Tag: google

  • The iPhone 12 Pro LiDAR Scanner is the Gateway to AR, But Not in the Way You Think

    I’m sure everyone knows about it by now, the iPhone 12 Pro has a LiDAR scanner. Apple touts it to help you take better pictures in low light and do some rudimentary AR on the iPhone. But, what this scanner does today isn’t where the power will be tomorrow.

    Apple cares a ton about photo quality, so a LiDAR scanner helps immensely with taking these pictures. If there is one reason today to have that scanner, it is for pictures. But the real power of the scanner is for AR. And AR isn’t ready today, no matter how many demos you see in Apple’s event. Holding up an iPhone and seeing how big a couch in your room is interesting, just as interesting as using your phone to find the nearest Starbucks.

    Apple has spent a lot of time working on interior spaces in Apple Maps. They’ve also spent a ton of time working on sensors in the phone for positioning inside buildings. This is all building to an AR navigation space inside public buildings and private buildings in which owners share their 3D plans. But what if hundreds of millions of mobile devices could create these 3D worlds automatically as they go about their business helping users find that Starbucks?

    The future is so bright though with this scanner. It helps Apple and developers get familiar with what LiDAR can do for AR applications. This is critically important on the hardware side because Apple Glass, no matter how little is known about it, is the future for AR. Same with Google Glass too, the eventual consumer product (ignoring the junk that the first Google Glass was) of these wearable AR devices will change the world, not so much in that you’ll see an arrow as you navigate to the Starbucks, but give you the insight into smart buildings and all the IoT devices that are around.

    The inevitable outcome is in the maintenance of smart buildings

    Digital Twins are valuable when they link data feeds to a 3D world that can be interrogated. But the real value comes when those 3D worlds can be leveraged using Augmented Reality to give owners, maintenance workers, planners, engineers, and tenants the information they need to service their buildings and improve the quality of building maintenance. The best built LEED building is only as good as the ongoing maintenance put on it.

    The iPhone 12 Pro and the iPad Pro that Apple has released this year both have LiDAR to improve their use with photo taking and rudimentary AR, but the experience gained seeing the real-world use of consumer LiDAR in millions of devices will bring great strides to making these Apple/Google Glass devices truly usable in real-world use. I’m still waiting to get my iPhone 12, but my wife’s arrived today. I’m looking forward to seeing what the LiDAR can do.

  • Google AI Project Recreating Historical Streetscapes in 3D

    When this caught my eye I got really interested. Google AI is launching a website titled rǝ which reconstructs cities from historical maps and photos. You might have seen the underlying tool last month but this productizes it a bit. What I find compelling about this effort is the output is a 3D city that you can navigate and review by going in back in time to see what a particular area looked like in the past.

    Of course, Scottsdale, my town, is not worth attempting this on, but older cities that have seen a ton of change will give some great inside into how neighborhoods have changed over the past century.

    Street level view of 3D-reconstructed Chelsea, Manhattan

    Just take a look at the image above, it really does give the feel of New York back in the ’40s and earlier. People remember how a neighborhood looked, but recreating it in this method gives others key insights into how development has changed how certain areas of cities look and act.

    This tool is probably more aimed at history professors and community activists, but as we grow cities into smarter, cleaner places to live, understanding the past is how we can hope to create a better future. I’d love to see these tools be incorporated into smart city planning efforts. The great part of all this is it is crowdsourced, open-sourced, and worth doing. I’m starting to take a deeper dive into the GitHub repository and look how the output of this project can help plan better cities.

  • Uber and Google Sign 4 Year Agreement on Google Maps

    This is one of those surprised/not surprised things.

    Uber Technologies Inc. announced that it has entered into a Google master agreement under which the ride-hailing company will get access to Google Maps platform rides and deliveries services.

    I mean today Uber uses Google Maps with their app, even on iOS. This is basically a continuation of the previous agreement with some changes that better align with how Uber does business. Rather than number of requests that Uber makes for Google Maps services, it is based on billable trips that are booked using Uber, a much more manageable deal for Uber. Last year, it came out that Uber paid Google $58 million over the past 3 years for access to Google Maps. This quote really strikes me as bold:

    “We do not believe that an alternative mapping solution exists that can provide the global functionality that we require to offer our platform in all of the markets in which we operate. We do not control all mapping functions employed by our platform or Drivers using our platform, and it is possible that such mapping functions may not be reliable.”

    For as much money Uber has invested in mapping, they don’t believe their technology is reliable enough to roll out to the public. That is mapping services in a nutshell, when you business is dependent on the best routing and addressing, those businesses pick Google every time. All that time and effort to build a mapping platform and they still pay another company tens of millions of dollars.

    I’ve read so much about how Uber is about ready to release their own mapping platform run on OSM. But in the end the business requires the best mapping platform and routing services and clearly nobody has come close to Google in this regard. Google Maps is not only the standard but almost a requirement anymore.

  • It is Different With COVID-19…

    I started blogging in May of 2005. Right before Katrina hit and everything we knew about GIS disaster response changed. Katrina was that moment where the static image PDF of a map changed to a map service that ran on almost any modern (at the time) web browser. Immediately every GIS map server that was out there became irrelevant at best, dead to the world at worst. Remember though, Google bought Google Earth almost a year before Katrina and Google Maps didn’t launch until early 2005. The tools that created this disaster response revolution were in place, but not too many people used them or had heard of them. But less than 6 months after Google Maps hit the web, Katrina response was almost entirely driven by their tools.

    Remember this? Don’t try and pan!

    If you look at my blog entries from September and October, you can see attempts by Esri, Microsoft, Yahoo! and others to try and address this new paradigm of mapping but none of them stuck. Everyone, and I mean everyone, was using Google. Esri ArcScripts back then probably had 50 tools to convert SHP to KML or MXD to KML. We had tools like Arc2Earth that specialized in making maps easier with Google. And while Esri tools were still being used to generate the data, the display was happening on other platforms.

    This of course gave rise to the Neogeography revolution. I’ll spare you the bare breasted Andrew Turner graphic but at this time we had so many people doing things with GIS that had no idea what GIS was let alone what Esri was. The limitations on getting started with mapping went down and all you needed was a computer and a text editor to make a map. My blog is littered with examples of Neogeography, from EVS Islands to all that great Flickr mapping that Dan Catt and crew did back then. People didn’t ask for permission, they just did it. It all culminated in what I consider the greatest crowdsourced disaster mapping effort, the wildfires in San Diego back in 2007 (feel free to choose the Haiti response over this, that’s fine. I really like the example of using Google My Maps in your backyard for this).

    In all fairness, Andrew wasn’t literally saying it killed GIS.

    But something happened after this, it isn’t that people stopped mapping. Look at OSM growth. The amount of crowd sourced data continues to grow exponentially. But responses to disasters seemed to be run by Google and Microsoft themselves. Tools like Google My Maps continue to exist, but I truly can’t recall using one in the past 10 years. Or if the disaster was not interesting enough for Google, you’d see people using government websites to get that information. The Esri mapping had finally caught up that people would use the fire maps from the DOI other 3 letter agencies without complaining. The citizen effort moved to Twitter where it continues to show great promise, just not as a Google My Map. Take a look at the Bush Fire here in Arizona on Twitter. So many great posts by people but maps are either static images shared or links to traditional InciWeb maps.

    This brings us full circle to COVID-19 mapping. Think of the best and most up to date COVID websites. They are built on Esri technology. Google has websites, Microsoft has them too. But the Esri dashboard has finally had its moment in the sun. I wonder if this is because the market has matured, that the tools have matured or the data set lends itself to a more scientific approach to display rather than simple lines and points. The Johns Hopkins COVID-19 Maps & Trends website is the bible for this epidemic.

    GIS is no longer a side show on this response. I’m guessing that because this is more structured government data, Esri is uniquely positioned to be in the middle of it but even then, their tools have come a long way from the ArcIMS/ArcWeb madness that we dealt with during Katrina. COVID-19 dashboard is the opposite of Neogeography and that is OK. The influence of the citizens on mapping is clearly shown in the Esri tools we deal with today. They still drive me nuts from time to time but let’s be honest, they really do work for this situation. As we close out 1/2 of the way through 2020, hopefully we can keep the need for disaster response to a minimum.

  • Facebook Acquiring Mapillary is More Than You Think

    I’ve been working on this blog post all weekend and I’ve rewritten is many times. It comes back to the confusion about why Mapillary and Facebook are now part of the same team. I wrote down about 10 guesses as to why Facebook decided it needed Mapillary and they needed them now but Joe Morrison did such a a good job outlining many of them I’ll share it here. Go read and come back after you’re done, I’ll wait.

    Welcome back, now what do I think about this? Hard to say honestly, I can talk myself out of any idea. Get back at Google? I don’t think things are that emotional, sure they probably should own their own mapping solution as sending all their users on to another platform is leaking out their secret sauce and probably a boon for Google. But this isn’t something they haven’t been working on and I can’t see how as amazing Mapillary is, that it moves the needle on this at all. Any work toward a Facebook Maps platform has been done and is probably close to happening. I could see that amazing Mapillary team being an acqui-hire that could help in the long term given their expertise with Open Street Map.

    Computer vision, AR/VR and the rest *could* be a reason but remember that Facebook owns Oculus and has done so much in AR that again Mapillary is a rounding error on this. While Oculus has not paid out the way I’m sure Facebook hoped it would, the engineering and development teams there clearly have influenced Facebook. Mapillary, as amazing as those guys are, just don’t have the horsepower that existing AR/VR/CV teams do at Facebook. Again, maybe an acqui-hire.

    Place database is of course the holy grail of mapping. The maps are a commodity, but the places are not. But let’s be honest, there are very few companies that have better place data than Facebook. They might have not had street level view data but they sure had more pictures of these venues than almost anyone else. I get that people like street view data but how often do people really say, let me see a street view image from 2011 when they are look at directions. THEY DON’T. Street view is the coffee shop mapping example. It sounds interesting, looks great in demos but in the end not as important as a 3D world built from satellite imagery and lidar. But wait, that’s where Mapillary does come in.

    The mostly likely reasons I feel that Facebook bought Mapillary was because of their expertise with Open Street Map and OpenSfM. Facebook is one of the largest users of OSM out there so bringing in a group that is as if not more experienced with OSM helps move the needle with their mapping efforts. The second thing Mapillary brings is their skill making 3D worlds out of imagery. As I said, who has better pictures of venues than Facebook? Start stitching those together and you get an amazing 3D city that is updated quicker than driving stupid cars down streets. Encourage people to take pictures and they update the 3D world for you. That and they they get some of the best OSM ninjas out there all at once.

    Now what happens to the crowdsourced data? Will people continue to participate given there are few companies who are more reviled for data management than Facebook? That is what I’m most interested in, Mapillary the product, does it continue? Time will tell.

  • The Story Behind Earthgoogle

    If you search my blog you’ll find an interesting post titled earthgoogle. Well it really isn’t that interesting, it just has a link to download Google Earth and a link to my blog. So what is this thing and why does it have such a weird title?

    For those that might not remember, 2005 was a crazy time for GIS blogs. Katrina brought satellite imagery to everyone and people searched the internet for ways to find out more. Google Earth was probably the easiest and best way for the average person to learn more about satellite imagery and get some really helpful tools to mark up the area.

    This was about as amazing as anything anyone had seen outside of our industry.

    About this time in September 2005, I noticed a lot of people arriving to my blog due to the search term “earthgoogle”. So as most people who blogged back then, I loved to talk about blogging. I created a simple blog post asking what was this all about.

    To all those reaching this site using MSN search with the term “earthgoogle” hello. You’ve been filling up my server logs with this request. I’m curious why you’ve typed this in to only MSN search and not Google/Yahoo/other search engines.

    So obvious, right? MSN users, not typing a URL correctly? Anyway, what this blog post of mine actually did was make this page the number one result in Google for the search term “earthgoogle”. I got so much traffic by being the way most people, who didn’t understand how URLs work, find Google Earth. Eventually I changed the page to what you see now.

    I put Google AdSense on that page too. I mean everywhere (really wished I took a screen shot because it was so tacky). The result from that tacky was that I was making over $1,000 a month in ad revenue from that blog post alone. People who wanted to find “earthgoogle” apparently also like to click on ads.

    Eventually the page died down, people stopped being directed to my blog via search for “earthgoogle”. I probably pulled ads off the blog in 2006 and couldn’t care less. But the page remains, a reminder of how crazy Google Earth was back in 2005.

  • Download Your Fusion Tables Data

    I first wrote about Fusion Tables back in 2010.

    Google Fusion Tables – Are you kidding me? These stuff is “teh awesome”. Fusion tables are going to be more “killer” than Google Maps was. Yup, pay attention.

    cageyjames

    “teh awesome”? Seriously, who says that? Well I guess I did and that’s OK. Was it more “killer” than Google Maps, obviously no. It’s not that Fusion Tables was wrong, it is just there are so many alternatives to it that it really doesn’t matter anymore like it did when it first arrived.

    Well if you’re like me, you probably have a lot of data in Fusion Tables and Google just sent out an email explaining how to get it out.

    If you created many tables over the years, we’ve made it easy to download all your data in one step with a new dedicated Fusion Tables option in Google Takeout. You can save the rows, metadata and geometries of any base tables that you own, and export this data in the following formats: JSON, CSV and KML.

    It’s a really nice tool, just tried it myself on some baseball data that I had in there. Google explains the tool as such:

    The data for each table is saved to its own “archive”. The data will be saved in a Google Sheet; for datasets beyond the size limits of Sheets, you’ll get a CSV. This archive is stored in a top level folder called “ft-archive” in your Drive.

    A Google Maps visualization is automatically created with the archived data. This map preserves many of the original Fusion Tables styling configurations. Any changes you make to the Sheet or CSV will appear in the map visualization.

    A listing of all archived tables is stored in a Sheet. This handy Sheet is called “ft-archive-index” and lives within the “ft-archive” folder. The index Sheet summarizes each run of the archive tool and preserves the visualization URLs with encoded styles. Each time you run the archive tool, you will get additional archives based on the current data in your tables along with corresponding new rows in the archive directory.

    You have until December 3, 2019 to get your data out. Google Takeout makes it easy which is really nice.

  • Plus Codes; Another Attempt at Addressing Places Without Street Addresses

    Yes everyone knows about What3Words.  It was an attempt to come up with an easy way to assign addresses for places where there are none.  In the end, a proprietary addressing system will never gain traction, and of course the inevitable eventually happened. My personal feeling is that What3Words never really got us beyond x/y numbering and the logic behind an addressing system was not there.  Enter Plus codes which comes at this problem from a different perspective.  There is a very detailed analysis of existing methods and why they choose to go this direction that I’ll leave it up to you to read.

    Probably the biggest reason to pay attention is that this open addressing system was developed by Google.  In fact, they are already implementing it in India as we speak which goes a very long way to making this happen.

    All these systems are built on the idea the world is a grid, and how deeply you drill down into that grid is your address so things need not be a single point, they can be an area which opens up many exciting ideas for addressing, especially outside of North America and Europe.  Check out the Github project to learn more.

  • Waze sued for allegedly stealing data from another navigation app

    Well I’m not sure how much this had to do with Waze being owned by Google or not but PhantomAlert is suing Waze.

    Before the advent of GPS and navigation apps, cartographers sneaked “paper towns” and “trap streets” into their maps—fake points of interest that they used to detect plagiarism. If someone copied their map, it would be easily identifiable through the inclusion of those locations. That same trick has found its way into modern-day mapping systems: A new lawsuit brought against Google and its traffic app Waze cites sham points of interest as evidence that the Google-owned service copied from a competitor’s database.

    Apparently these two companies tried to make a deal before Google snapped up Waze and PhantomAlert is alleging that Waze used their database to “boost its profile”.  One of the biggest concerns in the OpenStreetMap community is allowing these intentional mistakes into their database.  Copyright Easter Eggs is well documented on the OSM website.

    Copyright Easter Egg, in terms of mapping, is a feature that is drawn in a distinctive way in order to help identify its original author. It may be a nonexistent, or slightly or heavily distorted, map feature, or its name may be wrongly or unusually spelt.

    The supposed main purpose of such a feature is to strengthen the author’s case in a copyright dispute. If he can show that his own unique feature appears in the defendant’s work, it is easier to prove that the defendant’s work is a copy of his.

    google_logo

    Hey look, I got to use the new Google logo already!

    Yea so if this is true, PhantomAlert has a pretty good idea that Waze stole their data and it could mean big trouble for Google.  Having a closed database like this opens Waze up to these kinds of lawsuits because they are unable to have the community police the data.  The big question is was this data imported into Waze intentionally or by accident.  I don’t think the latter will get them off the hook but if there was intent it could be costly.  We’ll have to see.  The Waze byline about “outsmarting traffic, together” might not be too smart.

  • Google Here Never Was

    Indoor mapping is the white whale of our Spatial IT industry.  We’re always reading about how our smartphones will lead us to the best deals or how I can find the specific nail I need in Home Depot without having to ask anyone or walk down every aisle.  They key to all this is essentially iBeacon.

    You can search Google News for all the latest excitement on the concept but essentially it is a way for your phone to know where things are and for the vendors to know where you phone is through Bluetooth.  Imagine walking into a store and getting alerts about your favorite beer being on sale and then the ability to navigate directly there.  Sexy right?  Plus we’ve been anticipating this happening for years.  Except

    Google was set to launch a new product that added context to one of its most successful apps, Google Maps. But earlier this year, it was shut down by Alphabet CEO Larry Page, according to people familiar with the project.

    Google Here worked by sending a notification to a smartphone user’s lock screen within five seconds of their entering a partner’s location. If the user clicked on the notification, a full screen HTLM5 “app” experience would launch. Google Here would know when to send the notification via Google Maps and beacons placed in the stores of participating partners. Google planned to supply the beacons to partners for the launch, according to the document. The experience could also be found by going to the Google Maps app.

    Exactly what we though everyone wanted.  In testing the application was deemed too invasive and Google feared no retailers would sign up.  That’s right, Google didn’t think could get their partners to install cheap beacons in their stores AND they feared they were too big brother.  Seems weird doesn’t it, if there is one company that can get companies to spend money on ads, it is Google.  And since when did Google ever think pushing ads on us was “invasive”?

    The magic about Google Here (Here as in not Here that was owned by Nokia) was that you didn’t need an app running for it to work.  Think about that for a minute, ads would appear on your phone based on where you where and you didn’t need to opt in to get them.  Now we see why Google was very concerned that Here was going to get a large backlash.  Being able to push ads on users would have been something they really could have sold well to companies, I’m not sure there would be any fear of companies not wanting to push ads on us.

    Beacons are still very important to Google. Their Eddystone project talks about lots of uses of beacons but not for ad delivery.  Clearly there was feedback on this project and it jolted Google out of their normal sell more ads business model.  I think beacons will be very valuable as they start appearing in more areas, but I for one don’t need to get an ad for fabric softener every time I walk into a Target.