Want to know more about Google’s Project Tango? The most frequently asked questions about Project Tango answered by Steegle.com
Google ATAP is Google’s Advanced Technology and Projects group. The group is best known for being in charge of both Project Ara and Project Tango. Formerly know and Motorola's Advanced Technology group, ATAP recently transferred to Google following the sale of Motorola Mobility to Lenovo. Fairly radical in the way they work, they describe themselves as “A small band of pirates. Believers. Makers. We like epic shit.” They are based in California in the United States of America.
Project Tango is an attempt to create a mobile device which understands space and motion just like humans do. Google ATAP has worked with universities, research labs, and industrial partners to put the last decade of work in robotics and computer vision into a unique mobile phone. They are trying to enable mobile phones to see the world beyond the boundaries of the screen.
When Google initially came up with Project Tango, in early 2013, they described this as a Simultaneous Localization and Mapping (SLAM) system capable of operating in realtime on a phone. Google’s ATAP teamed up with a number of organisations to create Project Tango from this description.
A number of companies came together to develop Project Tango. All of these are listed in the credits of the Google Project Tango introduction video called “Say hello to Project Tango!” Each company has had a different amount of involvement. The following are the list of participating companies listed in that video:
· NASA JPL
What stage is Project Tango at in Development?
Google ATAP, alongside a number of other companies, has created 200 prototypes that are being distributed to eager developers and partners. They have built the hardware and software and now want other developers to help them imagine the opportunities that this technology offers.
What problem does Project Tango solve?
Google ATAP explains that the world currently has a problem with mapping indoors. Put simply, this is the main issue that Project Tango looks to solve. Yet, by solving this issue, the device could have an exponential impact on a variety of problems in a host of different industries.
What is the Project Tango prototype?
The current prototype is a 5” Android phone. This phone has highly customised hardware and software which tracks the 3-dimensional motion of the device while creating a map of the environment. What makes this particularly unique is that these sensors mean that the phone can make over a quarter million 3D measurements every second. The phone updates its position and orientation in real-time and combines all of that data into a single 3D model.
How will Project Tango be used?
The possibilities are huge. Project Tango’s sensors mean that it would be possible to show where particular objects are in a 3D space. So, the most obvious developments are likely to be 3D-mapping apps to assist room and building planning. Google ATAP gives the example that the phone would be able to show you where the items that you want to buy in a shop are and that it could allow you to play hide-and-seek with your favourite video game characters.
So, say you are a theatre designer; you could capture the dimensions of the stage before you start building the pieces of furniture. Also, this could have a major impact on gaming. A number of developers are exploring using real spaces to play a variety of virtual games.
Google believes the possibilities are vast and they are now working with developers and external companies to help them achieve this and more.
Can I get one for myself?
Not yet, but you may be able to soon. Currently, they only have early prototypes which are not ready for everyone as the functionality is a bit rough around the edges. Yet, it may not be too far away as these prototypes are now in the hands of developers that are keen to test its boundaries and help it evolve.
What coding language does Project Tango use?
The main languages are Java, C++ on the Android platform. The phone runs on Android, which is Google’s standard. The ATAP team have included development tools that show developers how to make all of the software work together. Java and C/C++ are also used to provide position, orientation and depth data.
The Project Tango device is made up of a camera and sensor array which runs on an Android phone. The hardware was built by the ATAP team, alongside other companies, and mainly consists of a cell phone, a wide angle camera, depth sensors, an inertial measurement unit and hardware which processes low power images.
For those more technically minded the actual arrangement that enables depth-sensing includes: an infrared projector, a 4 MP rear-facing RGB/IR camera and a 180º field of view fisheye rear-facing camera. The other key components are:
· Snapdragon 800 quad core (up to 2.3 GHz per core) CPU with 2 GB LPDDR3 RAM
· 64 GB internal storage, expandable by microSD
· 5" LCD screen
· 9-axis accelerometer/gyroscope/compass
Can you connect other hardware to Project Tango?
Yes. Project Tango includes several ports, including:
· Micro HDMI
· USB 3.0
· A microphone grille
What battery does Project Tango use?
Project Tango uses a 3000 mAh battery.
What is the memory and processing power of Project Tango?
The following components are powering Project Tango:
· Elpida FA164A1PB 2 GB LPDDR3 RAM
· Qualcomm 8974 (Snapdragon 800) processor
· 2 Movidius Myriad 1 computer vision co-processors.
· 2 AMIC A25L016 16 Mbit low voltage serial flash memory ICs
· InvenSense MPU-9150 9-axis gyroscope/accelerometer/compass MEMS motion tracking device
· Winbond W25Q16CV 16 Mbit SPI flash
· PrimeSense PSX1200 Capri PS1200 3D sensor SoC
· SanDisk SDIN7DP4-64G 64 GB iNAND flash memory
Can developers get involved?
Yes. Google is already working with a number of professional developers. A number of these are initially helping them to build some touch-screen apps.
The Project Tango device has a unique ability to sense 3D motion and geometry. Google wants these partners to take their technology forward and to build exceptional experiences for future users. Google explains in their original brief that Project Tango is able to capture a wealth of data which has never before been available to app developers, including: depth, object-tracking and real-time 3D mapping.
They have already given 200 prototype development kits to developers, with the hope that these will be used to create artistic and imaginative software. Some of the projects that are currently being developed with these prototype development kits include:
· Projects in indoor navigation/mapping
· Single and multiplayer games that use physical space
· New algorithms for processing sensor data
If you are a developer and want to get involved, you should keep looking out for further announcements as it is likely that they will need additional developers in the future too.
Prior to distributing the first 200 developer kits, Google ATAP created a few demonstration apps to reveal the capabilities of Project Tango’s hardware.
One of these apps displays a distance heat map on top of the camera’s general display. This works by layering blue colours on distant objects and red colours on objects that are nearby.
A second app shows movement paths of objects, within a one percent accuracy. This works as the device took data from the image sensors and paired them with the device's standard motion sensors and gyroscopes. Once again, this was all plotted onto an interactive 3D map.
Another app captures a 3D model of the scene around the device in real time by drawing it on the Project Tango display as it was moved around the room.
A lot of software will have been created exclusively for Project Tango. Currently, this isn’t all public knowledge. However, we do know a little about the software originally written for the SLAM application, part of Project Tango. This includes:
· A realtime Visual Inertial Odometry system to track the 3D position and orientation of the device as it moves around the world.
· A realtime full 6 DOF SLAM solution which is able to correct odometry drift.
· A mapping system with the capacity to take information from the device’s depth sensor and build a 3D reconstruction of the device’s environment.
· A re-localisation system built on top of the place recognition module that allows users to determine their position relative to a known map.
· Mechanisms for sharing maps between users, allowing users to operate off of the same map within the same environment. This in turn, opens the possibility of collaborative map building.
· Mechanisms for improving map quality through the utilization of cloud services. Maps are shipped from the device to the cloud where they can be refined.
· Infrastructure for testing algorithms, monitoring progress of the project, and preventing code regressions.
· APIs with associated code to support use of mapping information in Unity and Java applications.
Project Tango uses a number of technologies that were not created specifically for the device. Reports originally showed that some of Project Tango’s 3D capabilities are powered by the Movidius Myriad 1 3D-sensing chip. Yet, now know that the device uses a number of other technologies. Many people were most surprised to find out that Project Tango is powered by Apple technology. The device uses Apple’s PrimeSense Capri PS1200 3D imaging system-on-a-chip, a technology that Apple acquired when it purchased PrimeSense.
What does PrimeSense do?
One of the key components behind Project Tango is actually an Apple product called PrimeSense. A website called iFixit gave the device a teardown (whereby they took the device apart to discover what it is made up of). The 3D imaging system is controlled by PrimeSense, a chip contained within the Project Tango developer hardware. The chip processes the data from the device’s four cameras and turns it into a 3D map of the surroundings. Apple bought PrimeSense late last year and many technologists had assumed that it would appear in an iOS device first. However, Project Tango has used this first to help them map 3D spaces.
How could Project Tango help shopping?
Google suggested on their Project Tango webpage that this device could definitely have a large influence on the future of shopping. They propose that Project Tango is able to create an indoor map of a shop so that you can be guided around the aisles and directed to the shelf where the item that you want is. In addition, Google also suggest that the device could be used to map your own rooms before buying furniture. Therefore, when purchasing large pieces of furniture, you could potentially preview how well it would complement the space before purchasing it.
How will this device help with directions?
Project Tango is very likely to be used to help people direct themselves around a building in ways that current mapping systems cannot. Whether in large office buildings, public spaces or even government buildings and airports – these new technologies could have large implications on indoor directions.
Could this new device also help with emergency response?
It could in a number of ways. Other areas of Google, specifically the teams working on Google’s Project Glass, have spent some time looking at how innovative technologies could assist emergency response teams. As an example, it has been posited that Project Tango could help firefighters find their way around burning buildings. Similarly, Project Tango could be equally useful to other emergency services entering unfamiliar locations.
Are NASA using Project Tango in their International Space Station?
Mashable certainly believe so, as they explained in their article “Google's Project Tango Will Power NASA Robots in Space”. According to them, the Google ATAP team plan on getting a prototype inside the International Space Station. It will be used to assist NASA astronauts with a variety of different tasks.
The device will be attached to a robot (on a platform called NASA Spheres) and will be able to navigate, in zero-gravity, the International Space Station. Project Tango and NASA have already tested this concept on a zero-gravity flight from Texas and Google say that the technology is likely to launch into orbit starting this summer.
El-Ouzzane from Movidius, one of the companies has been that have been developing the processing chips used in Project Tango, explained that the technology used is not particularly different from what NASA's Exploration Rover used to map the surface of Mars.
Could Project Tango assist the visually-impaired?
Absolutely. In Google’s video that summarises Project Tango, they explain that the device could have some useful applications that would assist the visually-impaired community. This could involve developments which enable the visually-impaired to use their phone to map out the room get alerts about any obstacles in the room. This would assist them when navigating around unfamiliar places.
Will Project Tango be used for Augmented reality gaming?
Inevitably, a device like Project Tango lends itself very well to Augmented Reality. For those that do not know, Augmented reality is where computer generated content is brought into a physical, real-world environment. This normally involves elements such as altered sound, video, graphics or GPS. Project Tango uses augmented reality in a variety of ways, so it is very likely that some of these will be used for the purposes of gaming.
Google has also indicated that this is one of the major opportunities. They explained on their main website that this device could have major implications on gaming, especially where developers combine room-mapping with augmented reality. They explain on their Project Tango webpage: “Imagine competing against a friend for control over territories in your own home with your own miniature army”.
What is next for Project Tango?
These are very early days. Currently, this technology is only just starting to move away from the research labs, so we will not see it reaching the hands of millions of people just yet. Google hopes for big things and is excited to see this project evolve. Now that 200 dev kits have been distributed, the next step is for these professional developers to build some touch-screen apps. The areas that will be tested first are indoor navigation/mapping, augmented reality games and new algorithms to process sensor data.