Augmented Reality

New ARKit demos: Tesla Model 3, food ordering & inter-dimensional portal

With the new ARKit framework for building augmented reality (AR) apps, Apple is turning existing iOS devices into the largest AR platform in the world.

We previously shared a bunch of interesting ARKit demos showcasing the possibilities of the framework, including an upcoming furniture-ordering app from Ikea, a measuring tape, an ARKit-powered VR mode in Maps and more.

Today, we're highlight three more examples of ARKit-driven apps: a Tesla Model 3 app by an impatient fan, an example of an inter-dimensional portal in the middle of the street and an app that promises to change how we order food.

First up, a demo app by an impatient Tesla fan who couldn't wait for his ordered Model 3 so he made this ARKit-powered app that lets him drive around a virtual Model 3 in his real world, activate the headlights and so forth.

@elonmusk Couldn't wait 4 my #Model3, so made this AR app, what do you think? #ARkit pic.twitter.com/lIRLTZox7N

— Jelmer Verhoog (@JelmerVerhoog) July 1, 2017

Food ordering will never be the same, says developer Alper Guler who created an ARKit-driven app which renders various foods on your table that you can pan around, zoom in and out, rotate and more.

https://www.youtube.com/watch?v=oFdgVNg4ryM

And last but not least, we have French consulting agency Nedd which came up with a great example of AR+VR, via iPhoneAddict.fr.

https://www.youtube.com/watch?v=rIPfpGCxONQ

Before signing off, here's a quick volumetric capture example with ARKit.

hint pic.twitter.com/WOuqVer1Ph

— Made With ARKit (@madewithARKit) June 27, 2017

Very impressive so far, don't you think?

New demos show how easy it is to bring 3D models to life with ARKit

One of the best aspects of ARKit, Apple's new framework for building augmented reality apps, is the fact that it does all the incredibly complex heavy lifting like detecting room dimensions, horizontal planes and light sources, freeing up developers to focus on other things.

ARKit analyzes the scene presented by the camera view, in real time.

Combined with sensor data, it is able to detect horizontal planes, such as tables and floors, as well as track and place objects on smaller feature points with great precision.

And because it uses the camera sensor, ARKit can accurately estimate the total amount of light available in a scene to apply the correct amount of lighting to virtual objects.

First, check out this demo from Tomás Garcia.

https://www.youtube.com/watch?v=1G2YbQuQHps

Children's bedtime stories will never be the same come this fall!

Another developer has put together a quick demo showing off his AI bot, named “Pepper”.

https://www.youtube.com/watch?v=brFKo_tSkw8

According to the video's description:

I've been working on an AI bot for a while now. To be short, it's like V.I.K.I in the movie “I, Robot”. With the help of ARKit, I was able to bring it close to a real life assistant.

Due to obvious reasons, I'm not demonstrating her functionality in this video. So I ended up showing you guys how easy and simple it is to bring 3D models to life with Apple's new framework.

These videos clearly demonstrate how easy ARKit makes it for developers to match the shadows of their virtual objects to lighting conditions in the real world.

https://www.youtube.com/watch?v=lQt96saECfM

Here are some additional ARKit-enabled demos.

https://www.youtube.com/watch?v=-o7qr1NpeNI

 

https://www.youtube.com/watch?v=sJR_f7XCuvA

 

https://www.youtube.com/watch?v=lQt96saECfM

ARKit requires A9 or A10 processors, meaning ARKit apps will require an iPhone 6s or newer or one of the latest iPad Pro models, either the 9.7-inch or the 10.5-inch one.

If anything, these videos demonstrate just how easy Apple has made it to put together an AR app. Are you looking forward to ARKit, and why? Leave a comment below to let us know.

Apple acquires German eye-tracking firm SensoMotoric Instruments

Apple may have quietly acquired SensoMotoric Instruments, a German company which can track people's eye movements. MacRumors was first to report yesterday that SensoMotoric has been acquired for an undisclosed sum by Apple's shell company, called Vineyard Capital.

The company holds multiple patents relating to eye tracking and virtual reality.

“Apple buys smaller technology companies from time to time, and we generally do not discuss our purpose or plans,” an Apple spokesperson said in a boilerplate statement issued to Axios.

Gene Levoff, Apple's Vice President of Corporate Law representing Delaware's Vineyard Capital Corporation, granted power of attorney to a German law firm to represent the shell company, which in turn acquired SensoMotoric Instruments on June 16. Levoff even notarized the document in Cupertino, California, where Apple is headquartered.

Tellingly, SensoMotoric recently removed over a dozen pages from its official website. It no longer has a jobs portal, news blog, schedule of events and workshops, contact information, list of distributors and resellers or mailing list signup form.

Their managing director Eberhard Schmidt was replaced by Dr. Ali Sahin, one of the German attorneys representing Vineyard Capital Corporation. Christian Villwock, who was the company's Director of OEM Solutions Business, was removed from the website, too.

Here's an example of SensoMotoric's eye-tracking technology in Samsung Gear.

https://www.youtube.com/watch?v=mDvgP2tnMHQ

Another video embedded further below shows off SensoMotoric's eye-tracking glasses with Natural Gaze Head Gear used by young athletes playing tennis to accurately capture their natural gaze, which helps them evaluate and improve their visual performance.

https://www.youtube.com/watch?v=VEZP_corY3Q

Proprietary eyeglass hardware that the video's athletes are wearing can capture a person's natural gaze behavior at a rate of 120 scans per second.

On the hardware side, Apple could use SensoMotoric technology in its rumored augmented reality glasses product. Eye-tracking technology can significantly reduce motion sickness for users of virtual reality headsets such as Facebook's Oculus Rift.

One specific aspect of SensoMotoric's technology, called foveated rendering, allows a virtual reality headset to save power by only showing you in high resolution what you're actually looking at, with anything in your peripheral vision being rendered in less high-resolution.

This reduces the amount of processing power needed to render a virtual world.

https://www.youtube.com/watch?v=-w7r0IGRlTY

On the software side, Apple could improve iPhone 8's rumored 3D facial recognition security feature through eye tracking and even allow apps and games to track the user's eye movement so that they could, for instance, aim in a game with their gaze.

Founded in 1991, SensoMotoric Instruments is headquartered in Teltow, Germany, with a satellite office in Boston, Massachusetts. The company employs about 60 engineers.

“I do think that a significant portion of the population of developed countries, and eventually all countries, will have AR experiences every day, almost like eating three meals a day,” Apple CEO Tim Cook said last year.

More ARKit demos: Falcon 9 rocket landing, Van Gogh bedroom tour & more

Wouldn't it be great if you could take a tour of Van Gogh's virtual bedroom in augmented reality? How about witnessing a Falcon 9 rocket descending from the skies?

ARKit, Apple's new framework for building augmented reality apps for iPhone and iPad, has captured the imagination of many iOS developers out there who have already created some truly awesome examples of what's possible with ARKit.

For starters, here's an example of ARKit's accurate tracking.

https://www.youtube.com/watch?v=dMEWp45WAUg

Developer Mark Dawson used ARKit to create a virtual copy of Van Gogh’s bedroom which you can walk around and examine detailed furniture, paintings on the wall and more.

ARKit has “amazing tracking,” Dawson said.

https://www.youtube.com/watch?v=lvjVgt_ce5Q

ARKit combines live camera feed and sensor data to find tables, floors and other horizontal planes in your real world. Speaking of which, This example shows ARKit's plane detection.

https://www.youtube.com/watch?v=9ZnG9wrVxtM

And this is adding geometry and physics with ARKit.

https://www.youtube.com/watch?v=Vsk9erdCvdk

In this demo, ARKit is tracking a virtual cube and providing a light estimate for the scene, which makes it easy to change the light intensity of the virtual object to match the real world.

https://www.youtube.com/watch?v=7Kk6iVr5ULo

Notice how when the lights are dimmed down the virtual cube also automatically dims, then when the lights are raised the virtual cube also gets brighter. Pretty neat, wouldn't you say so?

And here you can see ARKit detecting horizontal planes in the real world and rendering content using SceneKit with physically based rendering.

https://www.youtube.com/watch?v=rNFQl7I4T6Y

Developer Tomás García shared this cool demo depicting a Falcon 9 landing at the ASDS in a swimming pool, which he accomplished using ARkit and Unity.

https://www.youtube.com/watch?v=NodGjd3C0SQ

And lastly, German company Econsor Mobile GmbH has been working on an ARKit-powered app for commissioning of construction projects directly on the construction site.

https://www.youtube.com/watch?v=PJWTXefVDK8

Apple CEO Tim Cook thinks augmented reality is a big idea like the smartphone.

“The smartphone is for everyone, we don’t have to think iPhone is about a certain demographic, or country or vertical market: it’s for everyone. I think augmented reality is that big, it’s huge. I get excited because of the things that could be done that could improve a lot of lives. And be entertaining,” he said in the past.

How do you like these latest ARKit demos, which one is your favorite, and why? Does ARKit show a lot of promise, do you think? Share your thoughts by posting in the comment section.

Watch new ARKit demos: Minecraft and measuring tape

The website madewitharkit.com dedicated to highlighting cool apps made with Apple's new ARKit framework, was updated today with a pair of new video demonstrations showing off some of the augmented reality possibilities coming to iPhone and iPad with iOS 11 this fall.

The first demo has the user selecting two spots in the real world, as viewed through an iPhone's lens, to calculate the distance between them, transforming the device into a working tape measure. That's a great example of the power of the ARKit framework.

https://www.youtube.com/watch?v=z7DYC_zbZCM

The app was built by Laan Labs and, like other ARKit-enabled apps, uses an iOS device's camera along with sensor data to precisely find horizontal planes in the real world, such as tables, floors and other objects.

You can beta-test the app by signing up at armeasure.com.

Measure distances with your iPhone. Just because you can. Clever little #ARKit app by @BalestraPatrick https://t.co/b2mXe2FS84 pic.twitter.com/pyoHp99Yts

— Made With ARKit (@madewithARKit) June 25, 2017

Laan Labs has other examples of proof-of-concept apps built using ARKit on their Twitter, like the following example of impressive 3D drawing in augmented reality.

https://twitter.com/laanlabs/status/878692051889655808

As for an AR-enabled Minecraft, we don't know if Minecraft creator Mojang is working on one, but that didn't stop developer Matthew Hallberg from recreating Minecraft in AR using the ARKit framework and the Unity engine.

By superimposing Minecraft building blocks on top of the real-world, and taking advantage of ARKit's super accurate tracking, the user is able to walk around their environment and place Minecraft blocks at arbitrary spots. “I love that you are able to place life size objects because the tracking with ARkit is so good,” Matthew said.

https://www.youtube.com/watch?v=qFGx9QcE5Gk

Apple is also using ARKit tracking for an impressive virtual reality mode in Apple Maps on iOS 11. The Cupertino giant is even helping Ikea build an ARKit-powered app which will let you try out virtual furniture at home before purchasing it.

ARKit requires a device with an Apple A9 or A10 chip because those processors deliver “breakthrough performance that enables fast scene understanding and lets you build detailed and compelling virtual content on top of real-world scenes,” as per Apple.

How do you like the aforementioned ARKit demos? Are you looking forward to augmented reality-enabled apps, and why? Chime in with your thoughts in the comments section.

iOS 11 Maps has crazy cool VR mode that lets you move around by walking

Apple Maps on iOS 11 beta 2 features a great new virtual reality (VR) mode that takes advantage of Apple's new ARKit framework to let you move around in 3D by walking.

This unapologetically cool feature seems to be tied to Flyover, which replaces satellite imagery with three-dimensional buildings, landmarks and other points of interest.

The new VR mode on iOS 11 Maps was highlighted yesterday by Twitter user @StijnDV, but it appears to have been originally discovered by Tweetbot developer Paul Haddad on Wednesday.

To try it out yourself, open Maps on iOS 11 beta 2, switch to 3D mode by tapping “3D”, then use the search field at the bottom to find a place that has Flyover.

On the place card, tap the Flyover button and move the device around to rotate the view. Better still, why don't you actually move forward, backward or side to side to explore the map in VR?

Mind blown.

So, how do we know this nifty feature actually uses ARKit? Because it displays a message when you cover the camera, just like any ARKit-powered app does, saying you should aim the device at a different surface because “more contrast is required”.

As a quick backgrounder, ARKit analyzes live camera feed in real-time, using computer vision to find horizontal planes in your real world, such as tables and floors. I was able to successfully test the feature on my iPhone 6s running a second beta of iOS 11. Because I don't currently own an iPad, I couldn't test VR mode in Maps on the Apple tablet.

WOW There is an VR mode in Apple maps on iOS 11! It seems to use ARKit for positioning! pic.twitter.com/IdXiGoed26

— Stijn (@StijnDV) June 24, 2017

At any rate, this appears to be the default mode for Flyover now, not a special setting. But don't you worry, there's the option to switch back to the old Flyover mode where you rotate and zoom your Flyover view using touch interactions.

This is honestly one of the coolest features in iOS 11! pic.twitter.com/Zjr6RRkKHk

— Stijn (@StijnDV) June 24, 2017

This is a wicked cool feature and I cannot help but wonder how it might look like when experienced through Apple's rumored digital glasses that, as per Robert Scoble, should use optics by German lens specialist and optical instruments maker Carl Zeiss.

You can actually move around by walking! This is crazy cool! pic.twitter.com/ttR6RaAo7D

— Stijn (@StijnDV) June 24, 2017

Some people couldn't get Maps' new VR mode to work, but I suspect it may have something to do with their hardware. Maps' VR mode uses ARKit, which tracks your actual position in the real world with the camera but requires newer hardware.

Holy Flyover Magic Window batman. pic.twitter.com/Fb8nPeLT5J

— Paul Haddad (@tapbot_paul) June 22, 2017

According to Apple, ARKit runs on the Apple A9 and A10 processors. “These processors deliver breakthrough performance that enables fast scene understanding and lets you build detailed and compelling virtual content on top of real-world scenes,” says the company.

In other words, anything older than iPhone 6s, iPhone 6s Plus, iPhone 7, iPhone 7 Plus, the 9.7-inch iPad (early-2017 model) or iPad Pro won't be able to run iOS 11 Maps' VR mode.

So, is this cool or what?

We'd obviously love to hear your thoughts and predictions regarding iOS 11 Maps' new VR mode and what it might signify in terms of possible new VR hardware from Apple.

Do us a favor and chime in with your thoughts in the comments section.

Videos: cool stuff made with ARKit

Eager to learn why Apple's new ARKit framework is such a big deal? Look no further than a new website which offers a hand-picked curation of some of the coolest stuff developers have made thus far with ARKit, via The Loop.

For the uninitiated, augmented reality experiences superimpose computer-generated imagery on top of live video feed of the real world. According to Apple, ARKit uses a technique known as Visual Inertial Odometry to accurately track the world around an iPhone or iPad by fusing camera sensor data with motion data.

“These two inputs allow the device to sense how it moves within a room with a high degree of accuracy, and without any additional calibration,” the company says.

The following videos offer a look at the capabilities of the ARKit framework.

https://www.youtube.com/watch?v=nMd0dIAEJuc

 

https://www.youtube.com/watch?v=1hvfpxaxGwc

 

https://www.youtube.com/watch?v=6OV2mBbNtVk

 

https://www.youtube.com/watch?v=2xrVFDRJ8HQ

 

https://www.youtube.com/watch?v=R4OeFjZCi9o

 

https://www.youtube.com/watch?v=OHJRExynkuI

 

https://www.youtube.com/watch?v=VdaWHv6hmJk

 

https://www.youtube.com/watch?v=njQSiO2uj0s

 

https://www.youtube.com/watch?v=Rq2NChZ3c4E

 

https://www.youtube.com/watch?v=4HY868Jskrc

The ARKit framework uses computer vision to determine the layout of your surroundings and automatically find horizontal planes like tables and floors. It can track and place objects on smaller feature points and apply the right type of light to a virtual object in order to match the current lighting conditions in your room.

Ikea is working on an AR app in partnership with Apple that will let users try out furniture in augmented reality before buying it. Apple's WWDC 2017 keynote demos included an upcoming ARKit-driven game, called Wingnut AR, by director Peter Jackson's AR company.

If you like these demos, be sure to follow @madewithARKit on Twitter.

Siri design patent updated to include smart glasses, making hotel reservations & more

Apple's Siri design patent in the European Union and Hong Kong has been updated ahead of today's live-streamed WWDC keynote to also cover “smart glasses” as a category. As noted by PatentlyApple, the patent now lists “smart glasses” under the “remote control for” category.

KGI Securities analyst Ming-Chi Kuo gave the rumored standalone Siri speaker a 70 percent chance of being formally unveiled during the keynote presentation, but he had nothing to say about any kind of Apple-branded smart glasses.

A sketchy report that appeared on Reddit this weekend, attributed to an alleged Foxconn insider, leaked more info about the purported Google Glass-like augmented reality accessory.

Apparently code-named “Project Mirrorshades,” the glasses are said to include polarized or prescription lens with smart optics from German maker Carl Zeiss, noise-cancelling microphones, a light sensor, bone-induction modules, an accelerometer sensor for tracking steps and head movement, magnetometer for navigation, a capacitive panel on one side, a ceramic battery, an Apple-designed chipset and more.

The bone conducting modules would presumable allow for Siri command and hands-free phone calls, among other features. The poster mentions there's a 65 percent chance that the project is cancelled so take the report with a pinch of salt.

Another new entry covers making hotel reservations. Siri can currently make a reservation at a restaurant, but not a hotel. The Cupertino technology giant is expected to announce plans today to make Siri work with a larger variety of apps, Reuters reported yesterday.

The updated patent is no guarantee that Siri could be used to control the rumored smart glasses or that Apple's voice assistant could play a major role in its augmented reality plans.

All shall be revealed in less than an hour and a half.

Any predictions?

Gatwick Airport rolls out iBeacons for augmented reality indoor navigation

Gatwick Airport, the UK’s second busiest airport after London Heathrow, has installed 2,000 battery-powered iBeacons for indoor navigation and passenger tracking.

Available across Gatwick Airport’s two terminals, the system is accurate up to three meters, much more reliable than GPS and enables an augmented-reality wayfinding tool so passengers can be shown directions in the camera view of their mobile device.

It could be used to inform passengers they’re running late and help them avoid missing flights.

Conceivably, Gatwick Airport could also take advantage of iBeacons for efficient queue management and to reduce congestion by being able to determine whether to offload luggage if a late passenger is far away.

According to the airport, they won't be collecting any personal data from the beacons with the exception of “generic information on ‘people densities’ in different beacon zones.”

Deployed in just three weeks, iBeacons form part of Gatwick’s £2.5 billion (about $3.1 billion) transformation initiative. The technology is currently being integrated into some of the Gatwick apps.

The airport is in discussion with other airlines to enable the indoor positioning and wayfinding tools to also feature on their apps and services.

According to Gatwick:

Airlines could go further—and with the consent of their passengers—may send reminders on their airline app to late running passengers, for example, or find out where they are and make an informed decision on whether to wait or offload their luggage so the aircraft can take off on time.

The lack of satellite signals makes road-based navigation systems, like Apple Maps, unreliable indoors. That's why Apple has developed inexpensive iBeacons, which are tiny battery-powered Bluetooth transmitters.

The best Augmented Reality apps for iPhone

More often than not, the term Augmented Reality still has that elusive, techy ring to it, particularly when brought up in conjunction with Apple’s purported eyewear project. Curiously, many of us have it down as tomorrow’s technology rather than today’s, when the truth is that AR apps have populated the App Store for years.

While some of these apps are admittedly not much more than shoddy tech demos, separating the wheat from the chaff actually produces some really cool apps conceived to boost your business, creativity or simply keep you entertained in novel ways. With the preamble out of the way, here are the best Augmented Reality apps for iPhone available today.

Barclays: iPhone 8 to sport 3D sensors on both front and back

KGI Securities analyst Ming-Chi Kuo, along with a few other sources, believes Apple will bet heavily on augmented reality with iPhone 8. He predicted the device's front-facing camera will use a bespoke 3D sensor to let users take 3D selfies, map their surroundings, scan nearly any real-world object in three dimensions and more.

Analysts Andrew Gardiner, Hiral Patel, Joseph Wolf, Blayne Curtis and Mark Moskowitz reported in a Friday research note, obtained by 9to5Mac, that iPhone 8 may use a second 3D sensor out the back for augmented reality features.

iPhone 8’s augmented reality 3D sensor could be built by Himax Technologies

KGI Securities analyst Ming-Chi Kuo predicted that iPhone 8's front-facing camera would take advantage of a new sensor for “revolutionary” features such as augmented reality applications, advanced facial recognition, 3D selfies, 3D object scanning/modeling and more. Tuesday, Barron's noted that Apple may have contracted a company called Himax Technologies to build the rumored sensor for Apple's OLED-based iPhone 8.