Layar
Part of the Blippar Group

Blog: Layar Vision

AR Tracking Improvements in 6.2

Dirk Groten March 30, 2012

If you’ve downloaded Layar v6.2 for Android or iOS and have tried out some of the vision layers, you’ll probably have noticed that tracking was improved tremendously in this version. I’d like to shed a bit of light on what we’ve done technically to get this far.

So what is tracking in the first place? It’s the ability of the device to recognize and follow the object that is being viewed through the camera and determine its position in space. That way, if it’s augmented in a vision layer, the device knows where to draw the augmented content.

Making augments stick

Until v6.1, we were showing the live camera feed in the AR view with the augmented content on top of it. To do that, we’d take the camera frame from the live feed, do lots of calculations to determine the position of the tracked object and then draw the augmented content. That takes some processing time, so that in the end, the drawn content would always be lagging behind the camera images that you saw. 

We’ve changed that in v6.2: We’re now showing the camera frame that belongs to the calculation for the augmented content, so that the content now sticks much better to the object. You can see that by moving the object smoothly: There is a slight delay between your movement and the image you see, but the content really sticks to the object you move. Don’t move too fast, this will blur the image and the app will lose track of it.

Zooming into the augments

There’s another big improvement in 6.2: When a layer developer uploads a reference image (the image of the object that she wants to augment), we create a fingerprint of that image that is used to recognize and track the features of that object. Obviously you need a certain number of features inside the camera view to calculate a good estimate of the position of the object. If the user holds the camera close to the object, zooming into the object, there are less features that can be tracked and at some point we’d lose track of the object.

We’re now adding features dynamically as you zoom in, allowing the app to track an object much longer. This is really nice when the augments on a page are quite small and you need to get closer to read them. 

A cool example is the Eppo comic book, where you can view the original draft drawings on top of many of the pages. Well, you can now also zoom in really close, as shown in the screenshots below. Don’t zoom in too fast though, the app still needs some time to process the images.

  

  

Taming the ARM processors

A lot of the code we write for Layar is platform-independent. The entire vision algorithms are written in C++ and integrated into iOS using Objective-C++ and into Android using the JNI. This way we can write once and use the same code on both platforms. This has allowed us since Layar v6.0 to bring simultaneous updates for iOS and Android.

For Layar v6.2 we’ve decided to move one level down and write the processor intensive code directly in assembler using ARM NEON. The cool thing is that up to recently all iOS and Android devices run on ARM processors. Not all of them support NEON (like NVIDIA’s Tegra chips), so not all of them benefit of our new optimized code.

In our reference implementation, we’ve been able to decrease the processing time of certain critical operations by a factor of 8: For example, preparing a frame for display takes 2ms instead of 15ms when using the ARM NEON implementation. Obviously we don’t achieve this same speed enhancement in Layar itself, where there’s a lot more going on, but writing NEON-optimized code is proving a huge improvement.

Thanks to this optimization we were able to add support for streaming video as AR objects without significant loss in performance. We’ve decided to only support dual-core ARM processors, where different tasks like rendering video and analyzing the camera image to calculate the pose estimation can be distributed to different cores.

Now Intel has joined the club with its x86-based Atom processors. That’s a game changer and obviously our NEON-optimized code won’t run there. But we’ve been pleasantly surprised at the performance of the x86 chip. Even without optimization it runs our C++ code at higher speed than similar ARM chips.

We can conclude that with the arrival of Intel in the smartphone market, a new era of speed improvements has started which is great news for processor-hungry apps like Layar.

Dirk Groten, CTO

All credits to Michael Hofmann, Lawrence Lee, Ronald van der Lingen, Anatoliy Samara and Andrey Koval for these achievements.

Permalink: www.layar.com/news/blog/374

Email this article
 

Layar and Speakers Academy Partner to Engage Magazine Readers

Chris Cameron March 5, 2012

Speakers Academy, a leading management agency that represents professional speakers across the globe, has embedded Layar into its biannual magazine and speaker catalogue.

In the new Spring 2012 issue, interviews and profiles of featured speakers are accompanied by “View with Layar” logos on the pages of the magazine. Scanning these pages displays bonus videos, giving readers a look at the speakers in action.

“It’s very hard to explain in the text what they can expect,” said Speakers Academy commercial director Rene Warmerdam of the difficulties of showcasing speakers in the print medium. “Now, when using Layar, [they] immediately get access toward the video that explains it all. It really, really helps.”

The magazine also features a full page Layar ad in the front which explains how readers can scan the pages to view additional content. The “View with Layar” logo is our way of signaling to readers that they should scan pages with Layar. You can learn all about the logo here, including links to download the logo.

We visited Speakers Academy headquarters in Rotterdam recently to talk with Rene Warmerdam. Check out the video below to see the details of how the magazine has adopted Layar to engage its print readership.

Permalink: www.layar.com/news/blog/370

Email this article
 

Layar 6.1 Update Now in App Stores

Chris Cameron February 14, 2012

Today we’ve released an update to Layar that you can now find in both the iTunes App Store and the Android Market. While it’s a small release from 6.0 to 6.1, it’s an important one, and here’s why.

With Layar 6.1, we’re focusing more on Layar Vision, and making the experience of viewing Vision content easier and faster. To get a better understanding of the changes coming in Layar 6.1, we crafted a quick video introducing you to the new features.

As you can see, the changes are minor but important:

  • Firstly, when you launch Layar, you are now brought immediately to the Scan View screen. This way, when you see the View with Layar logo out in the world, you’re only one tap away from scanning and viewing great Layar Vision content. If you want to skip this screen and head to the layer catalogue, just tap the “Layers” button in the lower left corner. 
  • And when you do scan items, you can do so by tapping anywhere on the Scan View screen. Previously we had a small area where you could tap, but now, simply tap anywhere to get started. It’s a small change, but it can go a long way to making Layar Vision a better experience.
  • Finally, when you scan for Vision content, the layer will immediately launch if it is the only result found. Previously we displayed a results screen, and when only one result is listed, this is unnecessary. Just another small change making the use of Layar Vision a little more user-friendly.

Layar 6.1 is available now in the App Store and Android Market, so make sure you update today!

Permalink: www.layar.com/news/blog/366

Email this article
 

Renowned Ad Agency Tribal DDB Enlists Layar Vision

Chris Cameron January 19, 2012

Our friends over at Tribal DDB - a global digital advertising agency - recently employed the help of Layar Vision to enrich one of its projects. The project is a desk reference “calendar” that the company sends to prospective clients to inform them all about mobile.

It’s called “Mobile Engagement: A Guide to the Mobile World,” and now with Layar’s help, the calendar comes to life with augmented information within the pages. In the video above, we sat down with Joeri Kiekebosch, who worked on the project, to get an idea of how and why they chose Layar Vision.

“We had the feeling that a static desk reference was… static. And we’re a digital agency that does digital stuff, so we needed something that made it interactive,” Kiekebosch told us. “If you talk about mobile, you should be able to do something with your mobile. Why not scan the calendar with your mobile and see extra content on top of that?”

“There are a lot of big shots walking around here, and these people kind of like geekiness but are not geeks. AR is perfect for these people,” he adds. “Everybody has an iPhone, they were like ‘Oh this is cool, this is fun!’ They got it installed easily, they understood it. For these kinds of people it really worked.”

Take a look at our interview with Joeri to see the calendar in action!

Permalink: www.layar.com/news/blog/364

Email this article
 
We request not to sign up and further make payments for Layar services. Please proceed to use Blippbuilder to create AR experiences.
We use cookies to improve our services. Don’t worry, they don’t store personal or sensitive information and you can disable them at any time in your browser settings.