Apps-Driven Design: Is there a bigger challenge than “Augmented Reality”?

A recent video segment by Ali Velshi on CNN’s “The Big I” as in “The Big Idea” shows Associate Professor Blair MacIntyre and research scientist Alex Hill from Georgia Tech’s Augmented Environments Lab demonstrating their Argon augmented reality (AR) browser app on a mobile phone. Argon itself is based on an AR development platform called KHARMA (KML/HTML Augmented Reality Mobile Architecture) and uses a markup language called KARML that’s an extended version of KML (Keyhole Markup Language), an XML-based markup language originally designed for geographic annotations in mapping applications. (My apologies for the huge level of nesting in that last sentence but it reflects the huge level of nesting in the development itself.) If you’re still with me, KHARMA allows an application to query and receive information via additional servers based on GPS-driven location and other orientation queues (compass, image, and video), if present. The client device (a mobile phone in this example), merges the incoming stream of information with a real-time representation of reality to produce AR-enhanced images.

The huge number of complex tasks occurring simultaneously in this AR application soaks up a lot of processing power from both client and servers. The client simultaneously generates a video stream of the captured scene (camera processing, compression, short-term storage, real-time display), monitors the client’s GPS location, and wirelessly transmits images and coordinates to the AR server(s). The servers must recognize images well enough to decide what they are based on the additional GPS information, must translate printed material in real time (think of augmenting a street sign or a restaurant menu written in a foreign language and having your mobile phone translate for you), and must then send the processed information back to the client where it’s combined with the video stream and displayed. (Come to think of it, we’d probably want street signs translated and then spoken by the client app in addition to or instead of an augmented display.)

That’s a whole pile of cycle-crunching apps to produce a seemingly simple but obviously complex result. It’s apps like this that will be driving the need for comprehensive, apps-aware System Realization, SoC Realization, and Silicon Realization tools in the immediate future.

And if you haven’t seen the CNN video of augmented reality in action, click here.

About sleibson2

EDA360 Evangelist and Marketing Director at Cadence Design Systems (blog at https://eda360insider.wordpress.com/)
This entry was posted in Apps, EDA360, Silicon Realization, SoC Realization, System Realization and tagged , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s