Tuesday

IDF 2012: What will it feel like to be a human in 2020?


Against the backdrop of a week that will be marred by the madness and frenzy of an impending Apple device launch, Intel started its proceedings on Day 0 of IDF 2012 on a refreshingly different note.
IDF 2012: What will it feel like to be a human in 2020?
Against the backdrop of a week that will be marred by the madness and frenzy of an impending Apple device launch, Intel started its proceedings on Day 0 of IDF 2012 on a refreshingly different note. Intel’s opening session, a day before the main proceedings begin on Tuesday, was more about the interesting and futuristic computing experiences we can all look forward to in the years to come and less about the hardware and processing power one-upmanship that we have all been so accustomed to in the past.

This is a significant shift. And one that can only be fully appreciated when you see first-hand the projects that have been incubating in Intel's labs. Projects that have very little to do with the guts of computational and processing power and more to do with the experiences and the future of technology and computing.

What will it feel like to be a human in the year 2020? 
This is a question that Brian David Johnson, Intel Futurist, tackles head on. As a Futurist, it is his job to look out 5, 10, or 15 years ahead and develop plans that Intel engineers can use to create technology for. His job is a complicated mix of sociology and research, looking deeply into how people interact with computers and computation today to anticipate how it will evolve over time. The work that Brian’s team does trickles down into teams all over Intel and becomes blueprints and spec sheets for products that Intel would then work on in the Intel Labs.
Brian David Johnson, Futurist, Intel Corporation
Brian David Johnson, Futurist, Intel Corporation


Brian’s answer to that question is powerfully presented. He postulates that as we look out to 2020, something really remarkable happens: the size of meaningful computational power approaches zero. Simply put, as the chips become smaller and faster, the size of meaningful computational power approaches zero by volume. And because the chips by then will become so small and powerful that they’re invisible, they could become a part of everything. We could convert anything into a computer. We could turn a table into a computer, you could turn someone’s shirt into a computer, and sometimes even our bodies could become computers. And then the computing experiences will be solely limited by the power of our imagination.
Science and technology have progressed to the point where what we build is only constrained by the limits of our own imaginations--Justin Rattner, Intel CTO
Science and technology have progressed to the point where what we build is only constrained by the limits of our own imaginations--Justin Rattner, Intel CTO


With this power of imagination, Intel then demonstrated a whole bunch of really cool projects that its researchers are working on. These interactive and context-aware innovations further demonstrate how people’s relationships with technology are changing and will continue to evolve as we look into the future.

Display without boundariesChanging how and where we display and interact with our content
Easily one of the most exciting demos at the session was the Display without Boundaries setup. It used unique algorithms to transform any surface in a home or work environment into an interactive display. Intel researchers showed how they’ve stitched together a seamless integration of images from multiple sources that can wrap around objects and corner surfaces in a typical home. If you look carefully, you'll see the researcher move digital pictures around in this bowl, tap on them as if they’re real objects and then take one out and slap it onto a wall. It all happens seamlessly!
Display Without Boundaries uses unique algorithms to transform any surface in a home or workplace into an interactive display
Display Without Boundaries uses unique algorithms to transform any surface in a home or work environment into an interactive display

Apologies for the poor audio quality due to high ambient noise, but we do hope you get a gist of what the kit looks like in real life and how it works. Also, if you look carefully, much of the setup involves projectors and, interestingly enough, a bunch of Microsoft Kinect-enabled Xbox 360s. Of course, there’s Intel hardware and software hard at work in the background making it all look smooth. And while it looks unwieldy at the moment, you can imagine, as the researcher says in the video, that all of this will become a simple light-bulb installation in the years to come and you could instantly bring to life ordinary objects in your home.

I for one can already imagine how restaurateurs are salivating looking at this concept, because it could instantly turn tables into interactive menus, and customers could even Instagram what they’re eating without the need for an interface or device between them and the dishes. It could truly enable an interface-less interaction—without an actual device in front of us.

Emotions through Images: Changing how we interact with our devices through emotion
This installation explores the emotional associations people have with images, and the potential of images to foster emotionally rich exchanges. In this demo, images taken by individuals on mobile phones with Instagram are projected on a large interactive display and real-time sentiment analysis of captions is used to infer the mood of each image. The Intel sentiment analysis software uses the Circumplex Model of Emotion. Algorithms translate this classification into a colour-coding of images on the interactive display and individuals are invited to express how an image makes them feel with a touchscreen Mood Map. Their Mood Map adjustments are reflected through colour, text and sound. Additionally, individuals can associate images that affect them with emotion tags, to compose an arrangement out of the entire installation. Those compositions allow the ability to capture and share the collective vibe of events.

Interactive Shopping: Changing how we shop offline
We’ve already seen the impact of online shopping in India. It is truly transforming the way we are purchasing goods and making buying decisions. But it is also having a profound impact on brick-and-mortar retailers who are now struggling to compete with this onslaught of deals, payment options and overall convenience.
Transforming the physical store to mimic the online shopping experience
Transforming the physical store to mimic the online shopping experience


All of that could change in the years to come if offline retailers choose to integrate interactive digital signage into the retail environment. These retailers could bring the same online shopping benefits to the physical store. Capabilities include motion sensors, temperature sensors, cameras, Wi-Fi/3G, Ethernet, touchscreens, NFC and cloud connectivity. All of these are already present in the smartphones we purchase today, and by integrating these sensors into the retail experience, retailers could provide contextually aware data to shoppers and manufacturers that could allow for a new paradigm of shopping at a brick-and-mortar store.

Situational Sensitive Communication: Changing the way we reach each other
People often call a friend to say hello and end up interrupting a meeting or an important event. Wouldn’t it be great if phones gave people a glimpse of what their owners are up to before others call them, so they can make an informed decision on how and when to reach them?
The Lava Xolo in action at the Situational Sensitive Communication Demo, running Gingerbread Android.
The Lava Xolo in action at the Situational Sensitive Communication Demo, running Gingerbread Android.


This demonstration was really something that I thought should just jump out of the Intel Labs and land into the Google Play store as an App. The concept is remarkably simple, but one that we haven’t yet begun to consider (aside from leaving status and social media updates). The app could use your phone to infer your activities and social interactions through a range of hard and soft sensing capabilities coupled with machine learning algorithms. Jump to the video and you’ll see Intel researchers demonstrate how the phone of the future can infer people’s activity/availability and leverage this knowledge to inform people’s trusted circle.

The phone can act appropriately on behalf of a user when he or she receives calls and choose the best way to send/receive messages (eg. text when in a meeting, but voice while driving). Apologies again for the poor quality of audio and video.
It’s hard not to feel excited about the future of computing, especially if you unhinge it completely from devices and gadgets, and make it totally about experiences. A future that will be totally focused on less gear, less personal tech hardware and interface-less experiences that may actually better our lives instead of just cluttering our drawers with more discarded hardware year after year after year.

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

TECH 4 COMPUTER Headline Animator