The Augmented World Expo: Go XR or Become Extinct

Dan Feinberg, Technology Editor, I-Connect007 | 06-13-2018

The Augmented World Expo (AWE USA), now in its ninth year, is perhaps the largest event for professionals focused on providing science fiction-like abilities through XR (cross reality) and associated wearable technology. This year’s event, held at the Santa Clara Convention Center, showcased over 100,000 square feet of exhibit space and hosted approximately 6,000−7,000 attendees. In addition to the exhibits, there were numerous meeting rooms as well as three large presentation halls that experienced a constant stream of presentations and discussions covering topics ranging from the latest and greatest uses and devices for XR to the business of marketing and monetizing it.

AWE-Keynote-with-hologram450.jpgThe presentations were organized into various tracks, including Inspire, Life, Design, Develop and Start-up Pitches, as well as various workshops. The talks and panel discussions were continuous and set for specific times, so attendees could attend desired ones and see and visit the nearby exhibit halls as they wished. They were presented in the various tracks that best fit the subject material and all included ample time for Q&A—which one could ask on-screen using a laptop or other mobile device.

The Keynote speaker was Ori Inbar, co-founder and executive director of AWE. The keynote literally set the stage first with a greeting by an on-stage hologram of Ori, who spoke and then introduced his live self, who then walked out, bringing the two on stage together. Their conversation focused on the state of XR, and more.

AWE-show-keynote-demo450.jpgI spent the rest of Day 1 listening to another keynote, and more presentations from the various tracks and demonstrations. I chose from introductions and demos of the latest devices including some very interesting new startups to the business of XR including an informative presentation on ways to monetize the technology. After all, if there is no profit in it there is little incentive to do it. A few of my favorites: “Fulfilling the Potential of AR for Enterprise” by Mark Sage, “Introductory AR Workshop” by Will Hellwarth; “NVIDIA Holodeck” by ZVI Greenstein; “The AR Glasses Story, from Soldier to Worker to You” by Colleen Emaus; and perhaps my favorite, backed up with an excellent show floor training demo, “Microsoft’s Vision for Mixed Reality in the Modern Workplace,” by Dioselin Gonzalez.

Qualcomm’s Hugo Swart talked about building an ambient world and the possible convergence of MR devices in the next 10 years. He stated, “XR is the future of mobile computing. It’s going to change the way we work. It’s going to change the way we play. It’s going to change the way we socialize. The question is: How are we going to get there?”

A presentation by NVIDIA was, as always, interesting and informative. They stated that with the HolodeckVR used for car design it is no longer necessary to build an initial model for prototype review and modifications; you can build a XR model for local and global review instead. With the XR model prototype being real, interactive, dynamic and behavior simulated getting through the prototype stage is much faster and far less expensive.

Nvidia-Holo-Deck500.jpgClick on this link to watch the NVIDIA Holodeck video and while watching, keep in mind that the robot-appearing avatars in the video are real people interacting with the XR prototype. If you are interested in XR this is an excellent video to watch! Note that the avatars and the people they represent can interact with the overall vehicle or with any and all of its parts/components. I imagine that competing with a company using this technology while continuing to do it the old 20th century way would add credence to the phrase, “Go XR or go extinct.”

virtual_speaker_aaron_lemke-700.jpg

While most of the presentations were by humans aided by XR, some were virtual speakers. I could write two more columns just on the presentations I attended at least part of and there were so many more that I could not get to. I suggest that you might want to take a look at the presentations [2] that you may be interested in for yourself. Following are some additional highlights and interesting statistics I heard at various sessions:

On Day 2, even with many more presentations, I spent my time on the show floor. The exhibition was divided into three main areas. One was normal booths showing presently available products and processes using XR. Another was startup companies and prerelease devices, and the third was the play area where you could enter various virtual worlds using the latest technology and devices and…play.

virtual_enhanced_service450.jpgAs mentioned earlier, the Microsoft area was especially interesting. You could put on an XR headset and stand in front of a malfunctioning piece of equipment or computer and call an offsite expert who could then see what you see and superimpose his instructions and guidance augmenting your reality with his virtual pointing and guidance; in no time, you could repair and reset the device. I went through the process and thought that was so much faster, less expensive and more satisfying than waiting for a repair person to show up perhaps a few days later or even trying to have an expert talk you through it on the phone.

One interesting conversation was with an IEEE rep who invited us to meet with them to discuss some of the more pressing topics. I met with IEEE Director of Innovation Jay Iorio and Senior Program Director Kathy Grise as well as Minu Seshasayee, Senior Director at Interprose. As they noted, “XR is poised to become the standard, general-purpose, ubiquitous-as-sunglasses sensory interface for a blizzard of AI-curated content, informed by sensors.”

ieee_logo.jpgWe discussed the future of XR and how IEEE is now much more involved than just engineering. They are far more broadly involved than in the ‘60s, when I received my IEEE certification (yes, I have been around that long). They are of course still engineering centric but now they are involved in a broad scope covering topics from engineering to the ethical, from helping young professionals to women in engineering, students, researchers and so much more. I suggest that anyone on a career path in the tech industry, especially early in their career, check out what IEEE has to offer.

We spoke on a broad range of issues including some that are not being widely discussed yet, such as the dangers of being exposed to EMF and RFI radiation of various power levels and frequencies coming from smartphones, wireless headphones, WIFI, Bluetooth and the incoming wave and growing true megatrend of IoT. In fact, I recounted to them an experience from one of the presentations I attended on Day 1 of the show regarding IoT from a provider of wireless devices where I had the opportunity to ask a question. I asked, “Are you concerned about the increasing exposure to varying sources of RF radiation for hours on end day after day?” My question was skipped over and not addressed by the speaker; it was obvious to those in attendance that it was not a topic that they wanted to discuss and when I recounted that experience to Jay of IEEE he seemed to understand and did not seem surprised.

It was an interesting and informative session. I now have a greater appreciation for IEEE’s efforts in the modern age. Their following statement says a great deal: “Because the potential of XR combined with machine learning and sensor networks is arguably unprecedented in the history of technology, this revolution is poised to become a powerful example of how technology can potentially have vast and unpredictable effects, as well as how its creators might evolve to see their role as broader than simply enabling commerce. This moment could be a unique opportunity to build an ethical foundation for the development of technology so that the needs of all stakeholders can be addressed, and so that the full diversity of the human community can be reflected in our created worlds.” As we wrapped up our discussion, I thought that I will be paying more attention to the activities of the IEEE.

holosuit.jpgMy next visit was in the startup pavilion with Kaaya Tech, who were introducing the Holo Suit, a wearable that includes various haptics. Haptics is “the study or use of tactile sensations and the sense of touch as a method of interacting with computers and electronic devices. Haptics allows you to feel and manipulate digitized objects in a virtual 3D environment [3].” Most XR devices use individual haptic sensors such as handheld controllers to enable you to interact with the virtual world. The Holo Suit takes full body motion capture beyond gaming and into the real world—an ideal solution for various applications in sports, healthcare, education, entertainment or industrial operations. The lightweight four-piece suit consists of a jacket, pants and two gloves and it can be purchased as separate items or as a complete suit. The complete suit features up to 36 body motion sensors—along with nine integrated haptic exciters to provide precise physical feedback to the user. This suit connects to your device (PC, smartphone, or other XR hardware such as the HTC Vive or the Hololens, etc.) wirelessly over Bluetooth or WiFi, etc. Presently, this haptic suit can be found here on Kickstarter.

I was able to see numerous new headsets and smart glasses. One such device was by VisionAR, (Univer Optical Technologies), lightweight and good quality smart glasses that let the user keep his or her own glasses on if needed and superimpose a virtual world over the real world. The interesting feature that got my attention was that they are wired using a thin, non-annoying cable to a box you wear on your belt or in a pocket. The smallish box provides long battery life and the wireless connection to your computer, phone, etc. This eliminates the issue of RF radiation against your head. There are also shielded pocket protectors and even shielded underwear available to keep the RF away from your body while allowing it to connect through the side away from your person. 615 Technologies were also touting a similar configuration designed to reduce radiation exposure to the user.

To consider the next level of mixed reality—putting very realistic but not real images into the real world—we must learn about ray tracing, a technology that I only became aware of recently. I had seen the result of ray tracing, but only recently did I become fully aware of and interested in the technology behind it. A basic understanding required some research and focused learning to better understand it and its capabilities. Ray tracing is the technique modern high-quality movies rely on to generate or enhance those amazing special effects. Consider the creation of artificial but super realistic objects, reflections, refractions and shadows. That is what makes spaceships and exploding planets in sci-fi epics so amazing. It makes fast cars, screaming fighter planes and epic war scenes including the mega explosions look so frightening and real.

To give provide a basic understanding of ray tracing, let me quote from NVIDIA’s Brian Caulfield's blog:

ray-tracing-2.jpg“What is ray tracing? The easiest way to think of ray tracing is to look around you, right now. The objects you’re seeing are illuminated by beams of light. Now turn that around and follow the path of those beams backwards from your eye to the objects that light interacts with. That’s ray tracing. Historically, though, computer hardware hasn’t been fast enough to use these techniques in real time, such as in video games. Moviemakers can take as long as they like to render a single frame, so they do it offline in render farms. Video games have only a fraction of a second. As a result, most real-time graphics rely on another technique, rasterization. Real-time computer graphics have long used a technique called “rasterization” to display three-dimensional objects on a two-dimensional screen. It’s fast. And, the results have gotten very good. With rasterization, objects on the screen are created from a mesh of virtual triangles, or polygons, that create 3D models of objects.

“Ray tracing is different. In the real-world, the 3D objects we see are illuminated by light sources, and photons can bounce from one object to another before reaching the viewer’s eyes. Ray tracing captures those effects by working back from our eye. It traces the path of a light ray through each pixel on a 2D viewing surface out into a 3D model of the scene. As GPUs continue to grow more powerful, putting ray tracing to work for ever more people is the next logical step. For example, armed with ray-tracing tool—and powerful GPUs—product designers and architects use ray tracing to generate photorealistic mockups of their products in seconds. As GPUs offer ever more computing power, video games are the next frontier for this technology.”

Ray-trace-demo1.jpgWhat does this have to do with XR and the AWE show? ADSHIR, a small and previously almost unknown company, was showing some of their real-time innovative AR ray tracing technology, which enables graphics quality in XR, previously possible only within the animated film industry, which also uses ray tracing to place virtual objects into the real world. You can see the virtual objects on the screen and if you place a mirror to reflect the area, you can also see them in the mirror’s reflection on the screen. To get a good idea, check out the short video below. Ray tracing is playing an increasing role in XR. As people enjoy new movies and advanced gaming and other forms of XR, they can begin to understand how it is possible and how virtually anything is becoming possible.

When working or playing in a virtual world, one challenge has been how to control the objects around you. In the real world, we just think what we want to pick up and move or what button to push and our brain directs our hand to do it. We really do not have to think much about it. In an XR world, devices such as keyboards and mice and now handheld and even wearable controllers give us control.

%%https://www.youtube.com/embed/qkfKGstGt70%%

mudra-band350.jpgTwo new devices that I was shown at AWE take the control and movement of objects in the XR universe to the next step. One is by Wearable Devices, an Israeli company. Their MUDRA touch Free Wearable Controller for Digital Devices allows you to control a device using a wristband that reads the interface between your brain and your hand and transmits it to the XR device.

wearable-devices-hand-animation.gifThe MUDRA is not measuring the movement of your fingers as proved by wearing the device and then having someone else move your fingers for you—nothing happens but when it is your own brain causing the movement both your real hand and the hand of the virtual or remote device move. You have full control of it just as if it were part of your body. These seeming brain wave sensor devices are soon available for smart watches and XR headsets and much more. Using one was an amazing experience as the band can detect when a user touches a virtual item and can detect the grabbing action of a user picking up a physical item.

tap-keyboard450.jpgThe other control device I had the opportunity to experience is called the Keyboard of the Future by TAP. TAP wearable keyboard creates text each time you tap your fingers. Tap does not use the QWERTY keyboard layout or any keyboard at all. Instead, you create letters by tapping one or more finger combinations on any surface. I took a brief training exercise which I found to be much easier than expected to master. In just a few minutes I was able to tap and type a dozen or so letters. For instance, when you tap your thumb, the letter A is sent to the Bluetooth device you have paired with your TAP, etc. Based on my experience I feel that, with a few hours’ practice, you would be able to type basic text by tapping without a keyboard. There were literally hundreds of other devices.

AWE-XR-surgery.jpgThere were far more items to see and cover than could be accomplished in few days or covered in a few articles. XR is a coming megatrend in entertainment, medicine, military, product service and just about any area you can imagine so you can be sure we will continue to cover it as well as the AWE event.

Think about what is driving our electronics industry today. XR, autonomous transportation, standard automotive electronics, AE, quantum computing and so many more areas that did not even exist 25 or so years ago except in science fiction.

AWE-show-keynote-demo450-ZviGreenstein.jpg

 

References

  1. NVIDIA Holodeck video
  2. augmentedworldexpo.com
  3. From dictonary.com.