INTEL® DEEP LEARNING
BOOST

Section 1

DEMOCRATIZING MARKER-LESS MOTION CAPTURE

Putting the Capabilities of Intel® Deep Learning Boost into more hands. (and arms, legs, torsos…)

Increasing Image Expectations are Becoming Unsustainable

Video recording technology is nothing new, but it has come a long way in the last few decades, moving from the hands of very few to a tool for the masses. Motion Capture technology, more recently, has been put to use at the highest levels in Hollywood and professional sports, improving audience experiences and athletic performance, respectively. Until now, such technology has been out of reach for most consumers, requiring expensive, specialized equipment. wrnch is challenging that with AI driven, marker-less motion capture that can be used by any 2-D webcam.

“Using Intel® Deep Learning Boost through the openvino toolkit, our developers worked with Intel engineering to optimize wrnch to run great on Intel® 10th generation Core™ i5 Processors.” Paul Kruszewski -CEO, wrnch 

Figure 1 Woman doing a plank with visual overlay
Section 2

Increasing Image Expectations are Becoming Unsustainable

Video recording technology is nothing new, but it has come a long way in the last few decades, moving from the hands of very few to a tool for the masses. Motion Capture technology, more recently, has been put to use at the highest levels in Hollywood and professional sports, improving audience experiences and athletic performance, respectively. Until now, such technology has been out of reach for most consumers, requiring expensive, specialized equipment. wrnch is challenging that with AI driven, marker-less motion capture that can be used by any 2-D webcam.

“Using Intel® Deep Learning Boost through the openvino toolkit, our developers worked with Intel engineering to optimize wrnch to run great on Intel® 10th generation Core™ i5 Processors.” Paul Kruszewski -CEO, wrnch 

Between the Eye and the Brain of your Thin and Light Laptop

Wrnch can turn any camera into, essentially, a visual cortex, translating visual information. By helping computers to make connections between what they “see” and what they “know,” we’re teaching computers how to better understand human motion, activity, and body language. The application potential for this is widespread, from entertainment to security. The question of where to take the technology falls to the creative minds of today’s independent software vendors. 

That Looks Right: The Path to Perfection for Professional Athletes (and Beyond)

By combining motion capture technology with computer vision, coaches and trainers can access powerful insights at a level of precision unavailable to human senses. This wealth of information allows for more precise training, which ultimate results in better athletes. Unfortunately, this technology has been expensive and inaccessible to the rest of the public before now. But amateur athletes come in all shapes and sizes. By removing the need for wearable sensors and allowing the computer to “see” more naturally, using only a standard webcam, computer vision and training can be harnessed in the home gym.

Section 3

Democratizing the Virtual Personal Trainer

Exercise is a multimillion-dollar industry that crosses every demographic, thanks in part to sports and rehabilitation. As a use-case demo for this developing technology, wrnch has been testing a virtual personal trainer application for the software. Personal trainers can be economically unrealistic for a lot of the population. Workout videos have long been a popular alternative for those who prefer to do lunges in the privacy of their own rooms. More recently, thanks to virtual assistants, workout apps are making regular exercise even easier to work into a daily routine. In the future, wrnch would like to see workout apps that utilize computer vision to count reps and to help people ensure they are doing exercises correctly. It’s one thing to watch a workout host pull a perfect plank, but quite another to have your home computer capable of telling you when you need to straighten your legs. 

Safety First

Exercise, and exercising correctly, goes beyond physical fitness. The same personal training program that can talk a yoga practitioner into an improved tree pose could be used for home rehabilitation exercises after a hip or knee replacement. Physical therapists would tell you that exercises work best when performed in a particular way. Using the integrated webcam on an Intel® Core™ i5 Processor powered laptop, wrnch can be applied to help ensure a body is in proper alignment for maximum safety and results. In a similar vein, a computer can be trained to recognize the physical signs of a person who is in distress, and needs assistance. This application could potentially be helpful in the homes of the elderly and other demographics at risk of home-injury.

Figure 2 Woman doing a lunge with visual overlay
Section 4

Computer Vision Application Potential Across Industries

The far-reaching applications of wrnch will be in the hands of today’s talented developers, whose creativity will doubtless find new uses for the real-time, markerless motion capture technology. In the developing field of robotics, this virtual visual cortex is already helping computers learn to interpret human behavior in order to create more natural and helpful interactions. Even simple human cues, like body positioning, can help a robot to understand whether a human is trying to interact with it. Similarly, the actions and body language of pedestrians could help self-driving cars recognize when a person might try to cross the street.

From a security standpoint, a computer’s ability to quickly spot and interpret suspicious behavior could go far to keeping public and private spaces more secure, from banks to airports and schools. As robots take on more human-adjacent roles, their ability to understand and predict our actions will become increasingly critical to successful adoption. 

REMEMBERING
APOLLO 11

Article

IBM: 50th ANNIVERSARY OF APOLLO 11 LUNAR LANDING from VolvoxLabs on Vimeo.

Want to walk on the moon? For the 50th anniversary of the Apollo 11 lunar landing, VolvoxLabs [VVOX] collaborated with Ogilvy USA create the IBM Moon Walk experience exhibited at the iconic World Trade Center Transportation hub, the “Oculus”. Visitors were able to step onto the IBM Moon Walk experience and see him/herself as an astronaut next to the Eagle lander. After the walk on the moon, visitors received a souvenir digital postcard to share online.

Volvox Labs integrated wrnchAI machine learning for real-time skeletal tracking, TouchDesigner for processing data and Unreal Engine to create real-time graphics. Each was seamlessly connected into the IBM Moon Walk installation for a family-fun, immersive lunar experience on July 18th — Saturday, July 20th, 2019.

DESIGN, DEVELOPMENT & INSTALLATION BY VOLVOX LABS [VVOX]
DOCUMENTARY VIDEO: Why Not Labs

The Jaguar ‘catwalk’
project

wrnchAI is used by The Mill to create the first real-time interactive Jaguar, currently featured at motorshow events world-wide

 
The Mill @MillChannel creates the world’s most photo-realistic interactive digital @Jaguar whose visual cortex is powered by @wrnchAI and rendered in @UnrealEngine. Both the AI inferencing and computer graphics rendering powered by @NVIDIA GPUs