IoT powers this robotic suit

They've been around for a while, but robotic suits are finally becoming more than a novelty thanks to IoT.

You get in an accident or have a stroke, and suddenly you can't move some part of your body. That's the reality faced by the 5.6 million people who suffer from paralysis in the U.S.

Recent breakthroughs, particularly in robotics, have held some promise of making life a lot easier for people with mobility issues related to paralysis. But exoskeletons and robotic braces are kind of like flying cars--we've seen them demonstrated, they exist in some form, but they haven't really made a big impact in daily life for the people who need them most (as an Angeleno with a seething disdain for the 405, I think flying cars should be a basic human right).

Here's where we stand: Companies like Ekso Bionics, which evolved out of a robotics lab at U.C. Berkeley, and Cyberdyne, which is an awesome name for a robotics company, even if it does herald Skynet and certain destruction, are currently making robotic suits for people suffering from stroke and spinal cord injuries. Mostly their suits aren't being used for mobility (look ma, I can walk to the store), but for rehab and to reduce secondary complications associated with sitting in a wheelchair all day (look ma, I can walk along an ungraded path indoors under the close supervision of a physical therapist).

The issue is that the robotic suits are trying to mimic the human gait, and humans don't follow any kind of replicable pattern when they walk in the real world. Humans are constantly falling forward and catching themselves, so there's not much room for error. Our brains take in tons of sensory information to plan each step and distribute weight accordingly. Since a person with lower limb paralysis doesn't have much or any control over their legs, the robot has to figure out the best way to plan and execute each step. So far the results have been pretty clunky.

But there's some hope, and it's coming thanks to the sensors and controllers driving the Internet of Things revolution (seriously, is there anything IoT can't do?).

Hyundai (yup, the car company) has been working on a robotic suit. To be effective, the next generation of robotic suits will have to react to a user's intentions and to the environment in real time. That takes a lot of sensors and a lot of real-time processing power. Hyundai engineers turned to National Instruments for help. National Instruments makes embedded sensors, as well as a platform that combines real-time CPU and field-programmable gate array for ultra fast processing that wasn't available just a few months ago.

The exoskeleton follows a trend of IoT technology making its way into medical hardware. In fact, the use of IoT in healthcare is expected to grow from $32.47 billion in 2015 to $163.24 billion by 2020. Forty-seven percent of hospitals are looking to further expand their use of connected health technologies.

Hyundai's Medical Robot is designed for the elderly and patients with spinal cord injuries. In a recent clinical trial, the wearable robotic system shared real-time data of human patterns to successfully complete complex tasks and quickly react to motions as well as external forces using multiple actuators running on NI's platform-based technology. Enabled by the IoT, Hyundai and NI can connect the wearable robotics to smart devices to provide users with more data on control, diagnostics and rehabilitation.

Watching video of the device in general, what's striking is that the people using it don't walk like robots. That's not to say they're walking naturally--if anything, the older gentleman in the video is shuffling along like someone hobbled by arthritis. But that's huge progress over the previous generation of suits, which took very prescribed steps and couldn't deal with obstacles at all.

So if the suits aren't quite there yet, the speed and sensor technology coming out of IoT advances is getting engineers closer. No word yet on the flying car.