In the alphabet soup of wireless protocols in the Smart Home, several stand out. Everyone knows Wi-Fi and Bluetooth. Bluetooth has fragmented into Bluetooth Low Energy (BLE) ,Bluetooth Mesh and plain old Bluetooth. Consumers tend to buy what they know, which historically has given a boost to WiFi and Bluetooth over Zigbee and Z-Wave. Given that most of the successful service providers, Vivint, Comcast, ADT and others are using Z-Wave, you would expect this to be changing. Turns out it is.
Z-Wave, so far in 2017, has been beating out all all the other protocols in the Smart Home market except for Wi-Fi and Homekit. For the first time since we’ve been tracking the Smart Home conversations in Twitter, Bluetooth has dropped below Z-Wave. What does this mean? Part of this is the growth in adoption of Z-Wave enabled products within the space. Part of it is the efforts by the Z-Wave Alliance and its members to drive awareness as to the benefits of their protocol over others. Their recent announcement of the S2 security improvements has also ridden the wave of increased concern over the security of our increasingly connected homes.
We track the mindshare of the entire Smart Home market, as well as the major (and minor) devices and apps that form the ecosystem. If you want to have access to these tools and predictive insights, just check out the Argus Analyzer.
Discussions of Artificial Intelligence (AI) are everywhere, ironically enough, some of it written and published by bots… Elon Musk continues to gain headlines for his crusade against the development of AI, citing concerns that if we pick the wrong utility function, humans could be optimized out of existence. (from his talk with Walter Issacson, if the goal is to eliminate spam, the AI could decide that eliminating humans is the most efficient path). IBM’s Watson has been the poster child of deep learning, promised to be the panacea for everything from call center routing to finding copyright infringement to winning at Jeopardy. But what does this mean for the every day consumer? What happens when our homes really become ‘smart?’ Does it punish us for not cleaning the toilet? Does it decide, based on reading the news feeds, it would be safer for the family not to leave the house and lock us in, delivering food via Grub Hub and Amazon drones?
We need a way to think about AI in the home so we can better design these systems to anticipate the benefits and well as where a ‘rogue’ intelligence could go off the rails. Much the way our own brain segments functions in different sections of our gray matter, home intelligence could also benefit from some sense of specialization and hierarchy. AI systems are really good at recognizing patterns and determining action based on those patterns. Within the home, you can consider five types of intelligences needed to realize the true promise of the Smart Home.
- Visual Intelligence
- Behavioral Recognition
- Human Interface Engine
- Threat detection and abatement
- Ethics Engine
The Visual Intelligence is the most common being used today, looking at streams of visual data, static and dynamic, coming from security cameras, phone snapshots and more. These maturing intelligences can distinguish between dogs and cats in the scene, ensure only family members are allowed in the back door, and with Apple’s new A11 Bionic on the iPhone X, process your face fast enough to paint your expressions onto an animated emoji of poo. The trend is to push as much of this recognition to the edge so that the homes do not have to push terabytes of data up to the cloud every time someone rings your doorbell camera. In the tug-a-o-war between privacy and security, edge process of the visual data from your home also ensures your data stays yours as well as lowers the latency in recognition. The interpretation of visual information in the home is a key requirement for any home intelligence as correct evaluation of what is happening visual drives much of the follow-on actions in the home. Adjusting the thermostat based on the number of people in the room. Letting people know when the dog went outside for a potty break and more.
Behavioral Recognition builds on the information within the visual scene of the home and interprets the behavior of the occupants to look for patterns of engagement that can be automated. Vivint’s new Sky intelligence examines what actions the occupants take inside of their homes and takes on those tasks over time. For example, if when you leave in the morning, you turn down the thermostat, turn on the back porch light, switch off your stream of classical music from Alexa and send a text to the nanny reminding them of pickup, Sky promises to, over time, do most of these as you walk out the door. The same happens when you come home, the right lights turn on, smooth jazz is piped through the kitchen, and the house has already been working on achieving the right comfort level as soon as it knew you were headed home. This Behavioral Recognition extrapolates within your routine (and less routine) actions to determine candidates for automation while at the same time understanding when you deviate from your normal routines and how the react appropriately. Say you come home with extra kids for a play date, smooth jazz probably isn’t the vibe you are going for. Or shift your thermostat because Mom is visiting. The Behavioral Recognition is critical to having your home feel intelligence and tailored to your context.
Of course the occupants need to interact with the home. The Human Interface Engine bridges that gap between human and home. We’ve seen the rapid rise of Alexa and Google Home (and eventually Bixby and HomePod) leverage voice as key interface to our home environment. But there are also gestures, apps, touch and more that drive our engagement. The Human Interface Engine will be that collection of intelligences that gather and interpret those actions on behalf of the human occupants. Want the kids to exercise more? Require 200 jumping jacks before they can turn on the Xbox? Wave goodnight to your living room to turn off the music and the lights before retiring for the night. Filter out your teenager’s commands every time she wants Alexa to play death metal. Without the ability to recognize and react to the ultimate users in the Smart Home, the humans, any system would only be guessing at what do to on behalf of their occupants.
Security becomes critical as well within the home environment both for the ability for the bad guys to get access to our resources but also for the ability for nefarious actors to leverage our homes to attack others. As such, a local intelligence that is focused on Threat Detection and Abatement is critical to any Smart Home intelligence solution. Solutions like Cujo bring solutions typically reserved for the enterprise into the home and are a good start towards the type of security needed in the home. Cujo and others leverage the attack patterns sensed across their customers to continuously update their own understanding of evolving threats. Eventually, the abatement of threats to the home could also include counter-intelligence capabilities, spoofing data and usage information as a way to throw off the intentions of the bad guys. Are you on vacation or just taking quick trip to Costco? Home with a sick child or on a business trip? This type of misinformation also becomes a new direction of capabilities for these intelligences and aid in the overall security of the home and its users.
The final intelligence we believe is necessary in the home is an Ethics Engine. The Ethics Engine establishes the bounds of what is the correct actions for the home to take. Just because you can turn the thermostat down to 40 degrees does not mean that you should. Just because the dash button can order three lifetimes of Cheetos to be deliverd by Friday, does not mean is should. This notion has received a lot within Autonomous Vehicles, where the vehicle makes decisions regarding whether to hit the squirrel or the telephone pole. Within the home, this is also critical as more of the home’s systems become controllable by these intelligences. This ‘brain of brains’ will have the final sign-off on any actions taken by the Smart Home as the final line of defense of the users health, safety and wellbeing. Models of robo-ethics have been around since Issac Asimov first published his laws of robotics but they have not yet been applied to the Smart Home content. The Disney Channel movie, Smart House, saw a resurgence in popularity recently because of its campy portrayal of a home gone rogue. An Ethics Engine would prevent and more dangerous scenarios from happening, ensuring the home does not fulfill Elon Musk’s nightmare scenario of a rampant AI that eradicates the family hamster.
Within each of these five intelligences, you can see how this framework helps give context to the types of AI that needs to be brought together in concert in the Smart Home. True intelligence in our abodes will come from the seamless integration and collaboration within these critical capabilities. While no one company has brought all of these elements together, we see the capabilities evolving. Look for the service providers like Vivint and Comcast leading the charge, integrating the solutions where feasible and building their own where necessary. Eventually, we will have that home of the future, today, and feel safe using it.
Some of you may be old enough to remember the series, the Six Million Dollar Man. In addition to having his own ambient soundtrack when he ran, Steve Austin also had a Bionic Eye that would do wondrous things in his fight against the bad guys. Since the 1970’s, research has continued on how to craft an artificial retina to help those that cannot see. Not quite the super sight of Steve Austin or Mad-eye Moody but the basics of vision. This effort to create a bionic eye good enough to replicate the full function of the human eye has missed a key target segment.
In honor of International Talk Like A Pirate Day, we should consider using our latest advances in bionic eyes to help hapless pirates like this one have their sight back. Imagine how much better their aim would be with binocular vision! Who needs a spyglass when you have digital zoom built in?
Pirates. Pirates, with their poor health plans, dangerous work environments and high levels of scurvy, represent an underserved population with keen vision problems. In recognition of International Talk Like a Pirate Day, we humbly suggest that we work together with the tech industry to eliminate eyepatches from the pirate wardrobe. The MIT (my alma mater) did a recent review of the latest advances, including the Argus II (a name we are partial to here at Argus Insights).
Today’s implants are not quite up to the Six Million Dollar man performance level but cost less at a mere $150k. Most pirates have enough booty stashed somewhere to pay for the require implant surgery thought it’s unclear where the medical centers will take doubloons or bitcoin.
While we do not replace retinas here at Argus Insights we do help our clients see better. Using the Argus Analyzer, our clients can visualize where the markets are going, peek at what is driving consumer adoption, and peer into the future based on the Internet of Things trends we are seeing drive adoption. If you’d like to learn more, you can check out the Analyzer by clicking below.
Brain Machine Interfaces (BMI) have been part of the science fiction stable for years, but research has brought us closer to that reality every day. We are closer to enabling the control of devices without lifting a finger, a boon for those with injuries or disabilities that prevent them from controlling most devices. This could also work the other way. Sure, a BMI helped Neo learn Kung-Fu in seconds but it also left his mind and those of others jacked into the Matrix vulnerable to connections going from the machine world into their own carbon based processor.
We can already influence the other way just by controlling what content we consume. Everyone has their mood music, a playlist for coding, focusing, getting down and more. Facebook proved, through some internal research, that they could control the moods of members by adjusting their news feed. Now imagine if you didn’t have to go through people’s eyes but directly to their brain? This Internet of Thoughts could be a new type of digital voyeurism where celebrities and everyday people could share their emotional playlists. Want to get amped up for the bit swim meet, jack into Michael Phelps. Need a creative boost to hit the deadline, access the soul and feelings of Ogilvy for a few minutes. Several firms, like Thync, have already proven they can induce calming states externally with the right electrical stimulation.
In this world where our filter bubbles have already erected intellectual firewalls to protect us from views, theories, and evidence that might challenge our cherished world views, imagine if there was a way to hack around those firewalls and target our emotional centers directly? What if they could change your anchors so that every time you saw chocolate, it would trigger disgust. That for every glance of fake news, ecstasy would result. The potential for behavioral therapy is fantastic but there is that dark side to be concerned with. An evolving Internet of Thoughts coupled with the ability to control them more directly could edge us down that dystopian path first laid out by George Orwell in 1984.
At Argus Insights, we have a bit of an anecdote to these Filter Bubbles. Because we track every single piece of content related to a market, we help users pop their brand biased filter bubbles to see what is happening across the entire Internet of Things market. Our Analyzer platform helps you avoid the tunnel vision of brand biased engagement improves your market kung fu in a matter of seconds. If you are ready to take the red pill, check out the Analyzer today.
It has been said that ‘He who dies with the most toys, wins.’ But not all toys are created equal. Some spark joy, trigger endless hours of replay, and bring smiles to parents and children alike. Others sit, untouched, lonelier than the island of misfit toys citizens. A few are still in their packages, never inciting enough interest to even be opened, stuck in that purgatory of a good intention mating with bad execution.
Why can we not leverage the nanny cams already installed in smart home around the country to see what objects are being used, or at least gazed at adoringly? Imagine a class of algorithms that is able to cross reference the inventory of Toys R Us or Amazon (wait, you can already do that) and then turn your playroom into a continuously running focus group! Maybe you’ll earn enough benjamins to pay for college! It’s unlikely any proceeds gained from offloading the unwanted toys at a garage sale will even pay for books their first year, I mean “content subscriptions.”
We do not condone the surveillance of children for profit at Argus Insights but we do support the monitoring of markets. If you’d like to see if your content fits in the category of misfit toys, let us know. Since we see everything, we can tell you what ideas and brands are being ‘played with’ by the Internet of Things market and what is being left to fester in the corner under so much dust, despite the best efforts of their makers.