The Heat of Artificial Intelligence
Self-driving car technologies, Amazon’s Alexa and robotics “stole” the show at CES this year.
“Smart” devices have been a big theme at the CES shows over the past few years. The word smart generally means connectivity, which typically refers to BLE or WIFI. A connection to the cloud (and an app) in turn would enable remote control and monitoring capabilities. Examples include smart cameras, smart locks, and smart toys. This year, we're seeing the word "smart" being replaced with "Intelligent". This could be as simple as rule-based systems to sophisticated deep learning models. We have seen "AI" powered cameras and conversational speakers (e.g. Alexa Echo). We also see that intelligence is being pushed to the "edge" – integrated into each device rather than being dependent on the cloud. Sometimes it is because of privacy or for bandwidth reasons (e.g. security cameras) and sometimes it's because the application demands real-time actions (e.g. self-driving cars).
Speaking of self-driving cars, it appears that every mobile object is getting autonomous navigation capability - vacuum, delivery robots, drones, etc. Some applications are waiting for low cost LiDAR sensors, but many have opted for a computer vision based approach. Our sense is that automotive would be the industry that drives large volume of LiDAR sensors, which in turn, would lower the cost for more applications to benefit from these capabilities.
Robotics Steal the Show
Robotics start-ups, for the first time of the CES history, occupied such a distinct section of the main floor at the Sands hall. The majority of start-ups on display seemed to be working on consumer-facing robots, e.g. humanoids that are assigned day-to-day tasks as kids or seniors’ companions, running errands inside the house, or serving as guides in public spaces such as airports and hotels. The majority of the designs we saw are still not good enough to create any emotional attachment to the humans they are supposed to serve, and the display and voice communication interfaces will need to become much better in order to deeply engage humans for the longer term, beyond the initial novelty factor.
More interestingly, there are also early signs of robots that focus on replacing time-wasting routine human tasks, with both consumer and commercial applications, which - once reliable and economic enough - can become mainstream solutions. One such that impressed us is Laundroid, a Japanese robot that folds and sorts (per type of clothing or per family member) your just washed and dried clothes. Although the CES demo was still slow and the device is originally targeted for home use, this robot has substantial potential both as a residential device, as well as in laundry services and hospitality industries. The company behind Laundroid recently raised $50M from Panasonic, which clearly intends to integrate it with its own line of laundry and drying machines.
We were overall impressed by the electromechanical innovations in several Japanese products around mobility, including Whill’s modern and powerful wheel-chairs, as well as Honda’s new self-driving car and motorcycle assisted driving solutions. On the flip side, there are other devices, like the window-cleaning Winbot, that are assisting human tasks but do not appear to be well thought-out in terms of time-saving and ease of use, and cannot expand to similar commercial offerings at scale. Some of these robots seem to be over-engineered.
On a separate technology sector, we also liked some of the latest glass display products displayed by Corning, one of the oldest companies exhibiting every year at CES. Home, office and school collaboration surfaces have been long-in-the-coming, overhyped a few years ago but constrained largely by high cost and availability of custom software solutions, as well as hindered by the increasing presence of all types of personal smart devices in all environments. These surfaces though offer unquestionable display and productivity advantages, and it is only a matter of time before they become omnipresent in our daily lives.
Amazon Alexa is Everywhere
Amazon’s voice control system, Alexa, was absolutely one of the most popular technologies at this year’s CES show. It’s not only that many established home products like Nest or Philips’ smart lightbulb Hue are already connected to Alexa, end users are now also asking for the same level of integration, convenience and intelligence applied to several other electric appliances at home, like refrigerators, microwaves or blind openers. Vivint seems to have captured the opportunity and have surpassed its competitors to become one of the leading smart home solution companies in the market. Alexa, the voice behind Amazon’s Echo device, was firstly launched at the end of 2014 at a price of US$180. Since then, the device and its cloud-based data service have been steadily gaining a solid foothold in the smart home market, and Amazon has reportedly sold over 5 million units in the first 18 months.
The trend shows that home builders are also starting to adopt smart voice activation technology into home design, which will enable more digital devices and home appliances being connected via voice control to the internet. Besides home, working spaces and cars are the two other primary places that several companies, as well as Amazon, are exploring as the next areas of opportunity. At CES 2017, we saw partnership announcements between Alexa, Ford and Volkswagen. Ford demonstrated the integrated Alexa voice control for cars running the Sync 3 voice platform, which is expected to be available for consumers in the summer of 2017. This step is critical for Amazon to secure Alexa’s ubiquitous presence in everyone’s daily lives. Currently there are over 7,000 skills listed in the Alexa Skill Store, that can help accomplish our most common daily tasks via voice control.
Amazon’s Alexa is definitely one of the stars at CES 2017. However, it’s still too early to conclude that Alexa will be the final winner in the intelligent voice agent market at this point. Google recently announced Google Home and its partnership with Korean car manufacturer Hyundai. The real competition in this space has just begun.
Self-Driving Cars are On!
As one of the most important fields in AI, self-driving technologies have made prominent progress over the past few years. At this year’s CES, we saw almost all autonomous car vendors taking the stage to demonstrate their latest technological progress in the area. Based on our observations, visitors from China seemed particularly enthusiastic. In a space of 2 hours at the exhibition hall, we met a good number of autonomous car start-ups, investors and industry analysts from the China, including Robin Li, the founder and CEO of Baidu.
Currently there are a number of different approaches to solving the self-driving problem. Usage of LiDAR sensors is an important differentiator. In order to minimize blind spots in the visual system, companies like Google propose to use costly but higher precision LiDAR as their core sensor, to obtain precise spatial modeling data. Meanwhile, companies like Tesla and its old partner Mobileye are using cameras to play the leading role, aiming also at cost reduction.
In any case, no one can deny the importance of LiDAR. LiDAR sensor hardware company Velodyne was also at the show, so was Quanergy, the company who blew everyone away with its solid-state laser radar S3 at CES 2016. Interestingly, Quanergy, after developing solid-state lasers for less than $500, unveiled two mechanical radars this time round. Meanwhile, leading mechanical radar vendor Velodyne announced that it had successfully developed its own solid-state radar solution. However, Quanery’s mechanical radar will be mainly used for security purposes, due to its low accuracy. In our opinion, it would be very difficult for any mainstream car manufacturer to adopt solid-state radar in the near term. However, Velodyne itself seems to be aware that, given the cost issues around mechanical solutions, solid-state radars could potentially be the long-term trend.
Besides talking with upstream vendors, the Sinovation Ventures team also managed to experience an L3 test drive, with a self-driving startup that is aiming at L5. The test drive took place at a typical Las Vegas residential area, primarily on local and relatively quiet roads. During the drive, the vehicle was able to accomplish various types of turns, and gave priority to pedestrians and other vehicles when necessary. Although the speed was not very fast and the driver needed to intervene sometimes, we were still very encouraged by this remarkable progress in the development of self-driving technology by a start-up within a short timeframe. Likewise, we were proud to see our very own portfolio company, UISEE, unveil its autonomous campus electric vehicle.
While self-driving technologies continue to evolve, we see unprecedented enthusiasm from both entrepreneurs and investors. However, we all need to realize that autonomous driving technologies are still far from mature. Meanwhile, policy and regulatory issues are even more pressing and complicated. In fact, it will be almost impossible to deploy any self-driving cars at large scale in the near term. Some people are questioning if self-driving will end up as e.g. wearable devices. From our point of view, a temporary cooling down is probably inevitable. Overall though, self-driving cars carry massive potential to solve significant transportation problems. It’s not a bubble.