Xiaomi launched a bipedal humanoid robot during the course of a release event for its foldable handsets. The CyberOne is capable of perceiving 3D space and also distinguishing men and women, motions, and expressions. This robot can identify 45 types of human emotion and comfort users during times of sadness. Xiaomi offers a variety of real-world applications for the bot, including manufacturing assistance as well as human company.
Long since gone are the times when a consumer electronics organization could simply introduce a phone and conclude the event. At this morning's huge launch event in Beijing, Xiaomi followed its foldable phone news by giving the stage to CyberOne. This biped humanoid robot followed Lei Jun on the stage, saying hi the CEO and presenting him a long-stem flower.
At initial glimpse, the robot is not exactly spritely, in regards to movement, but it's still an encouraging demonstration and very much not a person in a spandex disguise. It's the latest indicator of Xiaomi's thriving robotics aspirations, which started out with vacuums and have since developed to include last year's Spot-esque CyberDog.
We've witnessed plenty of consumer brands flex their robotic muscle at events such as this, including Samsung and also LG, therefore it's challenging to realize where CyberOne falls in the spectrum in between serious pursuit and stage performance.
The CEO was fast to mention the company's financial commitment in the sector explaining that CyberOne's AI and mechanical functionalities are all developed by Xiaomi Robotics Lab. He said they have put in heavily in research and development covering several areas, such as software programs, equipment and algorithms advancement.
— leijun (@leijun) August 11, 2022
There is a very broad series of claims here, including the capability to understand people's emotions. Xiaomi indicates that humanoid robots depend on eyesight to understand their surroundings. Outfitted with a self-developed Mi-Sense depth vision module and blended with an AI interaction algorithm, CyberOne is up to the task of sensing 3D space, and distinguishing individual people, actions, as well as expressions, helping it to not only observe but to process its surroundings. To be able to interact with the environment, CyberOne is furnished with a self-developed MiAI environment semantics recognition system and a MiAI vocal emotion identification system, permitting it to distinguish 85 types of environmental tones and 45 classifications of human emotion. CyberOne has the ability to identify joy and happiness, as well as even condole the user through of sadness. Each one of these functions are combined into CyberOne's processing units, which in turn are paired with a curved OLED module to show real-time interactive data.
Just as broad are the pledged real-world uses, varying between from manufacturing assistance to human company. There will be lots of use for both of these of these attribute sets in the years to come, but that's a very long way from this demo. For the time being, it most likely makes the sense to see CyberOne as a little something of an analog to, say, Honda's Asimo: a promising experiment that serves as a really good brand ambassador for a lot of the heavy lifting being done in the lab.