Android Technology Episode 2: Should Robots Ever Look Like Us?
NeoScribe NeoScribe
127K subscribers
37,252 views
0

 Published On Jun 21, 2020

Android Technology Episode 2 | Should Robots Ever Look Like Us?


There are conflicting outlooks amongst roboticists about whether robots should ever look like humans.


The co-founder of Intuition Robotics Dor Skuler believes it is ethically wrong for robots to pretend to be human.


And Dr. Reid Simmons of Carnegie Mellon University's Robotics Institute believes it is okay for robots to possess essential characteristics of humans, but not be hyper-realistic.


On the other side of the spectrum are companies like Hanson Robotics and Realbotix that aim to create robots that resemble humans as much as they possibly can.


If you agree with Skuler and Simmons and think hyper-realistic robots are wrong, you don't have to worry anytime soon considering the current state of AI and Robotics.


Professor John Thangarajah at RMIT, who is a researcher in the AI field, says we are far away from replicating the AI depicted in Westworld.

And as for robotics, the closest thing that seems possible in our lifetime is that Old Bill character in season one and even that seems far out there.


Scientists are currently working on the foundation technology that would make Old Bill possible, like electronic skin or e-skin.


The ideal e-skin would mirror the capabilities of human skin, such as sensing pressure, temperature, and vibrations.


At the same time, the skin must be durable, scalable, and easily integrated.


Our brains process a lot of information from our skin receptors without much effort.

Every square cm of our skin has around 200 pain receptors, 15 pressure, six cold, and one hot receptor.


And our sensitive areas have more, like our fingertips that have around 3,000 receptors.


One major challenge with creating the ideal e-skin is computing power because there is a limit on how much information that can be detected at once.


Researchers in Germany have been working on e-skin technology for ten years.

And their latest development is an e-skin system consisting of hexagonal sensing modules.


The modules form into flexible artificial skin that can attach to a variety of surfaces.

What makes the skin special is that it takes less computing power compared to other e-skins allowing it to cover more surface area.

They achieved this by mimicking how human skin receptors only send signals to the brain when they detect changes temperature or pressure.


Likewise, their e-skin system only transmits information when the input reaches the programmed threshold, and this dramatically reduces computing demands.


So now Cheng's team aims to achieve higher sensing densities by reducing the size of the modules.


And they hope their research will lead to applications such as touch-sensitive prosthetics and improved safety with human-robot interaction.


So while this project aims to allow robots to feel their environment, scientists out of Osaka University are developing robots that can express what they feel.


You see, the scientists in Osaka want their research to lead to more realistic robots that can have more in-depth interaction with humans.


In 2011 they developed an android child named Affetto that can react to the sensations using a variety of facial expressions.


The team has developed a second-generation Affetto that is more expressive.


Affetto’s face has deformation units on 116 facial points, each with a set of mechanisms that create facial contortions, such as lowering or raising of part of a lip or eyelid.


While this is fascinating to watch, Affetto falls within the uncanny valley, for me at least.


The Uncanny Valley is a concept coined by Masahiro Mori that describes how robots are more appealing, the more human-like they are.


But at a certain point, the appeal dips to a feeling of strangeness and a sense of unease.


Bionic Eye:


The nanowires are made from conductive and light-sensitive material called Perovskite, which is used in solar cells.
Then they formed aluminum oxide into a hemispherical shape, which allows the eye to focus.
They connected the nanowires to the aluminum to form the artificial retina.
And the wires in the retina have a higher density than the photoreceptors in human eyes.
So, the retina is housed in an artificial eye that is filled with ionic liquid that allows charged particles to move through it.
The artificial eye can process patterns of light in 19 milliseconds, which is twice as fast as a human eye.
This technology can be applied to robot vision.
So, these are just a few examples of foundational research that may make Old Bill possible in our lifetimes.
There is no question that robots could improve our lives.
We all have chore around the house that we would love to have robots help with, but the question is, do we want them looking like us?
Ultimately the market will decide if and when the time comes but there is still a long way to go.

show more

Share/Embed