Skip to main content

Scientist Claims to be On the Verge of Making An AI That ‘Feels’ True Emotions

MAJOR BREAKTHROUGH?

AI has been making great strides in the past few years, beating humans at our own game, as well as augmenting and even replacing human controlled systems. However, some are still not impressed with these developments and feel more should be done.




Such is the view of Professor Alexi Samsonovich, who announced that Russia “is on the verge” of a major AI milestone—robots that can feel human emotion!
The announcement was made during the 2016 Annual International Conference on Biologically Inspired Cognitive Architectures (BICA) in New York City. Specifically, Samsonovich pointed to free thinking machines capable of feeling and understanding human emotions, understanding narratives and thinking in those narratives, as well as being capable to actively learn on their own.
However, Samsonovich did not go into the specifics of the Russian AI advances, merely saying that the breakthrough will come in several years and that more funding will be needed to complete it.

THOUGHTS AND IDEAS





In an interview with Digital Trends, Samsonovich was still skimpy on specifics, but full of thoughts and ideas on AI. He mentioned that with the recent uptick in the number of publications on AI as well as the money invested by governments and companies toward development will make new things possible “…a machine [is] capable of feeling human emotions and exhibiting human-level socially emotional intelligence in a variety of settings.”
Samsonovich doesn’t want it to be thought that he is making any claims toward AI developing consciousness. He believes that consciousness is not a worthwhile goal to achieve, since it cannot be validated on anybody other than oneself. Instead, he sees a future with robots that have different behavior and internal organization. “When you will see this kind of behavior (human-like feelings) exhibited consistently over time in many circumstances and without occasional ‘presence breaks’ … you will believe that this entity is alive and is in a social contact with you, and you will interact with it accordingly.”

Comments

Popular posts from this blog

Google and Stanford early adopters of Honda Fit EV

Honda's first all-electric vehicle is hitting the streets a little early. The  Honda Fit EV  debuted at the Los Angeles Auto Show in November 2011, and it's expected to be     available for lease this summer. However,  Honda announced  that Google and Stanford University got a special early delivery of the tiny EV this week.The Honda Fit EV is equipped with a 20kWh lithium ion battery, and has an EPA estimated driving range of 76 miles. Google added the EV to its  car -sharing service for employees, dubbed the G-Fleet, in    Mountain View, Calif. The search giant maintains several electric and plug-in vehicles that it uses for research and to cart Googlers around town and between buildings on campus. Stanford University also is an early adopter of the Fit EV, but will be using it primarily for research. The university's automotive research department will study the difference in psychological and physical reactions of using battery...

Hand-manipulated objects and transparent displays - the computer desktop of tomorrow

A see-through screen, digital 3D objects manipulated by hand, perspective adjustments according to the user's viewing angle - these are the core features of a prototype computer desktop user interface created by Microsoft's Applied Sciences Group. The prototype uses a "unique" Samsung transparent OLED display through which the user can see their own hands to manipulate 3D objects which appear to be behind the screen. A demo video appears to show a working prototype of a computer markedly different from those we use today. Yes it includes a familiar keyboard and trackpad - but these are placed behind the OLED display. The user simply lifts their hands from these input devices to manipulate on-screen (or more accurately  behind -screen) objects, such as selecting a file or window. The video shows the interface in action with a series of program windows stacked behind one another, with the user selecting the desired program by hand, using the depth of the w...

Bioengineers develop smart, self-healing hydrogel

Velcro is pretty handy stuff, but imagine if there was a soft, stretchy material with the same qualities. Well, now there is. Scientists from the University of California, San Diego have created a self-healing hydrogel that binds together in seconds, essentially copying the Velcro process at a molecular level. The new material could potentially find use in medical sutures, targeted drug delivery, industrial sealants and self-healing plastics. The secret to the jello-like polymer hydrogel is its "dangling side chain" molecules, that reach out toward one another like long, spindly fingers. When developing the gel, a team led by bioengineer Shyni Varghese ran computer simulations, in order to determine the optimal length for these molecules. The resulting substance is capable of healing cuts made to itself - or of bonding with another piece of hydrogel - almost instantly. The behavior of the material can be controlled by adjusting the pH of its environment. In lab t...