Tuesday, 5 January 2016

The Echo Men


What if the person you talked to next was being fed lines from a hyper advanced computer, would you know? No. At least you probably would not it seems. I found this out a while back when I was listening to a great podcast by the guys at 'Stuff to blow your mind' on the subject of Echoborgs. (Listen Here) 

Apparently we find it difficult if not impossible to tell if a stranger speaks with their own mind or is fed lines from a director. Have a look here for some excepts from the tests into this phenomenon. . Why in this case would a computer ever need to construct a robotic avatar to enable conversation would it not just employ people to be an Echoborg? Echoborgs are a living mouthpiece for the computer to enable it to seem more human when it announces decisions and be involved in human activities such as a company retreat. All in all a human is much cheaper than a robot that mimics a human.

The questions roll out from that point onward for me. The discussion in the podcast seemed to see the Echoborg as a 'meatpuppet' merely dancing at the whim of the computer. This is not efficient. The AI would itself need to exactly how human communicate or would need to dictate a huge raft of information or spend a lot of time training the Echo to get the intonation, inflection correct otherwise the conversation is limited. Sarcasm may be a great example of what I mean in this case. Typing a sarcastic remark often results in it losing the emphasis. Imagine being fed a line that should be read sarcastically and reading it deadpan. The result is changed and if you are the echo of a FTSE100 company talking to shareholders theirs going to be a problem. Why is it important, can't we just drop sarcasm? Well if the exercise is to come across as human then we need the full range of expression otherwise we get Data from star trek. Merely humanish. So either the AI director here has to learn how to imitate sarcasm and all the other subtleties of communication itself or brief the Echo on so much information that any important conversation needs to be borderline scripted.

So to save on time the computer puts some trust in it's echo. The echo may be selected for a purpose maybe they are very gregarious for social situations or particularly threatening for when the computer needs to make a point. Either way rather than dictating the entire speech the computer gives directives and occasionally takes direct control when needed to relay some fact or give direct orders. over time the echo learns how to interpret the will of the AI director with greater accuracy increasing thier performance. Eventually the two may become inseparable as people become used to speaking to that face and hearing the answers. This partnership is perhaps the best solution a balance of autonomy and function but what if there was a better way?

What if we and the AI exchanged nigh on everything. We got an overview of the data the AI is working with & what logic it is applying and in return it experiences the world through our senses, experiencing our pleasure, pain, sadness and joy. We generate data on other people at a startling rate, we pick up expressions that last milliseconds and here slight changes in tone and pacing that can change a sentence from one meaning to another. In return we can look at how to present and handle data. Our ability to predict how other humans will react is going to be superior to a computers understanding.

No comments:

Post a Comment