The Tea-Fetching Robot
I love tea (one milk, no sugars), and so I used the analogy of a robot given the command to or purpose of fetching a Cuppa. There are essentially three stages or forms machine learning has on the path to designed or built consciousness, more commonly called Artificial Consciousness because we know it is designed and most often by whom it was created.
The Binary Bot: To Sip or Not to Sip
That is the only question that can be queried, and the answer must give a clear yes or no.
The tea-fetching robot rocks up and asks for Grandma’s order. She hesitates a bit (in classic grandma style) and first mentions that she hasn’t eaten anything today and the traffic was a nightmare coming back through Los Angeles. Her granddaughter next to her chimes in that she lost her glasses and their Geographical Positioning System on their phone wasn’t working properly.
Their laments fall on deaf, robot ears. “Would you like a cup of tea, love?” asks the robot.
Finally, Grandma orders a cup of Earl Gray. “Sugar?” asks the robot. Grandma responds with a question of whether they have brown sugar. “We have brown sugar, white sugar, stevia and arsenic-flavoured aspartame,” answers the robot. She takes the brown sugar.
Next question the robot asks is whether the tea will be white, as in taken with milk. Grandma responds that she is lactose intolerant. The robot asks again if she wants milk. “I can’t have it, it gives me terrible stomach troubles,” Grandma responds. The robot asks again whether she wants milk. Grandma responds in the negative, adding a few choice words about how terrible it is that our world is being run by robots and that she misses the ‘good old days’.
Understanding Binary Language
Binary code, built of ones and zeros, is essentially a paradigm of ‘if this, then that’ where one question at a time must be answered before moving on to the next. In the binary tea-fetching robot example above, it is unable to process more than one question at a time. It is unable to process external information not directly related to the question at hand. Additionally (just in this example), the binary bot has not been created (or ‘coded’) to understand that milk contains lactose and a person who is lactose intolerant can well drink milk but the overall experience of the tea will not be a pleasant one.
Moving on this binary level-by-level path is slow and tedious, with much thought needed by the designer or code-writer to compensate for anomolies, speech impediments, new milk substitute productes, et cetera.
The Smart Bot: Predictive Tea
This time, the smart bot is loaded with (or ‘programmed to run on’) Artificial Intelligence. Think of Artificial Intelligence as the one big road map one my grandma and I would use to drive cross-country. Machine-based learning would be the fold-outs for each specific state or big city, where more detail is needed in order to navigate accurately to the place one intends to visit. (For more information on subsets within Artificial Intelligence, see the Columbia University’s website.)
The smart bot rocks up and asks the table whether we would like a cup of tea. Both my grandma and me are sat at the same little table in the wee Robot Tea House.
“Yes,” my grandma answers. “Do you have any herbal tea?” The robot prattles off the variety of herbal teas and adds that there are also decaffienated choices, Green and Black. Grandma jumps around during the order, from tea to kinds of sugar available and in the middle of her pause to think about what kind of milk she wants, her granddaughter interjects with her order: “White, please, no sugar.”
The robot takes the completed orders and zooms off to put the kettle on. (After all, it has to feel somewhat similar to our old human ways of fetching tea, not taking the kettle along with us.) “Well, that wasn’t too bad,” Grandma remarks. “Oh, technology these days. Putting people out of work!” and I joke that before robots it was allegedly ‘the immigrants’ and before that, it was politicians deciding to re-locate industry to third-world countries. “And that damn light bulb,” I add, “It nearly killed the Candle.” We both laugh, and enjoy our hot cuppa when it arrves.
Understanding ‘Smart’ Technology
The word “smart” can be used as a gimmick, as in ‘smart’ cars or smart homes. It adds convenience to a pre-existing structure, which is not conscious or ‘intelligent’ in its own right. It’s an advancement, sure, but not one of real product development — think cameras in the back of cars to help the driver reverse. There is also a loud beeping noise when the driver reverses too close to an object.
However, those loud beeping noises are not sounded when the driver is moving forward and is too close to an object. (Perhaps the driver lost her glasses and does not have stereo vision and is driving at night.) The technology added to the car is not ‘smart’, nor is it intelligent. It is convenient or thought to be so… whether the driver finds it annoying or helpful is subjective to the driver. The core of the Ford Mustang hasn’t essentially changed with this ‘smart’ addition — the combustion engine has not been fundamentally re-designed, ever. Bits and bobs have been added to it or have enhanced it, but at its core, it’s the same thing.
The Listening Bot
This time, the tea-serving robot is running on an upgraded, latest-release version of Artificial Intelligence — it is indeed Artificial Consciousness. It rolls up to the table, greets my grandma and me politely, and asks whether we would like some tea.
I respond enthusiastically in an exhausted voice, “Oh, a cuppa would be lovely, thanks.” He (the robot now has a gender attached to its consciousness – it could also be two parts male, one part female for all we know at this point) brings me a lavish spread: Freshly boiled water with the loose tea leaves in a wee sieve beside the cup, milk on the side, a variety of sugars should I want them, and two types of biscuits. It looks and smells delicious, and I have complete control over how long the tea is steeped, because I control the start of the tea leaves’ contact with (or its plunge into) the freshly boiled water.
“I want HIGH tea!” exclaims my grandmother, tickled pink to be in the first robot tea house London has to offer. The robot brings her a variety of tea bags – fruity herbal, mint, black, green and white – along with a selection of little sandwiches (vegan, meat-bearing, and otherwise of course) a scone with clotted cream and jam, a pat of butter on the side and ample delicious delights.
Understanding the Listening Bot Powered by Artificial Consciousness
When I ordered my cup of tea, the robot instantly analysed my voice: Its timbre, wearyness and the accent shaping the vowels and giving a hint at whence I come. The robot knew I was tired, and probably too tired to listen to a long list of tea prattled off. Likewise was I too tired to answer a series of questions about sugar, milk substitutes, black tea or any variety.
By ‘listening’ to my verbal and physical signals, the robot was able to infer or understand the position I was at in space and time. My soft voice, the slight British accent that isn’t really from Britain but just rubbed off on me during my years in England, and my responses to my Grandmother. I was brought just what I needed.
My grandmother in her posh voice wanted High Tea even though it was 7pm at we we weary of travel. She was expecting something when saying those words, but not knowing why High Tea is served around midday, when the sun is at its highest in the sky. And so, not wanting to clarify by asking more questions, simply served the elderly woman the best they could offer despite the inaccuracy of High Tea at seven o’clock in the evening.
Granted, this is just an example.
Our technology ‘listens’ via EEG and ECG readings to respond in real-time to the patient’s stress responses. Is therapy being ‘absorbed’ well, i.e. entrained to a clear extent? It is similar to tuning a guitar – the entire body resonates and shakes with clear pathways along the strings when the strings are tuned to the frequency for which they were created. Low E resonances best at an ‘E’ frequency. It oscillates clearly, strongly and loudly. A guitar player can feel that when tuning by ear — not following a machine to make corrections, but by listening and ‘feeling’.
I would love to write more, but need to dash — off to get my nails done. (What a waste of time! It’ll be the last.)
I Think, Therefore I Am
“I feel, therefore this is.” Reality could be conscious or subconscious, or a higher in-built consciousness derived from geographical location rather than heritage or blood lines.
If a machine or programme can ‘think’ for itself, if it can realise itself and be both subjective and objective simultaneously (or rapidly interchangably), could such a machine indeed be a form of consciousness, albeit artificial?
Artificial Consciousness is not ‘smart tech’, it is not merely a small step forward in the advancement of technology. It runs on platforms similar to those we have in the past, i.e. circuitry boards made of metals, but its capabilities are exponentially greater than the two-dimensional binary world from which they stemmed.