Monday, November 23, 2009

PERCEPTRON

YEAR: 2067 A.D.
PLACE: ROBOTICS DIVISION Laboratory,
IW RESEARCH INSTITUTE,
CHENNAI
INDIA

There I was, sitting along with a desktop PC in the laboratory room and waiting. It’s been 00:08 min since my wait sequence started. Oh! I forgot to mention that I was waiting for Marvin-I. He or It was the first product of the perceptobot. And I am waiting to begin the testing.

Before I narrate the remaining of the events, a brief introduction of what is this testing of perceptobot all about. I am Sembai Xavier, age 40, PhD in Robotics, a researcher scientist. Robotics is the happening field in research as the yields are high owing to the automation output. Here in our robotics division, we have developed the concept of a robotic brain using Artificial Neural Networks. The perceptobot is named after Marvin Minsky, one of the pioneers in this field. The capabilities of the networks that are developed here in our research laboratory are almost close to the human brain (that is what we think). Marvin-I is one such of our products. It is a first of its kind robot as this the first robot with a brain of its own, an artificial neural network brain.

When the design of this robot began there was a huge discussion over many points out of which I want to mention two points of discussion
(1) The primary purpose of the Robot
(2) The evaluating/training procedure of the Robot.
There were many considerations for the primary purpose of the robot such as a Research assistant, Remote sensing data analyst etc. But finally it was decided that the purpose would be a patient data collector which would collect the data from the patient monitoring devices that are used in hospitals and generate appropriate reports of the patient. This was chosen mainly to analyze the monitored data and suggest the possibilities for diagnosis. This would reduce the paradox error (human factor) in such activities so that the patient would receive correct treatment at the right time. Coming to the second point ‘The evaluating/training procedure of the Robot’, as the brain of the Robot is made of perceptrons (artificial neural network) and hence the name perceptobot, it requires initial training to shape the perceptrons for the desired task. This robot after this training is to be subjected to a series of acceptance test which is to be conducted by an design engineer from each team i.e. one engineer from the software team, electrical team, mechanical team etc.

And here I was waiting on behalf of the electrical team to conduct the acceptance test of the first perceptobot. The perceptobot was brought in with the help of an automated trolley which was operated by a laboratory assistant, even though the perceptobot was capable; the mechanical team was hesitant to allow it to walk more than 1000 meters. The perceptobot was designed with to look like a human robot, with arms legs and most importantly a head with eyes and a mouth where the camera and a speaker were fitted respectively. I mentioned the laboratory assistant to leave the lab and was preparing my first encounter/meeting with a livelier of my products (the feeling of possessiveness increases the more I see the perceptobot).

I cleared my throat and looked once again at the test procedure on my hand to re confirm the sequence of test that the electrical team has prepared for this occasion.
I began “Hello Marvin”.
There was a response after 1 sec (approx) “Hello doctor” – the LED near the mouth was blinking. A thought occurred to me ‘would a female voice be more appropriate considering the primary objective of this perceptobot?’
“Do you know me?” said I.
“Yes. I was told that I was to be tested by you on behalf of the electrical team.”
I was beginning to relax, “Good. Before we get started, let me do some sanity check”.
“I cannot understand. You want to check whether I am sane or insane?” asked the perceptobot.
“No.” I explained, “I want to check some of the electrical circuits in you. ‘Sanity check’ is a colloquial term used to check the basic functionality.” And I made an entry in the notes section that an explanation was given to the perceptobot for sanity check.
I continued “What is your current body temperature?”
“Do you want the temperature in Celsius or Fahrenheit?” asked the robot.
“Celsius”, said I.
“51.2658 degree Celsius” came back the reply.
I wondered whether mentioning 4 decimal places is necessary. I decided not to note and observe further. I looked at what’s next in the test procedure. The first on the agenda was the pattern recognition. I had loads of ECG data of various patients and the perceptobot is to observe the data of each patient and bring out the similarities in them.
“Marvin, I going to show you a list of data of various patients and you need to analyze the data and point out the similarities in them and if possible suggest any additional monitoring that needs to be done. Am I clear?” said I.
“Yes doctor” said the perceptobot.
I showed him the data. Each data report had a dummy picture of a patient. Two reports contained the same photo as part of a sub test. The perceptobot was quick to identify it “Excuse me doctor, there might be a confusion here. The photo in this form #374 is similar to the photo in the form #289”.
I looked at the photo and said, “It must have been a mistake. Ignore the photo and can you imagine any imaginary face for this form?” making a note in the notes section.

The perceptobot said” An Imaginary face. I can only imagine the faces that I have met so far. Starting from the training engineers to you. Can I pick any one face in random?”

I was wondering what is the probability that it would choose my face? I said “Yes” and proceeded to the next exercise. The next exercise was Emergency drill. Before that I asked the body temperature and noted it down. If the temperature is ok, then the circuits are ok. Also there are indications on the perceptobot body to indicate any malfunction of any circuit. So far so good I thought.

I started the next exercise, “Have a look around the room once and give me a report of the various emergency exits the room has”. The purpose is to check the training of emergency response. The perceptobot started to walk around to see the emergency exits and I was looking at the perceptobot.

At this point there was an uneasy feeling that something was not correct. I remember the perceptobot saying “Can I pick any one face in random?” this is not the expected answer as the perceptobot was not trained to pick random choice, it should have given me the options and get the choice from me. I made a note and underlined it. The perceptobot returned and produced its report i.e. explaining about the exit in the room and I wondered what class of medical device this would be fitted in if at all it is called a medical device. The testing continued.

Three hours later I was at the program director’s office. He offered me a seat and it was clear that he was surprised at my sudden visit. I spoke first “Sir, this is about the perceptobot Marvin-I.” His eyebrows went up “Yes” he said. “Sir I don’t want to recommend the perceptobot into field.” said I. He was taken aback “And the reason” he asked.
I handed over my notes to him. He read them very carefully and looked at me still puzzled.
I said “There is an underlined section of the report in which the perceptobot showed signs above expectation”.
“Was this the reason for your recommendation?” the director asked.
“No sir. During the test procedure, I permitted the perceptobot to interrupt me and ask doubts if any. It started asking basic doubts like the purpose of trip sheets, self temperature monitoring etc. The last doubt that the perceptobot asked was:
“If a patient dies, would I be blamed?”