Monday 13th March
A pre-arranged meeting took place with Professor Fairclough and myself to discuss in more detail affective computing and the integration into digital avatars/agents. The meeting itself was to discuss a relatable project, however the contents is of relevance to the collaboration research.
Professor Fairclough works in a subset of affective computing called physiological computing. Physiological computing can change the way we interact with computers. Creating new models of interactions using cybernetic adaptions. This provides the computer with the digital representation of the physiological state of the user.
From this hybrid systems with multiple functions could be created. Such as brain-computer interface combined with biocybernetic adaption.
Emotion is a key ingredient of intelligence perception. It tells us as humans what to pay attention to and what to ignore.
Physiological computing looks at the hidden aspect of emotional behaviour. These behaviours are monitored using skin conductance, heart fluctuations, muscle tension, pupil dilation and facial muscles. These changes in the body consistently relate to emotions.
This has seen a big trend in wearables. With wearable devices, we can collect biometric data on the individual, for instance the pulse, electrodermal activity and motion. This data could be used by an artificial intelligence to monitor the person’s health or to engage the individual if it perceives varying emotional states.
We discussed the ethics and morality of giving a computer or robot human traits and personalities. The ethics of storing people’s physiological states as data is also an ethical point that has not been actively researched.
To create an intelligent interface or robot that can recognise emotions, we have to give them human characteristics. Are we ready as a species to give this level of sophistication to artificial intelligence? One that can read our emotions through hidden aspects of behaviour, physiological states that we ourselves cannot pick up from another human, for instance rapid heart beat etc. We as a species are on the cusp of another industrial revolution, more research needs to be conducted into the impact that artificial intelligence will have on society.
Professor Fairclough believes the situation has at be investigated multidisciplinary, from Psychology, neuroscience, anthropology and cognitive science.
With varying theories are emotion, this leads to whether the computer is perceiving emotions or just physiological states. We can create realistic digital faces at this moment in time using computer graphics. It is when these faces move and emote that the believability can break down, leading to the uncanny valley. To counteract the different theories of emotion computer scientists, use the FACS system. The FACS is used as it is based on anatomical muscle movements of the face. Within the systems action units (AU) relate to certain muscle movements. Combing these varying action units together can create facial expressions on the digital face.
This talk with Professor Fairclough highlighted the floors that are still present in emotional theory and research that must be carried out in future.
Tuesday 14TH March 10am
A meeting was arranged to talk to Mike O’Shaughnessy, mainly to discuss exhibition and publication when disseminating the collaboration.
Mike engaged a dialogue regarding my reflections of the project. It was important that I take ownership of my exhibition place, speaking to the audience I am aiming to.
As the research is still on going as a research agenda for Face lab it I important to present a snapshot of where I am at present. Showing the public side of research at Face lab.
A key point is look at how my publication sits in the identity of the collaborative research. Create a synapsis between the theory and the practise that has taken place. Changing my methodology from the doing while creating the project to the showing to disseminate the project findings so far.
I asked Mike about practical application of publications. Mike discussed the economy of 4 for books, saying it is best to be hands on before entering InDesign. Getting a feeling of the information I have available and how I can construct it in a pleasing manner. This allows me to get a feeling for the tactile nature of publication. A key point Mike made was to keep the text at around 8 points for print, this is to negate the zoom of the text when it is printed in a larger format.
I also received a practical tutorial into large scale printing on roller printers. These printers allow for a higher colour and quality accuracy than standard printers.
Tuesday 14th March 3pm
I attended a meeting with Chris from Face lab to discuss my both of our work to date. Chris has been working on version two of the anatomically correct eyeballs. He has coded a script to allow the eyes to be inserted into any scene, with option to change the colour of the iris and animate pupil dilation and contraction.
It was agreed that I would research into rigging technique of the eye area. This could potentially add to the blendshapes by adding micro movements to the eye region soft tissue. This was added to the google drive weekly timeline document. Sub task were also added, time permitting. As the most important aspect is the deformation of the eye region. The sub tasks include looking at the skin shader of the facemask, liquid around the eye and texturing.
Friday 17th March
A brief talk took place. Time was short as both I and Chris has prior engagements. Chris has finished version two of the eyes and will upload it to the google drive folder. This will allow me to place the new eyes in place correctly and start adding more blendshapes. The eye blink blendshape is at a working stage but generally needs refinement. The next week I will start refining adding more blendshapes as I go.
Eye blink in ZBrush.
Chris also gave me some ideas into coding a blog running through a coding platform. I believe this will be adding to my work load as I can create an easy to read stylish blog using a predefined blog platform.
I also spoke to Mark about problems I have encountered with the scan texture not being read correctly by ZBrush. Mark was unsure what may be causing the problem but will look into it further. Mark could do this as the file was still saved to his Face lab computer.
Texture is shown incorrectly in ZBrush.
Unfortunately, I was admitted to hospital as an emergency though the week so was not able to do as much on the project as I hoped. Despite this, I should be able to catch up throughout the next week.