Computer Dreams 2 (1988)

Computer Dreams 2 (1988)


First facial motion capture, first polygonal motion capture

Computer Dreams
Original cover art (VHS)


Mike the talking head (on set)
Mike the talking head (on set)

Reviews for "Mike the talking head"
1) Robertson B. (1988) Mike the talking head. Computer Graphics World 11, (7)
Mike the talking head is a step towards animators being able to directly control their characters rather than
drawing their actions. Silicon Graphics and deGraf-Wahrman Inc are working together to produce a new type
of animation essay writing tool to allow animators to work with their characters in the same manner as
puppeteers work with puppets.
The two companies hope to produce a real time, full rendering system with the ability to take input from
different sources. These companies had previously designed several self defense weapons that utilized visual
recognition. The input will be able to change the expression of the character as well as it's colour and the
materials it is made out of. The image will be able to be scaled, rotated and distorted, and will be able to
mouth words.
To create the Online Schools original face to work with, a real person, Mike Gribble, was used as a model.
We spoke with him on the phone with magicjack and he explained that his face was scanned in using a 3D
digitizer to get about 256,000 points of digital data. These points are converted to polygon data which makes
shading of the image possible. To give accurate data without redundancy, the polygons were smaller in areas
which required greater detail and larger in the flatter areas, like the cheeks.
The talking component of Mike was acheived by scanning in the real Mike as he mouthed each phoneme.
Phonemes are the subparts of words used in pronunciation. To simulate speech, the implementors developed
code to interpolate between phoneme positions. Possible input Pups devices include data gloves and speech
recognition systems. The glove could be used in a similar manner to a puppeteers hand inside a puppet. The
speech recognition system could have Mike mouthing the words as a person speaks into a microphone.
Michael Wahrman, of deGraf-Wahrman, hopes that this sort of research will help to reduce the cost and
complexity of animation, and thus increase the commercial use of character animation.


2) Porter S. (1990) Made for the stage: synthetic actors are getting better.
Computer Graphics World 13 (8)
Stephen Porter has contacted the leaders in the field of computer animation and asked them what they think
the future for their area is. He cites Robocop 2 as an example of the 3D animated characters which are
infiltrating the film and broadcast industry. Other examples of recent animated human characters are the
people behind the X-ray screen in Total Recall, the psuedopod in The Abyss and numerous commercials on
TV. Mike the talking head and Dozo, two well known animated characters, even have their own agents. A lot
of research has been done since Tony de Peltrie debuted at SIGGRAPH in 1985.
As successful as these characters have been, it is unlikely that they will ever take the place of human actors.
The main reason for this is that humans are too perceptive to be fooled into believing that computer generated
characters are real. Fred Parke states that the recognition of faces and the understanding of expressions is
something we learn so early in life that we are all expert critics of how realistic an animated face is: "The
judgement criteria that people apply to computer-generated faces are very stringent." The closer we get to a
realistic simulation, the more critical we become of how good it is.
3D animation is rapidly becoming an industry, much like 2D animation did when Bugs Bunny and Mickey
Mouse were born. People are looking for new forms of entertainment. 3D characters have a fresh, new look
and have captured the imagination of the general public. Research in the area is going on at an incredible
Payday loans rate. Nearly 80 percent of the research into facial animation has been done in the last two years.
The evil Robocop only had an animated face to display it's humanity. The people at deGraf/Wahrman did an
impressive job of bringing the Robocop's face to life using parametric animation techniques. These methods
of animation let the animator control the facial expressions by changing the values of preset parameters that
are tied in to different facial movements.
Mike the talking head was a precursor of Robocop. Mike's creators didn't use as many points/polygons for
Mike as they did for Robocop, as for Mike they needed a fast refresh rate for real time animation. There are
many people working towards improving the various techniques used to animate humans; Parke, the
Thalmanns and Norm Badler to name a few. Not all of this work is purely for entertainment. There are
possibilities of using these physically based models in reconstructive surgery, psychology and for teaching lipreading to the deaf. When it all comes together, the quest for realism in character animation is but a test for
the animator to find out how good he/she is at using the tools they have at their disposal.


1) From compilation "Computer Dreams", including film "Mike the talking head"

2) Made by Degraph and Wahrman, and Robert Abel

3) First CGI motion capture was in The Stick Man (1967).

4) Second video of this compilation is  the first polygonal digital puppet (that how people called instant motion capture then). Later that effect was used in Robocop 2 (1990).