Please use this identifier to cite or link to this item:
http://theses.ncl.ac.uk/jspui/handle/10443/333
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Woolard, Adrian | - |
dc.date.accessioned | 2009-08-06T08:36:31Z | - |
dc.date.available | 2009-08-06T08:36:31Z | - |
dc.date.issued | 1994 | - |
dc.identifier.uri | http://hdl.handle.net/10443/333 | - |
dc.description | PhD Thesis | en_US |
dc.description.abstract | This thesis presents the initial exploratory research into an original and novel technique to enhance performance control in animatronics. An animatronic system is defined as a 3-D electro-mechanically driven facial model that can move in certain ways, when controlled by a human performer to create the "illusion of life" for a viewer. The vital elements in this form of performance are the synchronisation of lip movements to an acoustic speech signal and the animation of emotive expressions. A novel optical sensing technique is proposed based on the hypothesis that the input of distinctive articulatory or emotive movements from the performer's face would provide a more 'natural' form of control. The principle that the movement of a minimal set of points at key positions on the face can produce sufficient control information to describe the overall action is proposed to achieve this hypothesis. A comprehensive investigation into human communication, including visual speech perception and non-verbal facial expression, to define the optimum set of key points is described. Conclusions are also drawn on the primary facial actions required for successful lip synchronisation. Both the theoretical and practical aspects of the realisation of a prototype system are described. A methodology is presented for the assessment of the sensing system and the overall objectives based on the design and construction of an animatronic face, of the same dimensions as the researcher's, to produce animation of the desired actions with similar displacements. Objective analysis is achieved through the comparison of measurements by the sensor system from the performer's key point movements and those of the animatronic model. Perceptual data is generated through the visual analysis of the animated facial movement. The results and analysis of the investigations are presented in the thesis. The thesis discusses results obtained which indicate that, given certain valid assumptions, the sensor system is capable of consistent facial motion detection. It can provide sufficient control for the animatronic model to produce a limited set of facial actions in a realistic manner. Results indicate the possibilities for improved lip synchronisation and, hence, "overall character" performance. | en_US |
dc.description.sponsorship | Jim Henson Creature Shop, London : | en_US |
dc.language.iso | en | en_US |
dc.publisher | Newcastle University | en_US |
dc.title | Animatronics :the development of a facial action sensing system to enhance performance control | en_US |
dc.type | Thesis | en_US |
Appears in Collections: | School of Electrical, Electronic and Computer Engineering |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Woolard94.pdf | Thesis | 91.33 MB | Adobe PDF | View/Open |
dspacelicence.pdf | Licence | 43.82 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.