Directors of TV programs, commercial programs, etc. usually convey their intentions to actors and production staffs using storyboards. However, it is difficult to perfectly and strictly convey director’s intentions to them since storyboards indicate only moment images of scenes. Directors then need much time to convey their intentions. In order to solve such problems, a system to automatically generate animation storyboards: moving images, have been developed in this study. The system is called “Virtual Studio System”. The system analyzes a scenario written by a director in natural language and automatically creates moving images. The system enables one to easily change the result: moving images, by changing the scenario in natural language. In addition, a method to make facial expressions of characters in the virtual system has been developed. With this system, anyone can easily make and edit animation storyboards representing a scenario.
Skip Nav Destination
ASME 2007 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference
September 4–7, 2007
Las Vegas, Nevada, USA
Conference Sponsors:
- Design Engineering Division and Computers and Information in Engineering Division
ISBN:
0-7918-4803-5
PROCEEDINGS PAPER
Virtual Studio System and Facial Emotion-Expression
Hideki Aoyama,
Hideki Aoyama
Keio University, Yokohama, Japan
Search for other works by this author on:
Umezawa
Umezawa
Ricoh Company, Ltd., Japan
Search for other works by this author on:
Hideki Aoyama
Keio University, Yokohama, Japan
Ryo Haginoya
NTT Data Corporation, Japan
Umezawa
Ricoh Company, Ltd., Japan
Paper No:
DETC2007-35023, pp. 507-516; 10 pages
Published Online:
May 20, 2009
Citation
Aoyama, H, Haginoya, R, & Umezawa, "Virtual Studio System and Facial Emotion-Expression." Proceedings of the ASME 2007 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. Volume 2: 27th Computers and Information in Engineering Conference, Parts A and B. Las Vegas, Nevada, USA. September 4–7, 2007. pp. 507-516. ASME. https://doi.org/10.1115/DETC2007-35023
Download citation file:
4
Views
Related Proceedings Papers
Internet Design Studio
IDETC-CIE2004
Related Articles
Analysis of Designer Emotions in Collaborative and Traditional Computer-Aided Design
J. Mech. Des (February,2021)
Automatic Facial Expression Analysis as a Measure of User-Designer Empathy
J. Mech. Des (March,2023)
Facial Expression Analysis for Content-Based Video Retrieval
J. Comput. Inf. Sci. Eng (December,2014)
Related Chapters
A Method of Reverse Modeling of Equipment Parts by Combining Geomagic Studio and Pro/Engineer
International Conference on Advanced Computer Theory and Engineering, 4th (ICACTE 2011)
An Adaptive Facial Feature Tracking for Expression Recognition Using Temporal Information in Feature Selection
Intelligent Engineering Systems through Artificial Neural Networks
Relation between Eyebrow Movement and Articulatory Parameters
Intelligent Engineering Systems through Artificial Neural Networks, Volume 16