首頁 > 網路資源 > 大同大學數位論文系統

Title page for etd-0813110-130449


URN etd-0813110-130449 Statistics This thesis had been viewed 2645 times. Download 680 times.
Author An-ting Chan
Author's Email Address No Public.
Department Industrial Design
Year 2009 Semester 2
Degree Master Type of Document Master's Thesis
Language zh-TW.Big5 Chinese Page Count 148
Title A Study of the Animation Character Design for Emotion Communications on Social Network Platforms
Keyword
  • Online community
  • facial expression factor.
  • 2D character-animation
  • emotion communication
  • emotion communication
  • 2D character-animation
  • facial expression factor.
  • Online community
  • Abstract Nowadays online community has become a part of human daily life. At the same time, this type of human interaction, namely interaction through the Internet, has become more and more popular and diversified. Some research scientists (Chew et al., 1988) found that verbal language only supports 35% of face-to-face communication, while 65% of face-to-face communication depends on body language, or non-verbal communication. Their study emphasized the essential role that non-verbal language plays in interpersonal communication.
    Communication in the virtual world, on the other hand, requires the assistance of a variety of additional tools to express the emotions conveyed in the non-verbal messages; such commonly used tools include emoticon, punctuation marks, animations and avatar. However there is no standard or rule for emotion communication in the online community. This study investigated the problems and inadequacies of using the tools to express emotions in the online community, as well as provided suggestions for the design of animation character for emotion communications on social network platforms.
    The first investigation was related to users’ experiences of internet communities and instant communication software. Results showed that most users used communication tools with a positive emotion and preferred to use pictures and animations to communicate. Besides, the communication tools were frequently used among people with acquaintance and for social purposes. In addition, the results indicated that most users had problems in expressing and receiving the exact feeling through the use of communication tools. Misinterpretation of the emotions often resulted in misunderstandings. This might be due to a lack of an important communication factor, the context.
    Therefore, in the second stage of the study, two animations with characters were created for comparison. The animation composed of two design factors, i.e., changes in facial expression component (A: brows/forehead and B: eye/eyelids, B: eye/eyelids and C: cheeks/mouth) and changes in the distance between the characters (i.e., near or far). Among the basic emotions, four positive emotions were chosen for investigation: Joy, Sorrow, Love and Surprise. Four animations were designed for each of the four selected emotions. Twenty observers were divided into two groups: senders and receivers. They were asked to watch each animation and then filled out some questionnaire.
    Statistical analyses revealed the facial expression factor had a significant effect in conveying Love and Surprise emotions. However, the distance factor had a significant effect in conveying Joy, Sorrow and Love emotions. Facial expression of eye/eyelids and cheeks/mouth in conjunction with a close distance was suggested to be the most appropriate expression combination for conveying Joy, Sorrow and Love. Also facial expression of eye/eyelids and cheeks/mouth was highly suggested for conveying Surprise emotion. In addition, the facial expression factor had significant effect only on the receiving of the Sorrow emotion. The distance factor had no significant effect on the receiving of all four emotions. A combinational use of facial expression of eye/eyelids and cheeks/mouth with far distance resulted in the lowest error score in conveying Sorrow.
    Advisor Committee
  • Li-Chieh Chen - advisor
  • Ming-Chih Huang - co-chair
  • Wen-Yuan Lee - co-chair
  • Files indicate access worldwide
    Date of Defense 2010-07-15 Date of Submission 2010-08-15


    Browse | Search All Available ETDs