||In recent years, emotion analysis is a popular research topic. Facial expression plays a very important role in emotion analysis because of its instant and changeability characteristic. Most traditional expression classification systems track facial component regions such as eyes, eyebrows, and mouth. Though the obvious and prominent basis features can be the main clues for facial recognition, the finer changes of muscles on face can also be used to perceive the variation of expressions. This paper deployed facial components and dynamic facial textures such as frown lines, nose wrinkle patterns, and nasolabial folds to classify facial expressions. Firstly, we integrate Adaboost and ASM (Active Shape Model) to accurately detect face. Then, we utilize the facial feature points from ASM to acquire important facial feature regions. Gabor filter and Laplacian of Gaussian edge detection are used to extract texture features in the acquired feature regions. These texture feature vectors represent the changes of facial texture from one expression to another expression. At last, Support Vector Machine is deployed to classify the six facial expression types neutral, happiness, surprise, anger, disgust, and fear. Cohn-Kanade database is used to test the feasibility of proposed method, the average recognition rate reaches 91.7%. In addition, we test the system with five persons and the results show that the expression recognition rate reaches 93% in real-time.