Submit manuscript...
Journal of
eISSN: 2574-8114

Textile Engineering & Fashion Technology

Research Article Volume 2 Issue 1

Design of interactive fashion (IF) related to emotion recognition based on detection of physiological signal data

Crease Xia,1 Li Zuwang,1 Frankie Ng1

1The Hong Kong Polytechnic University, Hong Kong
2Beijing Institute of Fashion Technology, China

Correspondence: Li Zuwang, School of Fashion Art and Engineering, Beijing Institute of Fashion Technology, BIFT, No.2 Yinghua Road, Chaoyang District, Bejing, P.R. China, Tel 8601064288183, 86013661065131

Received: February 15, 2017 | Published: June 16, 2017

Citation: Xia C, Zuwang L, Ng. Design of Interactive Fashion (IF) related to emotion recognition based on detection of physiological signal data. J Textile Eng Fashion Technol. 2017;2(1):297-306. DOI: 10.15406/jteft.2017.02.00048

Download PDF

Abstract

While modern living has been increasingly interactive, equal attention has been drawn upon emotion psychology in the course of interactions. It is timely indeed for the aspiration for creation of Interactive Fashion (IF) and a deeper understanding of its interactions among wearers and between wearer and clothing. In this paper, an original theoretical system of IF was established to elucidate the interaction processes, interactive relations and levels peculiar to interactivity in fashion. The theoretical system of IF helped guide the designs and creations of IF towards the latter stage of this area. In practice, the emphasis is focusing more on analyzing IF that detects and recognizes emotional changes based on physiological signal data measurement obtained via visual expression observed during which interactions between prototypes of IF created, i.e., ‘Breathing Dress’ and ‘Heartthrob Dress’ in this study. Such understanding and recognition are no doubt of tremendous value contributive to the long-term development of emotion management and medical/physiological diagnosis.

Keywords: interactive fashion, emotion recognition, physiological signal

Introduction

Today, our life is increasingly interactive. ‘Interactivity’, as the origin of this research, was first expressed in art as early as the 60s,1 and it started to develop into various design disciplines such as installation, architecture, product, as well as fashion and textiles since 1980s.2 With the development of computer science and digital technology, multimedia technologies have been incorporated into artwork which gave rise to new art forms, i.e., New Media Art.3 Being the main characteristic of new media art, Interactivity detached and developed into a medium of new art form – Interactive Art.4 The term ‘Interaction Design’ was first proposed by Bill Moggridge and Bill Verplank in the late 1980s.

Since the happening of Interactivity has a close inter-relationship with electronic design. Electronic design technology is the based technology for setting up interaction processes and operating the whole system of IF. Refer to the electronics applied into clothing, there has been the term named wearable electronics. IF could belong to this category. The earliest wearable electronics can be traced back to the ‘wearable computer’ appeared in 1960s. With the rapid development of science and technology, wearable electronics has exceeded a great number of limitations of existing technologies and has become a hot topic in cross-disciplinary research and application. Over the past few years, many textile scientists and electronic engineers have been conducting research on various technologies of textiles and clothing related to interactivity. Besides research on functionality, a few fashion designers also started to design interactive intelligent clothing and smart fashion.5

Over the past few decades, there has been increased emphasis on research on emotion. Emotion has become a branch of psychology in its own right. It has its own niche in sociology, retailing, computing, design, etc. In the research area of human-computer interaction in particular, the computer too recognizes emotion to meet people’s spiritual needs. This capability ultimately enables computers to naturally and vividly interact with humans, like humans do.6 Humans’ psychological needs will eventually return to the simplest level, while computers with emotional intelligence can be the media for transformation and realization. Design of IF is also based on these concepts of human-computer interaction, focus on emotional recognition to help people to ultimately achieve interactive communication among them.

In this study, a systematic theoretical research was presented. It includes establishment of independent theoretical system of IF, study of theory of emotional recognition via physiological signal data towards IF. The experiments with particular focus on investigation of physiological signal data measurement for emotional recognition. Two prototypes named ‘Breathing Dress’ and ‘Heartthrob Dress’, were designed and produced. They advanced a holistic concept of IF that are both structurally innovative and functionally sophisticated, whereby clothing is escalated to a new level where they are not only fashionable and comfortable to be wore, but also can they suggest real-time moods and emotions of the wearers for timely responses and create subtle interactive effects among people.

Successful design of IF not only expanded the aesthetical and technological dimensions of fashion attributive to the subsequent redefinitions of fashion as object d’art as well as utility, of humanities and technology, but also did it reshape our lifestyle and cultural context in which we live.

Theoretical research

Definition of IF

Conventional fashion lacks motility and vitality; the fashion language expression is limited to a single wearing mode. Modern society today demands individuals to change their roles and emotions in tune with needs of different social situations. To this end, a dynamic and changeable fashion language is required. On the other hand, there has been an imminent need to monitor and regulate our psychology and emotions in a society where individuals are under ever-increasing pressures of different kinds. Being a medium of frequent and intimate contact with humans, fashion needs to be enriched with deeper meanings and missions and the ability to activate its fashion language. Thus, the concept of IF is presented, based on these needs. Yet, what is IF?

IF is a new fashion language. It is the fashion language that could be expressed at its own initiative. It possesses activeness, self-initiation, motility and reasoning by which direct and visible interaction or communication among fashion, its wearers and others is created.

  1. Expression of activeness: IF is enlivened with liveliness. It is no longer a layer of a still cover that wraps around the body or a kind of fashion language that cannot speak much for it. With activeness, fashion transforms itself from its introvert being to an expressive extrovert one. IF responds to human emotions such as happiness, sadness and anger, etc., and/or human intentions through the major elements of fashion language like color, pattern and style for simultaneous interactions. Besides color, pattern and style, IF can also use other forms of expressions such as sound and music to express its fashion language. IF delivers fashion language more directly and makes expression more lively.
  2. Expression of reasoning: to a certain extent, IF is like fashion with brains having reasoning ability. From active perception to corresponding responses is a course of reasoning. It reacts with corresponding responses according to various inputs it receives after some kind of consideration in its mind. Thus, the responses it offers are a result of reasoning. The reasoning IF processes literally make fashion language a language that communicates.
  3. Expression of self-initiation: IF changes fashion from totally passive to simultaneously active. Because IF has considerable reasoning power, it can determine its own form and content of responses through its ‘mind’ and it forms interactions and functions of communication ultimately.
  4. Expression of motility: IF is capable of instant reception and reaction. It means that IF can change into appropriate color, pattern or style at once when it becomes aware of human needs by the informs. And, in turn, IF can offer appropriate responses such as sound, temperature and also the traditional elements change immediately when it detects human emotional changes as shown in Figure 1.

Figure 1 Comparison between FASHION and INTERACTIVE FASHION (IF).

With reference to the preceding concept of interactive art, IF attempts to translate ideology into reality, and often in a pragmatic context.7 A brand new form of communication and synthesized fashion language are formed by capitalizing on the distinctive characteristics and ideology of Interactive Art in fashion design. Expressing through clothing in a 3-dimensional setting and a 4-dimensional space-time interaction with time, sound, light, text and motion,8 an infinite range of new forms, relationships and ideas of fashion expressed through fashion. Like any interaction design, fashion as a design carrier requires injection of new expressive forms to enhance its meaning. It is an integration of traditional designs with the burgeoning ones. Based on the concept of “interactivity”, IF often merges electronic engineering and novel materials with fashion, advancing new forms of fashion as a retroaction against the wearer and the observers.9 The original static clothing is vitalized with a multitude of wearing modes, which in turn enrich the expressive forms and artistic contents of fashion during its interactions and exchanges among clothes and people, e.g., between fashion and wearer, fashion and fashion, and wearer and wearer.

Design principles of IF

Summarize relations and levels of interaction process of IF in Table 1 below:

Sorted by Relation of interactivity:

Relation 1:

  1. IF → Environment
  2. IF → Human

In Ration 1, IF influences environment or human directly through a prescribed course.

Relation 2:

  1. Environment → IF → Environment/Human
  2. Human → IF → Environment/Human

In Ration 2, IF can take in information from environment or human and responds to them. This process is not yet recurring.

Relation 3:

  1. Human ⇄  IF
  2. Subject ⇄  IF

In Relation 3, the interactivity is a recurring one.

Relation 4:

  1. Human A  ⇄    IF A⇄       IF B   ⇄   Human B

In Relation 4, the occurrence of interactivity has transformed from direct contact to a more sophisticated interactive course. It can take place through wireless communication, and is recurring.

Sorted by Levels of interactivity:

Level 1: Non-interactive Relation 1:

  1. ↓IF → Environment
  2. IF → Human

Level 2: Reactive Relation 2:

  1. ↓Environment → IF → Environment/Human
  2. Human → IF → Environment/Human

Level 3: Interactive Relation 3:

  1. ↓Human ⇄      IF
  2. Subject ⇄      IF

Level 4: Communicative Relation 4: g. Human A  ⇄      IFA  ⇄    IFB   ⇄   Human B

Relation

Level

IF→ Environment [a]

Non-interactive [N]

IF → Human [b]

Environment → IF → Environment/Human [c]

Reactive [R]

Human → IF → Environment/Human [d]

Human  ⇄   IF [e]

Subject   ⇄   IF [f]

Interactive [I]

Human A  ⇄    IFA     ⇄ IFB  ⇄   Human B [g]

Communicative [C]

Table 1 The relations and levels of interaction process of IF

The classification of level of interactivity is primarily according to the interactive relationships between artifacts with environment and/or humans, as well as its ability of interaction. It ranges from passive direct transmission of message (non-interactive) to reception of message and response (reactive) to active reception of message and response (interactive) and passive voluntary interflow (communicative).

Based on the data, Figure 2 Proposes the interaction processes that specifically addresses to IF.

Figure 2 The basic interaction process in detail of IF.

Practical experiments

In this study, emotional recognition was based on the physiological data measured when an emotional reaction was generated. An emotional reaction infers a series of physiological changes during emotional activities, and these physiological changes can be recorded as physiological data via specific instruments for other research purposes. Two of the basic emotional reactions were selected in this study to measure and collect the corresponding physiological data, i.e., heat rate/pulse rate and respiration, and their corresponding physiological data were ECG signal data/BVP signal data and RSP signal data (highlighted in yellow) is as shown in the following Figure 3.

Figure 3 Selected emotional reaction and physiological signal data.

In this study, the experimental procedure for the collection of emotional and physiological data made reference to the experiments conducted in University of Augsburg in Germany.10 Forty males and female subjects aged 20-30 were invited to watch four different types of movies, i.e., they signaled joy, anger, fear and peace. The physiological signal data samples of the corresponding subjects were measured using special equipment and the aim of the experiment was to establish an emotion model.

Electrocardiography (ECG) measurement-heartbeat

An Electrocardiography (ECG) signal serves to monitor the heart rate, and is produced by cardiac cells when a human heart beats. It can also reflect the physiological changes of a human heart over time. Ekman et al.11 discovered that the heart rate is the fastest when one is in a state of anger and fear; when happy, the heart rate is moderately fast, yet the heart rate slows down when the subject is in a state of sadness of surprise, and the heart rate is at its lowest when the subject experience disgust. In addition, heart rate changes are affected by both gender and emotions, for example, the heart rate of female subjects has been found to be higher than that of male subjects. Lower heart rate variability (HRV) indicates a relaxed state, while the enhanced HRV indicates a possible spiritual state of tension and setbacks.12

The measurement process:

  1. Subject sits quietly. Glasses, watch, cell phone and other electrical appliances are removed, and all muscles relaxed.
  2. Control buttons are placed on the ECG panel on the appropriate location according to the requirements. Any ECG input interface is connected to the ECG guide electrode and the ECG channel connected to a computer acquisition system which records the ECG of the human body. The power is connected after the ECG machine or computer has been properly grounded.
  3. Electrode placing: First, the parts where electrodes are to be placed are cleaned using alcohol-soaked cotton balls. The parts with conductive paste are then coated to reduce skin resistance. Electrode holders are placed in positions with less muscle. In general, they are positioned about 3 cm above the wrist (flexor side) or 3 cm above the inner ankle.
  4. Lead wires connection: lead wires are correctly connected according to provisions of ECG. Generally, lead cords of 5 different colors connect with electrodes of corresponding body parts.
  5. Baseline adjustment devices are adjusted so as to place the baseline in the appropriate location.
  6. Standard voltage input: when the input switches are opened, the working status of the ECG is adjusted, and the standard voltage (1mV=10mm) applied.
  7. The ECG is then recorded.

The Blood Volume Pulse (BVP) measurement procedure was relatively simple. As shown in Figures 4a-4c, BVP sensor was positioned close to the skin on the finger, and then the red light was emitted from the sensor on the skin surface. The size of the beam of the reflected red light, which changes with the change of subcutaneous blood flow, can be calculated. The real-time BVP waveform measured was shown when it was connected to corresponding monitoring devices. The BVP signal reflected the pressure of the pulse. When a subject is surprised, scared or excited, the signal envelope tends to squeeze tight; while when the subject is relaxed, blood flows to the peripheral and BVP amplitude increases. Blood volume is one of the parameters affecting cardiac output, thus pulse associated with blood volume was adopted in the selection of parameter. Pulse signal is relatively weak. For normal adults, the frequency of the pulse signal is within range of 0.01-40Hz of which 99% of the energy is distributed between 0.01-10Hz.

  • Figure 4a ECG measurement of a subject tested.

    Figure 4b Blood Volume Pulse (BVP) measurement procedure.

  • Figure 4c RSP sensor and the measured waveform of a subject.

    • RSP measurement-respiration

    Measurement of RSP is through measurement of chest circumstance. Respiration will cause some slight changes in the chest. The chest expands when the lungs inhale air and contracts when they exhale to discharge air, then the cyclic process repeats. In general, there should be periodically regular ups and downs, The RSP signal changes in speed and depth with the changes in the person’s emotional state. An emotional reaction of agony usually accelerates and deepens respiration, while sudden panic will temporarily interrupt respiration, and both ecstasy and sorrow trigger respiratory spasms. In general, the RSP signal frequency range of a normal adult is 0-0.35Hz, and the respiratory rate is 16-20 beats per minute.



    Figure 5 The three physiological signal waveforms comparison of one subject under 4 emotions.

    The length of the three physiological signals is 2 minutes each, of which the ECG sampling rate is 256Hz, while the BVP and RSP sampling rates are 64Hz. Figure 5 shows a typical physiological signal under four emotion states (Joy, Anger, Fear and Peace).

    Feature extraction from data is based on statistical significance, with the mathematical expressions of feature (Jonhannes, Joghwa and Elisabeth, 2005) being as follows:

    1. Normalization

    X n ~ = X n μx σx MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9 vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=x fr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaqcfa4aaCbiaO qaaKqzGeGaamiwaKqbaoaaBaaajeaibaqcLbmacaWGUbaaleqaaaqa beaajugibiaac6haaaGaeyypa0tcfa4aaSaaaOqaaKqzGeGaamiwaK qbaoaaBaaajeaibaqcLbmacaWGUbaaleqaaKqzGeGaeyOeI0IaeqiV d0MaamiEaaGcbaqcLbsacqaHdpWCcaWG4baaaaaa@4A75@ (1)

    1. Mean value

    μx= 1 N n=1 N X n MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9 vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=x fr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaqcLbsacqaH8o qBcaWG4bGaeyypa0tcfa4aaSaaaOqaaKqzGeGaaGymaaGcbaqcLbsa caWGobaaaKqbaoaaqahakeaajugibiaadIfajuaGdaWgaaqcbasaaK qzadGaamOBaaWcbeaaaKqaGeaajugWaiaad6gacqGH9aqpcaaIXaaa jeaibaqcLbmacaWGobaajugibiabggHiLdaaaa@4B95@ (2)

    1. Mean square deviation

    σx= ( 1 N1 n=1 N ( X n μ x ) 2 ) 1 2 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9 vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=x fr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaqcLbsacqaHdp WCcaWG4bGaeyypa0tcfa4aaeWaaOqaaKqbaoaalaaakeaajugibiaa igdaaOqaaKqzGeGaamOtaiabgkHiTiaaigdaaaqcfa4aaabCaOqaaK qzGeGaaiikaiaadIfajuaGdaWgaaqcbasaaKqzadGaamOBaaWcbeaa jugibiabgkHiTiabeY7aTLqbaoaaBaaajeaibaqcLbmacaWG4baale qaaKqzGeGaaiykaKqbaoaaCaaaleqajeaibaqcLbmacaaIYaaaaaqc basaaKqzadGaamOBaiabg2da9iaaigdaaKqaGeaajugWaiaad6eaaK qzGeGaeyyeIuoaaOGaayjkaiaawMcaaKqbaoaaCaaaleqajeaibaWc daWcaaqcbasaaKqzadGaaGymaaqcbasaaKqzadGaaGOmaaaaaaaaaa@5F94@ (3)

    By applying the above mathematical equations, fifteen features were extracted from the ECG signal, fifteen features from the BVP signal, and ten features from the RSP signal. Figure 6 shows the composition of the selected features.

    Figure 6 Feature matrix.

    One sample was extracted each time, and the remaining samples were used to build a classifier SVM13 to distinguish positive and negative emotions. The ultimate emotion was further recognized by classifying the measured results through SVM2 (Joy/Peace) or SVM3 (Anger/Fear) (Larry and Malik, 2001).

    Figure 7 shows the Binary Tree Method14 of extraction. Binary Tree Method is to classify all types into sub-classes and then further divide sub-classes into two sub-categories, and the cycle continues until a single category is obtained, hence a binary tree is acquired. SVM1 in the figure can distinguish the large group of Joy/Peace from the large group of Anger/Fear, so can emotion arousal degree. The experiment results were obtained by this extraction method.

    Table 2 shows the incorporation of the model of emotional recognition classification via fature extraction of collected physiological signals data.

    The results of the experiments performed in order to collect data representative of emotional reactions and physiological signal data were applied in the following electronic system design. The system hardware included input physiological signals conditioning, signal sampling, data transmission and storage, real-time clock, LED display, etc. The entire electronic detection system is shown in the block diagram in Figure 8. The physiological signals being detected via this system were BVP and RSP.

    Figure 7 Binary tree method emotion recognition diagram.

    Figure 8 Electronic detection system diagram.

    The serial circuit mainly realizes conversion between TTL electrical level and RS232 electrical level used by the upper computer. The system uses a classical electrical level conversion chip-MAX232. The detailed serial circuit diagram is shown in Appendices A.2.5. The holistic framework of the circuit design of electronic system is shown in Figure 9.

    Figure 10 shows the main board of the circuit of the electronic system. MCU, ADC, VREF, DC-DC Converter, Serial circuit chip, USB Controller, USB interface, IIC interface, DC input are distributed on the main board.

    Below are the main diagrams of the circuit (Figure 11a) (Figure 11b).

    Figure 9 The holistic framework of circuit design of electronic system.

    Figure 10 The main board of the circuit of the electronic system.

    Figure 11a ADC circuit diagram.

    Figure 11b MCU circuit diagram.

    Figure 12 The interaction process applied into Prototype A.

    Figure 13a Operation process of thoracic and abdominal respiratory movement wave sensor.

    Emotion states

    Positive emotion

    Negative emotion

     

     

    joy-peace

    anger-fear

    Physiological signal data of emotion reaction

    Variation of Heartbeat rate and waveform

    Heartbeat rate range: 70-85 times / min

    Heartbeat rate range: more than 85 times / min Waveform: irregular and unstable

    Waveform: regular and stable

     

    Variation of Respiration rate and waveform

    Respiration rate range: 16-20 times / min Waveform: regular and smooth

    Respiration rate range: more than 20 times / min Waveform: irregular and unsmooth

    Table 2 The incorporation of model of emotional recognition classification

    Input

    Sensor

    Processor

    Actuator

    Output

    Relation

    Level

    Movement

    Motion
    sensor

    IAP

    LED display

    Vision

    Human    ⇄   Prototype A [e]

    Interactive [I]

    Table 3 The elements list of the interaction process of Prototype A

    • Prototype A: based on breath

    With reference to design principles of IF and previous practical experiments, the form of the interaction process which was applied in prototype A is shown in Figure 12.

    In the interaction process of prototype A, the ‘input’ is chest and abdomen movements induced by breath. The ‘output’ is LED display. The elements of the interaction process are listed in the following Table 3:

    The design for ‘Based on Breath’ included breath (respiration) rate, breath (respiration) depth and corresponding breath (respiration) wave. With reference to the research and analysis of acquisition of physiological signal data, the RSP measurement method was adopted to obtain respiration data from all aspects of Prototype A for system design. RSP data was obtained by sensing and recording thoracic and abdominal movement waves caused by breathing. Sensing devices applied in Prototype A were transformed from a thoracic and abdominal movement wave sensor which is inside the vest. For thoracic and abdominal movements, one thoracic and abdominal expansion and contraction is regarded as one breath, i.e., expand when inhaling and contract when exhaling. This was used to record the respiratory rate per minute and respiratory depth status. The operation process of thoracic and abdominal respiratory movement wave sensor is shown in Figure 13a.

    According to the setting of the interaction process and the establishment of the input sensing mode, in order to visualize the process of respiration, interactive performance modes can be divided into real time monitoring and emotional state recognition.

    The outline of interactive performance modes in detail is shown as in Table 4:

    The signal processing circuit first amplified the respiratory signal output difference by 25.6 times. The RSP signal could effectively remove power frequency interference and interference of high frequency myoelectricity through a low pass filter with a cut-off frequency of 8Hz after the difference amplifier. The circuit diagram is shown as Figure 13b.

    Figure 13b RSP signal detection circuit diagram.

    Modes

    Input Data From Sensor

    Output Presentation

    Mode 1: Real Time Monitoring

    Real time monitoring of Respiration signal

    Gradual change of lights as same as the rhythm of respiration rate
    (Inhale-being light→ Exhale-being dark)

    Mode 2: Emotional State Recognition

    Record of respiration signal data within periods of given time

    Regular changes → Irregular flashes

    Table 4 The outline of interactive performance modes of Prototype A

    Figure 14 The outline of electronic components and circuit distribution of Prototype A.

    Distribution locations of the electronic components and related circuit arrangement were determined according to design locations and practical operations, with the security measure of waterproof thermal insulation. The outline of electronic components and circuit distribution can be seen from Figure 14. The real work of Prototype A (Figures 15a-15c)

    Prototype B: based on pulse

    According to design principles of IF and previous practical experiments, the form of the interaction process which was applied in Prototype B design is shown in Figure 16.

    In the interaction process of Prototype B, pulse waves of the heart beat/pulse beat were identified as the input. As pulse waves generated very slight vibrations, they were recognized as touch. The LED display was the output. Table 5 lists all the elements of the interaction process.

    Heartbeat rate and pulse rate are the same. Prototype B could obtain heart rate data by acquiring pulse rate data through the BVP measurement method. The finger tip, where both the flexibility and sensitivity are high, was selected as the place to acquire BVP physiological signals, and BVP photoelectric sensor located close to the finger shot red light on the skin surface to calculate the size of its reflective red light which was related to changes of subcutaneous blood flow, while changes of blood flow formed blood volume pulse from which the pulse rate was measured. Figure 17 shows a photoelectric finger-tip movement pulse wave sensor as BVP sensor applied in Prototype B.

    Figure 15a Prototype A before interaction.

    Figure 15b Prototype A turning on via wearer’s breathing.

    Figure 15c The real expression of interaction process of Prototype A.

    Figure 16 The interaction process applied into Prototype B.

    Figure 17 Photoelectric finger-tip movement pulse wave sensor.

    Similar to Prototype A, interactive performance model was divided into real time monitoring and emotional state recognition. The outline of interactive performance modes in detail is shown as Table 6.

    Specific BVP signal acquisition starts with the heart beats beating, finger-tip capillaries underwent corresponding volume changes of pulse, light with a specific wavelength shot by the optical transmitter circuit. The photoemission circuit used emission wavelength within the range of circuit 600-700mm and pressure drop of generally within 1.5-2.0V. The following Figure 18 shows the BVP signal acquisition system circuit.

    Input

    Sensor

    Processor

    Actuator

    Output

    Relation

    Level

    Touch

    Pressure
    sensor

    IAP

    LED display

    Vision

    Human    ⇄   Prototype B [e]

    Interactive [I]

    Table 5 The elements list of the interaction process of Prototype B

    Modes

    Input Data From Sensor

    Output Presentation

    Mode 1: Real Time Monitoring

    Real time monitoring of Pulse signal

    Light and dark beating as same as the real-time pulse rate rhythm and strength degree

    Mode 2: Emotional State Recognition

    Record of pulse signal data in one min

    Display LED light colors corresponding to the range levels of pulse rate in one minute

    Table 6 The outline of interactive performance modes in detail

    Figure 18 BVP signal detection circuit diagram.

    Distribution locations of the electronic components and related circuit arrangement were determined according to design locations and practical operations in Figure 19.

    The real work of Prototype B (Figure 20a) (Figure 20b)

    The range level of pulse rate data record acquired in Mode 1 was discriminated according to the four levels of pulse rate value set in the system program, i.e., i) pulse rate <70/min, ii) 70/min <pulse rate≤80/min, iii) 80<pulse rate≤95, iv)pulse rate>95. Different range level values corresponded to different LED light colors in Figure 20c.

    Figure 19 The outline of electronic components and circuit distribution of Prototype B.

    Figure 20a Prototype B before interaction.

    Figure 20b Prototype B turning on via wearer’s pulse beat.

    Figure 20c Prototype B in different range levels of pulse rate.

    Conclusion

    This research proposes and establishes a specific theoretical system of IF peculiar to Interactive Fashion through which valuable referential theoretical systems for future research are provided and future settings of IF modes are facilitated. On the other hand, the system applied for recognizing basic emotions through physiological signals in IF and the IF mode (capable of transmitting it visually) were developed through interdisciplinary research on psychology, physiology, electronics and clothing materials, etc. Finally, the purpose of this study and the design concepts are presented through creative fashion design and prototypes production. Prototypes Design work included determination of the illustration to be used in production in the draft program, setting of targeted design rationale, determination of electronics programming and production of the electronic system, besides selection of design materials and design materials arrangement for distribution of electronic components and circuits and employment of modeling for final integration and completion of the design.

    Furthermore, though this study, it also raised some unresolved issues and constraints, for example, expansion of channel used to acquire emotion recognition physiological signals, selection of new electronic components and application materials, interaction modes and systems development, wireless sensor applications, etc., which provide new directions and spaces for future IF research and design.

    Acknowledgements

    None.

    Conflict of interest

    Author declares there is no conflict of interest in publishing the article.

    References

    1. Liu YP, Shrum JJ. What is Interactivity and is it always Such a Good Thing? Implications of Definition, Person, and Situation for the Influence of Interactivity on Advertising Effectiveness. J Advertising. 2002;31(4):53‒64.
    2. Saffer D. Designing for Interaction. New Riders, Peachpit Press, USA; 2006. p. 256.
    3. Rush M. New media in art. UK: Thames & Hudson Press; 2005. p. 240.
    4. Crawford C. The Art of Interactive Design. No Starch Press, San Francisco, USA; 2003. p. 410.
    5. Braddock SE, Mahony M. Techno textiles: revolutionary fabrics for fashion and design. 2nd edn. Thames and Hudson, London, UK; 2006. p. 208.
    6. Sharp H, Page YR, Preece Y. Interaction design: beyond human-computer interaction. 4th edn. John Wiley & Sons, USA; 2011.
    7. Ascott R. Telematic Embrace: visionary theories of art, technology and consciousness. University of California Press, Berkeley, California, USA; 2003. 441 p.
    8. Peng CZ. Design through digital interaction: computing communications and collaboration design. Intellect Books, UK; 2001. p. 212.
    9. Seymour S. Fashionable technology: the intersection of design, fashion, science, and technology. Springer; 2008.
    10. Jonghwa K. Emotion Recognition from Physiological Measurement. LMKA, University of Augsburg, Germany; 2005.
    11. Ekman P, Scherer KR. Questions about emotion: An Introduction. In Scherer K, Ekman P, editors. Approaches to Emotion. Lawrence Erlbaum, Hillsdale, New Jersey, USA; 1984. p. 1−8.
    12. Clifford GD. Signal Processing Methods for Heart Rate Variability Analysis. University of Oxford, UK; 2002. p. 6.
    13. Manevitz LM, Yousef M. One-Class SVMs for Document Classification. J Machine Learning Research. 2001;2:139‒154.
    14. Melinik OV. Methods of electro cardio signal processing and analysis. Med Tekh. 2007;(6):8‒12.
    Creative Commons Attribution License

    ©2017 Xia, et al. This is an open access article distributed under the terms of the, which permits unrestricted use, distribution, and build upon your work non-commercially.