This brief report investigates the relationship between the lip color of women’s faces and the latency and amplitude of the P1, N170, and early posterior negativity of event-related potential components. To show different color lipsticks affect face perception processing, we used EEG to observe these event-related potential components in 19 participants exposed to visual stimuli under four conditions: red lips, yellow lips, blue lips, and no-makeup. The results indicate a significantly higher attractiveness score for red lips than the other three conditions and a significantly shorter P1 peak latency for red lips than blue lips or no-makeup. This may reflect that red lips attract attention more than blue or natural lips in the early stages of face processing. The results indicate that the peak of early posterior negativity for red lips occurred significantly longer than for yellow lips, blue lips, or no-makeup. Early posterior negativity amplitudes were significantly larger to red lips than blue lips or no-makeup. These results may indicate that, at later stages of face processing, the high attractiveness of red lips is associated with slower and careful processing. In contrast, blue lips, which have a low attractiveness score, are processed speedily and carelessly. These present results suggest a novel possibility that P1 and early posterior negativity can be used as a biomarker for temporal face perception processing of facial attractiveness in the human brain.
Event-related brain potentials (ERPs) have been recorded in response to various aspects of facial perception, from which face-sensitive P1, N170, and early posterior negativity (EPN) components have been identified. P1 consists of a sharp positive increase in the ERP, appearing about 100 ms after human faces presentation. The P1 component is reported to relate the perception of low-level visual features of the human face, such as color [1, 2, 3, 4, 5, 6]. Previous studies reported that P1 has a medial or lateral-occipital scalp distribution [2, 7, 8, 9, 10, 11, 12, 13, 14]. Moreover, the neural generators in the human brain of P1 are indicated to lie adjacent to the occipital face area (OFA) [15, 16].
The N170 has been related to face perception and structural encoding of human faces [8, 17, 18]. This component comprises a sharp increase in ERP negativity, which is maximal close to 170 ms after presentation of the human faces and distributed over the posterior temporal region of the human scalp [17, 19, 20, 21, 22]. N170 amplitude is larger for human faces than other objects, including hands, cars, houses, furniture, plane figures, solid figures, and scrambled faces [7, 17, 18, 23, 24, 25]. The generators of N170 have been indicated to be located in the fusiform face area (FFA) in human brains [16, 26]. Rossion and Caharel [3] reported that processing of faces represented by both P1 and N170 components can be functionally differed, with P1 relating low-level visual features and N170 reflecting the high-level perception of human faces. In particular, the N170 component is related to changes in facial perception resulting from cosmetic makeup [27, 28], suggesting its usefulness to investigate the effect of red lipstick on face perception.
The EPN component is sensitive to the emotional content of faces [29, 30]. In addition, a larger EPN amplitude is reported for faces perceived to be attractive relative to those perceived to be unattractive [31]. The EPN is observed between 150 and 350 ms after stimulus onset and is usually distributed across the posterior temporal scalp [29, 32, 33].
Haxby et al. [34] suggested a cortical model of the neural system for face perception in human brains. Pitcher et al. [15] subsequently modified Haxby’s cortical model by adding temporal information from intracranial ERP studies [35, 36, 37, 38]. In the latter model [15], early stages of face processing occur in three regions of the occipitotemporal visual cortex, specifically the OFA, FFA, and superior temporal sulcus. Meanwhile, later stages of face processing occur in other regions (e.g., amygdala, insula, limbic system, and auditory cortex) and the above three regions: OFA, FFA, and superior temporal sulcus.
The P1 and N170 neural generators are reported in the OFA [15, 16] and FFA [16, 26]. Therefore, these components may reflect the early stages of face processing. At later stages, both models [15, 34] assume interactions between the three regions of the occipitotemporal visual cortex (OFA, FFA, and superior temporal sulcus) and the other regions above (amygdala, insula, and limbic system). Therefore, because the limbic system processes emotions in the later stages of face processing [34], the emotion sensitive-EPN component reflects these later stages.
DaSilva et al. [39] suggested the significance of the mouth in facial processing. Expressions featuring teeth (open mouth) elicited significantly larger N170 amplitudes relative to expressions without teeth (close mouth) [39]. Pesciarelli et al. [40] reported significantly larger N170 amplitudes for the mouth relative to the eyes in upright faces. Krautheim et al. [41] showed that lip-protrusion was processed in the superior temporal cortices, the temporal-occipital junctions, the postcentral gyri, and the left fusiform gyrus in the human brain. Cortical models for face perception processing [15, 34] indicated superior temporal sulcus in the human brain perception of lip movement. Tanaka [27] showed that facial cosmetics exert a greater influence on mouth processing than the eyes by N170 amplitudes. Combined with these previous findings [15, 27, 34, 39, 40, 41], it may be presumed that the mouth compared to the eyes, exert a greater influence on face perception processing. Therefore, these previous studies [15, 27, 34, 39, 40, 41] support the notion that lip color affects face-sensitive event-related potential components.
The purpose of our work is to examine the relationship between lip color and P1, N170, and EPN during women’s face perception. Specifically, Tanaka [27] reported that a Lipstick condition (red lip painting) elicited a larger N170 amplitude than a no-makeup condition. N170 would be particularly sensitive to lip color but did not examine whether lip colors other than red affect ERP components (P1, N170, and EPN). We compared face-sensitive ERP components (P1, N170, and EPN) were compared for four lip color conditions (red lips, yellow lips, blue lips, and no-makeup). Since the P1 and N170 components reflect early stages of face processing and the EPN component reflects later stages of face processing [15, 34], P1 and N170 were predicted to be sensitive to the red lips condition, and that the emotion sensitive-EPN would be sensitive to both the red (enhanced EPN amplitude) and blue lips condition (reduced EPN amplitude). Moreover, Leder et al. [42] showed that the more attractive the face became, the longer it looked. Therefore, attracting attention by changing facial attractiveness using different color lipsticks lip color may affect the emotion sensitive-EPN peak latency.
Nineteen healthy, right-handed, East Asian participants (men, n = 11, women, n = 8; mean age (standard deviation: SD) = 21.1 (1.08) years, age range = 18–23 years) participated. According to the Declaration of Helsinki, all participants had a normal or corrected-to-normal vision and provided written informed consent. The ethics committee of Otemon Gakuin University provided ethical approval, and participants were recruited from the student population at Otemon Gakuin University.
The stimuli of eight young, adult East Asian women’s faces were unfamiliar to all participants. The faces were collected from the Internet1 (1http://www.air-lights.com/recruit.html and http://ameblo.jp/studioaquarius/entry-11473277532.html) and faces had neutral expressions. In total, 32 faces were made with each of the eight model faces presented in four conditions: red lips (wearing red lipstick), yellow lips (wearing yellow lipstick), blue lips (wearing blue lipstick), and no-makeup (no lipstick applied).
Each face stimulus was digitally edited to lip color and reconstructed from the original using the YouCan Makeup on an iPad (http://jp.perfectcorp.com/#ymk) (Fig. 1A, [27]). No-makeup faces were used for adaptation stimulus and were presented before each target stimulus (red lips, yellow lips, blue lips, and no-makeup) to compare adaptation stimulus and each target stimulus. Furthermore, an iPad also digitally edited each face image to the same hairstyle and hair color (black). Finally, all face stimuli were presented in the same orientation on a background of the white color screen by Adobe Photoshop 12.
All face stimuli had the same visual angle of 3.4
We used faces with neutral expressions because N170 is affected by emotional expression [43, 44, 45, 46, 47, 48]. However, previous studies reported larger N170 amplitudes of other-race facial stimuli than own-race facial stimuli [49, 50]. All face stimuli were of East Asian women only, and all participants were recruited from East Asian people. Furthermore, because N170 amplitude varies by the face viewpoint [18, 51], all faces were presented in a front-facing view. Similarly, because Tanaka [52] reported that a change in hair length influences N170 latency, we presented faces with the same hair length, same hairstyle and hair color (black) for all conditions to all participants. Finally, Russell [53] reported that enhanced luminance contrast between facial features (lips and eyes) and facial skin increases attractiveness and femineity of women’s faces but decreases the attractiveness and masculinity of men’s faces. Therefore, we used only women’s faces as a stimulus.
All participants were seated comfortably 80 cm in front of a 22-inch CRT monitor which faces were presented by a Multi Trigger System (Medical Try System, Kodaira, Tokyo, Japan). Each trial comprised as follow: (i) a fixation mark (+) was presented for 500 ms after a white color screen as an interval of 1000 ms was presented; (ii) an adaptation face stimulus (no-makeup) was presented for 500 ms after a white color screen as an interval of 1000 ms was presented; (iii) a target face stimulus (red lips, yellow lips, blue lips, or no-makeup) was presented for 500 ms after a white color screen as an interval of 500 ms was presented; and (iv) a judgment screen was presented for 1000 ms (Fig. 1B). The inter-trial interval varied randomly between 500–1500 ms.
Example of stimuli and the single trial. (A) Examples of four face stimuli. (B) Timeline of the single trial. This figure has been modified from Tanaka [27].
On the judgment screen, participants rated all face stimuli for attractiveness. Attractiveness was assigned a number, defined as an attractiveness score: 1 = very unattractive; 2 = unattractive; 3 = attractive; and 4 = very attractive. All participants were instructed to compare the adaptation face and the target face and rate the attractiveness. Participants responded by pressing one button corresponding to 1, 2, 3, and 4 with their index finger of the right hand to show the attractiveness score. Reaction times (RTs) were measured from beginning with the onset of the judgment screen to finishing when participants had responded with a button press.
Furthermore, responses
Electroencephalography (EEG) and electrooculography (EOG) data were recorded
with a 128-channel Sensor Net and the standard EGI Net Station 5.2.01 package
(GES300, Electrical Geodesic, Inc., Eugene, USA). Moreover, EEG data were
obtained with Ag/AgCl electrodes according to the 10-5 system [54, 55]. The 128
electrodes were referenced to Cz and next re-referenced offline to the common
average. Electrodes for EOG recording are placed above, below, and at the outer
canthi of both eyes. EEG and EOG recording had been set a band-pass filter at
0.01–30 Hz, and electrode impedance maintained
Stimulus-locked ERPs were obtained separately for each of the four target faces (red lips, yellow lips, blue lips, or no-makeup) from 200 ms before to 400 ms after stimulus presentation. Baseline corrected using the 200 ms pre-stimulus window. For P1 analyses, electrode sites O1 and O2 were analyzed. The positive peak of the ERP in the window from 60 ms to 110 ms after stimulus presentation was defined as P1 latency and amplitude. For N170 and EPN analyses, electrode sites P7, PO7, PO8, and P8 were analyzed. The negative peak of the ERP in the window from 110–180 ms after stimulus presentation was defined as N170 latency and amplitude; the negative peak of the ERP in the window from 200–300 ms after stimulus presentation was defined as the EPN latency and amplitude. The mean RTs and mean ERP latencies and amplitudes (peak-to-peak) were calculated for each participant for each stimulus type by manual inspection. The mean ERP latencies, the average of 19 max points for each ERP latencies in each of the above time windows, was calculated.
Stimulus-locked average event-related potential (ERP) waveforms (P1 component) at O1 and O2 for each condition.
The attractiveness scores and RTs were analyzed with a one-way repeated-measures
analysis of variance (ANOVA) for the four lip conditions as the main effect (red
lips, yellow lips, blue lips, or no-makeup). Moreover, to test for a
participant-gender effect of the facial attractiveness ratings, the
attractiveness scores were also analyzed with two-way (4
Table 1 shows the mean attractiveness scores and RTs for the four lip conditions
(red lips, yellow lips, blue lips, and no-makeup). A significant main effect was
observed for lip condition on the attractiveness score (F(3, 54) =
50.09, p
Red lips | Yellow lips | Blue lips | No-makeup | |||||
Mean | SD | Mean | SD | Mean | SD | Mean | SD | |
Attractiveness scores | 3.24 | 0.64 | 1.65 | 0.50 | 1.46 | 0.52 | 2.78 | 0.64 |
Reaction times | 348.12 | 68.54 | 345.01 | 58.15 | 360.83 | 76.01 | 376.11 | 79.41 |
A significant main effect was also detected for lip condition on RTs
(F(3, 54) = 3.17, p = 0.039,
Fig. 2 showed ERP waveforms for the four lip conditions. An increased positive component lasted from 60 ms to 110 ms after face onset for each condition. This positive component was identified as P1. Table 2 shows the mean P1 peak latency for the four lip conditions at two electrode sites.
Electrode | Red lips | Yellow lips | Blue lips | No-makeup | ||||
Mean | SD | Mean | SD | Mean | SD | Mean | SD | |
P1 latency | ||||||||
O1 | 92.21 | 13.04 | 94.95 | 16.86 | 97.37 | 11.52 | 96.53 | 13.39 |
O2 | 86.74 | 14.34 | 92.00 | 13.78 | 94.11 | 10.81 | 96.32 | 15.74 |
P1 amplitude | ||||||||
O1 | 5.51 | 2.79 | 5.67 | 3.07 | 5.35 | 2.85 | 5.41 | 3.02 |
O2 | 6.25 | 3.84 | 5.91 | 2.91 | 5.06 | 2.66 | 5.16 | 3.09 |
There was a significant main effect of lip condition on P1 latency
(F(3, 54) = 2.86, p = 0.05,
Fig. 3 shows ERP waveforms for the four lip conditions. For each condition, an increased negative component was evident between 110 ms to 180 ms after the onset of the face stimulus. This negative component was identified as N170.
Table 3 shows the mean N170 peak latency for each lip condition of four
electrode sites. There were significant main effects of lip condition
(F(3, 54) = 5.35, p = 0.007,
Electrode | Red lips | Yellow lips | Blue lips | No-makeup | ||||
Mean | SD | Mean | SD | Mean | SD | Mean | SD | |
N170 latency | ||||||||
P7 | 140.42 | 10.91 | 141.58 | 14.53 | 144.00 | 13.87 | 136.21 | 12.61 |
PO7 | 139.47 | 12.68 | 139.89 | 13.24 | 141.16 | 12.84 | 135.68 | 14.91 |
PO8 | 134.84 | 14.74 | 134.95 | 14.15 | 140.95 | 14.92 | 128.11 | 15.33 |
P8 | 134.11 | 12.82 | 136.42 | 12.42 | 138.63 | 16.31 | 129.89 | 14.65 |
N170 amplitude | ||||||||
P7 | 4.33 | 2.69 | 3.84 | 2.37 | 3.88 | 2.78 | 4.22 | 2.79 |
PO7 | 4.85 | 3.13 | 4.67 | 2.87 | 4.78 | 2.98 | 5.06 | 3.12 |
PO8 | 5.96 | 3.47 | 6.10 | 3.01 | 5.10 | 3.09 | 5.76 | 2.76 |
P8 | 5.40 | 3.08 | 5.66 | 2.76 | 4.89 | 2.16 | 5.03 | 2.27 |
Table 3 also shows the mean N170 amplitude (peak-to-peak) for four lip
conditions of four electrode sites. There was a significant main effect of
hemisphere on N170 amplitude (F(1, 18) = 5.88, p = 0.026,
For each condition, an increased negative component was evident between 200 ms
to 300 ms after the onset of the face stimulus (Fig. 3). This negative component
was identified as EPN. Table 4 shows the mean EPN peak latency for each lip
condition of four-electrode sites. There were significant main effects of lip
condition (F(3, 54) = 17.68, p
Electrode | Red lips | Yellow lips | Blue lips | No-makeup | ||||
Mean | SD | Mean | SD | Mean | SD | Mean | SD | |
EPN latency | ||||||||
P7 | 277.89 | 29.30 | 245.37 | 24.98 | 249.26 | 20.20 | 262.32 | 25.19 |
PO7 | 277.89 | 31.88 | 250.21 | 20.24 | 242.74 | 16.25 | 260.42 | 26.35 |
PO8 | 260.11 | 18.43 | 243.79 | 17.33 | 246.32 | 19.63 | 244.74 | 16.38 |
P8 | 264.32 | 17.56 | 243.89 | 18.24 | 242.21 | 19.70 | 241.68 | 17.79 |
EPN amplitude | ||||||||
P7 | 3.23 | 2.74 | 2.24 | 1.37 | 2.12 | 1.57 | 2.44 | 2.07 |
PO7 | 3.81 | 2.34 | 2.47 | 1.16 | 1.82 | 1.46 | 2.43 | 1.83 |
PO8 | 3.30 | 2.04 | 2.71 | 1.86 | 1.70 | 1.60 | 2.01 | 1.71 |
P8 | 2.92 | 1.38 | 2.27 | 1.51 | 1.47 | 1.23 | 1.83 | 1.30 |
Table 4 also shows the mean EPN amplitude (peak-to-peak) for each lip condition
of four electrode sites. There was a significant main effect of lip condition on
EPN amplitude (F(3, 54) = 7.49, p
Stimulus-locked average event-related potential waveforms (N170 and early posterior negativity [EPN] components) at P7, P8, PO7, and PO8 for each condition.
The relationship between lip colors and three ERP components (P1, N170, and EPN) investigated women’s face perception. Face-sensitive ERP components (P1, N170, and EPN) across four lip color conditions (red lips, yellow lips, blue lips, and no-makeup) were compared with the attractiveness scores and RTs. The results demonstrated that the attractiveness scores for red lips were significantly higher than for no-makeup; those for no-makeup were significantly higher than for yellow lips and blue lips. These differences were further reflected in the ERP components.
The P1 peak for red lips occurred significantly shorter than for blue lips and no-makeup. Moreover, the N170 peak for no-makeup occurred significantly earlier than for yellow lips and blue lips, while the N170 peak latency was significantly earlier in the right hemisphere than in the left. Meanwhile, N170 amplitude was significantly larger at P8 than at P7. Furthermore, the EPN peak for red lips showed a significantly longer latency than for yellow lips, blue lips, and no-makeup, while EPN peak latency was significantly earlier in the right hemisphere than in the left. In addition, EPN amplitude was significantly larger in response to red lips than to blue lips and no-makeup. Therefore, in the early stages of face processing, as red lips are very attractive, they attract the participants’ attention. This is reflected in a significantly shorter P1 peak latency for red lips than for blue lips and no-makeup, significantly lower attractiveness scores. Compared to [27] that P1 amplitude was unaffected by single color lipstick, the present results suggest a novel possibility that P1 can be used as a biomarker for face perception processing of facial attractiveness by various colors lipsticks. In addition, previous studies [56, 57] indicated that the P1 component is related to the positive, rewarding value of stimuli. This may be reflected in a significantly shorter P1 peak latency for red lips, with high attractiveness scores a positive, rewarding value. However, there was no P1 peak latency difference between yellow lips and red lips. It may be because the contrast difference in lip colors also affects P1 peak latency.
At later stages of face processing, red lips’ enhanced facial attractiveness is reflected in the significantly longer latency of the emotion sensitive-EPN peak latency for the red lips condition than that for yellow lips, blue lips, and no-makeup had significantly lower attractiveness scores. Leder et al. [42] showed that the more attractive the face became, the longer it looked. Our results may suggest that red lips are slowly and carefully processed at later stages of face processing due to their high attractiveness. In contrast, blue lips are processed speedily and carelessly due to their low attractiveness. The amplitude of the emotion sensitive-EPN component was also significantly larger in response to red lips than to blue lips and no-makeup. These results are consistent with those suggesting that larger EPN amplitudes are observed for attractive faces [31]. These results also support cortical models that suggest the temporal processing of human faces [15, 34]. The EPN component may be sensitive to the emotional content of human faces, reflecting an interaction between the FFA and limbic system, which processes the emotional content of human faces in the later stages of face processing [34]. However, because the task structure will draw the participants’ attention to lip colors, this may affect facial processing and produce a pseudo-attraction for lips. To correctly interpret the present results, it is important to note that explicit characteristics of the current experiment may have created a pseudo-attention to lip colors that may not be observed in less biased circumstances.
Although we investigated the relationship between facial attractiveness and face-sensitive ERP components (P1, N170, and EPN), the present results show an indirect correlation between behavioral and neural measures and not a direct causational relationship. In addition, eight different pictures were used. For all 300 trials, each face was presented at least 10 times in total. Therefore, after so many repetitions, faces might become familiar. It may be presumed that this higher than the usual number of repetitions can affect the results obtained.
Tanaka [27] showed that a lipstick condition (red lips) elicited a significantly larger N170 amplitude than a no-makeup condition. However, N170 amplitude did not differ significantly between an eye shadow condition and the no-makeup condition [27]. In contrast, in the current results, the N170 amplitude for red lips did not differ significantly from the other three conditions (yellow lips, blue lips, and no-makeup). Tanaka manipulated a wider region of the face (including the eyes and mouth), while only a local region was manipulated (the mouth) [27]. Therefore, these inconsistencies in N170 amplitude results may reflect differences in experimental procedures.
Our results demonstrated that the N170 peak for no-makeup occurred significantly shorter than yellow and blue lips. In no-makeup stimuli, the light lip color had low contrast, while the deep lip colors in yellow and blue lips had high contrast. Only the no-makeup condition had a light lip color, making it conspicuous among the four conditions. Therefore, it is possible that, in the early stages of face processing, the light lip color in no-makeup attracted more attention from participants than the deep lip colors in yellow and blue lips. In addition, Zhang et al. [58] showed that the repeated high attractive faces tended to elicit a larger N170 amplitude. In contrast, there was no significant difference in N170 amplitude between four lip conditions (red lips, yellow lips, blue lips, and no-makeup). Zhang et al. [58] used gray-scaled faces stimuli, and it may also be presumed that differential contrast of faces stimuli and our results affect inconsistency of N170 amplitude results.
Our results demonstrated that N170 peak latency was significantly earlier in the right hemisphere than in the left, similar to the findings of [52]. Moreover, previous functional magnetic resonance imaging (fMRI) studies reported that N170 neural generators were located in the FFA [16, 26] and the middle and posterior fusiform gyri [59]. Several studies [60, 61, 62, 63] have highlighted that processing of face perception is dominant in the FFA in the right hemisphere in humans. In particular, Gao et al. [59] showed larger current density reconstruction (CDR) values in the posterior fusiform gyri and FFA in response to faces than houses, which were also larger on the right hemisphere than on the left. Chance et al. [63] had found that the fusiform gyrus for the right hemisphere of the human brain relative to the left hemisphere contained narrower minicolumns and smaller pyramidal neurons. Taken together, the previous studies [52, 59, 60, 61, 62, 63] and the present results support a dominance of face processing in the right brain hemisphere in humans.
As previous studies showed that facial expression processing occurred earlier (N170 and EPN, 150–290 ms post-stimulus onset) than facial attractiveness processing (P3b, 400–700 ms) [64], that high attractive faces increased significantly larger EPN. Larger late positive potentials (LPP) [65], facial attractiveness should also be investigated using other ERP components. We investigated the relationship between the lip color of women’s faces and cortical models that suggest temporal processing of human faces [15, 34]. However, we used ERP techniques only. The role of lip color in facial perception should be investigated using additional neuroimaging techniques such as fMRI or magnetoencephalography.
Furthermore, Ikeda et al. [66], facial skin radiance affects facial attractiveness and affective impressions of women’s faces. For example, if lips, eyes, and facial skin were simultaneously modified by cosmetic makeup, what effect does this multiplex cosmetic makeup have on facial attractiveness? Together with Ikeda et al.’s [66] results indicate that further research is needed to investigate the relationship between facial attractiveness and ERP components in the multiplex effect of cosmetic makeup on face perception. In addition, Zaki et al. [67] indicated that participants first rated facial attractiveness and subsequently changed their second ratings to conform to those of other participants. Therefore, further research is needed to investigate facial attractiveness from social processing.
The attractiveness scores were significantly higher for red lips than the other three lip conditions. Moreover, the P1 peak for red lips occurred with significantly shorter latency than for blue lips and no-makeup. The EPN peak for red lips also occurred significantly longer than for yellow lips, blue lips, and no-makeup. In contrast, the EPN amplitude was significantly larger in response to red lips than blue lips and no- makeup. These results suggest that red lips attract the participants’ attention in the early stages of face processing due to their high attractiveness, which is reflected in the P1 peak latency. The results may also indicate that in later stages of facial processing, the attractiveness of red lips is slowly and carefully processed. In contrast, blue lips are less attractive and, therefore, processed more quickly and carelessly. These present results suggest a novel possibility that P1 and EPN can be used as a biomarker for temporal face perception processing of facial attractiveness in the human brain.
ERP, event-related potential; EPN, early posterior negativity; OFA, occipital face area; FFA, fusiform face area; fMRI, functional magnetic resonance imaging; CDR, current density reconstruction.
HT conceived and designed the experiments; HT performed the experiments; HT analyzed the data; HT contributed materials; HT wrote the paper.
I was obtained with the written informed consent of all participants. The ethics committee of Otemon Gakuin University provided ethical approval, and participants were recruited from the student population at Otemon Gakuin University. The ethics code number was 2017017.
The author would like to thank all individuals who participated in the present work.
Research grants of Otemon Gakuin University funded this research. These research grants were used to pay for proofreading by a native English speaker.
The author declares no conflict of interest.