Prof. Wataru Teramoto

 






 

 

ProFILE

 1976.01 Born in Omagari, Akita, Japan

 

 


Education

 1999.03 Bachelor, Department of Psychology, Kobe University, Japan
 2001.03 Master, Department of Psychology, Kobe University, Japan
 2004.09 Ph.D, Department of Psychology, Kobe University, Japan
  

Professional position


 2002.06–2003.03 NEDO Graduate student research fellow
 2004.10–2005.03 Postdoc researcher, Department of Psychology, Kobe University
 2005.04–2008.03 JSPS Postdoc research fellow, National Institute of Advanced Industrial Science and Technology, Japan
 2006.04–2007.09 Visiting Researcher, Max Planck Institute for Biological Cybernetics (Bülthoff Lab.), Tübingen, Germany
 2008.04–2011.03 Postdoc Researcher, Department of Psychology and Research Institute of Electrical Communication, Tohoku University, Japan
 2011.04–2012.06 Assistant Professor, Department of Computer Science and Systems Engineering, Muroran Institute of Technology, Japan
 2012.07–2015.03 Associate Professor, Department of Computer Science and Systems Engineering, Muroran Institute of Technology, Japan
 2015.04–2017.12 Associate Professor, Division of Psychology, Graduate School of Social and Cultual Sciences, Kumamoto University, Japan.
 2018.1-      Professor, Division of Psychology, Faculty of Humanities and Social Sciences, Kumamoto University,Japan.

 

Research Interests


My current research interest is in multisensensory processing, in particular perceptual (associative) learning, self-motion perception,bodily self (and other) representation, and perception of reality. I am employing psychophysics, EEG, fMRI and virtual reality in my current research projects.

 


Publications

 

Kawagoe, T., & Teramoto, W. (2024). The center of a face catches the eye in face perception. Experimental Brain Research.
Kotegawa, K. Kuroda, N., Sakata, J., & Teramoto, W. (2024). Association between visuo-spatial working memory and gait motor imagery. Human Movement Science, 94, 103185.
Teraoka, R., Kuroda, N., & Teramoto, W. (2024). Comparison of peripersonal space in front and rear spaces. Experimental Brain Research, 242, 797-808.
Fujii, Y., Kuroda, N., Teraoka, R., Harada, S., & Teramoto, W. (2023). Age-related differences in temporal binding and the influence of action body parts. i-Perception 14(5), 20416695231208547.
Kawagoe, T., & Teramoto, W. (2023). Mask wearing provides psychological ease but does not affect facial expression intensity estimation. Royal Society Open Science, 10(8), 230653.
Teramoto, W. & Ernst, M. O. (2023). Effects of invisible lip movements on phonetic perception. Scientific Reports, 13, 6478. Teraoka, R., Kuroda, N., & Teramoto, W. (2023). Interoceptive sensibility is associated with the temporal update of body position perception. Psychologia, 65, 4-16.
Fujii, Y., Teraoka, R., Kuroda, N., & Teramoto, W. (2023). Inhibition of intentional binding by an additional sound presentation. Experimental Brain Research, 241, 301–311.
Teraoka, R., Hayashida, Y., & Teramoto, W. (2023). Difference in auditory time-to-contact estimation between the rear and other directions. Acoustical Science and Technology, 44, 77–83.
Kuroda, N., Ikeda, K., & Teramoto, W. (2022). Visual self-motion information contributes to perceived passable width perception during a bike riding situation. Frontiers in Neuroscience, 16:938446.
Teramoto, W. (2022). Age-related changes in visuo-proprioceptive processing in perceived body position. Scientific Reports, 12, 8330.
Kiridoshi, A., Otani, M., & Teramoto, W. (2022). Spatial auditory presentation of a partner’s presence induces the social Simon effect. Scientific Reports, 12, 5637.
Kotegawa, K. & Teramoto, W. (2022). Association of executive function capacity with gait motor imagery ability and PFC activity: An fNIRS study. Neuroscience Letters, 766, 136350.
Kuroda, N. & Teramoto, W. (2022). Contribution of motor and proprioceptive information to visuotactile interaction in peripersonal space during bike riding. Experimental Brain Research, 240, 491-501.
Kawagoe, T., Sueyoshi, R., Kuroda, N., & Teramoto, W. (2021). Automatic gaze to the nose region cannot be inhibited during observation of facial expression in Eastern observers. Consciousness and Cognition, 94, 103179.
Hidaka, S., Sasaki, K., Kawagoe, T., Asai, N., & Teramoto, W. (2021). Bodily ownership and agency sensations in a natural state. Scientific Reports, 11, 8651.
Hide, M., Ito, Y., Kuroda, N., Kanda, M., Teramoto, W. (2021).Multisensory integration involved in the body perception of community-dwelling older adults. Scientific Reports, 11, 1581.
Kotegawa, K., Yasumura, A., and Teramoto, W. (2021). Changes in prefrontal cortical activation during motor imagery of precision gait with age and task difficulty. Behavioral Brain Research, 399, 113064.
Kuroda, N. & Teramoto, W. (2021). Expansion of space for visuotactile interaction during visually-induced self-motion. Experimental Brain Research, 239, 257–265.
Teramoto, W. (2020). Development of fall prevention virtual-reality programs based on older adults’ body-related information processing.
The Japanese Journal of Psychonomic Science, 39, 80-89 (in Japanese).
Kawagoe, T., Kihara, K., and Teramoto, W. (2020). Eastern observers cannot inhibit their gaze to eye and nose regions in face perception.
Consciousness and Cognition, 79, 102881.
Kotegawa, K., Yasumura, A., and Teramoto, W. (2020). Activity in the prefrontal cortex during motor imagery of precision gait: An fNIRS study. Experimental Brain Research, 238, 221–228.
Kuroda, N. and Teramoto, W. (2019). Influence of self-motion speed on the spatial extent of human peripersonal space. Transactions of the
Virtual Reality Society of Japan, 24, 325–328 (in Japanese).
Kuroda, N., and Teramoto, W. (2019). Sex difference in facilitation effects of auditory motion stimuli on vection. Transactions of the Virtual Reality Society of Japan, 24, 329–334 (in Japanese).
Kotegawa, K., Teramoto, W., and Sekiyama, K. (2019). Motor imagery in older adults: Comparing the JMIQ-R questionnaire and the imagery-pointing task. The Japanese Journal of Cognitive Psychology, 17, 27–36 (in Japanese).
Teramoto, W., Kawano, S., Mori, S. and Sekiyama, K. (2019). Word scanning in native and non-native languages: insights into reading with declined accommodation. Experimental Brain Research, 237,2411–2421.
Sugita, Y., Hidaka, S., and Teramoto, W. (2018). Visual percepts modify iconic memory in humans. Scientific Reports, 8, 13396
Teramoto, W. (2018) A behavioral approach to shared mapping of peripersonal space between oneself and others. Scientific Reports, 8, 5432.
Teramoto, W., Hidaka, S., and Sugita, Y. (in press). Auditory bias in visual motion perception. In Spatial biases in perception and cognition.  Hubbard, T. L. (Ed.). Cambridge, UK: Cambridge University Press.
Teraoka, R., Watanabe, O., and Teramoto, W. (2017). An ERP study onsound-contingent visual motion perception. InterdisciplinaryInformation Science, 23(2), 175-178.
Hidaka, S., Higuchi, S., Teramoto, W., and Sugita, Y. (2017). Neural mechanisms underlying sound-induced visual motion perception: An fMRI study. Acta Psychologica, 178, 66-72.
Teramoto, W., Honda, K., Furuta, K., and Sekiyama, K. (2017) Visuotactile interaction even in far sagittal space in older adults with decreased gait and balance functions. Experimental Brain Research.235.2391-2405
Teraoka, R., and Teramoto, W. (2017) Touch-contingent visual motion perception: tactile events drive visual motion perception. Experimental Brain Research, 235, 903-912.
Suzuki, N., Asai, N., and Teramoto, W. (2016) A sense of being together in shared virtual environments: An analysis using the social Simon paradigm. Transactions of the Virtual Reality Society of Japan, 21, 53-62 (in Japanese).
Kondo, Y., Teramoto, W., Kobayashi, M., and Otani, M. (2016) Effects of auditory-somatosensory interactions on auditory distance discrimination accuracy. Transactions of the Virtual Reality Society of Japan, 21, 49-52 (in Japanese).
Teramoto, W., Nakazaki, T., Sekiyama, K., and Mori, S. (2016) Effects of word width and word length on optimal character size for reading of horizontally scrolling Japanese words. Frontiers in Psychology, 7, 127.
Hidaka, S., Teramoto, W., and Sugita, Y. (2015) Spatiotemporal processing in crossmodal interactions for perception of the external world: A Review. Frontiers in Integrative Neuroscience, 9, 62.
Sakamoto, S., Teramoto, W., Terashima, H., and Gyoba, J. (2015) Effect of active self-motion on auditory space perception. Interdisciplinary Information Science. 21(2), 167-172.
Teramoto, W.and Kakuya, T. (2015) Visuotactile peripersonal space in healthy humans: Evidence from crossmodal congruency and redundant target effects. Interdisciplinary Information Science. 21(2), 133-142.
Teramoto, W., Cui Z., Sakamoto, S., and Gyoba, J. (2014) Distortion of auditory space during visually induced self-motion in depth. Frontiers in Psychology, 5, 848.
Hidaka, S., Teramoto, W., Keetels, M., and Vroomen, J. (2013) Effect of pitch-space correspondence on sound-induced visual motion perception. Experimental Brain Research, 231(1), 117–126.
Teramoto, W., Kobayashi, M., Hidaka, S., Sugita, Y. (2013) Vision contingent auditory pitch aftereffects. Experimental Brain Research, 229(1), 97–102.
Teramoto, W., Nozoe, Y., Sekiyama, K. (2013) Audiotactile interactions beyond the space and body parts around the head. Experimental Brain Research, 228(4), 427–436.
Takahashi, J., Hidaka, S., Teramoto, W., and Gyoba, J. (2013) Temporal characteristics of the effects of visual pattern redundancy on encoding and storage processes: evidence from rapid serial visual presentation. Psychological Research, 77, 687–697.
Honda, A., Kanda, T., Shibata, H., Asai, N., Teramoto, W., Sakamoto, S., Iwaya, Y., Gyoba, J., and Suzuki, Y. (2013) Determinants of sense of presence and vraisemblance in audio-visual contents, Transactions of the Virtual Reality Society of Japan, vol.18(1), 93–101 (in Japanese).
Teramoto, W., Tao, K., Sekiyama, K. and Mori, S. (2012) Reading performance in middle-aged adults with declined accommodation. Attention, Perception, & Psychophysics, 74(8), 1722-1731.
Teramoto, W., Sakamoto, S., Furune, F., Gyoba, J. and Suzuki, Y. (2012) Compression of auditory space during forward self-motion. PLoS ONE, 7(6), e39402.
Kobayashi, M., Teramoto, W., Hidaka, S. and Sugita, Y, (2012) Sound frequency and aural selectivity in sound-contingent visual motion aftereffects. PLoS ONE, 7(5), e36803.
Kobayashi, M., Teramoto, W., Hidaka, S. and Sugita, Y, (2012) Indiscriminable sounds determine the direction of visual motion. Scientific Reports, 2, 365.
Hidaka, S., Teramoto, W., and Nagai, M. (2012) Sound can enhance the suppression of visual target detection in apparent motion trajectory. Vision Research, 59, 25-33.
Cui, Z., Teramoto, W., Sakamoto, S., Iwaya, Y., and Suzuki, Y. (2012) The effect of self-motion perception on horizontal sound localization in front and rear spaces, The Transaction of Human Interface Society, 14, 41-48 (in Japanese).
Teramoto, W., Hidaka, S., Sugita, Y., Sakamoto, S., Gyoba, J., Iwaya, Y., and Suzuki, Y. (2012) Sounds can alter the perceived direction of a moving visual object, Journal of Vision, 12(3), article 11.
Hidaka, S., Teramoto, W., Kobayashi, M., and Sugita, Y. (2011) Sound-contingent visual motion aftereffect. BMC Neuroscience, 12, 1-6.
Hidaka, S., Teramoto, W., Sugita, Y., Manaka, Y., Sakamoto, S., and Suzuki, Y. (2011) Auditory motion information drives visual motion perception. PLoS ONE, 6, e17499.
Teramoto W., and Riecke, B. E. (2010) Dynamic visual information facilitates object recognition from novel views, Journal of Vision, 10, article 11. Teramoto, W., Hidaka, S., and Sugita, Y. (2010) Sounds move a static visual object, PLoS ONE, 5, e12255.
Hidaka, S., Teramoto, W., Gyoba, J., and Suzuki, Y. (2010) Sound can prolong the visible persistence of moving visual objects, Vision Research, 50, 2093–2099.
Teramoto, W., Hidaka, S., Gyoba, J., and Suzuki, Y. (2010) Auditory temporal cues can modulate visual representational momentum, Attention, Perception & Psychophysics, 72, 2215–2226.
Teramoto, W., Manaka, Y., Hidaka, S., Sugita, Y., Miyauchi, R., Sakamoto, S., Gyoba, J., Iwaya, Y., and Suzuki, Y. (2010) Visual motion perception induced by sounds in vertical plane, Neuroscience Letters, 479, 221–225.
Teramoto, W., Yoshida, K., Hidaka, S., Asai, N., Gyoba, J., Sakamoto, S., Iwaya, Y., and Suzuki, Y. (2010), Spatio-temporal characteristics responsible for high “Vraisemblance!”, Transactions of the Virtual Reality Society of Japan, vol.15(3), 483–486 (in Japanese).
Teramoto, W., Yoshida, K., Asai, N., Hidaka, S., Gyoba, J., and Suzuki, Y. (2010) What is “sense of presence”: A non-researcher's understanding of sense of presence, Transactions of the Virtual Reality Society of Japan, 15, 7–16 (in Japanese).
Hidaka, S., Manaka, Y., Teramoto, W., Sugita, Y., Miyauchi, R., Gyoba, J., Suzuki, Y., and Iwaya, Y. (2009) Alternation of Sound Location Induces Visual Motion Perception of a Static Object, PLoS ONE, 4, e8188.
Teramoto W., Watanabe H., Umemura H. and Kita S. (2008) Change of Temporal-order judgment of sounds during long-lasting exposure to large-field visual motion, Perception, 37, 1649–1666.
Watanabe H., Teramoto W., and Umemura H. (2007) Effect of predictive sign of acceleration on heart rate variability in passive translation situation: preliminary evidence using visual and vestibular stimuli in VR environment, Journal of Neuro Engineering and Rehabilitation, 4, 36.
Teramoto, W., Watanabe, H., Umemura, H., Matsuoka, K., and Kita, S. (2004) Judgment Biases of Temporal Order during Apparent Self-Motion, IEICE Transactions on Information and Systems, E87-D, 1466–1476.
Teramoto, W., Watanabe, H., Umemura, H., Matsuoka, K., and Kita, S. (2004) Objective measure of visually induced self-motion perception using optokinetic nystagmus, Transactions of the Virtual Reality Society of Japan, 9, 51–60 (in Japanese).


Grants


2023年度-2025年度 科学研究費補助金(基盤研究A 23H00076)「高齢者の身体知覚における多感覚統合過程とその神経基盤の解明」研究代表者
2023年度-2025年度 科学研究費補助金(挑戦的研究(萌芽):23K17644)「原初的な他者行動理解システムの加齢変化」研究代表者
2022年度-2025年度 科学研究費補助金(基盤研究A 22H00523)「人間の選択的情報処理に基づく聴空間共有型コミュニケーションプラットホームの実現」研究分担者(代表:坂本修一)
2020年度-2024年度 科学研究費補助金(学術変革領域研究A計画研究:20H05801)「知覚系の知識獲得機構の加齢変化」研究代表者
2020年度-2022年度 科学研究費補助金(挑戦的研究(萌芽):20K20867)「高齢者の身体表象への内受容感覚の影響」研究代表者
2020年度-2022年度 科学研究費補助金(基盤研究C:20K03484)「生体反射を指標とした複数感覚情報の同時性に関する無意識的な知覚処理の検討」研究分担者(研究代表者:立教大学・日高聡太)
2019年度-2021年度 科学研究費補助金(基盤研究A 19H00631)「高齢者の身体モデルとその神経基盤の解明」研究代表者
2019年度-2022年度 科学研究費補助金(基盤研究B 19H04145)「人と場の相互作用を考慮した知的感性活動を支える聴空間創出基盤技術の確立」研究分担者(代表:坂本修一)
平成29年度-平成31年度 科学研究費補助金(挑戦的研究(萌芽)17K18708)「情動触の異種感覚相互作用」研究代表者
平成29年度-平成31年度 科学研究費補助金(基盤研究C 17K00263)「自己運動中の音空間知覚における空間非対称性」研究分担者(代表:崔正烈)
平成28年度-平成32年度 科学研究費補助金(基盤研究S 16H06325)「ライフスタイルと脳の働き-超高齢社会を生き抜くための心理科学-」研究分担者(代表:積山薫)
平成28年度-平成30年度 科学研究費補助金(国際共同研究加速基金(国際共同研究強化)15KK0092)「異種感覚情報統合の学習メカニズムとその神経基盤の解明」研究代表者
平成27年度-平成29年度 科学研究費補助金(基盤研究C15K00329)「異像・動画像符号化のためのユニバーサルコードブックと低ビットレート通信の研究」研究分担者(代表:鈴木幸司)
平成27年度-平成28年度 日本学術振興会科学研究費補助金(萌芽研究 15K12039)「バーチャル・リ アリティ空間に 提示される「ひ と」の実在感」研究代表者
平成26年度-平成29年度 日本学術振興会科学研究費補助金(基盤研究B 26285160)「異種感覚情報統合の学習メカニズムとその神経基盤の解明」研究代表者
平成26年度-平成28年度 日本学術振興会科学研究費補助金(基盤研究B 26280067)「人間の外界空間認識過程に基づく自己運動感応型バーチャル視聴覚空間創成技術の確立」研究分担者(代表:坂本修一)
平成25年度-平成28年度 日本学術振興会科学研究費補助金(基盤研究A 25245068)「身体に根ざした認知の生涯発達と神経可塑性」研究分担者(代表:積山薫)
平成24年度-平成25年度 日本学術振興会科学研究費補助金(挑戦的萌芽研究 24650059)「他者の存在を知覚するメカニズムの解明−聴覚と触覚のインタラクション−」研究分担者(代表:小林まおり)
平成24年度-平成26年度 日本学術振興会研究費補助金(基盤研究C 24500270)「災害時にも利用可能な低ビットレートによる画像・動画像通信の研究」研究分担者(代表:鈴木幸司)
平成23年度 科学技術振興機構 研究成果最適展開支援プログラム(A-STEP) FSステージ 探索タイプ「バーチャルリアリティ評価のための心理・生理的評価法の開発」研究代表者
平成23年度-平成25年度 日本学術振興会科学研究費補助金(若手研究B 23730693)「視覚及び前庭覚による自己運動情報が聴覚空間形成に及ぼす影響の解明」研究代表者
平成22年度-平成23年度 日本学術振興会科学研究費補助金(基盤研究B 22300091)「物体触知の能動性:注意移動と運動制御に関する心理学実験と生理学実験」研究分担者(代表:喜多伸一)
平成21年度-平成22年度 文部科学省科学研究費補助金(若手研究B 21730585)「広視野視覚運動情報によって形成される聴覚空間の歪みの検証」研究代表者
平成17年度-平成19年度 日本学術振興会科学研究費補助金(特別研究員奨励費 05J00057)「動的視覚環境での人間の知覚・行動特性に関するバーチャルリアリティ実験」研究代表者
平成16年度 財団法人科学技術融合振興財団 シミュレーション&ゲーミングの先進的独創的な手法の研究 調査研究補助金受給 研究代表者