- Computer Arts Laboratory
- 担当科目 - 大学
- LI10 Introduction to Multimedia Systems
IT09 Sound and Audio Processing
FU14 Intro. to Software Engineering (exercise class)
FU15 Introduction to Data Management (exercise class)
- 担当科目 - 大学院
- Spatial Hearing and Virtual 3D Sound
Introduction to Sound and Audio
Digital Audio Effects
- I am interested in spatial sound, audio signal processing, phonetics, psychoacoustics, and aural/oral human-computer interaction.
- 2021 – Senior Associate Professor, University of Aizu.
2013 - Associate Professor, University of Aizu.
2010 - Researcher, Ikerbasque - University of the Basque Country.
2010 - Ph.D. in Computer Science and Engineering, University of Aizu.
- PSYPHON: Psychoacoustic features for Phonation prediction
- Aural/oral human-computer interaction, real-time programming, visual programming
- • Audio Engineering Society, • Acoustical Society of Japan, and • Acoustical Society of America • IEEE
- Running, snowboarding, playing music, etc.
- • “Catch 22” by Joseph Heller;
• "The Man Who Mistook His Wife For A Hat: And Other Clinical Tales" by Oliver Sacks;
• “The Hitchhiker's Guide to the Galaxy” by Douglas Adams
- Distrust the authority. Your strategy in life should be to listen carefully everybody and then test by yourself to finally make up your own mind.
- Encuentros entre Colombia y Japón: homenaje a 100 años de amistad, chapter "De como el mundo es un pañuelo y de las misteriosas maneras" (Of how the world is a handkerchief and other mysterious ways). Colombian Ministry of Foreign Affairs, Bogotá D.C., Colombia, 2010. (Fiction) In Spanish.
- Sound and Audio Technologies
We are interested in sound as a vehicle to transmit information between humans and machines. In our research we focus mainly on spatial sound, applied psychoacoustics, and applied phonetics.
• Spatial sound
Vision is saturated with information coming from gadgets we use on a daily basis; we want to find ways to convey part of that information via spatial (3D) sound using loudspeakers or headphones. We are particularly interested in synthesizing auditory distance and elevation in virtual environments and multi-sensory interfaces.
• Applied psychoacoustics
The processing capabilities of the brain are sometimes exceeded by hardware. This brings opportunities for new interfaces explored in our lab, such as near ultrasound communication, bass enhancements using vibration motors, etc.
• Applied phonetics
In collaboration research, we are studying effects of noise on speech, multilingualism, articulation and phonation phenomena. Speech technologies are the ultimate interaction method for human-machine communication. Understanding how speech is produced and perceived in different setups is of paramount importance for such technologies.
We use sound regularly to communicate with others, yet our understanding of it is so limited that there may be many other opportunities for new technologies awaiting to be discovered. This is a difficult task that requires common efforts.
We are always looking forward for collaboration research; contact us (convert this into a valid email address: julian at u-aizu period ac dot jp) if you are interested. We’re particularly interested in Master and Doctoral students.
 E. Ly and J. Villegas, “Generating artificial reverberation via genetic algorithms for real-time applica- tions,” Entropy, vol. 22, p. 1309, Nov. 2020. doi 10.3390/e22111309.
 J. Villegas, K. Markov, J. Perkins, and S. J. Lee, “Prediction of creaky speech by recurrent neural networks using psychoacoustic roughness,” IEEE J. of Selected Topics in Signal Processing, vol. 14, pp. 355–366, Feb. 2020. DOI: 10.1109/JSTSP.2019.2949422.
 J. Villegas, “Movement perception of Risset tones presented diotically,” Acoustical Science and Technol- ogy, vol. 41, Jan 2020. DOI https://doi.org/10.1250/ast.41.430.
 I. de la Cruz Pav ́ıa, G. E. Alcibar, J. Villegas, J. Gervain, and I. Laka, “Segmental information drives adult bilingual phrase segmentation preference,” Int. J. of Bilingual Education and Bilingualism, Jan. 2020. DOI 10.1080/13670050.2020.1713045.