Creating Audio Reactive Visuals to Provide Immersive Experience for Hearing Impaired Audience in Live Concert Performance

Dr.Dynaya Bhutipunthu and Yuttana Santivong

Abstract :

Our present-day world, “BANI WORLD” concept that Jamais Cascio introduced and is the theme of the symposium, came from; B – Brittle, A – Anxious, N – Nonlinear, and I – Incomprehensible (Cascio, J., 2022). One way that the investigators of this project focus on as the counter response to the BANI world concept is to promote individual’s “good health and well-being” by embracing “diversity” among all of us including inclusive groups. Applying Goal No.3 and No.10 of the United Nation Sustainable Development (UN SDG) (UN SDG, 2015), the investigators are able to set the emphasis of this project which lead to the action that will help reduce inequality and promote the well-being of one of the inclusive groups, Hard of Hearing (HoH), through the use of music, visual communication designs, and new technologies.

New technologies with new approaches are available for the HoH audiences to facilitate and assist them in enjoying the music in their own interpretative ways. However, in visual communication, visual driven design works, using interactive design with audio reactive visuals to create immersive experiences for this group of audiences, are not emphasized enough. Even if it is one of the ways to create enjoyable musical experiences, not only for HoH audiences, but also for one without, all audiences, in the live concert performances replying to the “Music for All” approach (Music for All Inc., 2024). This encourages the integrated environment that diversity is embraced. Leading to the objective of this study which is to search for a way to create a series of interactive visuals integrating audiences’ insights and needs into part of the design executions, and to use the findings to create visual design works that provide immersive experience for HoH audience in live concert performance, using “Ji-Ja” (จิ๊จ๊ะ) song from the band “Silly Fools” (GMM Music Public Co.,Ltd., 2000) as a case study of this project. This study is part of the “SZENSE Music Festival 2024” event hosted by the collaboration efforts of the Mahidol University International College, the College of Music, and Ratchasuda College to create musical experiences for all.

Objectives :

To search for a way to create a series of interactive visuals with co-creation design process and to create one with the selected song (Ji-Ja – จิ๊จ๊ะ) that provides an immersive experience for HoH audience in live concert performance.

Conceptual Framework :

Conceptual framework of this study covers; 1.) Literature Review and the review of best practice projects, 2.) Focus groups discussions with experts and Hard of Hearing (HoH) audiences from Ratchasuda College and the conclusion of the findings, 3.) The creation process of a series of interactive visuals for one of 15 songs in the concert performance, “Ji-Ja” (จิ๊จ๊ะ) by the band “Silly Fools” (GMM Music Public Co.,Ltd., 2000), and 4.) The conclusions and suggestions will be drawn for future development.

According to research by Adrian Bossey, “Accessibility in all areas? UK live music industry perceptions of current practice and Information and Communication Technology improvements to accessibility for music festival attendees who are deaf or disabled” (Bossey, A., 2020), the HoH audiences is brought into attention to be included in the live concert events.

New technologies are available for HoH audiences to facilitate and assist them in enjoying the music in their own interpretative ways. For example, SUBPAC, a tactile audio platform developed by SUBPAC Head of R&D Sarosh Khwaja and electronics designer Andrew Kilpatrick in 2013 (Khwaja, S., Kilpatrick, A., 2024). The device was recently used in “Coldplay” inclusive concert experience in 2022 (CBS, 2022). Vibration responding to music frequency is also widely used for HoH audiences, as in Jason Torres’s research project, “Perception of Music in the Deaf and Hard of Hearing” (Jason, T., 2019). In Thailand, there was also a series of concerts for HoH audiences sponsored by LoveiS music record label company in 2020 called “Love Is Hear” (LoveiS, 2020). In addition, the investigators have reviewed best practice projects on displaying motion projections and the works that use digital interactive to create immersive experiences for the audiences from the collections of “Global Design Awards” from the Society of Experiential Graphic Design (SEGD) under “Digital Experience” category (SEGD, 2024). Applying Neuro Design in the selection of key graphics and visual arrangements for audiences’ sensory (Bridger, D., 2017) and intergrading Participatory Design process (Armstrong, H., Zvezdana, S., 2011) in the creation of series of motion graphics and audio reactive visuals are the focus of this study.

Process / Methodology :

The creation process of the design work starts from 1.) The integration of the information received from best practice projects review and the conclusion of the findings from the audiences’ pain points, needs, and insights. 2.) The study of the selected song “Ji-Ja” (จิ๊จ๊ะ) (GMM Music Public Co.,Ltd., 2000), including the interpretation of the meaning from the lyrics, the tempo/rhythm (BPM), and the amplitude (APM) of the song’s music in order to create a motion graphic clips and audio reactive visuals for the song. 3.) The creation of a storyboard for the motion graphic clip as a base clip displaying with the backing track of the song, the key graphics, and the assets designs for the audio reactive visuals showing with the music’s backing track in real-time.

There are three sets of works projecting on the main screen synchronizing with the concert performances. 3.1) The first set was created as the main motion graphic clip displaying throughout the performance, storyboard including characters designs, background drawings, graphic elements was created based on the investigators’ interpretation of the story from the song, “Ji-Ja” (GMM Music Public Co.,Ltd., 2000), with the two characters’ relationship dynamics.

3.2) The second set of the designs is a series of the key graphic assets used for synchronizing to the “Rhythm/Tempo” or beats per minute (BPM) of the song in real-time, these assets designs are called “audio reactive visuals.” Assets designs for audio reactive visuals responding to the “Rhythm/Tempo” (BPM) of the song covering; 1.) Black and white graphic line drawing illustration of a vintage radio and 2.) Graphics of the generative particles. 3.3) The third set of the designs are the audio reactive visuals responding to the “Amplitude” (AMP) of the song’s music including 1.) Graphic of the characters’ hands depicting various sign languages and 2.) Hand-drawn letterings of keywords reflecting the storyline and representing the interpreted content of the song.

Techniques and Materials :

Graphics of the generative particles and other additional assets were created from the software “Resolume Arena” (https://www.resolume.com/) which was the software that the investigators used for combining all pre-designed assets together with the Resolume’s generative one to create a four second clip of the two layers; 1.) Audio reactive visuals responding to the “Amplitude” (AMP) of the song’s music and 2.) Layer of graphic assets used for synchronizing to the “Rhythm/Tempo” or beats per minute (BPM) of the song. These assets were created by synchronizing them with Resolume’s CompositionFFT through the analysis of the Audio Frequency Spectrum identifying Audio Intensity then transforming the found audio intensity to create additional layers of audio reactive visuals.

The revision and finalization of the works include the series of rehearsals in the studio at the 5th floor of the Aditayathorn building, Mahidol University International College and at the concert hall, “Blackbox Theater” the 6th floor of the Southeast Asia Music Museum, College of Music, Mahidol University. Each rehearsal reflects what works? what to improve? and enhances the knowledge transfer among university’s academic staff and everyone involved through the working collaborations. Working with lighting design is another essential part of the creation process, involving the lighting designer at the beginning of the process and consistently getting feedback, testing, and revising the design of how the light works to allow a better understanding of the visual interpretation of the song’s music. Finalizing all clips is part of the preparation for the final rehearsal of the concert using Adobe After Effects to compose and render out all clips. The investigators tested the organization of all playlist’s clips sequences using Resolume to assist with the backing track synchronization in the actual environment, concert hall, real-time.

The video clip of the final motion graphic for the song and the concert day’s recorded video clip can be viewed through the attached links here; 1.) The final motion video clip with all layers and lyrics: https://drive.google.com/file/d/1efioRLC9nSS4qXZP89SU2XWU4vHgkvvZ/view?usp=sharing and 2.) The concert day’s performance video recorded clip: https://drive.google.com/file/d/1GftLcVyHehgv CFqd65WTc1N__Ojv_V32/view?usp=sharing.

Result / Conclusion :

In conclusion, the investigators identified the working process for the creation of a series of audio reactive visuals for the selected song (Ji-Ja – จิ๊จ๊ะ) to help provides immersive experiences for HoH audiences in live concert performance. The process includes the integration of the findings from the audiences’ insights with the executions of the final design focusing on three main parts of the work; 1.) Series of audio reactive visuals created to; 1.1) Communicate the storyline of the song displaying throughout the song’s backing track, 1.2) Correspond to the Rhythm/Tempo (BPM) of the music in real-time and 1.3) Reflect the Amplitude (APM) of the song. Resolume is used for this first part to assist in organizing the display sequences of all asset’s clips (1.1-1.3) and for the implementation of the additional generative graphic assets. 2.) Part 2 of the works is to have sign language interpreters communicating real-time with the audiences, the song’s lyrics and storylines, the performer’s conversations, and the emotional dynamics in a live concert environment. 3.) Part 3 is the integration of other sensory, especially the vibrations of the seating stage that the audiences sit and stand upon during the performance, this also helps the audiences to embrace and enhance the immersive experiences.

Additional suggestions for the improvement of the work can be drawn from the post evaluation/feedback from all stakeholders who were involved in the production of the motion graphics and organization of the concert and one who attended and participated in the concert performances. Phase II of the study aiming to gain this information and to find the revision directions for the future work improvement is recommended.

The collaboration among all parties involved in the development of the project not only helps create knowledge transfer among multidisciplinary academic staff, university’s partners, and students, but also encourages the positive working environment, one that shared similar focuses on making inclusive groups being part of all groups, part of us “embracing diversity and supporting equality” as one of the sustainable approaches to counter response to the “BANI WORLD.”

References :

  1. AIGA (The Professional Association for Design). (2016). The Design Process from Unit 3A Curriculum, AIGA Minnesota Innovative grant funded project. Academic and Design Education. Retrieved 12 March 2025 from: https://www.aiga.org/sites/default/files/2021-03/3A_DesignProcess_ Introduction.pdf.
  2. Armstrong, H. and Stojmirovic, Z. (2011). Participate: Designing with User-Generated Content, A Designer’s Guide to Co-Creation: Princeton Architectural Press, New York.
  3. Bossey, A. (2020), Accessibility in all areas? UK live music industry perceptions of current practice and Information and Communication Technology improvements to accessibility for music festival attendees who are deaf or disabled: International Journal of Event and Festival Management, Vol. 11 No. 1, pp. 6-25. https://doi.org/10.1108/IJEFM-03-2019-0022.
  4. Bridger, D. (2017). Neuro Design; Neuromarketing insights to boost engagement and profitability. Kogan Page Limited; New York.
  5. Cascio, J., (2022). Think Tank: BANI World. Retrieved 12 March 2025 from: https://futurist.com/futurist-thinktank/jamais-cascio-futurist-speaker/.
  6. CBS. (2022). Coldplay x SUBPAC Inclusive Concert Experience. Retrieved Oct 9, 2024, from: https://www.youtube.com/watch?v=xypUnMXpXFA.
  7. Eagleman, D. (2021). Buzz; Neosensory Sound-Sensing Wristband for the Deaf & Hard of Hearing. The Henry Ford’s Innovation Nation. Retrieved Oct 9, 2024, from: https://www.youtube.com/watch?v=ZKoicU-zorA.
  8. Garrix., M. (2016). 7UP + Martin Garrix; A Concert for the Deaf. Retrieved Oct 9, 2024, from: https://www.youtube.com/watch?v=vGF1KlaGa1E.
  9. GMM Music Public Co.,Ltd. (2000). Ji-Ja (จิ๊จ๊ะ) by Silly Fools: Official Music Video. Retrieved 12 March 2025 from: https://www.youtube.com/watch?v=1hOzfYC-YCI.
  10. Interaction Design Foundation. (2023). What is Participatory Design? Interaction Design Foundation (IxDF). Retrieved Oct 9, 2024, from: https://www.interaction-design.org/literature/topics/participatory -design.
  11. Jason, T. (2019). Perception of Music in the Deaf and Hard of Hearing. Capstone Projects and Master’s Thesis. 681. Retrieved Oct 9, 2024, from: https://digitalcommons.csumb.edu/caps_thes_all/681.
  12. Khwaja, S., Kilpatrick, A. (2024). SUBPAC technology. Retrieved Oct 9, 2024, from: https://subpac.com/what-is-the-subpac/.
  13. LOVEiS. (2020). “Love Is Hear” concert. LoveiS music records. Happening and Friends. Retrieved Oct 9, 2024 from: https://happeningandfriends.com/article-detail/228?lang=th.
  14. Mahidol Music Channel. (2024). SZENSE Music Festival 2024: filmed and produced by Music Journey Show. College of Music, Mahidol University. Retrieved 12 February 2025 from: https://www.youtube.com/watch?v=tq_2tgbzNk0.
  15. Miles, M. B. and Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook: Sage Publications, Inc.
  16. Music for All. (2024). What is Music for All? Music for All Inc. Retrieved Oct 9, 2024, from: https://musicforall.org/.
  17. Resolume B.V. (2024). Resolume Arena & Avenue. Retrieved 12 March 2025 from: https://resolume.com/software/avenue-arena.
  18. SEGD. (2024). Global Design Awards. Retrieved Oct 9, 2024, from: https://segd.org/projects/?_ practice_area=digital-experiences.
  19. Touch Designer, Derivative. (2024). Touch Designer. Retrieved Nov 9, 2024, from: https://derivative.ca.
  20. United Nation. (2015). United Nation: 17 Sustainable Development Goals (SDG). Retrieved 12 March 2025 from: https://sdgs.un.org/goals.
  21. WNYC Studio. (2022). Replay: Deaf Concert-Goers Can Feel the Beat. The TakeAway. Retrieved Oct 9, 2024, from: https://www.wnycstudios.org/podcasts/takeaway/segments/deaf-concert-goers-feel-beat.

Share :

Facebook
Twitter
LinkedIn
WhatsApp
Email
Print