If there is one thing I have learned throughout this semester of Digital Humanities, it is that technology has infinite uses. We have seen it used in architecture, for medical purposes, entertainment, educational, environmental, and many more. It is changing the way we go about every day life, creating a fast-paced and efficient society. For my semester project, I wanted to use the google glass and sensor structure to improve an aspect of human life and ultimately decided on improving an experience of a musical event for the hearing impaired.
Last summer, I was at one of the Cirque Du Solei events. The performance was incredible; vibrant lights, daring stunts, and feel-good music echoing throughout the stadium. As I watched the performance I also noticed the section of seats right in front of me was reserved for deaf/hearing impaired audience members. Throughout the concert I found myself concentrating on the signed words of the interpreter and occasionally losing focus on the performance in the background. Although the hearing impaired audience members could not hear what was going on around them, they could clearly feel the vibrations, judge the mood off of the movements of the dancers, and pieced it together with the lyrics. While we cannot give the deaf/hearing impaired back their hearing in all cases, I thought it would be really neat if for entertainment purposes (like Cirque du Solei) we could enhance or over-compensate for one of their other senses and add another visual element.
Thus the idea for projecting musical notes and “moods” onto google glass was born. What I wish I had the capabilities to create is a 3D imaging of musical sound waves using the structure sensor. I imagine using a cymatic machine, playing various pitched sounds and instruments, and creating almost some sort of dictionary of images for sounds (example of cymatics https://www.youtube.com/watch?v=GtiSCBXbHAg). With the sensor structure we would be able to make a mapping of these sounds and create an accurate image of what we hear. However, the image alone projected on the glasses would not be enough. The images would need to be color coordinated as well. In my music gen ed, “The Lively Arts”, we have talked a lot about perception, moods, and feelings of artistic pieces. What I have learned is that a lot of colors portray certain emotions to society. A dark blue color is more likely to portray a heavy, sadness than a bright yellow would. For this reason, I think the note shapes would need to be paired with a color to better portray what is going on in the music. Once these images are created and paired with a color, we could then use them as images on the google glass. Audience members would wear them during the performance, the microphone of google glass would detect the sounds being played, match them with the dictionary, and instantly project them for the audience member. Along with the sign language interpreter, vibrations/visual performance, and possibly some sort of rhythmic sensor mousepad, I think the google glass projections could be extremely helpful.
I do see some areas of trouble with my semester project. First is the creation of the dictionary. An important idea to note about art is that not every observer is going to have the same reactions, interpret it the same way, or associate certain colors with certain moods. This could pose an issue when creating the dictionary because there will be no clear way to decide what sound should be represented by what shape and color. This could be especially difficult when dealing with audience members who have potentially never heard a note of music in their life.Additionally, I do not think the google glass could be used alone to accurately portray the musical tone of a concert. There are so many moving parts in a musical piece that it could difficult to display them on the small surface area of google glass. This could require picking out the most important pieces of the music before hand, possibly collaborating with the composers of the music. With all of that being said, I also think there is great potential for expansion of the google glass projection tool. In the future, it could be used for audio books, subtitles during speech, or something as routine as a conversation. With technology taking over every aspect of lives, it can be both scary and exciting to see just how far it will go.