Music in City

Music in City

Oct 22, 2020

This project is WIP for the final and also serves as the midterm project.

Introduction #

With Music in City (tentative), users can plant and discover speakers playing music they love on the streets. Instead of being annoyed by compulsive music played by hippies, augmented experience filters and leaves only the ones that match the genres people like.

Sound is the nature of the interface of this experience, which starts before the users pull out their phones. This extra layer of interactivity expands the realm of AR from visual to auditory, from concentrated to peripheral.

People mark and endorse the channels they love, giving them more exposure and greater broadcasting area, and eventually shape the music map of the city. Just like bullet comment, the project enables people to gather and interact with each other asynchronously in the virtual world, having greater potential to be discussed during and post this very time of pandemic. (tentative)

Week 1 & 2 #

Models

Prototype

  • Plant a new music speaker.
  • Simple interactivity with the object.
  • Object based AR sound.

(Oct. 22) I prototyped the basic journey of the whole experience - from planting the speaker from the music library, to exploration in the virtual music map. Some parts of the prototype didn’t go well on this conjunction of two phases, e.g. the HUD updates when user taps on the objects during exploration, but not the sharing and planting.

TODO

  • Object generation, design, and dimension.
  • Amplitude attenuation curve.
  • Proximal information (HUD and APP interface, e.g. music details, library) and distal information (AR, e.g. virtual representation of the music, virtual interactivity), and the transition between the two within one experience.
  • Physical features of the virtual representation, e.g. getting dirty as people never stop by.
  • Interaction with virtual objects.