Com-Note – Composer’s Notebook

Com-Note - Composer's Notebook

Com-Note – Composer’s Notebook

The composition of music is a complex, creative and collaborative act. This is currently done with a range of tools including the editing of musical notation, the playing, recording and playback of musical phrases, and their verbal discussion. In this project we will bring these activities together in a single ‘composer’s notebook’ app called Com-Note for a smart phone. This will be based on the trial and extension of an existing multimedia narrative app called Com-Phone, during the creation of a new work by Tom Armstrong for trumpet and string quartet. Com-Phone was created on the Community Generated Media project and is part of the Com-Me toolkit.

<center><iframe width=”560″ height=”315″ src=”https://www.youtube.com/embed/nZWZCFHwlw0″ frameborder=”0″ allowfullscreen></iframe></center>

Existing music composition software focuses on mixing entire compositions on a desktop or laptop computer. This shifts the locus of composition to a particular place or machine, and fails to capture the spontaneous, distributed and collaborative nature of composition and its relation to performance. Our approach is mobile, flexible and collaborative by design, and more in the spirit of a sketchbook than a mixing desk. Musical ‘sketches’, inspirations and ideas will be recordable piecemeal on a smartphone, and passed between the composer and performer for mutual consideration, extension and revision.

The project was a collaboration between the University of Surrey, the trumpet player Simon Desbruslais and the Ligeti Quartet. It was funded by the EPSRC MILES programme at Surrey under grant number EP/I000992/1.

FreeEye photo and video browsing interface

FreeEye browsing interface

FreeEye browsing interface

Intuitive interfaces have become increasingly important multimedia applications, from personal photo collection to professional management systems. This research brings a novel intuitive interactive interface for browsing of large image and video collections that visualizes underlying structure of the dataset by its size and spatial relations. In order to achieve this, images/frames are initially clustered using an unsupervised graph-based clustering algorithm. By selecting images in a hierarchical layout of the screen, user can intuitively navigate through the collection. The experimental results demonstrate a significant speed-up in a content search scenario compared to a standard browsing interface, as well as inherent intuitiveness of the system.