Can we predict emotional responses to music as we create it? Try out a new melody-making interface and find out.

When we create music for its own sake, or as a soundtrack to a film or a game, we tend to keep in mind how audiences are likely to perceive emotional expressions of that music – is it likely to make people feel angry, calm, frightened and so on.

Research at the intersection of music, psychology and software engineering at the University of York, UK, is attempting to answer the question "What if we could predict emotional expressions in music as it is being created?"

One of the results of this research is a melody-making interface. Why not try it out and see if the in-built, dynamic emotion sliders match your music-expressive aims?

Watch our video and then use the melody-making interface to take part.

https://www.youtube.com/watch?v=qtFZAaruHNU

While the beginning of the video is aimed at a slightly older audience, anyone aged 5+ will enjoy trying out the interactive material which begins at 4'25" in the recording.

Melody-making interface 

About the presenter

Dr Tom Collins runs the Music Computing and Psychology Lab in the Music Department at the University of York, UK.

 

Partners