SMC 2009 will feature several keynote speeches, by the following distinguished researchers:
- Bruce Pennycook, PhD (Professor of Music and Radio-Television-Film, University of Texas at Austin)
- José Carlos Príncipe, PhD (Distinguished Professor of Electrical and Biomedical Engineering, University of Florida, Gainesville)
- Atau Tanaka, PhD (Chair of Digital Media, Newcastle University)
"Who will turn the knobs when I die?"
This presentation will focus on some issues affecting interactive music composition and performance. Having built several "custom devices" over the past twenty years which are now obsolete, it is evident that interactive music cannot be integrated into the normal flow of "chamber music" where performers can readily acquire musical materials (scores, parts, fixed media) and develop their own repertoire. With most interactive works the composer or some technically skilled surrogate has to be present at each and every concert which obviously cannot be sustained for long. Are interactive works more like art installations with limited life-spans? Or can we imagine a truly stable technology that will permit these works to be played for years to come?
Professor Bruce Pennycook (Doctor of Musical Arts, Stanford, '78) is a composer, new media developer and media technology specialist. He taught at Queen's University in Kingston, Ontario then McGill University in Montreal, Quebec where he developed undergraduate and graduate degree programs in Music Technology and held the position of Vice-Principal for Information Systems and Technology. Pennycook moved to Austin in 2002 and was appointed Professor of Music Composition and Professor of Radio-Television-Film in 2007. Pennycook has published a wide range of articles on new music and music technology. His music includes music for video, electroacoustic music, chamber music and music for large ensembles and these are performed throughout North America and in Europe and Asia.
"Perception as Self Organization in Space Time"
The exquisite abilities developed by the brain to be aware of the surrounding world are often taken for granted. When engineers attempt to build sensory systems in robotics they realized how difficult the problem really is. It is fair to say that biology has built the most robust sensory systems, and very little is known about the design principles involved. This talk will discuss some issues of the role of time in auditory perception, will critically review some of the tools used in audio engineering and how one can design new self-organized systems to quantify the world around us.
Jose C. Principe is Distinguished Professor of Electrical and Biomedical Engineering at the University of Florida, Gainesville, where he teaches advanced signal processing and machine learning. He is BellSouth Professor and Founder and Director of the University of Florida Computational Neuro-Engineering Laboratory (CNEL). He is involved in biomedical signal processing, in particular Brain Machine Interfaces and the modeling and applications of cognitive systems. He has authored 4 books and more than 160 publications in refereed journals, and over 350 conference papers. He has directed over 60 Ph.D. dissertations and 61 Master’s degree theses. Dr. Principe is an IEEE and AIMBE Fellows and a recipient of the IEEE Engineering in Medicine and Biology Society Career Achievement Award. He is also a former member of the Scientific Board of the Food and Drug Administration, and a member of the Advisory Board of the McKnight Brain Institute at the University of Florida. He is Editor in Chief of the IEEE Reviews on Biomedical Engineering, Past Editor-in-Chief of the IEEE Transactions on Biomedical Engineering, Past President of the International Neural Network Society, and former Secretary of the Technical Committee on Neural Networks of the IEEE Signal Processing Society.
"From Mainframes to DIY Culture – Continuous Evolution in Computer Music"
Computer music, in its second epoch of the last twenty years, has undergone fundamental shifts – it has gone real time, it has become interactive, it has become miniaturized, and completely democratized. I’ll map out my personal trajectory in this time to look at broader evolutions in the field with sensors, networks, and mobility. These are not just technological changes, but changes that bring about shifts in musical approaches. Form factors change, analogue is reconciled with digital, and new directions in Open Source and DIY culture continue to challenge our assumptions on what it means to be an artist, composer, performer, participant, in these evolving musical/technological landscapes.
Atau Tanaka bridges the fields of media art, experimental music, and research. He worked at IRCAM, was Artistic Ambassador for Apple France, has been researcher at Sony Computer Science Laboratory Paris, and was an Artistic Co-Director of STEIM in Amsterdam. Atau creates sensor-based musical instruments for performance, and is known for his work with biosignal interfaces. He seeks to harness collective musical creativity in mobile environments, seeking out the continued place of the artist in democratized digital forms. His work has been presented at Ars Electronica, SFMOMA, Eyebeam, V2, ICC, and ZKM and has been mentor at NESTA. He is Chair of Digital Media at Newcastle University and is Acting Director of Culture Lab.