jueves, 1 de noviembre de 2012
The shift in the music paradigm
It's inevitable to admit that music has changed over the years, it also seems that this constant change keeps changing faster as new technologies are available.
In the past music was thought in the brain and then translated into a piece of paper, then a bunch of instrumentalist would execute it live. The only tangible thing was a sheet of paper, whenever you heard that particular song how it would sound depended on the performers, there was no recorded thing that would capture a particular execution.
As technology evolved and recording a specific song in a particular place and time the paradigm shifted. At this point you could hear any recorded song without the need of having musicians with instruments in front of you. However music writing and composing was still something thought first from the brain, and then once the piece was completed, recorded.
The sounds of the first records were crude, with a pinch of distortion, speakers could not reflect the whole audible frequency spectrum and you were limited to mono field recordings. With the years all the equipment evolved and got better, multitrack stereo recorders where available, stereo speakers that could reproduced the whole frequency spectrum, effects etc. With all of this at the musician availability now the mixing board, effects, panning, type of recording, microphones used etc. were considered as an essential part in the music creation process as sounds could be processed intentionally without the limitations there were before. (Listen to any of the first stereo recordings you would notice that everything is fully panned to one speaker or another). As new instruments (electric guitars, bass, electric pianos) were introduced into the mass market the most valued musicians were those extremely virtuoso players.
The introduction of computers into the music production process changed everything again. Now tools as copy, cut and paste were extremely easy to do, expensive hardware was slowly replaced by software. The use of synthesis, samplers etc not only changed the type of sounds of how music was performed but also how it is composed. I like to think music nowadays as a collaboration between you and the sounds you have. It is how sounds sound with each other, not how they sound by themselves. No matter how cool you think a sound is on its own, if doesn't sound well with others, it becomes useless.
Saying all of this, I encourage all of you new music makers to think about this changes in how music is thought nowadays.
Music is about how a combination of sounds make you feel and the emotions they trigger inside of you. Specially in electronic music made with machines were the human part is kind of left aside, you have to work out how to be able to combine those sounds, simple, complicated, loud, quiet, long short whatever into something that sounds good makes you feel things.