The Technology Institute for Music Educators (TI:ME) offers courses, workshops and conferences on technology and use to music educators and students of music and education. TI:ME offers two levels of certification much like Suzuki, Orff or Kodaly offer certification to earned levels of knowledge and expertise. For those working in the field already, TI:ME offers alternate certification for their Level 1 certification. This and the following two articles are papers I wrote for that alternate certification.
First, a little clarification. Back in the day (as my students like to say), music software was divided into three separate and distinct categories. Software to manipulate MIDI, audio and notation were separate purchases. Today, the lines are blurred. Sibelius & Finale, traditionally notation programs, offers some good MIDI and even audio. Logic, traditionally a MIDI program, offers excellent audio and good notation. Now that digidesign and Sibelius are owned by the same parent company, Avid, traditionally an audio program has some good notation and will only get better. Although these programs can be considered “triple threats”, it’s still important to choose software based on your primary needs. Although I can get some good notation out of Logic Studio, I wouldn’t publish music with it. The same can be said for a notation program, it’s not my first choice for composing or teaching music through composition. I know that may be heresy to some but more on that in another article
Electronic Instruments and MIDI
Today’s electronic music instruments come in all shapes and sizes. Keyboards, drums, guitars, winds, strings and even instruments you can play on your cell phone! Around 1983, electronic instrument manufacturers got together and created a standard for transmitting musical information. MIDI (Musical Instrument Digital Interface) was born and a new world of music was created. MIDI is simply a standard digital format for devices to communicate to each other. Think of it as the language of electronic music. When you hit a note on a MIDI keyboard, a series of parameters is captured, translated to a digital format (a collection of 1 and 0). MIDI parameters include pitch (the actual note you want to play), velocity (how hard did you hit that note will effect the sound or tone quality), volume (loud and soft and everything in between) pan (what you hear in the right or left ear) and other parameters that can be entered and manipulated by the user. Remember, MIDI information is just the digital information. It has no sounds. You’ll need something to interpret the MIDI information and reproduce the sounds. That’s the sound module or what might be called the tone bank, a computing device that reinterprets the digital information and reproduces it as sound. All sound modules contain the basic industry standard for General MIDI that includes a collection of 128 preset sounds divided into 16 channels (translate that into groups of sounds like Keyboards, Guitars, Bass, Strings, Reeds, Sound Effects, Drums and others) and certain functional parameters including the ability to interpret velocity, support polyphony (simultaneous multiple voices). What separates one manufacturer’s keyboard from the next and what distinguish entry level of keyboards from more expensive keyboards are the proprietary sounds in the sound module and the user’s ability to manipulate these sounds on the instrument.