The rise of computers and electronics has profoundly transformed music, affecting not only the way music is created, recorded, and performed but also how it is distributed and consumed. From the earliest electronic sounds to today’s sophisticated digital music systems, this technological revolution has reshaped the entire musical landscape. This essay will explore the history of how computers and electronics revolutionised music, from the initial experiments with electronic sounds to modern developments in digital instruments, production, and performance.
Let’s start right at the beginning….
Taking things way back, you can make the case that the beautiful-sounding Golden Dionysis was very possibly the first electrical musical instrument. It was ‘built’ by the Czech electrical researcher Václav Prokop Diviš in 1748, who claimed to be able to recreate string and wind sounds with it.
You’ll note that we say ‘claimed’ and ‘very possibly’, so, like we say, it might be better to move to firmer historic ground, over a century later when, at the very least, some of the claims were justified and recorded. (We’ll also note that other accounts of the time state that the instrument gave its users electric shocks… so we’re not exactly talking plug ‘n’ play!)
For this, we travel forward to the end of the late 19th century and Matthaüs Hipp’s Electromechanic Piano, which triggered sound by way of activating electric magnets with a keyboard.
Beep! Parp!
The First truly Electronic Sounds
The first significant experiments with electronic sounds began in the early 20th century. In 1897, Thaddeus Cahill invented the **Telharmonium**, an early electromechanical instrument that produced sounds using rotating electromagnetic tone wheels. The instrument was able to broadcast its sounds over telephone lines, marking one of the first uses of electronic signals to transmit music.
Meanwhile, in the late 19th century, Elisha Gray is also noted as (almost) accidentally inventing the synth oscillator while developing a telephone prototype in the late 19th century. But if you think it’s a complex history in this era, just wait until the next century.
Things get even more complex in the early 20th century, with a range of instruments that could and would lay claim to chunks of electronic music history, not least the electric organ and another couple of standout devices, notably the Helmholtz Sound Synthesiser and Theremin.
As we progress through the century, there was Ivor Darreg who, in the 1930s and 40s, designed instruments called the Electronic Keyboard Oboe and Electric Keyboard Drum, which could lay claim to being the first electric instruments specifically intended for music.
And as we dig deeper, other important instruments included the 1930s Trautonium, made by Freidrich Trautwein, and then enhanced into the post-Second World War Mixturtrautonium by Oskar Sala. Using what was called a ‘Sub-harmonic’ technique of sound creation, these allowed you to mix, match and modulate signals with controls like envelope shapers and frequency shifters which would later become common in subtractive synths.
By the 1950s, computers entered the scene, marking another leap forward. The **CSIRAC** (Council for Scientific and Industrial Research Automatic Computer), an early Australian computer, was the first to play digital music in 1951. It was a rudimentary series of tones, but it laid the groundwork for future possibilities. Around the same time, the RCA Mark II Sound Synthesizer developed in the United States at Columbia University, became the first programmable electronic music device.
In the UK in this period, the BBC Radiophonic Workshop (established in 1958) came to prominence, thanks in large measure to their work on the BBC science-fiction series Doctor Who. One of the most influential British electronic artists in this period was Workshop staffer Delia Derbyshire, who is now famous for her 1963 electronic realisation of the iconic Doctor Whotheme, composed by Ron Grainer
Shit gets real…
Computers and Music in the 1970s and 1980s
By the 1970s and 80s, as computing technology advanced, musicians and composers began using computers to compose music and assist in live performances. Early music composition programs such as **MUSIC-N** and its successors allowed musicians to create and manipulate sound in ways that had never been possible before. These programs could sequence notes, adjust tone, pitch, and timing, and even generate entirely new sounds. The **Fairlight CMI (Computer Musical Instrument)**, released in 1979, was one of the first digital sampling synthesizers, allowing musicians to record and play back any sound digitally. This era also saw the rise of computer-assisted music creation, with software enabling composers to edit and perfect their work before performance.
The Creation of MIDI
One of the most significant developments in electronic music was the creation of the **MIDI (Musical Instrument Digital Interface)** standard in 1983. Before MIDI, there was no universal way for electronic instruments and computers to communicate. MIDI changed that by allowing synthesizers, keyboards, and computers to exchange musical information in real-time. This opened up a world of possibilities for musicians, enabling them to control multiple instruments from a single device and synchronize different sound sources easily. MIDI remains a cornerstone of modern digital music production.
Synths, prog and New Romantic revolution
Evolution of Keyboards and Synthesisers
Electronic keyboards and synthesizers have evolved significantly since their invention. Early electronic instruments like the **Moog synthesizer** in the 1960s were analog devices, requiring manual tweaking of knobs and switches to shape sound. However, by the 1980s, digital synthesizers such as the **Yamaha DX7** became prevalent, offering more reliable and versatile sound production. These digital instruments could create a wide array of sounds through algorithms and stored samples, giving musicians more control over their creative output.
The ability to store sounds digitally allowed for the rise of **sampling technology**, where musicians could record real-world sounds—whether instruments, voices, or ambient noises—and manipulate them electronically. As computers became more powerful, musicians could record, edit, and save samples, integrating them into performances and compositions. The **Akai MPC series**, introduced in the 1980s, became legendary for its role in music production, particularly in hip-hop and electronic music, where producers could layer and arrange samples with precision.
Guitarists get in on this digital action…!
Digital Effects for Guitars and Amps
The digital revolution also transformed guitar effects and amplifiers. In the past, guitarists relied on analog effects pedals and tube amplifiers to shape their sound. However, the advent of **digital signal processing (DSP)** in the 1980s allowed for digital emulation of these effects. Companies like **Line 6** introduced **digital modelling amplifiers** that could replicate the sound of famous tube amps and effects pedals digitally. These digital models allowed guitarists to carry fewer physical devices while having a vast range of sounds at their fingertips. Trouble was, mostly they sucked, especially the Spider range!
This transformation changed the way guitarists perform live. Digital pedals and amps allow musicians to switch between sounds instantly and more accurately than analog devices, and entire live rigs can now be controlled from a computer or tablet. This has given musicians more flexibility but has also raised concerns about the loss of the raw, imperfect character of analog gear.
Even drummers wanted a piece of the revolution
Digital Drums
Digital technology has similarly revolutionized drumming. Early electronic drum kits like the **Simmons SDS-5** in the 1980s allowed drummers to trigger synthesised drum sounds through electronic pads. Over time, these systems evolved, incorporating higher-quality sound samples and improved touch sensitivity, making them feel more like traditional acoustic drums. Modern digital drum kits allow drummers to choose from a wide range of sounds, record directly to digital audio workstations (DAWs), and perform quietly with headphones—a key advantage for home studios and live performances in noise-sensitive environments.
The recording studio goes digital
Perhaps the most profound change in music brought about by the digital revolution has been in the recording studio. In the analog era, music was recorded to magnetic tape or vinyl masters, requiring elaborate equipment and a significant amount of space. With the introduction of **digital audio workstations (DAWs)** like **Pro Tools**, **Logic**, and **Ableton Live**, the entire recording process became digital. Musicians could record, edit, mix, and master music on a computer, storing it on hard drives rather than physical media like tape or vinyl.
While digital recording has allowed for unprecedented flexibility and convenience, there are concerns about the longevity of digital media. Unlike analog recordings, which degrade slowly and predictably over time, digital media can fail suddenly. Hard drives, CDs, and other digital storage devices are vulnerable to corruption, rendering the music irretrievable unless properly backed up. This presents a potential risk for the future preservation of music.
What’s next?
The Future of Music Recording and Performance
As digital technology continues to evolve, the future of music holds both exciting potential and complex challenges. **Artificial intelligence (AI)** is becoming a tool for music composition, allowing computers to create music based on user inputs or even generate entirely original pieces autonomously. Meanwhile, **virtual reality (VR)** and **augmented reality (AR)** offer new possibilities for immersive music performances.
However, one pressing question is whether the digital revolution has diminished the human aspect of music. While digital tools have democratized music production, allowing more people to create music regardless of their technical ability, some argue that the reliance on computers can lead to a loss of spontaneity and emotional depth. The ability to perfect every note and rhythm in the digital realm may take away the imperfections that give human-made music its unique charm.
On the other hand, many musicians see the digital revolution as liberating. Artists are no longer tied to expensive studios or large teams of technicians. They can create music whenever inspiration strikes, from any location, using portable recording equipment or even a smartphone
The impact of computers and electronics on music has been nothing short of revolutionary. From the early days of electronic sound generation to the modern era of digital instruments and recording, technology has expanded the possibilities for music creation and performance. While there are concerns about the loss of analog warmth and the potential for digital media to degrade, the future of music appears poised to continue evolving with new technological innovations. Rather than destroying the human aspect of music, the digital revolution has enabled more creativity and expression than ever before, offering musicians unprecedented freedom to bring their ideas to life.
Information sourced from Wikipedia, Music Radar and 120 years of electronic music and AI
No comments:
Post a Comment
Hi and thanks for your input!