Andy Richards (Part 2)

//Andy Richards (Part 2)

Andy Richards (Part 2)

In Part 1 Andy explained how his skills as a classically-trained session musician and experienced sequencer and syntheziser programmer, landed him a job working for Trevor Horn just when the producer’s career was at its peak. In Part 2, Andy describes some of the challenging changes he and other in the music industry have faced over the past decade or so, and talks about his recent work mixing and producing music for film.

Andy Richards

Like every other musician, engineer and producer who has worked in the music industry throughout the last 20 years, Andy has witnessed its steep decline and adapted to the changes. Naturally, he has some insightful thoughts on what has changed and the effects certain trends have had on the health of the industry.

“One of the sad things about technology today,” says Andy, “is that it has created a situation where people work by themselves or just with one or two other people. You can use a screen to drag and drop stuff to make verse, bridge, chorus or middle eights, but you give yourself all sorts of problems when you go to try and play live, whereas what used to happen was you would have a band and a producer would come and work with you, and you’d all sit down and play together, and knock it into shape. The producer might tell you, for example, to cut out those four bars at the end of the chorus and go straight into the second verse. You’d rehearse it as a live band and then go into the studio and record it with whatever overdubs you wanted to do, but the core of it was still a live entity, so if you wanted to go and play it live it was easy to do.

“We are talking the different between making music with machines and making music with people, who interact. This is why music of the ’60s, ’70s, and ’80s, is much more diverse in terms of style and feel than music of the ’90s and the 21st century.

“Also, in the late ’70s, early ’80s, you could play different pub or venue every night, even in Stoke-on-Trent where I was brought up. Now the opportunities to play live have shrunk to such a ridiculously small number of venues that there is nowhere to play, so for up and coming players who want to cut their teeth in a band and learn the process of playing it is becoming more and more difficult.

“There also used to be a much wider variety of recording studios, often with custom-made equipment, so people would go to a particular studio because it had a certain sound. Today everybody uses Pro Tools and most big mix rooms have SSLs or Neve desks.

Less Is More?

Another observation of Andy’s is that there is now so many processing and sample options available that newcomers are likely to spend all their time learning to use complex pieces of equipment and sifting through endless sound banks.

“When you read music technology magazines you see so much stuff in there, so where do you start?” asks Andy. “Once upon a time, you had computers which simply had a sample page and a sequence page. Now you have vast libraries, and huge numbers of way of processing the samples. The whole thing has got so complex that you no longer have the time to sit down and learn a real skill. How do people coming into it learn something like Logic which is such a hugely complex program? And then, if you find you’ve got a huge 5GB library of sounds, do you ever listen to them all? Do you ever build a track from scratch and start sampling stuff?

“We used to decide what sort of track we wanted and then we’d go around and dig out the relevant sounds. Those sounds would be the tools we’d use to make the track. It takes time to find stuff and start manipulating sound, which is why all these libraries of preset samples are such a boon, but imagine trying to choose a bass drum for a particular track when you’ve got 3000 to choose from! It is very difficult. You almost start to work to the lowest common denominator rather than the highest.

“I’m not saying that modern music is bad, but more and more often we find that we have singers who look good but can’t sing, and you end up with something which is completely fabricated. Although one has to take one’s hats of to Simon Fuller and Simon Cowell for what they have done, there are people who believe that by doing what they have done they have helped to change the music industry for the worse, and I fear that that may be the case. If you consider how many wonderful singers and musicians there are out there and you realise that, ultimately, things like Pop Idol are about making money and simply not about a love of music.

“In the ’60s and ’70s you had managers who loved their bands and would go to all their gigs, so it was sort of a pursuit of excellence. There were people who were great guitarists, drummers and bassists who became heroes and legends, but now it is basically about mediocre singers who look good on video and in pictures.

“One of my favourite tracks from 2003 was Rachael Steven’s ‘Sweet Dreams My LA Ex’, but the very first time I heard it I was looking at a television and I saw the video, and once that has been seen you have the video image printed in your mind, and that makes an enormous difference. There are obviously cases where that is not true. Queen’s ‘Bohemian Rhapsody’, for example, had an extraordinary video and I’m sure it helped, but it no way detracts from the fact that it is an extraordinary piece of music which crossed right across the board. Let me give you an idea of the crossover of that particular track.

“I must have been about 20 or 21 at the time and I was studying for my performance Diploma at the Royal Academy Of Music, which was basically about playing very complex pieces of music, very fast, from memory, in front of music professors who basically say, ‘OK, you are good enough to be a concert pianist’; it was that sort of level. But I came in one day and my professor, said ‘Have you heard that new song by Queen? It’s really very, very good, isn’t it?’ And this was a guy who was in his mid to late 60s who had studies classical music all his life, and he was able to appreciate its nuances and complexities. I can’t really imagine anything like that happening today.”

“Prior to the mid 1980s, musician in the business generally had to be able to play. I’m sure that within ten or 15 years, people will just work from a palette of grooves, sounds and parts that a computer will play. Garage Band, which is part of Apple’s iLife package, is already very much like that.

Out Of Eden

The Eden Project

Andy’s initial solution to the industry changes which caused his session work to dry up was to become a producer, but he did not enjoy working with the pressures of dealing with artists, musicians, A&R departments and managers while trying to deliver a commercial product within budget. By the end of the 1990s, the decline in record company production budgets caused him to think about his options once again. Recognising that there was a boom in multimedia products like DVD 5.1, and fostering an interest in film score work, Andy decided to set up his own post production studio, named Out Of Eden Studios, which was housed the Eden Studio complex in Chiswick, West London. The facility comprised a large control room, machine area and live room and was built by Eden’s technical director, Mike Gardner, leaving Andy responsible for providing all the equipment and furnishings.

“Out Of Eden is deliberately intended to be much more flexible than a traditional mixing room so it’s also great for programming, writing and recording,” explains Andy. “It’s an amazing sounding room, which I think is accurate to 0.1 of a dB from 20Hz to 20KHz! Composer Mike Higham is also a partner in Out Of Eden and he brings his own unique set of skills. I first met Mike when he was doing Pro Tools programming for Trevor, and since then he’s done a lot of film music work, so together we provide a ‘one-stop-shop’ production service which can include supervision, editing, mixing and composing.”

The Fairlight Dream Lives On

Out Of Eden is dominated by Andy’s 96-channel, 48-track Fairlight DREAM digital console/recorder, which is specifically designed to cope with all the demands of 5.1 mixing and film work.

“Fairlight addressed the whole issue of sound to picture way before Pro Tools and probably before any other major platform,” insists Andy. “When you’re working with video, you have to cut and paste stuff around, and the Fairlight’s editing commands are particularly well designed in that respect. In Pro Tools you would generally copy stuff from the front of a particular clip or region, but the Fairlight allows you to match any frame with any audio point. For example, you scroll to the exact frame you want, take your video machine off line, pick a point within the region of audio, hit Copy and the clip is immediately copied relative to the cursor position. Then you hit another button on the console and the audio snaps to the selected video frame. You press enter and it’s done. Very fast!

“The console has pretty much the same flexibility of a large Neve DFC so it’s completely configurable and works in mono, stereo, LCR, LCRS, 5.1, 6.1, Dolby EX, and also 7.1.

“It has a 40-bit floating point system so you have an enormous 3000DB of headroom, and the quality of the converters is astonishingly high, so it’s a remarkable sounding bit of kit.

“The I/O, DSP cards, and hard disk recorder are racked up in the machine room, so the desk is purely a control surface, which means that there’s no signal path and everything remains pristine. When you EQ something, for example, there is no phase shift at all.

“On the controller you basically have a channel strip with compression, EQ, and 12 aux busses. The desk enables you to take your mono, stereo, 5.1, or whatever feed it is, move it to any part of the room, give it any amount of width, any dimension, and you can rotate the whole balance around the room, all in real-time. You have that facility on 96 channels, and the desk has joy sticks, a track wheel and a rotary button to do all that. So you have the ability to position any single element anywhere you like, be it an effect return path or audio track.

“Also, when you are mixing in a format greater than stereo you can easily fold the mix down into stereo or mono for monitoring, and there is a reduction bus so that, while you’re mixing in 5.1, you still have complete independent control over things like panning, faders, mutes and effects returns for the stereo mix, the mono mix, or whatever. So, in essence, I can produce a different type of stereo or mono mix all in the same session, just by hitting one button!

“If you want to make fizzy pop records with lots of distortion, this probably isn’t your desk of choice, although there are third-party effects and processing plug-ins available from Creamware which simply slot into another PC at the bottom of the rack.”

The DREAM console

Editing For Film

As a music editor and score mixer, Andy’s acts as an intermediary between director and composer, looking after all aspects of a film’s musical score, and ultimately mixing the music ready for the dubbing theatre.

“The job is defined by anything that has pitch on it,” explains Andy. “I have to make sure the composer is up to date with any changes that have occurred, and with any music issues that crop up. When the composer is finished, he supplies a multitrack, which can be anything from a few tracks through to about 70. 70 tracks is a lot to mix in 5.1, particularly as you probably have a maximum of five days to mix an hour of music, so you have to be very organised.

“At the start of the process we create what’s called a ‘preview’, which is a temporary score – assemble from a load of musical segments taken from other scores – and that is designed to describe the emotive content for each part of the film. It gives everyone the chance to see if a certain piece of music evokes the right feeling. When the composer’s score comes in we use it to replace the temporary stuff, so it’s a gradual building process.

“I get sent an Avid OMF [Open Media Framework] file which we’ll convert into a Pro Tools file using Digiconverter. A rough cut of the film usually comes to me on a Beta SP video cassette, which I’ll record into both my Macintosh, using my Aurora Design Fuse video capture card, and also into the Fairlight, using a V-Motion Capture card. Doing that means that I can compose to picture on the Mac, and when I’m mixing in the Fairlight I’ve got a very high resolution picture running on the screen.

“When I get the OMFs, the studios are often still editing the film to length, but any changes are sent on a new OMF reel, which I’ll digitise again and use when I edit the music. And I also have to update the cue list for the composer.

“Although the music has dynamics, we don’t really mess with the levels because we don’t have the finished dialogue and effects tracks to reference from. Our job is to get it to a point to where everything feels like it’s sitting right. Then the dubbing mixer will ride the music, effects and dialogue. And of course, there are also the sound effects, which are created by a sound designer.

“I provide the dubbing theatre with a full 5.1 mix, which they can use really wide or, if they want, fold down into mono and EQ so it sounds like it is coming out of a radio speaker, for example. Sometimes we provide separate stems because there might be a huge battle scene with loads of explosions at that point, which can mean that even some quite loud parts of the mix simply don’t make it through. In that case, the dubbing theatre has the option of adjusting the levels to suit the effects. However, premixing is still really important because if it didn’t happen the dubbing theatre would have to spend ages sorting it out when they also have the dialogue and sound effects to deal with, and a dubbing theatre can cost four to six hundred pounds an hour!”

5.1 And Beyond

While many engineers and producers view 5.1 as an inevitable, but regrettable, music industry development, Andy has embraced the format and argues that it has much to offer over a traditional stereo system.

“5.1 is designed for cinema, so you always have your vocal, main characters and dialogue coming out of the centre speaker. Everything else, like reverbs and effects, are generally spread around and towards the back, and that works for audio as well as film. But I’ve found that mix engineers who are used to mixing in stereo don’t like putting the artist in the centre because it makes it too easy to isolate and steal the vocal and use it for something else. I argue that the artist now gets their music stolen whenever people download MP3s anyway, so if people want to nick stuff they will.

“Engineers also worry about putting the vocal in the centre because 5.1 system owners might not have the centre speaker set up properly, although I think the one thing people are most likely to have working is the centre speaker otherwise they wouldn’t hear any dialogue in films. I’ve been to people’s houses who have 5.1 systems, and it is interesting how different they all sound, but even with stereo systems you find people who have the left-hand speaker hidden under a sofa somewhere, so there is no such thing as the perfect listening environment.

“Some engineers also don’t like to use the sub speaker because they don’t believe in the concept of folding audio back into it, even though that is one of the things which make it sound good. So without the centre speaker and sub you end up with quad mixes. I think it’s a shame because 5.1 allows the performer or soloists to be showcased in a unique way, and I would tell any vocalist who doesn’t want their vocal in the centre that they won’t sound as good in stereo.

“When people used to work in mono the vocal was louder than anything else, and it was right in front of you coming out of one speaker. It was very simple and very effective. In stereo, the vocal is a phantom vocal because it comes equally out of the left and right speakers, which means that it appears to move as you walk from left to right. I could argue that that is a very bizarre concept because you have to be sat right in the middle of the room in order to hear the vocal in the centre.

“I remember Trevor Horn experimenting with stereo a few times by having drums and bass coming out of one speaker and the guitar and vocals out of the other. The Beatles and many other bands of their era used to do that and I think it’s very interesting and actually quite appealing!

“You can either view 5.1 mixing from the perspective of the listener being in the audience, with a group of musicians spread out in front and some ambience and reflection behind, or the listener can be placed on the stage so that they are involved in the sound. Obviously the latter varies depending on where the listener is supposed to be on the stage and where the instruments are distributed. I know a very good engineer who spreads the drums right around the listener, and he swears by it, although personally I think it is very strange. For certain music, like techno and interesting arty stuff, doing that can work very well, but in terms of a traditional band, that approach doesn’t make any sense to me.”

Andy as The Thinker

To The Future

Since it opened at the end of 2000, Out Of Eden has been very successful in obtaining score mixing jobs, and with Mike Higham on board, score writing commissions too. Andy seems to have put himself in the right place at the right time once again but, as ever, he is enthusiastically keeping a careful watch on future industry developments, so that whatever changes are in the industry’s pipeline, he can stay one step ahead.

“SACD is great but it’s not a mass market product like DVD. I know that by 2004 Dolby had sold over a quarter of a billion 5.1 licences and so DVD sold faster than any other sound delivery technology.

“There is the debate as to whether or not people in years to come will have record and CD collections. There will always be a core of people who still cherish their vinyl collection and love their hi-fi systems, but even though manufacturers are trying to up the sample rate to 96k, the mass market has moved towards compressed MP3 downloads. For many years I’ve had wireless broadband systems in the studio and at home so I just turn on the screen and I am on-line wherever I am. To all intents and purposes, the Internet is just another application that is always there, and accessing an on-line database is as quick as loading up any host-based program, so that is probably going to be the way most people are going to choose to listen to their music in the future.” TF

Part 1 of our interview can be found here: Part 1

To see the full (and lengthy) list of Andy and Mike’s film credits, visit their site: www.higham-richards.com

By | 2017-12-08T03:37:13+00:00 July 13th, 2012|Music Technology|Comments Off on Andy Richards (Part 2)