Ivy Music

Course in Modern Film Scoring & production for media


Midi Mock-up & Orchestration

In technical terms, orchestration is the process of assigning, in the form of a musical score, a composition complete in form melodically, harmonically, and rhythmically. The task is to designate melody, harmony, and countermelodies to the instruments in the orchestra.

Today, there is a broad agreement among composers and orchestrators that orchestration is the process by which the composer’s MIDI mock-up is translated into music that can be performed by live musicians. Orchestration nowadays is a lot about how do you make sense of something that a music producer wrote on the computer with the use of samples that now needs to translate to a real orchestra. It is taking a MIDI file and then making it work for the orchestra. Moreover, orchestration is more like taking the MIDI file and translating, refining, and essentially fleshing it out so the score is complete. With the production of detailed demos being standard practice in the contemporary industry, orchestration can mean almost transcription. Consequently, orchestration can be a little bit more like transcription if the programming is really good.

The primary route into film-score orchestration appears to start with a good musical education. Nevertheless, specific musical schooling is not a particular requirement.

Modular Synth

Simply put, a modular synthesizer is an electronic musical instrument that consists of a multitude of different components (modules) that are used and combined to create electronic sounds. The individual modules are connected via cables (patch cables), switches, sliders, and patch panels. In this way, a large number of connection options and permutations are possible. It is important to know that the parameters of analog synthesizer modules can be controlled by Control Voltage (CV).

A distinction is made between modules for generating sound (VCO = Voltage Controlled Oscillator), modules for changing sound, e.g. a filter (VCF = Voltage Controlled Filter) and modules that control the sound behavior over time using a so-called envelope curve (VCA = Voltage Controlled Amplifier or EG = Envelope Generator). The envelope used works according to the ‘ADSR‘-principle (Attack, Decay, Sustain and Release) and shapes the resulting sound with all intermediate stages during transient oscillation up to decay. With an LFO (Low-Frequency Oscillator) the resulting audio signal is modulated by a control voltage, e.g. to obtain a vibrato in the simplest case.

However, there are many other modules that are dedicated to special tasks. All analog modules are operated by control voltage: If, for example, a patch cable is used to connect the CV output of a keyboard generating control voltage or a sequencer to the CV input of a tone generation module (VCO), the control voltage changes when the keys on the keyboard are pressed, so that the receiving tone generation module (VCO) changes the pitch simultaneously. The same is done by the sequencer, which controls the connected tone generator with predetermined tone intervals and rhythmic patterns. 

Trailer music 

(a subset of production music) is the background music used for film previews, which is not always from the film’s soundtrack? The purpose of this music is to complement, support, and integrate the sales messaging of the mini-movie that is a film trailer. Because the score for a movie is usually composed after the film is finished (which is long after trailers are released), a trailer will incorporate music from other sources. Sometimes music from other successful films or hit songs is used as a subconscious tie-in method. Trailer music is known for its sound-design-driven and hybrid orchestral style. Trailer music tracks can vary greatly in duration, depending on the theme and target of the album. 

Art of Mixing

Detailed and comprehensive, the Art of Mixing module is designed to equip you with the skills of an advanced mix engineer and features real multi-tracks in a range of musical styles. Art of Mixing is taught on ProTools, it’s jam-packed full of classic and cutting edge mix techniques and contains practical video tutorials including some very special exclusives with top industry professionals sharing their own mixing tips, tricks, and techniques. Balancing, panning, aural perception, EQ, compression, limiting, gates, and effects are covered in detail. You will also learn to listen to music from a critical perspective and understand the science behind the sound.

  • The Mixing Environment & Critical Listening
  • Starting a mix
  • Controlling Dynamics
  • Separation in the mix – EQ
  • Space & Depth – Reverb
  • Delays & related effects
  • Creative Mixing
  • Mixing Vocals
  • The Mix Process
  • Complete Mix Walkthroughs
  • Basic Mastering

Mastering

You learn to develop/tune your ears, find out about mastering EQ, compression (including multi-band), limiting, mid/side processing, and more. You can master your own music as part of this course. This module is taught using a variety of different platforms and plugins, and the skills you learn can be applied to any mastering environment.

  • Mastering Process & Sound Theory
  • Digital Audio Theory
  • Monitoring
  • Workflow, Phase & Metering
  • Frequency Spectrum & EQ
  • Dynamics: Compression, Limiting & De-Essing
  • Dynamics: Multiband & Parallel
  • Mastering in Action
  • Audio Restoration & Noise Reduction
  • Level & Tonal Coherence
  • Album Work
  • Mastering for Different Media

Music Licensing & Sync 

We’ve taught thousands of students here on Udemy and in this course you will learn the basics of music licensing and how we got our music placed in TV Shows shown around the world. Some of the topics covered include:

  • What is music licensing and how it works
  • What is music publishing and how it works
  • How to get your music ready for placement opportunities
  • How to increase your chances of getting placements
  • The art of making connections
  • And who and where to submit your music
  • Gain Knowledge on how to build your own professional Music Catalogue for Tv and Film.

Principles of Sound and Audio Practice

You will be introduced to the concept of sound as a physical phenomenon and shown how to protect its behavior by exploring maths and physics concepts, which will support your learning throughout the rest of the course. You will then focus on the fundamental principles of audio production, including the theory and practical application of EQs, dynamics processors, and effects. This critical foundation will enable you to grasp the concepts of signal routing, microphones, and loudspeakers, providing you with the basic skills to operate in a professional audio environment.

Introduction to Academic and Professional Practice 

In this module, we introduce you to proven concepts and routines of academic research, critique, and writing, and nurture these skills to ensure that you apply good study practice and management throughout your studies. We aim to familiarise you with the theories of culture and communication and develop a basic understanding of the creative media practitioner. Together, all of these skills will help you develop transferable career skills to aid your job search upon graduation.

Marketing, Business Planning, and Law

This module aims to impart the essential knowledge, concepts, and analytical tools of business and marketing to function effectively in the industry. It will also open your eyes to key legal and ethical issues that underpin practices specifically related to the creative media industries. Upon completion of this module, you will have developed key communication skills, whilst being sensitive to the impact of how communication can shape how we, as a society, understand each other in social, cultural, and economic contexts.

Research Practice and Society

Research is key to the successful outcome of any creative project and this module aims to develop within you an advanced critical understanding of qualitative and quantitative research methodologies and their application for both artistic and scientific research. You will advance your knowledge, planning, and implementation of research-based inquiry to address specific questions, whilst developing an in-depth understanding of the creative media industries and your potential role as a creative media practitioner. Together we will broaden your understanding of the ongoing interplay between science and the history of ideas, culture, and creative media, to give your work the depth it requires to have a profound effect in the market-place.

Immersive Audio
Immersive audio is a powerful way to fully immerse a user and direct attention within a 360 video or VR experience via sound. A huge part of our attention can be directed with audio cues but a fully immersive experience requires a detailed spatial audio mix and not just cues that are added on as an afterthought. Spatial audio makes what we hear a believable auditory experience that matches what we see and have experienced. And for this precise reason, for the most realistic and impactful experience, it is pertinent for the sound design to be a part of the creative brief from the very beginning as bad or misplaced audio design and cues can be a deterrent to a convincing outcome.

Here we’ll cover the basics of Immersive audio and provide some how-tos to help you make an awesome 360 VR experience.

What is Spatial Audio?

The human brain interprets auditory signals in a specific way that allows it to make decisions about its surrounding environment. We use our two ears, in conjunction with the ability to move our heads in space, to make better decisions about the position of an audio signal and the environment the sound is in.

Spatial audio in virtual reality involves the manipulation of audio signals so they mimic acoustic behavior in the real world. An accurate sonic representation of a virtual world is a very powerful way to create a compelling and immersive experience. Spatial audio not only serves as a mechanism to complete the immersion but is also very effective as a UI element: audio cues can call attention to various focal points in a narrative or draw the user into watching specific parts of a 360 video, for example.

Linear vs. Interactive Audio Design

Thanks to the growth of production and consumption of immersive panoramic or VR experiences, developers and infrastructure owners are re-examining the constraints of video container specifications. Better file compression, codec and streaming infrastructure support for multichannel spatial audio, or, support of various real-time interactive metadata elements to influence traditional video playback or social reactions with live video are examples of a trend which is proof of the worlds of traditional video broadcasting and siloed game-like apps coming together.

Ambisonics

Ambisonic technology is a method to render 3D sound fields in a spherical format around a particular point in space. It is conceptually similar to 360 video except the entire spherical sound field is audible and responds to changes in head rotation. There are many ways of rendering to an Ambisonic field, but all of them rely on decoding to a binaural stereo output to allow the user to perceive the spatial audio effect over a normal pair of headphones.

Ambisonics is not the only way to render spatial audio for 360 videos. There are other solutions in the market as well, although the effectiveness, feature set, toolchains, and final render quality varies between various techniques:

  • Traditional surround sound such as 5.1, 7.1 etc. which can be decoded over virtual speakers and rendered binaurally over headphones. Depending on the content, the rendered sound field may suffer from ‘holes’ between the speakers and won’t have the same smoothness in spatial accuracy or resolution
  • Quad-binaural: 4 pairs of pre-rendered binaural stereo tracks each in 0, 90, 180 and 270 degrees. The audio streams are faded in based on head-rotation

Dolby Atmos

The Dolby Atmos course provides in-depth training covering many aspects you’ll need to know to create great content in Dolby Atmos. This course covers our complete curriculum for Dolby Atmos Post Production through text, graphics, and short videos about the key topics. Modules include Dolby Atmos basics, best practices, and how to work with the Dolby Atmos Renderer in various DAWs. There is also advanced post-production-focused topics such as ADM BWF Import into Pro Tools.

What you will need for this program

Software Requirements

  • Any DAW (Pro Tools, Logic, etc.) Protools is recommended

Mac Users

  • OS X 10.9 Mavericks or higher (click here for system requirements)
  • Latest version of Google Chrome

Windows Users

  • Windows 7 or higher (click here for system requirements)
  • Latest version of Google Chrome

Hardware Requirements

  • High quality audio interface, with a recommended minimum of 2 inputs (Focusrite, Behringer, Apogee, MOTU, etc.)
  • MIDI controller (M-Audio, Akai, etc.)
  • Professional pair of speakers 
  • Professional pair of headphones 
  • 50 GB free hard drive space
  • Webcam
  • Internet connection with at least 4 Mbps download speed (http://www.speedtest.net to verify or download the Speedtest by Ookla app from your mobile app store)

    General Course Requirements
    Below are the minimum requirements to access the course environment and participate in live chats. 

    Please make sure to also check the Prerequisites and Course-Specific Requirements section above, and ensure your computer meets or exceeds the minimum system requirements for all software needed for your course.
    All Users
  • Latest version of  Google Chrome
  • Zoom meeting software
  • Webcam
  • Speakers or headphones
  • External or internal Microphone
  • Broadband Internet connection