AETCA 2025 Content
The topics listed on this page may appear on the Audio Engineering Technical Competency Assessment (AETCA) and are intended to reflect the foundational knowledge and practical skills essential for audio engineers. Not every topic will be tested on every administration of the AETCA, but any topic included may appear on the assessment. Those not directly listed will not be tested. This list is presented in the same order as the content areas will be presented on the assessment.
Audio Physics
An audio engineer should have a solid understanding of the physical principles that underly sound and govern the way that it behaves. This knowledge is essential for optimizing sound quality, controlling acoustics, and managing audio systems in unique and varying environments.
Properties of Sound
Define and explain frequency, amplitude, wavelength, period, and velocity.
Define the relationship between frequency, wavelength, and velocity and complete basic calculations using this relationship.
Know the exact speed of sound through air in meters per second and feet per millisecond.
Explain the differences in sound velocity through different mediums
Define and explain the compression and rarefaction of a medium caused by sound waves.
Relate compression and rarefaction to points on a waveform.
Describe the polarity of sound waves relative to each other.
Wave Behavior & Interactions
Define and explain reflection, refraction, diffraction, and absorption of sound.
Define and explain phase relationships, interference, and phase cancellation
Understand constructive, destructive, and partially destructive interference.
Define and explain the relationship between wave polarity and phase.
Describe comb filtering and its causes.
Define and explain the Doppler effect.
Define and explain the inverse square law and the effects of distance on sound intensity.
Hearing & Loudness
Know the standard range of human hearing.
Understand the measurement and meaning of sound pressure level (SPL).
Approximate the relationship between the decibel scale and perceived loudness
Understand the logarithmic nature of the decibel scale.
Relate the relative intensities of two sounds based on their decibel levels.
Understand and interpret Fletcher-Munson curves (equal-loudness contours).
Identify regions of relatively high or low sensitivity.
Resonance & Acoustics
Define and explain the concept of sound absorption.
Define and explain the absorption coefficient (alpha).
Define the concept of transmission loss.
Describe the relative absorption of various materials.
Describe the mass law and the role of material density in sound absorption.
Describe the difference between sound absorption and diffusion.
Identify and differentiate examples of materials used for sound absorption or diffusion.
Define and explain standing waves and how they relate to room modes.
Identify key points on a standing wave, including nodes and antinodes.
Understand the meaning of RT60 and how it relates to the reverberations of a space.
Perform basic calculations using an RT60 value.
Define and explain the difference between airborne and structure-borne sound.
Psychoacoustics & Perception
Identify and explain the basic anatomy and functions of the ear.
Identify the pinna, auditory canal, tympanic membrane (eardrum), ossicles, inner ear structure, and cochlea.
Describe the concept of critical bands and their relationship to frequency.
Describe the basic concepts behind binaural hearing.
Define and explain the Haas effect (precedence effect) and how it affects auditory localization.
Define and explain spatial hearing and localization cues.
Describe interaural level difference (ILD) and interaural time difference (ITD).
Electronics
Define and explain Ohm’s Law and its application in audio circuits.
Define and explain basic properties of electricity including voltage, current, and resistance.
Define the relationship between voltage, current, and resistance.
Identify basic electrical components including capacitors, inductors, and resistors.
Describe and explain the difference between balanced and unbalanced signals.
Describe phantom power and explain why it is needed for certain microphones.
System Design
An audio engineer should have the ability to design efficient, safe, and complex audio systems using the appropriate technology and technique to ensure clear signal flow, minimal noise interference, and optimal performance for the intended application. This includes selecting the right equipment, managing power distribution, and configuring signal routing to meet the demands of live sound, studio recording, broadcast, or installed audio systems. Audio engineers should also be able to quickly identify common effects processing tools both in a digital and analog format.
Signal Flow & Routing
Identify, map, and follow signal paths in both live and studio environments.
Explain the four different signal levels seen in audio systems (mic, instrument, line, and speaker).
Identify the differences between each, their common sources, and what tools are required to convert one level to another.
Explain the function of preamps.
Analyze and set gain structure to optimize headroom.
Compare and contrast analog and digital signal routing.
Differentiate between insert and send/return configurations and their applications.
Describe the use of direct outs, subgrouping, and summing in signal flow management.
Microphone Selection & Placement
Select appropriate microphones for various applications based on their characteristics and intended use.
Accurately read microphone specification sheets to determine a microphones behavior.
Analyze polar patterns and their impact on stage and studio bleed.
Choose and justify microphones for specific situations based on their characteristics.
Apply multi-microphone techniques for capturing instruments and vocals effectively.
Identify commonly used multi-microphone patterns used in live and studio environments.
Evaluate phase alignment and mic placement to minimize phase issues.
Compare and implement close-miking and distant-miking strategies for different recording and live sound scenarios.
Speaker System Design
Differentiate between point source and line array speaker systems and their applications.
Compare active and passive crossover networks and explain their roles in sound system design.
Analyze subwoofer placement and phase alignment to optimize low-frequency performance.
Describe and design unique speaker configurations to manipulate the behavior of sound in space.
Calculate basic time-alignment values for main speakers, fills, and delays to ensure coherent sound distribution.
Perform basic SPL calculations.
Identify the causes of audio feedback and describe strategies for prevention and management.
Monitor Systems & IEMs
Compare wedge and in-ear monitor (IEM) systems and evaluate their design considerations.
Differentiate between personal monitoring and engineer-controlled monitoring systems.
Identify and mitigate phase issues in monitor systems to ensure accurate sound reproduction.
Describe proper RF coordination for wireless IEM systems to prevent interference and signal dropouts.
Apply equalization strategies to enhance clarity and control feedback in monitoring setups.
Describe additional non-volume adjustments to monitoring feeds to aid in clarity and perceived loudness.
Mixing Console & Control Systems
Compare analog and digital mixing workflows, highlighting their advantages and limitations.
Explain console architecture, including the roles of groups, VCAs, and DCAs in mix management.
Understand remote mixing and control interfaces for flexibility in live and studio environments.
Explain basic networking protocols for configuring remote mixing applications.
Describe the role of scenes and snapshots in digital consoles.
Interconnects & Infrastructure
Identify and explain the transmission characteristics of cable types, including XLR, TRS, TS, AES/EBU, MADI, and optical/fiber.
Describe and identify proper soldering and cable termination techniques.
Describe the role of networked audio systems such as Dante, AVB, and AES67, and their applications in modern audio workflows.
Identify when the use of a networked audio system is most appropriate.
Analyze latency in digital audio networks and its impact on system performance.
Explain clocking and synchronization principles for maintaining accuracy in digital audio systems.
Networking Basics
Define key networking terms including IP address, subnet mask, and gateway as they relate to audio networks.
Explain the role of DHCP and static IP addressing in configuring audio devices on a network.
Describe the basic functions of network switches and routers in live audio environments.
Understand the basic protocols and best practices when configuring routers and access points for use in live audio.
Know basic network and security configurations as well as proper positioning and cabling.
Discuss the differences between 2.4 GHz and 5 GHz Wi-Fi bands and their impact on wireless audio control.
Power & Grounding
Explain proper power distribution techniques for audio systems to ensure stability and safety.
Identify the causes of ground loops and implement strategies for mitigation.
Explain isolated power circuits for critical audio gear and the advantages that they provide.
Acoustics & System Tuning
Describe and explain how to measure and compensate for room modes to improve sound quality.
Understand system tuning techniques using parametric EQ, FIR filters, and delay to optimize performance.
Explain soundproofing and acoustic treatment strategies for enhancing the listening environment.
Redundancy & Fail-Safes
Explain dual power supplies for critical audio equipment and their role in ensuring reliability.
Explain and design redundant audio paths for critical applications to maintain system operation during failures.
Describe failover strategies for networked audio systems to ensure continuous performance.
Identify backup power considerations, including UPS and generators, for maintaining audio system operation during power interruptions.
Identify when each backup system would be most effective.
Mixing Techniques
An audio engineer should have the ability to sculpt and balance numerous channels to achieve clarity, depth, and cohesion in a mix. This includes using all industry standard processing tools, including applying equalization, dynamics processing, spatial effects, and level adjustments to ensure a polished and professional sound.
Fundamentals of Mixing
Describe techniques for balancing levels to achieve clarity and separation in a mix.
Explain the concepts of headroom and gain staging in maintaining optimal audio levels.
Identify the proper use of faders and trim/gain adjustments for controlling audio levels.
Discuss considerations for mono vs. stereo mixing and their impact on sound.
Explain phase coherence and methods to prevent comb filtering in audio mixing.
Equalization Techniques
Describe subtractive and additive EQ approaches and their applications in audio mixing.
Explain frequency masking and strategies to achieve separation in the frequency spectrum.
Identify the effective use of high-pass and low-pass filters to shape the sound.
Discuss the applications of parametric, graphic, and dynamic EQ in various audio contexts.
Dynamics Processing
Define the different types of compression: VCA, FET, optical, and tube, and their uses in audio processing.
Explain the impact of attack, release, ratio, and threshold settings for different instruments in common dynamics,
Describe sidechain compression and ducking techniques for dynamic control in a mix.
Explain the use of expansion and gating for noise control in audio systems.
Describe parallel compression and New York-style compression and their applications in mixing.
Compare multiband compression with broadband compression and their respective benefits.
Reverb & Spatial Effects
Define the different types of reverb: hall, plate, room, chamber, and convolution, and their applications in audio production.
Explain the effect of pre-delay, decay time, and diffusion settings on the reverb sound.
Describe how to apply reverb for creating depth and dimension in a mix.
Explain the use of delays for stereo widening and the Haas effect to enhance spatial effects.
Identify slapback, ping-pong, and modulated delays and their unique applications in audio production.
Stereo Imaging & Panning
Compare LCR (Left-Center-Right) and full stereo mixing approaches and their impact on sound placement.
Explain Mid/Side (M/S) processing and its use for controlling stereo width.
Describe panning strategies for positioning different instruments within the stereo field.
Discuss the importance of balancing mono compatibility with stereo width in a mix.
Identify phase correlation issues and explain considerations for stereo summing to maintain clarity and coherence.
Additional Processing
Define tape, tube, and transformer saturation effects and their impact on audio quality.
Explain harmonic enhancement techniques for adding warmth and presence to a mix.
Describe the use of exciters and aural enhancers to improve perceived brightness and clarity.
Compare soft clipping and hard clipping distortion, and explain their effects on audio signals.
Define the function of a de-esser and its use in controlling sibilance in vocal recordings.
Explain the effects of chorus, flanger, and phaser, and their applications in creating movement and depth in audio.
Describe the use of tremolo for adding rhythmic variation to the sound.
Identify the octaver effect and explain how it shifts pitch to create harmonic content.
Mix Management
Define the roles of groups, busses, sends, DCAs, and matrices in audio signal routing.
Explain how groups and busses are used to combine multiple signals for processing.
Describe the function of sends for routing audio to external effects or processors.
Identify DCAs (Digitally Controlled Amplifiers) and their use in controlling group levels without affecting the individual channel settings.
Explain the purpose of matrices in routing and summing signals to different outputs in complex audio systems.
Identify strategies to reduce mixing workload during live events.
Monitor & IEM Mixing
Describe how to balance clarity and comfort in individual monitor mixes to meet performer needs.
Explain how to prevent feedback using EQ and proper gain structure in monitor systems.
Discuss the differences between stereo and mono IEM mixes and their impact on spatial awareness for performers.
Describe techniques for handling multiple monitor mixes when console outputs are limited.
Identify strategies for effective communication with musicians to make real-time adjustments during a performance.
Problem Solving
An audio engineer should have the ability to quickly identify, diagnose, and resolve technical issues that arise in live sound environments. The Problem Solving section of the AETCA will present examinees with real-world technical challenges commonly encountered in live sound, ask them to analyze and synthesize the given information, and decide on the most effective solution.
Rather than simply recalling theoretical knowledge, this section emphasizes applied problem-solving skills. This section will require an understanding of systematic troubleshooting approaches, such as using process-of-elimination techniques or interpreting diagnostic tools like metering, RTA displays, signal flow diagrams, or test signals. Questions in this section may cover topics such as signal loss, feedback, ground loops, phase issues, power supply problems, malfunctioning equipment, improper cabling, incorrect gain staging, frequency interference, distorted signals, improper routing, monitoring issues, connectivity problems, system integration failures, and troubleshooting analog and digital systems.
Critical Listening
An audio engineer should have the ability to critically analyze audio signals by accurately identifying various processing techniques and technical issues.
Questions in this section ask the examinee to listen and assess common audio topics such as the appropriate use of EQ, compression, reverb, delay, and other easily detectable processing techniques in a mix or recording. This section will also ask the examinee to identify issues such as frequency imbalances, phase cancellations, distortion/clipping, dynamic range problems, stereo imaging inconsistencies, spatial effects misapplications, noise artifacts, and improper EQ settings.