When using a TL Audio compressor, I sometimes experience some LF distortion when using a fast release time, which disappears as I make the release time longer. Why is this?
This effect tends to be experienced on signals with significant LF content, when some gain reduction is taking place. The reason it occurs is a result of the way compression works: As a simple example, imagine you feed a low frequency source signal such as a 20Hz sine wave into the C-1 Compressor, whose release time is set at its fastest position (40mS). The time it takes for a 20Hz signal to complete one whole cycle is 50mS. Since the fastest release time on the C-1 is 40mS, this means that once the source signal has fallen back below the compression threshold, its gain will return to normal in 40mS, which is a shorter time than the cycle time of the source signal. As a result, the envelope of the compressed signal is distorted as it is forced in and out of compression within a single cycle. Increasing the release time to a figure greater than 50mS prevents this problem, which is why adjusting to a slower release setting will cause the distortion to disappear. For the same reason, sou
Related Questions
- When using a TL Audio compressor, I sometimes experience some LF distortion when using a fast release time, which disappears as I make the release time longer. Why is this?
- Why do some real-time stream-gaging stations experience equipment problems for extended periods of time?
- Is the sidechain insert point on my TL Audio compressor the same as the insert point on a console?