View Single Post
Old 08 July 2006, 10:45 pm   #7
Serge Smirnoff
Join Date: Jul 2006
Posts: 4

Originally Posted by Shy
A modern, useful test would be based on the idea that:
All codecs define quality levels for you to use and trust, and it's those quality levels which should be tested, not bit rates.
You can “close your eyes” to actual SE bitrate calculations (FBR) and just compare different codecs with chosen (for some reason) quality parameters.

Originally Posted by Shy
The only avarage bitrate that matters in today's codecs is that which can be derived from calculation of the avarage bitrate an encoding parameter gives with thousands of albums. Bitrate approximations that Musepack, Vorbis and AAC tools show you are derived from just that.
Even in this case final bitrate will depend on type of albums chosen. Thousands of albums with classical music will give lower figures and with hard rock – higher ones.
Keep your audio clear! -
Serge Smirnoff is offline   Reply With Quote