View Single Post
Old 08 July 2006, 07:10 pm   #6
Shy
Admin
 
Shy's Avatar
 
Join Date: Jul 2004
Posts: 372
Default

If that's the bitrate, it's most likely that this combined sample is not good enough to be a "problem sample" that can actually pose real issues to the codec. I say this from experience which says the vast majority of serious problem samples (as in samples that clearly expose sound deficiencies) give a much higher bitrate, avaraging around 260kbps with "standard" settings of codecs.

There are very few cases where a ~192kbps audio segment would actually cause a difference in the sound which is noticeable enough with Musepack.

The one likely thing this test can show is that MP3 and WMA are not as good as the alternatives (Musepack, Vorbis, AAC), which is hardly a thing that needs further proof.
For showing which of the better alternatives is best, a 192kbps sample is highly unlikely to be of any use.

I of course don't mean to undermine your efforts which I appreciate, but I must say I think the entire idea to limit yourself to a single bitrate for a segment of audio just to have a "192kbps test" is very counterproductive.

A better, yet still not so sensible way to do a modern audio codec test would be to take an entire track, encode it with various codecs and reach a similar bitrate with all of them, and then choose the segment which you hear poses a problem, and use the same encoding parameters to encode the segment and not pay attention to the segment's bitrate, which would most likely have a much higher bitrate than that of the entire track.

Another way would be to take problematic samples and encode them with various codecs' equivalent parameters, and again, not pay attention to the bitrate. The actual sound artifacts are what matters, not the bitrate.

A modern, useful test would be based on the idea that:
All codecs define quality levels for you to use and trust, and it's those quality levels which should be tested, not bit rates.

I'm sorry if what I think is not so pleasant to hear, but I feel it's important to make it clear that such a test's method is not efficient and not suitable for today's modern audio codec world, which has since many years already gone way past limiting the encoding process to a constant or avarage bitrate, and instead, is based on variable bitrate.

The only avarage bitrate that matters in today's codecs is that which can be derived from calculation of the avarage bitrate an encoding parameter gives with thousands of albums. Bitrate approximations that Musepack, Vorbis and AAC tools show you are derived from just that.
Shy is offline   Reply With Quote