By way of background, an editor friend of mine produces WMV files from AVCHD footage using Vegas Pro 10, but changes the audio frequency in the output WMV file to 44.1 kHz from the original 48 kHz off the camcorder. I am not very familiar with audio issues, but I would assume that changing the frequency would actually lower the audio quality because of the resampling necessary. Just wondering if I am correct in that thought. Will computer sound cards play 44.1 kHz audio without having to resample it again?
I also noticed that the WMV files he produced with Vegas 10 have an audio track about a second shorter than the video track. I am also wondering if the frequency change could possibly cause sync issues with the video files as well? Or does a frequency change only affect the quality, not the length, of the audio file?
I also noticed that the WMV files he produced with Vegas 10 have an audio track about a second shorter than the video track. I am also wondering if the frequency change could possibly cause sync issues with the video files as well? Or does a frequency change only affect the quality, not the length, of the audio file?