Community Forums Archive

Go Back

Subject:2 Questions: Normalization & SaveAs...
Posted by: D-Slam
Date:1/30/2006 8:17:46 AM

Hi TJ -

QUESTION #1
You may have answered this in so many words, but is it possible to take the return values from "file.DataFormat.ToString()" and use them to control the SaveAs parameters? I often have to process a folder of disperate file types and would like to create a set of trimmed masters where the file types are based on the actual masters.

Do I have to create presets for each possible file type and then case out the statements to choose the correct preset? Or is there a way to dynamically specify SampleRate, BitDepth and Channels without presets? Can I change the path and then fool it into "saving" (in a diferent directory without "SaveAs")...whatever.

QUESTION #2
You mentioned the other day that Normalize is field accessible. I'd like to normalize a large set of files, but unless I scan each file individually, some will be clipped. Is there a way to normalize to 100% *based on a fresh scan*? Or, do I have to UpdateStatistics, and do my own calculation of how much to normalize based on each file's MaxValue?

Thanks again for all the help.

Subject:RE: 2 Questions: Normalization & SaveAs...
Reply by: _TJ
Date:1/30/2006 12:28:53 PM

QUESTION #1
You may have answered this in so many words, but is it possible to take the return values from "file.DataFormat.ToString()" and use them to control the SaveAs parameters? I often have to process a folder of disperate file types and would like to create a set of trimmed masters where the file types are based on the actual masters.

Do I have to create presets for each possible file type and then case out the statements to choose the correct preset? Or is there a way to dynamically specify SampleRate, BitDepth and Channels without presets? Can I change the path and then fool it into "saving" (in a diferent directory without "SaveAs")...whatever.


You can't do this directly, but you can do it indirectly. Whenever you SaveAs using the .WAV format and the template called "Default Template", then the current DataFormat() of the file is what it is saved as.

So what you have to do is use DoEffect("Resample"), DoEffect("ChannelConverter") and DoEffect("BitDepth") to convert the file to the desired format, then SaveAs() with "Default Template" to save it.


QUESTION #2
You mentioned the other day that Normalize is field accessible. I'd like to normalize a large set of files, but unless I scan each file individually, some will be clipped. Is there a way to normalize to 100% *based on a fresh scan*? Or, do I have to UpdateStatistics, and do my own calculation of how much to normalize based on each file's MaxValue?

Well you could do your own scanning using statistics, then make a normalize preset and call FieldsNormalize.ForceScanResult(...) on it.

Or you could just create a Normalize preset and set Fields_Normalize.UseLastScan = false and it should scan the whole file before normalizing.

tj

Message last edited on1/30/2006 12:46:05 PM by_TJ.
Subject:RE: 2 Questions: Normalization & SaveAs...
Reply by: D-Slam
Date:1/30/2006 1:16:31 PM

I'm creating a preset and calling

app.DoMenu("Edit.SelectAll", true);
fields2.UseLastScan = false;
fields2.NormalizeTo = SfHelpers.dBToRatio(-1);


But the results are not what I was expecting. I assumed this would normalize to -1db, but the files vary.

Any ideas?

Subject:RE: 2 Questions: Normalization & SaveAs...
Reply by: _TJ
Date:1/30/2006 1:44:27 PM

please post your script. put it inside <pre> </pre> tags so that the formattting doesn't get lost.

Subject:RE: 2 Questions: Normalization & SaveAs...
Reply by: D-Slam
Date:1/30/2006 2:09:22 PM

without all the extra bits... :-)



//-NORMALIZE ***
//Create Preset
ISfGenericEffect fx1 = app.FindEffect("[Sys] Maximize peak value");
byte[] normData = new byte[72] {
8,0,0,0,0,0,0,0,0,0,0,0,
0,0,240,63,0,0,0,0,0,0,
0,0,223,30,36,128,147,
8,119,63,0,0,0,0,0,0,73,
64,0,0,0,0,0,0,73,64,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,240,
191,0,0,0,0,0,0,240,191,
};

// make a preset
ISfGenericPreset presetNorm = new SoundForge.SfGenericPreset("Test Preset", fx1, normData);

// get an object that will let us set fields in the preset.
Fields_Normalize fields2 = Fields_Normalize.FromPreset(presetNorm);

// set values for a few of the fields.
app.DoMenu("Edit.SelectAll", true);
fields2.UseLastScan = false;
fields2.NormalizeTo = SfHelpers.dBToRatio(-1);

// write changes back into the preset object.
fields2.ToPreset(presetNorm);

// use the preset.
app.DoEffect("Normalize", presetNorm, EffectOptions.EffectOnly);
//-END NORMALIZE


Message last edited on1/30/2006 2:59:00 PM byD-Slam.
Subject:RE: 2 Questions: Normalization & SaveAs...
Reply by: _TJ
Date:1/30/2006 2:52:34 PM

Ok, first of all theres

ISfGenericEffect fx1 = app.FindEffect("[Sys] Maximize peak value");

which should be

ISfGenericEffect fx1 = app.FindEffect("Normalize");


But I think the real problem is that Normalize is assuming that it only needs to scan once (i.e. it assumes that you are normalize multiple pieces of the same file), and the preset that you use for your Bytes isn't telling it otherwise.

so you can either change

byte[] abData = new byte[72] {
8,0,0,0,0,0,0,0,
0,0,0,0,0,0,240,63,
0,0,0,0,0,0,0,0,
223,30,36,128,147,8,119,63,
0,0,0,0,0,0,105,64,
0,0,0,0,0,0,105,64,
0,0,0,0,0,0,0,0,
0,0,0,0,0,0,240,191,
0,0,0,0,0,0,240,191,
};
to
byte[] abData = new byte[72] {
8,0,0,0,0,0,0,0,
0,0,0,0,0,0,240,63,
0,0,0,0,0,0,0,0,
223,30,36,128,147,8,119,63,
0,0,0,0,0,0,105,64,
0,0,0,0,0,0,105,64,
0,0,0,0,1,0,0,0, // <---- the change is on this line...
0,0,0,0,0,0,240,191,
0,0,0,0,0,0,240,191,
};

or you can add this line of code where you change fields2


fields2.UseLastScan = false;
fields2.ForceScanResult(-1.0, -1.0); // -1.0 triggers a rescan.
fields2.NormalizeTo = SfHelpers.dBToRatio(-1);


By the way, would you please edit your post to make the lines shorter?
thanks,

tj


Subject:RE: 2 Questions: Normalization & SaveAs...
Reply by: _TJ
Date:1/30/2006 3:00:06 PM

Oh, and as a tip. You should use file.DoEffect() rather than App.DoEffect(),
and always have a file.WaitForDoneOrCancel() after every file.DoEffect().

Otherwise, when you cancel, it's going to look like a crash and you won't be able
to exit your script gracefully.

(The reason is that when a file is busy processing, there's an impled file.WaitForDoneOrCancel() at the beginning of each file.DoEffect() or app.DoEffect() call, but since a cancel at that point can't be reported since DoEffect doesn't return SfStatus we have to treat the cancel as an exception instead.

Calling file.WaitForDoneOrCancel() explicitly gives you an opportunity to tell the difference between cancel and failure.

tj


Subject:RE: 2 Questions: Normalization & SaveAs...
Reply by: D-Slam
Date:1/30/2006 3:01:33 PM

Thanks. And sorry for the long lines...

in fields2.ForceScanResult(-1.0, -1.0); // -1.0 triggers a rescan.

what do the two args represent. One you've said retriggers the scan, but which one and what does the other specify?

best - D

Subject:RE: 2 Questions: Normalization & SaveAs...
Reply by: _TJ
Date:1/30/2006 3:49:26 PM

the first is the peak value of the last scan, the other is the RMS peak value. A value of -1 for the peak value is clearly nonsense, so the code treats that as a signal for 'do a scan'.

Otherwise, what appears to happen is that setting UseLastScan = false isn't enough to force a re-scan in all cases. If the Normalize effect thinks that it has valid cached values from a previous pass, it will use those. Setting UseLastScan = true, will FORCE it to use the cached values, but setting it to false doesn't force it not to use them. It's essentially saying "use them if you think they are valid".

And the validation for the cached scan values can be pretty easily fooled by doing a mixture of using Normalize from the menu and also running scripts.

So the solution is to ForceScanResult() with clearly invalid values.

tj



Subject:RE: 2 Questions: Normalization & SaveAs...
Reply by: D-Slam
Date:1/30/2006 4:37:19 PM

thanks again. clear, helpful and responsive as always.

Go Back