Previous Next

*   Accuracy
The Mass Frontier application uses two parameters associated with m/z value: Accuracy and Resolution. Resolution is the smallest difference in m/z of two centroid peaks that a detector can distinguish Accuracy is the estimated difference between the observed and the real m/z value of a peak.
The Mass Frontier application determines these values from the raw data file. If you select the Use User Defined option in the mass accuracy page, the software sets both values to the same user defined value.
Use accuracy settings for the differentiation of adjacent peaks (spectra) and m/z values (calculations). Because the application processes only centroid spectra, the differentiation is defined as the spacing between resolved peaks (spectra) or as the smallest distinguishable difference in m/z values (calculations). No additional parameters, such as peak width, are taken into account.
Note  Changing the accuracy in the Tolerance Settings dialog box does not affect spectra that you open in the Chromatogram Processor or fragments that the application has already calculated in the Fragments & Mechanisms and Fragments Comparator modules. When you require spectra or fragments with a new setting, you must reload the spectra or regenerate the fragments.
v
1.
Choose Option > Settings from the Mass Frontier main menu.
The Options dialog box opens.
2.
Click Mass/Abundance in the Parameters section.
The Mass Accuracy page in the Tolerance Settings dialog box displays options for specifying accuracy settings.
Mass Accuracy page
3.
The application supports these tolerance types:
ppm 1 000 000 x ΔM/M
Mass Accuracy (ΔM in AMU or ΔM in mmu)
where:
M = m/z (mass-to-charge ratio)
ΔM = M2 – M1 (M1, M2 are two adjacent peaks)
Peaks (m/z values) that fall into the ΔM band are considered to be identical (m/z value).
4.
To set a tolerance for imported spectra or GC/LC/MS chromatograms from a processed file, do one of the following in the Accuracy for Experimental Data area:
Select the Determine from Source option to adopt the tolerance settings saved in a file.
–or–
Select the Use User Defined option to use the same tolerance settings specified for calculations in the Accuracy for Calculated Data area.
You cannot improve the resolution of experimental data by choosing a user-defined type. Avoid setting tolerance values that are better than the resolution of the mass spectrometer that you use for data acquisition. The Use User Defined option can artificially reduce the resolution of experimental data when working with low- and high-resolution spectra in the same data set: spectra search, target analysis, or classification.
For additional information, see Using the Search Utilities for spectra search, Searching Chromatographic Libraries for target analysis, Spectra Classifier Module. for classification, or Supported GC/MS and LC/MS Data File Formats.


Previous Next
Related Topics:
  Monoisotopic Mass
  Precision
Copyright 1998 - 2013 HighChem Ltd., Slovakia