Understanding Peak Finding; Data Smoothing - PerkinElmer LabChip GX User Manual

Table of Contents

Advertisement

Understanding Peak Finding

Peak finding is the complex data analysis process of converting the
raw data signal into a list of meaningful peaks. It involves smoothing
the data signal, applying the selected baseline algorithm, identifying
the peak baseline, detecting peaks within the smoothed signal, and
discarding peaks failing to meet user specified thresholds. This
process is controlled by the Peak Find analysis parameters
selected in the

Data Smoothing

The raw data signal is initially smoothed using the Filter Width and
Polynomial Order parameters. This filter removes high frequency
noise from the data by performing a local weighted averaging of the
data using a Savitzky-Golay convolution kernel. The filter width
defines the range over which the averaging takes place; a 0.3 sec
filter width averages over 0.3 seconds of data or more precisely, the
data is convolved with a weight array that is 0.3 seconds wide. At
the typical sampling rate of 60 HZ, this kernel is 0.3 * 60 = 18 + 1 =
19 points wide (forced to be odd by algorithm requirements).
Using a very large filter width to try and reduce signal noise will
have an adverse effect on sharp peaks. Typically, the over-filtering
of sharp peaks causes side-lode artifacts to appear. The sample
shown below has both sharp peaks and signal noise, particularly
near the end of the trace. Removing the noise in the broad tail of
the upper marker by setting the filter width to 1.0 sec causes the
early sharp peaks to lose height and become distorted.
V4.2
Assay Analysis Window
Figure 46. Filtering Examples
LabChip GX User Manual
Understanding Peak Finding 108
on the
Peak Find
PerkinElmer
Tab.

Hide quick links:

Advertisement

Table of Contents
loading

This manual is also suitable for:

Labchip gx ii

Table of Contents