Log Resolution Vs. Linear Resolution - HP 3563A Operating Manual

Control systems analyzer
Table of Contents

Advertisement

Log Resolution vs. Linear Resol ution
Measurements differ between log resolution and linear resolution modes. Specifically, you should
be aware of the following when taking measurements in the log resolution mode:
Frequency spans are limited to
Minimum start frequency in log resolution is 0.1 Hz.
Log resolution uses a predefined windowing function for all measurements, so windows are not
select able.
Overall measurement time is determined by the time required to measure the lowest decade and
the number of averages taken.
Triggering and time averaging are not applicable to log resolution. (Delayed triggering can be
used when measuring throughput files from disc; see chapter
Filtered time and filtered spectrum displays are not available.
Log resolution mode is optimized for wide-band frequency response measurements. The
trade-off is a loss of accuracy (amplitude and phase) in spectra of discrete frequencies.
As mentioned in chapter
Proportional-bandwidth measurements that are faster than log swept sine.
Better resolution at low frequencies and less measurement variance at higher frequencies (when
compared to the linear resolution mode).
1, 2, 3, 4
or
decades, and the resolution is always 80 lines/decade.
5
2,
the advantages of using log resolution mode are:
Using
ne LOg HeSOIUtlon Mooe
I

Log Resolution vs. Linear Resolution

9
for more information.)
4-3

Advertisement

Table of Contents
loading

Table of Contents