I'm trying to find a method or
instrument to measure return loss for a GPS front-end
consist of an LNA that's dynamically controlled
with AGC. The front-end of the GPS IC is
for around -130dBm and L1 (1.575420Ghz)
network Analyzer I'm using outputs a signal at -60dBm.
That's the lowest it can go without
being overwhelmed by noise. The problem is S11 impedance
on the smith chart display
appears almost as
a short. The GPS receiver works ok. To me it seems
obvious what's happening
here. The network analyzer's
relatively large signal is forcing the LNA into
Also the strong signal is
forcing the the AGC to switch the receiver into
its low gain mode.
The receiver upon start up
monitors the signal strength and chooses either
low or high gain
modes. I want to measure the
return loss in high gain mode.
So with all
that said, anyone have a suggestion as to how I
can measure return loss with using
of -130dBm (up to -110dBm would probably be ok too).
Are there any vector network analyzers out
there that can do this? Is a special test set needed?
How about using a typical spectrum analyzer. I just
need return loss, not necessarily the phase information.
Any help on this appreciated!