Hello-
My expertise in RF is limited- I write
code in C and assembly, design analog circuitry
to a 1 megahertz or so, and push a lot of paper.
I am trying to calculate the field strength
in a TEM cell given a signal generator output in
dbM and the septum length of the cell. The signal
generator output impedance is 50 ohm, the cell looks
like a 50 ohm load, the testing frequency of 930
Mhz is well within the passband of the cell. Lets
put aside cable loss, adaptor loss, swr, and the
accuracy of my measuring instruments for the time
being.
The TEM Cell manual gives a method
for calculating field strength in cell at test specimen
location:
1. Record the emf applied at cell input
2. Divide by the septum length
3. You get volts
per meter.
I hook up a hp sig gen to
the cell, and set the output to -86dBM
1.
To calculate the emf applied to the cell, I use
the formula P=E^2/R, rearranging to E= sqrt (RP).
I use 50 ohms for R, and get P from P=
10*log(Px/1mW)
2. I divide by the septum length. Now I dont have
the work in front of me but I get something close
to 40-50 uV/M
3. I put a 50 ohm dipole antenna
at the test point (gain is +2.5dbi), but measure
a -107 dbM power at the test point.
4. Using
the formulas from 1, I do not get 40-50uV/m...I
get something substantially less
Whats going
on? Should I be using 377 ohms instead of 50 ohms
in step 4?
Please help this victim of a lumped
systems education make progress!
Thanks.