Information is needed to finalize a cable assembly
design.
The current design consists of a
coil cord and two stainless steel connectors. The
EMI shielding consists of Sn/Fe/Cu mesh tape wrapped
around the wire bundle with a 50% overlap and terminated
to each connector with a stainless steel clamp.
Uncoiled length of cable is about 2 meters. Normal
braiding is not possible due to the need for coiling.
This cable needs to provide at least 40dB attenuation
to a 400MHz test frequency.
Also needs to
meet MIL-STD-461: CE, CS, RE, and RS at values ranging
from 10KHz to 18GHz.
It is acceptable to
qualify by design. This avoids the need to do expensive
swept frequency tests and expedites the whole process.
Normally a shielded cable can qualify by design
if measurements with a micro-ohmmeter show a very
low resistance from shell-to-shell and across the
faying surfaces at each end.
The requirements
for this cable are <2.5 milliohms from each shell
to the overall shield on the other side of the clamp.
The cable design meets this limit.
Shell-to-shell
values are a problem. Actual measurements with a
micro-ohmmeter give a value between 1.25 and 1.35
ohms shell-to-shell. Braided cable would typically
have a value in the milliohm range depending on
weight of shield per meter and overall length. Because
of this difference it may not be possible to qualify
by design based on provable milliohm measurements.
My questions are:
01 Is the 1.25 to 1.35
ohm range too high for mesh tape? If so why, and
how can we get to a milliohm range?
02 If a milliohm
range is not necessary to use DC resistance tests
as proof of compliance then what data is available
to support such a position?
03 Would aluminized
mylar foil help?
EMI testing hasn't been
required at our level of manufacture for many years.
Usually the odd job that requires it goes to an
outsource test vendor but time constraints favor
a faster solution.
Any help is greatly appreciated.