...we need at least currents of 10A to see something noticeable. Do such high currents even flow in an antenna???
I'm just answering this point for now. Most antennas have a resistive impedance R of 50 ohms. A basic ham radio installation is 100 W, and often followed by an amp in the KW range. The current can be calculated quite simply. P=R*I² so I=sqrt(P/R). For 100 W : I = 1.4 A For 1 KW : I = 4.47 A This current is the one that really flows at the antenna connection point. For 10A, a power of 5 KW would be needed, this becomes quasi-professional equipment. But the current in an antenna of non-negligible size compared to the wavelength, and even compared to the quarter-wave, is not constant along the conductor. If the antenna is 1/4 wavelength long, the maximum current supplied by the transmitter will drop to zero at the end of the antenna. The current distribution in a dipole antenna, as a function of its length: https://www.youtube.com/watch?v=edyFGAT_87oTo have a higher current without increasing the power, one would have to design an antenna with a lower impedance. With a 10 ohm antenna and 1 KW, one would have 10A, at least in a part of the antenna. Since the impedance of commercial transmitters is 50 ohms, a ferrite transformer or a tuner box will be needed to do the impedance matching. This may also answer your question about ferrites: yes, they can handle this kind of high RF current, with a proportional magnetic flux, but you have to choose the right size and type of ferrite for the frequencies used. This is a difficult project.
---------------------------
"Open your mind, but not like a trash bin"
|