Engineering:Radar mile

From HandWiki
Revision as of 17:05, 4 February 2024 by WikiEditor (talk | contribs) (update)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Radar mile or radar nautical mile is an auxiliary constant for converting a (delay) time to the corresponding scale distance on the radar display.[1] Radar timing is usually expressed in microseconds. To relate radar timing to distances traveled by radar energy, you should know that radiated energy from radar set travels at approximately 984 feet per microsecond, approximately the speed of electromagnetic waves in vacuum. With the knowledge that a nautical mile is approximately 6,080 feet, we can figure the approximate time required for radar energy to travel one nautical mile using the following calculation:

[math]\displaystyle{ 1~\text{radar mile} = \frac{2 \cdot 6080~\mathrm{ft}}{984~\mathrm{ft~per~microsecond}}= 12{,}35~\text{µs} }[/math]
The radar pulse takes a certain amount of time between transmitting the sounding signal to receiving the echo - if the object is exactly one mile away, that time is one radar mile.

A pulse-type radar set transmits a short burst of electromagnetic energy. The target range is determined by measuring elapsed time while the pulse travels to and returns from the target. Because two-way travel is involved, a total time of 12.35 microseconds per nautical mile will elapse between the start of the pulse from the antenna and its return to the antenna from a target in a range of 1 nautical mile. In equation form, this is:

[math]\displaystyle{ \text{range} = \frac{\mathrm{elapsed~time}}{\mathrm{radar~mile}}=\frac{t_d}{12{,}35~\text{µs}} }[/math][2]


References