Radar mile

Radar mile or radar nautical mile is an auxiliary constant for converting a (delay) time to the corresponding scale distance on the radar display.[1]

Radar timing is usually expressed in microseconds. To relate radar timing to distances traveled by radar energy, you should know that radiated energy from radar set travels at approximately 984 feet per microsecond, approximately the speed of electromagnetic waves in vacuum. With the knowledge that a nautical mile is approximately 6,080 feet, we can figure the approximate time required for radar energy to travel one nautical mile using the following calculation:

The radar pulse takes a certain amount of time between transmitting the sounding signal to receiving the echo - if the object is exactly one mile away, that time is one radar mile.

A pulse-type radar set transmits a short burst of electromagnetic energy. The target range is determined by measuring elapsed time while the pulse travels to and returns from the target. Because two-way travel is involved, a total time of 12.35 microseconds per nautical mile will elapse between the start of the pulse from the antenna and its return to the antenna from a target in a range of 1 nautical mile. In equation form, this is:

[2]


References

  1. "NEETS - Naval Electrical Engineering Training Serie". Retrieved 2020-12-31.
  2. "Radartutorial". C. Wolff. November 1998. Retrieved 2021-01-01.


This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.