Nothing Special   »   [go: up one dir, main page]

Radio Wave Radio Waves Are A Type of Electromagnetic Radiation With Wavelengths in

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 8

RADIO WAVE

Radio waves are a type of electromagnetic radiation with wavelengths in


the electromagnetic spectrum longer than infrared light. Radio waves have frequencies as high as
300 gigahertz (GHz) to as low as 30 hertz (Hz).[1] At 300 GHz, the corresponding wavelength is 1 mm,
and at 30 Hz is 10,000 km. Like all other electromagnetic waves, radio waves travel at the speed of
light in vacuum. They are generated by electric charges undergoing acceleration, such as time
varying electric currents.[2] Naturally occurring radio waves are emitted by lightning and astronomical
objects.

Radio waves are generated artificially by transmitters and received by radio receivers,


using antennas. Radio waves are very widely used in modern technology for fixed and mobile radio
communication, broadcasting, radar and radio navigation systems, communications
satellites, wireless computer networks and many other applications. Different frequencies of radio
waves have different propagation characteristics in the Earth's atmosphere; long waves
can diffract around obstacles like mountains and follow the contour of the earth (ground waves),
shorter waves can reflect off the ionosphere and return to earth beyond the horizon (sky waves),
while much shorter wavelengths bend or diffract very little and travel on a line of sight, so their
propagation distances are limited to the visual horizon.

To prevent interference between different users, the artificial generation and use of


radio waves is strictly regulated by law, coordinated by an international body called the International
Telecommunications Union (ITU), which defines radio waves as "electromagnetic
waves of frequencies arbitrarily lower than 3 000 GHz, propagated in space without artificial guide".
[3]
 The radio spectrum is divided into a number of radio bands on the basis of frequency, allocated to
different uses.
MICRO WAVE

Microwaves are a form of electromagnetic radiation with wavelengths ranging from


about one meter to one millimeter; with frequencies between 300 MHz (1 m) and 300 GHz (1 mm).
[1][2][3][4][5] Different sources define different frequency ranges as microwaves; the above broad
definition includes both UHF and EHF (millimeter wave) bands. A more common definition in radio-
frequency engineering is the range between 1 and 100 GHz (wavelengths between 0.3 m and 3 mm).
[2] In all cases, microwaves include the entire SHF band (3 to 30 GHz, or 10 to 1 cm) at minimum.
Frequencies in the microwave range are often referred to by their IEEE radar band designations: S, C,
X, Ku, K, or Ka band, or by similar NATO or EU designations.

The prefix micro- in microwave is not meant to suggest a wavelength in the


micrometer range. Rather, it indicates that microwaves are "small" (having shorter wavelengths),
compared to the radio waves used prior to microwave technology. The boundaries between far
infrared, terahertz radiation, microwaves, and ultra-high-frequency radio waves are fairly arbitrary
and are used variously between different fields of study.

Microwaves travel by line-of-sight; unlike lower frequency radio waves they do not
diffract around hills, follow the earth's surface as ground waves, or reflect from the ionosphere, so
terrestrial microwave communication links are limited by the visual horizon to about 40 miles (64
km). At the high end of the band they are absorbed by gases in the atmosphere, limiting practical
communication distances to around a kilometer. Microwaves are widely used in modern technology,
for example in point-to-point communication links, wireless networks, microwave radio relay
networks, radar, satellite and spacecraft communication, medical diathermy and cancer treatment,
remote sensing, radio astronomy, particle accelerators, spectroscopy, industrial heating, collision
avoidance systems, garage door openers and keyless entry systems, and for cooking food in
microwave ovens.
INFRARED WAVE

Infrared radiation (IR), sometimes called infrared light, is electromagnetic radiation


(EMR) with wavelengths longer than those of visible light. It is therefore generally invisible to the
human eye, although IR at wavelengths up to 1050 nanometers (nm)s from specially pulsed lasers
can be seen by humans under certain conditions.[1][2][3][4] IR wavelengths extend from the
nominal red edge of the visible spectrum at 700 nanometers (frequency 430 THz), to 1 millimeter
(300 GHz).[5] Most of the thermal radiation emitted by objects near room temperature is infrared.
As with all EMR, IR carries radiant energy and behaves both like a wave and like its quantum particle,
the photon.

Infrared radiation was discovered in 1800 by astronomer Sir William Herschel, who
discovered a type of invisible radiation in the spectrum lower in energy than red light, by means of
its effect on a thermometer.[6] Slightly more than half of the total energy from the Sun was
eventually found to arrive on Earth in the form of infrared. The balance between absorbed and
emitted infrared radiation has a critical effect on Earth's climate.

Infrared radiation is emitted or absorbed by molecules when they change their


rotational-vibrational movements. It excites vibrational modes in a molecule through a change in the
dipole moment, making it a useful frequency range for study of these energy states for molecules of
the proper symmetry. Infrared spectroscopy examines absorption and transmission of photons in
the infrared range.

Infrared radiation is used in industrial, scientific, military, law enforcement, and


medical applications. Night-vision devices using active near-infrared illumination allow people or
animals to be observed without the observer being detected. Infrared astronomy uses sensor-
equipped telescopes to penetrate dusty regions of space such as molecular clouds, detect objects
such as planets, and to view highly red-shifted objects from the early days of the universe.[8]
Infrared thermal-imaging cameras are used to detect heat loss in insulated systems, to observe
changing blood flow in the skin, and to detect overheating of electrical apparatus.

Extensive uses for military and civilian applications include target acquisition,
surveillance, night vision, homing, and tracking. Humans at normal body temperature radiate chiefly
at wavelengths around 10 μm (micrometers). Non-military uses include thermal efficiency analysis,
environmental monitoring, industrial facility inspections, detection of grow-ops, remote
temperature sensing, short-range wireless communication, spectroscopy, and weather forecasting.

VISIBLE LIGTH RAY WAVE

The visible spectrum is the portion of the electromagnetic spectrum that is visible to the
human eye. Electromagnetic radiation in this range of wavelengths is called visible light or simply
light. A typical human eye will respond to wavelengths from about 380 to 740 nanometers.[1] In
terms of frequency, this corresponds to a band in the vicinity of 430–770 THz.

The spectrum does not contain all the colors that the human eyes and brain can distinguish.
Unsaturated colors such as pink, or purple variations like magenta, for example, are absent because
they can only be made from a mix of multiple wavelengths. Colors containing only one wavelength
are also called pure colors or spectral colors.

Visible wavelengths pass largely unattenuated through the Earth's atmosphere via the
"optical window" region of the electromagnetic spectrum. An example of this phenomenon is when
clean air scatters blue light more than red light, and so the midday sky appears blue (apart from the
area around the sun which appears white because the light is not scattered as much). The optical
window is also referred to as the "visible window" because it overlaps the human visible response
spectrum. The near infrared (NIR) window lies just out of the human vision, as well as the medium
wavelength infrared (MWIR) window, and the long wavelength or far infrared (LWIR or FIR) window,
although other animals may experience them.
ULTRAVIOLET WAVE

Ultraviolet (UV) is electromagnetic radiation with wavelength from 10 nm to 400 nm, shorter
than that of visible light but longer than X-rays. UV radiation is present in sunlight, and constitutes
about 10% of the total electromagnetic radiation output from the Sun. It is also produced by electric
arcs and specialized lights, such as mercury-vapor lamps, tanning lamps, and black lights. Although
long-wavelength ultraviolet is not considered an ionizing radiation because its photons lack the
energy to ionize atoms, it can cause chemical reactions and causes many substances to glow or
fluoresce. Consequently, the chemical and biological effects of UV are greater than simple heating
effects, and many practical applications of UV radiation derive from its interactions with organic
molecules.

Short-wave ultraviolet light damages DNA and sterilizes surfaces with which it comes into
contact. For humans, suntan and sunburn are familiar effects of exposure of the skin to UV light,
along with an increased risks of skin cancer. The amount of UV light produced by the Sun means that
the Earth would not be able to sustain life on dry land if most of that light were not filtered out by
the atmosphere.[1] More energetic, shorter-wavelength "extreme" UV below 121 nm ionizes air so
strongly that it is absorbed before it reaches the ground.[2] However, ultraviolet light is also
responsible for the formation of bone-strengthening vitamin D in most land vertebrates, including
humans (specifically, UVB).[3] The UV spectrum thus has effects both beneficial and harmful to life.

The lower wavelength limit of human vision is conventionally taken as 400 nm, so ultraviolet
rays are invisible to humans, although some people can perceive light at slightly shorter wavelengths
than this. Insects, birds, and some mammals can see near-UV (i.e. slightly shorter wavelengths than
humans can see).
X-RAY

X-rays make up X-radiation, a form of high-energy electromagnetic radiation. Most X-rays


have a wavelength ranging from 0.03 to 3 nanometres, corresponding to frequencies in the range 30
petahertz to 30 exahertz (3×1016 Hz to 3×1019 Hz) and energies in the range 100 eV to 200 keV. X-
ray wavelengths are shorter than those of UV rays and typically longer than those of gamma rays. In
many languages, X-radiation is referred to as Röntgen radiation, after the German scientist Wilhelm
Röntgen, who discovered it on November 8, 1895.[1] He named it X-radiation to signify an unknown
type of radiation.[2] Spelling of X-ray(s) in the English language includes the variants x-ray(s), xray(s),
and X ray(s).[3]
GAMMA RAY

A gamma ray, or gamma radiation (symbol γ or {\displaystyle \gamma } \gamma ), is a


penetrating electromagnetic radiation arising from the radioactive decay of atomic nuclei. It consists
of the shortest wavelength electromagnetic waves and so imparts the highest photon energy. Paul
Villard, a French chemist and physicist, discovered gamma radiation in 1900 while studying radiation
emitted by radium. In 1903, Ernest Rutherford named this radiation gamma rays based on their
relatively strong penetration of matter; in 1900 he had already named two less penetrating types of
decay radiation (discovered by Henri Becquerel) alpha rays and beta rays in ascending order of
penetrating power.

Gamma rays from radioactive decay are in the energy range from a few kiloelectronvolts
(keV) to approximately 8 megaelectronvolts (~8 MeV), corresponding to the typical energy levels in
nuclei with reasonably long lifetimes. The energy spectrum of gamma rays can be used to identify
the decaying radionuclides using gamma spectroscopy. Very-high-energy gamma rays in the 100–
1000 teraelectronvolt (TeV) range have been observed from sources such as the Cygnus X-3
microquasar.

Natural sources of gamma rays originating on Earth are mostly as a result of radioactive
decay and secondary radiation from atmospheric interactions with cosmic ray particles. However,
there are other rare natural sources, such as terrestrial gamma-ray flashes, which produce gamma
rays from electron action upon the nucleus. Notable artificial sources of gamma rays include fission,
such as that which occurs in nuclear reactors, and high energy physics experiments, such as neutral
pion decay and nuclear fusion.

Gamma rays and X-rays are both electromagnetic radiation, and since they overlap in the
electromagnetic spectrum, the terminology varies between scientific disciplines. In some fields of
physics, they are distinguished by their origin: Gamma rays are created by nuclear decay, while in
the case of X-rays, the origin is outside the nucleus. In astrophysics, gamma rays are conventionally
defined as having photon energies above 100 keV and are the subject of gamma ray astronomy,
while radiation below 100 keV is classified as X-rays and is the subject of X-ray astronomy. This
convention stems from the early man-made X-rays, which had energies only up to 100 keV, whereas
many gamma rays could go to higher energies. A large fraction of astronomical gamma rays are
screened by Earth's atmosphere.

Gamma rays are ionizing radiation and are thus biologically hazardous. Due to their high
penetration power, they can damage bone marrow and internal organs. Unlike alpha and beta rays,
they pass easily through the body and thus pose a formidable radiation protection challenge,
requiring shielding made from dense materials such as lead or concrete.

You might also like