Sie sind auf Seite 1von 67

August 6, 2012

WHAT IS LIDAR

What is LIDAR
LIDAR is an acronym for Light Detection And Ranging.

What can you do with LIDAR?


distance speed rotation chemical composition and concentration of a remote target where the target can be a clearly defined object, such as a vehicle, or a diffuse object such as a smoke plume or clouds.
Measure Measure Measure Measure

Introduction
The recently emerged technique of airborne altimetric LiDAR (Light Detection and Ranging) provides accurate topographic data at high speed. This technology offers several advantages over the conventional methods of topographic data collection viz. higher density, higher accuracy, less time for data collection and processing, mostly automatic system, weather and light independence, minimum ground control required, and data being available in digital format right at beginning. Due to these characteristics, LiDAR is complementing conventional techniques in some applications while completely replacing them in several others. Various applications where LiDAR data are being used are flood hazard zoning, improved flood modeling, coastal erosion modeling and monitoring, bathymetry, geomorphology, glacier and avalanche studies, forest biomass mapping and forest DEM (Digital Elevation Model) generation, route/corridor mapping and monitoring, cellular network planning etc. The typical characteristics of LiDAR have also resulted in several applications which were not deemed feasible hitherto with the conventional techniques viz. mapping of transmission lines and adjoining corridor, change detection to assess damages ( e.g. in buildings) after a disaster. This chapter aims at describing the various aspects of this technology, viz. physical principle, data collection issues, data processing and applications.

LIDAR Overview
What is LIDAR?
your institute Page 1

August 6, 2012

WHAT IS LIDAR

Lidar uses laser light to measure distances. It is used in many ways, from estimating atmospheric aerosols by shooting a laser skyward to catching speeders in freeway traffic with a handheld laser-speed detector. Airborne laser-scanning technology is a specialized, aircraftbased type of lidar that provides extremely accurate, detailed 3-D measurements of the ground, vegetation, and buildings. Developed in just the last 15 years, one of lidars first commercial uses in the United States was to survey powerline corridors to identify encroaching vegetation. Additional uses include mapping landforms and coastal areas. In open, flat areas, ground contours can be recorded from an aircraft flying overhead providing accuracy within 6 inches of actual elevation. In steep, forested areas accuracy is typically in the range of 1 to 2 feet and depends on many factors, including density of canopy cover and the spacing of laser shots. The speed and accuracy of lidar made it feasible to map large areas with the kind of detail that before had only been possible with time-consuming and expensive ground survey crews. Federal agencies such as the Federal Emergency Management Administration (FEMA) and U.S. Geological Survey (USGS), along with county and state agencies, began using lidar to map the terrain in flood plains and earthquake hazard zones. The Puget Sound Lidar Consortium, an informal group of agencies, used lidar in the Puget Sound area and found previously undetected earthquake faults and large, deep-seated, old landslides. In other parts of the country, lidar was used to map highly detailed contours across large flood plains, which could be used to pinpoint areas of high risk. In some areas, entire states have been flown with lidar to produce more accurate digital terrain data for emergency planning and response. Lidar mapping of terrain uses a technique called bareearth filtering. Laser scan data about trees and buildings are stripped away, leaving just the bare-ground data. Steve Reutebuch, team leader for Silviculture and Forest Models at PNW Research Station, first began his lidar research in forests in 1997 to find out how much accuracy was lost in lidar flights over areas with heavy forest cover. He wanted to better understand the level of error in lidar mapping of the ground through forest canopy, to be used in analyzing terrain maps in forested areas. He and his University of Washington collaborators found that the data thrown away by geologists were a rich source of information for foresters, a finding that has been well corroborated by lidar forestry research groups around the world.

your institute

Page 2

August 6, 2012

WHAT IS LIDAR

How Does LIDAR work?


The use of lasers has become commonplace, from laser printers to laser surgery. In airborne-laser-mapping lidar, lasers are taken into the sky. Instruments are mounted on a single- or twin-engine plane or a helicopter. Airborne lidar technology uses four major pieces of equipment (see figure below). These are a laser emitter-receiver scanning unit attached to the aircraft; global positioning system (GPS) units on the aircraft and on the ground; an inertial measurement unit (IMU) attached to the scanner, which measures roll, pitch, and yaw of the aircraft; and a computer to control the system and store data. Several types of airborne lidar systems have been developed; commercial systems commonly used in forestry are discrete-return, small-footprint systems. Small footprint means that the laser beam diameter at ground level is typically in the range of 6 inches to 3 feet. The laser scanner on the aircraft sends up to 100,000 pulses of light per second to the ground and measures how long it takes each pulse to reflect back to the unit. These times are used to compute the distance each pulse traveled from scanner to ground. The GPS and IMU units determine the precise location and attitude of the laser scanner as the pulses are emitted, and an exact coordinate is calculated for each point. your institute Page 3

August 6, 2012

WHAT IS LIDAR

The laser scanner uses an oscillating mirror or rotating prism (depending on the sensor model), so that the light pulses sweep across a swath of landscape below the aircraft. Large areas are surveyed with a series of parallel flight lines. The laser pulses used are safe for people and all living things. Because the system emits its own light, flights can be done day or night, as long as the skies are clear. Thus, with distance and location information accurately determined, the laser pulses yield direct, 3-D measurements of the ground surface, vegetation, roads, and buildings. Millions of data points are recorded, so many that lidar creates a 3-D data cloud. After the flight, software calculates the final data points by using the location information and laser data. Final results are typically produced in weeks, whereas traditional ground-based mapping methods took months or years. The first acre of a lidar flight is expensive, owing to the costs of the aircraft, equipment, and personnel. But when large areas are covered, the costs can drop to about $1 to $2 per acre. The technology is commercially available through a number of sources.

Lidar remote sensing


The basis for lidar remote sensing lies in the interaction of light with gas molecules and particulate matter in suspension in the atmosphere (aerosols). More particularly, a lidar uses a laser (emitter) to send a pulse of light into the atmosphere and a telescope (receiver) to measure the intensity scattered back (backscattered) to the lidar. By measuring the scattering and attenuation experienced by the incident pulse of light, one can investigate the properties of the scatterers (concentration of gaseous species, aerosol distribution and optical properties, cloud height) located in the atmosphere.

The light scattered back to the detector comes from various distances, or ranges, with respect to the lidar. Because the light takes longer to return to the receiver from targets located farther away, the your institute Page 4

August 6, 2012

WHAT IS LIDAR

time delay of the return is converted into a distance (range) between the scatterers and the lidar, since the speed of light is a well-known quantity. By pointing the laser beam in various directions and at various angles with respect to the ground surface (scanning), a ground-based lidar system can gather information about the three-dimensional distribution of aerosols in the atmosphere.

from: www.nsf.gov/geo/egch/solar/gc_solar_cedar.html

your institute

Page 5

August 6, 2012

WHAT IS LIDAR

The backscattered radiation detected by a lidar is described by the lidar equation. In general terms, the received power is expressed as a function of range R. For a simple backscatter lidar (measuring backscattered light at the same wavelength as the laser wavelength), the lidar equation is written as:

(1)

where Pr is the power returned to the lidar at the laser wavelength ( ), C is the lidar constant, R is the range, h=ctp, where tp is the pulse duration and c the speed of light. The term O(R) describes the overlap between the laser beam and the receiver field of view. The term is equal to 1 for ranges where there is complete overlap of the laser beam and the receivers field of view. Here, and are the combined aerosol and molecular backscatter and extinction coefficients your institute Page 6

August 6, 2012

WHAT IS LIDAR

respectively, at the laser wavelength. The combined backscattering coefficient can be re-written as the sum of molecular and aerosol backscattering ( ). For an elastic backscatter (one wavelength) lidar, this combined backscattering can be obtained by solving the lidar equation following the method suggested by Fernald (1984). With a Raman lidar, more information is available. Independent retrievals of aerosol backscatter and extinction can be obtained (Ansmann et al., 1990; Ansmann et al., 1992). A Raman lidar is able to detect specific gaseous species (O2, N2 or H2O) by measuring the wavelength-shifted radiation returned to the lidar due to inelastic scattering by the gas molecules. The lidar equation describing the return at the Raman-shifted wavelength ( ) is written as:

. (2)

The first term in the exponential term describes the extinction of the laser beam (at laser wavelength) going up toward the target while the second term describes the extinction of the return signal back toward the lidar (at Raman-shifted wavelength). The inelastic Raman backscattering coefficient ( ) is only associated to the inelastic molecular scattering and is not affected by aerosol scattering. Returns at the laser wavelength and at the Raman-shifted wavelengths can be combined in various ways to obtain information about the aerosols and the water vapor content of the atmosphere. More details about the products available from the CART Raman lidar, as they pertain to this study, and how parameters are derived are provided hereafter.

A Raman lidar designed for 24-hour automated operations has been making measurements at the Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) Clouds and Radiation Testbed (CART) near Lamont Oklahoma for a few years now. It is a vertically pointing (non-scanning) lidar so it provides vertical profiles of various aerosol optical properties over the site, as well as profiles of water vapor mixing ratio (Turner et al., 2002). The CART Raman lidar uses a frequency tripled Nd:YAG (neodymium:yttrium/aluminium/garnet) laser transmitting 350 mJ pulses of 355 nm light at 30Hz. The backscattered light is collected with a 61-cm telescope. The system measures your institute Page 7

August 6, 2012

WHAT IS LIDAR

backscattered light at the laser wavelength (355 nm), as well as at 387 and 408 nm wavelengths. These correspond to the Raman-shifted nitrogen (N2) and water vapor (H2O) waveleng respectively.

from: www.arm.gov/general/photolibrary/ ramanlidar.html Lidar at Otlica observatory Centre for Atmospheric Research at the University of Nova Gorica includes a lidar observatory at Otlica above Ajdovina. An elastic lidar, developed by Laboratory for Astroparticle Physics, is installed on the observatory since June 2005. Lidar system at Otlica enables remote sensing of aerosols and estimation of their impact on some optical properties of the atmosphere. Its transmitter, Nd:YAG laser with triple basic frequency, emits 5 ns long UV pulses in the atmosphere with the wavelength of 355 nm and energy of 120mJ. More aerosols are present in the atmosphere stronger is the scattering of the emitted light. As aerosols scatters the light in all directions, a part of the scattered light is directed towards the lidar's receiver. The receiver, parabolic mirror with focal length of 41 cm, diameter of 80 cm, and effective area of 1.5 m2, collects the received light and focus it to a sensitive detector, photomultiplier. The photomultiplier detects and intensifies the received light, and transforms it to a measurable electrical signal. The signal is related to the power of your institute Page 8

August 6, 2012

WHAT IS LIDAR

received scattered light. From the time delay of received signal in comparison to the triggering time of the laser the distance to the aerosol layer is calculated, and from the intensity of backscattered light the density of aerosol layer. Digitalization of lidar measurements is performed by analog/digital converter that is connected to the computer responsible for data acquisition and analysis. Laser, parabolic mirror, and photomultiplier are attached to the metal construction that enables two-dimensional scanning within lidar's filed of view. As aerosols are commonly related to the air pollution, lidar measurements at the Otlica observatory provide information about air pollution across extensive area of forest Trnovski gozd, Vipava valley, Kras, and Slovenian coastal area, up to the altitude of few tenths of km. Simultaneously used meteorological models enable the estimation of back-trajectories for air particles, and consequently the path of air pollution for the time before reaching the area of Slovenia. Thus lidar measurements together with meteorological models can provide an answer to the question - how intensive is the air pollution which is transported to the area of Slovenia in typical synoptic situations and where this pollution originates. Such an example is pollution emitted in highly industrialized Po lowland that is transported with SW winds towards the Slovenia where reaches the first bigger mountain barrier, the forest Trnvski gozd. Reaching the barrier a part of the pollution, especially in lower layers of the atmosphere, is trapped and deposed. A well known example of pollution transport is also the Saharian dust, drifted to the atmosphere by desert storms, which can travel thousands of km before reaching Slovenia.

your institute

Page 9

August 6, 2012

WHAT IS LIDAR

Principles of lidar system: emitted laser light scatters in all directions on particles and molecules in the atmosphere. Higher concentrations of scatters results in more intensive scattering and more backscattered light reaches the lidars receiver. It has to be taken into consideration that the power density of irradiance decreases with the square of distance from the scatter.

your institute

Page 10

August 6, 2012

WHAT IS LIDAR

Measurements of the intensity of backscattered laser light on aerosols with lidar that is installed at Otlica since June 2005 (measurement taken at Ljubljana, 8th of May, 2003). Polluted boundary layer can be seen in the atmosphere up to 1.5 km, between 1.5 and 4.5 km a thick layer of Saharan dust is evident, and above 8 km a development of high clouds occurs.

LIDAR
The term "lidar" is an acronym for Light Direction And Ranging, and works in a similar manner to its earlier cousins RADAR (Radio Detection And Ranging) and SONAR (Sound Navigation And Ranging). Similar to sonar, a pulse of light (equivalent to a sonar ping) is directed straight up. Any molecules, dust or clouds (whales or submarines for sonar), cause the the pulse to bounce back to the detector. In a lidar system, pulses of light are produced by a laser (1). When the light pulse encounters dust, fog, clouds and/or molecules, scattering occurs in all directions (2), and some of the scattered light is redirected back towards the ground (3), where it is gathered by a telescope, electronically detected and digitized. The time that it takes to travel from the laser to the cloud, dust, or molecules and back to the telescope can

your institute

Page 11

August 6, 2012

WHAT IS LIDAR

be converted into a distance using the speed of light (Distance = time traveled * speed of light * )

Signals, such as those below collected on Earth at the Atmospheric Optics Laboratory at Dalhousie, are representative of what we might observe on Mars. Cloud layers would produce an increased signal, and we could measure how thick and at what altitude they occur. Observing them over time will reveal how Martian weather unfolds over both a single sol (also known as a diurnal cycle) and over many sols.

your institute

Page 12

August 6, 2012

WHAT IS LIDAR

Contour Plot of Scattering Ratio for Aug 25, 2005 at Dalhousie. The Scattering Ratio is a ratio of the amount of light scattered by aerosols (clouds and pollution) and the amount of light scattered by molecules in the atmosphere (Red areas indicate a high degree of aerosol scattering, blue/purple a low degree of scattering). This plot shows a cloud rapidly descending from 9-12 kms to 5 kms and than rapidly ascending; this also shows a persistant aerosol layer, likely pollution, from 0-4 kms all evening long. The white area (centre) represents the laser light being completely extinguished by the cloud.

The information gathered by the Phoenix lidar will allow scientists back on Earth to understand how Martian weather evolves. Similar processes occur on Earth, and through comparison and contrast, projects such as the Phoenix Lander will allow us to better understand how weather on Earth develops. Specifications For Phoenix, the lidar system is designed, built and tested by Canadian companies MDA and Optech. MDA Space Missions (a subsidiary of MDA) is an international leader in robotic space technology (they built the Canadarm), while Optech is the world's largest corporate developer of lidar systems. The laser has been manufactured by Passat.

2007

Copyright 2003-2009 Dalhousie University. All rights reserved. Disclaimer Last Updated: Aug 3,

Lidar (light detection and ranging) is an instrument used to generate topographic data from a remote platform, usually a fixed-wing, light aircraft or helicopter. 3D topographic data is generated by measuring the distance between the sensor and the ground from a known position and in a known direction. To accomplish this, three your institute Page 13

August 6, 2012

WHAT IS LIDAR

measurements are repeatedly taken from the platform including: (i) the laser range distance to the ground; (ii) the Global Positioning System (GPS) location of the platform; and (iii) the Inertial Navigation System (INS) attitude of the instrument. The instrument also measures the return intensity of the laser pulse which produces a greyscale image of the surface. These data are combined in post-processing to produce a digital elevation model (DEM). Currently, most lidar post-processing requires coincident GPS basestation data within 30 km to correct the position of the platform during flight. This is called short-baseline processing. One of the aims of SLICES is to develop a long-baseline solution that will enable the production of a high quality DEM data set where GPS base station data is available only over long baselines (i.e. >30 km). The lidar collected for SLICES, will be for several purposes. The first is to extract ground control for use in controlling historical aerial photography that has been collected in Svalbard by the Norwegian Polar Institute. The second is to assess the quality of the photogrammetric DEMs that will be derived from the historical aerial photographs. The instrument used in the SLICES campaigns is an Optech ALTM 3033 deployed on the ARSF's Dornier 228-101 research aircraft. For more information about the lidar instruments, please consult the ARSF Website.

Campaign Report
Because of a malfunctioning lidar instrument in 2004, lidar data was only collected in Svalbard during the 2003 and 2005 campaigns. The following table summarises the lidar data that were acquired for each of the SLICES field sites. Because the data collected in 2005 has not yet been delivered, the 2005 data reported below is unconfirmed. Glacier Site Midre Lov reen Austre Br?breen Slakbreen Fridjov/Gr? ont>nfjordbreen Finsterwalderbreen Hansbreen Tunabreen Nordenski?reen your institute 2003 x x x x 2004 2005 x x

Page 14

August 6, 2012 Kongsvegen Albrechtbreen Koristkabreen Gullfaksebreen

WHAT IS LIDAR

Data Samples
The lidar data samples that follow were flown during the 2003 campaign. Data for two sites were acquired in 2003, Midre Lov reen and Slakbreen. The data is provided by the ARSF as a data cloud in ASCII table format with each point accompanied by an return intensity value on an 8-bit scale. The data samples below are provided both in jpg format and also in QT Modeler interpolated and point cloud format. With the QT Modeler files the user can interactu with the model, place markers and extract positional and elevation data. To download the free QT Modeler Reader, visit the Applied Imagery Download Page or click here to download directly. To see the data more clearly, we recommend turning on the Show Sky option under Options. Full site 20 m interpolated model of Midre Lov reen, QT Modeler or JPG. This data set has been interpolated from the ASCII point cloud to a 20 m grid. In this case, the intensity data overlay has been replaced with a shaded-relief overlay. JPG image has been captured looking up-glacier using a shaded-relief overlay with height colouration. Single flight line point cloud of Midre Lov reen, QT Modeler or JPG. The data in this file was imported from the raw data using only 1 out of every 10 points. The JPG images shows the flight line in a perspective view. Full site 20 m interpolated model of Slakbreen, QT Modeler or JPGa and JPGb. This data set has also been interpolated from the ASCII point cloud to a 20 m grid. In this case, the intensity data overlay is being used. The JPG images show an up-glacier view using: (a) a shadedrelief; and (b) an intensity overlay. Interpolated close up of Slakbreen, JPG. This image shows a good close-up of one Slakbreen's tributaries. This was cropped from the 20 m DEM above using an intensity overlay. your institute Page 15

August 6, 2012

WHAT IS LIDAR

If you have questions about this data, please contact Tim James.

About LIDAR Data


What Are LIDAR Data? How LIDAR Data Are Collected Interpreting LIDAR Elevation Maps

What Are LIDAR Data?


Light Detection and Ranging (LIDAR) is a remote sensing system used to collect topographic data. This technology is being used by the National Oceanic and Atmospheric Administration (NOAA) and NASA scientists to document topographic changes along shorelines. These data are collected with aircraftmounted lasers capable of recording elevation measurements at a rate of 2,000 to 5,000 pulses per second and have a vertical precision of 15 centimeters (6 inches). After a baseline data set has been created, follow-up flights can be used to detect shoreline changes. your institute Page 16

August 6, 2012 Return to Top of Page

WHAT IS LIDAR

How LIDAR Data Are Collected


For the South Carolina project, a LIDAR sensor was mounted onboard a NOAA DeHavilland Twin Otter aircraft pictured below. Once in flight, the aircraft travels over the beach at approximately 60 meters per second. During the flight, the LIDAR sensor pulses a narrow, high frequency laser beam toward the earth through a port opening in the bottom of the aircraft's fuselage. The LIDAR sensor records the time difference between the emission of the laser beam and the return of the reflected laser signal to the aircraft.

NOAA Twin Otter Aircraft

The LIDAR transceiver is rigidly fastened to the aircraft and does not move. However, a scan mirror assembly is mounted beneath the transceiver. A 45-degree folding mirror reflects the laser pulses onto a moving mirror which directs the laser pulses to the earth. The reflected laser light from the ground follows the reverse optical path and is directed into a small Cassegrainian telescope. The moving mirror produces a conical sampling pattern beneath the aircraft over a 30degree wide swath, thus permitting the collection of topographic information over a strip approximately 300 meters (about 1000 feet) in width from the nominal 600 meter (2000 feet) data collection altitude. For an animated display of the data collection process, click here.

your institute

Page 17

August 6, 2012

WHAT IS LIDAR

Illustration of How the LIDAR Sensing Instrument Captures Elevation Points.

The LIDAR instruments only collect elevation data. To make these data spatially relevant, the positions of the data points must be known. A high-precision global positioning system (GPS) antenna is mounted on the upper aircraft fuselage. As the LIDAR sensor collects data points, the location of the data are simultaneously recorded by the GPS sensor. After the flight, the data are downloaded and processed using specially designed computer software. The end product is accurate, geographically registered longitude, latitude, and elevation (x,y,z) positions for every data point. These "x,y,z" data points allow the generation of a digital elevation model (DEM) of the ground surface. LIDAR data sets on this CD-ROM cover an area from the low water line to the landward base of the sand dunes. Flights are planned to maximize the number of elevation points collected at the lowest tide for the largest area possible. The aircraft flight path is always parallel to the beach. Four passes are flown over each section of the beach. Two of these passes are flown so the center of the swath is over the sand/water interface. The other two passes are flown over the center of the sand/development interface. Flights generally last four hours. Weather conditions must be monitored. The flights cannot be flown during times of rain or fog as the water vapor in the air could cause the laser beams to scatter and give a false reading. Additionally, the plane cannot fly during times of high winds as the returned laser pulse will not be recorded correctly. Return to Top of Page

your institute

Page 18

August 6, 2012

WHAT IS LIDAR

Interpreting LIDAR Elevation Maps


In remote sensing, false color images such as LIDAR elevation maps are common. They serve as an effective means for visualizing data. The term "false color" refers to the fact that these images are not photographs. Rather, they are digital images in which each image pixel represents a data point that is colored according to its value. The purpose of this section is to aid users in interpreting false color images. LIDAR beach mapping data are composed of elevation measurements of the beach surface and are acquired through aerial topographic surveys. The file format used to capture and store LIDAR data is a simple text file and referred to as "x,y,z," where x is longitude, y is latitude, and z is elevation. Using the elevation "points," LIDAR data may be used to create detailed topographic beach maps. In the three images shown below, the legend in the bottom right corner of the image has a range of numbers from -3 meters to +5 meters. The numbers indicate the relationship between the colors on the legend and the elevations depicted on the map. For example, in the Huntington Beach map, the deep blue color represents land approximately at sea level or zero elevation. The cyan (light blue) features, like the jetty, represent elevations around 1 meter, or about 3 feet above sea level.

your institute

Page 19

August 6, 2012

WHAT IS LIDAR

LIDAR data become easier to interpret when examined in conjunction with additional data such as aerial photography. In the example below a LIDAR elevation map is compared with an orthophotograph. This small area on Kiawah Island provides a variety of interesting features. Comparing the orthophoto to the LIDAR data it becomes easier to identify features such as houses, roads, the vegetated dune area, and irrigation ponds.

Comparing Features Found in an Orthophotograph to LIDAR Data

Along the South Carolina coast, beach features tend to be less than 5 meters (16 feet). As a result, the scale of the color bar was chosen to highlight relatively narrow variations in elevation. This legend can be readily viewed in the PDF maps located in the pdf/islands directory on this CD-ROM. Addtionally, this legend has been provided for use in ArcView and is located at: data/lidar/avelev.shp. In this second example, an additional vector base map was overlaid on both the orthophoto and LIDAR elevation map. The base map, created in 1993, includes digitized building footprints, dune walkovers, and roads. A detailed base map can assist in confirming features detected by LIDAR elevation measurements. For example, when houses are surrounded by tall vegetation, LIDAR elevation data do not distinguish between roof top and tree top. Without the vector base your institute Page 20

August 6, 2012

WHAT IS LIDAR

map, it would be very difficult to determine boundaries between roofs and trees. Often ancillary data do not provide sufficient detail or are not available. In these cases, the user must obtain ground reference information using either local knowledge or by visiting the area to accurately confirm landmarks.

Example of How Vector Data Can be Useful in Identifying Features in LIDAR Data

Users can also view LIDAR data by creating a plot or profile of the data. In the profile below the beach features including the dune crest, beach face, and the water line can be identified. Users that have the add-on ArcView Spatial Analyst module can use the LIDAR Data Handler Extension, provided on this CD-ROM, to create similar profiles. For more information about this tool see the Data Tools section.

your institute

Page 21

August 6, 2012

WHAT IS LIDAR

LIDAR Data Viewed as a Profile

Return

to

Top

Return to

Main

Menu

What is Lidar?

Light Detection and Ranging (LIDAR) is a technology similar to RADAR that can be used to create high-resolution digital elevation models (DEMs) with vertical accuracy as good as 10 cm. LIDAR equipment, which includes a laser scanner, a Global Positioning System (GPS), and an Inertial Navigation System (INS), is generally mounted on a small aircraft. The laser scanner transmits brief laser pulses to the ground surface, from which they are reflected or scattered back to the laser scanner. Detecting the returning pulses, the equipment records the time that it took for them to go from the laser scanner to the ground and back. The distance between the laser scanner and the ground is then calculated based on the speed of light. While flying, the airplanes position is determined using GPS, and the direction of the laser pulses are determined using the INS. Because one laser pulse may reflect back from multiple surfaces, such as the top of a tree, a house, and the ground surface, there are multiple returns from each pulse that can be used to map such things as the top of the tree canopy, buildings, and the ground. Post-processing is used to differentiate between these multiple returns to determine the bare-earth surface. Using the combined information from the laser scanner, the GPS, and the INS, very accurate, closely spaced (typically 1 per square meter) X, Y, Z your institute Page 22

August 6, 2012

WHAT IS LIDAR

coordinates are determined from which a DEM is be made. -- Linda Mark USGS/Cascades Volcano Observatory, October 2004 Press release, October 25, 2004 -[PDF format,32K]

Preliminary color map of elevation change at Mount St. Helens, September 2003 to October 4, 2004. -- [PDF format,2.4M] Area shown is 5.2 km x 5.2 km in size, or about 3.25 miles square. Please note that small elevation differences, between -0.4 and +1.4 meters, are not shown as most such small differences reflect differing geodetic frameworks for the September 2003 and October 4, 2004 surveys. These decrease with further analysis and processing. Shaded relief images of Mount St. Helens showing volcanic unrest-related changes from September 2003 through November 20, 2004, including both still and animated versions.

LIDAR Images of Mount St. Helens, Washington


September 2003 through November 20, 2004, including animation

[NASA/USGS image] [click to enlarge]

[NASA/USGS image] [click to enlarge]

[NASA/USGS image] [click to enlarge]

Vertical view of new dome area Mount St. Helens crater, September 2003.

Vertical view of new dome area Mount St. Helens crater, September 24, 2004.

Vertical view of new dome area Mount St. Helens crater, September 30, 2004.

your institute

Page 23

August 6, 2012

WHAT IS LIDAR

[NASA/USGS image] [click to enlarge]

[NASA/USGS image] [click to enlarge]

[NASA/USGS image] [click to enlarge]

Vertical view of new dome area Mount St. Helens crater, October 4, 2004.

Vertical view of new dome area Mount St. Helens crater, October 14, 2004.

Vertical view of new dome area Mount St. Helens crater, November 20, 2004.

Southern Mapping Companys airborne laser survey equipment consists of a specialised laser terrain mapper (Lidar), a high-resolution digital photo camera, a video camera, and various controllers.

Our Methodology

your institute

Page 24

August 6, 2012

WHAT IS LIDAR

We consider high mobility of the laser survey equipment of utmost importance. This gives us the opportunity of operating our system worldwide. The way Lidar operates has much in common with a typical radar system. Lidar pulses a laser beam onto the periodically oscillating mirror and projects it downward. The laser beam hits an object and reflects back to the mirror. The scanner receives reflections of all the objects within the area needing to be surveyed. High-speed counters measure the time interval between the pulse leaving the airborne platform and its return to the Lidar sensor. Lidar time-interval measurements are converted to distances and correlated to the information recorded by GPS receiver, Inertial Measurement Unit (IMU), and ground-based GPS stations. 3-dimensional GPS solution is used to position the laser scanner at each second or half second, while the IMU data is used to determine the systems orientation. Our system is capable of discriminating among multiple returns from each pulse measuring not only the position and geometry of the surface below the aircraft, but also the terrain under the vegetation cover. Our methodology consists Registering Intermediate Final of three phases: Phase Phase Phase

By using these phases, Southern Mapping have a distinct advantage of being able to identify and measure the position of on-ground objects, map the land surface under the vegetation layer and obtain spatial model of vegetation. This important feature of the system is used with data processing algorithms to create a digital terrain model not affected by the vegetation. We perform a calibration flight to ensure the system meets specifications. Preliminary data processing is conducted to ensure completeness and integrity. This acts as a quality control process thus ensuring that coverage is complete and that the equipment was functioning well. Should there be a gap or unacceptable data quality detected, the crew will isolate problem areas and re-fly the affected sections before demobilising. The GPS and inertial data are processed together to achieve the best positional result. Once the position and attitude of the aircraft are known at each period (1 or 0.5 -second intervals), this data is integrated with the laser reflections to provide a position for each point on the ground. The data is processed using the proprietary laser-processing software suite to produce an ASCII file of (x,y,z) coordinates. Towards the final product output various operations are performed over the collected survey data. These include the initial laser points computation, laser data classification points, objects recognition, digital photographs, thematic objects processing and others in accordance with the clients requirements. All operations are performed in conjunction with a stringent quality assurance plan, which forms part of the contract. Its the final product of our services that sets us apart from other lidar companies. We dedicate our efforts, in first ensuring that we fully understand our clients needs and expectations. In so doing we deliver quality solutions that assist you in maintaining efficiency of your business.

your institute

Page 25

August 6, 2012

WHAT IS LIDAR

Lidar Technology

Lidar (Light detection and ranging) technology uses laser distance measuring technology to conduct topographic mapping. Unlike other technologies, lidar beams, which are transmitted from the aircraft, are able to function in overcast and cloudy conditions and can penetrate through dense vegetation. Lidar provides high accuracy at much higher speeds. This lidar system measures distances directly from the aircraft to the ground during flight, using laser ranging. When combined with a digital camera, high resolution, full colour imagery is created. During aerial surveys the laser fires 100,000 laser pulses per second at 25 degrees left and right of the direction of the flight of the aircraft. Simultaneously the camera takes photos of the earth below which are then draped in their precise location on the 3D model. The aircrafts trajectory and movements are kept carefully in check by means of a GPS and an Inertial Navigation System (INS). Advantages of lidar technology

- Vegetation Penetration: Lidar has a very narrow beam which is emitted from the aircraft. This beam can penetrate dense foliage to reflect off the ground and return to the aircraft. - Accuracy: A terrain model of extremely high accuracy can be delivered, due to the sheer density of the laser points. - Ground control: No ground control is required because of the accurate measuring equipment in the aircraft i.e. the GPS and IMU. Hence the entire project can be calculated from airborne sensors. The ground survey equipment of a lidar project is far simpler than a conventional survey. Hence it is faster.

your institute

Page 26

August 6, 2012

WHAT IS LIDAR

- Weather conditions: Lidar is an active sensor and therefore does not require ambient light to function. Many projects have been successfully completed in overcast or partly cloudy conditions. - Digital workflow: The lidar and camera both deliver their raw observations to computer disc in the aircraft. Once the sortie is completed, the air operator delivers the removable disc drives to the processor. The processor then immediately works on data processing. Within hours, verification of the days mission is completed. The entire project is usually processed, verified and delivered to our clients within 30 days. This is by far, faster than any other technology.

Latest Press Releases

Southern Aerial surveys SMC Hyperspectral Fourways

Mapping Company conducts GIS Training Course support management of power line infrastructure invests in new aircraft technology ideal for precision farming company invests in community

Feature - Placing Kepler at the center of your computing system


Scientists want to concentrate on science. They appreciate handy tools that simplify the process of analyzing data, especially when the data are stored in a complicated variety of systems and formats. The open source Kepler workflow system, for which version 1.0 was just released, is just such a tool; it helps scientists from a wide range of disciplines design scientific workflows and execute them efficiently. Keplers native support of parallel processing allows these workflows to leverage the compute power of grid technologies. Kepler has attracted collaborators and users from ecology, molecular biology, genetics, physics, chemistry, conservation science, oceanography, hydrology, library science, and computer science. Scientists have adopted earlier Kepler versions to study the effect of climate change on species distribution, simulate supernova explosions, and perform complex statistical analyses. Researchers from the National Center for Ecological Analysis and Synthesis (NCEAS) at UC Santa Barbara, the San Diego Supercomputer Center at UC San Diego, and the Science
A scientific workflow describes a series of structured computations that arise in scientific problem-solving. Typically a sequence of analysis tools are invoked in a routine manner. Workflows often include sequences of format translations that ensure that the tools can process each other's outputs, and perform routine verification and validation of the data and the outputs to ensure that the computation as a whole remains on track. (Adapted from Munindar P. Singh and Mladen A. Vouk ) The LiDAR workfow communicates both with the portal and the Grid layers (click for larger version showing layers ). This central workflow layer, controlled by the Kepler workfow manager, coordinates the multiple distributed Grid components in a single environment as a data analysis pipeline. It submits and monitors jobs onto the Grid, and handles third party transfer of derived intermediate products among consecutive compute clusters, as defned by the workfow description. In addition, it sends control information to the portal client about the overall execution of the process. Image courtesy of LiDAR.

your institute

Page 27

August 6, 2012

WHAT IS LIDAR

Environment for Ecological Knowledge (SEEK) and Scientific Data Management (SDM) projects at UC Davis founded the Kepler Project in 2002. Kepler extends Ptolemy II , a system developed at UC Berkeley for modeling, simulation and design of concurrent, real-time embedded systems . We wanted to create an open, customizable, extensible and robust scientific workflow environment for solving scientific problems, and Kepler 1.0 realizes our goal, said Ilkay Altintas, lab director of the Scientific Workflow Automation Technologies (SWAT) group at SDSC. It provides access to diverse technologies, and also furnishes a basis for many exciting features in the works. Kepler 1.0 comes with a searchable library containing more than 350 ready-to-use processing components that can be customized and operated from a desktop environment to perform analysis, automate data management, and integrate applications efficiently.

Artificially illuminated digital elevation model derived from LiDAR data hosted by GEON and produced by the GEON LiDAR system which uses the Kepler workflow. The image, in greyscale, is rendered in Google Earth with some transparency to show fusion with the imagery. The location is a famous site on the San Andreas Fault in central California known as Wallace Creek where stream channels have been offset by repeated earthquakes. View is to the south, south-east. Image courtesy of LiDAR

-Anne Heavey

The Kepler project is funded through various grants from the National Science Foundation and the Department of Energy. To download the Kepler application or learn more about the project, see http://www.kepler-project.org
Tags: ShareThis Americas Earth science Feature Middleware

L IDAR
Contacts
Valentin Simeonov

Description
How does a LIDAR work?

A LIDAR (short from LIght Detection And Ranging) is an optical instrument which allows to measure remotely different parameters of the atmosphere, such as humidity, temperature, particles or gas content, wind etc. The basic LIDAR operational principle is illustrated schematically in the figure below. Light pulses produced by a laser are transmitted into the atmosphere. The laser light is scattered and attenuated during its propagation as it encounters various molecules and particles in the air. A small fraction of the scattered light is directed backwards to the lidar. This backscattered light, which contains information on atmospheric properties, is collected by a

your institute

Page 28

August 6, 2012
telescope, and analyzed.

WHAT IS LIDAR

Fig. 1 - LIDAR principle. Since the laser pulses are very short, typically some billionths of a second, only a certain atmospheric volume gives a response at a given time. The measurements, thus, have a spatial resolution defined mostly by the laser pulse duration. The energy of the backscattered light E(R) is a function of the distance R and depends on instrumental parameters such as the energy of the transmitted laser pulse EL, the telescope surface A, and the overall efficiency of the lidar &eta

The backscatter &beta(R) and extinction &alpha(R) coefficients contain information on the atmospheric properties. Different type of lidars have been developed depending on the measured parameter and scattering mechanisms employed, some of which are presented briefly below.

your institute

Page 29

August 6, 2012

WHAT IS LIDAR

Fig. 2 - Jungfraujoch lidar - 3600 m above see level. Picture by Alain Herzog.

Elastic

lidar

The predominant part of the laser light is scattered by air molecules and suspended particles (aerosol) during a process known as ellastic scattering. Ellastically scattered light has the same wavelength (color) as the laser light. Since both, backscatter and extinction, depend on aerosol content, the elastic lidars do not allow precise measurement of backscatter or extinction coefficients. Therefore, elastic lidars are most often used to measure distance to clouds, smoke or dust layers. Assuming a relationship between backscatter and extinction makes possible to derive these parameters but the values to grate extent will depend on the guessed relationship. Differential absorption lidar (DIAL)

Atmospheric concentrations of water vapor or pollutant gasses, such as ozone are measured by DIfferential Absorption Lidar (DIAL) technique. A DIAL usually uses elastic signals at two closely spaced wavelengths that differ considerably in the absorption of the gas of interest for deriving its concentration. The close wavelength spacing of the two signals allows to eliminate extinction caused by other than gas absorption reasons i.e. aerosol and to neglect the differences in the backscatter. Raman lidar

A very tiny part of the scattered light has a significantly different wavelength because of the so-called Raman effect. The change in the wavelength is due to an exchange of energy between the laser light and the scattering molecule. Depending on whether rotation or vibration of a molecule is in the origin of the energy, the scattering is called accordingly vibrational or rotational. A vibrational Raman wavelength is the fingerprint of the molecule and since the backscatter coefficient is directly proportional to the scattering molecule concentration, vibrational Raman is used to measure gas concentrations i.e. water vapor with high selectivity and accuracy. The well expressed temperature variations of pure-rotational Raman spectra gives us a convenient way of measuring air temperature. Since the backscatter from air molecules is known, Raman lidars are also used for precise measurements of the aerosol extinction and backscatter. Doppler (wind) lidar

Wind speed and direction can also be measured by a lidar. In this case, the lidar measures very small changes in the wavelength of the backscattered light caused by atmospheric motion (Doppler shift). EPFL lidars

Number of lidars have been developed at EPFL and used in different field campaigns and for regular measurements. Some of these lidars are listed below.

your institute

Page 30

August 6, 2012

WHAT IS LIDAR

High-resolution water vapor and temperature Raman lidar Operational Raman Lidar for water vapor, aerosol, and temperature vertical profiling - RALMO Multiwavelength aerosol, water vapor temperature elastic/Raman and UV DIAL system - Jungfraujoch Mobile UV ozone DIAL
Further reading
1. E.D. Hinkley, 1976: Laser Monitoring of the Atmosphere, Springer-Verlag. 2. R. Measures, 1992: Laser Remote Sensing: Fundamentals and Applications, Krieger publishing company. 3. C. Weitkamp ed., 2005: Range-Resolved Optical Remote Sensing of the Atmosphere, Springer. 4. T. Fujii and T. Fukuchi ed., 2005: Laser remote sensing, CRC press. 5. V. Kovalev, W. Eichinger, 2004: Elastic lidar: Theory, Practice and Analysis methods, Wiley.

2006

EPFL.

Last

update:

29-04-2009

Light Detection and Ranging (LIDAR)

your institute

Page 31

August 6, 2012

WHAT IS LIDAR

What

is

Lidar?

How

Does

it

Work?

Lidar (LIght Detection And Ranging), is often refered to as laser radar. Light transmitted by a laser is scattered by atmospheric consituents and detected by an optical telescope. The properties of the incoming light enable certain properties of the targets to be determined, and timing of the measurement determines the altitude of the targets. The NOAA ESRL GMD Lidar Network

The National Oceanic and Atmospheric Administration (NOAA) Earth System Research Laboratory (ESRL), Global Monitoring Division (GMD) your institute Page 32

August 6, 2012 operates Lidar at four locations.


WHAT IS LIDAR

Mauna Loa Observatory, Hawaii (MLO) Boulder, Colorado Pago Pago, American Samoa (SMO) Trinidad Head, California (THD)

At Mauna Loa, Boulder, and Samoa, the Lidars are part of the Network for the Detection of Atmospheric Composition Change (NDACC). At Trinidad Head the Lidar is part of the NASA Micro-pulse lidar network (MPLNET). What is Camera Lidar? How is it different from Lidar? CLidar (Camera Lidar), is an innovative on the standard lidar developed at the ESRL Mauna Loa Observatory to measure boundary layer aerosols. Clidar uses a charge-coupled device (CCD) camera to image the entire laser beam from a few hundered meters away.

About LIDAR
LIDAR (LIght Distance And Ranging, also known as Airborne Laser Swath Mapping or ALSM) is a relatively new technology that employs an airborne scanning laser rangefinder to produce accurate topographic surveys of unparalleled detail. Compare high-resolution LIDAR topography with conventional 1:24,000-scale contour topography. Follow the links below for further explanations of LIDAR: Laser altimetry in brief (42 KB PDF file, Dave Harding, March 17, 2000) Bill Krabill's Airborne Topographic Mapper page (leaves this web site) PowerPoint presentations by PSLC participants: Finding faults with LIDAR in the Puget Lowland (PowerPoint, 5 MB file, presented at Seismological Society of America meeting, San Francisco CA, April 24, 2001, by Ralph Haugerud, Craig Weaver, and Jerry Harless) Seeing through the trees: LIDAR for the Puget Lowland (PowerPoint, 5 MB file, presented to Seattle-area Contingency Planners and Risk Managers, May 16, 2001, by Ralph Haugerud, Craig Weaver, and Jerry Harless) Article on Bainbridge LIDAR survey (217KB PDF file, or 4.8 MB DOC file with better-quality figures, presented at ASPRS conference May 2000, by Dave Harding and Greg Berghoff)

your institute

Page 33

August 6, 2012

WHAT IS LIDAR

Article on virtual deforestation (LIDAR post-processing) (541KB PDF file), presented at ISPRS workshop in Annapolis, MD, October 2001, by Ralph Haugerud and Dave Harding 30 August 2002

The LIDAR beam probes the night skies above Davis during auroral activity. Photograph by M. Lambert What do two physicists, a chef, an electrician, a diesel mechanic and a communications technician have in common? If you had been at Davis in the last couple of weeks (mid-July 2002), you would have answered A lot! When the LIDAR stopped working recently and the trouble was diagnosed as a problem capacitor, the space and atmospheric scientists were not too happy. They had no spares and the first ship which could bring the necessary parts was not expected for many more months! The Davis LIDAR (Light Detection and Ranging Instrument) is one of only three such instruments in the southern hemisphere. The problem capacitor should charge up and direct energy into the flash lamp and start the high speed laser process. The resulting high speed, luminous green laser beam probes the skies at high altitudes investigating climate change.

your institute

Page 34

August 6, 2012

WHAT IS LIDAR

The capacitor newly modified by the comms technician. Photograph by B. Chester Determined to find a solution, the two wintering scientists put out a plea for assistance. Although capacitors are common components in a range of electronic equipment, the voltage of this particular one was unusual. The comms tech and electrician rallied to the call and found similar parts, although they were unusable without modification. The comms officer concentrated on ways in which to modify the parts to suit the LIDAR's needs. Meanwhile, the electrician and station diesel mechanic decided to build a new one from scratch - of necessity from non-standard components. Enter the chef, who donated generously from the station kitchen supplies of plastic cling wrap and aluminium foil!

Capacitor testing. Photograph by B. Chester As a team, the men joined forces with the two physicists and built a machine to wind up the plastic and foil capacitor. your institute Page 35

August 6, 2012

WHAT IS LIDAR

By this time, the comms tech's modified component was ready for testing. Although it worked to some degree, added power was needed, and the addition of the home made version to the power supply seemed to do the trick! As the values of the home-made products are lower than the commercial brands, the creative team got together to produce a few more. The first four capacitors have now rumbled off the production line and individual tests show success in maintaining voltage over a two-hour period. More capacitors are being made and the real test will be when they are linked in series and the characteristic green light re-appears in the sky at Davis. Without the ingenuity and team work of the scientists and trades people on station, the LIDAR would have lain idle for the rest of the winter. As it is, it has passed all the preliminary testing and should soon be shining its characteristic green light into the atmosphere. UPDATE - August 2002: The LIDAR has been functioning well since the addition of the improvised capacitors. New "standard" components will be installed this (Austral) summer.

The LIDAR operating at Davis. Photograph by M. Lambert

op ^

your institute

Page 36

August 6, 2012

WHAT IS LIDAR

Feed back Priva cy Copy right Ratin g:

[Last Updated 29/03/20 10]

LIDAR

LIDAR has become a major portion of my geomatics career and was the main focus of my graduate research project that I completed at the Applied Geomatics Research Group and the Centre of Geographic Sciences in Nova Scotia. The data that I processed and products that I generated are helping Environment Canada and other Government Organizations to create adaptation strategies to coastal flooding problems in the Maritime Provinces. This site is meant to provide a brief understanding of the technology while demonstrating some of the highlights of my projects.

Basic overview of LIDAR


The aircraft uses a high precision Global Position System (GPS) and an Inertial Measurement Unit (IMU) to determine the location and your institute Page 37

August 6, 2012

WHAT IS LIDAR

measure the attitude so that the ground location of the return pulse can be accurately determined.

The LIDAR sensor produces a series of point measurements that consists of geographic location (X & Y) and height (Z) of both natural and man-made features, and can be further processed to produce several different products and integrated into a Geographic Information System (GIS).

The data produced from a LIDAR sensor in its most common form, is often represented by a series of spatial coordinates in an American Standard Code for Information Interchange file (ASCII). The data in the file is recorded in a tabular format where each line has coordinate information separated by a common delimiter. The data can include other attribute information for each point as well. There are additional ways to represent LIDAR data such as LAS format, which is an alternative to the generic ASCII file format used by many companies. Resultant LIDAR data is usually a very dense network of coordinate points and can often contain millions of measurements for a given area. This can result into large file sizes, depending on the collection area and data resolution, which has been known to be difficult to handle with the your institute Page 38

August 6, 2012

WHAT IS LIDAR

majority of common off the shelf software packages. Continue to my 2004 LIDAR research project.

LIDAR LINKS
2004 LiDAR Industry Directory The Lowdown on LIDAR, by Robert A. Fowler, EOM Archives LIDAR for Flood Mapping, by Robert A. Fowler, EOM Current Issues

Imaging Laser Altimetry

Light Detection and Ranging

(LIDAR)

What

is

Lidar?

How

Does

it

Work?

your institute

Page 39

August 6, 2012

WHAT IS LIDAR

Lidar (LIght Detection And Ranging), is often refered to as laser radar. Light transmitted by a laser is scattered by atmospheric consituents and detected by an optical telescope. The properties of the incoming light enable certain properties of the targets to be determined, and timing of the measurement determines the altitude of the targets. The NOAA ESRL GMD Lidar Network

The National Oceanic and Atmospheric Administration (NOAA) Earth System Research Laboratory (ESRL), Global Monitoring Division (GMD) operates Lidar at four locations.

Mauna Loa Observatory, Hawaii (MLO) Boulder, Colorado Pago Pago, American Samoa (SMO) Trinidad Head, California (THD)

At Mauna Loa, Boulder, and Samoa, the Lidars are part of the Network for the Detection of Atmospheric Composition Change (NDACC). At Trinidad Head the Lidar is part of the NASA Micro-pulse lidar network (MPLNET). What is Camera Lidar? How is it different from Lidar? CLidar (Camera Lidar), is an innovative on the standard lidar developed at the ESRL Mauna Loa Observatory to measure boundary layer aerosols. Clidar uses a charge-coupled device (CCD) camera to image the entire laser beam from a few hundered meters away. Camera Lidar at Mauna Loa Observatory, Hawaii (MLO)
U.S. Department of Commerce | National Oceanic and Atmospheric Administration Earth System Research Laboratory | Global Monitoring Division http://www.esrl.noaa.gov/gmd/obop/mlo/programs/gmdlidar/general_info.html Privacy Policy | Accessibility | Disclaimer | USA.gov Contact Us | Webmaster Site Map

LIDAR
Implementation of an operational water vapor and aerosol LIDAR system
The goal of this project is the realization of an operational water vapor and aerosol Lidar system for continuous profiling of the water your institute Page 40

August 6, 2012

WHAT IS LIDAR

vapor mixing ratio and the aerosol backscattering properties. The project is co-funded by MeteoSwiss (54%), the Swiss Federal Institute of Technology in Lausanne, EPFL, (39%) and the Swiss National Science Foundation (7%). The agreement between MeteoSwiss and EPFL on the project was signed in Summer 2004. The Lidar system is build at EPFL by the LIDAR research team of Dr. V. Simeonov, part of the Air and Soil Laboratory of Prof. H. van den Bergh. It will be pre-operational for MeteoSwiss in 2007, and operational in 2008, after several calibration and intercomparison periods (comparison with sounding of Payerne, intercomparison campaign in Lindenberg scheduled for Spring 2007).

The MeteoSwiss-EPFL-LIDAR will supply data with high spatial (150 to 600 m vertical resolution) and temporal (typ. time steps of 15 to 60 min) resolution, thus helping to improve the database for direct meteorological applications by adding real time water vapor and aerosol backscatter profiling. It will provide MeteoSwiss and EPFL with a database of great importance for climate change modelling and trend analysis in aerosol and water vapor content in the troposphere. Moreover, LIDAR information together with conventional data will be applied for comparison, validation and further development of the prognostic model aLMo. The resulting "combined Swiss Alpine site" (Payerne, Arosa and Jungfraujoch), will enable MeteoSwiss to fulfil the requirements defined in the Global Water Vapor Project (GVaP) of WMO-GEWEX for a level-1 reference station. The project will provide EPFL with:

your institute

Page 41

August 6, 2012

WHAT IS LIDAR

A "spin-off" of its R&D in the field of lidar and atmospheric sciences.

As long term objectives, it will bring information and studies of interest for numerical research and modeling activities, possibilities for further work with MeteoSwiss, and also for participation in different national and international scientific programs.

Results for the water vapor profile and for the temperature profile Lidar_results.pdf, 143 KB

Installation of the LIDAR


The system will be installed in a cabin, divided into two distinct rooms: a "white" room containing the system itself, with a hole in the roof allowing the Lidar beam to be sent and the backscattered light to be collected, and a computer room, from which the system will be operated (see Figure 1).

your institute

Page 42

August 6, 2012

WHAT IS LIDAR

Figure 1: Drawing of the Lidar housing, with separation into two distinct rooms (computer room for operation and acquisition on the right, and the Lidar ifself on the left).

Results
The lidar system is running in series of operation since early 2006. Current obtained results of vertical profile are shown in Figure 2.

Figure 2: Vertical profiles of water vapor mixing ratio [g/kg] from 14 to 24 UTC on 7 April 2006, and comparison with the Payerne sounding at 12 (left) and at 24 (right).

Key publications
Raman Frequency Shifting in CH4 :H2 :Ar mixture pumped by the 4th Harmonic of Nd :YAG (V. Simeonov, V. Mitev, H. van den Bergh, and B. Calpini) Appl. Opt., Vol 37, No 30, pp 7112-7115, 1998. A Raman Differential Absorption Lidar for Ozone and Water Vapor Measurement in the Lower Troposphere (B. Lazzarotto, V. Simeonov, P. Quaglia G. Larchevque, H. van den Bergh, and B. Calpini) Int. J. Env. Analytical Chem., 74, pp255-261, 1999 Experimental investigation of high-power single-pass Raman shifters in the ultraviolet with Nd :YAG and KrF lasers (L. Schoulepnikoff, V. Mitev, V. Simeonov, B. Calpini, and H. van den Bergh) Appl. Opt., Vol 36, No 21 pp 5026-5043, 1997.

your institute

Page 43

August 6, 2012

WHAT IS LIDAR

Water vapor Upper Air Observations at MeteoSwiss Payerne Bertrand Calpini 1, Vincent Hentsch 1, Pierre Huguenin1, Dominique Ruffieux 1, Todor Dinoev 2, Valentin Simeonov 2 ISTP2009, Delft Ext_abstract_Calpini_ISTP_2009vf.pdf, 418 KB

Contacts
LIDAR-EPFL: Dr. Valentin Simeonov EPFL ENAC ISTE LPAS, CH C2 392, Station 6, CH-1015 Lausanne

MeteoSwiss: Prof. Bertrand Calpini, project leader, or Dr. Yves-Alain Roulet, project coordination, Aerological Station, P.O. Box 316, CH-1530 Payerne, Switzerland

In Situ Cloud Lidar 2004 Flight Results


Engineering flights were undertaken on November 3, 4, and 17, and the operation of the electronics was modified as a result. A science flight was taken on the evening of December 1, 2004 (2004-12-02 UTC) off the coast of south Texas.

your institute

Page 44

August 6, 2012

WHAT IS LIDAR

The in situ lidar detector housing on the left wingtip tank of the SPEC operated Learjet.

your institute

Page 45

August 6, 2012

WHAT IS LIDAR

The detector module with electronics boards and detector head. The electronics outputs high gain and low gain channels at 50 MHz sampling rate and implements amplifier gain switching to provide up to 6 orders of magnitude dynamic range. The electronics also automatically adjusts the photomultiplier tube (PMT) gain.

your institute

Page 46

August 6, 2012

WHAT IS LIDAR

The detector head with upward and downward viewing PMT detectors. Each detector has wide field-of-view (30o half angle) optics for nighttime operation and narrow FOV (3o) optics with a 0.37 nm (FWHM) solar blocking filter for daytime operation.

your institute

Page 47

August 6, 2012

WHAT IS LIDAR

The laser in the Learjet cabin. The 532 nm wavelength YAG laser fires 180 mJ per pulse at 10 Hz through an optical flat in a cabin window on the right side of the plane.

your institute

Page 48

August 6, 2012

WHAT IS LIDAR

An example lidar signal in dense cloud on the first engineering flight during daytime. The calibrated lidar data from the low and high gain channels is shown with the red and blue lines. The green line shows the data from the two channels merged and averaged over log spaced time bins and with the solar background signal subtracted. The error bars show the standard error of the photon fraction from the variability in each time bin. In this example with the daytime optics, the peak lidar signal is about 40 times above the solar background. With background subtraction the dynamic range is more than three orders of magnitude and the usable signal extends beyond 15 microseconds. The solar zenith angle was 76o at this time.

your institute

Page 49

August 6, 2012

WHAT IS LIDAR

The path of the cloud portion of the science flight on 2005-12-02 UTC off the coast of south Texas. The dots along the line indicate when the aircraft was in cloud as indicated by an FSSP concentration of more than 10 cm-3. The numbers along the flight track are the times in UTC hours.

your institute

Page 50

August 6, 2012

WHAT IS LIDAR

An example of the calibrated, merged, and time bin averaged lidar signals for up and down detectors from the science flight. The fits of the function, log[p(t)] = a - b*log(t) - ct, are shown and the coefficients listed in the legend. These three fit coefficients for each detector are input to a neural net (trained on simulated in situ lidar signals in stochastic stratocumulus clouds) to retrieve extinction at four averaging volume sizes, cloud thickness, and cloud relative aircraft altitude.

your institute

Page 51

August 6, 2012

WHAT IS LIDAR

Examples of the lidar signal at five selected photon times from 0.5 us to 8.0 us as a function of laser shot time for the up and down detectors. The signal sampled at later photon times changes much more slowly your institute Page 52

August 6, 2012

WHAT IS LIDAR

with the aircraft travel than the signal at shorter photon times, due to the larger volume sensed by the diffusing photons.

A scatter plot of the lidar retrieved extinction versus the FSSP derived extinction. The 1:1 and 1.8:1 lines are shown. The linear correlation in log extinction of the points is 0.915. This correlation is quite high considering that the lidar is measuring a cloud volume more than 1010 times larger than the FSSP probe in an inhomogeneous cloud. The offset of the lidar extinction from the FSSP extinction is a factor of 1.8, which is mostly due to the uncertainty in the lidar calibration.

your institute

Page 53

August 6, 2012

WHAT IS LIDAR

Cloud base and top altitude derived from lidar retrieved cloud thickness and cloud relative altitude and the aircraft altitude. Also shown are the aircraft altitude and the top and base altitudes obtained from cloud boundaries derived from the FSSP probe. The vertical dotted lines indicate the times of furthest south or north latitude. The lidar retrieved cloud base and top altitude shows a consistent trend to lower altitudes to the south and higher altitudes to the north. There is reasonably good agreement between the lidar retrieved and aircraft derived cloud boundaries, though there is considerable variation in the aircraft derived boundary altitudes. Back to in situ cloud lidar

Vision
your institute Page 54

August 6, 2012

WHAT IS LIDAR

For over forty years, computation has centered about machines, not people. We have ca pampering them in air-conditioned rooms or carrying them around with us. Purporting t forced us to serve them. They have been difficult to use. They have required us to inte speaking their languages and manipulating their keyboards or mice. They have not been whether we were in the room with them. Virtual reality only makes matters worse: w computers, but also live in a reality they create.

In the future, computation will be human-centered. It will be freely available everyw sockets, or oxygen in the air we breathe. It will enter the human world, handling our goa do more while doing less. We will not need to carry our own devices around with us. devices, either handheld or embedded in the environment, will bring computation to wherever we might be. As we interact with these "anonymous" devices, they will adopt They will respect our desires for privacy and security. We won't have to type, click, o Instead, we'll communicate naturally, using speech and gestures that describe our inten that picture on the nearest color printer"), and leave it to the computer to carry out our w

New systems will boost our productivity. They will help us automate repetitive hu physical devices in the environment, find the information we need (when we need i examine thousands of search-engine hits), and enable us to work together with other pe

Challenges

To support highly dynamic and varied human activities, the Oxygen system must maste must be

pervasiveit must be everywhere, with every portal reaching into the same information base; embeddedit must live in our world, sensing and affecting it; nomadicit must allow users and computations to move around freely, according to their needs; adaptableit must provide flexibility and spontaneity, in response to changes in user requirements powerful, yet efficientit must free itself from constraints imposed by bounded hardware re constraints imposed by user demands and available power or communication bandwidth; intentionalit must enable people to name services and software objects by intent, for example, by address; eternalit must never shut down or reboot; components may come and go in response to deman as a whole must be available all the time.

Approach

Oxygen enables pervasive, human-centered computing through a combination o technologies. Oxygen's user technologies directly address human needs. Speech and vi communicate with Oxygen as if we're interacting with another person, saving much individualized knowledge access, and collaboration technologies help us perform a wide to do in the ways we like to do them.

Oxygen's device, network, and software technologies dramatically extend o technologies to us at home, at work or on the go. Computational devices, called Envir homes, offices, and cars sense and affect our immediate environment. Handheld dev empower us to communicate and compute no matter where we are. Dynamic, self-con our machines locate each other as well as the people, services, and resources we want your institute Page 55

August 6, 2012

WHAT IS LIDAR

to changes in the environment or in user requirements (O2S) help us do what we want w

Oxygen

device

Devices in Oxygen supply power for computation, communication, and perception batteries and wall outlets supply power for electrical appliances. Both mobile and sta communication and computation appliances. They are also anonymous: they do not customized to any particular user. As for batteries and power outlets, the primary differ amount of energy they supply.

Collections of embedded devices, called E21s, create intelligent spaces inside vehicles. E21s provide large amounts of embedded computation, as well as interface arrays, large area displays, and other devices. Users communicate naturally in the spac speech and vision, without being aware of any particular point of interaction. your institute Page 56

August 6, 2012

WHAT IS LIDAR

Handheld devices, called H21s, provide mobile access points for users both with spaces controlled by E21s. H21s accept speech and visual input, and they can recon multiple communication protocols or to perform a wide variety of useful functions (e.g beepers, radios, televisions, geographical positioning systems, cameras, or personal conserve power by offloading communication and computation onto nearby E21s.

Initial prototypes for the Oxygen device technologies are based on commodity ha technologies will use Raw computational fabrics to increase performance for streamin more efficient use of power.
Oxygen network

Networks, called N21s, connect dynamically changing configurations of self-identifying to form collaborative regions. N21s support multiple communication protocols for lowwide, and campus-wide communication. N21s also provide completely decentralized me and resource discovery, and secure information access.
Oxygen software

The Oxygen software environment is built to support change, which is inevitable if Oxyg is adaptable, let alone eternal. Change is occasioned by anonymous devices customi requests, by the needs of applications and their components, by current operating co new software and upgrades, by failures, or by any number of other causes. Oxygen's s control and planning abstractions that provide mechanisms for change, on specificatio mechanisms to use, and on persistent object stores with transactional semantics to p change.
Oxygen perceptual

Speech and vision, rather than keyboards and mice, provide the main modes of inte integration increases the effectiveness of these perceptual technologies, for example speech understanding by recognizing facial expressions, lip movement, and gaze. Perce the core of Oxygen, not just afterthoughts or interfaces to separate applications. Oxyge versions of these technologies quickly to make human-machine interaction easy and context switching supports seamless integration of applications.
Oxygen user

Several user technologies harness Oxygen's massive computational, communication, a both exploit the capacity of Oxygen's system technologies for change in support of Oxygen's system technologies with that capacity. Oxygen user technologies include:

Automation technologies, which offer natural, easy-to-use, customizable, and adaptive mech repetitive information and control tasks. For example, they allow users to create scripts that control devic according to their tastes.

Collaboration technologies, which enable the formation of spontaneous collaborative regions tha mobile people and computations. They also provide support for recording and archiving speech and vide linking these fragments to issues, summaries, keywords, and annotations.

your institute

Page 57

August 6, 2012

WHAT IS LIDAR

Knowledge access technologies, which offer greatly improved access to information, customized and software systems. They allow users to access their own knowledge bases, the knowledge bases of f the web. They facilitate this access through semantic connection nets.

The following scenarios illustrate how Oxygen's integrated technologies make it easier f less, wherever they may be.
Business

Applications

Hlne calls Ralph in New York from their company's home office in Paris. Ralph's recognizes Hlne's telephone number; it answers in her native French, re vacation, and asks if her call is urgent. The E21's multilingual speech and aut has scripted to handle urgent calls from people such as Hlne, recognize the w and transfer the call to Ralph's H21 in his hotel. When Ralph speaks with Hlne, he de home in London, into the conversation.

All three decide to meet next week in Paris. Conversing with their E21s, they ask compare their schedules and check the availability of flights from New York and Lon 11am looks good. All three say "OK," and their automation systems make the necessary

Ralph and George arrive at Paris headquarters. At the front desk, they pick up H21 and connect to their E21s in New York and London. Ralph asks his H21 where they can across the street, and it provides an indoor/outdoor navigation system to guide them t "last week's technical drawings," which he forgot to bring. The H21 finds and fetches t Hlne.
Guardian

Jane and her husband Tom live in suburban Boston and cherish their independence. A they have acquired a growing number of devices and appliances, which they have co longer miss calls or visitors because they cannot get to the telephone or door in time; the walls enable them to answer either at any time. Sensors and actuators in the b bathtub does not overflow and that the water temperature is neither too hot nor too col system keeps track of which television programs they have enjoyed and alerts them shown.

Just before their children moved away from the area, Jane and Tom enhanced th more help. Tom uses the system now to jog his memory by asking simple questions, su today?" or "Where did I put my glasses?" The E21's vision system, using cameras in the patterns in Tom's motion. When Tom visits his doctor, he can bring along the vision sy are changes in his gait that might indicate the onset of medical problems. Jane and T system to contact medical personnel in case one of them falls down when alone. By deli the E21 affords peace of mind to both parents and children.

Oxygen technologies are entering our everyday lives. Following are some of the techno your institute Page 58

Oxygen

August 6, 2012

WHAT IS LIDAR

by the Oxygen industry partners.


Oxygen device

A prototype H21 is equipped with a microphone, speaker, camera, accelerometer, perceptual interfaces. RAW and Scale expose hardware to compilers, which and power. StreamIt provides a language and optimizing compiler for streaming
Oxygen network

The Cricket location support system provides an indoor analog of GPS. The Intentional resource discovery based on what services do, rather than on where they are located Cooperative (CFS) File Systems provide secure access to data over untrusted network control.

Trusted software proxies provide secure, private, and efficient access to netwo people. Decentralization in Oxygen aids privacy: users can locate what they their own location.
Oxygen software

GOALS is an architecture that enables software to adapt to changes in user location component failures and newly available resources, and maintain continuity of service as evolves. GOALS is motivated, in part, by experience gained with MetaGlue, a robust arc
Oxygen perceptual

Multimodal systems enhance recognition of both speech and vision. Multilingual sy participants speaking different languages. The SpeechBuilder utility supports devel Person tracking, face, gaze, and gesture recognition utilities support development of v understand sketching on whiteboards provide more natural interfaces to trad
Oxygen user

Haystack and the Semantic Web support personalized information management and c management and manipulation. ASSIST helps extract design rationales

The Intelligent Room is a highly interactive environment that uses embedded participate in normal, everyday events, such as collaborative meetings.

True innovation in Oxygen comes from MIT students, researchers, and others us systems for their daily work. Hence Project Oxygen is building a system to use, and using PROJECT OXYGEN your institute Page 59

August 6, 2012

WHAT IS LIDAR

As the Internet has exploded over the last couple of years it has been facing a massive shortage of bandwidth. Project Oxygen is a massive project that plans to make gigabits of bandwidth available to Net users. The planned capacity is 2,560 Gbits/s. This will available in 75 countries via 96 landing points. The project proposes to provide these services by laying thousands of miles of undersea cables. The need for bandwidth is acute, existing networks are overloaded with voice/fax/data calls. Over the years voice calls on International lines have dramatically increased. Fax calls have also registered significant growth. Project Oxygen is not only confined to data and will service these areas as well. But what has really been the main driving force behind the entire idea has been the great mother of all networks, the Internet. A couple of years back the Internet was barely heard of and whatever access existed was largely text based with limited file transfers. As faster modems, ppp/slip and the advent of graphical access with Mosaic and Netscape the Internet simply ran of bandwidth. Even the IP addresses began to run out and another project called IPV6 was born. Project Oxygen will be the worlds first truly global telecommunications network with its 96 landing points in 75 countries. It is an extremely flexible network and has built in switching design. Using the Lucent Bandwidth Manager platform the network will be able to handle any voice or data traffic easily. Oxygen is an extremely reliable network which is extremely essential for a network of its size. It uses the SDH ring architecture which enable traffic to be rerouted within 300 milliseconds (!!!!!) of a fault in the ring. If this does not work then network management centers will be able to manage existing network resources. The OXYGEN Network has a significant advantage over traditional cable systems, which do not have unified management of all the available resources. A interesting thought for the day, the planned size of the entire cabling systems is 168,000 km(largely undersea and the rest over ground) and the estimated project cost for the first phase is US$ 10 billion.

your institute

Page 60

August 6, 2012

WHAT IS LIDAR

What is a component?
A software component is a re-useable, related, self-contained "chunk" of an application. A component can be installed, started, stopped, uninstalled and updated independently of other components in an application or system. Thus, components are intended to make a system easier to maintain. New or updated components can easily be switched for existing components - to introduce bug fixes or enhanced functionality. In Oxygen a component is simply all the Java classes and resources in a JAR file. Oxygen attempts to search for a human readable name for the component in the JAR manifest under the Component-Name attribute. If no name (or manifest) is found Oxygen will name the component after the JAR filename. This name only appears in debug and user interfaces and makes debugging easier. There are two types of component that can be written that are slightly different. Library components - normally supply re-useable chunks of code (class libraries). They do not share instances of classes (objects). This is just a plain JAR file, nothing special is required. Dynamic components - share software services (see below), that is dynamic components provide and consume object instances with each other. Dynamic components need a class that is used by the Oxygen framework to control the dynamic behaviour of the component. Such components need to declare a class that implements ComponentActivator, and has a no-argument constructor (allowing an instance to be created by the java.lang.Class instance() method). The class that activates the component is declared in the ComponentActivator manifest entry of the JAR file. By default all classes in a component (JAR) are made available to all other components operating in Oxygen - as if they were on the regular Java class path. However components can stop certain named packages from being made available to other components by supplying a Package-Excludes manifest attribute listing the packages that the JAR is NOT to share. Wildcards such as com.acme.secrets.* are acceptable as well as individual class names.

What is a service?
your institute Page 61

August 6, 2012

WHAT IS LIDAR

A software service is a interface driven contract between two parties describing some runtime functionality one will provide that the other will consumer. A service is normally defined by some abstract Java interface providers implement this interface and register their service implementation with some central registry under the service name (normally the fully qualified name of the service interface). For example imagine a service that is used to generate random numbers, defined by the following interface: public { /** * The service identifier used when registering/finding the service */ public static final String SERVICE_ID = RandomService.class.getName(); /** * Get a random integer number * @return A random integer */ public int getRandom(); } A component can be written that contains a class that implements that interface, as follows: public { * The range private public of random numbers int RandomImpl(int /** generated */ range; class RandomImpl implements RandomService interface RandomService

range) { this.range = range; } /** * @see RandomService#getRandom() */ public int getRandom() { return (int)(Math.rnd() * range); your institute Page 62

August 6, 2012

WHAT IS LIDAR }

} Service consumers are built against the service interface, never knowing (or caring) which component provides an implementation of the service. Service consumers acquire a reference to the service implementation at runtime by registering an interest with the Oxygen framework (a ServiceManager) for any implementations registered under the service name. This provides strong de-coupling between service providers and consumers and allows the two to be "switched" without the other caring.

OSGi Compatibility
Oxygen is in many ways similar to the OSGi framework and should not be too difficult for those familiar with OSGi to pick up. Oxygen differs from OSGi in several ways, these are now discussed.

Components vs. Bundles


In Oxygen the single distributable unit of application embodied by a JAR file is known as a "component" - this is for all intents and purposes the same as a "Bundle" in OSGi. In OSGi dynamic components are controlled (started & stopped) by the framework using an activator. This is an instance of a class that implements BundleActivator interface and has a public no-argument constructor. The class that performs this function is indicated to the OSGi framework by the Bundle-Activator manifest entry. In Oxygen dynamic components employ the same model of an activator class, only the class implements the ComponentActivator interface. The class name is resolved by the Oxygen framework from the Component-Activator manifest entry. This choice of different names is deliberate - to allow your component to operate in both OSGi frameworks and Oxygen frameworks. Lastly, unlike BundleActivator - ComponentActivator does not supply a "context object" to the start and stop methods. In Oxygen the singleton context classes are discovered statically using a Factory object. ServiceManager is available from ServiceManagerFactory, and ComponentManager is available from ComponentManagerFactory.

your institute

Page 63

August 6, 2012

WHAT IS LIDAR

Class Loading
In OSGi class loading between bundles is controlled by entries in the manifest file. The Import-Package and Export-Package entries of the manifest define (statically) the names of packages that are to be imported and exported by different bundles. Our experience was that this is a major nightmare for developing larger more rich applications the problem of maintaining these inter-JAR dependencies grows quickly with the number of JARs in the system. It seems the reason for this manifest entry is only to prevent sharing "private" classes in a bundle with other bundles and to resolve dependent bundles statically. Oxygen does not place any constraints on class loading above conventional Java. Preventing classes being exposed to the outside world can be done by not making those classes public. But to add some compatibility with OSGi we provide a manifest attribute ExportExclude, this is a comma-separated list of the packages and/or classes of this JAR which will not be shared with other components. All other packages in the JAR will be shared with all other components. Here are a few important notes on export exclusion and component class loading: "*" wildcard is allowed in excluded package names and signifies this package and all sub packages will be excluded. Partial package names are not allowed (at present). So "com.acme.secrets.*" is allowed, but not "com.ac*". If a component excludes a package from export that contains the class files that define a service interface. That service will no longer be available to any other components (assuming no other components export it). The consumer components will receive class not found errors when they attempt to load the interface class(es) at the point of use. If a component excludes a package that contains an implementation of a service interface, but not the service interface itself. The instance of the service registered will be available to consumers. This is because the class that implements the service was loaded by the provider components class loader (that excluded it to others) - not by the consumer components class loader. If a class or package is contained in two or more components/JARs and is excluded from export by one of those components - it will not be excluded by the other components. Or every component must define for itself only which classes are to be excluded from export. The precise search path for all classes in a component - occurs in the following order: 1. In the system (defined when launching the framework) class path your institute Page 64
The

August 6, 2012

WHAT IS LIDAR

2. The components JAR file 3. The other components installed in the framework (at that time) - in the order they were installed The safest option is not to exclude any classes from export until you need to.

Service Consumption and Release


In OSGi a service consuming bundle is required to access the services it uses from a BundleContext using the getServiceReference(String) method. If no services are registered this method returns nothing, so it is more common to attach a ServiceListener with a (LDAP style) filter that matches the service sought by the client application using the addServiceListener method. Once this listener gets a call back from the framework for a matched service, the actual service implementation object is acquired via the getService(ServiceReference) method. The OSGi framework tracks a count of service users, so when finished the service must be released by calling ungetService(ServiceReference). Oxygen still uses a ServiceReference class to represent a service registered with the ServiceManager. However, this service reference makes the service object available directly to client code via the getServiceObject() method. Also the reference does not require client code to "unget" the service when the service is no longer being used you can treat the object as an ordinary Java reference. The major difference in service acquisition is that oxygen service consumers implement the ServiceConsumer interface. And register a ServiceConsumer for a ServiceInterest with the ServiceManager- a ServiceInterest is an object that contains a service identifier (String normally fully qualified name of service interface) and an optional set of ServiceAttributes (describing desired properties of the service). The consumer will then be called back for each service that is found or lost that matches the ServiceInterest, this includes an intial call to the serviceFound method for each matching service already registered with the manager before the consumer was registered. In this way the consumer will be notified of all "matching" services. Similarly when services are unregistered - or when the consumer is unregistered, the serviceLost() method is called for each matching service the consumer can no longer use. It is normally safe to use the service one last time during the serviceLost event, but once this method completes and returns to the ServiceManager the component should no longer use the service object.

your institute

Page 65

August 6, 2012

WHAT IS LIDAR

It is possible to get a list of registered services in a service consumer by calling the getServiceReference(s)() methods of ServiceManager. However this is generally a bad idea - since services can come and go at any time, those consuming a service normally must know when a service is unregistered, or wait until a service is registered (if none presently are) before continuing. The ServiceConsumer model fits most scenarios better than "manually" acquiring services.

History
The oxygen project was originally established to produce an implementation of the Open Services Gateway Initiative (OSGi) platform capable of running on the Java 2 Micro Edition Connected Limited Device Configuration (J2ME CLDC). This required some changes to the OSGi platform (interfaces used) for compatibility. The aim of this portability is to allow J2ME clients to run OSGi bundles without having to recompile/redesign bundles. The CLDC Java runtime does not support dynamic class loading and these features of OSGi were not intended to be made available. Since these early days of this effort the authors have identified a niche for a very lightweight OSGi-like service discovery framework. Within the Oxygen project we intend to make available an open source framework for dynamic service discovery that is:
Lightweight - Smaller footprint and memory allocation compared to the bloated OSGi implementations. Back to the thin-client roots of OSGi. Simple - Eradicating the complex and error prone areas of the OSGi model (i.e. Manifest files and service/package dependency resolution) OSGi Compatible (if possible) - Services written for Oxygen should be able (with the help of tools and libraries) to be simply converted to work in existing OSGi frameworks.

Design choices of Oxygen


Oxygen will not use manifest files to declare dependency information between components. This idea is flawed - Java is an inherently late bound, and flexible language. Problems arise (particularly when systems like serialization are used) to define a fixed set of classes (or packages) that a component will be required to load in its life. As new classes and packages can be introduced after the class was built and these can still operate through the interface known to the original class - provided that interface changes only in a "compatible" way. your institute Page 66

August 6, 2012

WHAT IS LIDAR

Oxygen will not use objects to represent a handle to services, and oxygen will not track service use. Systems like OSGi do this - requiring clients to specifically (with lots of repeated code) acquire and release handles to service objects. Also references obtained are to some object that must be further de-referenced. Oxygen will operate with simple, conventional service references. OSGi uses this model for two reasons: to encourage good housekeeping in service consumers "tidying up references" at the end of their lifecycle. And secondly as a mechanism to control services being unregistered. Service consumers need not be components - any Java code can acquire a reference to the service manager

your institute

Page 67

Das könnte Ihnen auch gefallen