Sie sind auf Seite 1von 56

G GIS and Imagery G Real World Gaming

G GeoSAR G NEXTMap USA


Magaz i ne f or Sur veyi ng, Mappi ng & GI S Pr of es s i onal s
June 2010
Volume 13
4
Integrate success into your .
Success has a secret ingredient.
Size is everything. Our new OEMV-1DF is the worlds
smallest dual-frequency RTK receiver. Thats pretty big
news for engineers. Smaller means it uses less power.
Smaller means it weighs less. Smaller means it can fit
in a lot more places. Smaller means its easier to hide
your advantage from competitors. And when your
advantage is NovAtel, thats a big deal. To find out
more, visit novatel.com or call you-know-who.
You have something new to hide.









































































































































































































































































































































A Magical Mystery
Augmented Reality Tour
Recently I had the opportunity of visiting the Location Business Summit in Amsterdam.
During this two-day conference there were some interesting reflections on the development
of location based services. Of particular interest was a presentation by Gary Gale from Yahoo!
Geo Technologies, called Taking the hype out of location based services.
He came up with some interesting thoughts. Not only did he mention that smoke signals
could be regarded as location based services avant-la-lettre, he showed that since we are
gathering information all the time, we lose perspective. In his own words: we lose the when
in order to get the now. The history of maps is lost by mapping the present, that changes
all the time. Also, he used the term Geobabel to point out how people think they are talk-
ing about the same location, but in fact they are not without realizing it. The same place
may mean something else to everyone.
In short, with new technologies such as location based services, the concept of location and
place is redefined. Context is important here. A point of interest or location could be any-
thing, depending on the context. Manhoods relation with place is complex and geographers
use psychological theories to understand this relation. Social media in combination with
location will surely pave the way for redefining place, both virtual and physical. A Magical
Mystery Augmented Reality Tour for instance. Layar created one and Im excited about it,
even though Im not a Beatles fan myself.
Enjoy your reading!
Eric van Rees
evanrees@geoinformatics.com
June 2010
3
GeoInformatics provides coverage, analysis and
commentary with respect to the international surveying,
mapping and GIS industry.
Publisher
Ruud Groothuis
rgroothuis@geoinformatics.com
Editor-in-chief
Eric van Rees
evanrees@geoinformatics.com
Editors
Frank Arts
fartes@geoinformatics.com
Florian Fischer
ffischer@geoinformatics.com
Job van Haaften
jvanhaaften@geoinformatics.com
Huibert-Jan Lekkerkerk
hlekkerkerk@geoinformatics.com
Remco Takken
rtakken@geoinformatics.com
Joc Triglav
jtriglav@geoinformatics.com
Contributing Writers
Angus W. Stocking
Lawry Jordan
Karel Sukup
Florian Fischer
Ken Goering
Philip Cheng
Chuck Chaapel
Kevin P. Corbley
Matthew DeMeritt
Account Manager
Wilfred Westerhof
wwesterhof@geoinformatics.com
Subscriptions
GeoInformatics is available against a yearly
subscription rate (8 issues) of 89,00.
To subscribe, fill in and return the electronic reply
card on our website or contact Janneke Bijleveld at
services@geoinformatics.com
Advertising/Reprints
All enquiries should be submitted to
Ruud Groothuis rgroothuis@geoinformatics.com
World Wide Web
GeoInformatics can be found at:
www.geoinformatics.com
Graphic Design
Sander van der Kolk
svanderkolk@geoinformatics.com
ISSN 13870858
Copyright 2010. GeoInformatics: no material may
be reproduced without written permission.
GeoInformatics is published by
CMedia Productions BV
Postal address:
P.O. Box 231
8300 AE
Emmeloord
The Netherlands
Tel.: +31 (0) 527 619 000
Fax: +31 (0) 527 620 989
E-mail: mailbox@geoinformatics.com
Corporate
Member
Sustaining
Member
Building a Modern GIS
Founded by Romans in 34 BC and with a current population of 92,000,
Cceres is one of Europes oldest cities. Recently, a team of three city
planners working with a modest budget were able to implement a world-
class municipal GIS using existing digital cartography and a variety of
existing databases. Many tasks that were slow and tedious are now
automated, freeing professionals for more productive activities.
C o n t e n t
June 2010
Articles
Building a Modern GIS
For an Ancient City 6
GIS and Imagery
How They Became Pals 10
Moving Forward
Image Data Acquisition and Processing
of Clustered Cameras 14
Real World Gaming with GPS-Mission
Business Perspectives of Location
Based Entertainment 20
NEXTMap USA
A GPS Coordinate for Everything in the United States 26
Pan-sharpening and Geometric Correction
WorldView-2 Satellite 30
A Collaborative Project
The Archaeological Potential for Shipwrecks 42
Making Mapping the Impossible Possible
GeoSAR 44
Interviews
Spatial Technology
For Utilities, Public Safety and Security Solutions 24
The Data Exchange Company
Snowflake Software 36
Translate, Transform, Integrate and Deliver Data
Moving Data with FME 40
President of ERDAS
Joel Campbell 50
Conferences and Meetings
Thriving on Energy of Shared Innovation
2010 ESRI Developer Summit 46
Are We There Yet?
The Location Business Summit 34
Page 6
GIS and Imagery: How They Became Pals
Historically, imagery and GIS have occupied two separate worlds.
Imagery had its own methodologies, its own language, and its own set
of distinct instruments. In the same way, GIS had its own tools,
technicians, and geek speak. Although ESRI added support for
imagery and rasters into its software as early as 1982, everyone on
both sides knew technology had to evolve before GIS and imagery could
converge in a completely unified environment.
4
Page 10
Latest News? Visit www.geoinformatics.com
5
June 2010
On the Cover:
GeoSAR P-band DEM and orthorectified radar image highlight intricate
geomorphological and textural details on the Galeras volcano (Colombia)
and adjacent agricultural features on the fertile slopes of the active volcano
and surrounding the city of Pasto definition. See article at page 44.
Business Perspectives of Location Based
Entertainment
Location-Based Entertainment seems to come of age slow but surely.
Smartphones and reasonable mobile internet fares establish a framework
to enable a broad public for gaming. The International Mobile Gaming
Award just introduced the category of real world games last year and
experts await good business perspectives for location-based games in
marketing, tourism and education.
GeoSAR
In less than a decade of commercial operations, Fugro EarthDatas GeoSAR
system has earned a reputation for mapping the impossible. GeoSAR is a
dual-band airborne interferometric radar system that is capable of rapidly
mapping large areas in any weather conditions. In 2009 Fugro EarthData,
which integrated and operates the system commercially, used GeoSAR to
complete one of the most challenging terrestrial mapping projects the firm
had ever attempted.
Page 44
Calendar 54
Advertisers Index 54
Page 20
Page 44
Building a Modern GIS
For an Ancient City
Founded by Romans in 34 BC and with a current population of 92,000, Cceres is one of Europes oldest cities. Recently,
a team of three city planners working with a modest budget were able to implement a world-class municipal GIS using
existing digital cartography and a variety of existing databases. Many tasks that were slow and tedious are now
automated, freeing professionals for more productive activities.
By Angus W. Stocking, L.S.
Cceres, Spain, is a UNESCO World Heritage
City renowned for its blend of Roman, Islamic,
Jewish, and Christian cultures and medieval
architecture, all of which have left their traces
on the city. Founded by Romans in 34 BC and
with a current population of 92,000, Cceres
is one of Europes oldest cities.
But Cceres is a modern city as well, and its
city servantslike their counterparts around
the worldstruggle to serve citizens efficient-
ly. Recently, a team of three city planners work-
ing with a modest budget were able to imple-
ment a world-class municipal GIS using existing
digital cartography and a variety of existing
databases. The GIS was quickly adopted by
the public and has become a daily timesaver
for city offices, said GIS Department Director
Luis Antonio lvarez Llorente.
Since there was no budget for outside con-
sultants, the citys planning staff had to devel-
op the GIS on their own. And the databases
and cartography that existed had not been
designed with a GIS in mind.
lvarez continued, Everything we hadmap-
ping and alphanumeric informationwas pre-
pared internally. When the project started in
1999, we had some digital cartography that
was inconveniently formatted, a lot of paper
maps and documentation, and databases in
different formats scattered across several city
departments. Also, were very busy so we
couldnt assign a lot of staff to thisthere
were only two technical staff assigned to the
project permanently, and occasionally wed
form small, temporary teams for particular
phases.
Accessible via the Internet
But if the projects challenges were big, so were
its goals. Planners wanted to give all city
employees access to the GIS, they wanted it to
incorporate all existing databasesalong with
information from utilities, railways, and high-
way departmentsand they wanted the GIS to
be easily accessible to the public via the
Internet. To accomplish all this, they broke the
project down into phases.
The first phase was to design and organize the
GIS. One early decision was to build the new
system with Bentley software to take advan-
tage of staffs familiarity with it. MicroStation,
MicroStation GeoGraphics, and Descartes were
heavily used to assemble the cartographic lay-
ers. We had a lot of our urban planning infor-
mation on paper so we scanned that for a raster
layer and then compared that to digital map-
ping that were able to import. We adapted and
drafted as needed to create base mapping,
which gave us a high-quality end product,
explained Faustino Cordero, GIS department
assistant.
The Cceres team also turned to dozens of out-
side sources for cartographic information,
including the National Geographic Institute, the
Geographic Army Service, historic maps on file
at the Cceres Library, and existing street maps.
Most of these were paper-based and required
digitizing.
Utility Companies
This base mapping was made available to city
staff, and immediately proved useful. The suc-
cess of this phase encouraged planners and
work continued on base layers. Urban and rural
cadastral mapping was imported to aid asses-
sors, and orthophotos were adapted and tied
to the GIS coordinate scheme.
The next phase involved consolidating alphanu-
meric informationon paper and in databas-
esin the GIS. Bentley tools were able to work
6
Ar t i cl e
June 2010
A sampling of infrastructure maps managed by the GIS of Cceres (Photo credit: Ayuntamiento de Cceres)
with the various data formats, and staff was
able to import paper-based info. Once again,
work at this phase was made available as com-
pleted and immediately found eager users.
Thanks to the versatility of the software, the
available maps and data were easy to consoli-
date and weve seen a big return on our invest-
ment, noted lvarez.
With the basic format created and most avail-
able city information included, the GIS planners
turned to outside sources to increase useful-
ness. Cceres was able to reach data-sharing
agreements with all the utility companies that
serve Cceres, including water, wastewater, gas,
and electrical. Cceres was also able to get dig-
ital information about the road and rail net-
works, which consisted of a total length of
Latest News? Visit www.geoinformatics.com
7
June 2010
A sampling of infrastructure maps managed by the GIS of Cceres (Photo credit: Ayuntamiento de Cceres)
Wireframe 3D model of the old Cceres city
(Photo credit: Ayuntamiento de Cceres)
Ar t i cl e
3,000 kilometers of unpaved roads, and have
integrated everything into the GIS.
Seeking the most complete and useful informa-
tion possible, planners continued to add to the
GIS, and found ways to import and reference
historical cartography, livestock paths, public
transportation routes, tourist-oriented street
maps, and other information resources. All city
buildings are identified, with addresses, useful
information like hours of operation, and more
than 15,000 total pictures of buildings. Other
buildings available for search include pharma-
cies, health centers, and schools.
Internet Publication
To get this resource on the Internet, the Cceres
team used Geo Web Publisher. Geo Web
Publisher made Internet publication very easy,
because we didnt have to transform or adapt
anythingwe could just use it as we created
it, explained lvarez. But the team did put con-
siderable work into the web interface. VBA and
Javascript were used to add functionalities like
parcel shading and annotation localizing.
Button bars were also created to make the
interface readily useable by the public and city
employees. In all, 30 VBA modules with a total
of more than 5,000 lines of code were built.
Designers have consistently updated, expand-
ed, and improved the Cceres GIS. lvarez
explained that its a living thing, currently man-
aging 42,000 archives with more than 50 giga-
bytes of data and 50 workstations for city use
distributed throughout the citys departments.
All the information is centralized and accessi-
ble to all departments, noted Cordero. That
way, the changes, updates, or improvements
we make are immediately available, not only
for the use of public servants, but for the pub-
lic as well. The power and versatility of this tool
is evident from the large volume of data were
able to manage and make accessible.
lvarez is effusive when speaking to the bene-
fits of the GIS. We have better control of tax
collection and much more ability to answer
planning questions. Our census information is
much more accurate, and were able to do more
with it. And we can do a lot more for the citi-
zens of Cceresfor example, weve easily pro-
duced more than 50,000 street maps, tourist
maps, and public transportation maps, said
lvarez. He added that many tasks that were
slow and tedious are now automated, freeing
public servants for more productive activities.
The system is also a hit with the public, and
more than 150 Cceres residents use it each
day.
Cceres spent 10 years and 1.3 million euros on
the GIS project, when all the staff hours, soft-
ware, workstations, and training hours are
taken into account. Several constituencies agree
that it was money well spentthe city can
accomplish vital tasks more quickly and effec-
tively and take on some chores that were pre-
viously impossible, and residents have a
resource they can turn to again and again for
information.
Angus W. Stocking, L.S. is a licensed land surveyor
who writes about infrastructure projects around
the world. He can be contacted at
angusstocking@gmail.com.
8
June 2010
Mainface of the street map printed on paper (Photo credit: Ayuntamiento de Cceres)
Ar t i cl e
I believe in precision.
Leica Geosystems AG
Switzerland
www.leica-geosystems.com
The new Leica ScanStation C10: this high-definition
3D laser scanner for civil engineering and plant
surveying is a fine example of our uncompromising
dedication to your needs. Precision: yet another
reason to trust Leica Geosystems.
Precision is more than an asset when your
reputation is at stake, its an absolute necessity.
Zero tolerance is the best mindset when others need to rely on
your data. Thats why precision comes first at Leica Geosystems.
Our comprehensive spectrum of solutions covers all your measure-
ment needs for surveying, engineering and geospatial applications.
And they are all backed with world-class service and support
that delivers answers to your questions. When it matters most.
When you are in the field. When it has to be right.
You can count on Leica Geosystems to provide a highly precise
solution for every facet of your job.
GIS and Imagery
How They Became Pals
Historically, imagery and GIS have occupied two separate worlds. Imagery had its own methodologies, its own language,
and its own set of distinct instruments. In the same way, GIS had its own tools, technicians, and geek speak.
Although ESRI added support for imagery and rasters into its software as early as 1982, everyone on both sides knew
technology had to evolve before GIS and imagery could converge in a completely unified environment.
By Lawrie Jordan
Moores Law
Today, thanks in large part to enabling technologies and Moore's law,
GIS and imagery have combined on the desktop. The result is that the
long-imagined symbiosis between imagery and GIS is here. The challenge
is to demonstrate that symbiosis to those who can most benefit from it.
Thankfully, this job is easy. IT is filled with examples of technological
symbiosis. It's not hard, for example, to explain how weather satellite
technology informs meteorological science and, conversely, how meteo-
rological science informs weather satellite technology. The imagery from
sensors complements atmospheric science because it contains valuable
data. That's similar to how GIS and imagery inform each other.
Photographs of the earth are inherently spatial. GIS extracts the spatial
data inherent in the photographs then processes it, analyzes it, and man-
ages it all on the same platform. That's easy to convey to this audience
because it is common sense.
Universally Understood Principle
Users of spatial information all have a common
objective: they all want to produce successful pro-
jects in increasingly shorter time frames. At some
point in the evolution of software, almost every-
body in the software business realized that meet-
ing that objective requires the consolidation of
tasks in a workflow. Complicated processes could
be automated. Moores law enabled CPUs to per-
form a number of concurrent operations without
frying circuits. The software suite was born from
that novel development. The creation of ArcGIS
exemplifies that bundling of functionality. It could-
nt do everything at first, but it did a lot.
10
Ar t i cl e
June 2010
GeoEye 1 high resolution satellite imagery over
Queenstown, New Zealand, with local government par-
cel basemap.
Lawrie Jordan
Killing two birds with one stone is an age-old, universally understood prin-
ciple. If dinner can be had with the least expenditure of energy, that con-
serves time and calories for other equally important tasks. Companies and
governments operate the same way on a macro scale. Only in their case,
time and calories represent their goal to always run at optimal efficiency.
So what does this have to do with earth views and cartography? With
imagery and geoprocessing tools in a single interface, GIS technicians no
longer have to open an image-processing package to modify imagery data,
nor do they have to deal with separate licensing. At 9.3.1, ArcGIS combined
imagery and GIS analysis in one integrated environment that immediately
improved workflow. By availing themselves of that merger, organizations
maximized the value of their imagery data. Many other benefits loom on
the horizon with ArcGIS 10.
The next release of ArcGIS includes a new Image Analysis window in the
user interface, which enables quick access to a range of tools that those
who work with imagery typically require. That integration paves a more
direct path to results. Users can also now create catalogs of all the rasters
in their organization as well as define metadata and processing to be per-
formed. Access has been beefed up, as well. Image services open the door
to huge imagery holdings like ArcGIS Online, Bing, and the forthcoming
ArcGIS.com. The surplus of quality imagery data is ever-growing.
On-the-Fly Processing
Moore's law told us one day these two disciplines would marry, and indeed
they have. That is evident in ESRI's on-the-fly processing and dynamic
mosaicking. These entail going back to the original source pixels to ren-
der hundreds of thousands of images that instantly display on the screen.
This is tremendously powerful, and ESRI's use of it is unique in the indus-
try. It means if theres an organization that wants to host a datasetsay,
an image mosaic of the world (or any other dataset) it could easily
accommodate tens of thousands or hundreds of thousands of users who
want to view it at the same time. Tiled caches are invaluable for that scale
of image delivery.
Many common business needs are easily met thanks to this performance
gain. Not only can pre-processing and dynamic mosaicking save terabytes
of intermediate file storage, the results return accurately and instantly.
Mosaic Datasets and Multiple Sensor Models
At ArcGIS 10, ESRI decided to combine GIS and imagery into a single com-
prehensive data model stored within the geodatabase called the Mosaic
Dataset. The enhanced scalability enables massive volumes of imagery to
be quickly and easily cataloged from within ArcGIS Desktop or automated
using the geoprocessing tools. Mosaic datasets not only catalog the data;
they enable definition of extensive metadata and processing to be per-
formed on the imagery. This processing can include simple aspects, such
as clipping and enhancement, to more detailed orthorectification, pan-
sharpening, pixel-based classification, and color correction.
Additionally, Mosaic Datasets can be deployed as image services, making
them quickly accessible to a large number of different users both over
local networks and the Web. The Mosaic Dataset is the implementation of
image serving technology directly into the core of GIS. Soon, Mosaic
Latest News? Visit www.geoinformatics.com
Ar t i cl e
11
June 2010
GeoEye 1 image of Queenstown Airport, on-the-fly sharpening applied.
Datasets will become the de facto method of managing and using large
collections of imagery and other raster datasets that our users continue
to acquire.
GIS also handles new higher-resolution, higher-precision data types. In
version ArcGIS 10, a start has been made to integrate rigorous sensor
models into the software. A sensor model is a precise way to get 3D coor-
dinate positions on the ground. Traditionally, simple approaches just to
make an approximation of exactly where a pixel is on the ground use a
very low-level of mathematical equation. Sensor models are more sophis-
ticated. A sensor model implementation knows all about the optics of the
system and calculates a precise math model that locates the pixel in three-
dimensional coordinate space. ESRI implements several sensor models
into ArcGIS in full cooperation with all of our partners.
Inevitable Transition
These groundbreaking developments in GIS and imagery are exciting
to watch. Granted, Moores Law will always create such partnerships,
but that doesnt make it any less gratifying to witness. Anyone inter-
ested in these fields is encouraged to investigate the merger of GIS
and imagery. See what it can do for your organization.
Lawrie Jordan, Director of
Imagery Enterprise Solutions, ESRI.
12
Ar t i cl e
June 2010
Keynote speaker David Chappell
explains why cloud computing is a
golden opportunity for developers
GeoEye 1 image of Queenstown Airport, on-the-fly terrain hillshade processing
Interactive supervised classification of a DigitalGlobe WorldView
2 8-band image.
Complete Solutions for
Mobile Surveying
WELCOME TO THE REVOLUTION
The Next Leap in Lidar Evolution
Optech Incorporated
300 lnterohange way vaughan, 0N Canada L4K 5L8
1el: +1 905 660 0808 lax: +1 905 660 0829
www.optech.ca
1he Lynx Mobile Mapper defines the state-of-the-art in mobile mapping teohnology, generating rioh survey-grade lidar and image data
from almost any moving vehiole at highway speeds. At the forefront of this teohnology, Lynx integrates the latest innovation in lidar
sensors with multiple perspeotive lidar imaging and best-in-olass imaging, navigation, produot warranty and support. Lynx Mobile Mapper
the definitive answer to your large-area engineering and survey work.
1oin 0pteoh olients and expert users at the 5th lnternational 1errestrial Laser 3oanning user Meeting in Prague, Czeoh Republio, 1une 8-9, 2010.
lor information or registration, visit www.opteoh.oa/i3dugm
Moving Forward
Image Data Acquisition and
Processing of Clustered Cameras
GEODIS is a European company in the fields of geodesy, photogrammetry and
remote sensing. The following article focuses on how the company is involved
in image data acquisition and processing of clustered cameras. Topics discussed
are development of the digital technology usage, application of clustered cam-
eras data, image processing using automatic aerotriangulation, among others.
The article concludes with a look into the future of digital photogrammetry.
By Karel Sukup
Introduction
Continuously increasing the resolution of com-
mercially-produced large-format digital cam-
eras or standalone medium-format digital
camera backs has brought a number of
changes in technological methods. One of the
application areas of these digital sensors is
in the field of applied photogrammetry and
image interpretation. GEODIS purchased its
first digital camera with a resolution of 6
megapixels about 10 years ago. The company
was excited about its features, image quality
and PC connectivity support, offering astound-
ing image processing options compared to
classic aerial film cameras. The only flaw in
this type of technology was the relatively low
resolution of its sensors. Compared to an RMK
TOP, the camera used by the company at that
time, the area captured in a single digital
image was negligible. However, the digital
cameras flexibility, its ability to capture quali-
ty images, even in rather poor lighting condi-
tions, and the versatility of its use was
remarkable (the camera could be held in
hand, with vertical or horizontal image axis
orientation, could be used in an aircraft or
car). Amazingly this first digital toy cost the
same as a current 39-megapixel digital cam-
era. And this is not the largest resolution
available on the market there are now 50-
megapixel and 60-megapixel solutions com-
mercially available as standard.
Development of the Digital Technology Usage
The versatility and easy-to-use characteristics
of digital sensors for photogrammetric pur-
poses caused a wide range of camera systems
to appear on the market. The problem of low
individual chip resolution led developers,
through necessity, to combine the chips into
larger units, resulting in a bigger image size.
Todays digital camera image sizes are there-
fore close to the classic large-format film
cameras. Although GEODIS, as a specialized
digital photogrammetry processing company,
was linked to the technologies of Intergraph,
they had to migrate to Vexcel solutions when
facing the decision of which digital camera to
purchase. Sensors from this company were
being developed dynamically and it is worth
noting that efforts at Vexcel have not
dropped. UltraCamD, criticized by many pro-
fessionals for its construction,
instability etc., was relatively
close to GEODIS because the
construction philosophy was
similar to the kit used for build-
ing the companys own camera
systems.
Although GEODIS bought the
first UltraCam back in 2007 and
now have three cameras in total,
they purchased the first 39-
megapixel camera in 2005 and
started experimenting with it,
developing their own solution,
the GbCam digital camera. Their
activities first involved the use
of a single camera but a digital
twin followed in 2006, a three-camera set in
2007 and since 2008 this system has been
used as the five-camera GbCam system (Fig.
1) for capturing vertical and oblique images.
The system is suitable for both aerial and ter-
restrial digital image data acquisition applica-
tions.
Over the years, GEODIS managed to fine-tune
the controlling electronics and software of the
14
Ar t i cl e
June 2010
Fig. 1 Five-camera GbCam
Fig. 2 Orientation system of a cluster camera with two strips
captured with opposite flight heading.
system. However, there was also development
in the digital image processing field, with soft-
ware for stitching generally oriented images,
calculating interior and exterior orientation
parameters, dependent and independent ori-
entation of image pairs, triples, and quintu-
ples, right up to bundle adjustment of whole
image sets. The solution included develop-
ment of software for simple viewing and mea-
suring of images in a single-image mode and
the transition to GEODIS own stereo-viewing
and stereoplotting solution this past year.
Several other specialized companies engaged
solely in image capture hardware develop-
ment followed a similar scenario. Through
development of various versions of dual and
quarto systems, the technology reached the
stage recently when four- or five-camera sys-
tems were developed for capturing generally
oriented images, with one camera usually
pointed vertically and the remaining four cam-
eras tiltable as needed.
Application of Clustered Cameras Data
Clustered cameras are developed mainly for the
purpose of acquiring area survey/recon -
naissance images. At the beginning this mainly
involved development for military purposes but
civil applications have since followed. Images
are usually visualized using special software
developed specifically for their processing. This
software enables basic measuring information
within the images such as lengths, widths,
heights, surface areas, point coordinates, etc.
There is relatively little discussion about options
for using these generally oriented images for
further photogrammetric applications such as
mapping, orthophotomap production, genera-
tion of better 3D models based on data textur-
ing and others.
Latest News? Visit www.geoinformatics.com
15
June 2010
Fig. 3 PixoView Application Workspace
Fig. 4 Options for generating DTM and DSM using clustered cameras
Basemaps
Subscription
service
Online access
More than
80 countries.
Ask us
for the Earth
www.spotimage.fr/spotmapsonline
The primary problem in processing of generally
oriented images from clustered cameras is their
correct geo-referencing. Since there are a high
number of image files generated during the
photographic mission, perfect data manage-
ment is needed. Compared to large-scale digi-
tal cameras, commonly used medium-format
cameras generate many more images even if
vertical capture only is performed. If there are
five such cameras mounted to the holder, sev-
eral hours of imaging can result in tens or even
hundreds of thousands of images. Proper orga-
nization of this data and simultaneous assign-
ment of appropriate meta-information during
the flight is a relatively difficult task, the suc-
cessful performance of which significantly ben-
efits subsequent data post-processing. If every
image has at least the information on GPS time
and/or basic GPS/INS image orientation infor-
mation assigned, a considerable amount of
effort can be saved later when organizing these
data sets for further production.
If images are only used for monitoring an area
from several different perspectives, the directly
registered GPS/INS data is usually sufficient to
determine orientation of the images with suffi-
cient accuracy. In fact, with these tasks it is only
necessary to download a set of matching gen-
erally oriented images that see the selected
ground objects from various directions after a
viewpoint is selected on a vertical image or
map. If the directly measured image orientation
elements are merely approximate or determined
with lower accuracy, this often poses no prob-
lem for this type of application.
If more accurate image orientation is required,
there are usually two methods available, in
addition to the more accurate GPS/INS system.
The first method is to make a cluster adjust-
ment based on GPS/INS measurement only
without ground control points supplied, which
considerably increases the relative ties of
images. The second option is to perform full
cluster adjustment by means of classic aerotri-
angulation (AT).
Image Processing Using Automatic
Aerotriangulation
16
Ar t i cl e
June 2010
Fig. 5 Example of a color orthophotomap generated using images
acquired with the GbCam camera
When processing oblique imagery using soft-
ware solutions that are currently available,
serious issues occur in the functionality of
these systems when processing non-standard
configurations and orientations. It is usually
necessary therefore to process blocks of
images in several passes so that the existing
software can handle these images. At GEODIS
BRNO, there are three types of automatic
aerotriangulation processing software avail-
able: a solution from Intergraph (ISAT), Inpho
(Match AT) and Vexcel (Ultramap AT with
adjustment in Bingo). For processing oblique
images there were two applications under
test, ISAT and Match AT, and our experiences
in 2009 varied. The company was able to use
both applications for calculation with differ-
ent results, relating mainly to the degree of
oblique image used. The problems the com-
pany encountered were discussed with both
software producers. AT input involved individ-
ual images with interior orientation parame-
ters determined by field calibration while exte-
rior orientation parameter calculations were
carried out mostly using Orient software
developed at TU Vienna (adjustment was
done at the Brno University of Technology)
and later using the Bingo system.
The automatic correlation had difficulties tying
appropriate images together. The software
was more stable if overlapping of vertical
strips was ensured. Oblique images correlat-
ed only if taken in the same direction. Images
from strips captured by cameras oriented in
different directions did not produce correla-
tion and considerable dropouts occurred in
mutual ties of the strips. Later, the triangula-
tion blocks were divided into sub-blocks with
the same camera orientation, which substan-
tially increased the stability of the calcula-
tions. The correlated sub-blocks were again
merged into a single block and the final
adjustment was performed using the least
squares method. The complexity of the mutu-
al position of images in strips with opposite
orientation is illustrated in Fig. 2.
Figure 2 shows that when performing image
capture it is better to set up the flight in such
a way that mutual overlapping of central ver-
tical images is ensured (preferably large). This
is given by the current automatic AT process-
ing development level. Although the overlap
between the strips can be selected as need-
ed, at least 40% overlap proved to be useful.
In urban areas, it is better to ensure at least
50% or 60% overlap due to the relating tech-
nologies, e.g. possibility to perform higher
quality DSM correlation, while maintaining the
overlap of 60% between the images in a par-
ticular strip.
Examples of issues bound to automatic AT
processing using current software systems:
Serious correlation problem in ISAT cor-
relation sequences are selected chaotically
especially if multiple overlapping exists;
often there is no connection achieved
ISAT cannot handle correlation of oblique
images if not oriented in the same direc-
tion
Solution: per partes ISAT correlation
standalone correlation for various combi-
nations of strips and cameras with subse-
quent merging into a single block and final
adjustment. It is not possible to determine
in advance which combinations will deliv-
er the best result. However, we know for
sure that the following camera combina-
tions are required (see Fig. 2): 1+3, 2+3, 3
and 3+4+5. If problems persist, additional
special combinations are needed, such as
3+ all images facing south (north, west,
east).
Computing times needed for the ISAT cor-
relation in individual combinations is rela-
tively low (20-35 seconds per image). In
total the times range from 45 to 60 sec-
onds per image depending on the number
of strip combinations.
Inpho Match AT correlates all with all,
which results in longer correlation times
(3.5 minutes per image). If the number of
observations is optimized for a single
point and the maximum number of points
is limited for a single image, the times are
lower, comparable (or even shorter) than
times in the ISAT software. In some cases,
however, images suffer from unacceptable
decrease in the number of automatically
generated points and the optimizing set-
tings need to be re-adjusted, which often
leads to higher correlation times again.
Options for Using Clustered Cameras
for Mapping and 3D Measurements
The use of clustered cameras is most fre-
quently discussed in connection with image
acquisition for area documentation purposes,
e.g. for construction, traffic, urban planning,
police, integrated rescue system etc.
However, the oblique images acquired can be
used for mapping too. The procedure suitable
for this purpose is single-image mapping. This
can be applied when obtaining location-spe-
cific details of public areas or performing sim-
ple mapping of buildings and other objects
(see Fig. 3). This kind of mapping can be per-
formed using specialized software, such as
PixoView developed for these applications by
GEODIS BRNO.
However, using oblique images for stereo-
scopic measurements can be far more inter-
esting. The well-known problem of handling
Latest News? Visit www.geoinformatics.com
17
June 2010
Discover
SPOTMaps Online
the easiest way
to access your
SPOTMaps
database!
New service
Ask us
for the Earth
www.spotimage.fr/spotmapsonline
roof overlaps could also be solved using this
image acquisition method. The stereoscopic
shadow issue, occurring commonly when
using vertical images, could be considerably
eliminated as well. Although the current AT
results are not optimal for use in accurate
mapping, it is merely a matter of better sys-
tem calibration (field conditions are not per-
fect for most types of clustered cameras) and
proper AT adjustment of the entire set of
images to receive accurately geo-referenced
stereo pairs for all directions. For stereo resti-
tution the company has tested the Intergraph
and Inpho systems and our own stereo work-
station. All systems delivered great stereo-
scopic perception with vertical and oblique
images. When using oblique images acquired
in multiple directions, it will be necessary
however, to develop an image manager to
support stereo plotting that will enable
instant replacement of the oblique stereo pair
needed for measuring a situation covered in
one direction.
Options for Using Clustered Cameras
for DTM and DSM Preparation
Current experiences indicate the option of
using clustered cameras for generating DTM
and DSM. For example, Match-T DSM from
Inpho can be used to create a higher-quality
DSM on the assumption that there is at least
60% overlap of images and strips. Such an
overlap results in a high-quality DSM when
digital images are used. Despite this, even
these calculations have to deal with the issue
of hidden image areas or problems with deter-
mining real terrain, especially close to large
objects, such as buildings. Although these
problems have been minimized in recent
years, the use of oblique images still provides
considerably greater options for obtaining cor-
rect correlation of images and calculating DSM
in locations that proved problematic before.
For now, existing software cannot be fully
used for DSM calculations using oblique
images but it is possible to assume that a
combination of vertical and oblique images
will be beneficial for these calculations.
Available information also suggests that Inpho
has been working intensively on this issue,
also using GbCam data. If one takes into
account the option to calculate surface points
on building faades, the company could gen-
erate a high-accuracy surface model, includ-
ing various types of faade details. Usability
of the above methods for processing oblique
images acquired from an aircraft or mobile
mapping system would certainly represent an
excellent opportunity to calculate accurate
surface models of all buildings around com-
munications for example. Fig. 4 provides sam-
ples of DTM and DSM data generated using
GbCam imagery.
Use of Clustered Cameras for
Creating Orthophotomaps
The existing digital rectification technologies
enable the use of oblique mutually overlap-
ping images for creating orthophotomaps. The
modified true orthophotomap creation tech-
nology allows for the efficient patching of
shaded areas of vertical images. This is done
with image information obtained using math-
ematical searches to identify the missing sec-
tion in a suitable oblique image. A similar
method can be applied when performing
automatic building texturing. This is likely to
open a future path to 3D image databases
that will contain all information not only on
the terrain features but also the pixel image
information for all surfaces of the given 3D
object in database systems such as Oracle.
An example of a color orthophotomap pro-
duced using the GbCam system is provided
in Fig. 5 and an example of an automatically
textured building in Fig. 6.
Conclusion
The era of digital photogrammetry will bring
dynamic changes in acquisition and process-
ing of not only classic vertical images but also
oblique images. The software interconnection
of generally oriented images, captured from
an aircraft or ground-based mobile mapping
system, provides opportunities for the gradu-
al development of automated image data pro-
cessing in the sector of geo-informatics,
focused on applications related to image mea-
surement and semantic processing. Generally
oriented images will be stored in 3D databas-
es with the option of further use for various
types of 3D object measurement and surface
texturing. In connection with possible
improvement of image correlation options or
rotating laser scanners, it will be possible to
create extensive 3D databases of selected
areas comprising individual pixels with prop-
er geo-spatial and spectral information.
Karel Sukup Managing Director and CEO of
Geoinformatics Division of GEODIS BRNO, and
Patrik Meixner Production Manager of
Geoinformatics Division of GEODIS BRNO
Many thanks to Ing. Eva Pasekov,
Marketing &Sales Department
Geoinformatics Division
GEODIS BRNO, spol. s r.o.
Internet: www.geodis.cz
18
Ar t i cl e
June 2010
Fig. 6 Example of building automatically
textured using images acquired with the
GbCam camera
2010 Spectra Precision. All rights reserved. All other trademarks are property of their respective owners.
Simply Powerful
www.spectraprecision.com/FOCUS30
The latest and greatest in robotic technology from Spectra Precision.
30 ROBOTIC

StepDrive high speed motion technology


LockNGo advanced tracking technology


Spectra Precision Survey Pro eld software

GeoLock GPS assist technology



2, 3 and 5

Windows CE Touchscreen


2.4 GHz interference-free radio

Ultra lightweight at only 5kgs (11 lbs)


Contact your Spectra Precision dealer for a demo today. www.spectraprecision.com/dealers
Real World Gaming with GPS-Mission
Business Perspectives of Location
Based Entertainment
Location-Based Entertainment seems to have slowly but surely come of age. Smartphones and reasonable mobile internet
fares have established a framework to enable a broad public market for gaming. The International Mobile Gaming Award
introduced the category of real world games last year and experts await good business perspectives for location-based
games in marketing, tourism and education. Florian Fischer talked with Georg Broxtermann from Orbster about the
promise and prospective of location-based gaming. Orbster is the location-based entertainment company that developed
the highly successful game GPS-Mission.
By Florian Fischer
From the Pursuit of Coordinates to Mixed-Reality
May 1st 2000 is a memorable date for many geo-cachers. It was the day
when the White House announced it was going to stop degrading the
Global Positioning System accuracy and GPS users received an instant
upgrade of their devices accuracy. It has been an enabler for the very
popular leisure activity of geo-caching, which today is widespread all over
the world. People use GPS devices to search for hidden treasures, often
among historical or nature-relevant places, that are only described by their
coordinates. It has become a popular representative for a new paradigm
of leisure and entertainment-based activities, characterised by the conver-
gence of mobile information, communication technology and location ser-
vices to link up material space with media-space. They connect space and
entertainment in a way that makes people discover their environment
beyond their ordinary action space, solve problems, compete with others
and learn about spatial phenomenon or history. It is often described by
terms such as pervasive, mixed-reality or augmented-reality and
mostly dedicated to location-based entertainment like gaming or story-
telling. In 2010 location-based gaming seems to be a rising star in the
entertainment market.
Linking Material and Media-Space with Geospatial
Technology
Linking physical and virtual space holds the possibility of reclaiming social
and physical aspects of space in a playful way, and creates new and revo-
lutionary forms of spatial experience. Location-based games require sensi-
tivity for spatial contexts and interaction during the course of play which
is established by the application of localisation and mapping technolo-
gies. A starting point of their success has been the recent development
and convergence of mobile internet and geospatial technology. Both the
Microsoft and Google geo-browsing platforms ensure a free availability of
maps even on mobile phones. Many communication providers offer fair
mobile internet rates and cell phone producers commonly integrate GPS
chips nowadays. Thus costs for play and provision of location-based games
are reduced, which helps them gain more and more attention in the enter-
tainment and leisure industries.
Up to now a great variety of different games exists, as the Location-Based
Games Database project of the Chair for Computing in the Cultural Sciences
at Bamberg University proves. It contains 135 entries on different games.
While most are prototypes from research institutions, some commercial
projects are listed as well. GPS-Mission (www.gpsmission.com) is one of
these and at the moment one of the most successful in the world.
GPS-Mission The World is Your Playground
GPS-Mission is a treasure hunt game that offers numerous missions world-
wide with each mission adapted to a specific urban environment. After
having downloaded the GPS-Mission client on a mobile phone, the player
can log-in and start playing. During the game a mobile internet connec-
tion is necessary to re-load maps and update the players position on the
20
Ar t i cl e
June 2010
GPS-Mission - mixed-reality treasure hunt
server of GPS-Mission. That is to say, the course of the game is recorded
and can be reviewed later. In addition, other players in the community of
GPS-Mission can follow the game in real-time. After having selected a mis-
sion the players mobile phone shows checkpoints he has to reach and
challenges he has to fulfill. Checkpoints are points within walking dis-
tance. When reaching a checkpoint, sometimes a question related to the
place has to be answered. After reaching the last checkpoint the player
has solved the mission.
There is virtual gold everywhere in the world of GPS-Mission. All the gold
a player collects while playing a mission is available for him on his account.
Furthermore he is awarded gold for completing missions and can earn
gold for creating successful missions played by other players. Gold is the
in-game currency and can be used to buy trophies for every mission which
has been completed. The trophies are virtual collectibles similar to the
popular hiking-medals for alpine wanderers. Players can also buy power-
ups that improve the play.
Creating your own Missions
The missions are created by the community of GPS-Mission which are
assumed to be the community of players of the game as well. Thus every
player in the GPS-Mission community is invited to create their own mis-
sions for the community and share their knowledge of interesting places,
challenge other players and make them walk. Thus a mission designer is
provided as an easy to use web-based tool to create missions online.
After publishing a mission, it is instantaneously available for all players in
the area. The creator of a mission will be rewarded with 50 Gold for every
user that completes his mission successfully. In addition to managing the
mission, the mission designer utilizes a geo-browser optionally Bing-
Maps, OSM or Google Maps to create checkpoints, add local riddles,
gold and photo spots. As soon as a newly designed mission is ready to
be played, it can be published online and is visible for everyone in the
community after just a few seconds.
Quality Assessment
Georg Broxtermann believes, that the quality of the GPS-Mission largely
belongs to the activities in the community. This is also the reason why we
leave the quality management to the players mainly. However, a tool in
the mission designer checks every mission for its rough playability, and it
is up the players in the community to review the mission with stars and
comments. According to Broxtermann these players are aged mainly from
14yrs to 40yrs but sporadically up to 65yrs. He must smile while he admits
that the best mission on GPS-Mission has been created by a 66 year old
teacher from Amsterdam. This might indicate that the most active mem-
bers, in terms of high-quality contribution, are in the older age range, a
trend similarly observed on
OpenStreetMap and other popular plat-
forms for Volunteered Geographic
Information (VGI). In fact Broxtermann
argues that the authors of missions are
driven by a motivation similar to partici-
pating on YouTube. While he wanted to
focus on entertainment as motivation, I
rather believe in a whole bunch of moti-
vations for creating missions, ranging from
entertainment and education to earning
money, and developing a kind of profes-
sionalism in location-based entertainment.
A Multi-branched Business Model
Still the company Orbster wants to earn some money with GPS-Mission.
Georg Broxtermann explained the various branches of their business
model. Basically a premium client can be purchased on Apples App Store
or Nokias Ovi Store and advertisements on the website of GPS-Mission
generate some revenue for Orbster. But Broxtermann emphasizes that their
interest is in partner-events and the re-use of the GPS-Mission platform
for white-label productions and brand-marketing. There are three levels of
branding which can be incorporated in GPS-Mission. Firstly, the branding
of single missions, which reach from a special design for checkpoints and
a branded story, to the checkpoints that guide the player to points-of-
interest for that particular brand. Secondly, Orbster can build a new game
which is integrated on its platform, and thirdly, create a whole new and
independent game for its customers.
Bright Prospects
While location-based entertainment can be part of a branding-strategy in
the opinion of Orbster, it also has opportunities in the tourism and leisure
industries as well as in education. Location-based games are often
described as new leisure activities combining outdoor activities with gam-
ing experience as they generate a great post-work reward for the players.
As such they have strong connotations with life-style trends, self expres-
sion and fashion issues and compete with personal fashion items and
activities, such as having a coffee with friends rather than watching a
movie. Thus it might be assigned a valuable component in the tourism
and leisure industry in the future rather than in the entertainment domain.
Touristic performances are strongly concerned with play. They are about
taking on new roles and trying different patterns of action. The experi-
ence of difference aside from everyday lifes spaces is considered the
most driving force for leisure activities and travelling. Location-based
games provide a playful and different experience in everyday spaces, and
they help players transcend urban life by inscribing the game and their
interactions with it. While the game directs the player in space rather than
as his personal everyday habits do, he gains a new perspective on space
and a chance to reflect on daily spatial habits and configurations. At the
same time, he experiments with new tactics of space appropriation while
he moves through space by conducting the games rules, interacting with
other players and executing strategies to succeed in the game. The change
of perspectives is a basic principle to experience difference and gain an
awareness of other concepts of space. Other-awareness means an imagi-
native takeover of other points of perception while ones own points-of-
view are temporarily suspended. Perspective-taking is an important com-
Latest News? Visit www.geoinformatics.com
Ar t i cl e
21
June 2010
What the player can
do in GPS-Mission
ponent of a successful learning environment. Thus, loca-
tion-based games might be interesting components of
education-focussed leisure activities as well as for school
excursions and study trips. Affirmatively Georg
Broxtermann explains that education is a fascinating
domain for location-based entertainment. Teachers can
easily use the mission designer to create attractive mis-
sion for their students. There are already many exam-
ples of that. He also mentions a teacher in Munich,
Bavaria who has even been assigned to the municipal
school authority to create missions for learners.
The Future of Location-Based Gaming
It seems that location-based entertainment has some
very bright prospects to be used for the branding of
products as well as becoming a popular leisure and
tourist activity or even utilized as a learning environ-
ment. The fusion of location-based gaming with local
search and geo-social networking is expanding. The ever
popular mobile applications like Foursquare and Gowalla unite mobile
gaming with local search. They reward their users with virtual commodi-
ties when they check-in at a place. Those commodities can be collected,
changed and dropped again. Furthermore, players are rewarded with spe-
cial badges if they create new places. Checking-in and the maintenance of
virtual places assure the reception of virtual commodities. In the local
search game MyTown, the player if he owns the virtual place can even
get rent from follow visitors of the place. Embedding
the community of players seems to be a central topic
of future location-based entertainment and its applica-
tion in the leisure, education and marketing domains.
However, we shall keep our eyes peeled to see what
fusions emerge with other kinds of mobile services.
Florian Fischer, GIS Editor and Research Assistant at the
Austrian Academy of Sciences, Institute for GIScience in
Salzburg, Austria. He has a blog with small essays on the
Geographic Information Society, Locative Media,
Geobrowsers and the like: www.ThePointOfInterest.net.
Links
Orbster: www.orbster.com
GPS-Mission: www.gps-mission.com
Location-Based Games Database: www.kinf.wiai.uni-bam-
berg.de/lbgdb/
Gowalla: www.gowalla.com
Foursquare: www.foursquare.com
MyTown: www.booyah.com
22
Ar t i cl e
June 2010
Game display in GPS-Mission
UlLraCam Lechnology creaLes Lhe mosL advanced aerial mapping producLs or some o Lhe
world's mosL sophisLicaLed projecLs, as well as small, single-craL operaLions.

1o sLreamline Lhe phoLogrammeLric workow process, each UlLraCam is compaLible wiLh
Lhe new UItraMap 2.0 soLware. 1his soLware provides a powerul way Lo ecienLly
manage large volumes o UlLraCam imagery, and now includes addiLional eaLures such as
MonoliLhic SLiLching Lo signicanLly improve geomeLric image accuracy or unsLrucLured
Lerrain, and MonoliLhic RadiomeLry or single CCD radiomeLric images.
l you are looking or a cosL-eecLive opLion Lo upgrade or expand your currenL hardware,
visiL microsoft.com/uItracam/gif.
Serious tools jor serious maing.
200 MicrosoL Corp. All righLs reserved. MicrosoL, vexcel lmaging CmbH, UlLraCamXp, and UlLraCamL are eiLher regisLered Lrademarks or Lrademarks o MicrosoL CorporaLion in Lhe UniLed SLaLes and/or oLher counLries.
The data you dellver ls only as good
as the technology behlnd lt.
1he Iargest image PAN image
footprint in Lhe indusLry,
ewer ighL lines required.
ultraCamX
Same impressive ooLprinL aL
lower alLiLude wiLh
a new wide-angIe Iens.
ultraCamX WiJe Angle
1he Iargest PAN image footprint
rom any medium-format mapping
camera, ideal or smaller craL.
ultraCamL
Spatial Technology
For Utilities, Public Safety and
Security Solutions
Dr. Horst Harbauer, SG&I Senior Vice President for EMEA at Intergraph, talks about the companys software solutions
for the utilities industry, public safety and security solutions. Also , the distinction between GIS and security is addressed
and how Intergraph is in a unique position to deliver critical infrastructure protection to different but related markets.
Lastly, Harbauer speaks about integration real-time sensor feeds with maps and how that experience leads towards
new innovations.
By the editors
How does Intergraph
support the Smart Grid
needs of the utilities
industry?
The term smart
grid means the availability of
intelligent and flexible grids. More
and more power is being gener-
ated by decentralized power
sources (photovoltaics, wind
power). This leads to higher grid
structure requirements with regard
to load distribution and grid sta-
bility, which can be secured by
intelligent and flexible grids.
Contrary to regular power plants,
photovoltaic plants directly feed
into medium and low voltage net-
works creating significantly higher
effort to conduct networks analy-
sis. Wide area power generation
equally broadens the volume of
requests for network analysis soft-
ware solutions (e.g. voltage drop
and R&X-calculation) from not
only the headquarters and the
power plant, but also in some of
the subsidiaries of the regional
supplier and municipal utilities.
G/Technology is Intergraphs
focused application for utility and communications customers. It was
developed from the foundation of our GeoMedia technology to provide
advanced workflows that meet the data capture, maintenance, analysis
and reporting requirements of utility and communications companies. To
provide maximum openness, flexibility and scalability, both applications
support native Oracle Spatial. Previous versions of G/Technology initially
remained on Oracles relational spatial data model when GeoMedia
upgraded to the object data
model. Today, both G/Technology
and GeoMedia utilise Oracles
object data model. For earlier ver-
sions, customers made use of
Oracle stored procedures to simul-
taneously populate both geome-
try types, allowing both applica-
tions to access common records.
In Europe, when perform-
ing disaster management
simulations, the heavy
security at government
institutions impedes the
exchange of (geo)data.
The real problem seems to
be massive firewalls. In
what way can Intergraph
help government agencies
with this issue?
This is really a
matter of approaching the require-
ment from the correct direction.
Major events (whether natural dis-
asters, acts of terrorism or sport-
ing events of the scale of the
Olympics) are unparalleled in their
operational and organisational
complexity. Their safe and effective
management requires timely and
well informed decision making coupled with the ability to communicate
and coordinate across geographically dispersed locations and a bewilder-
ing range of diverse organisations. These can involve critical responders
and resources from emergency services, national government, municipal
and regional government, the private sector (such as utility operators,
communications companies, transport operators, etc.), the military, securi-
ty services and the voluntary sector, amongst others.
24
I nt er vi ew
June 2010
Dr. Horst Harbauer
To achieve this requires a significant degree of coordination, control and
resilience. In the absence of secure, reliable and predictable process and
access control, data sharing invariably becomes reduced to non-sensitive
themes that can be exploited by organisations downloading data from
portals for use in their local projects. The overheads, hinted to in the
question, and the lack of real-time interaction, tend to limit the applica-
tion of GIS to the planning and recovery phases of disaster management.
Intergraph has drawn on its experience as the leading provider of map-
based public safety and security solutions to develop a robust, collabo-
rative, process-driven emergency planning and response suite that fuses
workflow, real-time data integration, secure role-based access and
advanced geospatial functionality. The security and coordination provid-
ed by this platform enables users from different organisations to use
data directly from the source, avoiding the overhead and disconnect
caused by downloading datasets. This platform has already helped man-
age major events successfully, including the recent G8 Summit in LAquila,
Italy, and is being deployed for regional civil protection centres across
Europe.
The same questions as the one before, but with a focus
on security and infrastructure? How can Intergraph use
its knowledge of the energy and utilities infrastructure
industries to direct its expertise toward security concerns?
And because security in government agencies and energy
companies is not in the same hands as GIS, is there any
contact at all between both divisions and what is
Intergraphs strategy to enter these divisions?
In a perfect world, the GIS/security distinction would not
exist. However, some GIS technologies are harder to integrate with real-
time information and operational business systems. Intergraph is in a
unique position, having experience and products in the three prerequisite
areas of capability necessary to deliver critical infrastructure protection.
Intergraph offers core geospatial technology, as well as integrated security
platforms and industry solutions for infrastructure design and manage-
ment.
Today, Intergraph solutions are providing integrated security for airports,
ports, mass transit systems, rail, national borders and nuclear power
plants. Besides SG&I (Security, Government & Infrastructure), Intergraph
Process, Power and Marine (PP&M) which is Intergraph Corporations sec-
ond division, is the worlds leading provider of enterprise engineering soft-
ware for the design, construction and operation of process and power
plants. Our close relationship with and insight into the energy sector means
we work with clients wishing to protect next generation nuclear, petro-
chemical plants and oil production facilities.
The utilities industry has quite a high pressure to reduce
its operating cost. What solutions can Intergraph provide
to achieve this goal?
The German Federal Grid Agency has requested the utility
industry to reduce its operating costs and at the same time to com-
pensate the power losses which occur during the transmission. To secure
this, many power suppliers focus on status oriented maintenance.
Intergraphs G!NIUS solution provides all necessary methods and functions
needed to collect and document the status of the production equipment.
This covers the full workflow of production equipment data into the grid,
graphical user interface for result entry in the field, and recirculation of
the collected data into the office. The funds allocation is then based on
the findings of the results of the status oriented maintenance plan.
Furthermore, Intergraph does return the result data back into the central
ERP-SAP system, where cost calculation can be done.
The placement of safety cameras with a known position
that recognizes pixels is rapidly bringing digital camera
technology into the spatial domain. What can be expected
from Intergraph in the field of cameras and location, pixel
recognition and the real-time monitoring of suspected
movements with multiple cameras?
While this is bleeding edge technology for conven-
tional GIS vendors, Intergraph has a long history of working with video,
and the company holds a number of patents in this space. We first
integrated camera feeds with our emergency management environment
over a decade ago and also produce a forensic video enhancement and
analysis product. This experience has enabled us to lead innovation in
a number of directions.
The security and public safety markets have driven the need to inte-
grate real-time sensor feeds with maps to maintain a clear picture of
the situation on the ground and as a way to manage and make sense
of the ballooning and bewildering range of real time data feeds like
intelligent CCTV, radar, access control and UAVs.
The spatial framework also helps the operator understand situations
more quickly by showing the context of an alarm with clear links to
supplementary information that can help them determine whether action
is required. For example, when an alarm is raised by an access control
system or a sensor, the operator is shown its location along with CCTV
that covers the area in question and the location and status of nearby
personnel. Video footage 10 seconds from either side of the alarm can
be accessed by clicking a camera location. Similarly, a patrol can be
dispatched to investigate and CCTV cameras can be panned and
zoomed by simply clicking their icon within the map. Intelligent CCTV
enhances this process by continuously monitoring multiple feeds for
conditions that fall outside acceptable parameters. When an exception
is detected, an operator is shown the video sequence and location of
the event on a map display, providing direct access to all of the sup-
plementary information to assess the alarm and deploy the most effec-
tive response. These capabilities are used extensively in critical infras-
tructure protection and border security.
Intergraph also has just launched GeoMedia Motion Video Analyst to
enable wider and more effective exploitation of the terrabytes of data
that are produced by the hundreds of thousands of hours of video pro-
duced annually by UAV flights. . Motion Video Exploitation combines
video feeds from aerial platforms directly with mapping, enabling live
video to be viewed in its geographic context and in combination with
other data for enhanced situational awareness during operations. It
also unlocks valuable information in archived footage by providing a
simple and reliable means of searching by location as well as date and
time.
For more information, have a look
at www.intergraph.com
Latest News? Visit www.geoinformatics.com
I nt er vi ew
25
June 2010
NEXTMap USA
A GPS Coordinate for Everything
in the United States
The contiguous United States, comprising more than 8 million km
2
, extends westward from a Maine beach on the Atlantic
Ocean to the state of Washingtons Pacific coastline. With Canada on its northern border and Mexico on the south, the
countrys landforms range from deserts to mountaintops and from grassland prairies to marshland. Each of those 8 million
square kilometers of diverse terrain is now part of NEXTMap USA, a high-resolution 3D digital elevation dataset from
Intermap Technologies. NEXTMap USA, which also includes the island state of Hawaii, is a companion dataset to NEXTMap
Europe, Intermaps collection of 2.4 million km
2
of digital elevation data for all of Western Europe that was made commer-
cially available in May 2009.
By Ken Goering
Like those in NEXTMap Europe, the datasets
within NEXTMap USA which include digital
surface models, digital terrain models, and
orthorectified radar images are unprece-
dented in their uniform accuracy and have
already been put to use in extraordinarily
diverse markets and industries. County gov-
ernments use the elevation models and
images for projects such as water manage-
ment planning, and U.S. federal government
agencies have leveraged the countrywide uni-
formity of the data, which is of the same accu-
racy specification from coast to coast and
from border to border. In addition, the data
is used in an enormous array of geospatial-
enabled products and services; in the auto-
motive industry alone, NEXTMap data will be
used in 3D in-dash visualization applications,
while Intermaps 3D Roads product, derived
from NEXTMap data, supports energy man-
agement and safety/advanced driver assis-
tance systems (ADAS) applications.
NEXTMap USA is a remarkable database,
said Brian Bullock, Intermap president and
CEO. Every building, road, and even large
rock in the United States now has a GPS
address, if you will, and we know its position
within 2 meters horizontally and 1 meter ver-
tically. Each square kilometer in the database
includes 40,000 individual elevation postings
and 640,000 image pixels, equating to over
600 billion elevation measurements and five
trillion image pixels for the nation.
The privately funded NEXTMap program devel-
oped from Intermaps recognition that map-
ping resources for first-world countries could
be dramatically improved. In 1998, after ana-
lyzing the United Kingdom, Germany, and the
United States, we concluded that the first
world was not well-mapped, said Bullock.
Rather, what existed was an accumulation of
decades and decades of maps, with varying
degrees of accuracy, all cobbled together.
Britain Serves as Prototype
By 2002, Intermap was ready to initiate its
first whole-country mapping project and chose
Great Britain as a prototype. Intermap collects
its data with interferometric synthetic aper-
ture radar (IFSAR) mounted on an aircraft fleet
which includes Learjets and King Airs that col-
lect data in swaths up to 10 kilometers wide.
The method results in digital elevation
databases with sub-meter vertical accuracy.
One particular advantage of IFSAR is the abil-
ity to collect data in cloudy or dark condi-
tions, which allows the jets to fly without wor-
rying about cloud belts or overcast days.
England and Wales were completed in 2002,
and Scotland was added to NEXTMap Britain
in 2003. We were able to meet the technical
26
Ar t i cl e
June 2010
This is a NEXTMap USA colorized shaded-relief digital terrain model (DTM) of the Grand Canyon, which is
located in northern Arizona in the southwest United States. The canyon is 446 km long and varies in width
from 8 km to 29 km. Grand Canyon National Park was one of the first U.S. national parks; the Colorado River
began carving the canyon at least 17 million years ago.
specifications and also prove the business
model, Bullock said. The big challenge was
to scale that up 50 times and significantly
reduce the costs.
Bullock said that Intermap wanted to devel-
op a digital database for the United States
that was much more accurate than what was
available at the time. It took the U.S. gov-
ernment 60 years and $2 billion to map the
United States the first time, and we were set-
ting out to do it at a thousand times more
density, and at least ten times more accuracy,
and we were going to do it in four or five
years with private funding, he said.
NEXTMap USA Begins with California
Based on market demand, NEXTMap USA
began with remapping the state of California.
However, remapping this single state was a
significantly larger project than NEXTMap
Britain had been: At nearly 424,000 km
2
,
California is almost twice the size of England,
Wales, and Scotland combined.
Intermap developed a 150-page project plan
that guided the company through this
along the southern and northern borders of
the United States for the mutual benefit of the
North American governments to manage bor-
der and security issues.
With the northern and southern borders com-
pleted, the rest of the United States was com-
pleted with maximum efficiency as dictated
by cooperative weather patterns and the sea-
sons. Coordinating the flights, which could
change on a moments notice depending on
extreme weather, took a huge effort from
Intermap personnel. There were times, espe-
cially during the winter, when we couldnt fly
anywhere in the country, said Ivan Maddox,
Intermap director of data acquisition and
planning.
Coordinating governmental clearance for the
NEXTMap USA flights was, compared to data
collection for NEXTMap Europe, relatively
straightforward: there is only one civil air
authority for the country, instead of different
agencies for each of the European countries.
Still, the flight planning had to be thorough.
Each flight had a standard 12-page briefing
that included the precise times of every sin-
gle turn, said Maddox.
For NEXTMap USA, Intermap aircraft flew a
total of 2,530 sorties, equating to 10,324
hours of airtime or a total of nearly five years
working aloft.
Improving Efficiencies
Throughout data collection operations for
NEXTMap USA and NEXTMap Europe,
Intermap was taking significant steps forward
in both its technology and methodology.
When data for NEXTMap Britain was collect-
ed, the aircraft flew in lines of only 200 km
in length. To maintain the absolutely straight
lines needed for accurate data collection, the
pilots must continuously adjust the aircraft
heading during the flight because of chang-
ing winds aloft which also reorients the
antennae mounted on the jets and changes
the look angle of the antennae. The radar
would have to be taken offline so that it could
be manually reoriented to correct the look
angle, and the aircraft would have to make a
turn in order to start collecting data where it
had left off before. During those periods, the
radar wasnt collecting data, but the aircraft
was still using fuel and time both of which
are expensive resources.
Through intense research and testing, the
companys engineers developed a method of
automatically reorienting the IFSAR antennae
pedestal to account for changing wind direc-
tions while continuing to collect data. This
advancement allowed the Learjets to fly ultra
long lines, 1,200km flightlines that were
restricted to that length only by the fuel
capacity of the aircraft. By the end, said
unprecedented project. The plan addressed,
in part, ways in which to ensure that the data
was collected as accurately as possible.
Intermaps aircraft collect data by flying abso-
lutely straight lines, and subsequently control
the data with reflective ground control points
(GCPs) placed by field staff using GPS coordi-
nates and, for a state the size of California,
GCP placement was no easy task. However,
as massive as California is, its only the sec-
ond-largest state in the contiguous United
States (Texas is nearly 7 million km
2
):
Intermap was definitely headed into new ter-
ritory with NEXTMap USA.
Data collection for California was completed
in September 2005. The early sales success-
es of the dataset including use for flood-
plain mapping, high-speed rail line planning,
and water resource planning, among many
other projects in the state convinced
Intermap to continue the initiative of remap-
ping the entire United States.
The expansion began with the states of
Mississippi and Florida in the southeastern
United States. Next, Intermap collected data
Latest News? Visit www.geoinformatics.com
Ar t i cl e
27
June 2010
AccuTerra by Intermap is one of the many applications enabled by NEXTMap data. The application, available
for Apples iPhone as well as dedicated GPS devices, allows users to plan, record, and share their outdoor
recreational experiences, like hiking and skiing.
agement and application interoperability.
Instead of storing and managing large
datasets locally, many users now prefer cost-
effective Internet-hosted solutions that are
compatible with both their existing appli-
cation environment and data access
requirements.
In response, Intermaps Web services por-
tal called TerrainOnDemand is an
Open Geospatial Consortium (OGC)
data as a service platform that
natively supports the acquisition, anal-
ysis, and delivery of the companys
NEXTMap data.
Automotive Applications
Abound
NEXTMap data is being evaluated
extensively in the automotive indus-
try: Intermaps 3D Roads product is an accu-
rate and homogeneous geometric representa-
tion of all roads in a country, based on
NEXTMap data. Key vehicle energy manage-
ment applications enabled by 3D Roads
include Eco-routing, which helps plan more
fuel-efficient routes (and reducing carbon
emissions), and Electric Vehicle Range
Prediction, which accurately informs electric
or hybrid electric vehicle drivers how far they
can proceed on their current charge. Also in
the automotive sphere, NEXTMap data is
enabling applications such as Predictive Front
Lighting, which automatically adjusts a vehi-
cles headlights to illuminate curves in the
road, and Curve Speed Warning, which alerts
a driver if the vehicle is traveling at an unsafe
speed for an approaching curve.
Recreational Uses As Well
NEXTMap USA data is the foundation for
Intermaps AccuTerra product, which is a map
database for smartphones and dedicated GPS
units used by outdoor enthusiasts to plan,
record, and share their hiking, skiing, and
other outdoor recreational pursuits. Earlier
this year, The New York Times used NEXTMap
data to create highly detailed and interactive
maps of the Winter Olympic venues in British
Columbia, Canada, for its Web site.
Ken Goering, Senior Writer at
Intermap Technologies
For more information on NEXTMap USA and
NEXTMap Europe, visit www.intermap.com.
Maddox, we had collected an area about four
times the size of California in the same
amount of time it took to collect that state.
Collection of the data for NEXTMap USA was
completed on March 16, 2009, within its bud-
get by six percent and ahead of
schedule by nine months. The data
was continuously being processed
and verified in several of
Intermaps offices around the
world, necessitating tremendous
upgrades in computing power
and storage capabilities, as
well as significant additions
to staff.
NEXTMap USA required
1,300 GCPs, each placed by
an Intermap employee who
had to ask landowner per-
mission prior to its place-
ment. The field staff would
regularly drive up to 25,000
miles in one month. For
NEXTMap USA, to initially place
and then return to pick up the
reflectors, our GCP crew drove the
equivalent of two return trips to the
moon, said Maddox. A total of 160 field
staff from Intermap worked on the data
collection phase of NEXTMap USA. Various
project teams, including the GCP crews,
spent a total of 24,463 days (67.5 work-
ing years) in the field.
Perhaps most stunning of all of the numbers
regarding the NEXTMap program is: two. For
a significant length of time, data collection
(and all of the operations that occurred to
support it) and processing was taking place
on the two continents of America and Europe
simultaneously so that NEXTMap USA and
NEXTMap Europe could both be completed as
quickly as possible. The end result: more than
10 million square kilometers of datasets pro-
viding uniformly accurate coverage for the
contiguous United States and Western Europe.
Putting NEXTMap USA to Work
While Intermap continues to collect and pro-
cess data under its NEXTMap program around
the world, the company has also transformed
itself from a data collection and processing
entity into one that is creating geospatial
products and services based on the NEXTMap
database and driven by the varying needs of
its customers worldwide.
Beyond traditional GIS-based uses for digital
elevation data and images, NEXTMap is also
used in a wide variety of geospatial-enabled
products and services. This year, Intermap is
introducing an online risk assessment portal
with which insurance companies can accu-
rately gauge their property portfolios risk of
flood damage; the accuracy of NEXTMap data
will allow that to happen at the level of a spe-
cific property address.
The company is also launching an application
that supports online terrain profiles applica-
tion for microwave link planning, which allows
telecommunications companies in the plan-
ning phase of building or extending a network
to ensure that their transmission towers will
have a clear line of sight without expensive
field verifications. The application has exten-
sive benefits to industries that use transmis-
sion lines of any type, such as water, and oil
and natural gas.
On-demand Data Delivery
The quality, resolution, size, and complexity
of geospatial data is increasing exponentially,
driving the need for more effective data man-
28
Ar t i cl e
June 2010
This is a digital surface model (DSM) of Germany
from NEXTMap Europe, which reflects the whole-
country mapping concept underlying the
NEXTMap program. Intermap is leveraging that
concept to develop a number of geospatial-enabled
products and services using NEXTMap data, includ-
ing automotive applications that help increase
vehicle fuel efficiency and reduce carbon emis-
sions.
Pan-sharpening and Geometric Correction
The successful operation of DigitalGlobe WorldView-2 has created another milestone for high-resolution satellites.
The high-resolution panchromatic sensor, the four previously unavailable multispectral bands at 1.8 meter resolution and
WorldView-2s advanced geopositional capability have provided a range of benefits to different applications.
By Philip Cheng and Chuck Chaapel
On October 8, 2009, WorldView-2 joined its sister satellites,
WorldView-1 and QuickBird, in orbit. WorldView-2 is a remote-sensing
satellite principally used to capture high-resolution images of the earth.
The images provided by the satellite can be used for applications such
as mapping, land planning, disaster relief, exploration, defense and
intelligence, visualization and simulation of environments, and classifi-
cation.
WorldView-2 was designed and developed by Ball Aerospace &
Technologies Corp, US, and is operated by DigitalGlobe Corporate
(DigitalGlobe). The satellite can swing rapidly from one target to anoth-
er, allowing broad images of many targets. The satellite was launched
into orbit through the Ball commercial platform (BCP) 5000 spacecraft
bus on the piggyback of Boeing Delta II on 8 October 2009. Worldview-
2 can operate at an altitude of 770km with an inclination of 97.2 for a
maximum orbital period of 100 minutes. Worldview-2 is the third satel-
lite in orbit in DigitalGlobe's constellation, and joins its forerunners
Worldview-1 (launched in 2007) and QuickBird (launched in 2001). The
satellite has been designed to have a lifespan of 7.5 years.
WorldView-2s large-area collection capabilities and rapid retargeting
are two important features of the satellite. Enabled by the combination
of the satellites 770km orbiting altitude, its state-of-the-art Control
Moment Gyroscopes (CMGs) and bi-directional push-broom sensors,
WorldView-2s enhanced agility and bi-directional scanning allows for
the collection of over 10,000 sq km in a single overhead pass, plus effi-
cient in-track stereo collections of over 5,000 sq km. WorldView-2s
advanced geopositional technology provides significant improvements
in accuracy. The accuracy specification has been tightened to 6.5m CE90
directly right off the satellite, meaning no processing, no elevation
model and no ground control, and measured accuracy is expected to
be approximately 4m CE90.
WorldView-2 panchromatic resolution is 46cm and multispectral resolu-
tion is 1.8m. Distribution and use of imagery better than 0.50m GSD
pan and 2.0m GSD multispectral is subject to prior approval by the U.S.
Government. As the first high-resolution commercial satellite to provide
eight spectral bands, WorldView-2 offers imagery with a high degree of
detail, unlocking a finer level of analytical discernment that enables
improved decision-making. In addition to industry-standard blue, green,
red and near-infrared, WorldView-2 includes four previously unavailable
bands, collected at 1.8m resolution: coastal blue, yellow, red edge and
near-infrared 2. These bands offer a range of benefits to analysts, who
30
Ar t i cl e
June 2010
[a]
[b]
WorldView-2 Satellite
will be able to identify a broader range of classification, (e.g. more vari-
eties of vegetation or water-penetrated objects), to extract more fea-
tures (e.g. cotton-based camouflage from natural ground cover), to view
a truer representation of colors that match natural human vision, and
to track coastal changes and infractions.
This article will examine different areas of the WorldView-2 satellite
image data.
Firstly, we will test pan-sharpening using WorldView-2 panchromatic
and multispectral data. Secondly, the geometric correction method and
accuracy of the WorldView-2 data will be examined. Given that the
WorldView-2 is equipped with state-of-the-art geo-location accuracy, it
would be useful to find out the geometric model accuracy of the
WorldView-2 data with and without ground control points (GCPs). Lastly,
we will test the geometric correction of WorldView-2 data using Google
Earth as a source of GCPs.
WorldView-2 Data
Similar to the QuickBird and WorldView-1 satellite data, WorldView-2
data is distributed in five different levels, i.e., Basic 1B, Basic Stereo
Pairs, Standard 2A, Ortho-Ready Standard (OR2A), and Orthorectified.
For custom orthorectification the Standard 2A and Orthorectified prod-
ucts are not recommended. Standard 2A product is not recommended
because of the coarse DEM correction already applied to the image
data.
Basic Imagery products are the least processed of the WorldView-2
imagery products. Each strip in a Basic Imagery order is processed indi-
vidually and therefore, multi-strip Basic Imagery products are not
mosaicked. Basic Imagery products are radiometrically corrected and
sensor corrected, but not projected to a plane using a map projection
or datum. The sensor correction blends all pixels from all detectors into
the synthetic array to form a single image. The resulting GSD varies
over the entire product because the attitude and ephemeris slowly
change during the imaging process. Basic Stereo Pairs are supplied as
two full scenes with overlap, designed for the creation of digital eleva-
tion models (DEMs) and derived GCPs.
OR2A has no topographic relief applied, making it suitable for custom
orthorectification. OR2A is projected to an average elevation, either cal-
culated from a terrain elevation model or supplied by the customer. It
can be ordered from a minimum of 25 km2 from the library, or from 64
km2 for new tasking.
For this article three sets of WorldView-2 OR2A data were obtained from
DigitalGlobe. The data include Morrison and Phoenix, USA and Beijing,
China. OR2A products are recommended for geometric correction
because the panchromatic and multispectral data are resampled to
exactly the same geographic extents; hence, it is possible to perform
pan-sharpening of the data before geometric correction if a pan-sharp-
ened orthorectified image is desired. This method works for most areas
with gentle terrain. Performing pan-sharpening after geometric correc-
tion of the panchromatic and multispectral data separately often
requires the need to deal with small misalignments between orthorecti-
fied panchromatic and multispectral data due to the accuracy of GCPs
and DEM used in the orthorectification process
Pan-sharpening
The availability of a WorldView-2 0.5m panchromatic band, in conjunc-
tion with the 2m multispectral bands, provides the opportunity to cre-
ate a 0.5m multispectral pan-sharpened image by fusing these images.
Based on the thorough study and analysis of existing pan-sharpening
algorithms and their fusion effects, an automatic pan-sharpening algo-
rithm has been developed by Dr. Yun Zhang at the University of New
Brunswick, in New Brunswick, Canada. This technique solved the two
major problems in pan-sharpening color distortion and operator
dependency. A method based on least squares was employed for a best
approximation of the grey level value relationship between the original
multispectral, panchromatic, and the pan-sharpened image bands for a
best color representation. A statistical approach was applied to the pan-
sharpening process for standardizing and automating the pan-sharpen-
ing process. This new algorithm is commercially available within the
PCI Geomatics software.
In figures 1a, 1b and 1c, examples of the WorldView-2 panchromatic,
multispectral and pan-sharpened images of Phoenix, U.S.A., are provid-
ed Figures 2a, 2b and 2c show examples of the WorldView-2 panchro-
matic, multispectral and pan-sharpened images of Beijing, China.
Latest News? Visit www.geoinformatics.com
Ar t i cl e
31
June 2010
Figure 1a: Panchromatic image of Phoenix, USA
Figure 1b: Multispectral image of Phoenix, USA
Figure 1c: Pan-sharpened image of Phoenix, USA
[c]
Geometric Correction Method and Software
In order to leverage the WorldView-2 images for applications such as
GIS, it is necessary to orthorectify the imagery. In order to perform the
orthorectification process, a geometric model, GCPs and DEMs are
required. The Rational Polynomial Coefficient (RPC) model has been the
most popular method in orthorectifying high-resolution images because
it allows the user to correct an image using no GCP or a few GCPs.
More details about the RPC model can be found in the paper written
by Grodecki and Dial (Block Adjustment of High-Resolution Satellite
Images Described by Rational Functions - PE &RS January, 2003).
For the purposes of testing, the latest version of PCI Geomatics
OrthoEngine software was used. This software supports reading of the
raw data, manual or automatic GCP/tie point (TP) collection, geometric
modeling of different satellites using Toutins rigorous model or the RPC
model, automatic DEM generation and editing, orthorectification, and
either manual or automatic mosaicking. OrthoEngines RPC model is
based on the block adjustment method developed by Grodecki and
Dial and was certified by Space Imaging
(http://www.pcigeomatics.com/support_center/tech_papers/rpc_pci_cert.pdf).
Morrison Test Results using Survey Points
A total of 13 survey-independent check points (ICPs) with sub-meter accu-
racy were collected from six OR2A datasets. A zero order polynomial RPC
adjustment was used. The ICP root mean square (RMS) errors were 2.6m
in X and 1.3m in Y with maximum errors of 5.7m in X and 3.1m in Y.
When one GCP was collected from each image, the ICP RMS errors were
0.7m in X and 1.0m in Y with maximum errors of 1.4m in X and 1.4m in
Y. Therefore, it is possible to achieve within 1m RMS accuracy with only
one accurate GCP per image using the RPC method. Figure 3 shows an
orthorectified image of the Morrison dataset overlaid with Google Earth.
Phoenix Test Results using Google Earth
In recent years, the launch of Google Earth has provided users with ref-
erence imagery which can be used as a source of GCPs anywhere in the
world. For most cities high-resolution data such as GeoEye, QuickBird or
airphotos are available. By checking the Google Earth imagery with
known survey points, it was found that the accuracy of the Google Earth
imagery is approximately within 2m in X, Y and Z directions in most
cities in North America. Accuracy outside of North America has not been
checked at this time. To test using Google Earth imagery as a source of
GCPs, the Phoenix dataset was used. A total of eight Phoenix WorldView-
2 OR2A data was used. Forty ICPs were collected from the Google Earth
imagery and the RMS errors were 0.9m in X and 0.7m in Y with maxi-
mum errors of 1.8m in X and 1.6m in Y. When using one GCP per image,
the ICP RMS errors are 1.2m in X and 0.7m in Y with maximum errors of
2.2m in X and 1.6m in Y. Figure 4 shows the pan-sharpened orthorecti-
fied Phoenix image overlaid with Google Earth. Therefore, it can be con-
cluded that it is possible to use Google Earth as reference imagery to
collect GCPs for near-nadir acquisition angle imagery. For off-nadir acqui-
sition angle imagery, more accurate GCPs should be used.
32
Ar t i cl e
June 2010
Figure 2a: Panchromatic image of Beijing, China Figure 2b: Multispectral image of Beijing, China
Figure 3: Pan-sharpened orthorectified Morrison
image overlaid with Google Earth
Figure 4: Pan-sharpened Phoenix image overlaid
with Google Earth
Figure 5: Beijing pan-sharpened ortho image over-
laid with Google Earth
Latest News? Visit www.geoinformatics.com
Ar t i cl e
33
June 2010
Beijing Test Results using Google Earth
A similar test was performed on an OR2A data set of Beijing, China.
Five points were collected using Google Earth imagery as the reference
image. The ICP RMS errors were 2.5m in X and 9.1m in Y with maxi-
mum errors of 3.7m in X and 8.9m in Y. When one GCP was collected
from the imagery, the ICP RMS errors were 2.9m in X and 1.2m in Y
with maximum errors of 3.6m in X and 1.9m in Y. As previously men-
tioned the Google Earth imagery may not be very accurate outside of
North America; however, it is still a useful tool if one just intends to
update an area of Google Earth. Figure 5 shows the pan-sharpened
Beijing image overlaid with Google Earth.
Conclusions
This article examines different aspects of the WorldView-2 data. Pan-
sharpening of WorldView-2 data can be performed by using OR2A
panchromatic and multispectral products before geometric correction.
The RPC model with zero order polynomial adjustment can be used as
the geometric model to orthorectify WorldView-2 data. Similar to
WorldView-1 data, it is possible to achieve RPC model accuracy within
1m RMS with a minimum of one accurate GCP for WorldView-2 data.
For areas without accurate GCPs, Google Earth can be used as a source
of GCPs.
Dr. Philip Cheng cheng@pcigeomatics.comis
a Senior Scientist at PCI Geomatics.
Mr. Chuck Chaapel cchaapel@digitalglobe.comis
a Senior Geospatial Engineer at DigitalGlobe.
Figure 2c: Pan-sharpened image of Beijing, China
Location based services have come a long way. Part of its success has
to do with technology, part with data providers and part with companies
that use location as a way of displaying data. In addition, that data is
free for everyone to use wherever they want. But what is the next step?
Who will lead the way in location based systems and decide what others
will do? What are the challenges ahead and how to tackle them? What
lessons are there to be learned from geospatial parties that deal with
location every day? These questions and more were addressed during the
Location Business Summit in Amsterdam, April 28-29.
As to be expected, this was not so much a technological conference, but
one where different groups of people met, discussing their thoughts and
learning from each other. Familiar parties such as Google, Yahoo, Layar,
Open Street Map and TeleAtlas were present, but also marketing agen-
cies, telecom companies and major hardware and software companies
like Microsoft and Dell Computers. Microsoft and Dell Computers.
Where is the Money?
The primary questions of the conference were addressed by David Gordon,
Director of Strategic Planning at Intel. One of the main questions was Where
is the money? , meaning how is there money to be made with location
based services (meaning advertising)?. This question came up during almost
every presentation. Its easy to see why, with Google and Nokia offering free
map services on mobile devices, mobile system providers are asking them-
selves the question of how to respond to this move and how to make
money with mobile and location-enabled advertising. Considering the diver-
sity of players in this market and the fact that the sales of GPS smart phones
are still increasing, all parties are eager to take their part of the cake.
Googles Geospatial Technologist Ed Parsons followed Gordons short open-
ing presentation with a talk that focused on data rather than the services
around the data. Parsons argued that without context, data itself is irrele-
vant, because place equals points of interest and people. He made this
clear with an example that illustrated how the location where information is
shown is just as important as the information itself. Context determines if a
message comes through. This message was repeated in other presenta-
tions: everybody seemed to agree that theres a need to personalize loca-
tion based information for the user. The question is how and by what means.
About personalizing location based information, Parsons argued that to be
able to personalize content to the individual user, the service should have
information about the user so it can give better search results. Google is
already doing this, and some speakers agreed that Google is in the drivers
seat in the location business market. Everyone was eager to hear Googles
presentation during the second conference day on mobile local advertising.
One of Googles new initiatives in this field is Google Local Shopping, where
inventories of shops are made searchable for mobile users through Google.
The other way around is also possible: take for instance geofencing, where
mobile users receive text messages about discounts offered by the shop
where they are at that very moment. Although research has shown that
geofencing can be quite effective as a marketing tool, it remains to be seen
if people are in favor of these marketing tools, as they may not be person-
alized and could be considered intrusive.
Verdict
The target audience at the conference was not clearly different than that
found at your normal geospatial event. This was not a technical confer-
ence, which had its strengths and weaknesses. I for one learned a lot
more on how business can use location based services and make a profit
with it, but honestly there was not much new to be learnt. There were no
big announcements or exciting new products. Augmented reality was only
mentioned in one presentation, but this topic certainly deserved wider
attention. Layar wanted to keep their new product announcements to
themselves but revealed an upcoming Layar event in June.
From a geospatial perspective, I was surprised how non-geospatial peo-
ple, like the majority of those at the conference, take maps for granted.
Or mapping, for that matter, or data quality. The big discussions between
crowd sourcing (OSM) or a blend of traditional mapping and crowd sourc-
ing (used by Navteq) seemed to go over the heads of most attendees. Ed
Parsons remarked that people have problems with maps, mapping is not
that easy, and gave an example that perfect circles on a map with a
Mercator projection should be read with suspicion, showing that theres
something wrong.
But the attendees noticed other barriers preventing location based sys-
tems from fully taking off. Roaming costs and battery power are still big
obstacles for mobile users using location based systems. To answer the
question are we there yet?, I think the answer should be: no, not yet.
For more information, have a look at
www.thewherebusiness.com/locationsummit
34
Event
June 2010
Are We There Yet?
The Location Business Summit
'Location based services: are we there yet? and how is there(more) money to be made with location business systems?' were
questions central at the Location Business Summit in Amsterdam, April 28-29. With an expected further growth of smart
phones equipped with a GPS, there sure is room for more location based systems and thus money to be made. The question is
how and by whom. Also, what lessons are there to be learned from the geospatial web for driving profits? During two days,
more than 50 speakers from the industry gathered in the Okura Hotel in Amsterdam to share their thoughts on these matters.
By Eric van Rees
Cologne, October 5
th
to 7
th
, 2010
The Data Exchange Company
Snowflake Software is a UK based provider of data exchange solutions. The company combines off the shelf software prod-
ucts, consultancy services and extensive training portfolio to build a complete data exchange for their customers, such as
the Defence, Aviation, INSPIRE and Data Provider markets. GeoInformatics talked with Snowflake Softwares Melissa Burns
(Marketing Manager) and Eddie Curtis (Chief Technology Officer) on the companys decision to focus more on data
exchange, as well as implementation of GML. Also, they explain the roles that their GO Loader and GO Publisher fulfill.
By Eric van Rees
Question: Last year the company made a change by
becoming a Data eXchange Company. Could you explain
to our readers what caused this change and how this
works out in practice, in terms of products, solutions and
especially new markets, such as aviation and defence?
Since Snowflake started (back in 2001), weve noticed a
transformation in the needs of geo-spatial data users. When we first
started, the industry was learning. And we were on the learning curve
with them.
Over the last couple of years, and much like the industry itself, weve
grown up. For a start we have a common set of data exchange stan-
dards (that would be GML then!). However, many users werent neces-
sarily able to get the most out of their data because many of us (tech-
nology vendors) were not servicing their needs; we were too focused
on the technology and the data. We needed to focus on the objective
clear and effective communication.
Users were struggling. And we quickly realised that simply having good
technical standards is not enough. People need to be able to use those
standards without spending months to become experts in them first.
They need to apply them to their existing business and therefore their
existing systems.
Step forth Snowflake Software the Data Exchange Company.
Today, users need to be able to load, model, manage, transform, trans-
late, publish, visualise and share their data in a way that isnt inhibited
by existing infrastructure (needing new investment), and in a way that
seamlessly integrates with other data sets to really get the most out of
them. They need to be able to exchange data between internal depart-
ments, between external companies, between legacy and next genera-
tion infrastructure and between schemas and frameworks.
Snowflakes data exchange approach means that Open Standards are
always at the core of our products. The user can quickly benefit from
clear and accessible information to make critical business decisions,
whether they need to load or publish, or even do both.
Whilst our GO Loader and GO Publisher products complete the cycle of
open data exchange, the addition of our training and consultancy real-
ly supports our customers in being able to get up and running with
open data exchange and fully realise the value.
36
I nt er vi ew
June 2010
Eddie Curtis Melissa Burns
Snowflake Software
In terms of our penetration into the
Aviation and Defence markets,
thats just a sign of how much the
industry has grown up. Now that
we have GML, we are finding other
industries that have adopted the
use of it have exactly the same
data exchange challenges as the
rest of us. And because the data is
often relied upon in those indus-
tries for critical and timely decision
making, the need to solve the data
exchange challenges by using tools
like GO Loader and GO Publisher
to open the data for analysis, man-
agement and quick decision mak-
ing was key.
Q: Snowflake has been
involved in the rollout of
GML maps by Ordnance Survey. Have you already seen
everything, or does every country experience its own
specific problems with the implementation of GML?
We certainly havent seen everything yet.
GML is much more than just a format for distributing maps. It gets used
in many ways for many purposes. However, since weve been helping
people use GML in a wide variety of sectors, I think we have seen quite
a lot of the challenges.
With OS MasterMap, the issue was how to distribute a very large data
set (around 500 million features) to hundreds of customers. That is
very different from the Aviation world, where GML is being used to send
notices about changes to airports and airspace on a minute by minute
basis and temporal aspects of the data are critical. In meteorology the
size and complexity of each individual weather forecast can present
challenges, but it has many similarities with air quality monitoring.
Within a sector, the issues are often similar when moving from one
country to another. For example, land parcel information in one coun-
try tends to have many similarities to land parcels from other coun-
tries. But there are always variations too, since each country has differ-
ent legal processes and administrative structures.
The results of implementing GML as a common standard far outweighs
the challenges associated with the implementation.
Q: GML is being implemented step by step as a basis for
object oriented map material in Europe. Use is made obli-
gatory by the government, but users dont seem to be
ready for it yet. Platform suppliers are also sending out
mixed signals. What is so difficult about a file format that
only seems to be supported by smaller organizations
such as Safe and Snowflake?
One of the good things about having a mature Open
Standard like GML is that it creates opportunities for people to share
information that wasnt previously feasible to exchange. As a result, in
many situations where GML is being promoted, it is in the context of
wider business change. Quite often, data exchange where GML is to be
used is a completely new business process. That kind of change is
always hard work because it involves business change as well as
changes in technology.
The main reason Open Standards are considered a good thing is that
they allow you to choose best-of-breed components for each task and
integrate components from different suppliers. It should not be surpris-
ing to see small companies getting involved to complement established
platforms with extra functionality since this is very much the ethos of
the Open Standards approach. Data exchange is an area of expertise
just as survey or GIS analysis is, so it makes a lot of sense to use spe-
cialist data exchange tools for these new processes but integrate them
with the existing platforms so that you can maintain business as usual
for established processes.
GML isnt a difficult technology to work with. If you are implementing
open data exchange for the first time, you are bound to face a bit of a
learning curve (which is one reason our training courses are so popu-
lar!). One important thing to remember is that exchanging data with
someone who does the same job as you is very different from commu-
nicating with someone in a different walk of life. You cant rely on them
to understand your terminology or your data structures.
Before you can exchange information you need to agree a common lan-
guage that you both understand. GML provides a toolkit for you to
define that language, which is known as an application schema in GML
terminology. Once you have agreed, you will find that its not the lan-
guage you use in your organisation, nor is it the language of the per-
son receiving the data. It is a common ground between the two a
lingua franca for the community sharing data. This means that the
provider of the data will have to do some translation to turn their data
into something the recipient can understand. The recipient will have to
do some further translation to convert the data into a form their appli-
cations and business processes can use. Thus GML becomes everyones
second language and allows people talk to people they couldnt previ-
ously. That does take a little work to think through and configure, but
the benefits of effective communication are immense.
Q: How would you explain the difference between FME
(supports 200 file formats) and the GML Viewer, which is
used mostly for GML display? Are they competing plug-
ins?
This is a question we get asked a lot!
Our GML Viewer is supplementary to our GO Loader and GO Publisher
products in the same way that the FME Universal Viewer is a compo-
nent part of FME. The confusion between Snowflake and FME usually
comes in with our loading and publishing tools and the difference
between a data format and a data exchange standard or application
Latest News? Visit www.geoinformatics.com
I nt er vi ew
37
June 2010
GO Loader
schema. At first glance FME and Snowflakes products might appear to
be competitors, but when you look a little closer you will see that they
fulfill different roles. In fact, nearly all Snowflakes customers also use
FME, but to solve a different problem. What we are focused on at
Snowflake is how to facilitate data flows to and from enterprise IT sys-
tems. For us, the creation of Open Standards for spatial data was the
big opportunity to help people open up their data. (It is no coincidence
that Snowflake was set up at the same time as the first large scale
adoption of GML).
With hundreds of data formats, FME is a fantastic Swiss-army knife for
communicating between geographical systems. GO Publisher lets you
reach out by connecting your corporate data via web standards. GO
Loader lets you consume information from outside organisations into
your internal, corporate systems. There are two things that make those
lines of communication work: data model translation, and web services.
Giving clients on-the-fly self-service access to data in a choice of data
models means that information can move from where it is created and
into web applications, mash-ups and other corporate data stores, not
just into the hands of other professional geographers. At Snowflake we
work with standards for business process orchestration, web services
and data modeling on a daily basis not just geo standards.
Whilst that is the case, people do still ask the question How do you
compare to FME? so heres a basic overview of how were different:
1. GO Loader
Whilst FME is a great tool for transforming to different file formats, GO
Loader goes a step further and translates the data between different
XML and GML schemas. This generic based approach means that GO
Loader is a future proof technology, based on Open Standards and
ensures interoperability.
GO Loader offers more than just a loading tool and has several fea-
tures and functions for managing the data within your database, which
means you can get more out of the data. It comes with plug-ins and
project packs that add specific support for different datasets such as
OS MasterMap in the UK, NEN3610 in the Netherlands or AIXM5.1 for
Aviation (to name but a few). This means you can really get the most
out of the data at no additional cost.
With its unique Schema aware technology, when new application
schemas and datasets are released, GO Loader can immediately handle
the data without the need for software development to catch up with
the new technology. It can even han-
dle schemas that havent yet been cre-
ated! A great example of this is in the
UK where our OS MasterMap cus-
tomers could immediately manage a
brand new dataset OS VectorMap, all
because the datasets were released in
the data exchange standard, GML.
2. GO Publisher
Whilst FME offers a transformation
tool, GO Publisher offers a translation
tool that can serve highly complex, on-
the-fly model translations. This means
that you dont end up with an extra
database or intermediate file. GO
Publisher can support multiple output
translations easily and cost effectively.
With its unique schema mapping tech-
nology, when a new GML schema is
created for use, GO Publisher can
immediately handle the data and
translate to the new schema without the need for software develop-
ment to catch up with the new technology. And because so many organ-
isations are adopting Open Standards, it means you can get even more
functionality out of your data, such as publishing to Google Earth for
visualisation (because you can translate your data and publish out to
KML the data visualisation standard).
GO Publisher was the first commercial product to be awarded the Open
Geospatial Consortium's (OGC) Compliance Certificate for its support of
the WFS 1.0.0 and WFS 1.1.0 Open Standards and were currently being
tested for WFS 2.0.0 compliance.
Q: Whats the difference?
FME think formats and think existing data over 250
supported formats.
Snowflake think standards and think past, present and future infi-
nite numbers supported.
We view Safe Software as industry role models and weve got a great
working relationship with them. We hold Don & Dale in high regard.
Ian and I often chat about their success over a beer on a Friday night.
If Snowflake could achieve what Safe have, wed be very happy.
Q: Snowflake Software offers a hands-on INSPIRE train-
ing course. Do you see a difference in demand for cours-
es like this, geographically speaking? If so, how do you
explain this difference? Where did the need for this spe-
cific training course originate?
Our INSPIRE training course was a recent addition to
our GML and XML training program and was based on customer
demand. Because of that weve been fully booked for almost every
course weve run.
INSPIRE isnt going away. Government departments and businesses
responsible for providing public geo-spatial data are starting to really
think about what INSPIRE means for them and how they can imple-
ment it.
But, because were still in the fairly early stages of implementation in
terms of the INSPIRE dates, were finding a lot of our customers are
really confused about exactly what is, and more importantly what isnt
necessary for INSPIRE compliance.
38
I nt er vi ew
June 2010
GO Publisher
Latest News? Visit www.geoinformatics.com
I nt er vi ew
39
June 2010
Here at Snowflake were blessed with the brains of several INSPIRE
experts who run and support the INSPIRE training. Debbie Wilson who
runs the training course, worked at DEFRA as part of the UK Location
Programme and offered her expertise on INSPIRE implementation.
Additionally, our very own Ian Painter (MD) is on the AGI INSPIRE
Working Group and helped compile the training.
Geographically, were seeing a surge of interest from Eastern European
countries such as Croatia, Slovenia and Poland. Were also seeing mas-
sive interest and movement in The Netherlands and Germany where
INSPIRE is really beginning to be taken seriously and theyre ahead of
the game in terms of implementation considerations.
Here in the UK, were slightly behind. Were talking about it, but
were not doing it. Our national mapping agency Ordnance
Survey GB still publishes data in GML 2 rather than INSPIRE
compliant GML 3.2.1. At the time of writing this, possibly due to
our political situation with a national vote due soon, people are
stalling. But, once the situation settles down and we can see
the overall vision for the next couple of years, we expect INSPIRE
to really take a hold. Its beginning already, but not as fast as
in some other European countries.
INSPIRE is complicated. Its time consuming. Its confusing.
Theres so much information available about INSPIRE but its
often product specific and difficult to get the real important
nuggets of information from the noise. Organisations with com-
mitments to publish to INSPIRE standards have been screaming
out for some simple, practical advice. And thats what our train-
ing offers. We try and only give you the information you need
to make INSPIRE easily understandable (and without the product spin).
We aim to simplify it so you know exactly what you need to do in terms
of implementation and when. We also make it black and white in terms
of what you do and dont need to do.
You can get in touch with Snowflake Software by:
Visiting their website: www.snowflakesoftware.com
Reading their blog: http://blogs.snowflakesoftware.com/news/
Following them on Twitter: www.twitter.com/sflakesoftware
Joining their GML discussion group on LinkedIn
(just search GML in www.LinkedIn.com).
Nested Data Structures
Translate, Transform, Integrate and Deliver Data
With a new version of Safe Softwares FME Desktop and Server products, Don Murray (President of Safe Software) and
Dale Lutz (VP Development) are travelling the globe and meeting users of their products. But their travels are designed
not only to discuss new technology, but also to hear the needs of the user community. During the FMEdays in Mnster,
Germany, Don Murray and Dale Lutz discussed some of the recent developments in the geospatial domain and their
relationship with Safe Softwares market view. Other topics discussed were cloud computing, 3D data, meeting users
needs, and partnerships with GIS vendors.
By Eric van Rees
40
I nt er vi ew
June 2010
Before Don Murray and Dale Lutz left for their North American 10-city
tour called 2010: An FME Odyssey, they visited Europe for a similar
series of user meetings. They started off in Mnster, Germany, where
during the course of three days a program was put together with user
presentations, product presentations and training sessions. During their
opening session, Murray and Lutz discussed the new release of FME
2010, which includes a new version of FME Desktop, as well as FME
Server. Since an in-depth analysis of FME 2010 has already been cov-
ered by this magazine (please refer to GeoInformatics issue 2, March
2010), this interview focuses on Safe Softwares product strategy, tech-
nology developments such as cloud computing, and how the company
continues to meet the needs of the user.
Usage Scenarios
What is striking about FME User Meetings, is that Murray and Lutz know
their users really well, in addition to their needs. They are addressed
personally during presentations, as well as their work. The need for bet-
ter data access through FME technology keeps growing. At the moment
there are more than 7500 users of FME technology across 116 countries.
Murray describes several common uses for FME today: CAD to GIS is a
very common scenario that Safe Software has been helping address
since 1993. A technical person uses our authoring environment, FME
Workbench, to define what needs to happen with their data; specifical-
ly, how to read features and attributes from their CAD data and restruc-
ture them into a useable dataset for the applications or end users who
need access to it.
Data migration is another use of FME: Were also seeing organizations
that are migrating their technology to spatial databases, and in many
cases we FME is used to support the legacy applications they used to
run, in order to push the data back out.
As far as FME Server goes, the main use case is data distribution: An
organization wants to make their data available to internal or external
stakeholders. Now, with FME Server, the technical user who understands
the input-output data model can make this knowledge available through
FME workspaces on the web. Before FME Server, somebody would have
to find the data or hire somebody to actually do that.
Partnerships
Over the years, GIS vendors have embraced FME technology in their
own way. For example, Bentley Systems and Trimble both announced
their FME integration some time ago. Judging by the number of GIS ven-
dors that embraced FME technology, it becomes clear that Safe Software
is complimentary to all the vendors who build GIS. Don Murray: Today,
they recognize that more and more, theyre not just a pure one-vendor
solution. Their customers need to be able to share data among applica-
tions and these vendors are happy to have us do that task by enabling
them to leverage our technology. It so happens that a number of ven-
dors are licensing FME technology, such as Autodesk, MapInfo, ESRI,
Intergraph, ERDAS, rather than reinventing it themselves.
Data Quality
Indeed, moving data is the heart of what FME technology does. Through
workspaces, a user of FME Desktop can set up a set of rules to translate
input data to another format, transform data into a specific data model,
or integrate different data types all at once. The end result can be deliv-
ered to end users in the structure and format they desire. But as with any
data, there can be bad data that will cause problems when used or brought
together. To make people aware of the quality of their input data, FME
includes a data viewer that enables user to view their data before, during
and after the conversion process.
Murray: When solving a data moving challenge, users sometimes think
their source data is better than it really is. Our data inspection tool, FME
Viewer, helps users see exactly what they have in their source data.
Since FME is a technical product, being able to understand the input data
is indispensable. Murray: You have to know what youre working with.
For instance, with 3D, people need to be able to inspect their data in 3D,
and for this we have created the next generation of our data inspector.
PDF is a very efficient way of sharing data with people who dont have a
3D visualization tool. Everybody seems to have an Acrobat Reader, its
sort of a de facto standard.
3D Data Format Support
With FME 2010, Safe Software has furthered its focus on the 3D realm by
making it easier for users to access and visualize data in more 3D for-
mats. Murray: Were taking non-3D building blocks, like CAD drawings or
building footprints, and then dropping on textures, and digital elevation
models. If you have to do all that stuff by hand, it would take you a great
deal of time, so were able to do that and load it into PDF or other 3D
things.
An explanation of the large number of different 3D file formats is compa-
rable to what happened before in the 2D world, says Murray: 3D is like
the 2D story all over again. There are vendor versions and there are stan-
dards. For example, theres IFC, which is a standard, and then you have
Revit, which is a vendor product. Then, you have things like Google
Sketchup, COLLADA or Presagis OpenFlight.
Not only are there many different data types that cause interoperability
problems, but there is also another cause for the new demands in 3D
solutions, namely legacy. Dale Lutz: An example of this is a common 3D
format for exchange called OBJ. There isnt any product left that uses OBJ
as a native format, but theres a legacy of using OBJ as a convenient
exchange format. The challenge there is that theres really no reference
application for OBJ so we have to cope with de facto interpretations of
this legacy file format.
Lastly, within a standard there can be many variants, says Lutz: For
instance, for CityGML theres a noise abatement extension and there are
other extensions as well. All this causes us to need a small army of devel-
opers at Safe.
Cloud Computing
Looking ahead, theres a lot of talk these days on moving into the cloud.
In discussing the recent partnership with WeoGeo, a North American com-
pany that manages and serves maps and mapping data in the cloud, Lutz
and Murray both think this will eventually take place and are confident
about the possibilities, but also see the challenges that are crucial for suc-
cess.
Murray: Its a new way of deploying, but there are challenges of course:
you have to get the data into the cloud and many organizations at this
point are concerned with that. You have to understand what that means:
are we giving the other rights to the data or not? You also have to move
your data into the cloud and out of the cloud, because that is probably
the biggest cost. And not so much moving it out, but moving it up.
Murray sees no problems for deploying a desktop application, but this
changes when things get bigger: if you are going to deploy a big appli-
cation in the cloud, you need a significant investment in the hardware to
be able to deploy it. Now with Amazon web services, you can deploy small
and as your application grows, it just automatically ramps up and down.
Your cost can be incremental as the demands for your services grow. In
the cloud, youre only paying for what you use, as opposed to hardware
infrastructure.
CAD-GIS Integration
For a company that focuses on moving data, data integration is always a
concern. Software vendors have all tried to meet their users needs when
it comes to this issue. But with the ever increasing amount of data types,
data models, standards, products and the like, life certainly hasnt been
made any easier. Dale Lutz has a clear view on the possibility and use of
data integration: It seems to me that the world is complex enough and
that any format which could satisfy the needs of all possible applications
would itself be so unbelievably complex that no one could ever use it.
And any format that tries quickly suffers from that problem.
An explanation for this is that the needs of both worlds are different, and
the tools reflect that. Lutz: For example, consider building information
model (BIM) and GIS. These worlds are very different and their tools reflect
that. Ultimately a GIS user doesnt have the same needs as the person
whos building the place.
However, this does not mean that the two universes cannot complement
each other. Lutz: Theres a need only to take over the relevant informa-
tion. Typically, the information is going to come out of the architects and
be dumbed down for the GIS guy. Rarely is it going to go in the other
direction.
User Community
During the FMEdays in Munster, a lot of requests were made for data
types to be included in later versions. Both Lutz and Murray were amazed
by the fact that things keep changing all the time, causing Safe Software
to take notice and incorporate those changes into their products. Lutz:
The question is how to gather all the information and distill it into
actions, because we cant do everything. You have to figure out all of the
things that are going on, and what things are going to be the most valu-
able to people. We try to make good guesses, but to be honest we know
we may have hits, and we may have misses.
And there are those hits that have been a complete surprise, apparently.
Both give a lot of credit to the user community, and it is true that the
interaction between the community and the company is quite inspired.
Lutz: Part of why Don and I love to come to conferences is that it gives
us a great opportunity to learn whats most important to our users and
how we can make our products and company even better.
Eric van Rees is editor in chief of GeoInformatics. For more data insights from
Don and Dale, have a look at Safes Its All About Data blog at
http://blog.safe.com.
http://blog.safe.com/2010/04/
working-with-xml-and-loving-it/
http://fmepedia.com/index.php/Converting_
Relational_Datasets_to_XML
Latest News? Visit www.geoinformatics.com
I nt er vi ew
41
June 2010
A Collaborative Project
The Archaeological Potential for
Shipwrecks
The AMAP2 - Characterising the Potential for Wrecks project (AMAP2), commissioned by English Heritage in October
2009, is a collaborative project between SeaZone and the University of Southampton (UoS) which seeks to improve the
management of the marine historic environment by enhancing our understanding of the relationship between shipwrecks
and their surrounding environment. This will be sought through the refinement of baseline data for marine spatial
planning and the development of a characterisation of the environmental variables affecting the potential for wrecks
to survive on the seabed. The project will provide an evidence base for the assessment of the potential for different
marine environments to harbour unrecorded wrecks.
By Olivia Merritt
42
Ar t i cl e
June 2010
The aim of the AMAP2 project is to study statistical relationships between
the physical nature of shipwrecks and their surrounding natural environ-
ment. The results will be used to develop a characterisation map of Areas
of Maritime Archaeological Potential (AMAP) based on the environmental
parameters affecting the survival of wrecks in seabed sediments. Improving
the understanding of the relationships between wrecks and their environ-
ment, coupled with the results of seabed modelling undertaken by the
University of Southampton (UoS), will provide a firm basis for interpreting
the variables which affect the potential for wrecks to survive in different
marine environments.
The term Archaeological Potential describes areas of land or seabed where
it is anticipated that previously unrecorded archaeology is likely to exist
and survive. The project seeks to encourage a considered interpretation of
the variable affecting potential in the marine environment by demonstrat-
ing relationships between wrecks and their environment. It will not, howev-
er, attempt to generate a predictive model that would allow the estimation
of the number of wrecks that might be found, or their spatial distribution.
Background
A pilot project completed in 2008 by Bournemouth University demonstrat-
ed the potential for correlations to exist within the Eastern English Channel,
leading to the commissioning of AMAP2 to further investigate and quanti-
fy these relationships across a much larger area encompassing all of
Englands territorial waters.
Key trends identified during AMAP1 included a strong bias in known wrecks
towards the 20th century, with few iron or steel vessels reported lost but
remaining unidentified. Iron and steel wrecks were found to cluster in areas
of shallow sediments and dynamic seabed, irrespective of their condition,
while wooden vessels tended to be concentrated closer inshore.
Correlations were suggested between wrecks recorded as buried or partly
buried and areas of shallow but dynamic seabed. Relationships were also
identified between the materials ships were built of and their distribution,
burial and location methods.
Making the Most of Wreck Data
The information on wrecks will be sourced from two distinct databases:
the Wrecks Database managed by the UK Hydrographic Office (UKHO) and
licensed through SeaZone, and the National Monument Record (NMR) man-
aged by English Heritage. There are, however, two issues to address before
the information held in these databases can be taken forward for analy-
sis. First, there are overlaps between the databases which must be identi-
fied and removed so that the project has a single source of wreck infor-
mation to work from. Second, much of the data which will be useful to
the project is held in lengthy descriptive text fields.
Therefore, the AMAP2 project will initially seek to compare and identify
matching records within the databases (Figure 2), to enable the best use
to be made of available physical and circumstantial information on each
wreck site. During this process, the project seeks to further develop inter-
operability between the wreck data published by the UKHO and historical
data available from the NMR, thereby enhancing the usefulness and acces-
sibility of both datasets beyond the scope of this project.
Data significant to understanding trends in the condition of wrecks on the
seabed such as age, construction materials, distribution on the seabed,
Figure 1: Shipwreck
burial environment are being
extracted to produce an
enhanced database of environ-
mental shipwreck characteristics.
Modelling Sediment
Dynamics
The AMAP2 project is being run
in collaboration with UoS with an
aim to integrate the results of
sediment dynamics modelling
techniques conducted for the EU
funded MACHU Programme
(www.machuproject.eu) with the
model built to generate the
AMAP2 environmental characteri-
sation.
An essential component for effec-
tive underwater cultural heritage
management is a clear under-
standing of the sediment dynam-
ics of both regional areas encom-
passing numerous archaeological
sites and individual sites them-
selves. The successfully devel-
oped approach for MACHU has
demonstrated the clear capability
of such numerical models to
identify areas of erosion, accumu-
lation and nil change under a
range of ambient and extreme
conditions. Further, they are able
to identify direction and order of
magnitude of sediment transport.
MACHU has focused primarily on
the development of a robust, coarse resolution, numerical model for the
North-west European Shelf, with a higher resolution nested domain cen-
tred on the Goodwin Sands, in the Dover Straits.
The AMAP2 project builds on the work undertaken for the MACHU pro-
gramme, using high resolution bathymetric modelling generated by
SeaZone to build a sediment transport model of the Thames Estuary and
Goodwin Sands for use in the development of
the AMAP characterisation across a test area.
The final outputs of the MACHU model are a
description of the net sediment transport path-
ways and the nature of gross and/or sudden
changes in seabed level (erosion or accumula-
tion) as a response from either ambient tidal
and wave conditions or extreme conditions (the
passage of a storm through the area), as well
as information of the direction and magnitude
of sediment transport (e.g. Figure 3).
In addition to the sediment dynamics model, a
wide range of environmental data considered
relevant to the formation of wreck sites and
their survival in the marine environment has
been collated, with particular focus on datasets
available on a national scale. These include
best available data relating to superficial sedi-
ment type and depth, seabed
morphology, water depth, sedi-
ment mobility.
Building a Characterisation
of Potential
The characterisation will be con-
structed using the results of
mapping of wreck characteristics
to environmental conditions
using statistical and spatial anal-
ysis. The project draws close par-
allels with the requirements for
generating marine habitat maps
and is therefore seeking to adopt
or adapt, where possible, the
statistical techniques employed
for this purpose.
The development of a character-
isation map of the environmen-
tal variables and trends in wreck
data which determine the poten-
tial for archaeological materials
to survive in different marine
environments will encourage a
more justified assessment of the
archaeological potential for a
particular marine environment to
harbour shipwrecks during the
process of marine planning. The
creation of a digital characterisa-
tion map of archaeological
potential and the consequent
enhancement of data, core to the
aggregate licensing process, will
enhance the approach to marine spatial planning and benefit the marine
industry as a whole.
SeaZone and UoS believe AMAP2 will help build firm foundations for future
marine planning and research, promoting consistency, and encouraging
the enhancement and interoperability of digital marine data.
Olivia Merritt, Heritage/GIS Consultant, SeaZone
Email: Olivia.Merritt@SeaZone.com
Internet: www.SeaZone.com
Latest News? Visit www.geoinformatics.com
Ar t i cl e
43
June 2010
Figure 2 Spatial discrepancies between wreck geometries in the
UKHO and NMR databases
Figure 3 - Bed Level Change and sediment transport
magnitude and direction for the Goodwin Sands.
Making Mapping the Impossible Possible
In less than a decade of commercial operations, Fugro EarthDatas GeoSAR system has earned a reputation for mapping
the impossible. GeoSAR is a dual-band airborne interferometric radar system that is capable of rapidly mapping large
areas in any weather conditions. In 2009 Fugro EarthData, which integrated and operates the system commercially, used
GeoSAR to complete one of the most challenging terrestrial mapping projects the firm had ever attempted.
By Kevin P. Corbley
The tropical region of Australasia has been
a challenge to every remote sensing platform
out there, said L.G. (Jake) Jenkins, Fugro
EarthDatas Senior Vice President. GeoSAR
took it on and successfully mapped it.
Jenkins explained that tropical areas in
Australasia embody nearly every geographic
and topographic trait that make mapping dif-
ficult. Located just south of the equator, the
region is characterized by almost constant
cloud cover that renders optical airborne and
satellite imaging systems impractical. And
even when an optical image can be captured,
the dense tropical rain forests which carpet
much of the land mass keep the surface ter-
rain hidden from view. For decades, the areas
treacherous topography has thwarted attempts
at performing detailed ground surveys.
With the search for natural resources acceler-
ating around the world, this region of the
world has generated considerable interest in
recent years. For the project, the government
of Australia sought a mapping system that
could provide images and terrain models of
land concealed below the forest canopy.
Australia identified the airborne GeoSAR sys-
tem as the only platform that could penetrate
both the clouds and the jungle to return accu-
rate surface data over the entire area in a rea-
sonable period of time.
Topographic Mapping with IFSAR
As the system integrator for GeoSAR, Fugro
worked closely with the U.S. government to
develop a commercially viable interferometric
synthetic aperture radar (SAR) platform using
custom and off-the-shelf components. For map-
ping purposes, SAR offers numerous advan-
tages, most notably the ability of radar sig-
nals to pass through clouds, many types of
vegetation and even unconsolidated surface
materials to return images of what lies
beneath.
Radar operates day or night in most weather
conditions enabling it to maintain aggressive
mapping schedules with few interruptions,
said Roy Hill, Project Manager at Fugro.
Interferometric SAR, or IFSAR, is a variation on
the synthetic aperture radar technology. It uses
two antennas separated by a precise distance
on the aircraft to send and receive the radar
pulses that are emitted from the system,
bounce off the Earths surface, and return to
the sensor. By measuring the phase difference
44
Ar t i cl e
June 2010
The GeoSAR aircraft, a modified Gulfstream-II jet
aircraft. P-band antennas are installed in the fairings on
each wingtip, while the X-band antennas are in the
fairings under the wings near the fuselage.
between the reflected signals
received at the two antennas, a pro-
cessor can calculate extremely accurate sur-
face elevation values.
The GeoSAR development team took the
IFSAR technology several steps further to
expand its mapping capabilities. The decision
was made to operate the IFSAR in two sepa-
rate radar bands, X and P, giving it the ability
to map in both short and long wavelengths
simultaneously. The X-band operates at a fre-
quency of 9630-9790 MHz with a relatively
short 3-centimeter wavelength that passes
through clouds but provides a return signal
from the first reflective surface it encounters,
such as tree canopies, man-made structures
and solid ground. The P-band, however, func-
tions at 270-430 MHz and has a one-meter-
long wavelength that penetrates dense vege-
tation and the top layers of soil and sand in
very arid regions.
This dual-band collection enables GeoSAR to
simultaneously collect first-surface elevation val-
ues with the X-band and bare-Earth elevation
models with the P-band, generating two valu-
able DEMs from a single collect.
GeoSARs dual-band IFSAR generates a set of
black-and-white scenes called magnitude
images from the X and P bands, each reveal-
ing different information about the topography.
The X-band image provides details of surface
features covering the ground. In projects com-
pleted in South America and Australasia, this
surface information was mainly limited to the
vegetative canopy and the few rock outcrops
and water bodies not obscured by the jungle.
By comparison, the P-band peers through
the canopy to image bare-Earth features
below, including buildings, roads, paths,
fences, field delineations, rivers and streams
objects that are often completely invis-
ible in the X-band magnitude image.
Together, the X and P bands captured a
comprehensive view of the topographic
and natural and man-made features across
the region. From these datasets, users are
able to extract 3D features and generate
maps at 1:25,000 and 1:50,000 scale.
Dual-Band, Dual-Side
Several modifications to the configuration of
the IFSAR system on the aircraft have con-
tributed significantly to the accuracy of the ele-
vation data it captures. Among these was the
selection of the P band instead of other radar
frequencies. P band was chosen because it
was the longest wavelength that could func-
tion on a cost-effective aircraft without com-
promising interferometric performance.
Separating the antenna pairs by a distance of
multiple wavelengths maximizes the phase dif-
ferences between emitted and reflected sig-
nal, yielding a more accurate elevation mea-
surement.
Latest News? Visit www.geoinformatics.com
Ar t i cl e
45
June 2010
Data layers extracted from dual-band
GeoSAR data.
GeoSAR collects X- and P-band IFSAR data simultaneously and on both sides of the aircraft.
At the time of integration, the Gulfstream II
had the longest wingspan 20 meters of a
civilian jet that could be used affordably for
mapping, and the P-band antenna pairs are
located in pods on its wingtips. The X-band
pods do not need as much spacing and are
positioned much closer together under the
wings near the fuselage.
The other major structural configuration con-
tributing to the quality of GeoSARs elevation
mapping capabilities is the systems dual-side-
looking design. In this configuration, each
antenna pod contains twin pairs of antennas
pointing to different sides of the aircraft with
some overlap in between. As a result, when
the aircraft is flown with a standard 30 per-
cent overlap in its flight lines, the dual-sided
IFSAR simultaneously collects multiple look
angles of every point on the ground with both
its X and P bands.
This dual-side-look IFSAR capability gives the
ground processing system a greater number
of signal reflectance measurements to use in
calculating elevation values. The result is
more accurate X-band surface and P-band
bare-Earth digital elevation models than could
be generated with a single-look system.
GeoSAR in Action
In mapping projects completed in recent years,
the unique capabilities of GeoSAR, such as its
dual look and dual band designs, have proven
advantageous to commercial operations.
The dual-side looking capability really pays
off in tropical areas where the terrain is
rugged and has extreme elevation changes in
small areas, said Hill.
At the standard GeoSAR operating altitude of
about 12,000 kilometers, the typical swath
width of each antenna pair is about 20 kilo-
meters. Even in relatively rugged terrain, this
swath width provides sufficient overlap
between the two sets of antennas to capture
a minimum of two to four redundant points
for each spot on the ground. However, when
the topography is accentuated by deep
ravines or steep cliff faces, such redundancy
might be reduced, or worse a ground point
may be completely blocked from view of the
sensor by the terrain.
To compensate for the extreme terrain, we
tightened up our flight lines for greater over-
lap over rugged areas to ensure we captured
at least the minimum number of looks at
every ground point, said Hill. This allowed
us to achieve consistent elevation mapping
throughout the project area, totaling 388,000
square kilometers.
In addition to capturing highly accurate ele-
vation measurements, the dual-side-look
enables GeoSAR to map large areas in short
periods of time, covering twice as much
ground in one flight line as a single-look SAR
can. This capability combined with the alti-
tude and speed of the Gulfstream II allows
the system to collect image and elevation
data at a rate of 300 square kilometers per
minute.
46
Ar t i cl e
June 2010
GeoSAR maps Mt.Huila through the clouds.
Colorized land classification generated using X-band and P-band imagery.
Fast turnaround time was the primary selling
point in a 2009 project that Fugro EarthData
completed for a European oil company work-
ing in South America. The company was under
strict time constraints to begin construction of
a pipeline from an exploration block in north-
west Peru to a terminal located 200 kilometers
to the south. Extremely treacherous terrain lay
in between, and the operator wanted to map
the landscape to select the most cost-effective
route for the pipeline. The timeline, cloud cover,
dense vegetation and topography eliminated
detailed ground surveys and optical airborne
imaging as mapping options.
GeoSAR covered the 3,000-square-mile project
area, which was divided into multiple potential
corridors, in less than two days, said Caroline
Tyra, Fugro Client Program Manager. We deliv-
ered the final end products in a month.
From the oil companys perspective, the value
of quick acquisition by GeoSAR was doubly
appealing because the system ultimately gen-
erated the two map products required for com-
paring construction costs in the proposed cor-
ridors. The elevation models provided the slope
measurements and topographic details as to
which route had the least variation in terrain
a desirable construction trait while the X- and
P-band magnitude images delineated wetlands
and other land cover features that would have
to be traversed or avoided with potentially neg-
ative impacts on construction costs.
Working in collaboration with commercial cus-
tomers or in independent experimental projects,
this team focuses on enhancing the technology
and its applications. Among these have been
pilot projects in both polar areas and arid
regions, in addition to the equatorial regions
that GeoSAR was originally designed for.
At the poles, GeoSAR has shown great poten-
tial for penetrating snow and ice with its P-band
to measure their thickness. Such data could be
valuable in determining the pace of glacial melt
in climate change research. For more commer-
cial activities in arid zones, the same P band is
demonstrating the ability to peer through sev-
eral meters of sand and unconsolidated over-
burden to find buried pipes, underground water
sources and caves.
A major new IFSAR application now under
development by the Fugro science team is the
use of GeoSAR for biomass calculation in sup-
port of climate change projects in tropical jun-
gles. Calculating biomass from remotely senses
data is a complicated process because the
species of tree must be determined along with
the location of the trunk. Fugro has accom-
plished this by combining ground truthing with
IFSAR classification techniques. Once the
species and trunk locations have been deter-
mined, the P- and X-band signals can be inte-
grated to calculate biomass, which in turn can
be used to quantify the carbon sequestration
capacity of the jungle area.
Fugro has also worked with ESRI Canada to cre-
ate IFSAR viewing tools with the ESRI PurVIEW
ArcGIS extension. This commercially available
product greatly simplifies feature extraction
from remotely sensed imagery by converting
data to 3D with the click of a mouse. This capa-
bility is now available for GeoSAR data, allow-
ing end users to generate synthetic stereo pairs
from both the X- and P-band data sets to iden-
tify and extract features to produce topograph-
ic maps for a wide variety of applications.
GeoSAR is at the cutting edge of large-area
mapping, but we believe we have just
scratched the surface of its mapping poten-
tial, said Jenkins.
Kevin Corbley is a business consultant located in
Denver, Colorado. He may be reached at
www.corbleycommunications.com.
Prior to the flight, the oil company believed it
had selected the best route for the pipeline, but
after examining the IFSAR data in three dimen-
sions, the company changed 90 percent of the
proposed corridor, said Tyra. The new route
will save them millions of dollars in construc-
tion and operating costs.
Map Once, Use Many Times
While successful projects in Australasia, South
America and other tropical regions have estab-
lished GeoSAR as the airborne mapping system
of choice in cloud- and jungle-covered areas,
Fugros Jenkins observed that interest in the
platform has now expanded to other parts of
the world and to other end-user applications.
The appeal is the speed with which GeoSAR
can map large states or entire countries to gen-
erate 1:25,000 and 1:50,000-scale framework
data layers that are the basis for a broad range
of applications, from natural resources manage-
ment to economic development.
The biggest financial risk to a large-area map-
ping project is a delay caused by weather, said
Jenkins. With GeoSAR, that risk is eliminated
and an entire country can be mapped in a few
months.
To continue expansion of the system, Fugro has
invested in assembling a team of IFSAR scien-
tists and technicians in its Frederick, Maryland,
facility where the GeoSAR aircraft is based to
develop other unique uses of GeoSAR data.
Latest News? Visit www.geoinformatics.com
Ar t i cl e
47
June 2010
GeoSAR-derived 1:50,000 scale topographic map
Thriving on Energy of Shared Innovation
2010 ESRI Developer Summit
If success is measured by the din of collaborative exchange, then the 2010 ESRI Developer Summit was a triumph.
One could barely walk 10 feet without overhearing attendees exchanging ideas about how they improved their work
with GIS tools. The chatter was strong evidence that GIS is an inherently interpersonal discipline that thrives on the
energy of shared innovation.
By Matthew DeMeritt
That collaborative spirit was enhanced last
year with the addition of user presentations to
the Developer Summit. With twice as many pre-
sentations this year compared to last, develop-
ers had a wide range of topics to learn about
and apply to their work environment. Many of
the presentations packed the largest rooms at
the Palm Springs Convention Center and were
followed by spirited and informative Q&A. More
often than not, Q&A sessions spilled into the
lobby of the venue and took on a life of their
own.
ESRI selected user presentations based on their
usefulness in tackling everyday problems. A
standout among user presenters, Timmons
Groups Vish Uma gave two presentations on
common obstacles that developers face. His
first presentation on continuous integration (CI)
highlighted the commonality in workflows
across a dauntingly wide spectrum of software
and organizations. Uma shared best practices
for overcoming the challenges posed by the GIS
software development process and explained
how he optimized his workflow through
automation. One of my big revelations with
the last release of ArcGIS was how many tasks
I could eliminate from my schedule by simply
automating them, said Uma. All of the sud-
den, building and deploying solutions became
more fun and opened new avenues of inspira-
48
Event
June 2010
DevSum attendees could meet
with ESRI development staff to discuss
solutions and get their questions answered.
tion. Thats why CI is so important. One
attendee commented, I had no idea how much
time I wasted with manual configuration. Now I
see how I can automate key aspects of my
work.
In his second presentation, Uma demonstrated
how he overcame the problem of limited Web
client printing solutions. In it, he shared the
architecture behind a printing service he built
for Timmons. The service was invaluable in
overcoming well-known limitations, such as the
inability to print multiple map services over a
basemap. Talking to ESRI technicians over the
years has helped me find workable solutions
to real problems, say Uma. Its rewarding to
be able to pay ESRI back for their assistance
by helping my peers. Its what Jack Dangermond
means by GIS community.
APIs and Mashup Challenge
Presentations on APIs were a top draw. More
half of all user presentations mentioned their
applicability in a multitude of scenarios.
Brenden Collins and Steven Andari from Blue
Raster software presented on their ArcGIS
prize of $10,000. His Executive Compensation
Mashup, compared top U.S. executive salaries
with the total income for selected counties in
the United States.
Bouwman and his DTSAgile colleague Brian
Noyle also received some of the best atten-
dance of all the user presentations. One of
Bouwmans presentations, Ruby-fu: Using
ArcGIS Server with Rails, explained how Ruby
on Rails, a popular Web development platform
that powers Twitter, Hulu, and Basecamp, can
be configured to work with ArcGIS Server. One
of Noyles presentations covered the hot issue
of iPhone and Android app writing. Noyle
demonstrated the design and implementation
of a geocoding-enabled site for location-based
feedback within a users local community.
ArcGIS.com
Many developers made a beeline for the tech-
nical sessions, especially the ones that pre-
viewed the capabilities of ArcGIS 10. ESRIs
Jeremy Bartley and Keyur Shah demonstrated
the REST APIs ability to create maps; execute
queries, geoprocessing, geocoding, and geom-
etry operations; access tiles; and generate KML.
Attendees learned how to use the REST API in
a variety of mashup environments including
JavaScript, HTML, Google Earth, Python, and
other Web technologies. I noticed that the real-
ly popular sessions focused on extending
servers through .NET and Java, says Bartley.
One of the exciting developments at ArcGIS 10
is that it supports writing custom server object
extensions that can be consumed in both SOAP
and REST services, which is hugely significant
to developers.
Bartley also kicked off the plenary session with
the unveiling and demonstration of ArcGIS.com,
an online resource for finding and using geo-
graphic content from ESRI and many other
sources.
Keynote speaker David Chappell, principal of
Chappell and Associates, delivered an engag-
ing address on cloud platform development. He
defined cloud computing, discussed new and
future trends, and explained what it means to
GIS developers building with ArcGIS and deploy-
ing it in the cloud.
ESRI twin mission for the Developer Summit is
to engage developer discourse and gather feed-
back from its users. Widening those lines of
communication inevitably results in a kind of
symbiosis, ensuring that ideas remain fresh and
tools and best practices continue to be refined.
For more information, have a look at
www.esri.com/events/devsummit/index.html
Server/Flex super-mashup for the Southern
Forest for the Future Project. Their project inte-
grated YouTube and Flickr, as well as WMS and
KML, into a rich Web application that uses
dynamic map-caching to create a time series of
urban sprawl. The service raises awareness
about the invaluable resources that forests pro-
vide, like fresh water, timber, and recreation,
said Collins. His first ESRI Developer Summit,
Collins was overwhelmed by the availability of
ESRI staff on the showcase floor. I found
myself gravitating to the Flex and geoprocess-
ing teams with all my nagging questions and
getting immediate answers, he adds. That
was enormously helpful and a pleasant sur-
prise.
Developers responded with many novel entries
in the 2010 Mashup Challenge, making it diffi-
cult to select a winner. The guidelines were sim-
ple: build a mashup using ESRI's ArcGIS Online
content and Web APIs, publish the related URL,
and post a video describing the application on
YouTube. After careful evaluation, Dave
Bouwman, CTO and lead software architect at
DTSAgile of Fort Collins, Colorado, took the top
Latest News? Visit www.geoinformatics.com
Event
49
June 2010
Jack Dangermond addresses
the crowd at the Plenary session.
Many peer-to-peer discussions took place before and after the user and technical sessions.
President of ERDAS Joel Campbell
Emphasis on Understanding
Congratulations with your new
position as President of ERDAS.
You bring with you over 20 years of
experience in the geospatial indus-
try, in a variety of senior roles
including sales, business develop-
ment and product management.
With all this experience, what can
we expect from ERDAS and its prod-
uct line and market strategy for the
coming years?
During the first 30 years, com-
panies within our industry focused on methods
of creating and generating geographic data.
Collecting current imagery and using this as the
source of all geospatial data is still important,
but now there is also an emphasis on under-
standing and thoroughly analyzing the changes
in data. End-users need more than just vector,
raster and terrain data. They need tools that
allow them to author, manage, connect and
deliver geospatial information. The industry is
now focused on producing powerful, yet intu-
itive tools for detecting change and under-
standing its implications. These tools enable
users to deliver the timely geospatial informa-
tion required.
As the industry has progressed, realigning its
focus from data collection to data analysis, my
50
I nt er vi ew
June 2010
Joel Campbell, President of ERDAS
Last year, Joel Campbell joined ERDAS as the new President.
With over 20 years of experience in the geospatial industry,
Campbell is a well known and highly regarded speaker,
lecturer and trainer throughout the world. During more
than a decade with ESRI, he held chief leadership and
management positions in the U.S. sales operation.
GeoInformatics asked mr. Campbell about his views
on the past, present and future of the geospatial industry,
and new business models and techniques such as radar.
By the editors
I believe we are entering
into the most exciting time of
our industrys history, where
geospatial is no longer a niche
industry, but has broad reach-
ing relevance.
career has followed a similar path. My time at
ESRI, Definiens and GeoEye all served as sig-
nificant milestones directing me towards data
analysis, and using imagery to exploit change.
Throughout my career, I regularly encountered
ERDAS, and had a deep respect for the compa-
nys rich history, leadership and innovation.
At ERDAS, we develop all of our plans and
products around fulfilling customer needs. Our
purpose is to make the customer successful,
and we are constantly improving our products
to ensure this happens. As the new President,
I will continue to lead this organization with a
customer focus. I believe we are entering into
the most exciting time of our industrys history,
where geospatial is no longer a niche industry,
but has broad reaching relevance. ERDAS con-
tinues to be a leader, as the industry experts
in handling all forms of imagery and image
analysis, enabling users to easily gather the
geospatial information they need.
Looking back at the last two
decades in the geospatial industry,
what do you think have been the
most revolutionary trends that have
happened and how do you value the
industry and imagery market? What
will be the challenges ahead for the
industry itself?
Both satellite imagery and
remote sensing continue to revolutionize the
way we see and interpret the world. In the
1980s, remote sensing and satellite imagery
were oversold. Our industry was one of special-
ist niche applications for image classification
enables users to run on-demand change detec-
tion, enabling users to update existing features
and perform primary feature extraction.
Moving forward, were introducing time and
other business-critical information to make 4D
and 5D systems. These are de facto parts of
the overall system, people expect not just a 2D
view of the world, but a dynamic 4D and 5D
view of the world that ties business informa-
tion to location.
Data is always a huge problem. We essentially
allow customers to get more than simply static
maps, we allow them to get things like, how
much green space has changed over the course
of the last three years, or the percentage of
impervious surface on my land. Additionally,
where can I land my helicopter in an E911 situ-
ation? Where is the slope of the land less than
4%?
Customers want live and up-to-date informa-
tion, they dont want to look at an elevation
model that was created fifteen years ago with
a 15M resolution. They want and expect more.
The update of spatial information on-the-fly is
now possible, and were doing this with the
geoprocessing available in ERDAS APOLLO.
If intelligence is added right after
image capture, doesnt this take
away some of the work GIS techni-
cians are currently doing? Or are
you helping them by making their
work easier?
Our tools do not eliminate the
need for technicians the human interaction
continues to be important throughout geospa-
tial analysis. Therefore, GIS technicians will
always be needed. However, their responsibili-
ties will change as processes continue to be
automated, enabling these individuals to be
more efficient in their jobs, taking on addition-
al responsibilities and quality control through-
out each projects workflow.
These days everybody talks about
the cloud. What is ERDAS answer to
initiatives such as WeoGeo, which
presents itself as an iTunes for
Maps on a cloud-base?
This is a significant area of
research focus for ERDAS. Within cloud comput-
ing, there are two areas where there is a great
deal of interest.
One is the ability to leverage the cloud for the
scalable processing power that you need. If I
received all new imagery, and wanted to create
a whole new orthomosaic of my entire area,
thats a compute-intensive process. Maybe I
and heads up digitizing. Tools during this time
were often complex and user-intensive and
were not a part of a standardized workflow.
Later, ortho images were used to give context
to GIS data. These were used extensively in the
Defense realm. Traditional (albeit digital)
human-intensive photo-interpretation was done
rather than through software analysis. Weather
satellites were one of the few operational exam-
ples of using satellite imagery.
In the 1990s and early 2000s, companies like
Microsoft, Google and Oracle moved into the
geospatial space, benefiting everyone involved
by raising our overall awareness. However,
these new users were mainly just using imagery
as a backdrop for other data sources. As a
whole, change detection and exploitation was
still largely concentrated to the expert realm.
Since then, there has been greater acceptance
of geospatial technologies outside the tradi-
tional customer base. In the future, this demand
will continue to grow as businesses recognize
the importance of thoroughly understanding
change. We see an increasing demand for serv-
er technologies, as well as desktop applications
that provide automation such as georeferenc-
ing (using tools like IMAGINE AutoSync). These
kinds of tools simplify and streamline the pro-
cess when existing imagery doesnt automati-
cally line up. An operator doesnt need to be
an expert to orthorectify. These types of tools
are not simply for imagery, but for other types
of geospatial information as well. At ERDAS, we
are connecting and integrating the strengths of
these types of desktop applications into our
server technologies. For example, ERDAS APOLLO
Latest News? Visit www.geoinformatics.com
I nt er vi ew
51
June 2010
Apollo Feature Interoperability
ITC develops and transfers
knowl edge on geo-information
science and earth observation
ITC is the largest institute for international
higher education in the Netherlands, providing
interna tional education, research and project
services. The aim of ITC's activities is the inter-
national exchange of knowledge, focusing on
capacity building and institutional development
in developing countries and countries in
transition.
Programmes in Geo-information Science
and Earth Observation
Master of Science (MSc) degree (18 months)
Master degree (12 months)
Postgraduate diploma (9 months)
Diploma (9 months)
Certificate course (3 weeks-3 months)
Distance course (6 weeks)
Courses in the degree programmes
Applied Earth Sciences
Geoinformatics
Governance and Spatial Information Management
Land Administration
Natural Resources Management
Urban Planning and Management
Water Resources and Environmental Management
I NTERNATI ONAL I NS TI TUTE F OR GEO- I NF ORMATI ON S CI ENCE AND EARTH OBS ERVATI ON
www.itc.nl
For more information:
ITC Student Registration office
P.O. Box 6, 7500 AA Enschede
The Netherlands
E: education@itc.nl
I: www.itc.nl
would have to have a large server system to
crunch through that in a meaningful timeframe,
but I would only need to do that maybe once
per year. The ability to access that power out-
side my organization when needed is something
that is really interesting to our customers, and
were working toward doing that.
The second part of the cloud that is equally inter-
esting to us is the software-as-a-service (SaaS)
that a cloud environment could offer. If you were
an engineering company doing a feasibility study
for a new highway, maybe its only a 90-day pro-
ject, and you have 10 engineers working on it
for 90 days. You may not want to buy a lot of
software for a 90-day project, but you could use
software as a service through the cloud, with the
full robustness of the product, but only pay for
it on a monthly basis. Theres an interesting busi-
ness model there that the technology allows us
to engage in.
I hear a lot of exciting stories on the
use of radar data. Can you talk
about the possibilities of this tech-
nology, the client base and applica-
tions that ERDAS offers for radar
data?
With over 15 years of experience
in radar image processing, ERDAS provides
leading radar mapping solutions, with special-
ized tools for processing radar data in a stan-
dard remote sensing or GIS environment.
ERDAS continues to focus on operational soft-
As the Earth changes, data orga-
nizations also have to be agile and innovative
to accommodate the growing volumes of data.
ERDAS APOLLO helps organize geographic informa-
tion, enabling users inside and outside an orga-
nization to have the ability find, view and direct-
ly use the geographic information. Based on a
100% Service Oriented Architecture (SOA), ERDAS
APOLLO provides a Spatial Data Infrastructure
(SDI) that manages and delivers TBs of GIS
data, imagery and terrain information to cus-
tomers. The heart of the SDI is the catalog and
the key component of the catalog is metadata.
To achieve a vision of a connected Digital Earth,
an interoperable and open catalog is critical.
ERDAS APOLLO provides an out-of-the box environ-
ment for cataloguing data and services.
Once data is organized and managed, the next
step is to get the geographic information to
users. Its one thing to deliver data to users as
web services. But its another thing to deliver
on-demand geographic information products to
a community. This is possible with the delivery
of on-demand geoprocessing capabilities.
Together, ERDAS IMAGINE and ERDAS APOLLO support
an end-to-end workflow for desktop to enter-
prise geoprocessing. Connecting to a catalog,
users have the ability to publish spatial mod-
els that can then be extended to everyone
through the internet so that information prod-
ucts can be requested, visualized and used.
For more information, have a look at
www.erdas.com
ware, taking the best of developmental algo-
rithms and implementing these into a user-
friendly environment, easily accessible for both
the novice and the expert user.
With tools for georectifying, filtering and cali-
brating radar images, analysts can derive ele-
vation information regardless of cloud cover,
day or night from stereo or interferometric
image-pairs using the IMAGINE Radar Mapping
Suite. Analysts can save their radar data in any
raster format, create color images to emphasize
the magnitude of change, derive binary images
to detect the most dramatic changes, create
shapefiles for GIS applications and more.
Utilizing the power of ERDAS IMAGINE, these tools
are interoperable, supporting all raster file for-
mats, with the ability to seamlessly connect
radar data to any of ERDAS portfolio of solu-
tions and many other geospatial products.
ERDAS radar mapping products support a grow-
ing number of satellite sensors, including ERS-
1, 2 and EnviSat, RADARSAT-1 and 2, TerraSAR-
X (with one meter spotlight modes),
COSMO-SkyMed, ALOS PALSAR and more. Data
from other radar sensors are supported by a
generic import interface.
ERDAS is well aware of the speed
that users want data and also how
they want it. What is your solution
to the possibility of too much data,
since data is captured and updated
all over the world, 24-7 and stored
in data silos everywhere?
Latest News? Visit www.geoinformatics.com
I nt er vi ew
53
June 2010
Web Client
Calendar 2010
Advertiser Page
GEODIS www.geodis.cz 22
GEOMAX www.geomax-positioning.com 29
Intergeo www.intergeo.de 35
ITC www.itc.nl 52
LEICA Geosystems www.leica-geosystems.com 9
NovAtel www.novatel.ca 2
Optech www.optech.ca 13
RACURS www.racurs.ru 33, 39
Sokkia www.sokkia.net 56
Spectra Precision www.spectraprecision.com 19
SPOT Image www.spotimage.com 15, 17
Topcon Europe www.topcon-positioning.eu 55
VEXCEL Imaging www.microsoft.com/ultracam 23
Advertisers Index
20-25 June 10th International
Multidisciplinary Scientific Geo-Conference
and Expo SGEM 2010 (Surveying
Geology & mining Ecology Management)
Albena sea-side and SPA resort, Congress
Centre Flamingo Grand, Bulgaria
E-mail: sgem@sgem.org
Internet: www.sgem.org
21-22 June 2nd Open Source GIS UK
Conference
Nottingham, University of Nottingham, U.K.
Internet: www.opensourcegis.org.uk
21-23 June COM.Geo 2010
Washington, DC, U.S.A.
Internet: www.com-geo.org
22-24 June Mid-Term Symposium of ISPRS
Commission V: Close range image mea-
surement techniques
Newcastle upon Tyne, Newcastle University,
U.K.
E-mail: j.p.mills@newcastle.ac.uk
Internet: www.isprs-newcastle2010.org
23-25 June INSPIRE Conference 2010
Krakow, Poland
Internet:
http://inspire.jrc.ec.europa.eu/events/confer
ences/inspire_2010
28-30 June ISVD 2010
Quebec City, Canada
E-mail: ISVD2010@scg.ulaval.ca
Internet: http://isvd2010.scg.ulaval.ca
29 June-02 July GEOBIA 2010
Ghent, Belgium
Internet: http://geobia.ugent.be
29 June-09 July Bridging GIS, Landscape
Ecology and Remote Sensing for
Landscape Planning (GISLERS)
Salzburg, Austria
E-mail: gislers2010@edu-zgis.net
Internet: www.edu-zgis.net/ss/gislers2010
29 June-09 July Spatial Data Infrastructure
for environmental datasets (EnviSDI)
Salzburg, Austria
E-mail: envisdi2010@edu-zgis.net
Internet: www.edu-zgis.net/ss/envisdi2010
July
01-03 July German-Austrian-Swiss confer-
ence for photogrammetry, remote sensing,
and spatial information science
04 July ISPRS Centenary Celebration
05-07 July ISPRS TC VII Symposium '100
Years ISPRS-Advancing Remote Sensing
Science'
Vienna, Austria
Internet: www.isprs100vienna.org
03-04 July InterCarto - InterGIS 16
Cartography and Geoinformation for
Sustainable Development
Rostov (Don), Russia
Internet: http://intercarto16.net
04-10 July 20th Anniversary Meeting on
Cognitive and Linguistic Aspects of
Geographic Space
Las Navas, Spain
E-mail: lasnavas@geoinfo.tuwien.ac.at
Internet: www.geoinfo.tuwien.ac.at/
lasnavas2010
06-07 July InterCarto - InterGIS 16
Cartography and Geoinformation for
Sustainable Development
Salzburg, Austria
Internet: http://intercarto16.net
06-09 July GI_Forum 2010
Salzburg, Austria
E-mail: office@gi-forum.org
Internet: www.gi-forum.org
10-13 July 2010 ESRI Education User
Conference
San Diego, CA, U.S.A.
Tel: +1 909-793-2853, ext. 3743
E-mail: educ@esri.com
Internet: www.esri.com/educ
10-13 July 2010 ESRI Survey & Engineering
GIS Summit
San Diego, CA U.S.A.
Tel: +1 909-793-2853, ext. 4347
E-mail: segsummit@esri.com
Internet: www.esri.com/segsummit
10-13 July 2010 ESRI Homeland Security
GIS Summit
San Diego, CA, U.S.A.
Tel: +1 909-793-2853, ext. 2421
E-mail: hssumit@esri.com
Internet: www.esri.com/hssummit
11-12 July 2010 ESRI Business GIS Summit
San Diego, CA, USA
Tel: +1 909-793-2853, ext. 2371
E-mail: bizsummit@esri.com
Internet: www.esri.com/bizsummit
12-16 July 2010 ESRI International User
Conference
San Diego, CA, U.S.A.
Tel: +1 909-793-2853, ext. 2894
E-mail: uc@esri.com
Internet: www.esri.com/uc
18-25 July COSPAR 2010
Bremen, Germany
Tel: +49 (0)421 218-2940
E-mail: chairman@cospar2010.org
Internet: www.cospar2010.org
20-23 July Accuracy 2010
Leicester, U.K.
Internet: www.spatial-accuracy.org/
Accuracy2010
26-30 July GeoWeb 2010
Vancouver, Canada
E-mail: info@geowebconference.org
Internet: www.geoweb.org
29 July-02 August MAPPS 2010 Summer
Conference
Incline Village, NV, U.S.A.
Internet: www.mapps.org/events/index.cfm
August
01-05 August SPIE Optics + Photonics 2010
San Diego, CA, San Diego Convention
Center, U.S.A.
Internet: http://spie.org/x30491.xml
01-05 August SPIE Photonics Devices +
Applications
San Diego, CA, San Diego Convention
Center, U.S.A.
Internet: http://spie.org/x13192.xml
01-05 August SPIE Optical Engineering +
Applications
San Diego, CA, San Diego Convention
Center, U.S.A.
Internet: http://spie.org/x13188.xml
07-12 August GIslands 2010
Ponta Delgada, Azores Islands, Portugal
E-mail: gislands2010@uac.pt
Internet: www.gislands.org
09-12 August ISPRS Technical Commission
VIII Symposium
Kyoto, ICC Kyoto, Japan
Internet: www.isprscom8.org/index.html
16-18 August 2010 URISA/NENA Addressing
Conference
Charlotte, NC, U.S.A.
Internet: www.urisa.org/conferences/
addressing/info
September
01-03 September RSPSoc 2010 From the
Sea-bed to the Cloud-tops
Cork, Ireland
E-mail: rspsoc2010@ucc.ie
Internet: www.rspsoc2010.org
01-03 September PHOTOGRAMMETRIC
COMPUTER VISION and IMAGE ANALYSIS
Conference - ISPRS Technical Commission
III Symposium
Paris, France
Internet: http://pcv2010.ign.fr
02-03 September COBRA 2010 - RICS
Research Conference
Paris, France
E-mail: p.chynoweth@salford.ac.uk or
r.w.craig@lboro.ac.uk
Internet: www.cobra2010.com
June
02-04 June ISPRS Commission VI Mid-Term
Symposium: "Cross-Border Education for
Global Geo-information"
Enschede, ITC, The Netherlands
E-mail: isprscom6@itc.nl
Internet: www.itc.nl/isprscom6/
symposium2010
02-05 June ACSM 2010
Baltimore, MD, U.S.A.
Tel: +1 317 637 9200 x141
E-mail: dhamilton@acsm.org
Internet: www.acsm.org/AM/
Template.cfm?Section=Conferences
03 June COMPASS10 Annual Conference
Dublin, Ireland
Internet: www.compass.ie
07-09 June Sensors Expo & Conference
Rosemont, IL, Donald E. Stephens
Convention Center, U.S.A.
Tel: +1 (617) 219 8330
E-mail: cgroton@questex.com
Internet: www.sensorsexpo.com
07-10 June 2010 Joint Navigation
Conference
Orlando, FL, Wyndham Orlando Resort,
U.S.A.
Tel: +1 (703) 383-9688
E-mail: membership@ion.org
Internet: www.jointnavigation.org
08-10 June 58th German Cartographers
Day 2010
Berlin and Potsdam, Germany
E-mail: kartographentag2010@dgfk.net
Internet: http://dkt2010.dgfk.net
12-14 June Digital Earth Summit
Nessebar, Bulgaria
Tel: +359 (887) 83 27 02
Fax: +359 (2) 866 22 01
E-mail: cartography@abv.bg
Internet: www.cartography-gis.com/
digitalearth
14-16 June 2nd Workshop on Hyperspectral
Image and Signal Processing
Reykjavik, Iceland
Tel: +354 525 4047
Fax: +354 525 4038
E-mail: info@ieee-whispers.com
Internet: www.ieee-whispers.com
14-17 June Intergraph 2010
Nashville, TN, U.S.A.
Internet:
www.intergraph2010.com/schedule/aag.aspx
14-18 June 8th Annual Summer Institute on
Geographic Information Science
"Interfacing social and environmental
modeling"
Florence (Firenze), Italy
E-mail: info@vespucci.org
Internet: www.vespucci.org
15-18 June Canadian Geomatics Conference
Calgary, AB, Canada
E-mail: exdircig@magma.ca
Internet: www.geoconf.ca
15-20 June 3rd International Conference on
Cartography and GIS
Nessebar, Bulgaria
Tel: +359 (887) 83 27 02
Fax: +359 (2) 866 22 01
E-mail: cartography@abv.bg
Internet: www.cartography-gis.com
17 June 7th ALLSAT OPEN - GNSS-
Reference Network - Quo Vadis
Hannover, Germany
E-mail: open@allsat.de
Internet: www.allsat.de/en/news/
allsat_open/2010.html
Please feel free to e-mail your calendar notices to:calendar@geoinformatics.com
54
June 2010
www.topcon.eu
Capture geo-referenced
360 degree images
and point clouds with any
car in your eet
SURVEY AT
SPEED
g
r
a
f
i
t
-
w
e
r
b
e
a
g
e
n
t
u
r
.
d
e
GNSS Recei ver
The entirely new Sokkia GNSS system provides
unsurpassed versatility and usability for
RTK,network RTK and static survey, enhancing
efciency in all types of eld work.
www.sokkia.net
Scalable - Affordable - Triple Wireless Technologies
ULTIMATE
VERSATILITY

Das könnte Ihnen auch gefallen