Sie sind auf Seite 1von 56

G 11

th
GSDI Conference G Spot Image International Conference 2009
G Intergraph and Smart Grids G MapInfo Professional v10.0
Magaz i ne f or Sur veyi ng, Mappi ng & GI S Pr of es s i onal s
July/Aug 2009
Volume 12
5
With ArcGIS

, you can create applications that meet todays high expectations for Web mapping. By making
your authoritative data available to people both inside and outside your organization via fast, effective Web
maps, you give them the spatial intelligence they need to make decisions. To learn how ArcGIS 9.3.1 can help
you deploy modern Web maps that are relevant to your entire enterprise, visit www.esri.com/whatsnew.
ArcGIS

9.3.1
Use Fast, Intuitive Web Maps to Share Your Geographic Knowledge
Users can easily access and leverage your GIS with clients built on
Flex | Silverlight | JavaScript | ArcGIS Explorer
Copyright 2009 ESRI. All rights reserved. The ESRI globe logo, ESRI, ArcGIS, and www.esri.com are trademarks, registered trademarks, or service marks of ESRI in the United States, the European Community,
or certain other jurisdictions. Other companies and products mentioned herein may be trademarks or registered trademarks of their respective trademark owners.
For ESRI locations worldwide, visit www.esri.com/distributors.
Czech Republic
www.arcdata.cz
Denmark
www.informi.dk
Estonia, Latvia,
and Lithuania
www.hnit-baltic.lt
Finland
www.esri-nland.com
France
www.esrifrance.fr
F.Y.R.O.M.
www.gisdata.hr
Germany
www.esri-germany.de
Georgia
www.geographic.ge
Greece and Cyprus
www.marathondata.gr
Austria
www.synergis.co.at
Belgium and Luxembourg
www.esribelux.com
Bosnia and Herzegovina
www.gisdata.hr
Bulgaria
www.esribulgaria.com
Croatia
www.gisdata.hr
Hungary
www.esrihu.hu
Iceland
www.samsyn.is
Israel
www.systematics.co.il
Italy
www.esriitalia.it
Malta
www.geosys.com.mt
Moldova
www.trimetrica.com
The Netherlands
www.esrinl.com
Norway
www.geodata.no
Poland
www.esripolska.com.pl
Portugal
www.esri-portugal.pt
Romania
www.esriro.ro
Russia
www.dataplus.ru
Slovak Republic
www.arcgeo.sk
Slovenia
www.gisdata.hr
Spain
www.esri-es.com
Sweden
www.esri-sgroup.se
Switzerland
www.esri-suisse.ch
Turkey
www.esriturkey.com.tr
Ukraine
www.ecomm.kiev.ua
UK/Ireland
www.esriuk.com
Where is my SDI?
For a long time, an SDI appeared to me a phenomenon as, lets say, a 78 RPM record by
The Beatles: its something that everyone is talking about, but no one has ever actually seen,
or heard for that matter. Of course, this comparison is quite far-fetched, but my skepticism
was unyielding every time I heard people speak about the creation of nationwide SDIs or
huge programs like INSPIRE. To me, it seemed that such long-term projects always take more
time (and money) than one thinks. And because there is a political link between INSPIRE
and the EU, its easy to place INSPIRE under widespread EU skepticism.
So I was quite curious when I read that at the 11th Global Spatial Data Infrastructures (GSDI)
Conference ( held last June in Rotterdam, the Netherlands) the industry would meet with
scientists and the public sector to discuss matters on SDIs, INSPIRE and so on. What were
the lessons to be learned from each other? Are people from different fields speaking the
same language or reinventing each others wheels without knowing it?
On the exposition floor I heard some grumbling comments from various captains of industry
that it was time to act and stop talking. I also heard of interesting new user platforms by
which government agencies can benefit greatly, if they are willing to listen to these user
communities. There are lessons to be learned from the user communities and companies
who make money out of the donation of volunteered geographical information. As is often
the case, governments are slow to respond in comparison to the market. But its actually a
good thing that politicians take notice of public initiatives and start to ask themselves how
can we jump on the bandwagon?
In the end, as in the title of one Beatles song, Here, There and Everywhere, the GSDI
Conference proved to be quite inspiring on this point.
Enjoy your reading,
Eric van Rees
evanrees@geoinformatics.com
July/August 2009
3
GeoInformatics provides coverage, analysis and
commentary with respect to the international surveying,
mapping and GIS industry.
Publisher
Ruud Groothuis
rgroothuis@geoinformatics.com
Editor-in-chief
Eric van Rees
evanrees@geoinformatics.com
Editors
Frank Arts
fartes@geoinformatics.com
Florian Fischer
ffischer@geoinformatics.com
Job van Haaften
jvanhaaften@geoinformatics.com
Huibert-Jan Lekkerkerk
hlekkerkerk@geoinformatics.com
Remco Takken
rtakken@geoinformatics.com
Joc Triglav
jtriglav@geoinformatics.com
Columnists
James Fee
John Trinder
Contributing Writers
Gordon Petrie
Florian Fischer
Tom Probert
Menno-Jan Kraak
Jan Sukup
Patrik Meixner
Karel Sukup
Huibert-Jan Lekkerkerk
Carmela Burns
Nelson de Jesus Parada
Ulfh Walter Palme
Jason San Souci
Philip Cheng
Remco Takken
Account Manager
Wilfred Westerhof
wwesterhof@geoinformatics.com
Subscriptions
GeoInformatics is available against a yearly
subscription rate (8 issues) of 89,00.
To subscribe, fill in and return the electronic reply
card on our website or contact Janneke Bijleveld at
services@geoinformatics.com
Advertising/Reprints
All enquiries should be submitted to
Ruud Groothuis rgroothuis@geoinformatics.com
World Wide Web
GeoInformatics can be found at:
www.geoinformatics.com
Graphic Design
Sander van der Kolk
svanderkolk@geoinformatics.com
ISSN 13870858
Copyright 2008. GeoInformatics: no material may
be reproduced without written permission.
GeoInformatics is published by
CMedia Productions BV
Postal address:
P.O. Box 231
8300 AE
Emmeloord
The Netherlands
Tel.: +31 (0) 527 619 000
Fax: +31 (0) 527 620 989
E-mail: mailbox@geoinformatics.com
Corporate
Member
Sustaining
Member
Pitney Bowes Business Insights MapInfo
Professional 10 v10.0
This month, Pitney Bowes Business Insight launched MapInfo
Professional v10.0, the latest version of the companys flagship applica-
tion for business mapping and analysis. Designed in direct response to
valued feedback from the worldwide MapInfo Professional customer
base, this latest, landmark upgrade offers unprecedented new capabili-
ties and equips organisations to make better, faster and more insightful
business decisions.
C o n t e n t
July/August 2009
Articles
Rethinking the Geo-information
Economy with Neogeography
Donate your Geo Data! 12
Covering Large Areas in Short Time
A High Resolution Orthomosaic in Brazil 22
Retooling for the Digital Data Revolution
Geospatial and GIS Technologies 28
Using PixoView Technology
Testing Measurement Accuracy in Oblique
Photography 36
Under the Sea
Ocean Depths 48
Product Reviews
The Power of Ten
Pitney Bowes Business Insights MapInfo
Professional 10 v10.0 16
Map Reading and Map Analysis
ESRI Book on Map Use 20
Useful Information for Everyday Geodetic Life
Datums and Map Projections Book 27
Forensics versus Research
Geoforensics 44
Interviews
An Interview with Ken Spratlin
Trimbles New GeoSpatial Division 6
Achieving a More Reliable Delivery
for Growing Energy Needs
Smart Grids around the World 50
Column
Geospatial Portals Keys to Success 34
By James Fee
Machine Learning as a Tool for Image
Analysis and Information Extraction 46
By John Trinder
Page 16
ESRI Book on Map Use
Menno-Jan Kraak reviews the sixth edition of the book Map use: reading
and analysis, published by ESRI Press. The book is meant as a compre-
hensive, philosophical, and practical treatment of map appreciation.
4
Page 20
Latest News? Visit www.geoinformatics.com
5
July/August 2009
On the Cover:
Pleiades image of Cannes. Photo credit: Cnes/distribution Spot Image. See
article on the Spot Image International Conference 2009 on page 42.
Global Spatial Data Infrastructure
Conference 2009
The eleventh edition of the annual Global Spatial Data Infrastructure (GSDI)
conference was held in Rotterdam, the Netherlands, from June 15 to 19. At
this conference the GSDI Association, an inclusive body of organizations,
agencies, firms, and individuals from around the world, promotes interna-
tional co-operation and collaboration in support of local, national and
international spatial data infrastructure developments.
Smart Grids around the World
Smart grid is gaining traction throughout the world as a means to con-
serve resources, lessen pollution and increase the security and resiliency
of power grids states Tony DiMarco, Intergraph Director of Global Utilities
and Communications. For a number of years, Intergraph has been active in
the field of smart grids. But what exactly is it that smart grids do, and
what is their link with geospatial technology? Read all about it.
Page 50
Events
Public Sector meets Science and Industry
Global Spatial Data Infrastructure Conference 2009 32
New Partnerships, Satellites, Products and Strategies
Spot Image International Conference 2009 42
Calendar 54
Advertisers Index 54
Page 32
Page 42
An Interview with Ken Spratlin
Since 2007, Trimble has acquired four companies INPHO, Geo-3D, Rollei
Metric and TopoSys which now form the companys GeoSpatial Division. Ken
Spratlin who is the general manager of this new division is asked about the
strategy that lay behind these recent acquisitions and the paths that he sees
this newly formed division following in the future.
By Gordon Petrie
Introduction
Ken Spratlin received his education at two of the U.S.s most prestigious technological univer-
sities Georgia Institute of Technology (Georgia Tech) and Massachusetts Institute of
Technology (MIT), obtaining his Masters degree at the latter in 1987. He was employed first
as an engineer and then as a section chief at Draper Laboratory, which originally was part of
MIT and is famous for its research and developments in navigation, guidance and advanced
control systems, including integrated GPS/INS systems. After which, he joined Trimble, where
he has held a number of senior managerial positions. These include serving as general man-
ager of the companys Military & Advanced Systems Division where he patented several
new developments in GPS technology. He then served as Chief Operating Officer of Nikon-
Trimble, a joint venture of the two companies within the field of surveying instrumentation.
Following which, he became Trimbles director for new market development, during which
time Trimble acquired the four companies (INPHO, Geo-3D, RolleiMetric and TopoSys) that
form the basis of its new GeoSpatial Division. Now Ken Spratlin has been appointed as the
general manager of the GeoSpatial Division, charged with the responsibilities of ensuring that
it becomes a leader in the area of geospatial imaging and a commercial success.
GP Please could you outline the think-
ing that lay behind Trimbles acquisition
of the four companies and the formation
of the GeoSpatial Division? How does the
new GeoSpatial Division fit into Trimbles
overall (global) business strategy?
- Trimble focuses on four major market
segments: Engineering and Construction;
Precision Agriculture; Mobile Resource Mana -
gement (fleet management and mobile work-
ers); and Advanced Devices (GNSS chipsets,
boards, and technology licensing). We see a sig-
nificant opportunity to apply geospatial imag-
ing to the first three of these markets, where
the use of imagery is largely under-penetrated
at the present time. The two most significant
hurdles to its adoption today are: (1) cost of
the systems or data, and (2) the age of the
information, since typically it can take months
from initiating data collection to the delivery of
the information. Trimble intends to address
both of these hurdles with purpose-built sys-
tems for these markets.
Trimble expects the convergence of the land
survey, mapping and GIS, and aerial mapping
segments to accelerate, and is one of the
drivers of this trend. Imaging, largely a tool for
the aerial mapping segment in the past, is
increasingly a part of land survey and GIS solu-
tions today. With this in mind, Trimbles
Connected Site solutions foster this conver-
gence now and offer a vision for the future.
The Connected Site creates seamless working
relationships among Trimble products, tech-
nologies, services and their end users. It
enables, for example, surveyors to choose from
a broad range of options, including surveying
techniques, communications channels and facil-
itating services such as GNSS infrastructure,
within a single fully-integrated and interopera-
ble solution. Surveyors benefit from data com-
patibility and transfer with field and office soft-
ware; increased flexibility in using the best tools
and techniques for the job; the adaptation of
specialized technologies to fit the ideal survey
workflow; and localized solutions to address
specific market needs globally.
For example, the Trimble VX Spatial Station
combines optical, scanning and metric camera
capabilities to measure objects in 3D and pro-
duce 2D and 3D data sets for spatial imaging
and traditional surveying projects. With recent
6
I nt er vi ew
July/August 2009
Fig. 1 Ken Spratlin, the general manager of Trimbles GeoSpatial Division, on the left,
and Eric McCuaig of 3D-Geo on the right at the Intergeo trade fair held in Bremen.
Trimbles New GeoSpatial Division
advances in the geospatial information
industry, more opportunities for spatial imaging data are being identified
for transportation and civil engineering, utilities and communications, nat-
ural resources management and government. Many applications use air-
borne information, but can also benefit from ground-based positioning
and imaging.
Trimble has participated in the aerial mapping segment since the mid-
1980s, providing GPS receivers to georeference aerial imagery. In 2003,
Trimble acquired Applanix, extending our georeferencing capability with
GPS/INS systems, and later the Applanix Digital Sensor System (DSS). The
acquisition of these four new companies represents a significant expan-
sion of Trimbles commitment to the mobile mapping segment, and a nat-
ural progression of our strategy, given our intent to continue driving the
convergence of the three segments that I mentioned previously.
GP Please explain to readers how the new GeoSpatial Division
is being organised and structured internally.
(i) Where is the head office of the Division located and who do
you report to as the general manager of the GeoSpatial
Division?
(ii) Who are the persons that are responsible for the day-to-day
running of the four formerly independent companies that now
make up the Division?
- Trimbles organizational philosophy is centered on the concept of the
division. The functions relating to business strategy, market planning,
product development, sales, and financial management are all functions
typically embedded within the division. This philosophy is applied as well
to the recently formed GeoSpatial Division.
Internally, the divisions are then organized into six sectors, with the sev-
enth sector being the companys corporate strategy and business develop-
ment function. These sectors are managed by vice presidents, and report
to the CEO. Multiple divisions and sectors address the four major markets.
Mark Harrington, Sector Vice President, manages the sector comprised of
the following divisions: Agriculture; Mapping and GIS; GeoSpatial; Mobile
Computing; Applanix; Infrastructure; Advanced Public Safety / Visual
Statement; Power, Process and Plant; and Trimble Outdoors. I am located
in the Trimble Rockies office located in Westminster, Colorado, near Denver,
and report directly to Mark.
The internal organization of the GeoSpatial Division actually changed in
mid-May to integrate and leverage our full capabilities. When we acquired
INPHO and then Geo-3D, we left these entities largely to operate as-is while
we focused on completing the other acquisitions. With the completion of
the RolleiMetric and TopoSys acquisitions in the fall of 2008, we began to
plan for the GeoSpatial Division to function as an integrated entity. As of
mid-May, GeoSpatial is now organized internally by function marketing,
engineering, operations, sales, and customer support. The transformation
is not yet complete, but the direction is clear.
GP INPHO was the first of the companies (acquired in February
2007) that now make up the GeoSpatial Division. It is already
well known for its digital photogrammetric and terrain modelling
software products and is the only one of the four acquisitions
that does not develop and sell hardware systems and solutions.
(i) How do INPHOs software products fit into Trimbles spatial
imaging initiative and in which direction(s) can we expect them
to develop in the future?
(ii) Will INPHO continue to offer the Summit Evolution DPW
which it sources from DAT/EM in Alaska?
- INPHO has earned an excellent reputation internationally for devel-
oping highly accurate and precise aerial photogrammetry solutions, work-
ing closely with users to continually improve their solutions and provide
training and technical support. Early in their company history, they also
developed solutions for close range (terrestrial) photogrammetry. So INPHO
was the obvious foundation for the GeoSpatial Division. We plan to con-
tinue to develop the aerial photogrammetry and LIDAR software products
(Fig. 2), and will also leverage their capabilities into other applications for
geospatial imaging in our markets.
INPHO and DAT/EM have enjoyed an excellent, long-term and complemen-
tary relationship. Trimble and DAT/EM have continued that relationship,
and actually converted what was formerly a handshake into a formal
relationship. So, yes, we will continue to offer the Summit Evolution DPW
(Fig. 3).
GP In January 2008, Geo-3D was acquired by Trimble. The com-
pany is known as the developer and supplier of its series of Trident-
3D (road) and Atlas-3D (rail) mobile mapping systems and of its
complementary Cyclop-3D aerial mapping system. However, in the
past, Geo-3D has also acted as a service provider supplying
geospatial data to clients via mapping contracts.
(i) Will it continue to operate in this latter role when it is in dan-
ger of competing with its own customers who have bought one or
more of its systems?
(ii) Can we expect Geo-3D to expand its product offerings in the
mobile mapping sector for road asset inventories and 3D urban
mapping since these appear to be application areas with an
obvious future growth potential?
At the time of its acquisition, Geo-3D was predominantly supplying
products and solutions to its customers. The service portion of the busi-
Latest News? Visit www.geoinformatics.com
I nt er vi ew
7
July/August 2009
Fig. 2 (a) Flow diagram show-
ing the steps in processing air-
borne lidar data using the
SCOP++ and DTMaster software
packages that have been devel-
oped by INPHO in partnership
with the Institute of Photo -
grammetry at the Technical
University of Vienna.
(b) A perspective image of the
Olympus Mons volcano on
Mars based on Mars Orbiter
Laser Altimeter (MOLA) eleva-
tion data acquired by the NASA
Mars Global Surveyor mission
and HRSC image data from the
ESA Mars Express mission
using the SCOP++ package.
(Source: Institute of Photo -
grammetry, Technical
University of Vienna)
[a] [b]
ness was and remains a very small portion of the business, operated in
the province of Quebec, Canada. This service business has functioned as
a test track for the development of the products and solutions busi-
ness. Trimble predominately provides products, solutions, and services-
for-service-companies to its customers. We are committed to this role, and
will operate the GeoSpatial Division similarly.
Of the four acquired companies, Geo-3D has progressed the furthest
toward addressing a specific vertical market that being the transporta-
tion segment with converged roadway asset management and pave-
ment management solutions (Fig. 4). Our focus is to see these converged
systems achieve high market penetration, and continue to automate the
detection and recognition of more types of assets.
GP Trimble acquired RolleiMetric from Rollei GmbH in September
2008. This appears to have resulted in a quite different situation
to that of the other acquisitions in that Rollei continues to oper-
ate as a separate brand in the consumer camera market and
remains quite independent from Trimble.
(i) Is this the reason for the change of title of the RolleiMetric
operation to be the Metric Imaging Department of Trimble
Holdings GmbH that now appears on its Web site?
(ii) Does this mean that the RolleiMetric name will now disappear?
Yes, that is correct for both questions. Trimble acquired the metric
imaging business (technology and product lines) of Rollei and employed
all the metric imaging staff. The metric imaging business operated under
the name RolleiMetric. The Rollei business for professional medium for-
mat and consumer cameras continues under the Rollei brand. Trimble
acquired the right to use the RolleiMetric brand name for a transition peri-
od, but immediately began re-branding the RolleiMetric products under
the Trimble brand (Fig. 5). We will refer to the RolleiMetric brand name in
some of our communications during the transition period to highlight the
strong technical history of what is now the Metric Imaging Department
within the GeoSpatial Division.
GP The current RolleiMetric line of AIC modular digital map-
ping cameras which is available in single, dual, triple and
quadruple configurations would appear to be one of the
strongest assets of the new GeoSpatial Division with consider-
able potential for commercial sales. Can we expect to see fur-
ther development of this particular product line for example,
resulting in a really large-format digital aerial frame camera?
We will proceed in the opposite direction; toward the development of
smaller cameras that are purpose built for high-precision work on engi-
neering scale projects with rapid turnaround of information to allow rapid
decision making (Fig. 6). The large-format camera market exhibits smaller
growth, inhibited by the very high cost of these cameras. And there are
already three competitors chasing this slower growth, high-cost camera
segment. As you observe, the RolleiMetric product line, and the staff that
developed it, are a very strong asset to Trimble one that we pursued to
increase our depth in metric imaging to apply towards our strategy.
GP With regard to TopoSys, which was also acquired in
September 2008, it seems that a similar situation has arisen to
that of RolleiMetric in that the TopoSys companys main product
is now being offered by Trimble as the Trimble Harrier Corridor
Mapping System.
(i) Does this mean that the TopoSys name will also disappear?
(ii) Will the TopoSys Falcon line of airborne laser systems
which were of great technical interest, but were not a commer-
cial success now be dropped from the Divisions product line?
Trimbles brand is recognized worldwide, especially in the markets
that the GeoSpatial Division will focus on, so we will operate under the
Trimble brand. Your observation regarding the Falcon II product is correct,
with the underlying LIDAR technology being well over 10 years old, and
the product itself being about 10 years old. The Falcon II product has been
discontinued since several of its subsystems were no longer in production
due to their use of now obsolete components. But the technology, exper-
tise, and know-how developed with the Falcon II are now part of our DNA.
Going forward, we will focus on the Harrier systems (Fig. 7), which are
seeing increasing adoption in the market.
GP To most outside observers, it does seem quite remarkable
that the Applanix company, (which Trimble acquired in 2003),
does not form part of the new GeoSpatial Division. On the one
hand, the Applanix GPS/INS products have often formed integral
parts of the airborne and terrestrial mapping systems offered by
Geo-3D, Rollei Metric and TopoSys. On the other hand, Applanix
also offers products that compete directly with those being
offered by these three companies that form part of the GeoSpatial
Division. Examples of this competition are (a) the Applanix DSS
cameras that compete directly with the RolleiMetric AIC airborne
digital cameras; (b) the Applanix LandMark vehicle-based mobile
mapping system that competes with the similar Geo-3D Trident-3D
system; and (c) the Applanix airborne systems that couple the
DSS camera and the POS AV GPS/IMU unit with a Riegl laser
scanner (e.g. as supplied to Limitless LLC) and compete with the
Trimble/TopoSys Harrier system with similar components.
Please could you explain this situation of Applanix not forming
part of the GeoSpatial Division and outline how these actual and
potential overlaps and competitions between products are being
resolved and managed within Trimble.
I nt er vi ew
July/August 2009
Fig. 3 The Summit Evolution Digital Photogrammetric Workstation (DPW) that
is used for feature data collection employing 3D stereo-viewing techniques.
Fig. 4 The Trimble Road Asset Inventory System is based on the Geo-3D Trident
mobile mapping system.
8
Referring back to my earlier description of Trimbles organizational
philosophy, Applanix operates as one of our divisions within Trimble with
a defined focus on integrated GNSS/INS systems for mobile mapping, as
well as solutions for the rapid response market. We work together closely,
on a daily basis, to supply the underlying technology for both divisions
for use in our respective areas of focus, as well as to ensure that we make
the right solutions available to meet each customers needs.
With regard to perceived or actual overlap in products, I would offer sev-
eral observations. First remember that the motivation for these acquisi-
tions was to increase the depth of our technical and market capability for
what is a quite challenging and long-term commitment to solve complex
problems in geospatial imaging. So in that respect the people aspect
there is no overlap.
Second, with regard to the Trimble Aerial Camera (RolleiMetric AIC) and
the Applanix DSS, the overlap is largely perception but a perception that
does indeed exist in the marketplace. We investigated this exhaustively
during the RolleiMetric acquisition. What we found was initially surprising
but not unexpected upon further reflection. The Aerial Camera is a cam-
era, while the DSS is a camera system (comprised of camera, GNSS/INS,
flight management system, etc.). While we found that potential customers
contacted both Applanix and RolleiMetric when first considering the pur-
chase of an aerial digital camera, these customers rapidly self-selected
into two groups (1) those that wanted a camera to perhaps integrate
into an existing LIDAR system or to replace a film camera in an aircraft
that was already equipped with a flight management system, and (2) those
that needed a turn-key imaging or imaging/LIDAR system. So we found
that, in reality, perhaps only 10 percent of the time, were Applanix and
RolleiMetric still competing when the customer was ready to make a pur-
chase decision. With both products now within Trimble, we can meet the
needs of both types of customers.
Today, we have a compelling airborne product portfolio from cameras
(Trimble Aerial Camera) to camera systems (Applanix DSS) to integrated
imaging/LIDAR systems (Trimble Harrier). In the future, we plan to expand
this portfolio with more configurations to provide customers with greater
choices and purpose-built systems to focus on their specific applications.
GP You mentioned earlier in the interview the convergence of
the land survey, mapping and GIS, and aerial mapping segments.
What is Trimbles perspective on these industries over the next
five to ten years?
Trimbles focus is to provide robust and ubiquitous information solu-
tions that meet the needs of our defined market segments. As I mentioned
before, over the next 5-10 years, we see traditional industry boundaries
blurring between land survey, mapping and GIS, and aerial mapping. The
field and the office are overlapping as data processing and engineering
expertise move closer to projects. Surveyors are adding data management
abilities to their skills portfolio. Engineering and spatial data are being
tracked with project timeline and accounting data. Survey instruments are
combining GNSS, optical and imaging capabilities. And construction machin-
ery is utilizing GNSS and lasers to enable 3D machine control that puts
design surfaces, grades and alignments in the cab, allowing automatic,
accurate real-time positioning for earth-moving operations. Put simply,
everything is convergingconnecting. Trimbles Connected Site solutions
foster this convergence now and offer a vision for the future, to ultimately
improve productivity and transform the way in which work is done.
Gordon Petrie is Emeritus Professor of Topographic Science in the Dept. of
Geographical & Earth Sciences of the University of Glasgow, Scotland, U.K. E-
mail - Gordon.Petrie@ges.gla.ac.uk
Latest News? Visit www.geoinformatics.com
I nt er vi ew
July/August 2009
Fig. 5 The Trimble Aerial
Camera, formerly sold as the
RolleiMetric AIC
(Aerial Industrial Camera).
Fig. 6 A Nikon D3 small-format
digital frame camera that has been
calibrated for photogrammetric
applications by Trimbles Metric
Imaging Department. The camera
can be connected via a suitable
adapter to an appropriate GPS
receiver to allow it to acquire geo-
coded images.
Fig. 7 (a) The Trimble Harrier Corridor Mapping System which comprises a
full waveform airborne laser scanning system; an Applanix POS-AV position and
orientation system; and an optional imaging system.
(b) A pseudo-coloured image of an open-cast mine located near Havelsee, in
Brandenburg, Germany, produced from data acquired by a Trimble Harrier
Corridor Mapping System.
[a]
[b]
9
Rethinking the Geo-information Economy with Neogeography
All of Austria on OpenStreetMap. This was breaking news in January 2009
when it became public knowledge that the Viennese publishing house
Compass.at would provide geo data from its portal Plan.at to OpenStreetMap
(OSM), which covers all of Austria. Florian Fischer met with Hermann Futter, the
CEO of Compass.at, in Vienna to get the whole story.
By Florian Fischer
The publishing house Compass.at is provid-
ing data from its website Plan.at for importa-
tion to OpenStreetMap. The announcement in
January 2009 was brief and did not get the
broad media attention that was expected,
even though this step by Compass.at might
be the first sign of a change in the Austrian
geo-information market. Since then the com-
munity has been working on the integration
of the geo data.
Geo-data Provider Overnight
Compass has been publishing business infor-
mation for more than 140 years. They were
among the first 500 companies worldwide that
ran their own web servers to publish their data.
In 1997 they started a service called Plan.at
which, as Futter says, became the new econo-
my hangout. All business data from Compass
was mapped on Plan.at using geographic data
from one of the numerous private sector geo
data providers of the time. Before the duopoly
of Navteq and TeleAtlas emerged, there were
some other geo data companies as well,
explains Futter. To shorten the story a bit,
Plan.at reached the break-even point just before
the crash of the new economy, their geo data
provider went bankrupt, and Compass bought
all the data and became a geo data provider
overnight. It was mainly basic geo data, Futter
says, and we continued to work on the data
to maintain and extend it.
Becoming a private sector mapping agency
overnight, they were somewhat free from the
mental constraints of those who were deep in
the business of private and public mapping
agencies, and they had a vision. At that time
we had the idea to launch a geo data commu-
nity together with other surveying companies
to collaboratively stream our data to a common
pool. Our aim then was to be independent from
the big players, at least concerning the cover-
age of Austria. Compass encountered problems
similar to those of the predecessor of
Wikipedia. Their requirements for the validity of
the data were simply too high and too strong.
"The idea of OSM has been much more elabo-
rated than ours," Futter admits, "but when we
saw what OSM is, it was clear for us to con-
tribute our geo data". Compass.at will sooner
or later make a total change from using their
own geo-data to the usage of OSM data for
their product Plan.at.
12
Ar t i cl e
July/August 2009
Hermann Futter, CEO of the publishing house Compass.at in Vienna
Donate your Geo-data!
OSM might become a Strong
Competitor
Futter believes that as a small entrepreneur
I dont have a chance against the OSM com-
munity. Thus Compass is joining the commu-
nity rather than becoming its opponent. And
OSM could become a big competitor of
Navteq and TeleAtlas. They are in competition
especially for tourist, city and business infor-
mation systems because OSM is reliable
enough for these domains. However in other
domains the requirements are higher and
OSM cannot give a legal guarantee for the
validity of its data, e.g. the automotive sec-
tor where data is tied to vehicle safety. The
public mapping agencies are still skeptical
about OSM. They claim that crowd-sourced
data or volunteered geographic information
(VGI) does not have the same quality as their
products, but they are overlooking an impor-
tant fact. While public mapping agencies need
highly accurate geo data for their administra-
tive tasks, no one needs such accurate data
to build a branch finder or a tourist informa-
tion system. Furthermore it is not their task
to create products but only to create a basis
for economic activities. Thats why we are in
competition with the city administration of
Vienna (Magistrat Wien) soonest. The City of
Vienna offers a public viewer for geo data sim-
ilar to Plan.at. The database for this applica-
tion is derived from the public mapping agen-
cy. Thus it is highly detailed and accurate.
Hermann Futter thinks the City of Vienna
hence competes with private companies. This
is not good for the economy and is not their
task. Additionally, their data is practically inac-
cessible because of the high price.
Compass will still have a mapping team to be
effective with regard to corrections. Some
Compass customers need to have corrections
done immediately or have special require-
ments for detail information. These require-
ments can clearly not be put to the OSM com-
munity but have to be done by the Compass
mapping team. With OSM I can get more
accurate data. Data which may not cover
everything but which is flexible. He consid-
ers the possibility of using individual carto-
graphic representations for the OSM data.
This is an advantage not to be underestimat-
ed as cartographic representation on OSM's
front-end differs widely from Austrian stan-
dards, e.g. highways in OSM are blue while
they are yellow in Austria. As it is data and
not just images from a map service like
Google Maps or Bing Maps, various carto-
graphic representations for arbitrary purposes
can be adopted by Compass.
Hence this seems like a win-win situation.
Compass takes some traffic from the OSM
servers and thus might become a popular
entry point for OSM in Austria. This might
bring Compass an increasing number of clicks
to gain some profits.
How to create Value with Plan.at
According to Futter, Plan.at has approximate-
ly 40,000 unique users at the moment. With
an ever increasing number, new ways of
adding value are possible as selling the data
is kicked out of the value-added chain when
using OSM data. The big difference between
OSM and other commercial data providers like
Google Maps is that OSM data and the carto-
graphic representations can be used under
OSM Data Import Not without
Hassle
Compass donated all its geo data, worth an
estimated half million Euro, to the
OpenStreetMap Community. The data was given
with the obligation that the community mem-
bers would incorporate it into the OSM
database. But thats fairly easy and has been
a fast process because there is a coordinate
system and you can just take the different lay-
ers into the database, remarks Futter. However
fast the integration was, the OSM community
encountered some early problems with posi-
tional accuracy. It seemed that data from
Compass was less accurate than was thought
and partly outdated in comparison to current
OSM data. Some community members even
refused to input data in their area. They
believe that their GPS mapping is of better qual-
ity than the data from Compass.
A Win-win Situation for Compass and
OSM
As well, OSM has some pitfalls that have to
be covered by Compass. In the view of
Hermann Futter, OSM has some performance
problems with their servers at the moment.
Mainly due to ever-increasing traffic, the OSM
foundation cannot comply with the upgrade
of their server performance. Thus Compass
will become a kind of mirror of the OSM data
and import OSM data at regular frequencies
for three reasons. First, to release the OSM
servers and fetch some traffic. Second, to
extend the OSM data with commercial data
that can be sold to customers. And third, to
provide a front-end for OSM appropriated to
Austrian standards and users.
Latest News? Visit www.geoinformatics.com
Ar t i cl e
13
July/August 2009
The frontend of Plan.at
the Creative Commons License cc-by-sa. That
is, private persons and companies can use
OSMs data free of license fees as long as
there is a reference to the source. As a mat-
ter of course the data cannot be sold because
everyone can get it for free. Thus added-value
has to be created in another way. Web por-
tals must distinguish themselves, is Hermann
Futters answer. Plan.at for example offers
tools for measuring areas and distances which
are not offered by other portals. Compass
adds value in two ways. They produce appli-
cations and services for companies and they
try out some advertising business models. For
example, a layer for certain branded location
stations might be implemented if a company
becomes an advertising partner.
Signs for a Change of Face for
Mapping Agencies?
In addition to Compass.at, other private and
even public mapping agencies have donated
their geo data to the OSM community. The
company Automotive Navigation Data (AND),
a leading provider of location, routing, map-
ping and address management donated its
street level data of the entire Netherlands and
the major road networks of China and India
to the OpenStreetMap community. The
Bavarian state mapping agency provided aeri-
al photos with a resolution of 2 m to the OSM
community. Unlike AND and Compass.at, the
images were not integrated into the OSM
database. The communitys active mappers
could use them to digitize and derive geo
data for the OSM database. The project has
ended and is considered a full success. Now
the Bavarian state mapping agency wants to
convince mapping agencies from the other
German Bundeslnder to also provide aerial
photos to the OSM community. Hence it
seems that a change of face is slowly coming
even to the public mapping agencies.
The Advent of a Change in the
Geoinformation Economy
The deployment of open and crowd-sourced
geo data in the geo-information economy
means a radical change in the value-added
chains. The sale of geo data has dropped out
of the chain. Geo data on a cost-free basis
will boost the creation of applications, the
refining of data and services based on geo-
communication. These are where the biggest
profits can be gained in the whole value-
added chain. The collection, maintenance and
publishing of basic geo data then is the task
of a network of private mapping agencies and
an active community of mappers, such as
those you can find in the OpenStreetMap pro-
ject. Actually this work belongs to the public
mapping agencies originally, so they should
be aware not to stand out in this process. The
value-added chain will probably focus on
areas similar to the OpenSource community.
Profits are made with the distribution of soft-
ware packages, the creation of applications
based on open-source software. The distribu-
tion of customized geo information products
and applications might be an option, or appli-
ance offers, such as different front-ends for
OSM. Last but not least, a very interesting
domain will be the mediators who will keep
the communities active and offer consultant
services on using and communicating with
geo information. This means a radical chance
in the geo information economy and in
Hermann Futters view it could even sweep
away other map applications and geo data
providers like Google Maps, Bing Maps,
Navteq and TeleAtlas.
But amid all this enthusiasm for volunteered
geographic information and crowd-sourcing,
the issue of stability is not often mentioned.
Finally, you never know what the community
will do. Certainly there will always be some-
one to do the mapping and the refining of
data. That is the way of life in the digital gen-
eration. But you never know what direction
the community will take. Actually this is one
more reason for todays mapping agencies to
get into neogeography and actively take part
in the future of the geo-information economy.
Compass.at shows the way to go.
Florian Fischer ffischer@geoinformatics.comis
contributing editor of GeoInformatics.
14
Ar t i cl e
July/August 2009
The frontend of OpenStreetMap
Great news! Next year
Spot Image will offer precise
50 cm imagery products from
the new Pliades constellation
bringing the best resource
availability to the market.

Ask us for the Earth
www.spotimage.com
P
E
M
A
2
b

-

P
h
o
t
o
g
r
a
p
h
e
r
:

J
e
a
n
-
F
r
a
n

o
i
s

D
a
m
o
i
s

-

I
m
a
g
e

s
i
m
u
l

e
,

c
o
p
y
r
i
g
h
t

C
N
E
S

d
i
s
t
r
i
b
u
t
i
o
n

S
p
o
t

I
m
a
g
e
Daily revisit to any point on the globe with Pliades 1 & Pliades 2
Collection capacity of more than1 million square kilometers per day.
Rapid image delivery thanks to exceptional acquisition capacity, superior agility
and a 20 km broad swath.
Urgent programming and direct tasking will make the Pliades constellation
the most reactive imaging source.
Very accurate* 50 cm products for large scale mapping, land administration,
defense & security, and mobile applications.
*3m (CE90) without GCPs
The Power of Ten
Pitney Bowes Business Insights
MapInfo Professional 10 v10.0
This month, Pitney Bowes Business Insight launched MapInfo Professional v10.0, the latest version of the companys
flagship application for business mapping and analysis. Designed in direct response to valued feedback from the
worldwide MapInfo Professional customer base, this latest, landmark upgrade offers unprecedented new
capabilities and equips organisations to make better, faster and more insightful business decisions.
By Tom Probert
Over half a million users across public and
commercial sector organisations use MapInfo
Professional to help them easily visualise and
harness the critical relationship between data
and geography to make better business deci-
sions.
MapInfo Professional v10.0 is the latest
release of this popular product and Pitney
Bowes Business Insight has worked in close
consultation with over 400 customers to
ensure that this upgrade reflects the evolving
needs of users. The end result delivers sub-
stantial improvements at all levels, but in par-
ticular in three key areas: usability, data
access and cartographic output. These devel-
opments combine to deliver a marked
increase in productivity for existing users and
a faster learning time for new users.
With an eye firmly on improved ergonomics,
MapInfo Professional v10.0 has been devel-
oped to deliver increased productivity and sig-
nificant cost savings for local authorities, con-
sultancies and related agencies across virtu-
ally all operational and project areas. The
redesigned interface means that users will get
more done in less time; time savings can be
measured in speed of fulfilment or simply in
mouse clicks.
Sharing output and access to data feature
heavily in the list of productivity benefits.
The new Layered PDF output provides an
instantly accessible and very flexible way of
sharing output with all the members of an
extended project team, including (and espe-
cially) those who are not users of a mapping
or GIS system.
Support for PostGIS, a freely available Open
Source database system, is a new cost-effec-
tive option for central data storage in MapInfo
Professional v10.0. This ability to access data
where it lives, without the requirement for
translation and maintenance of multiple ver-
sions of data, is an important factor in the
productivity of all large scale projects. In
addition to the newly added support for
PostGIS, Version 10 builds on MapInfo
Professionals already impressive data access
capabilities with native support for SQL Server
2008 spatial data and others.
Since this is version 10.0, Pitney Bowes
Business Insight has drawn up a Top Ten list
of the new capabilities and enhancements to
help quickly identify the added advantages
and business benefits which MapInfo
Professional v10.0 brings into play:
1. An Improved Layer Control system
MapInfo Professional v10.0s redesigned Layer
Control system makes users far more produc-
tive compared to previous releases of the
software. Common operations take fewer
clicks and the instant feedback it provides
helps new users to learn more quickly.
A major benefit of the new system is that the
Layer Control operates as a floating, docked
or slide out window, meaning the user can
interact with it and see updates to the map
immediately. This makes it more intuitive.
For example, a new user in a local authority
experimenting with the zoom layering capa-
bilities will see the impact on the map imme-
diately.
A town planner or engineer viewing three dif-
ferent map windows of the same area can
also work more easily in MapInfo Professional
v10.0. For example, the user has one map
window containing aerial photography, a sec-
ond containing the government supplied main
mapping source and a third containing the
local authority parcel boundaries. Copying a
layer from one of the map windows to anoth-
er is as easy as dragging and dropping.
Changing the same layer in all three windows
can also be done in one operation instead of
three separate efforts.
In addition, the new preview button intro-
duced in the Layer Settings dialog boxes
allows for multiple previews to be viewed
16
Revi ew
July/August 2009
UK Post Codes by Population Density- MapInfo Professional v10.0 makes it even easier for organisations to
harness the relationship between data and geography to make better business decisions
without the need for moving into and out of
the Layer Control option.
Thus, a site location analyst preparing a
report to be presented at a public hearing can
easily experiment with different colours and
styles in order to create the best looking map
possible.
MapInfo Professional users tend to have a
large number of layers in their map window,
and now they can work with these multiple
map layers at the same time. In some cases,
there may be multiple layers stored as differ-
ent sets of data, but that should appear on
the map as the same style. For example, a
crime analyst might have a number of differ-
ent layers of crime information. The layers
might represent sets of data from different
time periods or different types of incidents.
If a user needs to turn labels off or on for all
of the layers at once or change the label font
for all of them at once, this can be done in a
single operation in MapInfo Professional
v10.0. The layers could even be in different
map windows.
Grouped layer support is a further addition to
MapInfo Professional v10.0s improved Layer
Control system. A Grouped Layer is simply a
number of individual map layers that are com-
bined and represented as a single virtual layer
in the Layer Control system. This allows for
a cleaner, easier to read list of layers and
allows a whole group to be operated on at
once when turning the layer display on and
off.
The display styles of the individual layers in
the group can also still be independently con-
trolled.
As an example, this can be useful for a retail
site location analyst who might have a num-
ber of different levels of geography or tables
of different demographic variables available
for geodemographic analysis. These can all
be organised into one or a number of grouped
layers to make the whole list of layers easier
to use and manage.
2. New Toolbar System
MapInfo Professional v10.0 offers a new tool-
bar system. The icons have been redesigned
to make them easy to distinguish and have a
more modern look. The changes are more
than just superficial; the tool bars can now
be docked anywhere on the screen and the
icons are easier to learn as they appear next
to their associated menu command.
3. New Live Scale Bars
MapInfo Professional v10.0 offers a new live,
automatically updating scale bar facility. This
improves map presentation and readability.
The new scale bar adapts its values as the
view of the map changes.
PostGIS. The support has been implemented
in a manner consistent with how earlier ver-
sions of MapInfo Professional already interact
with other database systems.
6. Full (native) Support for Microsoft
SQL Server 2008 Spatial Data
MapInfo Professional v10.0 offers direct con-
nectivity with the ability to read, write and
edit spatial data from an SQL 2008 database
system. MapInfo Professional version 9.5
offered read only support to SQL Server 2008
spatial data last year. However, this year the
support has been extended to uploading and
editing data as well. As with PostGIS, MapInfo
Professional v10.0 works directly with the
database and no costly middleware is
required.
7. Support for the latest Microsoft
Excel and Access Data Formats
MapInfo Professional v10.0 offers direct access
to the latest Excel (.XLSX) and Access
(.ACCDB) data formats. This makes the shar-
ing and gathering of geospatial data much
simpler and can save organisations significant
amounts of time through no longer needing
to convert data from one format to another.
8. Handy Tool for making Spatial
Area Calculations
MapInfo Professional has long had the ability
to allow a user to calculate values across dif-
ferent sets of spatial data. For example, to
determine what the percentage is of wetlands
area across a set of boundaries such as coun-
ties. Version 10.0 offers a tool that simplifies
this process by offering a focused user inter-
face.
This is a comprehensive capability offering
multiple styles of scale bars and user control
over the style and display.
4. Create Layered PDF Output
MapInfo Professional v10.0 ships with a high
quality Layered PDF driver. Layered PDFs offer
flexibility in viewing the resulting map, as the
map reader can control the level of detail that
is offered. The capability works with both
plain map windows and with MapInfo
Professionals Layout window. This new func-
tionality places control of the viewing of com-
plex, multi-layered maps directly into the
hands of the user and delivers enhanced lev-
els of viewing customisation and flexibility.
For map development, this delivers significant
time and cost-savings as it eliminates the
need for additional maps to be produced
when all that is required is a change in lay-
ers.
5. Support for PostgreSQL and
PostGIS Database Systems
For those unfamiliar with PostgreSQL, it is an
open source database system. PostGIS is a
spatial add-on developed for PostgreSQL.
MapInfo Professional v10.0 directly supports
PostGIS databases with no middleware
required. Pitney Bowes Business Insights
support for PostGIS benefits from more than
twenty years of experience in working with
spatial database systems. MapInfo Profes -
sional users that are familiar with how the
software works with other spatial databases
such as Oracle Spatial or with Pitney Bowes
Business Insights own SpatialWare technolo-
gy will be very familiar and comfortable with
MapInfo Professional v10.0s support for
Latest News? Visit www.geoinformatics.com
Revi ew
17
July/August 2009
UK Crime Data- MapInfo Professional v10.0 enables crime analysts to manipulate data related to different
types of incidents in a single operation. This makes the process of identifying important patterns and trends
even easier and can help the police in locating criminals before they go on to reoffend.
9. UTF-8 support in DBF Files
MapInfo Professional v10.0 offers support for
dbf files making use of UTF-8 encoding.
10. Improvements to Installation
and Documentation
In MapInfo Professional v10.0, all of the con-
figuration and installation information has
been consolidated into a single document.
The installation of the MapCAD TOOLS (the
comprehensive CAD-like editing capabilities
introduced last year) has been integrated.
Previously, there had been separate manuals
for the license server, workgroup installation
and others. In addition, EasyLoader v10.0 is
shipping with MapInfo Professional v10.0 this
year, avoiding a separate download and
install process.
Conclusion
MapInfo Professional owes much of its ongo-
ing success to the valuable feedback it
receives from its worldwide customer base.
Pitney Bowes Business Insights Product
Management and Development teams under-
stand it would be impossible to redesign the
product on their own as there are simply too
many different types of users, data sources,
options, possibilities and use cases.
Therefore, through the companys annual user
conference, beta testing programmes and
ongoing contact with the customers that actu-
ally use the product on a day-to-day basis,
Pitney Bowes Business Insight is constantly
looking to improve MapInfo Professional by
gathering, evaluating and implementing this
information from existing users. MapInfo
Professional v10.0 is the end result of this
intensive ongoing process and reflects the
evolving needs of users who can now achieve
significant time and cost-savings by harness-
ing the critical relationship between data and
geography to make even better, faster and
more insightful business decisions.
Tom Probert is Desktop Product Manager, EMEA at
Pitney Bowes Business Insight. For more informa-
tion, please visit www.pbbusinessinsight.comor
www.pb.com
18
Revi ew
July/August 2009
New Layer Control System
UNI__GIS
Educating GIS Professionals Worldwide www.unigis.org/uk
Study for a postgraduate qualification in GIS
by distance learning
With over 16 years of experience presenting distance learning courses
to professional standards, UNIGIS offers you access to the premier
international network of Universities in GIS education.
Our courses meet the learning needs of busy professionals, or those
seeking to enter the GIS industry. We support you with personal
tutors, on-line help and optional residential workshops. Our courses
are assessed by coursework - there are no examinations.
Find out why the UNIGIS postgraduate courses are so successful:
call +44 161 247 1581, fax. +44 161 247 6344, email unigis@mmu.ac.uk,
or visit our web site at http://www.unigis.org/uk
WE ALSO OFFER:
Flexible entry requirements
Specialist pathways in GIS, GI Science, GIS &
Management and GIS & Environment
Course modules supported by key textbooks,
software and on-line resources
Flexible study options - full distance learning
or distance learning plus residential workshops
No examinations - full continuous assessment,
plus credit for prior learning or experience
Networking with an international community
of GIS professionals
Map Reading and Map Analysis
ESRI Book on Map Use
The sixth edition of the book Map Use: Reading and Analysis is meant as a
comprehensive, philosophical, and practical treatment of map appreciation.
Menno-Jan Kraak discusses its contents and compares this new publication with
the classic first edition from 1978.
By Menno-Jan Kraak
It is common knowledge among cartographers
that when you put four of them in a car it will
never reach its destination. Using maps is quite
different from making maps. A lot has been writ-
ten about map design and production and far
less on the actual use of maps. The book Map
use: reading and analysis is one of those that
treats use and it is a classic. As a student I used
the first edition from 1978, then authored by
Philip Muehrcke. Opening the current 6th edi-
tion is a pleasant surprise. A full color book with
well designed illustrations. The earlier editions
were issued by a small private publisher, the
6th edition has been published by ESRI Press.
It is obvious this publisher loves books and
maps. It is also a guarantee for the authors that
the book gets exposure in the right environ-
ment. Another reason for the quality of the illus-
trations is related to the fact that Kimberling
and Buckley (now an ESRI employee) have long
experience at Oregon State University and have
been involved with the Atlas of Oregon. Many
map samples are devoted to Oregon. A disad-
vantage of this, despite the quality, is the bias
towards North American cartography.
The authors position their book as a bridge
between academic indoor map use and military
way finding. As they state: this book offers its
readers a comprehensive, philosophical, and
practical treatment of map appreciation. They
intend to reach that objective by what they
define as a fluid definition of a map: a graphi-
cal representation of the environment that
shows relations between geographic features,
a definition that originated in Robinson and
Petcheniks The Nature of Maps from 1976. In
addition they make a distinction between the
tangible map and a users mental or cognitive
map, and when appropriate discuss commer-
cial products.
GPS and Land Partitioning
The book is split into two major parts, one on
map reading and one on map analysis. Each
chapter is introduced by a short text that puts
the content of the chapter into perspective. A
reference list ends each chapter. The content of
the chapters is well illustrated and explained in
detail. It makes the book useful for college stu-
dents, but its depth also makes it useful for
those seriously interested in a practical topic
such as the use of GPS and maps. Both sec-
tions are preceded by an introduction that
explains the basics of the map. Here, next to
mental maps they use the term cartographic
maps, a term which I find a bit strange, since I
would argue that any map is cartographic. In
this section I found one of the cartoons that
characterized the earlier editions. It shows how
a childs mental map might work (see image
page 21). The introduction is also an advertise-
ment for maps giving four major arguments why
maps are popular: they are convenient to use,
they simplify our surroundings, they are credi-
ble, and they have a strong visual impact. These
seem to be good arguments, but happily the
authors also discuss the other side of the coin.
For instance, the credibility argument is not
always right, and the map reader should be on
the alert for distortions, errors and omissions,
which might happen by accident or even on
purpose.
The map reading section is split into 10 chap-
ters, each discussing an aspect of the map as
a whole. Here the book has the most overlap
with common cartographic textbooks. Chapter
1 describes the earth and geographic coordi-
nates. The notions of ellipsoid, geoid and
graticule are well explained and illustrated.
Chapter 2 deals with the notion of scale. In
chapter 3 the need for and effects of map pro-
jections are treated by dealing with their prop-
erties. Several common projections are dis-
cussed in more detail. Examples of planar
projections are the orthographic projection (the
Google Earth view) and the gnomonic projec-
tion. Examples of cylindrical projections are
Mercator and the Transverse Mercator, and as
an example of the conic projection the Lambert
conformal projection is given. This one is often
applied on North American maps. The
20
Revi ew
July/August 2009
Title: Map Use: Reading and Analysis
Authors: Kimberling, A.J., Buckley, A.R.
Muehrcke, P.C. & Muehrcke, J.O.
Publisher: ESRI Press Academic
Price: 75,
ISBN: 9781589481909
Mollweide and Robinson projections are sam-
ples of global projections.
Chapter 4 elaborates on the many possible
coordinate systems one might find on a map.
This chapter is very much North American ori-
ented, with lots of attention to the state coor-
dinate systems. But also the UTM system is
explained and some European systems are
treated briefly. After reading this chapter one
can really use the maps as such. Land parti-
tioning is the subject of chapter 5. It deals with
the shape of land parcels in North America and
explains the history behind the different shapes,
often due to the habits of former colonial pow-
ers. In chapter 6 the visualization of the earths
terrain is discussed. All aspects of relief portray-
al are treated including relative and absolute
heights and depths, relief shading, raised relief
models, block diagrams etc. Aspects of digital
terrain models and fly-throughs are not forgot-
ten. Thematic maps are dealt with in chapter 7
(qualitative maps) and chapter 8 (quantitative
maps). Both chapters are, again, lavishly illus-
trated with many map examples and both fol-
low a similar structure, dealing the principles
of single themes, multivariate themes, change
maps, and dynamic change maps. Any map
type one can think of is described. Image maps
are the topic of chapter 9. The basics of pho-
togrammetry and remote sensing are explained
and many samples are shown. The image maps
themselves, for example imagery plus carto-
graphic symbology, is limited to the last pages
of the chapter. Google Earth / Virtual Earth, the
environment that many readers will be familiar
with, gets only one paragraph. Map accuracy
and uncertainty are treated in chapter 10.
Generalization takes up a major part of the
chapter. Other topics are the sources of errors
and options for how to communicate these.
A Cautionary Tale
The map analysis section focuses on the prac-
tical use of maps. As the authors inform the
reader: here our goal is to analyze and describe
the spatial structure of - and relationships
among - features on the map. The analysis can
be visual and quantitative. The first approach
might result in different answers depending on
the map reader. The second approach is objec-
tive, and should always lead to the same
answer. However, errors might still occur. The
authors claim rightfully that the beauty of map
analysis is that you can get more out of the
map then was put into it. This is obvious when
looking at a simple contour map where a
climber might see barriers in his/her path, and
a geologist might see fault lines, while the car-
tography has just expressed heights. The map
analysis part has eight chapters. Chapter 11
describes how to determine distances and
chapter 12 does the same for direction. Both
neighbor for points, several connectivity mea-
sures for lines and diversity measures for area
features are discussed, and, I have to repeat
again, are well illustrated. Chapter 18 compares
different patterns determining the degree of
spatial association. Both visual and qualitative
methods are described for point, line and area
features. Movement and diffusion patterns are
discussed separately. The chapter ends with
what the authors call a cautionary tale. Snows
1855 cholera map of London is demystified. The
map was not the starting point of Snows anal-
ysis, but his knowledge about health was. The
message the author wants to give the reader
is that maps might show patterns that should
only get a meaning if it can be supported by a
solid theory or hypothesis. Spatial patterns
might not seem to be what they look like.
The book ends with an appendix on digital car-
tographic data, a list of abbreviations and
acronyms related to navigation and GPS and
some conversion tables. A 40 page glossary and
an index conclude the book.
Verdict
The book does live up to its promises. It is
practical and supported by the necessary theo-
ry. For me the analysis part was most exciting
and the book can be recommended to every-
one who would like to start using a map. I look
forward to the authors next book, since they
promised something related to the third word
in the subtitle of the earlier edition which they
deliberately dropped with this edition: interpre-
tation.
Menno-Jan Kraak kraak@itc.nl is head of ITCs
Geo-Information Processing Department.
He is a member of the editorial board of several
international journals in the field of
Cartography and GIS.
physical and functional distances (like travel
time) are discussed based on several measure-
ment techniques using different instruments.
Directions exist in many flavors. Here geograph-
ic direction (towards the geographic north pole)
and magnetic directions (towards the magnetic
north pole) are described. For the last type of
measurements examples of both digital and tra-
ditional compasses are given. Practical exam-
ples for large and small scale maps are elabo-
rated.
Position finding and routing are the topics of
chapter 13. The where am I? question is illus-
trated by several examples with the map and
compass. Navigation for land, sea and air using
more advanced technology is also part of the
chapter. GPS is dealt with in chapter 14. After a
brief description of the technology behind GPS,
its operation and use is explained. GPS accura-
cy get special attention. This is needed because
people will often be surprised that GPS data is
not necessarily the most accurate data. The
operation of handheld systems also gets spe-
cial attention, and the authors do not forget to
tell about their limitations as well. They stress
you will still need a compass and map, and not
only for when your batteries die. In chapter 15
methods to measure shapes, areas and vol-
umes are discussed. Many methods are illus-
trated with maps and examples of calculations.
Surface analysis is found in chapter 16.
Questions about how to derive information on
slope, gradient, aspect illumination, curves, pro-
files and cross sections as well as visibility anal-
ysis are answered. The more complex analysis
of spatial patterns follows in chapter 17, while
18 concentrates on spatial associations among
patterns. Chapter 17 deals with feature count-
ing and spatial arrangements for point, line and
area objects. Several methods used, like
Morans I autocorrelation, and the k-nearest
Latest News? Visit www.geoinformatics.com
Revi ew
21
July/August 2009
A childs mental map
Covering Large Areas in Short Time
A high resolution orthomosaic was generated for the 800 km2 area of the County of Campinas, Brazil, derived from Ikonos
satellite imagery, a Digital Terrain Model (DTM) generated from WorldView-1 satellite imagery, and an implemented
geodetic network of 19 points. The final orthomosaic was generated both in normal color and in false color infrared in
order to broaden the possible applications to environmental issues. This result represents a very interesting solution to
cover large areas in short time and with a lower cost than traditional aerophotogrammetric methods, thus with basically
the same quality, allowing for a better assertive planning and sustainable development of the territory.
By Nelson de Jesus Parada, Ulfh Walter Palme, Jason San Souci and Philip Cheng
1. Introduction
In Brazil, outdated territorial and cartographic
data are unfortunately still very common and
the planning activities in a fast-paced urban
expansion environment represent a rough
challenge to urban planners and managers,
especially in day-to-day activities. This is also
the case for the County of Campinas, where
in recent years the population growth has sur-
passed 1 million inhabitants and is still grow-
ing. Also, the Metropolitan Area of Campinas,
including 19 other counties, is under great
pressure. These boundary conditions make
updated cartography and territorial monitor-
ing a constant necessity, because of aggres-
sive planning and management schedules.
In order to deliver to SANASA the Water
Supply and Sanitation Company of the County
of Campinas - and to the GIS County of
Campinas Administration Project, an updated
Cartography and Technical Cooperation
Agreement was signed between SANASA and
FUNCATE to generate a new orthomosaic of
the County of Campinas derived from high res-
22
Ar t i cl e
July/August 2009
Figure 1 VRT-29 on Ikonos imagery.
A High Resolution
Orthomosaic in Brazil
olution satellite imagery. The decision of using
high resolution satellite imagery instead of
conventional aerophotogrammetry was due to
the short time of approximately 1 year to gen-
erate the new cartography and budget limita-
tions.
In partnership with FUNCATE and UWPE, NCDC
Imaging was awarded the contract to acquire
and process high-resolution satellite imagery
to a final scale of 1:2000 NMAS. NCDC is a
Native American-owned small business in
Colorado Springs that specializes in remote
sensing and GIS applications using high res-
olution imagery, such as mapping and imag-
Brazilian Geodesy standards. The first step
in order to ensure the possibility of deliver-
ing a final orthomosaic compatible with the
scale of 1:2.000 in Brazilian error and accu-
racy standards was the implementation of a
network of geodetic points ground truth -
using well known GPS procedures in accor-
dance to the Brazilian specifications set by
the Instituto Brasileiro de Geografia e
Estatstica (IBGE). A total set of 35 geodetic
points were provided by MDATA Engenharia
S/S Ltda. From this set, 19 inside the
County of Campinas - were used for the gen-
eration of the orthomosaic. The survey deliv-
ered the geodetic points in SIRGAS 2000, as
this is the new official system that Brazil has
adopted in recent years.
A high resolu-
tion Digital Terrain Model (DTM) derived
from World View-1 stereo imagery - collect-
ed in two strips during mid 2008 over the
800 km2 of the County of Campinas. The
DTM was derived using the well known
and validated Rational Polynomial
Coefficients (RPC) method.

High resolution color imagery from the


Ikonos satellite collected in mid 2008 over
the 800 km
2
of the County of Campinas,
both in normal color and false color
infrared.
PCI Geomatics software was
used for the project to generate the DTM
and orthomosaic imagery.
3. Results
Figure 1 presents the location for VRT-29.
Figure 2 a fragment of the DTM over the
Viracopos International Airport and correspon-
ing services, land cover classification, natural
resource management, sustainability planning
and economic development. NCDC has deliv-
ered projects to clients including USGS, US
Army Corps of Engineers, US Forest Service,
Pacific Gas & Electric, Cities of Denver, Seattle,
Providence, Sacramento, Albuquerque, and
Dallas to name a few.
2. Method
The generation of the Orthomosaic was
accomplished using:
A specific geodetic network with
19 well distributed points in accordance to
Ar t i cl e
23
July/August 2009
Figure 2 Fragment of the DTM over the Viracopos International Airport (left)
and correspondent WV-1 image (right).
Figure 3 3D fragment of the DTM with contour lines.
dent WV-1 image, and Figure 3 a 3D fragment of the DTM with con-
tour lines.
General results obtained for the DTM with no ground control at all
and verified with 37 independent geodetic points resulted in a root
means square (rms) error of 1.0m and final vertical accuracy of 1.7m.
A well distributed subset of 23 points resulted in a rms error of 0.6m
and final vertical accuracy of 1.0m. Contour lines were derived with a
resolution of 1.6m.
A WV-1 orthomosaic was generated as an intermediate product to the
final orthomosaic. The final orthomosaic, with the imagery already
pansharpened, was then generated using AutoSync, an ERDAS prod-
uct, collecting hundreds of tie points.
Figure 4 presents the general overview of the final orthomosaic in nor-
mal color and Figure 5 in false color infrared. Since the final archive
size is of the order of 15 GB the final delivery was also in tiles accord-
ing to Brazilian Cartography standards. The tiling grid is presented in
Figure 6. The final orthomosaic for the County of Campinas presents a
high image quality with excellent contrast, as in Figure 7 normal
color and Figure 8 False color infrared.
In order to perform the error and accuracy check a set of 80 indepen-
dent geodetic points was used. The points are from the official County
of Campinas Geodetic Network (PMC) and were implemented by well
known specialized companies.
Figure 10 presents the distribution of the geodetic points in the County.
The points are mostly concentrated in and around the urban areas.
This fact does not interfere in the objectives and results since the
desired accuracy must be high in the urban areas and is not required
in the rural parts of the county where the required result should be
close to the scale of 1:5.000.
Final analysis for the rms errors and accuracy was performed compar-
ing the coordinates of the 80 points set and the generated Ikonos
orthomosaic in WGS 84 and using the Remote View software environ-
ment. Remote View is well known to have an excellent Graphic User
Interface (GUI) and not blur the image; and in this work, since the
error and accuracy checks are very close to the resolution of the
imagery ( 1 pixel), this was a key issue. Although the Brazilian regula-
tions state that in practical means WGS 84 is to be equal to SIR-
GAS 2000, it was observed that at this very high resolution level, con-
versions to and from different software platforms can introduce bias
of the order of 0.5 to 1.0m. Therefore, the analysis was performed
exploring the Remote View tools at a maximum and allowed for a
grouping of the 80 points in 2 categories, i.e. 0.6m rms or lower (Figure
14), and between 0.6m and 1m.
4. Conclusions
The County of Campinas now has an updated high quality ortho-
mosaic and also a very precise DTM. The final product compati-
ble with the 1:2,000 NMAS scale has proven to be perfectly ade-
quate to the objectives and necessities of the GIS Campinas Project.
This GIS is responsible for integrating the new generated cartogra-
phy to the technical territorial & environmental cadastre of the
County and to the databases and applications of all the organiza-
tions and secretaries of the County administration.
Besides generating a new territorial reference, the future yearly
updating process and monitoring operations will be strongly facili-
tated and fostered, and possibly at an even faster pace. It is very
important to clarify that the urban expansion process in the area is
very fast and most times the velocity of the urban expansion is
higher than the pace the county managers can see the territory
and interfere; in other words: the urban expansion process has
more velocity than control, and this is extremely undesirable since
it does not allow, among others, to allow for assertive planning
24
Ar t i cl e
July/August 2009
Figure 4 General overview of the Campinas County Orthomosaic in
normal color.
Figure 5 General overview of the Campinas County Orthomosaic in
false color infrared.
Figure 6 Tiling of the Campinas County Orthomosaic.
GO WHERE GPS
HAS NEVER GONE
BEFORE.
IMPROVEYOURGPS.COM
By coupling GPS and Inertial technologies, NovAtels world-
leading SPAN products enable applications that require
continuously-available, highly accurate 3D position and attitude.
and sustainable development. Hence it has
become imperative to have a method of gen-
eration of the updated cartography that real-
ly brings in a faster speed so that there is
more control. Fortunately this Project has
proven to deliver this fast, low cost & high
quality wanted product.
The result is compatible with the 1:2.000
NMAS scale and Brazilian Standards and rep-
resents a very interesting solution to cover
large areas in short time and with a lower cost
than traditional aerophotogrammetric meth-
ods, thus with basically the same quality.
Parada, Nelson de Jesus,
nelson.parada@sanasa.com.br
Sanasa - Sociedade de Abastecimento de gua
e Saneamento S.A.
Campinas SP Brasil and Funcate Fundao de
Cincias, Aplicaes e Tecnologias Espaciais
So Jos dos Campos SP - Brasil
Palme, Ulfh Walter, palme99@gmail.com
UWPE Engineering & Remote Sensing
Piracicaba SP - Brazil
San Souci, Jason, jsansouci@ncdcimaging.com
NCDC Imaging Colorado Springs USA
Cheng, Philip, cheng@pcigeomatics.com
PCI Geomatics Toronto - Ontario Canada
The authors would like to thank Mr. Ubirajara
Moura de Freitas from Funcate for the helpful tech-
nical
discussions about both the DTM and the final
orthomosaic.
26
Ar t i cl e
July/August 2009
Figure 7 Fragment final Ikonos orthomosaic normal color (1:2,000). Figure 8 Fragment final Ikonos orthomosaic false color infrared (1:2,000).
Figure 9 Fragment final WV-1 orthomosaic (1:1,000).
Figure 10 Distribution of the independent geodetic points for the error and accuracy check.
Latest News? Visit www.geoinformatics.com
Revi ew
27
July/August 2009
Useful Information for Everyday Geodetic Life
This book originally appeared in 2000. Its theme a practical guide to
coordinate reference systems is as important now as when it was first
published, probably more so when we consider the ever growing use
of satellite navigation systems and the introduction of web
mapping services such as Google Earth.
By Huibert-Jan Lekkerkerk
Contents
The second edition has grown with approxi-
mately 50% in size when compared to the first
edition. Critics will say that the contents of the
book cant really have been changed and they
are mostly right; datums and map projections
as part of geodesy is a relatively stable sub-
ject and not much has changed over the last
eight years.
The contents therefore are what one would
expect and give information on ellipsoids;
datums; projections and most important infor-
mation on transformations. The amount of for-
mulas is kept to a bare minimum, which, con-
sidering the extensive use of geodetic
software is a good thing in my opinion. This
way the reader is not distracted from what
really matters nowadays: which geodetic
parameter to select from the extensive list and
why the results never seem to match the
expectations.
Changes
The change between the two editions is not so
much in the facts as it is in the way the infor-
mation is presented. The first edition was
strongly biased towards UK users. This new ver-
sion includes examples from around the world.
One of the most striking changes is the change
from black and white towards full color pub-
lishing. Even though this does not add to the
information it does make the book easier to
read.
New in the book are flowcharts at the begin-
ning of every chapter showing the possible
choices a user can be presented with. Further
information is given on which chapter to read
to find the necessary information.
An extension of this user-oriented way of writ-
ing is the inclusion of a number of use cases
that give very clear information on what to do
when presented with a similar problem. The use
cases range from creating ones own datum
transformation to creating an overlay in Google
Earth.
Also useful is the addition of a terminology list
stating the terms that have been used within
the geodetic community for years as well as the
terms used in the ISO 19111 publication. The
latter is important since these terms are used
in software packages and Internet registries
such as the EPSG database more and more.
Conclusion
I can only find one downside to this book and
that is the chapter on GNSS that has been
retained from the first edition. The information
in this chapter does not give much additional
insight into GNSS systems and misses the
method used by the satellites for transmitting
their position towards the user. The other
changes from the first edition to the second
edition are however a good reason to buy the
edition even if one already owns the first one.
Especially the addition of quite a number of
user oriented information may prove very use-
ful in every day geodetic life.
Huibert-Jan Lekkerkerk
hlekkerkerk@geoinformatics.com is project
manager at IDsW and freelance writer and trainer.
This article reflects his personal opinion.
Title: Datums and Map
Projections (2nd edition)
Editors: Jonathan Illiffe
and Roger Lott
Publisher: Whittles Publishing
ISBN: 978-1-4200-7041-5
Nr of pages: 224
Price: GBP 40 (approx. 50,-)
Datums and Map Projections Book
Retooling for the Digital Data Revolution
Geospatial and GIS Technologies
Two important recent developments in exploration technology have been the delivery of stronger spatial data access and
management capabilities through the Internet, and advanced workflow support within GIS, which has become a standard
spatial platform for integrating and analyzing geospatial data with geology and other exploration datasets.
Both are helping explorers to work faster and smarter; to easily share data and knowledge across teams,
departments and regional boundaries; and to manage and make sense of larger amounts of digital data for
exploration decision-making.
By Carmela Burns
Although information technology is not a centre-stage strategy in explo-
ration industries grappling with economic and market uncertainties, it
remains a key driver for improving effectiveness and results - particularly
when you consider the growing data requirements of modern day explo-
ration.
Explorers continue to raise the bar in the software experience and capa-
bilities they expect, from full 3D visualization to data processing power
under the hood, and advanced integration support for multidisciplinary
datasets. Furthermore, many exploration organizations are setting stronger
corporate standards for the software they use on their exploration pro-
jects and how they use and manage their growing digital data resources.
During lean times, data-related waste and inefficiency - whether its
resources spent looking for the right data, hours wasted converting incom-
patible data products, or money lost through faulty decision-making - is
an easy target for improving productivity and results. The fact that there
are solutions for making many, if not all of these data issues, simply go
away is reason enough to rethink your software choices and technology
strategy.
Two important recent developments in exploration technology have been
the delivery of stronger spatial data access and management capabili-
ties through the Internet, and advanced workflow support within GIS,
which has become a standard spatial platform for integrating and analyz-
ing geospatial data with geology and other exploration datasets.
Both are helping explorers to work faster and smarter; to easily share
data and knowledge across teams, departments and regional boundaries;
and to manage and make sense of larger amounts of digital data for
exploration decision-making.
Spatial Data Access and Management
Over the past 15 years, governments and education institutions have been
making vast amounts of data available on the Internet. There are hun-
dreds of data resources and millions of spatial data sets available includ-
ing geophysical, geochemical, satellite images, and magnetic data pub-
lished by government and private organizations through open Web-based
services. Most exploration companies also have large, internal data
resources typically stored in variety of different systems. However, todays
fractured data landscape doesnt make the process of data discovery easy,
natural or seamless in either environment.
28
Ar t i cl e
July/August 2009
Global marine gravity dataset recently added to Geosofts public DAP Server.
The data, provided by Scripps Institution of Oceanography, is freely available
for download at http://dap.geosoft.com/geodap.
Dapple global viewer displaying search results from internet servers using web
services. In this case geology and magnetic data from a Geosoft DAP server and
Geocover 2000 data from a NASA tile server. Dapple is available as a free
download from http://dapple.geosoft.com.
A major challenge is that each data resource, whether internal or online,
has its own interface for finding and accessing the data. And often one
needs to go through numerous steps to find the data and then applica-
tions to convert it into a usable, consistent form before you can even
begin the actual exploration process.
Newer technologies, such as spatial search, server and desktop catalogu-
ing applications, are revolutionizing how explorers find and manage their
data and information.
In recent years, exploration software developer Geosoft has continued to
expand the range of its exploration information management capabilities.
The company has built its solutions around its core DAP server, Dapple
globe viewer and DappleSearch technologies.
We wanted to provide explorers with a simple way to find data that
matches their area of interest and the ability to view this data in a visual
and meaningful way, said Ian MacLeod, Geosoft Chief Technologist.
Together, Dapple and DappleSearch provide fast spatial and text search,
and show results as thumbnails with full metadata. Explorers are then a
click away from displaying data in Dapple.
Geosofts DappleSearch is a spatial catalogue Web service that creates
and maintains a database of active and verified Internet servers that host
spatial data services. Just as regular Internet users rely on Google to keep
track of the near-limitless resources available online, earth explorers can
use DappleSearch to keep track of new spatial data services as they
become available.
The web crawler continually searches new URLs to update the catalogue.
To-date, the catalogue engine has visited more than one million data lay-
ers over the Internet, of which about 200,000 have been accepted into
the catalogue index. Periodic re-testing of all indexed services drives the
search engines relevancy.
As Dapple users themselves discover new service sites, the URLs of the
service sites are also reported to DappleSearch, which adds each URL to
a list of candidates to be considered for the shared catalogue.
What does it all mean for the explorer? Managing exploration data
resources, for easy access and maximum reliability, is a key advantage
during all phases of a project lifecycle. During early stage exploration, the
two or three months previously spent pulling together project data can
be reduced to days through the effective leveraging of existing resources
and technology enablers. Explorers can make go/no-go decisions more
quickly, based on richer information which means better decisions. These
benefits continue once the work is underway.
Once all the preliminary work is done, explorers are constantly updating
their information by collecting and interpreting new data from the field
and integrating the results back into their original data sets, says Steve
Randall, Director, Product Management with Geosoft. Spatial cataloguing,
search and viewer technologies enable easy access and integration of new
data as soon as it is available. This means an explorer can always be in
sync with their data no matter what systems one is using or when or
where it is been accessed. This wasnt the case previously, which often
led to delays and inconsistencies in managing the project.
And since time is of the essence, performance and response rate are criti-
cal, especially as datasets continue to grow and threaten to degrade sys-
tem and network performance.
Explorers need cause and effect to happen instantly, right in front of
them, says MacLeod. We're able to handle absolutely massive cata-
logues, and from the end users perspective its an instantaneous response.
Well continue to spend research and development effort to make sure
the technology is as responsive as it can be.
All of this has huge workflow implications, says Randall. From finding
all the relevant data sources, evaluating them and extracting them back
into your interpretation environment, theres so much flexibility that didnt
exist even 2 or 3 years ago.
Better preparation is also an effective way to maintain competitive advan-
tage. These technologies can't guarantee you better success rates, but
they can guarantee you more time to help you evaluate your decisions,
he says. You can be more effective and productive, which will in turn
probably lead to greater success.
Exploration Workflows in GIS
Though GIS has always had a role to play in the search for mineral
deposits, adoption of GIS in the exploration industry has grown for sever-
al reasons. Not the least of these is the fact that there are more tools
available to help explorers effectively work within the GIS environment -
conducting advanced geospatial analysis, and creating accurate, quality
results.
The questions posed by resource exploration and the answers offered by
GIS are a natural fit. Exploration teams need to integrate and make sense
of reams of geological, geochemical and geological information in order
to find ore bodies. GIS supports this complex workflow by managing and
analyzing the data and displaying it in a spatial context.
As GIS becomes more mainstream in all aspects of life from the ever
changing political wall maps used by CNN to depict U.S. election results,
to Google Map applications that can track a runners training routes geo-
Latest News? Visit www.geoinformatics.com
Ar t i cl e
29
July/August 2009
ArcMap and the model builder can be used to streamline workflow and docu-
ment procedures and methods of data creation and analysis. This map show
mining claim management and visualization methods and models for data
conversion
ESRIs ArcGIS can be used to visualize geology, structure, alteration and miner-
alization in 2D and 3D for successful exploration. This 3D map in ArcScene
show drill data and voxel data created with the Target for ArcGIS extension and
then combined with other project GIS data.
30
Ar t i cl e
July/August 2009
scientists are becoming more comfortable weaving geospatial technology
into their workday.
The real breakthrough, however, has been adding and improving explo-
ration workflow support for geologists and specialists working with large
and multidisciplinary exploration datasets.
I think one of the most critical aspects in the recent uptake is direct
workflow support, says Geoff Wade, ESRIs Natural Resource Industries
Manager. But to support the most complex of specialist workflows, GIS
really needs to be in the hands of a specialist solution builder. Our global
partnership with Geosoft has been essential to fulfilling the specific needs
of explorers working within ArcGIS.
In collaboration with ESRI, Geosoft has been building next generation GIS
solutions for exploration industries and the geosciences sector.
We identified several explorer challenges, from basic format incompatibil-
ity issues, to a lack of advanced tools for visualizing and interpreting earth
datasets (geology, geochemistry, and geophysics) inside GIS systems,
said Louis Racic, Geosofts Director of Desktop Applications. And we set
out to create a whole solution for explorers that would bridge these gaps.
From an interoperability standpoint, Geosoft has made it a development
priority to support all of the leading GIS data formats, and other common
data formats, says Racic. There is nothing more frustrating than spend-
ing time fiddling with incompatible data formats. Our goal is to eliminate
the need to convert data formats entirely, and free up time for the explo-
ration team to collaborate in an integrated environment.
Geosoft chose to extend their exploration workflows to the ESRI ArcGIS
environment to meet customer demand for simpler and more seamless
solutions that met their exploration project needs.
ESRI technology easily scales to the growing data and spatial challenges
that exploration organizations are facing, said Racic. And weve also
seen strong adoption of ArcGIS within global, government geological sur-
veys and the academic geoscience sector.
One result has been strong market demand for Geosoft ArcGIS exploration
workflow solutions. Our Target for ArcGIS extension software was our
highest growth product last year, and we were taking orders for the new
Geochemistry for ArcGIS before its release to market, said Racic. Were
seeing strong interest in the Geochemistry extension from the government
sector as well as the exploration industry.
Geosoft has progressed its exploration strategy in GIS with the recent
release of a Geochemistry for ArcGIS extension that extends the explorers
toolkit by providing the ability to analyse geochemical data within the
ArcGIS environment, according to Racic. And it provides a powerful explo-
ration workflow solution thats not currently available in the market.
Geochemical investigations require the ability to process and analyze all
components of geochemical sampling in context with the geology and
geophysics. With the tools available within Geochemistry for ArcGIS, explor-
ers can effectively extract knowledge from their data by examining multi-
variate relationships, uncovering underlying structures, identifying outliers
and anomalies and present results by easily creating informative, visually
impactful maps.
Using Geochemistry for ArcGIS, explorers can simplify their geochemistry
quality control process and maintain data in an ESRI file geodatabase
using a data model optimized for geochemical data. They can select and
subset data interactively from maps based on lithology and regions to
enhance data display; create advanced geochemical maps within the ESRI
ArcGIS Desktop environment; and analyse multi-element geochemistry
using a variety of tools including: interactive multiple histogram plots,
Pearsons correlation reports, scatter plots, probability plots, ternary plots
and box plots, to identify outliers and define populations.
Wade recommends a gradual adoption of GIS by the exploration depart-
ment, but one that is incorporated by every member of the interdisci-
plinary team.
We find it particularly effective if several members of the project team
embrace the technology at the same time, thereby supporting each other
in the education process and gaining extra benefit from data sharing,
improved workflow support and an improved communication capability,
he says.In addition to workflow support, effective integration of GIS with-
in your exploration organization is also an important consideration.
Technology aside, if you start with a good understanding of your peoples
data requirements and project workflows, than you are more likely to
achieve the results you need, says Racic.
There is no doubt that explorers today, both junior and major, are operat-
ing in a rapidly changing exploration environment that is increasingly
reliant on data and applications available through the Internet. Theres
more digital data to makes sense of, greater integration and interpreta-
tion challenges and larger drilling projects to manage. For many, the
choice is clear: retool to deal with the new complexities of exploration
and take advantage of all the rich, digital data and applications coming
online, or risk being left behind, and missing out on future opportunities.
Carmela Burns is a writer and the editor of Earth Explorer,
www.earthexplorer.com, Geosoft's magazine and online news site covering the
earth sciences and exploration. Her articles have been published in leading min-
ing industry and geoscience magazines.
The analysis tools available in Geosofts Geochemistry for ArcGIS extension help
uncover geochemical relationships. Shown here are a number of these interac-
tive tools linked with a map of DEM contours, surface geology and sample results.
Scatterplot and histogram geochemical analysis are incorporated within this
map of Fort Hope lake sediments geochemistry.
MobileMapper

6
True Mobile GIS for Everyone
MobileMapper 6 provides a complete set of all necessary features required of a
mapping device for anyone who needs productive data collection and efficient
asset management in the field. Through post-processing, the positions of every GIS
feature you collect can be better than meter-level accuracy.
Unlike consumer-grade units, the low-cost easy-to-use MobileMapper 6 offers full
compatibility with popular GIS software to enable companies to select and use GIS
software of their choice.
The MobileMapper 6 comes with Microsoft Windows Mobile 6, a color touch-screen,
and has Bluetooth for wireless connectivity. This handy feature-rich GPS includes an
integrated 2-megapixel camera, an embedded speaker and microphone to enrich
the collected data with pictures and voice notes.
With MobileMapper 6, Magellan Professional innovates and fills a market gap in GIS
data collection between high-cost devices and consumer-grade products.
Contact us today to receive the white paper
and read how MobileMapper 6 beats its competition.
Visit www.pro.magellanGPS.com or email
professionalsales@magellanGPS.com
2009 Magellan Navigation, Inc. All rights reserved. Magellan, the Magellan logo and MobileMapper are trademarks of Magellan Navigation, Inc.
All other products and brand names are trademarks of their respective holders.
Features

High-sensitivity GPS

Rugged and waterproof

Windows Mobile 6

2-megapixel digital camera

Bluetooth connectivity

Submeter post-processing
For more information:
France (HQ) +33 2 28 09 38 00
Russia +7 495 980 5400
Netherlands +31 78 61 57 988
Affordable GIS/GPS with
nothing missing
Submeter accuracy with
post-processing
32
Ar t i cl e
July/August 2009
Public Sector meets Science and Industry
The eleventh edition of the annual Global Spatial Data Infrastructure (GSDI) conference was held in Rotterdam,
the Netherlands, from June 15 to 19. At this conference the GSDI Association, an inclusive body of organizations,
agencies, firms, and individuals from around the world, promotes international co-operation and collaboration
in support of local, national and international spatial data infrastructure developments. That was not all: this GSDI
conference was jointly organized with the 3rd INSPIRE Conference and a Dutch national SDI conference on
Public Sector meets Science and Industry.
By Eric van Rees
The theme for this years edition of the GSDI
Conference was Building SDI Bridges to
Address Global Challenges. The conference
explored the convergence towards best stan-
dards, practices and processes among nations
while at the same time addressing ever-evolv-
ing and exciting new approaches to the offer-
ing of geographic data and services in meeting
real world needs. An example of this conver-
gence is the INSPIRE program. In May 2009 the
INSPIRE legislation came into force, and since
then all European countries have started to
share their data. At the INSPIRE conference ses-
sions, visitors were able to get an up-to-date
picture of the implementation of the INSPIRE
Directive.
Every conference day had a specific topic: after
a day of pre-conference workshops, Tuesday
June 16 was the GSDI World Conference, on
Wednesday June 17 there was the 3rd INSPIRE
Conference, and Thursday June 18 was aimed
at the public sector meets science and indus-
try. On Friday June 19 the corporate sector gave
its visions and perspectives on collaboration
and participation in SDIs. On this last confer-
ence day, there was also a special student pro-
gram with master classes. As if all this was not
enough, visitors could also enjoy a large exhi-
bition, several press conferences, and diverse
social, technical and cultural programs to com-
plete the event.
Over the five days, visitors could choose from
a huge number of sessions and create their own
program. This was not an easy task since some-
times there were no fewer than ten parallel ses-
sions, which meant that even those who attend-
ed the conference for the whole week missed
a lot. It was wise to have made a plan of action
before visiting the conference to focus on ones
own topics of interest. To help fully digest the
conference, visitors were given a free CD con-
taining all three conference proceedings and a
book on SDI convergence, as well as a sub-
stantial program and a book of abstracts.
Volunteered Geographic Information
On Wednesday June 17 I visited two afternoon
sessions. The first one was called Volunteered
Geographic Information. In this session, con-
tributors from all over the world presented
papers on the phenomenon of user-based con-
tent on websites such as Open Street Map. One
very interesting paper was called Volunteered
Geographic Information: the nature and moti-
vation of produsers. This last word is not a
spelling error: produsers are users of user-gen-
erated content who also share it with others. A
lot of effort is being made to study the phe-
nomenon of users sharing content through
communities and applications via the web. Who
are these users, how many of them are there
and what is the motivation behind their efforts?
The authors of the papers presented explained
that its no secret that user groups as well as
money-making organizations benefit a great
deal from user-generated content. It seems
everyone benefits, so it pays to investigate the
motives behind these actions, even more when
large government agencies are also discovering
the power of user-generated content and can
use it for change detection and geospatial data
updates. But in order to get users to share their
content with other users and keep on doing it,
a number of requirements need to be fulfilled.
Global Spatial Data
Infrastructure Conference 2009
Panel discussion
Latest News? Visit www.geoinformatics.com
33
July/August 2009
One of the requirements is that users want to
see their changes take effect immediately on
the screen. If this doesnt happen there is no
more need for sharing user-generated content.
Also, the authors mention that not all user-gen-
erated content is harmless: it can be produced
and shared in order to harm other users by mis-
leading them with incorrect data. By taking a
closer look at contributors, a classification can
be made of different users and their motiva-
tions in contributing data to a community: for
instance, some contributors are more compe-
tent than others. Last but not least, the contri-
butions themselves can be assigned values like
high-quality, constructive or damaging.
What government agencies can learn from all
these kinds of users, intentions and types of
contributions is that they have to take into
account the rules of each community for gener-
ating and sharing user-based content. Also, its
important to take into account that these com-
munities decide what to share, how to share it,
and whom to share it with. Its not for a gov-
ernment agency to decide what the community
shares. Even more difficult may be dealing with
user-based content that is not yet a definitive,
perfect piece of information and creating new
rules/rights/legislation for both mapping organi-
zations and produsers outside these organiza-
tions so that both have the right to contribute.
As difficult as this is, commercial organizations
have overcome the challenge of meeting the
needs of user communities while also making
money from their efforts. As another contribu-
information: they are both about bottom-up
consumer initiatives and sharing information.
The topic of smart grids seems far away from
the INSPIRE program, but actually the opposite
is true: one workshop visitor spoke of policy
measures on creating wind turbine parks, a
direct result of reports based on INSPIRE data
sets of energy shortages, that would not be in
place if the data was not there.
Verdict
In terms of quantity, this was by far the biggest
geo-event I have visited in the Netherlands. The
event was well-organized, the venue was a per-
fect fit for the number of visitors, and the list
of keynote speakers was very impressive. Much
time was spent at the exhibition where many
new international business contacts were made.
As mentioned before, the number of presenta-
tions was huge: choosing one presentation
could mean missing nine others. Getting an
overall view of the proceedings was therefore
quite impossible. I can only say that the ses-
sions I was able to attend were very interest-
ing and offered new and intriguing opinions on
a lot of topics, and not necessarily limited to
SDIs or the INSPIRE program. One thing that
struck me was the long-term view of INSPIRE:
many sessions were not only about the creation
of INSPIRE, but already looked ahead to the
use and potential for better sharing of cross-
border data, for new energy sharing initiatives,
for example. It was clear that people want to
act and use the data, seeing INSPIRE more as
tor asked in a paper, why shouldnt this be pos-
sible for government agencies, particularly when
it involves SDIs?
Energy Efficient Smart Grids
The other session I visited was about energy
efficient smart grids. I chose this session since
I wondered what the link was between INSPIRE
and smart grids, a topic largely ignored in the
public debate on green energy but very high
on the agenda elsewhere. In this session, visi-
tors were shown a segment of a documentary
on renewable energy that was aired on Dutch
television earlier this year. The message was
clear: investing in current energy networks was
not the way ahead, but looking for alternatives,
such as solar energy and smart grids, was the
way to go. A presentation on the use of smart
grids followed. By using digital technology a
household can make use of electricity more effi-
ciently than before and save money on electric-
ity bills.
This technology posed some interesting ques-
tions for the round table discussion that fol-
lowed: in giving people the power to share and
distribute their own energy, who should be in
a position to share information on the energy
consumption of individuals, and what should
be the role of national government agencies in
deciding who is the owner and distributor of
energy reserves, especially when we see cross-
border energy distribution networks? In a way,
these questions address the same fundamen-
tals as the ones on volunteered geographical
Event
34
Ar t i cl e
July/August 2009
a starting point than a conclusion. There were
also some comments on this, mainly from the
industry, which is more focused on acting now
rather than looking at solutions tomorrow.
Politics is slow to respond to market initiatives,
some say. And they add that INSPIRE alone is
not enough. Public accessibility and an aware-
ness of the use and necessity of data portals
are as important as the creation of the portals
themselves. Also, what the public can do with
the data should be clear. Theres clearly still a
lot of work to be done in the field of Global
Spatial Data Infrastructures.
Eric van Rees evanrees@geoinformatics.comis
editor in chief of GeoInformatics. For more
information on GSDI 11, papers etc. have a look at
www.gsdi.org/gsdi11.
Geospatial portals are becoming more mainstream these days with
the introduction of Data.gov in the United States and INSPIRE in
the European Union. One of the biggest issue with these geopor-
tals is findability. Most people are used to just going to Google,
typing in a text search and then getting their results.
Works really well doesnt it? But we all know that when it comes
to doing textual searches on spatial data, we get really poor results
because text isnt spatial. Metadata could be the solution, but
depending on how it is written, the search engine could misinter-
pret it and give the user a poor result. That is why most new geo-
portals integrate maps so that users can at least zoom into the area
of interest and then narrow it down with tags and other methods.
This usually gets the user interacting with datasets they want, but
it is only the beginning of the journey.
One huge gap in these geoportals is their inability to offer up data
in formats people want (open formats) versus ensuring the data is
online. First off trying to defining open formats is difficult enough.
GML might be very open, but how many people can actually use it?
DWG is much more closed, but Autodesk users rely on it to get their
work done. That tradeoff between such formats can determine how
successful a geoportal is. Giving power users the ability to work
with raw data, but still giving casual users the ability to work with
the spatial data is a balancing game.
Caching is also important. Many early geoportals allowed data
providers to register their local web services with them, but didnt
do anything to ensure the data would be available. Caching reduces
the load on these data providers by ensuring fast, consistent access
to web services, but still keeping the management of those web
services with the provider. Even just providing uptime results next
to datasets or web services to show users how often data is avail-
able would help reduce unhappy users because they would be able
to see how healthy web services are and plan accordingly. Who
wants to build web applications on a web service that is down even
1% of the time?
Geoportals are doing a great job helping users locate spatial data
sources, but holding users hands to the download of the dataset
ensures that they will be happy. Users running up against data for-
mats that they cant read or understand, web services that arent
available and not being able to find a dataset in their area of inter-
est are all ways to ensure that users wont be coming back. Simple
solutions to these problems is all that is needed to make every
user satisfied, reducing the workload on everyone.
Column
Geospatial Portals - Keys to Success
James Fee james.fee@rsparch.com is Geospatial
Manager at RSP Architects Ltd. Have a look at his
blog www.spatiallyadjusted.com
I believe in precision.
Precision is more than an asset when your
reputation is at stake, its an absolute necessity.
Zero tolerance is the best mindset when others need to rely on
your data. Thats why precision comes first at Leica Geosystems.
Our comprehensive spectrum of solutions covers all your measure-
ment needs for surveying, engineering and geospatial applications.
And they are all backed with world-class service and support
that delivers answers to your questions. When it matters most.
When you are in the field. When it has to be right.
You can count on Leica Geosystems to provide a highly precise
solution for every facet of your job.
Leica Geosystems AG
Switzerland
www.leica-geosystems.com
The new Leica ScanStation 2: this high-definition
3D laser scanner for civil engineering and plant
surveying is a fine example of our uncompromising
dedication to your needs. Precision: yet another
reason to trust Leica Geosystems.
36
Ar t i cl e
July/August 2009
Using PixoView Technology
Testing Measurement Accuracy in
Oblique Photography
With an increase in applications relating to the use of oblique photography, there is now a demand to determine what
degree of accuracy can be achieved with oblique photography single-picture measurements in the PixoView application.
By Jan Sukup, Patrik Meixner and Karel Sukup
Testing Project
Since 2006, PixoView technology has been under development at GEODIS
BRNO, for processing oblique photographs. PixoView makes available
the possibility of processing generally-oriented aerial and terrestrial pho-
tographs in a single database, which allows for simple control of a suit-
able selection of photographs for further processing. Currently, work is
mainly focused on monoscopic measuring of individual image content dis-
played on computer monitors.
The oblique photography technology can be divided into two stages. The
first stage involves exposing oblique photographs, for which GEODIS
BRNO, Ltd., uses a Z-37A melk aircraft fitted with a system consisting
of five cameras secured in a fixed group called GBCam I. One camera cap-
tures vertical photographs, while the others are tilted to the left and right,
forward and backward to acquire the oblique images. The second tech-
nology stage involves the subsequent processing of digitally-acquired pho-
tographs using single-image aerial photogrammetry.
The purpose of the spring 2008 testing project was mainly to determine
positional and vertical measuring accuracy of the PixoView application.
The testing itself was performed on aerial photographs taken around Brno-
Le, where GEODIS BRNO has a calibration base with 80 signalized con-
trol points defined by GPS technology. For testing purposes a three-cam-
era system was used. Cameras labeled C1, C2, and C3 each took 71
photographs in 9 flight strips over the 180 hectare area of interest. Since
the flying height was ca 650 m above terrain, photographs had a nadir
resolution of 0.1 m, and the oblique photograph resolution decreased pro-
portionally to a value of ca 0.2 m, based on subject distance from the
projection centre.
Input Data
The input data for each project in PixoView includes:
Aerial (terrestrial) images
IO Interior orientation parameters
EO Exterior orientation parameters
DTM Digital terrain model
The PixoView application obtains interior orientation parameters via a
calibration protocol (Fig.2), which provides information on the camera
focal length (f ), the position of the principal point (dx, dy), and the
radial lens distortion values. These variables define the position of the
projection centre with respect to the level of the exposure, allowing us
to later reconstruct the visual cluster of rays that created the aerial
photograph at the time of its exposure.
To place this cluster in space, with regard to the geodetic coordinate
system, it is necessary to know the exterior orientation parameters
which define the position of the projection centre in the appropriate
coordinate system as given by three coordinates (X, Y, Z). In addition,
the frame axis orientation in space is represented by three angles of
rotation (, , ). The EO parameters can be determined in three ways:
by direct georeferencing
by analytical aerotriangulation without ground control points
by analytical aerotriangulation using ground control points
The direct georeferencing of aerial photographs means obtaining the
EO parameters directly by processing the data gathered by the GPS
Fig. 1: Flight plan and ground control point coverage of
the test project area.
Fig. 2: Calibration protocol detail example.
Latest News? Visit www.geoinformatics.com
Ar t i cl e
July/August 2009
differential signal receiver and the inertial measuring unit (IMU) on
board the airplane during the actual photo exposure. The analytical
aerotriangulation method without ground control points (AT without
GCP) improves the aforementioned EO parameters, with help of a large
quantity of Tie and Pass points generated by image correlation algo-
rithms in predefined image areas (Von Gruber areas). The most accu-
rate method of determining the image exterior orientation parameters
is to calculate the analytical aerotriangulation adjustment using ground
control points (AT with GCP). In addition to points acquired by image
correlation, the control points with known geodetic coordinates are
measured within the images; these points are then used to determine
the absolute orientation of the photographs and models.
Testing the Accuracy of Positional Measurement
Before the actual tests, the exterior orientation parameters were com-
puted using all aforementioned methods. The main task was therefore
to determine to what extent the accuracy of EO parameters could
improve measurement in the PixoView application. In the first phase,
eight control points were selected, four on the edges and the rest inside
the area of interest so that they evenly covered the entire test project
area. These points were then measured in PixoView in all the available
oblique photographs. A total of 170 individual measurements were made
Fig. 4, 5, 6: Graph showing the dependence of the mean deviation in position
up on the horizontal distance from the projection centre for measured control
points (4) dependence of the mean deviation in the height dz on the horizontal
distance from the projection centre for measured control points (5), depen-
dence of the mean deviation in position up on the horizontal distance from the
projection centre for all the measurements performed (6).
Fig. 3: PixoView working environment.
[Fig 4.]
[Fig 5.]
[Fig 6.]
37
Ar t i cl e
July/August 2009
for each variant. The measured coordinates were then compared with
the coordinates determined by terrestrial (surveying) methods. And as
is usual in aerial photogrammetry, the accuracy achieved was divided
into an altitude variable, computed as the difference between the given
altitude and the measured altitude (dz), and a positional variable, which
was calculated as the mean deviation in the position for each mea-
surement from the formula.
The measured values are greatly affected by the horizontal distance
from the projection centre, or the viewing angle at which we see the
point in the image. The mean deviation dependence in the position on
either one of the unfavorable factors can be statistically expressed
using a correlation to describe mutual relations between variables. The
result is a value or correlation coefficient ranging from -1 to 1, where -1
indicates indirect dependence, 0 indicates that there are no statistics
between the variables to determine dependence, and values towards 1
show direct dependence. In our case, this means that the greater the
distance from the projection centre, the less accurate our results are
when determining point positions.
The following graph (Fig. 4) shows the mean deviations in the position
at the individual distance intervals (d), and also compares all three
methods of EO parameter determination.
Determining the altitude variable is somewhat easier. The correlation
coefficients dependence on the horizontal distance is practically zero
for all three variants, so the height measurements have the same degree
of accuracy as the digital terrain model used (Fig. 5).
So far, we have only taken into account signalized points which are
clearly identifiable on the images. With other non-signalized points,
such as the vertices of pavements and roads, the resulting deviations
will be affected far more by interpretational errors. This means that
especially in remote areas of the photographs, where the pixel size can
be up to 20 cm, compared to 10 cm in nadir areas, our chances of
identifying the measured points are much worse. In the case of build-
ing corners, this uncertainty is even greater.
In the second stage of testing we also measured 450 non-signalized
points, and the resulting coordinates were compared with the coordi-
nates of these points on a digital cadastral map (DKM). A comparison
of all the results achieved from testing the positional accuracy of
PixoView measurements is shown in the graph below (Fig. 6). When
evaluating these results, it is necessary to take into consideration that
we are dealing with monocular measuring; the DKM was used for the
calculation of the accuracy, and the DKMs accuracy is also not entirely
homogeneous.
Testing Height Measurements
Within the testing of the PixoView application, a tool for measuring the
height of structures was also examined. For this purpose, geodetic mea-
surements of selected buildings were first taken, and the oblique length
Fig. 7: Measuring tools in the PixoView application.
Tab. 1: Table showing correlation coefficients for
cameras C1 and C2.
38
40
Ar t i cl e
July/August 2009
was used to calculate the buildings height, which was subsequently
compared with the measurement acquired from PixoView. As expect-
ed, the tests confirmed that there was no dependence on the accura-
cy of the exterior orientation parameters, meaning that we will get
comparable results regardless of whether we use direct georeferencing
or aerotriangulation. The main downside will be our ability to actually
interpret where we were measuring the height to and from. The devia-
tion (dh) from the geodetically measured height was on average 10
cm, although if the start and end points of the building and its given
relative height are defined incorrectly, this error can be several times
greater. For measuring, it is therefore preferable to choose images
where the building is closer to the projection centre, since the pixel
size is smaller and greater detail can be seen. However, a problem may
occur in terms of good visibility of the given points necessary for mea-
surement, because the nearer the structure is to the nadir of the image,
the more the advantage of measuring in oblique images is lost.
Conclusions
When testing positional measurements in PixoView, it is important to
take great care with the quality of the input data (image resolution,
exterior and interior orientation parameters, DTM), which has a direct
effect on the degree of accuracy of the individual points determined in
the application. We will get significantly better results by improving the
quality of the exterior orientation parameters. With direct georeferenc-
ing, average deviation in the position at signalized points was 0.45 m.
Using the analytical aerotriangulation method, we improved accuracy
and reduced the average deviation in the position at signalized points
to 0.20 m. The altitude accuracy of the individual points then depends
primarily on the quality of the DTM used, but this does not influence
the accuracy when determining the relative height of structures.
Overall accuracy characteristics for AT with GCP were divided into four
distance intervals (d), according to the aforementioned high correlation
between the accuracy achieved and the distance from the projection
centre, and are shown in the following table (Tab. 2). The achieved
accuracy approximately corresponds to 1.5 times the size of pixels for
all selected intervals of distance.
One of the greatest advantages of oblique photography is that it allows
us to measure the height of buildings, bridges, and in fact any kind of
landmark contained in a photograph. Previously, stereoplotting has
been the only way to determine height proportions. But working in
stereo requires special equipment and trained operators. Now, using
oblique photography and appropriate software, this information can be
acquired without extensive knowledge of photogrammetry, with any
computer using average hardware, and to a relatively high degree of
accuracy. Therefore, this technology has become attractive to a number
of sectors - from town planning to the coordination of integrated res-
cue system teams, where it can be used to ascertain preliminary infor-
mation on affected areas. Last but not least, oblique images can also
be used to create realistic photo-textured 3D city models.
Karel Sukup, Managing Director, Division of Geoinformatics,
Geodis Brno Ltd., karel.sukup@geodis.cz
Jan Sukup, Application Engineer, Division of Geoinformatics,
Geodis Brno Ltd., jan.sukup@geodis.cz
Patrik Meixner, Production Manager, Division of Geoinformatics,
Geodis Brno Ltd., patrik.meixner@geodis.cz
For more information, have a look at www.geodis.cz
Fig. 8: Graph showing resulting deviations dh, where C1 and C2 is the designation of cameras.
Tab. 2: Accuracy characteristics AT with GCP, where C1 and C2
are oblique cameras.




























































































intereste
For mo



www.supergeo
ed in being one of our re
ore information and dow



.
otek.com
esellers, welcome to vis
f youre or i wnload a trial,


sit
e



New Partnerships, Satellites, Products and Strategies
Spot Image International
Conference 2009
The Spot Image International Conference
2009 was held on June 10 and 11 in Toulouse,
France. During two days, more than 450 reg-
istered visitors from 50 different countries
joined to share their experiences with Spot
Images satellite imagery products and ser-
vices, and were updated with details of what
the future has in store. As a user conference,
it meant the focus was on whats new and
whats coming up. Indeed, a lot has hap-
pened since the last International Conference
which was held in 2006.
On the first conference day, visitors were
updated on new developments at Spot Image,
in terms of products, strategies and initiatives,
and it concluded with a Gala Evening in the
centre of Toulouse. During the second day,
visitors could attend numerous sessions
focused on image-application themes. The
diversity of the sessions showed that satel-
lite imagery is now being used for many dif-
ferent purposes. Not just mapping, surveil-
lance and security, maritime and coastal
monitoring but also forestry, agriculture, and
the environment.
Whats New?
The Conference was opened by Spot Images
CEO Herv Buchwalter. In his excellent
address, he touched on many topics that were
discussed in detail in the following sessions.
Buchwalter started by outlining the new
shareholding arrangement: since July 2008,
Spot Image is a subsidiary of Astrium within
the EADS Group. Astrium EADS is Spots
largest shareholder with 82%. Together with
Infoterra, Spot Image forms the Earth
Distribution pillar of Astrium Services. This
Earth Distribution division has a staff of 800
people, located in 13 countries with a total
turnover in 160 million Euros. Spot Image has
six subsidiaries, and two foreign offices, which
in total count 260 employees. The most recent
of these new subsidiaries is Spot Image Brazil,
located in So Paolo.
In terms of revenue, the company experienced
its seventh year of continued growth in 2008,
with total revenues of approximately 110 mil-
lion Euros. Of course, the current economic
situation may have a big influence on new fig-
ures, but Buchwalter was nonetheless opti-
mistic about the future. In fact, this was con-
firmed in other presentations during the rest
of the conference, by users stating that satel-
lite image-utilization is indeed economically
beneficial. By using the same imagery for mul-
tiple purposes, clients are able to reduce
costs. Spot Image sees a growth in the use
of satellite imagery compared to other sources
such as aerial imagery, which has technical
limitations as well. Some interesting new uses
for satellite imagery were shown during the
conference like Oenoview, a new space-based
service for the wine-growing industry.
A great deal of Buchwalters opening speech
was devoted to Spot Images new strategies
and commercial orientations. In order to main-
tain its market position, Spot Image is invest-
ing a lot in generating new capacities leading
to better coverage in less time. In practice,
this means new satellites and more of them
for capturing imagery. Perhaps the biggest
news from this conference was the announce-
ment of no less than two constellations,
which in total make four new satellites: Spot
6 and 7 and Pliades-1 and 2. But this is only
one side of the story: a lot is being done to
invent and implement new applications and
business models and, lastly, an increase in
the geographical presence on the planet
ensuring better and quicker access to data
42
Ar t i cl e
July/August 2009
With two new constellations coming up, there was a lot of news to be shared at this years Spot Image International
Conference 2009. At the Diagora Congress Centre in Labge, more than 450 visitors gathered for two days of user
presentations and to hear about the companys latest developments, such as the Planet Action initiative, new remote
sensing satellites, products and services. A good number of user presentations showed that satellite imagery
is now finding its way into the public domain more and more.
By Eric van Rees
Latest News? Visit www.geoinformatics.com
43
July/August 2009
and services. This development can be
explained because of the growth of data
capacity and server technology. Clients want
a lot of data at once, and they want it quick-
ly. The technology is there so there is no rea-
son why a company should not be able to
meet these demands.
Planet Action
Apart from its commercial activities, the con-
ference also featured a program on Planet
Action, a corporate citizen initiative, launched
at ISDE in San Francisco in 2007. Its mission
statement is to provide geographic informa-
tion and technology to support local projects
for action on climate change-related issues.
This mission statement has its origin in the
public debate on global warming, which real-
ly reached the agenda in 2006. The idea
behind Planet Action is that earth observation
technologies can be deployed to support
action in fighting climate change. Spot Image
owns millions of images that can be used to
monitor changes in the Earths climate. In
addition, it has the expertise and knowledge
to analyze geoinformation, together with a
large worldwide network which can work
together in the fields of GIS, education and
so on. At the moment, there are more than
120 Planet Action projects on all continents
that make use of imagery, GIS software or
image analysis expertise. It is expected that
by the end of 2009 there will be more than
200 Planet Action projects in total, with over
3 million km of imagery donated by Spot
Image. Already Planet Action is in a number
of partnerships with companies such as ESRI,
ITT, UNESCO and Definiens and new partner-
ships are on the way. The Planet Action theme
was revisited at the end of the first confer-
ence day in a round table discussion on the
role of the industry in the fight against cli-
mate change.
The Future: Two New Constellations
The two new constellations were given much
attention in the remainder of the morning pro-
gram. The first constellation consists of
Pliades-1 and Pliades-2, and the second
SPOT6 and SPOT 7. The new constellations
meet the strong demand for SPOT 5-like data,
such as large area collection, swath size, res-
olution and quality
The launch of Pliades-1 will be take place
sometime between December 2009 and May
2010, and Pliades-2 will be launched 15
months after the first launch. Meant for VHR
(Very High Resolution) imagery, the constella-
tion will have the following collection capa-
bilities: 1 day revisit (with two satellites that
is), a target collection of 20-30 targets per
pass, strip mapping up to 70x300km single
AOI mosaic collection, stereo capabilities
(tri-stereo) up to 6000km swath per stereo
collection and direct tasking up to 20 minutes
before actual acquisition. In terms of agility,
the constellation offers superior agility,
because it benefits from Control Moment
Gyros (CMGs) that provide an acceleration
that is 10 times better than its competitors.
The VHR imagery is 50cm color made of
simultaneous and systematic Pan and
Multispectral data collection (2m color and
50cm panchromatic). Programming of
Pliades will consist of three mission plans
per day, which gives it the ability to take
urgent programming into account four hours
before collection. The Pliades programming
consists of a centralized concept of opera-
tions. It starts with an order request from a
customer, which is then followed by an order
confirmation. After this a collection plan is
made and uploaded to the satellite. After
telemetry downlinking, the imagery is directly
received and processed on the ground.
Spot 6 and 7 will be launched in 2012 and
2013. Spot 6 will complement Spot 5 as of
2012. Together, Spot 6 and 7 will secure the
availability of 2m data up to 2023 and have
large and optimized collection capabilities.
Together, they will complement the Pliades
VHR data: both constellations are flying in the
same period and have common exploitation
through the ground segment. In the long
term, SPOT 6 and 7 will replace SPOT 5, which
will remain operational until 2014
Other market trends are new customers that
require large image capacities, such as the oil
and gas industry. Also, there is a consistent
combined need for VHR and HR (High
Resolution) data. Spot Image expects to meet
all these demands with the combination of
these two new constellations. SPOT 6 and 7
products can be Ortho products, 2m PAN-
sharpened products, but also advanced prod-
ucts such as mosaics in a map projection.
Compared with SPOT 5, the new constellation
will offer a better resolution, high location
accuracy, improved agility, reactive tasking
and improved weather forecast accuracy. This
all allows for 2m color Ortho images, large
imaging capacities, better use of resources,
achievement of weather-difficult coverage, and
faster product delivery and availability.
Eric van Rees evanrees@geoinformatics.comis
editor in chief of GeoInformatics.
For more information on the Conference,
have a look at www.spotimage.fr/web/1144-siconf-
09-programme-presentations.php.
Many thanks to Anne Marie Bernard for providing
cover imagery and presentations.
Jena-Optronik GmbH
jas@jena-optronik.de
Our space
experience for your
benet on Earth
Three decades of aerospace
business enable high quality
spaceborne and airborne
cameras for Earth observation.
We have built a bridge from
space right into our everyday life.
Our surveying camera
Jena Airborne Scanner for
Remote Sensing and Mapping
provides you data with
- High geomeLrical accuracy
from 4000m altitude
- horizonLal less Lhan S cm
- verLical less Lhan 8 cm
- high spaLial resoluLion
- ol 17cm lrom 4000m alLiLude
- very high posiLional accuracy
- S dilerenL panchromaLic sLereo
chanels to reduce blind spots
- high radiomeLric resoluLion
Multi-spectral imagers
from Jena
44
Ar t i cl e
July/August 2009
Forensics versus Research
The title of this books immediately described it contents since it is a contraction of the words Geo and Forensics.
Now with the first part, you, as a reader of this magazine, should be more than comfortable. The second part
however seems something distant from televised series such as Crime Scene Investigations.
By Huibert-Jan Lekkerkerk
This book describes the use of geo-physical acquisition and analyti-
cal methods, Geo graphic Information Systems and geostatistics in the
application of crime scene investigation (or more formally named foren-
sics).
General Contents
In the book there are short descriptions of all systems involved as well
as a great deal of examples from the crime scene world. From the non-
forensic point of view the book gets boring at some times since the
application of techniques are described on a per-technique basis.
What the book however does do is open the eyes of those involved in
more regular geographical and geophysical techniques to the poten-
tial application of their every day tools to this line of research.
Forensics versus Research
The book also strongly implies the major differences between the every-
day use of our familiar tools as we know it and their use in forensics.
In some sense geoforensics is like prospecting for oil or locating a
wreck. We have a good indication that it must be there, but we also
have no clue as to the exact whereabouts or where to start searching
and with exactly which method.
Another major difference is of course in the legal implications of the
research being performed. Ultimately a member of society will be acquit-
ted of a crime or not based on, amongst others, the results of the
forensic process.
Readability
The downside of this book is that it is not easy to read with the con-
stant referral to academic literature or research of others. Another aspect
that makes it difficult to read it the division into specific subjects.
Probably someone who has knowledge of the forensics process may
find this useful, but a more comprehensive introduction to geoforen-
sics and reference to the other chapters would have been helpful as
far as Im personally concerned.
Companion Website
As with more and more books, this book also has a companion web-
site. One advantage of the website is that corrections to the book can
be found; helping you to keep your book up to date. The age of the
corrections is however not mentioned.
Apart from the corrections, the content of the website is disappointing;
the additional information presented would have easily fitted in anoth-
er few pages of the book. Maybe the additional material will be extend-
ed, but so far it seems more of a sales argument than an additional
benefit to the reader/user.
Conclusion
The book offers a good insight in the
application of everyday geo-information
techniques and tools to the world of forensics. The book is aimed at
undergraduate and postgraduate students in both forensics as well as
geosciences.
For the latter category I can say that the book will be hard to read due
to the lack of proper introduction into forensics in general. Students in
forensics will probably gain more from the book considering its divi-
sion into geo techniques and their application to forensics. However
for these students an overview of techniques and their applicability
would probably have been useful as well.
Huibert-Jan Lekkerkerk
hlekkerkerk@geoinformatics.com is project
manager at IDsW and freelance writer and trainer.
This article reflects his personal opinion.
Title: Geoforensics
Editors: Alastair Ruffell and
Jennifer McKinley
Publisher: Wiley-Blackwell
ISBN: 978-0-470-05735-3
Nr of pages: 331
Geoforensics
Copyright 2009 ESRI. All rights reserved. ESRI, the ESRI globe logo, and www.esri.com are trademarks, registered trademarks, or service marks of ESRI
in the United States, the European Community, or certain other jurisdictions. Other companies and products mentioned herein may be trademarks or
registered trademarks of their respective trademark owners.
Enhance your business practices and expand your capabilities with the
technology that is shaping the future of geospatial data management and
analysis. Join other surveyors, engineers, and geographic information system
(GIS) users at the 2009 ESRI Survey & Engineering GIS Summit.
Further develop your business with GIS.
Hear from Juliana Blackwell, director of the National Oceanic and
Atmospheric Administrations (NOAA) National Geodetic Survey (NGS).
Collaborate with your peers and ESRI staff and business partners.
Take away important ideas and tools you can implement right away.
Building New Opportunities
Learn more and register at
www.esri.com/segsummit.
2009 ESRI
Survey & Engineering
GIS Summit
July 1114, 2009
San Diego Convention Center
San Diego, California
46 46
July/August 2009
The extraction of information from remotely sensed data for digital
mapping and GIS database development has until recently been based
on the use of high resolution airborne and/or satellite images, some-
times only in panchromatic mode. This has resulted in limited progress
in the development of automatic methods that will perform over a
range of images and scenes. However, with the growing availability of
multiple sources of data with a range of resolutions and spectral char-
acteristics, which include photogrammetric quality images, multispec-
tral and hyperspectral images, discrete and full waveform lidar, and
interferometric SAR, the prospects are for much greater success in the
automatic extraction of digital spatial information based on new image
analysis approaches that include machine learning techniques devel-
oped by computer scientists over the past two decades.
The principle of these methods is to learn to recognize complex pat-
terns and make intelligent decisions from sensor data. Machine learn-
ing is also closely related to fields such as statistics, probability theo-
ry, data mining, pattern recognition, artificial intelligence, adaptive
control, and theoretical computer science. Many machine learning
software packages have been developed and tested including: Accel;
classweb; focl; foil; golem; index; learn_pl; miles; mobal; oc1; occam;
pebls; rwm; utexas; C4.5 and C5; and Weka. Also, several scientific
software and commercial packages provide tools and functions for
machine learning. Common machine learning algorithm types include:
supervised learning; unsupervised learning; semi-supervised learning;
reinforcement learning; transduction; and learning to learn. For super-
vised learning, the most popular approaches of machine learning
include neural networks, decision trees and support vector machines.
Given a set of attributes of a certain feature in an image, the algo-
rithm can learn how to distinguish this feature from other features
with high success rates.
Decision trees, for example, are based on the construction of a tree
structure of logical, if-then, conditions related to the attribute value. If
the condition at a node is fulfilled, the branch to the left is followed,
otherwise the right branch is chosen. The process continues until the
algorithm reaches a node with pure data, data from one class, as a
final classification. The approach enables classifications to be highly
automated compared with most of other traditional approaches in
which the selected attributes must remain fixed. Also, the classifica-
tion rules created in one training area should be applicable to other
similar areas if the characteristics of the data are stable.
Compared with other methods, learning machine algorithms are a fair-
ly straightforward alternative approach for image analysis and land
cover mapping from multidimensional data, because they are non-
parametric and do not require any assumptions about the distribu-
tion of the input data. Furthermore, they have the advantage that they
can operate on a mixture of categorical and continuous variables. Our
experiments have shown that some feature attributes are useful only
for one machine learning algorithm while others are useful for more
than one algorithm. This suggests that construction of hybrid classi-
fiers based on multiple machine learning algorithms operating simul-
taneously, taking advantage of all attributes should achieve a more
effective and robust decision making process.
Machine learning methods are promising approaches for much more
efficient and accurate automatic extraction of features for digital map-
ping and GIS database development.
Machine Learning as a Tool for Image Analysis and
Information Extraction
John Trinder, Visiting Emeritus Professor at
the School of Surveying and Spatial Information Systems
at The University of NSW, Sydney, Australia.
Column
GET YOUR MESSAGE OUT !
Reach 31,000 decision-
making professionals in
the geospatial industry,
worldwide.
GeoInformatics is the
leading magazine
for surveying, mapping-
and GIS Professionals.
Advertising in GeoInformatics means
effective exposure in the printed nd
digital version of GeoInformatics!
www.geoinformatics.com
intro
48
Ar t i cl e
July/August 2009
Under the Sea
Ocean Depths
During the last few months the deep regions of our earth seem to have been
getting a fair bit of attention. That started me wondering a) why there was so
much attention and b) if all that attention is really warranted. But to answer
these two questions, lets first have a look at the facts.
By Huibert-Jan Lekkerkerk
The Facts
About 71% of our planet is covered with water,
totaling about 1,405,000,000 cubic kilometers
of water if one cares to include the ice caps of
Greenland and Antarctica. About 95% of all that
water is salt water. The average depth of the
world oceans is about 3,800 meters with the
deepest point at 10,920 meters below sea level
in the Mariana trench (Challenger Deep). In com-
parison, the average height of the land masses
is in the order of 840 meters with the highest
point (Mount Everest) at 8,848 meters.
The dry surface of the earth has been explored
for as long as there have been humans. The
watery part has only been really explored for
the last few hundred to a thousand years. For
most of that period man was only interested in
the general geography, however.
In oceanographic circles the Challenger expedi-
tion (1873 1876) is usually taken as the real
start of the exploration of the oceans depths.
Similar expeditions followed such as the
German Meteor expedition in the Southern
Atlantic.
The equipment used in those days was, how-
ever, quite old-fashioned with the major instru-
mentation being the lead and line where every
depth measurement took a couple of hours.
The later expeditions started to use the acous-
tic echo-sounder which could perform a mea-
surement every few minutes without having to
stop the vessel.
Even so, data was collected along survey lines
and even on the longer expeditions only a few
survey lines (profiles) per ocean / sea / inlet
were taken, giving nothing but a general
impression of the depths. For example, on the
Dutch Snellius expedition (1929 1930) cover-
ing the entire area of Indonesia only about
30,000 soundings were taken (about one for
every two kilometers of sailing).
It was mainly after the Second World War that
charting the navigable parts of the worlds
oceans and seas was really taken up. In 1970
1980, the International Decade of Ocean
Exploration, research increased but was still per-
formed in ways similar to before. It was only
with satellites that the ocean images we are
familiar with today became known.
Accompanying this article is one of those
images for the Atlantic Ocean. The image, which
looks very impressive, is constructed of satel-
lite radar data and little of it has been actually
surveyed. Most of the depths presented have
been inferred from gravitational differences. One
of the other illustrations with this article shows
the difference between these inferred depths
and real acoustic soundings performed from a
ship.
In short, only a fraction of the ocean has been
really surveyed, mainly those parts where oil is
or was expected and those frequented by ships
where the draught of those vessels becomes
critical.
Why the Attention?
Now back to my original question: why all this
sudden attention to the depths of the ocean?
Probably the big drive was the introduction of
the sea bottom in Google Earth. Because, lets
face it, after all these years of trying to sell our
(digital) cartographic products to the general
public it was Google who succeeded in liber-
ating the use of digital geographic information
to the masses with Google Earth and Google
Maps.
So the introduction of the sea bottom all of a
sudden made the general public aware of the
existence of what we normally cant see
because it is hidden under a few kilometers of
water. When zooming in on Google, for exam-
ple on the North Sea, the limitations of the
information also become directly visible; the
information used is coarse grained and with-
out much detail. Actually the source used in
Google is the General Bathymetric Chart of the
Oceans (GEBCO), an initiative started by Prince
Albert I of Monaco at the end of the 19th cen-
tury and updated with information from a
Space Shuttle mission in 2000.
Another reason for paying new attention to the
ocean depths was a recent announcement from
the Woods Hole Oceanographic Institution
(USA), a legendary name in oceanographic
research. They announced that their underwa-
ter robot, Nereus, had reached the worlds deep-
est part, Challenger Deep. Since its discovery
in the 1870s by the first Challenger expedition,
that bottom has only been reached three times.
Challenger Deep in Google Earth.
Latest News? Visit www.geoinformatics.com
Ar t i cl e
49
July/August 2009
Only the first of these three times was a
manned expedition, by Jacques Picard and Don
Walsh in 1960. The only other descent to this
depth was the Japanese research robot Kaiko
in 1995.
Does it need Attention?
Based on the facts presented above the answer
could go two ways. On the one hand, nothing
special has happened over the last few
decades. Progress has been slow and has not
significantly sped up. On the other hand, if we
want to make progress, then any (positive)
attention is welcome. Our knowledge of the
geography of the moon (or for that matter Mars)
is now almost equal to or better than our
knowledge of our own world.
The lack of knowledge is not a matter of tech-
nology; there are multibeam echo sounding sys-
tems available that can measure ocean depth.
The problem is one of time (and money).
Most survey vessels can only measure, at
the most, at speeds of 15 knots (7.5
meters/second). The most advanced system can
measure about 2 3 times the water depth and
still obtain reasonable results. With an average
Conclusion
We know less about the ocean floor than we
know about the remainder of the earth. In
order to get better knowledge two things are
needed. One is an awareness that we know
as little as we do and publicity helps in that.
The other thing that needs to be done is to
set aside a substantial amount of money to
collect all the existing data and to collect as
much new data as possible to finish a work
that was started over a decade ago by Prince
Albert I of Monaco with the first version of
GEBCO.
Huibert-Jan Lekkerkerk
hlekkerkerk@geoinformatics.comis project manag-
er at IDsW and a freelance writer and trainer.
This article reflects his personal opinion.
depth of 3,900 meters that is about 75,000
square meters per second. Or about 6,480
square kilometers per day. The only problem is
that the total surface to be surveyed is approx-
imately 70% x 4 x 3.1428 x 6380 x 6380 kilo-
meters = 512 million square kilometers or 216
years of full continuous surveying! Even if Ive
grossly overestimated the parameters it would
still take decades to fully survey the ocean floor
using the current techniques.
Innovation at Work
and
Gain visibility for your groundbreaking work at two co-located European meetings
that are showcasing the latest work in security, defence, and remote sensing.
These international forums will address the challenges that continue to emerge
as these technology areas continue to evolve.
The co-location of these two events enables participants to collaborate on
topics of mutual interest. Meet with colleagues and potential new partners
in industry, academia, and government from around the world. In the new
economic landscape your research is more important than ever in advancing
developments in security and defence-related optronics, photonics, and remote-
sensing technologies.

Conferences: 31 August3 September 2009
Exhibition: 12 September 2009
bcc Berliner Congress Centre
Berlin, Germany
spie.org/ers
spie.org/esd
L
e
f
t

i
m
a
g
e

c
o
u
r
t
e
s
y

o
f

S
C
H
O
T
T
.

R
i
g
h
t

i
m
a
g
e

c
o
u
r
t
e
s
y

o
f

T
h
r
u
v
i
s
i
o
n
.
Atlantic Ocean from the General Bathymetric
Chart of the Oceans (GEBCO source NOAA)
50
Ar t i cl e
July/August 2009
Achieving a More Reliable Delivery for Growing Energy Needs
Smart Grids around the World
Smart grid is gaining traction throughout the world as a means to conserve resources, lessen pollution and increase the
security and resiliency of power grids states Tony DiMarco, Intergraph Director of Global Utilities and Communications.
For a number of years, Intergraph has been active in the field of smart grids. But what exactly is it that smart grids do, and
what is their link with geospatial technology? We put this question and a number of others, to Mr. DiMarco. In the
following interview, he discusses Intergraphs involvement in offering new smart grid solutions, the use of geospatially
powered applications for smart grid implementation, the current yet diverse Smart Grid initiatives that are evolving
outside the U.S., and the spread of Smart Grids into other markets and industries.
By Eric van Rees and Remco Takken
In your recent presentation The
Operations Center, Often Neglected,
yet Crucial to Your Smart Grid
Strategy, you talked about the chal-
lenges and new technologies that
enable Smart Grid operators to con-
trol their entire distribution network
from a consolidated graphical user
environment. Can you address these
challenges and discuss the new
technologies that Intergraph is cur-
rently involved with?
TD: Many people think about smart grid
in terms of new technologies such as
smart meters, advanced communication
networks and demand side management,
just to name a few. While these cutting-
edge technologies are indeed all critical
to a complete smart grid strategy, the
piece that many utilities embarking on
smart grid implementations are missing
is a way to logically tie all of these tech-
nologies together and make sense of all
the new network data they are providing
to operations personnel. As more tech-
nologies are added to the grid, the
amount of network data available to grid
operators grows exponentially, but the
information is not actionable if it cannot
be quickly and easily digested and under-
stood. To help utilities avoid information
overload and maximize their smart grid
technologies, Intergraph offers a Smart Grid Command-and-Control
Center that allows operators to view and control their entire distribu-
tion grid through a single interface, maximizing efficiency and enabling
safer and more reliable grid operation. The Smart Grid Command-and-
Control Center is an evolution of Intergraphs core infrastructure man-
agement (G/Technology) and outage management (InService) technolo-
gy, and can integrate power systems analysis and SCADA technologies
from providers like Siemens Energy, and others to create a single, easy
to understand picture.
Also, you state that Smart Grid
inherently has many more sources
of information for the operator to
manage. How can geospatially-
enabled visualization technologies
lessen this information overload and
at the same time increase the per-
formance of the Smart Grid?
Essentially, geospatially-powered
applications provide an important founda-
tion for any smart grid implementation. A
computer model (database) of the distri-
bution network is fundamental to SCADA
and to the integration of power systems
analysis to support operations. Without
knowing all of the details on your assets,
their location and status at any given
moment, it is impossible to obtain a clear
picture of your grid and operate at peak
performance levels. Seeing the grid more
clearly leads to faster, more informed deci-
sion making and more productive and
rapid deployment of field resources, which
helps keep the power on and improve
operational resiliency in the face of equip-
ment failures, extreme weather or other
challenges to operations.
In January 2009, President Barack
Obama asked the US Congress to
pass legislation that included build-
ing a new electricity Smart Grid.
Also, in April, the first national Coordinator for Smart Grid
Interoperability was named. How do you value the Smart Grid
initiatives in Europe and China compared to those in North
America (US and Canada)? Does Europe need to catch up com-
pared to the US? Do you see challenges and opportunities
ahead for both Europe and North America?
Smart grid is gaining traction throughout the world as a means
to conserve resources, lessen pollution and increase the security and
Tony DiMarco
Latest News? Visit www.geoinformatics.com July/August 2009
resiliency of power grids. A number of areas in North America have
been very aggressive about pursuing wide-scale smart grid plans,
including Ontario, Canada and Boulder, Colorado. Intergraph has
been working with Oncor Electric Delivery in Texas since 2007 and
Enersource Hydro Mississauga in Canada since 2008 on smart grid
projects. National governments around the world, from China and
Europe to the U.S. and Canada, are stepping in to ensure that
smart grids become a reality in their nations. In fact, smart grid is
a key topic on the conference program for Intergraphs China Users
Conference being held in Beijing in July. While each nation will
face its unique challenges, I believe there is currently great oppor-
tunity for any company involved in smart grid, and I see smart
grid soon becoming ubiquitous around the world as all nations
work to achieve more reliable power delivery for their growing
energy needs while working to minimize the impact of electric
power generation on the environment.
Safety and security are, now more than ever, serious issues
to take into account in relation to Smart Grid, both online
(the electricity distribution system on the internet) and offline
(the physical system itself ). In what way can Intergraph help
operators improve the safety and security of Smart Grid?
While it does involve putting more of an electric system
online and therefore opening it up to the vulnerabilities of cyber
attack, the smart grid movement will inherently make grids safer
and more secure in the long run since the entire network can be
monitored in real time and potential issues can be addressed
immediately. Intergraph is uniquely positioned to address grid
security due to our long history in the transportation security
and intelligence fields. Intergraph has been engaged in cyber
security assessments and remediation programs since the early
1990s, so this is an area in which we possess a lot of experi-
ence. By working closely with our Intergraph security experts,
customers can both address physical critical infrastructure pro-
tection needs and participate in cyber security assessments and
remediation to ensure that their operations center and their grid
monitoring and control systems are as impenetrable as possible
to attack.
How smart do you think Smart Grid really is, or will become?
Do you think the network will be communicating by itself? Is
one part of the network actually communicating with another
physical part? And if so, how should this network speech be
managed and by who? Key exhibition areas: Transport and logistics
Trafc and automotive I Machine control
Satellite surveying I Terrestrial SatNav technology
and services I Satellite-based technology
With the help of modern satellite navigation
systems, transport companies are able to manage
their vehicle eets and farmers are sowing their
seeds. Other industries are only now discovering
the various application benets of this new tech-
nology which already has an inuence on practi-
cally all branches of the economy and areas of life.
With POSITIONALE, Messe Stuttgart is creating the
rst European industry platform, which will bring
technology suppliers, service providers and users
together.
Find your position! Secure your trade fair parti-
cipation at POSITIONALE 2010.
The industry is positioning itself
New Stuttgart Trade Fair Centre
18 to 20 May 2010
www.positionale.de
Intergraph Smart Grid Command-And-Control Center.
52
Ar t i cl e
July/August 2009
TD: The ultimate goal of smart grid is to have a self-healing network
that automatically detects and remediates potential issues before
they create problems like outages and widespread blackouts.
However, as with any other automated technology, there will always
need to be some level of human intervention to ensure that the tech-
nology is doing what it is supposed to do. Therefore, we will still
need grid operators, but their jobs will evolve and become less man-
ual as grids incorporate more automation, which will allow opera-
tors to work more efficiently.
What about the effect on end users while automating electricity
flow: will they be able to bring their own solar energy into the
network? Are there already marketable ideas with regard to
providing customers with smart meter reading, in order to
calculate whether they are energy users or energy suppliers or
both?
One of the key benefits of smart grid is that it allows for the incor-
poration of alternative energy sources like solar and wind power into
the grid. In addition to enabling consumers to better monitor and con-
trol their energy usage, another intent of smart meters and smart grid
is to allow consumers to feed unused energy from electric cars for
example back into the grid, enabling them to save on energy bills
while helping to alleviate strains on energy supplies.
Do you foresee other markets and industries where the ideas
behind the Smart Grid concept can be applied?
Technically any utility can adopt the smart grid model. Water,
wastewater, gas, pipeline and communications providers all share the
same goal as electric utilities to provide a service to customers as
efficiently, reliably and safely as possible. I think the modernization of
other utilities distribution networks is in the near future. In fact, an
Intergraph customer, EPB of Chattanooga, which provides both electric-
ity and high-speed communications to its 163,000 customers in
Tennessee, is building a fiber optic smart grid to both provide cus-
tomers more control over managing their energy usage and provide
state-of-the-art communications services including video and faster
high-speed broadband Internet to homes and businesses.
Overall, smart grid is currently dominating most discussions surround-
ing electric utilities in the United States and abroad. In the near future,
I see it permeating throughout the world and into additional utilities
sectors, adding automation to the network and drawing attention to
better ways of visualizing, controlling and deploying resources to oper-
ate systems. Improved productivity and resiliency are hallmarks of smart
grid that can be applied to other markets and industries.
Tony DiMarco, Intergraph Director of Global Utilities and Communications.
Eric van Rees is editor in chief of GeoInformatics, Remco Takken is
editor of GeoInformatics.
For more information, have a look at
http://www.intergraph.com/learnmore/sgi/utilities-and-communications/
smart-grid-technology.aspx
Operating a Smart Grid.
Calendar 2009
Advertiser Page
Cardinal Systems www.cardinalsystems.net 46
e-GEOS www.e-geos.it 10
ESRI www.esri.com 2, 45
Foif www.foif.com.cn 19
ITC www.itc.nl 39
Intergeo www.intergeo.de 53
Jena-Optronik www.jena-optronik.com 43
Leica Geosystems AG www.leica-geosystems.com 35
Magellan www.magellangps.com 31
NovAtel www.novatel.com 25
Positionale www.positionale.de 51
Sokkia www.sokkia.net 56
Spie Europe www.spie.org/ers 49
Spot Image www.spotimage.com 15
SuperGeo www.supergeotek.com 41
Telespazio www.telesazio.com 11
Topcon www.topcon.eu 55
Unigis www.unigis.org 18
Advertisers Index
54
July/August 2009
21-25 September Conference on Spatial
Information Theory (COSIT '09)
Aber Wrac'h, France
E-mail: claramunt@ecole-navale.fr
Internet: www.cosit.info
22-24 September InterGeo 2009
Karlsruhe, Germany
Internet: www.intergeo.de
23-27 September International Summer
School for Cultural Heritage Documentation
Mrida, Spain
Tel: + 34 666 278 798
Fax: + 34 924 314 205
E-mail: administracion@gavle.es
Internet: www.doparex.com
October
05-08 October IXth International Scientific
and Technical Conference "From imagery to
map: digital photogrammetric technologies"
Attica, Greece
Tel: +7 495 720 5127
Fax: +7 495 720 5128
E-mail: conference@racurs.ru
Internet: www.racurs.ru
06-08 October 10th Austrian Geodetic
Congress
Schladming, Austria
E-mail: office@ogt2009.at
Internet: www.ogt2009.at
07-09 October ESRI Latin American User
Conference
Bogot, Colombia
Tel: 57 1 650 1575
E-mail: lauc09clombia@prosis.com
Internet: www.procalculoprosis.com/lauc09
11-14 October Electric & Gas User Group
(EGUG) Conference
Atlanta, GA, U.S.A.
Tel: +1 909 793 2853, ext. 4347
E-mail: prattanababpha@esri.com
Internet: www.esri.com/egug
14-16 October ESRI European User
Conference
Vilnius, Lithuania
Tel: +270 5 2150575
E-mail: conference@euc2009.com
Internet: www.esri.com/euc
18-21 October Pictometry FutureView 2009
Lake Buena Vista, Florida, U.S.A.
Internet: www.pictomery.com
19-22 October 7th FIG Regional Conference
Hanoi, Vietnam
Internet: www.fig.net/vietnam
November
03 November Springl 2009
Seattle, WA, U.S.A.
E-mail: daij@leda.nvc.cs.vt.edu
Internet: www.cs.purdue.edu/homes/daic/
springl09
04-06 November 17th ACM GIS 2009
Seattle, WA, U.S.A.
Internet: www.acmgis09.cs.umn.edu
04-05 November 4th International
Workshop on 3D Geo-Information
Ghent, Belgium
E-mail: 3dgeoinfo2009@geonet.ugent.be
Internet: www.3DGeoInfo.org
10-12 November ESRI Middle East and
North Africa User Conference
Manama, Bahrain
Tel: +973 1726255
E-mail: meauc2009@esri.com
Internet: www.esri.com/meauc
16-19 November ASPRS/MAPPS 2009 Fall
Conference
San Antonio, TX, Texas Crowne Plaza
Hotel, U.S.A.
Internet: www.asprs.org
25-27 November Remote Sensing of Land
Use and Land Cover
Bonn, Germany
E-mail: zfl@uni-bonn.de
Internet: www.sfl.uni-bonn.de
December
01-03 December 4th International
Conference "Earth from Space - The Most
Effective Solutions"
Moscow, Russia
Tel: +7 (495) 739 73 85
Fax: +7 (495) 739 73 53
E-mail: conference@scanex.ru
Internet: www.transparentworld.ru/
conference
07-08 December Web & Wireless GIS,
W2GIS 2009
Maynooth, Ireland
Tel: 353 1 402 32 64
E-mail: carswell@dit.ie
Internet: www.w2gis.org
16-20 December International Congres
Geotunis 2009
Tunis, Tunisia
Tel: + 216 71 341 814
Fax: + 216 71 341 814
E-mail: info@geotunis.org
Internet: www.geotunis.org
2010
01 January 2010 FIG Sydney
Sidney, Australia
Tel: +61 (02) 6285 3104
Fax: +61 (02) 6282 2576
E-mail: info@isaust.org.au
Internet: www.isaust.org.au
02-04 February Gi4DM 2010 Conferenc
Torino, Italy
E-mail: info@gi4dm-2010.org
Internet: www.gi4dm-2010.org
26-30 April 2010 ASPRS Annual Conference
San Diego, CA, Town and Country Hotel,
U.S.A.
Internet: www.asprs.org
25-29 May Fourth International Scientific
Conference BALWOIS 2010
Ohrid, Republic of Macedonia
E-mail: secretariat@balwois.com
Internet: www.balwois.com
July
27-31 July GeoWeb 2009
Vancouver, Canada
E-mail: chiebert@galdosinc.com
Internet: http://geowebconference.org
28-30 July MultiTemp 2009 - Fifth
International Workshop on the Analysis of
Multi-temporal Remote Sensing Images
Groton, Connecticut, U.S.A.
Internet: http://clear.uconn.edu/multitemp09/
index.html
August
02-06 August SPIE Optics + Photonics
2009
San Diego, CA, U.S.A.
Internet: www.spie.org
E-mail: media@spie.org
04-07 August 10th South East Asian
Survey Conference (SEASC'09)
Bali, Indonesia
E-mail: dkirana@bakosurtanal.go.id
12-14 August 17th International Conference
of Geoinformatics 2009
Fairfax, Virginia, U.S.A.
E-mail: info@geoinformatics2009.org
Internet: www.geoinformatics2009.org
28-29 August iGEOMAP2009, Urban
Infrastructure and GeoInformatics
Bangalore, India
E-mail: info@igeomap.org
Internet: www.igeomap.org
31-August - 03 September SPIE Europe
Remote Sensing 2009
Berlin, Germany
Internet: www.spie.org
E-mail: peterb@spie.org
31 August - 04 September Geodesy for
Planet Earth IAG2009
Buenos Aires, Argentina
Internet: www.iag2009.com.ar
September
07-09 September The Society of
Cartographers Annual Summer School
Southampton, United Kingdom
Tel: 0208 411 5355
e-mail: steve8@mdx.ac.uk
Internet: www.soc.org.uk/southampton09
09-12 September 6th International
Symposium on Digital Earth
Beijing, Peoples Republic of China
Internet: www.isde6.org
10 September RISK Management -
International, Interdisciplinary Workshop
Berlin, Germany
Internet: www.codata-germany.org
16-17 September GIS in the Rockies 2009
Loveland, CO, U.S.A.
E-mail: chair@gisintherockies.org
Internet: www.gisintherockies.org
21 September International Workshop on
Presenting Spatial Information: Granularity,
Relevance and Intergration (in cojunction
with COSIT '09)
Aber Wrac'h, France
Tel: +61 3 8344 7875
Fax: +61 3 9347 2916
E-mail: winter@unimelb.edu.au
Internet: www.sfbtr8.spatial-cognition.de/
cosit09-psi
21-23 September ESRI Health GIS
Conference
Nashville, TN, U.S.A
Tel: +1 909 793 2853 ext. 3743
E-mail: ctveten@esri.com
Internet: www.esri.com/healthgis
Please feel free to e-mail your calendar notices to:calendar@geoinformatics.com
The last issue (june 2009 volume 12) of GeoInformatics contained some
errors. The image on page 51 should also read Courtesy Parsons
Brinckerhoff. The caption on page 29 should have read: Vilnius and its
famous Belfry Tower, with a small piece of old town beyond. Photo credit:
Aidas Zubkonis.
www.topcon.eu
Handheld with GPS & GLONASS
from meter to cm RTK
One 4 all
g
r
a
f
i
t
-
w
e
r
b
e
a
g
e
n
t
u
r
.
d
e
Sokkia understands that today's surveyors, more than ever,
need to rely on their equipment no matter how rough the
conditions are. That is why the ultimate challenge for our
Japanese engineering team is to develop more precise and
reliable positioning solutions everyday.
As a result, Sokkia's complete instrument line-up provides
reliable and accurate measuring solutions, with the lowest
cost of ownership, in the feld of surveying, tunnelling,
monitoring, construction and industrial applications.
This perfectly balanced range of instruments gives the
jumpstart you need to get your work done fast and accurate.
Sokkia: always a step ahead
with 8okkia's Positioning 8olutions
ALVANE A ETEM AmEAC
WWW.c|||a.|e| 2009 SOKKlA TOPCON CO.,LTD

Das könnte Ihnen auch gefallen