Beruflich Dokumente
Kultur Dokumente
WORKSHOP REPORT
LA JOLLA, CALIFORNIA
MAY 14-16, 2001
TABLE OF CONTENTS
1. Overview .................................................................................................................................................. 3
1.1 Motivation ........................................................................................................................................... 3
1.2 Objectives ........................................................................................................................................... 5
1.3 Agenda ................................................................................................................................................ 6
1.4 Evaluation ........................................................................................................................................... 6
1.5 Relevant URLs ..................................................................................................................................... 6
Appendices ................................................................................................................................................ 16
Appendix 1: List of Attendees .................................................................................................................. 16
Appendix 2: Final Agenda ....................................................................................................................... 18
Appendix 3: Workshop Evaluation .......................................................................................................... 20
Appendix 4: Relevant URLs ...................................................................................................................... 24
ON MAY 14 -16, 2001 the National Science Foundation and the Office of Naval Research sponsored a
workshop on Data Management for Marine Geology and Geophysics: Tools for Archiving, Analysis, and Visu-
alization. The workshop was held at the Sea Lodge Hotel in La Jolla, CA. The workshops objective was to bring
together researchers, data collectors, data users, engineers, and computer scientists to assess the state of
existing data management efforts in the marine geology and geophysics (MG&G) community, share experi-
ences in developing data management projects, and help determine the direction of future efforts in data
management.
1
2
10. Develop standardized tools and procedures to 19. Require level 3 metadata* within each discipline-
ensure quality control at all steps from acquisi- specific data center. Archiving of publications
tion through archiving. related to the data should also be included (level
11. Improve access to common tools for data analy- 4 metadata*).
sis and interpretation for the benefit of the com- 20. Follow nationally accepted metadata standards
munity. (particularly for levels 1 and 3 metadata*).
12. Build data centers to address the needs of a di-
verse user community, which will be primarily A clear top priority of the workshop participants is to
scientists. immediately define and establish a centralized
13. Enforce timely data distribution through funding metadata catalog. The metadata catalog should be
agency actions. broad, containing information on as many data types
14. Promote interactions among federal agencies and as possible. It should support geospatial, temporal,
organizations, and international agencies to de- keyword, and expert-level searches of each data type.
fine data and metadata exchange standards and By definition, metadata are information about data that
policies. can evolve. The catalog should be a circular system
that allows feedback from the user/originator. The
DATA DOCUMENTATION
metadata catalog should serve as the central link to
15. Require ship operators and principal investiga- the distributed network of data centers where the ac-
tors (P.I.s) to submit level 1 metadata* and cruise tual data reside.
navigation to the centralized metadata catalog To move forward, funding agencies must estab-
at the end of each cruise as part of the cruise- lish a small working group or advisory board to de-
reporting process. velop the structure and implementation of a metadata
16. Generate a standard digital cruise report form and catalog. Additional working groups for each of the
make it available to all chief scientists for cruise high-priority, discipline-specific data centers also need
reporting (level 2 metadata*). to be assembled. It is critical to obtain the active in-
17. Require individual P.I.s to complete and submit volvement of scientists in all aspects of this process
standard forms for level 1 and 2 metadata* for through all operational phases, including data collec-
field programs carried out aboard vessels not in tion, processing, archiving, and distribution.
the University-National Laboratory System Section 2 of this report discusses these recom-
(UNOLS) fleet (e.g., foreign, commercial, other mendations further.
academic platforms).
18. Generate a standardized suite of level 1 and level *Metadata levels: Level 1. Basic description of the field program
including location, program dates, data types, collecting institu-
2 metadata* during operation of seafloor obser-
tions, collecting vessel, and P.I.s. Level 2. A digital cruise report and
vatories and other national facilities (e.g., the
data inventory. Level 3. Data object and access information includ-
Deep Submergence Laboratory, Ocean Bottom ing data formats, quality, processing, etc. Level 4. Publications de-
Seismograph (OBS) Instrument Pool), and sub- rived from the data.
mit to the central metadata catalog.
1. OVERVIEW
1.1 MOTIVATION
MG&G SCIENTIFIC data collections are grow- available for shallow-water problems, and digital ac-
ing at a rapid rate (Figures 1 and 2). Processed and quisition of 480 channel data is currently routine for
analyzed data made available to a broad community deep-ocean work (Figure 5).
of scientists, educators, and the general public can be With these new technologies it is becoming in-
used for discovering and distributing new knowledge. creasingly difficult for individual investigators to syn-
A significant problem is how to provide data users with thesize and organize data sets collected on single
the means to effectively access these data and the tools cruises let alone manage them in a manner that al-
to analyze and interpret them. lows data to be accessed efficiently by a larger user
Advances in data storage technology have elimi- pool. National archiving of some marine geoscience
nated practical constraints on storing large data vol- data is carried out. There have been several attempts
umes and have permitted data collection at increas- by individual P.I.s to establish geographic- or data-
ingly finer sample rates. New high-resolution systems specific databases. However, access to many data types
provide digital images of the seafloor at sub-meter pixel remains difficult and incomplete (e.g., MCS, multibeam
resolution and generate data at rates on the order of bathymetry, sidescan sonar, camera, and video imag-
Gigabytes per day (Figures 3 and 4). Seismic acquisi- ery). Large quantities of data are under-utilized by
tion capabilities have greatly expanded with long-term primary users and gaining access to these data is vir-
deployment of bottom sensors. High-resolution mul- tually impossible for secondary users. At the same time,
tichannel seismic reflection (MCS) systems are now our scientific interests are increasingly interdiscipli-
24.0 1000
23.0
22.0
21.0
NGDC Data Growth
20.0
19.0
18.0 100
Archive Size (terabytes)
17.0
16.0
15.0
14.0
33
13.0
Terabytes
12.0 10
11.0
10.0
9.0
8.0
7.0 1
6.0
5.0
4.0 17
3.0
2.0
1.0 0.1
0.0
9
Jul-92
Jul-93
Jul-94
Jul-95
Jul-96
Jul-97
Jul-98
Jul-99
Jul-00
Jul-01
Jan-92
Jan-93
Jan-94
Jan-95
Jan-96
Jan-97
Jan-98
Jan-99
Jan-00
Jan-01
3
4
Figure 3. Sun-illuminated perspective view of the Eel River margin Figure 4. Three-dimensional shaded relief map of the East Pacific
of Northern California. Multibeam bathymetry (EM1000, Rise near 9o-10oN. This is currently the best-studied section of fast-
Hydrosweep and Seabeam data) are merged with the USGS 30 m spreading mid-ocean ridge. Figure courtesy of Dawn Wright (OSU).
DEM for the adjacent land. Three-dimensional visualization of the
merged topography is carried out using Fledermaus from Interac-
tive Visualization Systems and Analysis. (Fonseca, L , Mayer, L. and
Paton, M., ArcView Objects in the Fledermaus Interactive 3-D Vi-
sualization System: An example from the STRATAFORM GIS, in
Wright, D.J. (ed.), Undersea With GIS, Redlands, CA: ESRI Press, in
press, 2001).
Figure 5. Example of a multichannel seismic reflection record from Figure 6 Example of capability provided by MapAp, a web-based
the northwest shelf of Australia. Seismic interpretation of various map driven, database interface developed for the RIDGE Multibeam
reflectors is superimposed with various colors. Data are stored and Synthesis project (see Appendix 4). Figure shows a multibeam
displayed within the GEOQUEST IESX seismic-well log integrator/ bathymetry map of Axial Seamount, NE Pacific, with a user-defined
data browser. The power engine of IESX is an Oracle-based data- profile location and corresponding bathymetry profile displayed.
base system that organizes seismic, well log and geographical data Figure provided by Bill Haxby (LDEO).
in a local environment for interpretation. Figure courtesy of Garry
Karner (LDEO).
nary and require easy access to the broad spectrum
of data collected. Throughout the marine geoscience
community, scientists want access to data, the ability
to compare data of different types, and tools to ma-
nipulate and interpret these data (Figures 6, 7, 8).
With these concerns in mind, the National Sci-
ence Foundation (NSF) and Office of Naval Research
(ONR) sponsored a workshop on MG&G data man-
agement on May 14-16, 2001 in La Jolla, California.
The coordinating committee advertised the workshop;
participation was open. Approximately 80 represen-
tatives from science, engineering, computer science,
government, and industry attended (Appendix 1). The Figure 7. The Virtual Jason Control Van is a web-based application
workshop provided a forum for a focused interchange that takes real-time snapshots of information occurring inside the
control van during vehicle operations and makes this information
of ideas among data users, data providers, and tech-
immediately available for shipboard scientists and for collabora-
nical experts in information technology.
tion and post-cruise analysis on shore. Features include monitor-
ing real-time operations, searching for events, dates, etc. Figure
courtesy of Steven Lerner (WHOI).
1.2 OBJECTIVES
The overall goal of the workshop was to develop a
strategy for MG&G data management that would meet
Experimental Data Processing
scientists needs for improved data access and im- Ocean
Data
proved tools for data analysis and interpretation. Ac- Matlab
complishing this goal will lead to greater use of data Adjust Constraints
by the MG&G community and the education and out- Parameters Geodynamic Seismic
Application Velocity
reach community. Model Parameters
5
6
1.3 AGENDA
The first day of the workshop was devoted to short The presentations on days 1 and 2 served as cata-
talks, each followed by a brief discussion. In addition, lysts for discussions that were held within the theme
a longer discussion followed each group of subject- working groups, each of which consisted of an inter-
specific talks. The longer discussion was led by a pre- disciplinary group of scientists, engineers, and com-
assigned discussion leader. Our intent was to engage puter scientists. These working groups addressed a
the participants in the meeting right from the start number of questions that formed the basis for presen-
through the discussion. tations in the morning of the third day of the work-
The workshop began with presentations from shop.
data users. The talks focused on problems that P.I.s The full agenda is given in Appendix 2. Presen-
have had in the past with gaining access to data, and tations and poster abstracts can be obtained through
possible solutions to these problems. Representatives the workshop conveners.
from large, multidisciplinary MG&G programs gave
overviews on anticipated database needs for new pro-
1.4 EVALUATION
gram initiatives. Individual P.I.s made presentations
on database projects which they initiated, providing An evaluation form was included in the workshop
working examples of data access and functionality over packet that participants received. The forms were col-
a range of disciplines. In the late afternoon of the first lected at the conclusion of the workshop. The re-
day, workshop participants presented models for data sponses to the questions have been compiled and are
access. The format of this session was somewhat dif- presented in Appendix 3.
ferent as the talks served as introductions for demon-
strations that were part of the evening poster session.
In addition to these invited demonstrations, the evening
1.5 RELEVANT URLS
session included posters and demonstrations contrib- During the meeting, participants were asked to pro-
uted by workshop participants. vide links to web sites that are relevant to MG&G da-
The second day of the workshop began with pre- tabase efforts. This URL list is provided in Appendix 4.
sentations by representatives of organizations with
large central databases. Talks focused on anticipated
future directions in data access and database design
as well as insights on successes and major obstacles
encountered during their efforts to date. The final set
of talks focused on current developments in informa-
tion technology, including data mining issues and de-
signing databases to serve real-time data.
2. WORKING GROUP SUMMARIES
Working Group 1 considered how to structure a MG&G attendees and the MG&G community, in general. The
data management system. Currently, some data are consensus of Working Group 1 is that the community
archived at the National Geophysical Data Center must begin taking small, concrete steps towards es-
(NGDC), such as the suite of underway geophysical tablishing a metadata catalog. From there the com-
data (navigation, gravity, magnetics, topography) col- munity should move towards a discipline-oriented,
lected on most large UNOLS vessels. However, it is distributed data management system that will improve
not standard practice to submit to NGDC all data col- the data use by a broad community. Development of
lected by the MG&G community. the discipline-oriented data centers should be handled
Several ship-operating institutions have archived through the normal competitive proposal process. Al-
data at some level. However, no standardization across though participants agreed that significant resources
institutions exists and these efforts have been carried are needed for new database efforts, exact details of
out at the discretion of the individual institutions. The the level of government agency funds for the manage-
need for a sound data management system is recog- ment system were not determined.
nized, and a few workshops have been held to ad-
RECOMMENDATIONS
dress this problem for specific data types (e.g., MCS
Workshop, La Jolla, CA, 1999). In addition, individual WG1_1. Create permanent, active archives for all
P.I.s have made specific data types available to the MG&G data.
broader community (see Appendix 4). It is evident that It is very important that the funding agencies
there is no community-wide strategy in place to solve maintain and strengthen their commitment to long-
MG&G data management problems. term data archiving. As noted at the meeting, data col-
lected by the HMS Challenger are still being used. Per-
QUESTIONS CONSIDERED
manent archives for all types of MG&G data must be
Working Group 1 addressed the following questions: established. The community must continually add to
What model is appropriate for a data manage- and update these permanent archives.
ment system (e.g., distributed versus centralized)?
How do we fund the data management system? WG1_2. Manage data using a distributed system
How do we evaluate the system? with a central coordinating center.
Do we need a permanent archive? The management system should operate as close
to the data generators as possible. Scientists must be
There was clear agreement within the group that the actively involved in data management, placing the re-
MG&G community needs a distributed data manage- sponsibility for and authority over the data as close
ment system with a coordination center to facilitate as possible to where the expertise resides. Data qual-
communication among different data centers. The ity control should be provided by those generating the
working group session started with a discussion of data. Mechanisms should be developed to enable us-
metadata, indicating the importance of metadata to ers to easily provide feedback on data quality.
7
8
A coordination center is necessary to facilitate needs. There is a critical need for one-stop ac-
communications among the distributed data centers, cess quality-control and processing centers with
and to ensure that everyone works together. A good tools to generate higher level products. There does
example of central coordination is the OBS Instrument not seem to be a quality-control process in place
Pool. The individual instrument centers provide qual- although the MB-System software provides tools
ity control and write standard format data. The Incor- for reading a broad suite of multibeam data.
porated Research Institutions for Seismology (IRIS) 7. Deep submergence data collected by near-bottom in-
then archives the data and provides community ac- struments (submarines, remotely operated vehicles
cess to them. (ROVs), autonomous underwater vehicles (AUVs),
etc.). In principle, these data should be managed
WG1_3. Manage different data types with user-de- in the same way that other shipboard data are
fined centers. managed. A data management plan must be de-
Examples of different data types and their man- fined and overseen, perhaps through the Deep
agement status are given below. The list is not all in- Submergence Science Committee (DESSC), the ex-
clusive. isting operators and user group.
1. Ocean bottom seismograph/hydrophone (OBS/OBH) 8. Gravity/magnetic data. NGDC maintains archives
data. Quality control is provided by the three OBS of these data, but there are major quality-con-
instrument centers and archival and community trol and user-interface problems. The commu-
access is provided by IRIS. nity concerned about these data needs to be de-
2. Rock petrology/geochemistry data. A web-served fined. Value-added products (derived products)
database is being developed to provide metadata should be archived and made available to the
and processed results for rock samples. This ef- broad community.
fort is ready for migration to permanent support. 9. Sedimentology, paleontology data. Although it was
3. Core/rock collections. Sample curation appears to noted that problems exist, there were too few rep-
be in good shape. NGDC maintains a central cata- resentatives from these communities at the work-
log of the existence of physical samples and some shop to define the issues and possible solutions.
sample metadata. NSF should encourage mini-workshops or work-
4. Ocean Drilling Program (ODP) data. ODP developed ing groups for these data.
the JANUS database based on community rec-
ommendations that came out of several work- WG1_4. Support area- or problem-specific data-
shops. It appears to be in good shape. There are bases if scientifically justified, but these databases
plans in place to transition the database from ODP should link to rather than duplicate data holdings
to IODP in 2003. within discipline-specific data centers.
5. Single channel and multichannel seismic data. Working Group 1 recognizes that there might be
A workshop was held in 1999 to determine the a future need to set up databases for specific oceanic
needs of the community for database manage- regions or for specific scientific goals. Examples of
ment. Recommendations were made from that area-specific databases are those for the 9oN area of
workshop. An interested subgroup of the MCS the East Pacific Rise and the Juan de Fuca region. Ex-
community needs to define the model details and amples of problem-specific databases are those that
submit a proposal to NSF. will develop from the MARGINS and RIDGE programs.
6. Multibeam sonar data (bathymetry, sidescan, back- These databases should be supported, but they should
scatter, LIDAR, etc.). This community needs a user/ serve as links to discipline-specific databases and
generator workshop or working group to define should not duplicate data holdings within these data-
the problems and solutions to their database bases.
WG1_5. Evaluate the data management system us- communitys needs. Selection of data centers should
ing oversight and advisory committees, in-depth be determined through competition, and a data cen-
peer reviews at renewal intervals, and ad hoc pan- ter should not expect to be funded permanently.
els to assess each data centers contribution to sci-
ence. WG1_6. Fund core operating costs of the distrib-
The data management system should undergo uted data centers as 3-5 year facility cooperative
regularly scheduled peer review. A new set of advi- agreements.
sory groups representing the broad spectrum of the This is a corollary to recommendation WG1_5 in
MG&G research community should be established. that funds should cover a finite number of years after
This will ensure that the recommendations regarding which each of the data centers should be evaluated
data sets and models will be responsive to the for effectiveness and responsiveness to users needs.
Figure 9. World map showing over 15 million miles of ship tracks with underway geophysical data inventoried within
NGDCs Marine Trackline Geophysics Database. Bathymetry, magnetics, gravity, and seismic reflection data along
these tracks from 4600 cruises were collected from 1939 to 2000. Figure provided by John Campagnoli, NGDC.
9
10
6
Data Storage RIDGE Multibeam Synthesis Web Site). However, de-
5
livery to these data archives is largely at the discre-
tion of the P.I., and access to these data and many
4 other data types is often difficult.
Log (PB/yr)
11
12
hold period. This lock is released when the period ex- WG2_7. Promote interactions among federal agen-
pires. The standard period is two years, although some cies and organizations, and international agencies
circumstances may warrant an extension to be granted to define data and metadata exchange standards and
by the cognizant funding agency. policies.
Auditing access to data will provide usage statis- The community would benefit from the standard-
tics and facilitate interdisciplinary collaboration as well ization of forms, such as an end-of-cruise digital data
as the communication of future updates, within the form, as well as from metadata content and exchange
restrictions of privacy requirements. standards. We encourage collaboration among the fed-
The NSF Final Report could include a field to de- erally mandated agencies (NSF, ONR, USGS, NOAA,
scribe how the P.I. complied with NSF data-distribu- NAVO, etc.) to review marine database standards.
tion policies. Noncompliance might have a negative International discussions should be encouraged
effect on future proposals. Data publication in citable to define exchange standards and policies. At a mini-
journals and in technical briefs such as USGS open- mum, exchange of cruise tracks and sample locations
file reports should be encouraged. would be a major benefit for cruise planning.
Working Group 3 focused on metadata issues. The de- form metadata are collected during federally funded
velopment of appropriate metadata and metadata stan- MG&G field programs. Basic information regarding
dards for ocean floor and other types of oceanographic cruise location, date, project P.I.s, and data types col-
data are an extremely important issue. The growth in lected can be difficult to obtain, and no central and
information technology has led to an explosion in the comprehensive catalog is available. Cruise reports of-
amount of information that is available to researchers ten contain detailed information regarding general ex-
in many fields. This is the case in the marine environ- periment configuration, data calibration, and data
ment where a state-of-the-art visual presence (e.g., quality, all of which are of great importance for sub-
through long-term monitoring by cameras and other sequent data analysis. In many instances, the cruise
instruments) may result in the acquisition of data that report may be the only record of this information, but
quickly overtakes the speed at which the data can be no easily accessible digital archive of these reports
interpreted. The paradox is that as the amount of po- exists.
tentially useful and important data grows, it becomes
QUESTIONS CONSIDERED
increasingly difficult to know what data exist, the ex-
act location where the data were collected (particu- Working Group 3 addressed the following questions:
larly when navigating at sea with no landmarks), How do we move toward metadata standards?
and how the data can be accessed. In striving to man- How do we standardize data collection proce-
age this ever-increasing amount of data, and to facili- dures?
tate their effective and efficient use, compiling What is the role of the ship operating institu-
metadata becomes an urgent issue. tions in the archiving of data and generation of
Although metadata are contained within some of metadata?
the digital data file formats commonly used to store What existing software and structures should we
MG&G data (e.g,. MGD77 and SEGY formats), no uni- take advantage of?
How should we deal with real-time data acquisi-
tion?
00 AVON2-27
Responsibility for each metadata level could reside with 1 00'N -50 1 00'N
0 50'N 0 50'N
tion of a central metadata catalog for levels 1 and 2
metadata was viewed as the highest priority. The group
consensus is that level 1 metadata should be gener-
AVON2-29
-4000 AVON2-28
ated during data acquisition and should be submitted 0 40'N 0 40'N
-5000
to the central metadata archive immediately following
a field program. Level 2 metadata should also be ar-
chived within the central metadata catalog, whereas
176 40'W 176 30'W
level 3 metadata would reside with the actual data
themselves. The appropriate archive for level 4 -6500 -6000 -5500 -5000 -4500 -4000 -3500 -3000 -2500 -2000 -1500 -1000 -500 0
Depth (m)
metadata may be both the central metadata catalog NSF OCE97-30394 Institute of Geophysics & Planetary Physics, Scripps Institution of Oceanography, UCSD, USA
13
14
RECOMMENDATIONS
WG3_1. Create a centralized and searchable on-line leg ID (if appropriate). Metadata standardization is very
metadata catalog. important. Metadata and data need to be handled
The metadata catalog should be broad, contain- separately for maximum efficiency.
ing information on as many data types as possible. It
should support geospatial, temporal, keyword, and WG3_3. Generate a standard digital cruise report
expert-level searches of each data type. By definition, form and make it available to all chief scientists for
metadata are information about data that can evolve. cruise reporting (level 2 metadata).
The catalog should be a circular system that allows These digital forms should be uniform across all
feedback from the user/originator. The metadata cata- federal agencies for all future cruises and should be
log should serve as the central link to the distributed submitted to the centralized metadata catalog.
network of data centers where the actual data reside. Old cruise reports should be digitized, perhaps
Selection of an organization to develop and main- from the NOAA National Oceanographic Data Center
tain this metadata catalog should be through a com- (NODC) archive, as a parallel effort. Standard report-
petitive process. The organization will oversee the de- ing should include essential fields described above as
velopment of metadata entry tools for easy entry into well as specific details for each data type (e.g., data
the metadata catalog. A high performance storage sys- ranges for each data type, acquisition quality control
tem to archive and serve the catalog to the commu- records, number and location of sample stations). The
nity should also be implemented. responsible individual and physical location where
each data type will reside following a cruise should be
WG3_2. Require ship operators and P.I.s to submit identified.
level 1 metadata and cruise navigation to the cen-
tralized metadata catalog at the end of each cruise WG3_4. Require individual P.I.s to complete and
as part of the cruise reporting process. submit standard forms for level 1 and 2 metadata
This function should be provided by the techni- for field programs carried out aboard non-UNOLS
cal support staff aboard UNOLS vessels, although the vessels (e.g., foreign, commercial, other academic
ultimate responsibility for generating and delivering platforms).
these data should lie with the project P.I. Tools need Not all field programs carried out by MG&G re-
to be developed to facilitate this task, simplifying the searchers involve UNOLS vessels, and procedures need
process with a smart web form. Standard forms should to be developed that permit the cataloging of data col-
be used on all UNOLS vessels and for all kinds of data- lected during these programs as well.
collection activities (chemical, physical, biological, and
geological studies). Level 1 metadata along with cruise WG3_5. Generate a standardized suite of level 1 and
navigation should be submitted. 2 metadata during operation of seafloor observato-
UNOLS may be an appropriate organization to ries as well as other national facilities (e.g., the Deep
manage the metadata submission process (and pos- Submergence Laboratory, OBS Instrument Pool) and
sibly the catalog), perhaps through modification of the submit to the central metadata catalog.
UNOLS electronic ship request form. Metadata need The metadata required should parallel that ac-
to be defined, but should include items such as the quired from UNOLS operations with additional fields
chief scientist(s), project P.I.(s), institution(s), data as relevant. Navigation from submersibles, ROVs, and
types collected, dates of field program, geographic AUVs needs to be captured and archived along with
coordinates of the field area, ship name, and cruise support-ship navigation.
WG3_6. Require level 3 metadata within each dis-
cipline-specific data center.
Required metadata for a specific data type will
likely vary and will be decided through development
of individual data centers. These metadata include, for
example, descriptions of data formats, retrieval infor-
mation, data quality, and processing procedures. Ar-
chiving of publications related to the data should also
be included (level 4 metadata).
15
16
17
18
19
20
APPENDIX 3:
WORKSHOP EVALUATION
The following workshop evaluation consists of answers QUESTION 2: What single suggestion
to two questions and a list of additional comments would you make to improve this
made by the participants. The evaluation was collected workshop?
from the participants at the end of the workshop. Fol-
Needed an example of a working data informa-
lowing the comments by the participants, pie diagrams
tion system on the WWW (such as land use sys-
of the session evaluation data are presented. There
tem).
were 35 forms submitted (~45% of the participants).
Follow up with another one in a year or so.
Not all participants answered each question.
Would have been nice to have more input from
QUESTION 1: Was there adequate funding agencies.
time for each activity? Discussion was dominated by data providers.
Clear visions of what the long term goals "should"
Yes - 28 No - 5
be often got lost. Long term goals should have
If you were of the opinion there was inadequate time,
more user input, including general nonscientific
please explain.
community.
The working groups required much more time to
Mandates to working groups were somewhat
discuss their issues. Also, the size (too large) pro-
vague and overlapping. Need to be more focused
hibited focused discussions.
and carefully thought out.
More time for working groups.
Present proposals prior to workshopmaybe de-
Too many issues that are unfamiliar to the ma-
veloped by very small groups.
jority of participants were brought up and a final
Reconvene at least once within 12-18 mo. after
recommendation is premature. More meetings
the proceedings and recommendations have been
with focus groups seem required.
disseminated and reviewed by the NSF manage-
The size of the meeting was too large too many
ment and community.
people. The time required is proportional to the
Ask NSF PMs to talk about NSF commitment to
square of the number of attendees (2x people
workshop objectives at the end of the workshop.
need 4x time).
Better fit of room to audience size.
The time for "tools for data access and analysis"
None.
was a little bit limited. It is understandable that
None.
the time allotted for commercial presentations
A room that would make it easier to see the pre-
was less than the other general presentations but
sentations. However, the surroundings were
the time allotted did not allow for much interac-
pleasant and the location at the hotel was con-
tion with the audience.
venient.
Provide a summary of existing workshop recom- Internet access at the meeting! Posters at the
mendations on database management in other meeting!
fields. More info provided prior to the meeting.
More UNOLS participation (especially since we More focus on who and how this is going to make
generated "unfunded mandates"). this happen.
Invite international attendees like the French.
ADDITIONAL COMMENTS
Organize it so that it focuses on more specific
recommendations, less general (?). Room should be laid out broad and shallow in-
A more focused group of experts from both the stead of long and deep.
scientific and computer science communities The follow-up workshop should emphasize more
should be gathered to improve progress, where focused groups of users and providers by disci-
domain experts in data base management and pline on data type (e.g., MB, MCS, UW Video,
science plan a detailed proposal to NSF. etc.)
Small item but short description of agenda items Very well run workshop.
would be useful. My background is in C.S. and I enjoyed this con-
Handout of overheads/presentation slides. ference very much!!
None. Very informative learned a lot of what is done
The sessions probably should have had a man- and available.
date to develop some themes or recommenda- I hope that CARIS would be invited back.
tions and the session leaders could have been Thanks for the big effort to organize it.
given the mandate to develop some consensus The meeting was exceptionally well choreo-
or themes as part of the session. These "results" graphed and no problems with the time lines.
could have been fed into the working groups to This was a useful fact-finding workshop, but the
make them more productive. details on how the future will be mapped is not
None. clear at the end of this session.
Abstracts and titles of talks available before meet- Very useful workshop. Good to see consensus
ing. building throughout. More productive than many
There should have been "read-ahead" material workshops.
to inform participants about other database dis- I think the workshop was very successful in gath-
cussions and workshops that have already taken ering the experience and articulating the needs
place under NSF sponsorship. and concerns of the MG&G community. The key
I thought having the people from "outside" the will be to craft recommendations that will lead
MG&G community (esp., Cushing, Gaudet, to coherent actions.
Brovey) was a good idea. Perhaps a bit more in- I thought Gaudets talk on the data he worked
put from the oil industry would have been good with (the amount and flow) was good as it put
they collect very similar data and face similar the amount of data our group is discussing into
problems: serving up data, what media to store perspective. It gives me a sense that we should
data on. be able to organize the data that we have.
More IT (Information Technology) specifics. An important and refreshing opportunity to re-
It was a good balance of researchers and work- think and reconsider NGDC/MGGs role and re-
ers associated with DB systems. Job well done. sponsibilities to the community.
Fewer, more select audience/participants at the Data catalog vs. database distinction is impor-
risk of compromised broadness to achieve a tant. I would like to have seen more examples of
higher degree of focus. working solutions such as the one that Peter
More pre-meeting planning and distribution of Knoop presented.
material.
21
22
32%
47% 53%
41% 35%
53%
Monday Afternoon:
Tools for Data Access Reception at IGPP Poster Session/
and Analysis Demonstrations
0% 0%
3% 7% 3% 3% 3%
29% 26%
36%
30%
44%
60%
24% 32%
Very valuable
Valuable
Average Value
Limited Value
Very little value
Tuesday Morning:
Organizations With Tuesday Morning: Tuesday Afternoon:
Centralized Databases Database Components Working Groups
0% 0%
6% 0% 6% 0% 6% 0%
17%
43%
53%
41% 35%
59%
34%
Wednesday Morning:
Summaries of Wednesday Morning:
Tour of SDSC Working Groups Wrap Up
0% 0%
9% 10% 0% 0%
14%
26%
17%
48%
32%
58%
17% 38%
31%
Very valuable
Valuable
Average Value
Limited Value
Very little value
23
24
25
26
27
28
September 2001