Sie sind auf Seite 1von 23

Version 3.8.

0 Overview
Prepared by: Alison Atkins , Senior Geological Data Analyst, Metech Pty Ltd

November 2003
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

Continuing in the Metech tradition,every new release of acQuire incorporates the most up-
to-date technological advances, increasing the functionality of the software and ease of
use for our clients.

Once again, not only have new functions been added to the software, but also
improvements and enhancements made to the pre-existing tools. New products have been
added to the existing suite of acQuire tools, including a coordinate transformation system
and data tracking. Major enhancements have been made to both the existing Data Entry
and Import objects. For Import objects this includes dynamic source file header recognition
and QAQC validation for assay imports. For Data Entry objects a new design interface is
available, plus new entry modes, graphical and grid displays and a variety of options for
entering data; either Online (existing entry objects) or Offline (via PocketPC or Remote PC
packaging). Extra functionality has also been added to Export, Briefcase, Form and SQL
objects. New functions for expressions are also available within each of the acQuire
objects. Modifications have also been made to the acQuire Data Model (ADM3500) and
the acQuire meta-system.

Listed below is a brief outline of what you can expect to see with the release of V3.8.0 of
acQuire.

New Functionality
Coordinate Transformation
The coordinate management system stores meta data associated with grids and grid
transformations, and store multiple grids in the database. Each grid is defined in terms of
its datum, projection and, if appropriate, its relationship to other grids. This means that
the grids are totally spatially defined. Coordinates for both drillhole and point data can be
transformed.
The advantages of the new
acQuire coordinate management
system means that there is not a
proliferation of coordinate and
azimuth data stored in the
database. Data can be stored in
one grid and transformed to a
selected grid when required. All
the user needs to do is store one
set of coordinates in the
database with the many grid
relationships defined.
Transformations between
geographic, projected and local
grids can be achieved simply and
the results are constant. This
means that the any grid can be
transformed to:
• One on the same level (e.g. local to local grid transformation)
• One on a higher level (e.g. local to projected grid transformation)

Page 2 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

• From the lowest to the highest level (e.g. local to geographic grid transformation),
via a projected grid. This method is not recommended as errors can result because
you are transforming from a flat surface to a sphere.
The fundamental objective of the acQuire coordinate management system is to provide:
• A solution for storing all the history and project evolution associated with
coordinate grid systems.
• The computational methodology whereby transformations could be applied by end
users without having to understand the fundamentals of geodesy.

rd
A dynamic transformation delivery system to 3 party application systems and for
the acQuire export tool.
• A normalized relational structure that is very flexible for querying and reporting.
There are two main areas of coordinate management, which are defined by who performs
the transformation task. There are tasks for the:
• acQuire manager on site – define the coordinate grid/s, create coordinate sets of
virtual fields and create form definitions that include those fields.
• General users – use the grids, transformations and coordinate virtual fields that
have been established to:
o Export data. Use the fields to filter and transform coordinates for both
drillhole and point samples. A form which includes coordinate sets of fields
can also be exported to a file.
o Filter. The coordinate sets of virtual fields can be used in a numerous places
in acQuire to filter a form, create a workspace filter, in the Calculated Fields
tool and more.
The definition of the
grids and the creation
of the coordinate
virtual fields is done
via new wizards
available from the
acQuire main menu.
The new wizards can
be accessed from
Manage\Coordinate
System. The first thing
the manager must do is set up the relationship between the various grids stored in the
database, using the Setup Grids… option.
Each grid that has been created in the database can be defined in the grid setup. Each
grid can be defined as either Local, Projected or Geographic. Depending upon the grid type
the Parent grid, Datum, Local datum transform, Projection and Grid azimuth parameters are
entered. If a local grid is being defined, the “Local Grid – Parent Grid Relationship” needs
to be defined. There are a number of Transformation types; 1, 2, 3, 4 or Multi-point
relationships. All dropdown lists are stored in new reference tables in the acQuire Data
Model (ADM), with the reference data and transformation engine supplied by Geosoft.

Page 3 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

Once the grid setup has been defined,


coordinate virtual fields can be created.
This is accessed from Manage\Coordinate
System\Setup Coordinate Set Virtual
Fields… menu option. Once virtual field
coordinate sets have been defined and
defined in form definitions, existing
coordinates can be transformed into these
destination fields.

Page 4 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

As previously mentioned, the transformation engine can be used to transform coordinates


dynamically on export or stored permanently in the database in the coordinate virtual field
sets.
To transform coordinates to permanent fields in the database, use the Tools\Transform
Coordinates… menu option. The will display the Coordinate Transformation dialogue box.
This is similar to the dialogue displayed within the Drillhole and Point Sample exporters (in
the Geographic tab). A form definition is selected, any filters applied, the source grid
coordinates and destination grid selected, plus the destination virtual field coordinate set.
The user also has the option of transforming survey azimuths as well.

Data Tracking
Data tracking incorporates two actions; tracking which monitors changes to data in your
database and locking which assigns read-only or read-write access to HoleID/ProjectCode
combinations.
Data tracking can be installed on your database at two different levels:
• Object tracking. Monitors operations or changes to the object dependent tables –
acQuire tables in some of the compound definitions for a specifi c
HoleID/ProjectCode combination (object). For example, operations to the Sample or
the GeoInterval tables. If changes are made in any table referencing an object, the
object is marked as having changed. Both the time/date and who made the change
are recorded. This can be useful to determine whether a drillhole or point sample
campaign needs to be:
o Re-exported to a data client
o A calculation needs to be performed again
Once installed, object tracking fields are available via the Collar compound
definitions and can be added to a Collar form or workspace filter definitions for use
in filtering.
Object tracking controls the locking of an object, which when locked, is rendered as
read-only. This means that a user cannot enter or modify any data associated with
an object. Each object is assigned to be either EDIT or READ, which is stored in the
QLOH_ObjectStatus table.
• Record tracking. Monitors any update or deletion changes to every table in the
acQuire database (all tables listed in the METATABLE table). Record tracking stores
the data before it has been edited or deleted in a duplicate/mirror acQuire table
with a QLR prefix. These QLR tables also record the date and user that made the
changes to the individual records. This means that data can be retrieved from the
record tracking tables in the database.
Record tracking uses two types of tables to record changes to the data in the
database:
o A single table called QLR_Action. This table is also created when object
tracking is installed without record tracking. The action table monitors
changes to records in the database. The following are recorded in this table;
whether an Insert (I), Update (U) or Delete (D) transaction occurred, the
database user name, network/OS logon name, client machine IP name, the
application in which the change was made, date-timestamp of the
transaction made, session ID of the transaction and whether the transaction
was successful.

Page 5 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

o Multiple tables with the name QLR_<TableName>. As mentioned, a table is


created for each table in the database.
It is the acQuire manager’s role to install and maintain data tracking – both object and
record. Each tracking component can be installed either together or individually. Form
definitions can be made to include the tracking fields, which can be used in filtering data
in the database.
The general user of acQuire will only know that tracking is installed if:
• There is an attempt to edit a record that is locked.
• Objects are created based upon a Collar form definitions that has the fields from the
QLOH_ObjectStatus table. Objects such as export and form.
• Filters can be applied using the fields from the QLOH_OjbectStatus table. For
example, you could export only the data that has changed in the last 14 days.
• A workspace filter is defined in terms of the fields in the QLOH_ObjectStatus table.
Tracking can be
enabled/disabled by using
Manage\Data Tracking from
the main acQuire menu.
The tracking needs to be
installed in the database, as
numerous tables and stored
procedures are required when
running the data tracking
functionality. Tracking can
only be installed on Oracle
8.1.6 and above and, SQL
Server 7 and above databases.

When data tracking is installed a stored


procedure is saved into the database to 'clean
out' or remove records from the tracking tables
that get too large. This procedure,
QSP_QLR_LOGDELETE, will remove records from
a table or tables according to some action date.
The action date is recorded in the field
ACTIONDATE_QLR.

When records are removed from the tables, a summary report of the process is stored in the
QLR_LOGDELETEMSG table. This table stores the following data; the name of the table
that has been shrunk, the number of records removed, the age (in days) of the records
removed and the date the records were removed.

Enhanced Functionality
Import Objects
Major enhancements have been made to the existing acQuire import object. New
functionality for importers includes dynamic header definition and QAQC assay import
validation.

Page 6 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

Dynamic Source Field Function


This function, particularly useful for assay import objects, allows variations within a source
file to be loaded into the database using the one import object, instead of using multiple
templates for the various assay column permutations. The actual fields in an assay file can
change from time, the sequence of fields or the name of the fields changing depending on
the elements analysed and the number of repeat analyses. It is important to ensure that
fields in a laboratory file are correctly matched to the destination fields in the database.
The new dynamic source fields function ensures a correct match of source-destination
fields. This function dynamically reads the source file field names and generates
destination field names via a defined template expression.
With dynamic source field selected, the Data section tab of the Import Definition dialogue
can accommodate variations in the name and number of fields in the source data file –
variations from source file to source file. Instead of specifically defining the source fields
when creating the import object, the field names in the source file are dynamically
associated with destination fields.
After definition, the dynamic source field function operates whenever a preview or
execution is performed. In this way, the sheet and the object are updated with the latest
information accessed from the source file every time.
All the user need do is define
which fields in the source file
are static, that is, the fields that
will not change position in the
source files. Define which line
the field names reside on, plus
the position of the fields (this
may change in the file). Source
fields can also be dynamically
generated via an expression,
plus the default data type
defined.
Within each import grid sheet,
destination database fields can
also be dynamically generated
from meta information extracted
from the assay source file. This
dynamic field template can be
accessed from either the main
acQuire menu, Import\Dynamic
Field Template… or by right-
clicking within the importer
sheets and selecting the same
option.

Page 7 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

Once the template has been defined for both the source fields and destination fields, and
all relevant expressions written for each import sheet, the import object can be previewed.
On preview (and on execute), if the source file is not the same as the file used previously
with the importer, the Dynamic Destination Field Report is displayed. This report will
display for the user the following:
• Fields that can be matched to the destination
fields and remain unchanged since the previous
time the import was used (black fields).
• Destination fields from a previous import not
matched to source fields (grey fields).
• Fields that cannot be matched to fields in the
database (red fields).
• New fields matched within the importer (bold
fields).
Invalid fields are easily di stinguishable to the users
and prior to loading the data into the database the
user is able to check the source file and make sure
that any irregularities are properly checked.
Meta data can also be pivoted from columnar data in the source file to individual records
that can then be loaded into the database. This option is also available within the Import
Definition\Data Section tab.
QAQC Validation
The QAQC tool is new functionality for assay
import objects to test the repeatability of the
laboratory results for field duplicates and
laboratory checks, and reviews the spread of
results for standards. The importer presents
this by means of charts and a summary sheet.
The QAQC tool is only activated when an object
is executed (prior to the actual loading of the
lab data).
When an object is first executed, the source file
is accessed. The analytical data in the file is
reviewed and a statistical and graphical report
is presented. On the basis of this report, the
user can decide whether to proceed and

Page 8 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

execute the import (load data into the database) or cancel the import. If the user decides
to import the data, irrespective of whether the assay batch file passes QAQC validation,
functions are available to record the report information.
Two extra virtual fields need to be created within the DespatchReturns subystem;
QAQC_Import and Import_User. Functions can be called to write to each of these virtual
fields:
• QAQC_Import – Expression = QAQCRESULTS(), where
values entered into field are; “Accepted with warning”
or “Accepted”
• Import_User – Expression = PCUSER (), machine login
name written into database field.
Parameters are defined for the QAQC report by the user at setup
time, and are also accessed from reference tables in the
database to be used in determining whether a batch passed
validation.
The tool is enabled by selecting the Setup QAQC… option from
the Import option from the acQuire main menu. The QAQC
dialogue box prompts the user to define which import sheets
defined by the user match the information required by the QAQC
tool. Sheets required are the
despatch return, primary assay
results, company duplicate and
company standards sheets and
also the laboratory internal
check sheet.
Once sheets have been
defined, validation acceptance
limits can be entered into the
option boxes. Number or
percentage values can be
defined for acceptable
differences, and the maximum
acceptable difference defined
before an import is no longer
valid for importation.
Thresholds, which are defined
in the database, can also
extracted and displayed within
the QAQC plots.
• Standards – results for standards are displayed relative
to:
o One standard deviation either side of the standard
value (again defined in the database)
o The standard value
o The acceptable min and max for a given standard
(again defined in the database).

Page 9 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

• Duplicate/Splits and Laboratory Checks – for each


element analysed a linear, a log and a MPD (mean paired
difference) graph is presented. The log and linear graphs
show:
o The equal line, where the primary (original sample
value) and the secondary (duplicate value)
samples are equal.
o Ten percent either side of the equal line.
o Acceptable difference % which is defined at
setup.
Samples that fall outside the acceptable difference range have failed the test and if
the number exceeds a user defined number a warning is given.

The QAQC report produced is as displayed below:

Page 10 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

Other Modifications
• A comma delimited reject file is generated for records that are rejected by imports
using an ODBC source file format. Previously a reject file was not created.
• Micromine source fields can be defined in the Import Definition\Data section tab.
• Lines in the source file, when displayed within the Import Definition, are now
numbered in the display box.
• In design mode, if the source file is passed from the control sheet, it is possible to
view the source in
Import Definition.

Data Entry Objects


Significant enhancements
have been made to the data
collection process within
acQuire Data Entry Objects.
The Data Entry enhancements
allow the object to be used
on a desktop PC directly
linked to the database,
exported to a mobile device
(such as a PocketPC) or
packaged and emailed to a
remote PC such as a Tablet
PC or laptop.
The enhanced Data Entry system is designed to integrate the data collection and
management from a wide range of processes, including: mine grade control data capture;
exploration point sampling; diamond core and RC/RAB logging; and laboratory despatch
data.
Entry Modes
Data Entry and Pocket acQuire objects have now been combined into a single object and
are classed either as online (connected to the main DB server) or offline (the new Data
Entry Package or Pocket acQuire). Once an object has been created the user can alternate
between the two entry modes. The creation of the new objects is wizard based, so
recalling the wizard interface allows the user to make any modifications.
The new acQuire Remote PC package is built containing the data entry object, a workspace
and an Access database. This is a self-
contained unit and can be sent to a site
that has a version of acQuire although, not
necessarily, a database installed. Data is
entered into the Access database using the
object on the remote machine. When the
data entry program is finished, a package
is created and sent back to the main
database. At the main database, the data
from the package can be imported or
compared against the contents of another
package to verify the quality of the data.

Creating a package
Page 11 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

Importing Remote Packages


Data Entry Creation
The acQuire Data Entry wizard prompts the user to select the entry mode desired for the
specific object. From here the user defines the sheet relationships, define target (offline,
online), the individual entry sheets, the entry sheet organisation, the form definition to be
used, the name for the individual sheets, the data entry mode (Insert/Update), a data
display filter selection and the actual fields required.
• Sheet Relationships – there are
three options available for the type
of data entry object defined;
o Parent/Child based on Collar.
The Collar parent sheet
filters all child sheets.
o Parent/Child. The parent
sheet filters all child sheets.
o No Restriction. Each sheet is
independent.

• Data Entry Target – whether the


object is for mobile (offline) or
database (online) entry usage.
• Entry Sheet Organisation – this dialogue box
is used to define individual sheets created
within the entry object. Each sheet can be
defined as a parent sheet, child sheet or as a
control sheet panel. There are also three
different display types available for an entry
object; grid, panel or graphical displays.
o Panel mode – the user enters data
into a panel, the same as the old data
entry objects.
o Grid mode – data entry into a grid,
similar to a spreadsheet view.

Page 12 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

o Graphical mode – can be used for objects defined using a Parent/Child


relationship based on a Collar which has sheets in the object with interval
information. A graphical sheet summarises the interval sheets in log form.
Interval/depth markers can be created or edited directly in the log. The data
can be displayed as a pattern or colour of the logged attribute. Graphical
sheets are available for both offline and online data entry objects.

• Form definition selection and object sheet name defined.


• Entry Mode – there are now three entry modes available for individual data entry
sheets;
o Insert/Update – retrieves records from the database according to a user
defined filter. New records are inserted and existing database records
updated. This is a new entry mode to v3.8.0.
o Insert – able to insert new records and update newly inserted records. There
is no access to records in the database other than to those entered in the
current entry phase.
o Update – retrieves records from the database according to a user defined
filter. Updates can be made to these records.
• Filter options – filters can be applied to entry objects and can be applied to a
parent sheet and all associated child sheets. Record sets can also be fetched
whenever the entry object is opened. The data extracted via these filters are
displayed in a Navigator for an object when entering data into the database.

Page 13 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

• Field selection – fields can be selected based on the form definition selected for a
specific entry sheet. Each field can be edited; a caption, control and data type
applied and whether the field should be visible, required or read-only.

Sheet Formatting
Once each sheet has been defined for a Data Entry object using
the supplied design wizard, the user can test the object without
physically adding any data into the database. This is a new
feature to v3.8.0. The user can toggle between Design, Test and
Run modes. Test mode is available for online and offline entry
objects.
The actual interface for designing a entry object has also changed
with the latest acQuire version release. Design grid and form
view has been combined, as a modeless property window to assist
defining properties of the controls on a panel. Each field, when
selected on the panel view, has it’s parameters displayed in the
grid adjoining the view.

Page 14 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

For each field, up to five value validation calculations can be created. Expressions can
also be set to only execute when initial field values are null.

The object displayed to the left


shows the new navigator bar,
which shows all HoleID /
ProjectCode combinations that
comply with the filter applied to
the entry object. All data on
subsequent sheets is auto-filtered
to the HoleID / ProjectCode
selected by the user from the
Navigator Bar. This object also
displays a panel mode entry
screen.

Page 15 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

Advanced Expressions
Data entry expressions are now more sophisticated and comprehensive than previously
available. It is now possible to define the visible, edit, required, caption or error states of
controls based on user input into related entry fields. These are all available as functions
grouped by ‘Field Properties’.
In the following example, a user can toggle
different captions on for an entry screen,
dependent upon a parameter entered into
another field. When the language is
checked, the field labels are displayed in
Spanish, whilst when unchecked, the labels
are displayed in English.

Page 16 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

Graphical Log Display

Functionality Common to Data Entry and Control Sheets


The above mentioned design mode modifications made for data entry objects are also
available for all acQuire objects that contain control sheets. That is, the panel and grid
design views are displayed within the one interface. Whenever a field is selected from a
entry/control sheet in design mode, the associated parameters are displayed.
Field property expressions are also available within the control sheets.
Lookup lists can be controlled by an expression or changed by user input. This allows for
hierarchical drilldown querying of data during run time.
New controls are also available within acQuire v3.8.0;
• Slider – move a slider along a defined range of integers to make a selection, eg.
percentage fields.
• Matrix – display the validation codes for the field as a matrix. Appropriate for
virtual fields that have validation codes defined for them.
• Multiline edit – an edit box for comments, using wordwrap.
• List Box – displays the selection defined in the lookup calculation row (control
properties) as a list rather than a drop-down combo list.
• Date-time – date/time stamp
• Tab – a static control that allows the user to add a set of “pages”, each accessed
via a tab with a label.

Page 17 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

Tab Control with a


slider bar

Hierarchy Drill down


query functionality

Export Objects
There are new options available in the selection
process of both drillhole and point sample export
objects. These determine some of the facets of the
data presentation in the output file.
• Geographic tab modified to incorporate the
new grid transformation functionality for
transforming coordinates on export.
• For each form definition selected within the
various exporter tabs, the number of decimal
places, maximum field length and split
increment can be defined.
o Decimal Places – defined for numeric
fields

Page 18 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

o Max Length – maximum number of characters to export for text fields


o Define the number of characters allowed for a field, before splitting. This
provides a way of displaying the entire field in data client views for clients
that have limited text size requirements. Fields split have fieldnames with
split extensions, eg: Fieldname, Fieldname_Ext1, Fieldname_Ext2 etc.
• Maximum number of Geology Pages increased from 10 to 20 pages (Geology tab).
• Interval Split tab now referred to as ‘Interval Split/Combine’.
• Assay Tab modification – when a sample interval has more than one sample number
and different assay values, the assay values are exported by the priorities assigned
to the sample numbers. The highest priority value will be exported first.
• Text Client export option – new options added;
o Replace NULL numerics with: (user can define a value for numeric nulls)
o Replace NULL text with: (user can define a value for text nulls)
o Fill missing intervals. Will insert missing geology and assay intervals with
null values or with the defined replace values.
• Geosoft GDB export option – compatibility restored between 3.6.x and 3.7.x Geosoft
Export dlls.
• Command line options;
o –SelFile option can locate either:
ƒ .SEL file (xxx.sel created in File\Export on the main menu)
ƒ Export object (xxx.qdhx for example)
o New text client options available:
ƒ CLPARReplaceNumNulls
ƒ CLPARReplaceTextNulls
ƒ CLPARFillMissing
Briefcase Objects
A Transfer Mode has been added. Modes now available are:
• Default
• Insert
• Update
• Merge
It is recommended that Insert is used for main data transfer if transferring to an empty
database. This mode improves transfer speed, which is much faster than transferring large
data sets in Merge mode.

Page 19 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

Default is the mode defined by the system for each table. Currently each table is set to
Merge.
Form Objects
Styling can now be defined for individual columns, instead of being applied to all columns
within a specific form. Each column can be independently assigned a data text colour,
background colour, a cell pattern or no styling.

New Functions in Expressions


The following functions are now available within the expression builder:
• String
o INCALPHANUM – a numeric component can be added to a string to generate
a new alphanumeric which can be incremented by a defined amount.
o FINDREPLACE(str1,str2,str3) – will find the incidence of str1 in str2 and
replace it with str3.
• Logic
o ISNULL – If x is a null value or an empty string, a true value is returned. For
any other value, false is returned.
o ISDATE – If x is a date value, a true value is returned. For any other value or
if a NULL, false is returned.
o ISNUMBEREX – the function determines if x is a number. This will work for
both string and number variables. The original ISNUMBER function only
works with number variables.
o ISERROR(x) – Returns 1 if x is an error, otherwise it returns 0. x is an error if
it generates an error outside the ISERROR function.
• Math
o UNITCONVERSION(x1,x2,x3) – will convert from data unit type to another,
the conversion factors are stored in a new reference table – UnitConversion.
An example of this function is UNITCONVERSION(10, “ppm”,”ppb”) converts
10ppm to 10,000ppb.
o INC(x1,x2) – adds two numeric fields, returns x1+x2

Page 20 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

o DIFF(x1,x2) – returns the absolute value of x1-x2


• Misc
o SPATIALINSIDEDB – this function will check if data resides inside a defined
polygon. The polygon location is defined in the database (in a Tenement
comment field). The value 1 is returned if it is TRUE.
o SPATIALINSIDEFILE – this function will check if data resides inside a defined
polygon. The polygon location is defined in an external file called by the
function. The value 1 is returned if it is TRUE.

Warning message during import


execution

o PCUSER() – will return the machine login name


o QAQCRESULTS() – available for QAQC on import objects. Will return a
summary statement about the QAQC review before import execution.
• Field Properties – as previously mentioned within the data entry outline, new
functions are available:
o Error, Warning, Visible, Required, Caption
acQuire Data Model Changes
The following tables have been added to the ADM.
• TenementComment and TenementCommentCode have been added to support the
spatial validation subsystem. The path to a polygon file can be stored in one of the
virtual fields of TenementComment.

Page 21 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

• Coordinate management system. A number of new tables have been added,


including; GDCoordinateSet, GDSurveyMethod, GDGridType, GDTransformationType,
GDProjection, GDDatum, GDLocalDatumTransform, GDUnits, HoleCoord,
SampleCoord, SurveyAzimuth
• Grid table. A number of fields have been added.
• Sample table. PointGridName and PointTenementID have added. PointGridName
allows each SampleID, PointEast, PointNorth, PointRL combination to be assigned a
grid name. These fields can only be used if the SampleID is defined for a campaign
where HoleType = Geochem.
• AssayDetection table. Threshold has been added, which is used in the QAQC import
validation.
• GeoContinuity table added. The Continuity has also been added to
GeologyCodePrimary. If a geology virtual field is defined with a PrimaryCode that
has a Continuity flag set, the interval data that is entered into the database for this
field is validated for:
o From/To overlaps (Continuity = RESTRICTED)
o From equal to TO (Continuity = DEPTHMARKER)
• CheckSample table has new fields; PCheckID, PDuplicateNo and CheckStage. These
fields have been added to associate hierarchical storage of check/duplicate data.
When a repeat sample is taken of a duplicate, the parent for this sample can be
recorded in the new PCheckID and PDuplicateNo fields. The ancestry of check
samples can be accurately recorded.
• CheckStage table. This reference table defines which part of the analysis process a
sample was taken from.
• UnitConversion table. This table defines the default conversion factors to convert
assay results of different unit types.
• PointGeoComment table. This table stores comments for a particular SampleID,
available for PointSample and SampleGeoAssay compounds.
• MetaAssayExport table. Primary keys added to fields in the table to assist in the
correct associations between numeric and descriptor assay virtual fields.
• New Compound Definitions
o PointSampleRestricted – This new compound definition is used for displaying
point sample records where at least one
virtual field in the form definition is
populated. With the existing PointSample
compound, all sample records where
displayed within a form irrespective of
whether the virtual fields were populated.
This new compound is effective for
grouping “like” virtual fields together for
samples (eg sample types) and showing
samples that use only these virtual fields.
o New Read Only Compounds – There is now
a series of compound definitions available
that dynamically query the database and
return the results to the user in a read-only
format. The manager can use these compounds to create form definitions for
Page 22 of 23
VERSION 3.8.0 OVERVIEW
Prepared by: Alison Atkins, Senior Geological Data Analyst, Metech Pty Ltd
November 2003

usage in the workspace. The BestAssay compounds have functionality similar to


the existing acQuire reporting stored procedures. They return the best ranked
method results for elements selected. There are also compounds available for
field duplicate, lab repeat and standard value comparisons. For each compound
type, the user is able to display the results as;
ƒ Text – using the best result determined from both the numeric and
descriptor virtual fields for an assay result.
ƒ Numeric – uses the MetaAssayExport table to extract defined numeric
equivalents for descriptor/text fields. The results returned in the
forms are all numeric.

B es tA ss ay D ril l h o l eN um ex am p l e

Page 23 of 23

Das könnte Ihnen auch gefallen