100% found this document useful (1 vote)
377 views

Numeric Modelling With The Wolf Pass Project: For Leapfrog Geo Version 5.1

Uploaded by

Bayartsengel Tse
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
377 views

Numeric Modelling With The Wolf Pass Project: For Leapfrog Geo Version 5.1

Uploaded by

Bayartsengel Tse
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 80

Numeric Modelling with the Wolf

Pass Project
For Leapfrog Geo version 5.1
© 2020 Seequent Limited (“Seequent”). All rights reserved. Unauthorised use, reproduction, or disclosure is
prohibited. Seequent assumes no responsibility for errors or omissions in this document. LEAPFROG, SEEQUENT
and are trade marks owned by Seequent. All other product and company names are trade marks or registered
trade marks of their respective holders. Use of these trade marks in this document does not imply any ownership of
these trade marks or any affiliation or endorsement by the holders of these trade marks.
Session 1: Introduction to Numerical Modelling
Contents
Introducing the Project 1
Modelling Approach 2
Iterative Refinement 2
Choosing a Modelling Tool 2

Goals
For this series of sessions, we will begin with a pre-built Wolf Pass geological model. If you are new to the
project, we will take a few moments to get acquainted with the lithologies and modelling codes before
exploring the data in more depth using Leapfrog Geo's exploratory data analysis tools. Then we'll build
interpolants, demonstrate how to model numeric data as categories, create indicator interpolants and build
block models.
By the end of these sessions you will know how to:
l Download a local copy of the Wolf Pass project from Central
l Evaluate a model back on to your drillhole table
l Explore relationships within your data using statistics and graphs
l Build an RBF interpolant
l Model numeric data as categories
l Build an Indicator RBF Interpolant
l Create block models
l Import and export block models
The data for this session can be found on the Central Training server for your region or it will be provided by
your instructor. Your instructor will lead you through the steps to enable Central integration with Leapfrog
Geo, add the server and download a local copy of the project.

Introducing the Project


The Wolf Pass project that we will be using for these sessions comprises a drillhole database with multi-
element assays, grouped lithology codes as below, a basic and a refined geological model complete with
output volumes. These sessions will primarily focus on the gold assays, leaving the remaining codes available
for modelling practice later. There are 15 different lithology codes that have been grouped into five major codes:
l Recent: SAPR, COLLV and ASH
l Dacite: DA
l Early Diorite: E1, E2, E3, EBX1 and EBX2
l Intermineral Diorite: I1, I2 and IBX
l Basement: H and SBX
The leftover code (SGNCRLSS) represents significant core loss, so is left ungrouped for this model.
The Intermineral Diorite intrusion is the oldest intrusion, which was emplaced into the schist basement and
contains some gold and copper. This was followed by the Early Diorite intrusion, which contains the highest
gold and copper grades. Then, the barren Dacite dykes cut through all three existing lithologies. Weathering
and a nearby volcanic eruption formed the Recent layer, which is the youngest lithology shown in the logging.
1. Add the Wolf Pass GM to the scene.
2. Rotate, zoom, slice and otherwise make yourself familiar with the geometry and relationships between the
modelled lithology solids.
3. Clear the scene.

© 2020 Seequent Limited 1 of 78


Modelling Approach
A numeric model can be built in four steps from a variety of data. Any data that contains points with X,Y,Z
coordinates and an associated numeric value can be used for interpolation.
l The first step is to clean the drillhole data by removing inconsistencies in the data. This can be a time-
consuming process with some data sets, but it is critical as the quality of any model ultimately depends on
the quality of the data. In this example, the data has already been cleaned but we will do some exploratory
data analysis to become more familiar with the gold and copper distributions.
l The second step is to select the numeric values then apply appropriate parameters to the values, this also
requires choosing the correct modelling tool. A numeric model estimates the values over a region from an
initial set of point values. The numeric values can be selected directly as points if they have been imported
into the Points folder. If you are creating your numeric model from drillhole data, Leapfrog will allow you to
select the segments used to generate points. When you are adjusting the model later, you can work directly
with point values.
l The third step is to apply a trend. A trend allows the directions and strength of mineralisation to be defined to
ensure the resulting numeric model is geologically reasonable. Adding a global or structural trend will alter
the isosurfaces. It should be adjusted to ensure these honour the expected mineralisation patterns. This is
where we will initially direct our focus in this session.
l The final step, while equally important to all the others, is to determine how the isosurfaces are bounded and
calculate the volume of mineral within each isosurface.

Iterative Refinement
Building a numeric model is a process of successive refinement. This involves:
l Defining the numeric model and basic structures. This usually corresponds to defining the topography and
boundaries.
l Refining the internal structure. This involves setting the proper trends and making manual corrections to the
point and value data until the resulting surfaces are geologically realistic.

Choosing a Modelling Tool


There is more than one way to create a boundary surrounding a cut-off grade in Leapfrog Geo; however the
different techniques used to create these boundaries rely on different underlying concepts. Which tool to select
will depend on the ultimate purpose for building the boundaries.

RBF Interpolants
The typical way to create these boundaries is by building grade shells using the RBF Interpolant tool.
Interpolated grade shells are built by using the known drillhole or point data to interpolate values infinitely
across the boundary extents; isosurfaces (grade shells) are then created to link up identical values. Grade
shells created by interpolation cannot be snapped to contact points on drillholes. These interpolated grade
shells are ideal for exploration drillhole targeting where there is a good understanding of what is controlling
mineralisation and they work very well when the deposit has lots of drillholes.

Numeric Values Modelled as Categories


Alternatively, Leapfrog can create “mineralised zones” which utilise contact points directly from the drillholes.
Mineralised zones are created from numeric value ranges that have been converted to “Category” (text) form.
Once the numeric values have been converted to Category format, the mineralised zones can be created in a
Geological Model. Mineralised zones created in a GM can give "snapped" boundaries, as it is possible to
snap these surfaces to contact points on drillholes. Mineralised zones created using a GM are ideal whenever
drillhole contact points need to be honoured.

2 of 78 © 2020 Seequent Limited


Indicator RBF Interpolants
Finally, the indicator interpolant numeric modelling tool was developed for situations where the geological
controls on mineralisation (or other parameters of interest) were poor. The Indicator RBF Interpolant is similar
to the RBF Interpolant in that it produces interpolated iso value surfaces that cannot be snapped to drillholes.
However, instead of the iso surfaces representing a particular grade, the Indicator RBF Interpolant produces
a surface representing a specified probability.
Examples of when this tool might be used and preferred include:
l Disseminated mineralisation that crosses several lithological boundaries,
l Uncertainty in how to group lithology codes,
l Determinations of ore versus waste in veins where the vein domain includes thin intervals of country rock,
l Or, an even more general situation, just poor geological logging.

In general, numeric models can be created in Leapfrog Geo using assay data, temperature values, geophysical
data or any other numeric data that is sparsely distributed in space to interpolate across a region.
Interpolation in Leapfrog is fast and flexible, and the shells produced to represent mineralisation are smoother
and more reproducible than traditional hand-drawn meshes.

© 2020 Seequent Limited 3 of 78


4 of 78 © 2020 Seequent Limited
Session 2: Data Analysis
Contents
Visualisation of Numeric Drillhole Data 5
Colourmaps 5
Cylinder Radius Scaling 8
Downhole Graphs 8
Enhance High Values 9
Importing and Exporting Colour Gradients 10
Introduction to Statistical Data Analysis 11
Graphs and Statistics on Drillhole Tables 11
Creating a Geological Model Evaluation Table / Backflagged Table 11
Creating a Merged Table 12
Histograms 13
Scatter Plots 16
Box Plots 20
Q-Q Plots 21
Table of Statistics 21
Reporting 23

Goals
In this session, we will cover:
l Displaying numeric data
l Creating a new evaluation / backflagged table
l Creating a new merged table
l Creating drillhole queries
l Using the statistics and graphs available in Leapfrog Geo
This session continues to use the Wolf Pass project.

Visualisation of Numeric Drillhole Data


Up to this point, we have looked at category data (lithologies). Many of the visualisation tools available in the
shape list and properties panel for numeric data are the same as for categorical data, but there are a couple of
additional tools.

Colourmaps
With numeric data, you have the option of Continuous or Discrete colourmaps. While a continuous colourmap
is the default, in this session, we will focus on creating a discrete colourmap.
1. Clear the scene, add the WP_assay table to the scene and select the Au column.
2. In the shape list, click on the Au colourmap and select New Colourmap:

© 2020 Seequent Limited 5 of 78


You will see a menu of the two colourmap options. Choose to create a new Discrete colourmap:

The default discrete colourmap will open a new window:

3. Change the X-Axis Limits maximum to be 2.0.


4. Use Add to add the colour ranges that you like.
You can experiment with changes in this window and you will see them update live in the scene:
l Change the value in the Max column and the Min column will update accordingly.
l Click on the ≤ sign to switch the greater than/less than/equal to status.
l Click the colour swatch to change the colour.
l Use the other Options to change the histogram display.
l Click and drag the lines between the colours to change the colour ranges.

6 of 78 © 2020 Seequent Limited


Note that the scene will be updated as you make changes to the colourmap

To auto-generate intervals based on statistics:


l Click Generate Intervals.
l Select the Interval Mode and Number of Intervals you would like.
l Select an appropriate Colour Gradient.

For a description of the different Interval Modes, see the Colourmaps topic in the online help.

5. Click Apply to view the results of the changes you have made and Close the interval dialog.
6. When satisfied with your colourmap, click Close.

You only have to set up this colour scheme once per column and you can then export and share it between
projects.

© 2020 Seequent Limited 7 of 78


Cylinder Radius Scaling
As we’ve seen, numeric data columns in drillhole data can be displayed as lines or cylinders. When displayed
as cylinders, the radius of the cylinders can be set from a numeric data column in the table, either the same one
being displayed, or a different one.
1. If you’re not doing so already, display the WP_assay table using the cylinders ( ).
2. In the properties panel, use the Radius values dropdown to display by Au.
3. Click the Use log value for radius button ( ) to get a more useful display.

In this project, we also have Cu values; we can display the Au values in scene, scaled by the Cu values.
4. Using the Radius values dropdown in the properties panel, select Cu_pct.

5. Look at the result in scene.

Downhole Graphs
In addition to viewing numeric data by numeric-scaled cylinder radius, numeric data can also be viewed as a
downhole graph alongside categorical data, allowing you to view two columns of downhole data
simultaneously. This visualisation capability can be helpful for familiarising yourself with your data, drawing
correlations between different datasets (e.g. rock type and grade), and assisting with modelling interpretations.
For this exercise, we will view the lithology data together with the Au assay data.
1. Clear the scene then drag the WP_lith table into the scene.
2. Double-click on the Drillholes Graphs object into the project tree.

8 of 78 © 2020 Seequent Limited


3. In the Drillholes Graph Style Manager window, select the AU_gpt column from the WP_assay table.
4. Make any Position and Size, Colouring or Numeric Display Range changes you would like.

If you have downhole points in your project, they can also be displayed alongside the drill trace. Below is an
example of LAS point data displayed as a downhole graph:

Enhance High Values


Frequently it is hard to visualise your high grade zones in drillholes if your project is well drilled off. There are
typically a lot of low values at the perimeter obscuring a higher-grade core. To help visualise your high-grade
trends, click use the Enhance High Values button ( ) in the shape list:

© 2020 Seequent Limited 9 of 78


The images below (from a different project) show the same drillholes, but the higher values are enhanced in the
image on the right:

Importing and Exporting Colour Gradients


The Colour Gradients folder near the bottom of the project tree is used for storing imported colour gradients.
Built-in colour gradients were updated in Leapfrog Geo 4.4 and the older gradients are automatically saved into
the Colour Gradients folder when projects from earlier versions of Leapfrog Geo are opened in 4.4.
To import a gradient, right-click on the Colour Gradients folder and select Import Gradient to start the
process. Once the new colour gradient is in Leapfrog, it can be assigned to any object in the project that uses a
continuous colourmap. Click Edit Colourmaps pencil ( ) in the shape list:

In the Edit Colourmaps window, all colour gradients in the project area available from the Gradient list:

Now that we have visually inspected our data, we will look at more quantitative data analysis tools.

For more information regarding these visualisation options, see the Colourmaps and Displaying Drillhole
Graphs topics in the online help.

10 of 78 © 2020 Seequent Limited


A useful article on colourmaps is available at http://peterkovesi.com/projects/colourmaps/. Existing
colourmaps can be downloaded from this website in ERMapper (*.lut), Geosoft (*.tbl) and Golden Software
Surfer (*.clr) formats.

Introduction to Statistical Data Analysis


There are a variety of univariate and bivariate statistical tools available in Leapfrog Geo that can be used to
investigate your data, including a table of statistics and the following graphs: 
l Histogram
l Cumulative histogram
l Log probability plot
l Scatter plots
l Boxplots
l Q-Q plots
These graphs can also be used to detect possible errors in the data, as well as to identify or confirm different
mineralisation populations.

You can view basic statistics and graphs for imported numeric drillhole tables, merged tables, composited
tables, points and block models.

While we will not specifically discuss the use of these tools with respect to QAQC, they can also be used for
this purpose. The graphing tools can be used to compare duplicate sample grades against the original sample
grades (scatter plots, Q-Q plots), or to compare assay data from different laboratories (boxplots), etc.

Graphs and Statistics on Drillhole Tables


We will continue to familiarise ourselves with the Wolf Pass dataset using the Histograms, Scatter Plots, Box
Plots and Q-Q Plots on drillhole tables.
While it is possible to evaluate the nature and distribution of high grades with only the lithology and assay
tables, often modelling geologists will want to incorporate decisions made while creating geological model
volumes. Creating a geological model evaluation table - otherwise known as back-flagging drillhole data -
creates a new lithology table containing the lithologies from the selected model, which is also useful for
validating models created from drillhole data as you can generate statistics for the correlation between
modelled geology and drilling data.

Creating a Geological Model Evaluation Table / Backflagged Table


1. To create a back-flagged drillhole data table, right-click on the Drillholes object in the project tree and select
New Evaluation:

© 2020 Seequent Limited 11 of 78


2. In the New Evaluation window, select the Wolf Pass GM and enter names for the column and table:

3. Click OK to create the new table, which will appear in the project tree as part of the Drillholes object:

The new table contains from, to and Wolf_Pass_GM columns defined using the intersection between the
model’s output volumes and the drillholes:

For output volume evaluations, you can view the correlation between modelled geology and actual logged
drillhole segments. To do this, right-click on the table and select Statistics.

For more information, see the Back-Flagging Drillhole Data topic in the online help.

Creating a Merged Table


The assay table in this project contains only numeric assay data, with no information about which geologic unit
the assays belong to. Any statistics we view on the assay table would include all the data from every unit. This
can offer us some valuable information about our dataset as a whole but viewing the statistics and graphs on
the WP_assay table will not allow us to perform unit-specific data interrogation. Both the WP_Liith and our new
WP_GM_Evaluation tables only contain lithology intervals, with no reference to the assay values.
To view and interrogate both data types (assay and lithological) in relation to one another we will create a
merged table.

12 of 78 © 2020 Seequent Limited


1. Right-click the Drillholes object in the project tree and select New Merged Table.

2. Select both the assay and GM evaluation tables and rename the table “Assay and GM_Evaluation”.

3. Click OK and Leapfrog Geo will process the new table.

The intervals of a merged table are dependent on the selected columns’ interval breaks. The merged table
interval will be the longest possible interval that is shared by all selected columns. Where the ends of
interval don’t align, small intervals will be created. For example, if an assay interval is 10-12m and has a
value of 0.563, but there is a lith code change at 11m, a merged table will present this as follows: 
l 10-11m, Lith A, Au 0.563
l 11-12m, Lith B, Au 0.563
In many cases, this doubling up of the assay values is not ideal. To deal with this issue, you can use a New
Majority Composite table based on assay intervals, and merge that new table with the assay table. This
approach will not result in any split assays.

Now that we have our data prepared, we can examine the data analysis tools in Leapfrog.

Histograms
You can view histograms for your numeric data columns by right-clicking on the data column in the project tree.
We will begin with investigating the Au data in the Assay_and_GM_Evaluation merged table. This merged table
contains both the assay information, as well as the lithology units, allowing us to filter the graphs based on
lithologic unit.
1. Expand the Assay_and_GM_Evaluation merged table to view the available columns of data.

© 2020 Seequent Limited 13 of 78


2. Right-click on Au_gpt and select Statistics.

Initially all the Au data in the project will be displayed on the plot.

The statistics calculated on the histogram on the drillhole tables are length-weighted by default.

Display Options
Options available for displaying the Histogram include the three graph types, Histogram of the log,
Percentage and Bin width. A box plot is also automatically displayed beneath the histogram. It is possible to
adjust the axes of the graphs, by setting the X limits and Y limits.
3. Tick the box for Histogram of the log.
4. Switch the Histogram display to Cumulative histogram and then Log probability to review them all.
5. Switch back to the Histogram display.

Queries
While it’s useful to start out by reviewing the statistics of the entire dataset, it’s also important to look at the
unit-specific statistics. This can be achieved on the graphs available at the drillhole level by setting up query
filters that isolate the values in each unit.
6. Right-click on the Assay_and_GM_evaluation table in the project tree and select New Query Filter.
7. Click the ... Query builder button.

14 of 78 © 2020 Seequent Limited


8. Set up the query as shown below:

9. Call the query Early Diorite.


10. Repeat the process for the Intermineral Diorite unit.
Now that we have query filters on the table, we can apply them to the graph to show the calculations for just the
Early Diorite or Intermineral Diorite units.
11. To view the data within Early Diorite, set the Query filter to Early Diorite.

Graph-Scene Interactions
Leapfrog offers graph-3D scene interaction for improved visualisation of your data.
12. View the Assay_and_GM_Evaluation table in the scene by the Au column.
13. To visualise a bin of data in the scene window, click on it in the histogram.

To select multiple bins, click and drag the mouse across the bins you wish to select.

© 2020 Seequent Limited 15 of 78


14. Switch to the scene window to view the selected intervals:

15. Once you’re finished with the graph, close its tab.

The settings you select are saved for the next time you view a graph.

Scatter Plots
1. Right-click on the Assay_and_GM_Evaluation merged table and select Statistics, then Scatter Plot.
2. Set the X column to AU_gpt and the Y column to CU_gpt.

16 of 78 © 2020 Seequent Limited


3. Select log scale for both axes:

Display Options
When necessary, either or both axes can be logged; query filters can be added to display selected data; point
size and shape can be changed to suit your preferences, and the graph background can be set to white. A third
variable can be displayed on the graph by selecting a different column of data to use as the Colouring.

© 2020 Seequent Limited 17 of 78


4. Set the Colouring to Wolf_Pass_GM:

5. Use the Query filter to show just the Early Diorite unit:

18 of 78 © 2020 Seequent Limited


Graph Calculations
The Linear Regression line, equation and the Conditional Expectation (smoothed regression) can also be
displayed:

The linear regression line is weighted, by length for numeric data in drillholes and by volume for block
models. Scatter plot data, however, is not weighted. This weighted linear regression equation and
correlation coefficient will not directly compare to un-weighted equations and values calculated in other
software, like Excel.

Graph-Scene Interactions
Leapfrog Geo offers graph-3D scene interaction for improved visualisation of your data. The selection tools
available in the graph’s toolbar are similar to those available for the interval selection tool on drillhole data:

© 2020 Seequent Limited 19 of 78


You can make a selection of the graph using either the Replace the current selection ( ) or Add to current
selection ( ) buttons. Click and hold the mouse to draw a line around the desired selection and then switch to
the scene window to view the selected intervals in 3D space:

Box Plots
1. Right-click on the Assay_and_GM_Evaluation merged table and select Statistics.
2. Select Box Plot.
3. Change the Numeric column to AU_gpt and tick the box for Log scale.
Since the merged table contains a few category columns, we can view the box plots for the different logged
geological units.
4. Set the Category to Wolf_Pass_GM and tick the boxes for the units you’d like to display.

As with the other plots, there is also the option to add a Query Filter.

20 of 78 © 2020 Seequent Limited


The whiskers extend out to lines that mark the extents you select: Minimum/Maximum, Outer fence or Inner
fence. Outer values are 3 times the interquartile range (IQR), and inner values are 1.5 times the IQR. Hold the
mouse over an option to see a tool tip to remind you of these definitions.

Q-Q Plots
The Q-Q plots in Leapfrog Geo can be used to compare values from different phases or styles of drilling,
different lab analyses techniques, duplicates vs original samples, etc., using query filters. In this project, we
only have one phase and style of drilling and assay results from one lab, but we will demonstrate the graph
functionality by comparing the Au values from the Early Diorite and Intermineral Diorite units.
1. Right-click on the Assay_and_GM_Evaluation merged table and select Statistics.
2. Select Q-Q Plot.
3. Change the X data to AU_gpt and the X filter to the Early Diorite query filter.
4. Change the Y data to AU_gpt and the Y filter to the Intermineral Diorite query filter.
When necessary, either or both axes can be viewed with a logarithmic scale.

5. Once you’re finished with the graph, close its tab.

Table of Statistics
In addition to the graphs, there is also a comprehensive, flexible table of statistics available.
1. Right-click on the Assay_and_GM_Evaluation merged table and select Statistics.

© 2020 Seequent Limited 21 of 78


2. Select the Table of Statistics:

3. Under Categories, click Add and use the dropdown list to select the Wolf_Pass_GM column.
4. Check the box for AU_gpt and CU_pct in the Numeric items list.
5. Tick the Hide empty categories and Hide inactive rows boxes.
6. To get a more useful view, choose the Group by numeric item radio button.

There are a number of useful statistics in this table. By default they are length-weighted, but you also have the
option of un-weighted if necessary. Sort by the different columns by clicking on the column headers.
7. Click the Mean heading to sort based on grade.

We can quickly see the lithologies that contain higher gold grade, as well as those with less.

22 of 78 © 2020 Seequent Limited


Reporting
Each graph type, and the table of statistics, has tools for copying and exporting the graph/table:
l Click the Export button ( ) to export the graph/table.
l Click the Copy button ( ) to copy the graph/table to the clipboard. You can then paste it into another
application.

For more information, see the Statistics topic in the online help.

© 2020 Seequent Limited 23 of 78


24 of 78 © 2020 Seequent Limited
Session 3: Numeric Models
Contents
Creating Composites Directly From Drillholes 25
Creating a First Pass Gold Numeric Model 29
Outputs Tab 30
Interpolant Tab 33
Value Transform Tab 37
Copying a Numeric Model and Clipping to a Domain 39
Adding a Structural Trend to a Numeric Model 41
Creating a Structural Trend 41
Adding a Structural Trend to a Numeric Model 42
Adding a Contour Polyline to the Numeric Model 43
Numeric Model Statistics 44

Goals
In this session, we will cover:
l A generalised modelling approach
l Building composites directly from drillholes
l Creating an initial RBF Interpolant
l Steps to take or consider in refinement of the numeric model
In order to explain how Leapfrog creates numeric models, we will introduce a relatively large amount of
basic interpolation and geostatistics theory.
This session continues to use the Wolf Pass project.

Creating Composites Directly From Drillholes


After the last session, we have a good idea of where the mineralisation occurs within our model. The drillhole
data as logged and assayed exists as interval data, but not all intervals are the same length. Were we to build a
numeric model from this data without further processing, we would end up with points with uneven data
support; giving equal weighting to points would represent vastly different volumes of rock. Because of this, we
will composite in order to normalise the volume of rock being represented by a point, thus giving us
representative models and statistics.
First we’ll review the sample interval lengths in the project.
1. Right-click on the WP_assay table, then select Statistics, then the Interval Length Statistics option.
2. Change the Bin width to 0.5 to make the graph easier to read.

© 2020 Seequent Limited 25 of 78


3. Untick the box for Automatic X axis limits and change the upper X limit to 5.

We can immediately see that the majority of our samples are 2.0 metres long. We will use this information to
choose our composite lengths.
It is possible to create a set of composited drillholes directly from drillholes. To do this, we go directly to the
Drillholes folder. This folder gives us more options, including whether to composite over the entire drillhole or
only within a particular lithology. Once the composite has been completed, it can be used to create a numeric
model.
1. Right-click on the Composites folder (under the Drillholes folder) and select New Numeric Composite:

l Entire Drillhole applies compositing parameters to all values down the length of the drillhole, regardless of
unit breaks.
l Subset of Codes lets you set compositing parameters for each individual code, based on a Base column.
This allows compositing to break along unit breaks.
l Intervals from other Table uses interval lengths from the Base table to determine composite lengths.

If you want to use grouped codes as a base table you will need to make a Group column in your interval
table. You can then select Intervals from other Table and choose the grouped codes column.

In addition to specifying the desired composite region and length, there are also 3 options for handling residual
segments of lengths less than a specified cut-off:
l Discard
l Add to previous interval

26 of 78 © 2020 Seequent Limited


l Distribute equally

For more details regarding compositing, click here.

2. Select Subset of Codes.


This option provides maximum compositing flexibility.
3. Ensure the Base column is grouped_lith in the WP_lith table.
By default, the Compositing Length is set to 10 and the Minimum Coverage is set to 50%.
Compositing length will vary depending on several factors, including the deposit style, mining method and raw
sample length. In general, high grade underground mines will require a shorter sample length compared to bulk
open pit mines. The other point to note is that, if possible, samples shouldn't be “split”. For example, if most
intervals have a 2 m sample length, choosing a composite of 1 m or 5 m will split the intervals, which will
artificially reduce variance as an interval with a single value will be represented in more than one composite. In
this case, we will change the composite length to 6 m, as the raw interval lengths are mostly 2 m, so the
intervals aren’t being split. A composite of 6 m is also reasonable for the deposit style we are working with in
this case. We could also choose 4 m for our composite length, but this will increase processing time later in the
session, so we’ll stick with 6 m for now.
4. Change the Default Length to 6.
This will apply the length to all codes, starting the first composite interval at the start of each new code and
working its way down the hole.
Next we need to determine what we want Leapfrog to do with the residual end lengths present after the
compositing to 6 m intervals.

© 2020 Seequent Limited 27 of 78


5. In this project, we will select the add to previous interval option, using the dropdown.

6. Set If residual ends length less than to 1.5.


Residual end lengths less than 1.5 m will be added to the previous composited interval. Residual end lengths
longer than 1.5 m will generate a separate composite interval.
Once these composite intervals have been defined, based on the starting depth of the composite (defined by
the code intervals), the specified composite length, and the residual end length action, Leapfrog then checks to
ensure the specified Minimum coverage % of input assay intervals have been met.
The minimum coverage parameter is associated with the original imported assay intervals and allows you to
decide how ‘informed’ a composited interval needs to be, by setting a percentage threshold on the minimum
allowable accumulated grade length required to generate a composite interval and value.
Minimum coverage is based on the input data coverage of the specified composite interval (which is based on
the composite length and residual end length parameters), and dictates whether or not a composite interval can
be created.
If there is sufficient coverage (defined by the Minimum coverage parameter) of input data for the defined
composite interval, Leapfrog then creates the composite based on the original imported assay values.
Minimum coverage is expressed as a percentage of the composite interval length. Depending on the residual
options chosen and the length of drilling being composited, interval lengths might not always be the same as
the originally specified compositing length.
7. We will leave the Minimum coverage at 50%.
In this example, we have specified a composite interval length of 6 m and have added any residual end lengths
less than 1.5 m to the previous interval, and set the minimum coverage percentage to 50%. With the minimum
coverage parameter set to 50%, to create a composite interval, at least 3 m of the total 6 m interval must be
informed by original imported assay values, from which the composite will be calculated. That being said, due
to the residual end lengths potentially being added to some intervals, it is possible in this scenario to have
composite interval lengths a little longer than 6 m, for example, there may be a composite length of 7 m, in
which case at least 3.5 m of the total 7 m composite interval (50%) must be informed by original input data.
l If the Minimum coverage is set to 0%, a composite will be calculated based on the available input data for
that interval, no matter how short the original assay interval is.
l If the Minimum coverage is set to 100%, a composite will only be calculated if the entire composite interval
(6+ m) contains original assay data (if there is only original assay intervals covering 5.5 m of the defined 6 m
composite interval, a composite value will not be calculated).
l If the Minimum coverage is set to 50% (default value), a composite will be calculated if the original input
assay intervals cover half of the defined composite interval.

28 of 78 © 2020 Seequent Limited


Regarding the Action column, there are three options:
l Composite, which is self-explanatory
l Filter Out, where all values for the filtered-out code will be removed from the composited table
l No Compositing, where all values for the No Compositing code will retain their original interval lengths
within the composited table
8. For the Recent unit, select Filter Out.
Values existing in the Recent unit will NOT be included in the new composited table.
9. Click on the Output Columns tab, and select Au and Cu:

10. Give the new table an appropriate name, click OK.

Creating a First Pass Gold Numeric Model


It is a good idea to run a quick numeric model through your data to check how the isosurfaces behave. Once
this is done, we can go ahead and create a numeric model with full knowledge of the data.
1. Right-click on the Numeric Models folder in the project tree and select New RBF Interpolant.
The New RBF Interpolant window will appear, with a few basic options for defining the model.
2. Set Numeric values to the AU_gpt column from the composited table.
3. Click Existing model boundary or volume and select the Wolf Pass GM Boundary object.
4. Leave the Surface Filter box ticked.
5. Change the resolution to 20.
The settings in the New RBF Interpolant window should now look like this:

© 2020 Seequent Limited 29 of 78


6. Click OK
7. Clear the scene.
8. Once the model has been processed, add it to the scene.
As you would expect for a first pass model created without changing any parameters, it is unrealistic:

We will change a few of the more important parameters and check how they change the numeric model.
9. Double-click on the AU_gpt numeric model ( ) in the project tree.
This opens the Edit RBF Interpolant window.
We will start by refining the numeric model for the entire area, then look at creating a model within the Early
Diorite, which is the major mineralised lithology.
For the first model, we will change parameters in the Outputs, Interpolant and Value Transform tabs. When
we create the second model within the Early Diorite, we will also look at the Value, Boundary and Trend tabs.

Outputs Tab
In the Outputs tab, we can choose the values used to create isosurfaces and define the resolution of the
isosurfaces and how the isosurfaces create the associated volumes. By default, there are three isosurface
values, which are at the lower quartile, median and upper quartile of the data being used. These default values
are often not of interest but are useful in checking the general shape of the numeric model. We will go ahead
and change them to more reasonable values.
1. Click on the Outputs tab.
2. Click to highlight one of the default values beneath the Iso Value heading, then click it again to edit it.
3. Change the existing values to 0.5, 0.75 and 1.0.

30 of 78 © 2020 Seequent Limited


4. Click the Add button to add isosurfaces with values of 1.25 and 1.5:

Resolution is important when creating isosurfaces. Ideally, we would want it to be equal to the composite
length (6 m in this case). A quick test using one of Leapfrog’s laptops (16GB RAM, 2.8GHz processor) took 75
seconds to run these isosurfaces at a resolution of 6, but if your laptop is particularly slow, it may be worth
increasing the resolution to between 12 - 15. This will still give you a reasonable surface but will process more
quickly.
The resolution of isosurfaces is important because it determines the size of the triangles making up the
surface. If the resolution is 6, the approximate edge length of the triangles will be 6 units in length (remembering
that Leapfrog is unit-less). If the edge length of the triangles is 6 units, they will be able to include intervals that
are as small as 6 m long. If we were to increase the resolution to 12, the triangles would only be able to include
intervals as small as 12 m long and so will miss some of the smaller intervals.
Obviously, a lower resolution produces a more accurate surface, but can take a lot longer to run. A general
guide is that if you halve the resolution, the processing time will increase by four times.
5. Change the Default resolution to something between 6 and 20.
The resolution for each surface is set by the Default resolution unless a different resolution for a particular
surface is specified. Since the resolution of each isosurface can be set independently, so you can save time by
making the higher iso value shells at a lower resolution value and the lower iso value shells at a higher
resolution value.

© 2020 Seequent Limited 31 of 78


6. Set the 1.5 and 1.25 isosurfaces to 6 m and the others to 12:

The Volumes Enclose dropdown lets you choose from Intervals, Higher Values and Lower Values.
l Intervals will create a series of “donut” shaped shells. In this example, the shells will be < 0.5, 0.5 - 0.75,
0.75 - 1.0, 1.0 - 1.25, 1.25 - 1.5, >1.5.
l Higher Values will create a series of shells that enclose all higher values within them. In this example, the
shells will be >0.5, >0.75, >1.0, >1.25, >1.5.
l Lower Values will create a series of shells that enclose all lower values within them. In this example, the
shells will be <0.5, <0.75, <1.0, <1.25, <1.5.
7. In this case we will start by using Intervals, so keep this selected.
8. Click OK.
When the model reloads in the scene, every volume will be opaque. To view with increasing transparency,
clear the scene and drag in the numeric model ( ) again:

The Evaluation limits refer to when interpolated values evaluated onto objects (surface, points, block model,
etc). These limits do not affect the input data, or the interpolation itself. By default, there is a Minimum limit set
at 0.0. This means that regardless of the parameters set in the Interpolant tab, and the resulting interpolation,
no interpolated values less than 0 will be evaluated onto objects. There is also an option to set a Maximum
value for the interpolated values. If ticked, the default is the highest value in the dataset, meaning that no
interpolated value evaluated onto an object can exceed the highest measured value. Keep in mind, this is not
top cutting as it only affects the interpolated values when evaluated onto objects, not the input data or the
interpolation itself.

32 of 78 © 2020 Seequent Limited


This is a good first step, but we can see there are some pretty clear issues with the model still, such as the
large high-grade blowouts in the NW corner of the model.

Interpolant Tab
1. Double-click on the numeric model object ( ) to open the Edit RBF Interpolant window.
2. Click on the Interpolant tab:

There are several parameters in the interpolant tab that can be set based either on rules of thumb or by using
geostatistical input from packages such as Leapfrog EDGE, Supervisor or Isatis. For this example, we will look
at rules of thumb that work well for a number of examples. The default settings are almost certainly incorrect so
the next few paragraphs are important when creating reasonable numeric models.

As you may have figured out by now, Leapfrog is fast at creating models, but that doesn’t necessarily mean
the first pass models are correct.

Understanding how the interpolation works is one of the key topics in the Leapfrog Geo.

Interpolant
There are two options for the Interpolant, Linear and Spheroidal. The Linear interpolant works well for
lithology data and for quickly visualising data trends. It is not suitable for values with a distinct finite range of
influence such as most metallic ore deposits. The Linear interpolant assumes that values a certain distance
from a point have a proportionally greater influence on that point than values further away. The Spheroidal
interpolant works well when there is a finite range beyond which the influence of one point upon another should
fall to zero. This is the case for most metallic ore deposits.
3. Change the interpolant type to Spheroidal.
Note that the interpolant function shown in the window changes shape to display the Spheroidal interpolant
rather than the Linear interpolant.

Base Range
The Spheroidal interpolant has a Base Range that represents the distance from the data at which the value
equals the Total Sill. As we move away from a specified point, the influence of that point decays in a roughly
linear manner up to around 30% of the range. Past 30% of the Base Range, the influence of the point starts
dropping more quickly until it reaches 96% of the value of the total Sill.

© 2020 Seequent Limited 33 of 78


In simpler terms, the Base Range is the parameter that roughly corresponds to continuity. Leapfrog is
essentially creating an isosurface through points of equal value; by increasing the Base Range, the isosurface
can stretch a further distance between points. The effect of the Base Range can be visualised most obviously
when it is too small.
For this example, setting a Base Range of around 20 will produce a series of isosurfaces that surround the
drilling. These are sometime referred to as “strings of pearls”:

These are a good indication that the Base Range needs to be increased, as it’s extremely unlikely that all the
drillholes manage to perfectly follow thin pipes of high grade while missing the surrounding low grade!
As a rule of thumb, the Base Range should be set to 2.0 - 2.5 times the distance between drillholes.
In this case, the average distance between holes is around 100 m, so a Base Range of between 200 and 250
should be a good starting point.
4. Change the Base Range to 250.
Note that the shape of the interpolant function changes to include the range of 250, which is represented using
a vertical yellow line:

34 of 78 © 2020 Seequent Limited


Total Sill
The Total Sill controls the upper limit of the interpolant function, where there ceases to be any correlation
between values. This is an arbitrary number that only becomes relevant when used in conjunction with the
nugget. For this reason, it is easiest to change the nugget to the power of 10 that is closest to the Variance,
which is displayed in the window. The reason for using a power of 10 is to make it simpler to calculate the
nugget, which is in the next step.
In this case the variance is around 1, but this may change depending on whether you added a different top cut,
used a different composite length or used the log transform.
5. Change the Total Sill to 1 .
Note that the shape of the interpolant function changes again and is now limited in the y-direction by the Total
Sill; the function approaches the sill and will be at 96% of the sill when the function crosses the Range line.

Nugget
The Nugget allows for local anomalies in sampled data, where a sample is significantly different to what might
be predicted for that point based on the surrounding data. By increasing the value of the Nugget, more
emphasis is placed on the average values of surrounding samples and less on the actual data point. The
Nugget can also be used to reduce noise from inaccurately measured samples.
The rule of thumb for the Nugget changes depending on the deposit type, and geostatistical input is vital. For
this deposit (a porphyry gold project), a Nugget of 10 - 20% is appropriate. It is important to note that the
Nugget is a percentage of the sill, so in this case a 15% nugget would be 0.15 (15% of 1).
6. Change the Nugget to 0.15.
Note the change in the interpolant function; the base point (0.0, 0.0) moves up the y axis to equal the value of
the Nugget:

Drift
The Drift controls the way the interpolant decays away from the data. A Constant drift means the interpolant
will decay to the mean value of the data. A drift of None means the interpolant will decay to a value of zero
away from the data, so is useful when there are no low-grade holes constraining the deposit. A drift of Linear
means the interpolant decays linearly away from the data.
In this case, as the interpolant model is currently not bounded to any domain (geological, structural, weathering
etc) a sensible drift to use will be None. This means that as we move away from the data toward the edges of
the model, the value of the interpolant will revert to a value of zero.

© 2020 Seequent Limited 35 of 78


7. Change the Drift to None:

We are only using a drift of None as we currently have no geological domain set up. Later in this session once
we have introduced a geological domain, we will use a drift of Constant.

Alpha
The Alpha determines how steeply the interpolant rises toward the Total Sill. A low Alpha value produces an
interpolant function that rises more steeply than a high Alpha value. By looking at the interpolant function while
changing the Alpha, we can see that a high Alpha will give points at intermediate distances more influence
compared to lower Alpha values. The possible Alpha values are 9, 7, 5 and 3.
An Alpha of 9 gives the best approximation to the Spherical Variogram but takes longer to process and in most
situations gives a very similar result to using an Alpha of 3.
In this case, we will keep the Alpha at 3 to reduce processing time.

Accuracy
Leapfrog Geo estimates the Accuracy from the data by taking a fraction of the smallest difference between
measured data values. There is little point in changing the accuracy to be significantly smaller than the errors in
the measured data as the interpolation will run more slowly and will degrade the interpolation result.
The rule of thumb here is to leave the Accuracy as it is.
8. Click OK to reprocess the interpolant.

36 of 78 © 2020 Seequent Limited


Editing the Interpolant tab settings has made significant changes to the volumes:

Value Transform Tab


1. Double-click on the interpolant again.
2. Click on the Value Transform tab.
The Transform Type can be set either to None (default) or to Log. For this example, changing the Transform
Type to Log will change the histogram so it is more normally distributed. The Pre-log shift option becomes
available once the Log transform is selected; this prevents issues when taking the logarithm of zero or
negative numbers.
The log transform will always produce smaller volumes compared to not using a transform. Unless outside
geostatistical input is available, the Transform Type should be left as "None". A good method of deciding
whether to use the log transform is to create two identical models; one with the log transform on, and the other
with it off. The user can then visually analyse the models and decide which is more suitable.
3. Keep the Transform Type set to None.
The Lower bound (bottom cut) and Upper bound (top cut) options become available once the Do pre-
transform clipping check box has been enabled. By setting the Upper bound, all samples with a value
greater than the specified Upper bound will be reduced to the value of the Upper bound. This prevents
samples with very high grades having an undue influence on the model. As a rule of thumb, an Upper bound
can be selected where the histogram starts to break down. A simple method of checking this is by looking at
the histogram and finding the value at which the columns start to get gaps between them.

© 2020 Seequent Limited 37 of 78


There is a gap in the data at a value of around 6:

We will use this value as our Upper bound.


4. Tick the Do pre-transform clipping checkbox and change the value of the Upper bound to 6.
Note that once the Upper bound has been applied, all values greater than 6 have been reduced to be equal to
6:

This will reduce the variance of the dataset, so some values on the Interpolant tab may need to be adjusted.
5. Click OK to process the changes.
As a first pass model, this isn’t a bad result. We will need to add trends to give the mineralisation more of a
defined shape, as well as limit the extents of the model to within geological domains (which we made as part of
the geological model earlier).

38 of 78 © 2020 Seequent Limited


Copying a Numeric Model and Clipping to a Domain
The easiest way to clip a numeric model to a geological domain is to firstly create a copy of it. Once the
numeric model has been copied, we can alter some of the parameters in the new model and check the changes
against the original interpolant.
1. Right-click on the AU_gpt numeric model and select Copy.
2. Enter the name “AU_gpt clipped to Early Diorite” and click OK.
Now that we have a second numeric model, we can open it up and make appropriate changes to first clip the
data/surfaces to the Early Diorite and then update the parameters to consider the changed data. For example,
the data will be clipped to the Early Diorite lithology, which will change the Upper Bound, Total Sill, Nugget
and Drift.

Bounding your numeric model to a geologically reasonable domain (lithological, structural, alteration, etc) is
vital to creating a reasonable volume result.

3. Collapse the original AU_gpt numeric model and expand the new numeric model in the project tree.
4. Right-click on the Boundary and select New Lateral Extent > From Surface:

5. Select the Early Diorite output volume:

6. Click OK.
7. Clear the scene.

© 2020 Seequent Limited 39 of 78


8. Once the model has been processed, add it to the scene:

The model has changed in two different ways. First, the isosurfaces have been clipped to the Early Diorite
boundary and second, the data has been clipped to the Early Diorite boundary:

Now that the boundary has been changed, we need to edit the other interpolation parameters as discussed
above.
9. Double-click on the numeric model to open the Edit RBF Interpolant window.
In the Values tab, we can see that the Surface filter is set to Boundary. As the boundary is now set to the
Early Diorite volume, this means the Surface filter is already using the Early Diorite, so we don’t need to
change anything here.
We can also leave the Boundary tab and the Trend tab unchanged.
10. Click on the Interpolant tab to see if anything needs to be changed there.
11. If necessary, make appropriate changes to the Total Sill, Drift and Nugget.
12. Since the values used in the interpolation have changed, you may wish to revisit the Value Transform tab
to assess the clipping Upper bound.
13. The Outputs tab should be correct as it is, so we can click OK and let the model reprocess.
14. Clear the scene.

40 of 78 © 2020 Seequent Limited


15. Add the model to the scene:

Adding a Structural Trend to a Numeric Model


Now that we have changed the parameters of the numeric model to give a reasonable first pass, the final step
is to apply a trend in a similar manner to applying a trend to intrusion surfaces. In the Fundamentals course and
the Wolf Pass Geological Modelling module, we added a Global Trend to an intrusion surface, but for this
model we would like to add more detail than that tool allows. When more detail is required, consider adding a
Structural Trend.

Creating a Structural Trend


To create a Structural Trend, we need to create meshes that represent the direction of the trend. We can use
as many meshes as required and apply different strengths and ranges to each. A common example of where
this might be useful is when dealing with multiple zones of high grade. This could be a syngenetic deposit with
an epigenetic zone formed by later weathering or it could be two separate intrusions, each with a different
plunge and strike.
There are many methods we can use to define meshes, but an easy option is by using polylines. Look at the
existing numeric model and pick out a few trends. Two possibilities are shown below. Note that it can be easier
to pick trends by hiding the initial isosurfaces and viewing the points that are being used to make the
isosurfaces. Once the points are in the scene, use the value filter in the properties panel to slowly hide the
lower grade points. As you do this, the trend often becomes clearer.

1. Create two or more meshes defining your trends.


In the example above, there is one named “Vertical Trend” and one named “Dipping Trend”. Both were made
using polylines.

© 2020 Seequent Limited 41 of 78


2. Once the meshes have been created, right-click on the Structural Trends folder (in the Structural
Modelling folder) and select New Structural Trend.
3. Click Add and select your two meshes.
The three options at the top of the window are Non-decaying (default), Blending and Strongest Along
Meshes:
l Non-decaying assumes the strength of the trend won't decay away from the meshes
l Blending takes any one point in the model, then uses a combination of the multiple meshes to define the
direction and strength of the trend at that point
l Strongest Along Inputs takes any one point in the model, then uses the closest input at that point to
define the direction and strength of the trend at that point
For this example, the most reasonable type of trend to use is Strongest Along Inputs.
4. Select Strongest Along Inputs, and the Range option becomes available.
l The Range defines the perpendicular distance away from each mesh that the mesh has influence on
l The Strength defines the strength of the trend (a strength of "5" would be the equivalent of a 5:5:1 ratio in a
Global Trend)
l As we move further away from the mesh, the Strength decays until the Range is met (unless Non-
decaying is selected, in which case the strength does not decay away from the mesh)
5. Keep the Strength and Range at 5.0 and 100 for both meshes:

6. Click OK.
7. Click and drag the Structural Trend into the scene to view it.
The ellipsoids visually represent the direction of the trend. The larger the ellipsoid, the stronger the trend. You
can see that close to the meshes, the trend is stronger and as we move further away from the meshes, the
strength decays.

Adding a Structural Trend to a Numeric Model


Now that we have created the Structural Trend, the final step is to add it to the numeric model.
1. Copy the AU_gpt clipped to Early Diorite model and rename the new model “AU_gpt clipped to Early Diorite
Structural Trend”.
2. Double-click on the new numeric model and click on the Trend tab.
Now that there is a structural trend in the project, the Structural Trend check box has become active.

42 of 78 © 2020 Seequent Limited


3. Select the check box, and make sure the correct structural trend is selected:

4. Click OK, and the model will re-run to include the Structural Trend.
When Leapfrog is processing models with Structural Trends, the processing time is slightly longer because
the calculations are more complex.

Adding a Contour Polyline to the Numeric Model


Sometimes manual interpretation is necessary to create the surfaces that you, as the geologist, believe exist.
Sometimes drillholes end in high grade and potential blowouts need to be mitigated. We can use contour
polylines to add this interpretation/limitation. To add a contour polyline, you need to decide what value the
polyline is, then draw the polyline either on a surface or on the slicer. This feature is most useful where there is
limited data; remember that if there is contradicting data (i.e. the drillholes contradict the contour polyline), the
surface produced will not look correct. By doing this, you are essentially adding assay values to your
interpolant, so use with care.
1. Draw a slice through your existing AU_gpt clipped to Early Diorite numeric model, cutting along hole
WP052.
In this example, I will use a contour polyline to mitigate a small blowout caused by a drillhole ending
prematurely:

2. Right-click on the Values under the interpolant and select New Contour Polyline.

© 2020 Seequent Limited 43 of 78


3. Enter the value that you would like the contour polyline to represent.
In this case, an appropriate value is a low background value, 0.01.

4. Draw the contour polyline on the slicer, off the end of the drillhole, then click Save in the toolbar.

5. The model will be updated, and the surfaces will now represent the contour polyline:

Numeric Model Statistics


Numeric models have in built statistics that give basic information regarding grades and volumes within each
shell.

44 of 78 © 2020 Seequent Limited


1. Right-click on the AU_gpt clipped to Early Diorite model and select Statistics.

There are four columns: Interval, Interval Volume, Approx Mean Value and Units. The first three columns
are used to calculate the last column, then each row is added to give a total number of Units.
l The Interval column lists the value(s) of the volumes that are being calculated.
l The Interval Volume column lists the total volume contained within each Interval.
l The Approx Mean Value lists the mean value of the volume. In the example above, the intervals <0.5 and
>1.5 have Approx Mean Values of 0.5 and 1.5 respectively (as there is no further information available
higher or lower than these grades). The interval from 0.5 - 0.75 has an Approx Mean Value of 0.625, which
is half way between the two grade shells. To gain accuracy, one method is to increase the number of grade
shells (by decreasing the spacing between shells).
l The Units column is calculated by multiplying the Interval Volume by the Approx Mean Value, which
gives a total number of units. Each row is added to give the total number of units.
There is a short description at the bottom of the window giving instructions for turning the total units into grams.
In this case, we can look at the average density of the Early Diorite lithology, multiply it by the total units, then
divide by 31.1 to give total ounces of Gold.
l Total Units is 120,986,158
l Density of Early Diorite is 3.06 (which can be found in the Histogram tab of the merged table)
l (120,986,185 x 3.06) / 31.1 = 11.9 million ounces of Gold within the Early Diorite.
Note that we haven’t included a cut-off, so we are calculating the grade within the entire Early Diorite domain.
There are several things we could do to get a more constrained result, including limiting the model to within a
certain distance to the drilling (New Lateral Extent > From Distance Function), using an Indicator RBF
Interpolant as a boundary, and creating further refined geological domains (using refined models).

© 2020 Seequent Limited 45 of 78


Session 4: Modelling Numeric Data as Categories
Contents
Defining Mineralised Zones 46
Building the Mineralised Zone Surfaces in a Geological Model 47
Building the Mineralised Zone Surfaces in a Refined Model 50

Goals
In this session we will:
l Define and convert numeric data to categories
l Build mineralised surfaces in a geological model using the new category data
l Build mineralised surfaces in a refined model
This session continues to use the Wolf Pass project.

Defining Mineralised Zones


The first step required in modelling mineralised zones is converting numeric data into category data.
1. Right click on the assay table in the project tree, select New Column > Category From Numeric:

Note: a category can be made from any numeric data, this function is not exclusive to grade.
2. Select AU_gpt as the Base Column and name the new column:

3. Click OK.

46 of 78 © 2020 Seequent Limited


4. Select cut-off grades for the different categories. Categories can be added or removed.

5. Click OK.
A new “Category” column has been created in the assay table, from which a Geological Model can be
created.

Before moving on we need to decide whether or not the mineralised zone(s) are to be constrained using a
previously modelled lithologic boundary. If rectangular model extents are sufficient, we can proceed with
building the mineralised zone surfaces in a geological model; if we want to use a previously modelled lithologic
solid, for example the Early Diorite, we can build our new surfaces within a refined model. We will do both.

Building the Mineralised Zone Surfaces in a Geological Model


1. Right-click on the Geological Models folder and select New Geological Model.
2. Select the newly created Au_gpt_Category column as the base lithology column using the drop-down.
3. Set the Model Extents to WP_assay.
4. Set the Surface resolution to 20.

© 2020 Seequent Limited 47 of 78


5. Give the model a new name and click OK.

We will build the mineralised zone surfaces using the New Intrusion tool so that the surfaces can enclose
volumes and more easily avoid lower grade intervals.
6. Right-click on the Surface Chronology and select New Intrusion > From Base Lithology:

7. Start with the highest-grade interval as the “youngest” lithology.


The same rules apply when creating mineralised zones that apply when building lithologies; always ignore
“younger” intrusions:

8. To control the filtering of short segments, click on the Compositing tab in the New Intrusion window.

48 of 78 © 2020 Seequent Limited


By default, when intrusions are built, segments shorter than ¼ of the surface resolution are filtered out
(ignored). Decide whether these default settings are appropriate and edit accordingly.

9. Click OK to generate the surface.


Surfaces built using this method can be manually manipulated with polylines and structural data, etc. as with
our other surfaces.
10. To ensure that the surfaces are snapping to drillholes, double click on the Mineralised zones GM.
11. Under Surface Generation, Snap to data, select Drilling only.

12. Click OK.


13. Add the output volumes to the scene to evaluate.

During this exercise, it is important to keep resolution in mind. It is not possible to capture 0.5 m of high
grade using a surface with a resolution of 50. While as many contact points between zones will be honoured
as possible, it is likely some intervals will be missed due to surface geometry.

© 2020 Seequent Limited 49 of 78


Building the Mineralised Zone Surfaces in a Refined Model
If we want to build our mineralised surfaces within a specific volume, we can use a new Refined Model.
However, because we have already refined our Wolf Pass GM, we have to make a copy first.
1. Expand the Refined Wolf Pass GM object in the project tree.
2. Right-click on the Wolf Pass GM and select Copy:

3. Give the model a new name or keep the default one and click OK.
The new copy will appear in the project tree:

Now we can start building our mineralised surfaces as a refinement to this model.
4. Right-click on the Geological Models folder and select New Refined Model.
The New Refined Model window will open.
5. Select the Wolf Pass GM copy and then the Early Diorite volume from the top two drop-down menus.
6. Select the base lithology column to be Au_gpt Category.
7. Set the Surface resolution to 20 and give your new refined model a meaningful name:

8. Click OK.

50 of 78 © 2020 Seequent Limited


A new Refined WP Mineralised Zones model will appear in the project tree; the new refined model will contain
the original geological model that it was based off, as well as a new model structure built under the lithology of
interest:

9. In the Early Diorite section of the refined model, build the mineralised zone surfaces in the Surface
Chronology using the New Intrusion tool.
As with the earlier model, treat the higher-grade categories as “younger” lithologies than the lower grade
categories. Remember to check the Compositing tab to ensure that the short segments are being treated as
desired.
10. Activate the surfaces, add the Output Volumes to the scene and critique your results.
The results rendered using these two different mineralised surface building methods may overlap in places, but
they will generally not yield the same result. It is up to you, as the geologist, to determine which model provides
the most realistic distribution of grade.

© 2020 Seequent Limited 51 of 78


Session 5: Indicator Interpolant Models
Contents
Modelling Approach 52
Make a Merged Table 52
View Distribution of Mineralisation with Box plots 53
Importing a Mesh to make a Structural Trend 54
Build an Indicator RBF Interpolant 55
Setting up the Initial Indicator Interpolant 55
Editing the Numeric Model 56
Copying and Comparing Models 62
Statistics 63
Exercise: Create an Indicator RBF Interpolant for Gold 63

Goals
In this session, we will:
l Review how to make a merged table
l Review box plots within the exploratory data analysis tools
l Set up a basic indicator interpolant model
l Refine the model parameters
l Add a structural trend to reflect the geometry of the higher-grade zone
l Examine the statistical outputs to validate the use of the model.
This session continues using the Wolf Pass project and the additional data is available either as an
attachment from Central or it will be provided by your instructor.

Modelling Approach
As with the other numerical models, there are several steps to take, often iteratively, to build a sensible/fit for
purpose model. While the Wolf Pass gold mineralisation is reasonably well controlled by the grouped lithologies
(as we have been using in the earlier sessions), if we did not have the groups, it would be more difficult to
determine what might be controlling the metals distribution. We will use the ungrouped lithologies and the
copper assays for this session. Let's look at the copper distribution before starting to build our indicator
interpolant.

Make a Merged Table


A good way to find out what lithologies are important to the mineralisation is by creating a merged table
between the geological logging and the assay data (or other property of interest).
1. Right-click on the Drillholes object in the project tree and click New Merged Table:

52 of 78 © 2020 Seequent Limited


2. Select the copper assays and the ROCK columns:

3. Rename the new table and click OK.

View Distribution of Mineralisation with Box plots


To view how copper is distributed across the ROCK codes, we will look at the box plot of the CU_pct values
displayed by the ROCK category.
1. Right-click on the copper and ROCK merged table and select Statistics.
2. Select Box Plot.
3. Change the Numeric column to CU_pct.
4. Set the Category to ROCK.
5. Select all Categories:

© 2020 Seequent Limited 53 of 78


The resulting plot is difficult to interpret. Which units have most of the grade?
6. Close the Box Plot.
7. Add the copper and ROCK table and the topography to the scene.
8. Display the drillholes by the CU_pct values and rotate the scene.
It’s possible that the copper distribution is being controlled by a structure, something aligned approximately
parallel to the ridge. We could try using one of the structural trends that we created earlier. However, for this
exercise we will import an additional mesh that represents the ridge shear structure.

Importing a Mesh to make a Structural Trend


Structural trends can be made from any mesh or solid within the project. We will import a mesh that represents
the Wolf Pass central shear zone.
1. Right-click on the Meshes folder in the project tree and select Import Mesh.
2. Navigate to the folder for this session and select the Central_Shear (On Topo_Wolfpass).
3. Click Open and then OK, keeping the default options enabled to import.
4. Clear the scene.
5. Add the new mesh to the scene.
6. Right-click on the Structural Trends folder and select New Structural Trend:

7. In the Structural Trend window, click the Add button and select the mesh we just imported.
8. Select Non-decaying.
9. Rename the Structural Trend “Central Shear” so that it can be distinguished from the one you made earlier.

10. Click OK.


We are now ready to start making our new indicator interpolant.

For more information about where, when and how to use structural trends, see the Structural Trends topic in
the online help, or review this blog article.

54 of 78 © 2020 Seequent Limited


Build an Indicator RBF Interpolant
The indicator interpolant is a useful tool for understanding the distribution of a variable (e.g. grade), when we
don't have a clear understanding of the geological parameters driving it.
The indicator RBF interpolants allow you to specify a cut-off grade, then assigns either a "0" (for grades below
the cut-off) or "1" (for grades above the cut-off) to each sample point within the interpolant model boundary.
Based on the user-defined Iso value, which must be specified between 0 and 1, an isosurface is created
around the "1" values, resulting in the generation of Inside and Outside volumes.
Small, uneconomic volumes generated by this process can be automatically removed using a volume filter, and
a Statistics tab provides detailed statistics on how the model overlaps with other data in the project.
Once the volume has been created, it can be used as a boundary or domain within which further processing can
be carried out.

In addition to the use cases mentioned in the introduction, the indicator RBF interpolant can be a very useful
tool to temper the effects of extreme high-grade samples combined with sparse drilling. Because all of the
samples are transformed into 0s or 1s, the resulting solid that is produced tends to be more conservative,
with fewer 'blow-outs'.

In practice, creating indicator interpolants is very similar to creating regular interpolants.

Setting up the Initial Indicator Interpolant


In this case, we will define our cut-off at a value of 0.3% Cu.
1. Right-click on the Numeric Models folder and select New Indicator RBF Interpolant.
2. Change the Numeric Values option to CU_pct from the copper and ROCK table.
3. Change the Cut-off to 0.3.
The cut-off value here is the grade or physical property threshold - in the units specified in your drillhole tables -
for determining how the 0s and 1s will be assigned.
Generally, when trying to delimit a resource the marginal cut-off will be used, so in this case 0.3% copper is an
appropriate value.
4. Untick the checkbox for the Surface Filter.
The Surface Filter is enabled by default, which filters the data used to only the values within the Interpolant
Boundary or another boundary within the project. By disabling the Surface Filter, all available data can be
used to generate the interpolant. In this case, it doesn’t matter whether the Surface Filter is selected or not
because the Interpolant Boundary that we will be using corresponds to the project boundary, thereby using all
the data. However, if you wanted to simulate a domain with a “soft boundary”, the Surface Filter could be
utilised to customise the data being used.
5. In the Compositing section of the New Indicator RBF Interpolant window, select Composite in >
Entire drillhole.
As most samples are 2 m in length, we can choose a composite length that is a multiple of 2 in order to
minimise the number of split intervals.
6. Set the Compositing length to 4.

In practice, it can also be useful to consider the proposed selective mining unit (SMU) or block size when
calculating composites.

7. For end lengths less than 1 m, select Distribute equally.


8. Under the Interpolant Boundary section, select the Existing model boundary or volume option.
9. Click to select the existing geological model boundary.

© 2020 Seequent Limited 55 of 78


10. Drop the Surface resolution to 8.
11. Change the Iso value to 0.3 (the value must be between 0 and 1).
We will discuss the meaning of the Iso value in more detail shortly.

12. Click OK, and the initial interpolant will be processed.


13. Clear the scene.
14. Add the initial model to the scene:

Editing the Numeric Model


Similar to the RBF interpolant, we will need to make appropriate adjustments to the default Indicator
RBF Interpolant to make it more meaningful. This time, we already know most of the rules of thumb, so we
can work through them quickly. In addition to those parameter modifications, we will also add the structural
trend we just created.
Once we have adjusted the appropriate parameters of our model, we will copy the model to create a couple
more for comparison purposes.
15. Double-click on the indicator interpolant to open the Edit Indicator RBF Interpolant window.
The Values tab can be left unchanged, as can the Compositing tab as we previously set these parameters.
The Boundary and Cut-off tabs can be left as they are as we have already made the appropriate changes.

Reviewing the Indicator Values


16. Click on the Cut-off tab.

56 of 78 © 2020 Seequent Limited


Some useful information can be found in this tab regarding sample distribution around the cut-off value:

The cut-off value, in this case 0.3% Cu, is displayed on the histogram with the purple line. The samples below
cut-off, those coded as 0s, amount to 20% of the total sample count and have a mean grade of 0.18% Cu. This
leaves 80% of our samples coded as 1s, and they have a mean grade of 1.14% Cu.
We can visualise this division of samples in the scene, displayed by whether the sample grade is above the
cut-off value of 0.3% Cu (represented as a '1') or below the cut-off value (represented as a '0').
17. Make the model volumes invisible and drag the CU_pct Indicator 0.3 values into the scene, from the project
tree.
18. Display the sample point values by CU_pct Indicator 0.3 values.

We can also visualise how this sample division relates to modelled Inside and Outside volumes of our indicator
interpolant.
19. Display the sample point values by Sample grouping.

© 2020 Seequent Limited 57 of 78


The sample grouping categorisation being shown in the scene is a combination of two factors: whether a given
sample point value is above or below the cut-off value (0.3% Cu), together with the modelled volume, Inside or
Outside, that the given sample falls in.
The red and the blue points represent points that have their cut-off status consistent with their modelled status,
whereas the green and orange coloured points have their cut-off status at odds with the modelled volumes (ie.
the sample value is below cut-off but it sits in the Inside volume and vice versa).

It is important to note that this sample grouping categorisation is a result of the modelled volumes (Inside or
Outside) being back-flagged on to the samples. Given that the modelled representation of the iso value shell
is dependent of the surface resolution, the distribution of these sample groupings will shift slightly if/when
the model's resolution is changed.

Now that we are familiar with the samples and the sample distribution, let's take a look at editing the interpolant
model.

Setting a Preliminary Global Trend


20. Click on the Trend tab.
In the Trend tab, we want to add our Central Shear structural trend; however, we will use a global trend as a
tool in the interim to help calculate appropriate interpolant parameters.

58 of 78 © 2020 Seequent Limited


21. Select Global Trend and set the ellipsoid ratios to 5, 5, 1:

Note: we are using a 5, 5, 1 ellipsoid ratio as that is the set ratio of the anisotropy applied by the Structural
Trend.

Editing the Interpolant Parameters


22. Click over to the Interpolant tab.
23. There are several changes we need to make in the Interpolant tab, following the same rules of thumb as
earlier:
l Change the Interpolant to be Spheroidal
l Change the Total Sill to equal the variance (~0.16 in this example)
l Change the Base Range to be 2.0 - 2.5x the drillhole spacing (~150 m x 2 = 300 in this example). We will
come back to this.
l Change the Drift to None
l Keep the Alpha at 3
l Change the Nugget to 15% of the total Sill (0.024 in this example)

© 2020 Seequent Limited 59 of 78


The Edit Indicator RBF Interpolant window should look like this:

Notice that adjacent to the Base Range, there are Max, Int and Min values. These values reflect the 5, 5, 1
global trend ellipsoid values that we set earlier. We would like the Base Range in our direction(s) of greatest
continuity to be 2.0 - 2.5x the drillhole spacing, so we need to change the value from 300 to something that
brings the maximum and intermediate values into the ~300 range.
24. Enter a value of 175 into the Base Range.
Notice that the Max and Int values are now ~300:

Now that we have set the Base Range to a value that reflects the 5, 5, 1 ratio that the Structural Trend will
impart, we can switch the structural input from the global trend to the Central Shear structural trend.

Applying the Structural Trend


25. Click on the Trend tab.

60 of 78 © 2020 Seequent Limited


26. Click on the radio button to select Structural Trend and then select Central Shear from the dropdown list:

27. Finally, click on the Volumes tab and enable the Volume filtering option.
28. Change the Discard volume parts smaller than option to 100,000 units cubed.
This will remove all volumes smaller than 100,000 units cubed.
29. Click OK.
30. Drag the indicator interpolant into the scene to view it.

Now that we have a more meaningful model with appropriate parameters, we can discuss, more effectively,
what our indicator interpolant represents and how to interpret its results.

© 2020 Seequent Limited 61 of 78


Copying and Comparing Models
31. Right-click the CU_pct Indicator 0.3 indicator interpolant model and copy the model 2 times.
32. Name one CU_pct Indicator 0.5 and the other CU_pct Indicator 0.7.
33. Double-click the CU_pct Indicator 0.5 model to edit it; click on the Volumes tab.
34. Change the Iso value to 0.5.
35. Repeat for the CU_pct Indicator 0.7 model; change the Iso value to 0.7.
36. View the 3 isosurfaces (0.3, 0.5 and 0.7) in the scene.

What does the Iso value represent?


Iso values for indicator RBF interpolants are probability values that are restricted to between 0 and 1.
An iso value shell of 0.3 represents the volume for which:
l There is a ≥ 0.3 (=30%) probability that any point (of composite support) inside the resulting modelled
volume will be ≥ the specified cut-off value.
l There is a > 0.7 (=70%) probability that any point (of composite support) outside of the resulting
modelled volume will be < the specified cut-off value.
An iso value shell of 0.5 represents the volume for which:
l There is a ≥ 0.5 (=50%) probability that any point (of composite support) inside the resulting modelled
volume will be ≥ the specified cut-off value.
l There is a > 0.5 (=50%) probability that any point (of composite support) outside of the resulting
modelled volume will be < the specified cut-off value.
An iso value shell of 0.7 represents the volume for which:
l There is a ≥ 0.7 (=70%) probability that any point (of composite support) inside the resulting modelled
volume will be ≥ the specified cut-off value.

62 of 78 © 2020 Seequent Limited


l There is a > 0.3 (=30%) probability that any point (of composite support) outside of the resulting
modelled volume will be < the specified cut-off value.

A low iso value (0.1-0.3) will create a more “inflated” shell, whereas a higher iso value (0.7-0.9) will create a
tighter, more restricted shell.
Using a low iso value can be used to prioritise the tonnage included in the volume, whereas a high iso value
will prioritise metal (or other physical properties).
The choices made here should generally reflect the proposed mining method - bulk mining methods will be
more suited to tonnage prioritisation, while selective mining methods are more suited to prioritising grade.
Remember: The interpolant function is continuous and can be evaluated at any point. The iso-surface shell
represents this underlying continuous function, but is depended on the surfacing parameters chosen, eg.
changing the surface resolution alters the resulting model volumes (Inside and Outside), which, by
extension, affects the sample grouping categorisation of the points (at composite support).

Statistics
It is easy to review the statistics of your indicator interpolants.
37. Right-click on the original CU_pct Indicator 0.3 indicator interpolant model and click Statistics.
Statistics relating to the Indicator are listed in the window and can be copied to the clipboard for importing into
Microsoft Excel or a similar package:

From here, we have a full summary of the number of samples above and below cut-off, those above cut-off that
are outside the modelled isosurface, and those below cut-off that are inside the modelled isosurface.
Using these values, we can calculate discard and dilution factors. For example, there are 324 samples below
cut-off that are included in the Inside volume, so our dilution factor = 324/(3208+324)=0.09, or 9%.

Exercise: Create an Indicator RBF Interpolant for Gold


Use the rules of thumb discussed above to create an Indicator RBF Interpolant model for gold.

© 2020 Seequent Limited 63 of 78


64 of 78 © 2020 Seequent Limited
Session 6: Block Models
Contents
Creating a Block Model 65
Creating a Sub-Blocked Model 66
Evaluating Onto the Block Model 66
Viewing Block Model Evaluations 67
Viewing in the Scene 69
Summary Statistics 70
Calculating Tonnage 71
Reporting 72
Exporting a Block Model 72
Importing a Block Model 75
Introduction to Leapfrog Edge 77

Goals
In this session we will create, visualise, import, evaluate and export block models in Leapfrog Geo.
This session continues using the Wolf Pass project and the additional data is available either as an
attachment from Central or it will be provided by your instructor.

Creating a Block Model


Leapfrog can create large block models onto which numeric models can be evaluated. The block model is made
up of regular-sized parent blocks that can also be sub-blocked where required in order to better represent solid
volumes. Both regular and sub-blocked models can be rotated around a vertical axis to better align with the
strike of deposit. Sub-blocked models can also be tilted so that the blocks can be aligned with the dip. This
option is not available for regular models. A sub-blocked model behaves the same way as a regular block model
until the defined sub-blocking triggers are activated. In this session we will create a block model for the Wolf
Pass dataset and evaluate the geological model and the Au numeric models onto the blocks.
1. Right-click on the Block Models folder in the project tree and select New Block Model:

The New Block Model window will appear, which shows options for defining the grid.
2. Set the block size to 20, 20, 10 (x, y, z).
It’s possible to adjust the reference centroid and rotate the grid about the x axis by changing the azimuth or
rotating the handles in the scene.
3. Under the Enclose Object dropdown, select the WP_lith table.

© 2020 Seequent Limited 65 of 78


4. Click OK, then drag the block model into the scene.

The block model is currently empty. The next steps will go through the steps required to evaluate the geological
and numerical models against the blocks.

Creating a Sub-Blocked Model


Creating a sub-blocked model is very similar to creating a regular block model. We will create this sub-blocked
model to demonstrate its functionality, but we will proceed through the remainder of the session with the regular
block model we just made.
1. Clear the scene.
2. Right-click on the Block Models folder in the project tree and select New Sub-blocked Model.
The New Sub-blocked Model window will appear.
3. Define the Parent block size as 20x20x10.
With a sub-blocked model, you can also define the sub-block count, if required. The number entered here is not
the sub-block size, but sub-block count. For example, if you enter a sub-block count of 4 for the X and Y and the
parent block size was 20, the new sub-block size is 5. There is also the option to have a fixed or variable height
for Z.
In addition to being able to rotate around azimuth, sub-blocked models can also be rotated around dip. If you’re
sub-blocking based on a unit of interest, such as a vein, the dip and azimuth angles can be set from a plane
aligned with the unit of interest. In this example, we will use the Dacite dykes to act as vein proxies.
4. Drag the Dacite volume into the scene.
5. Draw a new plane that roughly aligns with the general trend of the Dacite.
6. In the New Sub-blocked Model window, click Set Angle From and select Moving Plane.
7. Once the angle is set, click the Enclose Objects dropdown, and select WP_lith.
Now that the blocks and extents are set, we can assign the Sub-blocking Triggers, which can be any surface
or model in the project.
8. Click on the Sub-blocking Triggers tab.
9. Select the Wolf Pass model and click OK.

Evaluating Onto the Block Model


We can evaluate existing models onto the block model using a centroid evaluation, which assigns a value
based on the domain that the centroid falls in, or on the value at the centroid if an interpolant function is being
used.
1. Double-click on the regular block model in the project tree and select the Evaluations tab.

66 of 78 © 2020 Seequent Limited


2. Move all the models into the box on the right.

3. Click OK.
If you are evaluating onto a sub-blocked model, you will also have the choice of evaluating onto the sub-block
centroid or the parent block centroid. By default, categorical models evaluate onto the sub-block centroid and
numerical models evaluate onto the parent block centroid.

Viewing Block Model Evaluations


Once evaluations are complete, there are a few parameters that you can visualise on the block model:
l The evaluated grade value from the numeric model
l The block status: normal, blank, without-grade, outside, error
1. Drag the block model into the scene once it finishes processing.

© 2020 Seequent Limited 67 of 78


2. You can change which evaluation you are visualising by using the display list drop down box in the shape
list:

For numeric models you can use a value filter, and for category models you can turn different categories on and
off by selecting Edit Colours.
3. Click on the display list once again and select the model’s Status field:

4. Click on the Show legend button ( ) for the block model.


5. Click Edit Colours to review of the status of the blocks.

68 of 78 © 2020 Seequent Limited


6. Hide the blocks classified as Outside:

All the blocks for this model are classified as either Outside or Normal.
7. Click OK to close the Legend for Status window.

If you evaluate a block model that has a lateral extent, blocks that fall outside this will have no assigned
value.

Viewing in the Scene


Once a block model is created, certain options for viewing it in the scene are available in the shape list and
properties panel:

In addition to the standard Edit Colours, transparency slider and show legend in scene options, there is also
the ability to Show edges ( ) and Show inactive blocks( ).
In addition to the standard Slice mode and Query filter options in the properties panel, there is also the option
to display block models filtered by X, Y and/or Z indices. There are three Index Filter options, None, Subset
and Sliced, and you can choose to apply them to the X,Z, and/or Z axes. To filter on a specific axis, make sure
its check box is ticked.
For Subset, the grid is filtered to show the union of the selected X, Y and Z ranges.
For Sliced, the grid is filtered to show the intersection of the selected X, Y and Z ranges.

© 2020 Seequent Limited 69 of 78


To adjust the range, drag the white handles left and right. Double-clicking on the slider alternates between
displaying a single value and displaying the full range. Move the slider to move the range.

Summary Statistics
Summary statistics will provide an overall report of the volume of blocks evaluated, as well as the average
grade and other associated statistics within those blocks.
1. To open the Statistics tab, right-click on the block model, select Statistics.
2. Select the Table of Statistics option.
In the Statistics tab, you can view a basic summary and compare different domains with the interpolated
numeric value.
3. Under Categories, click Add and use the dropdown list to select the Refined Wolf Pass GM: Wolf Pass GM
column.
4. Check the box for AU_gpt in the Numeric items list.
5. Tick the Hide empty categories and Hide inactive rows boxes.

70 of 78 © 2020 Seequent Limited


6. To change the display organisation, choose the Group by numeric item radio button.

Calculating Tonnage
In addition to the statistics automatically displayed, we also have the option to calculate tonnage and display it
in the table. To calculate tonnage, we need to provide a density. The density can either be a constant value, or
based on a density interpolant. Given that we don’t have density information for each unit or the ability to
separately specify the density for each unit, it would make the most sense to provide a density and calculate
the tonnage for only our main unit of interest, the Early Diorite, as opposed to all of the units together.
7. In the Table of Statistics, change the Numeric items display to show only the Au_gpt clipped to Early
Diorite one.

8. Click the Edit Columns option in the Table of Statistics toolbar.

© 2020 Seequent Limited 71 of 78


9. Click the visibility icon ( ) to display Tonnage:

10. Close the Edit Columns window.


11. Set the Density to Constant and change the value to 2.9.

Reporting
You can copy the results by clicking on the Copy ( ) dropdown at the top of the window and choosing either
Select All or Copy Selected Rows. This copies the results onto the clipboard, and you can then paste the
data into a spreadsheet for subsequent calculations. The displayed summary statistics can also be exported to
CSV by clicking the Export ( ) button.
In addition to the table of statistics that we just reviewed, scatter plots, box plots and Q-Q plots are available on
the block models.

Exporting a Block Model


Block models can be exported in the following formats, each with a number of different options:
l CSV Block Model Files (*.csv)
l Datamine Block Model Files (*.dm)
l Isatis Block Model Files (*.asc; not currently available for sub-blocked models)
l Surpac Block Model Files (*.mdl; not currently available for sub-blocked models)
In this session we will export our block model as a CSV to review the steps in the export wizard.
1. Right-click on your block model and select Export.
2. For this session, leave the Save as type as CSV Block Model Files (*.csv) and click Save.

72 of 78 © 2020 Seequent Limited


This will initiate the export wizard options.
3. In the first window, the different CSV export types are described:

4. Select the CSV Block Model option and click Next.


5. Select the evaluations that you wish to export:

6. Click Next.
There is also an option to omit certain rows depending on the block status.
7. Leave the Omit rows defaults.

8. Click Next.
When exporting block models as CSVs, there is the option to set the Numeric Precision; in this case, we will
leave the defaults.

© 2020 Seequent Limited 73 of 78


9. Click Next.
When a block status is not Normal, the status code can be represented in the exported file using a custom text
sequence; leave the defaults.

10. Click Next.


When exporting block models as CSVs, there is the option to set the character set format used in the exported
file. The selection you make depends on the target for your exported file. If any changes will be made, they will
be displayed in the window.
11. Click between the different options to see what will change.

12. Select your desired option and click Next.


Lastly, the export wizard will display a summary of the options selected. You can easily go back and make any
necessary changes. Once you have exported a block model, the settings made will persist so that the same
settings can easily be reused when the block model is next exported.
13. Click Export.

74 of 78 © 2020 Seequent Limited


The export wizard varies slightly between the different export file types. For more information regarding the
exporting options for each file type, see the Block Model Export topic in the online help.

Importing a Block Model


Leapfrog can import regular block models. Sub blocked models are not yet supported by the importer. There are
two stages to importing a block model. First, we must define the columns with the location, category and
numeric information. The following steps go through the process required to import a block model.
Note: For this session, we will import a block model from a different project, just as an example. We can import
it into the same project we’re using now but it will have a different co-ordinate system, so it cannot be viewed at
the same time as any objects in the Wolf Pass project.
1. Right-click on the Block Models folder and select Import Block Model.
2. Navigate to the folder for this session and select the Leda Block Model.csv file.

3. Select the appropriate x, y and z columns to define the location of the block, if not already selected.

© 2020 Seequent Limited 75 of 78


4. Multi-select the category columns associated with the block model, right-click the dropdown to code all
selected rows.

5. Multi-select the numeric columns associated with the block model, right-click the dropdown to code all
selected rows.

Note: you can use the Treat as Blank column to dictate the value used for inactive cells. Any blocks with
this value/text will be imported with a Blank Status.

6. Once the desired columns have been selected, click Next.


Now we need to define the block model. This can be done in a few ways, depending on the information which
was exported with your block model. Block models exported from Leapfrog contain all the possible options for
defining the grid.

76 of 78 © 2020 Seequent Limited


If the grid has been correctly defined, the message highlighted at the bottom of the window will read “All the
centroids in the file match the grid you've defined”.

7. Click Finish.
The model will be imported into Leapfrog.
8. Drag the block model into the scene to view it.

Leapfrog can import regular block models from many different software packages. Leapfrog's excellent
graphical capabilities make it easy to visualise large block models.
Tips for importing block models:
l The azimuth Leapfrog uses to rotate the model may be different than what your other software reports. If
the model isn't matching up as expected, try adding or subtracting intervals of 90 to the given azimuth
when entering in Leapfrog.
l If centroids aren't matching, and it's not clear why, import the block model into the Points folder to
determine where it sits in 3D space. This is very helpful for defining the grid during block model import.

Introduction to Leapfrog Edge


During this session, we have seen the functionality available for block models in Leapfrog Geo. Take your
block model analysis, validation and reporting even further with Leapfrog Edge. Leapfrog Edge contains
purpose-built tools for creating calculated columns based on block model values, simple workflows for
resource categorisation, advanced query filtering options, swath plots, grade-tonnage curves, statistical tables
and more. Ask your course instructor or contact us to learn more about Leapfrog Edge.

© 2020 Seequent Limited 77 of 78


78 of 78 © 2020 Seequent Limited

You might also like