Numeric Modelling With The Wolf Pass Project: For Leapfrog Geo Version 5.1
Numeric Modelling With The Wolf Pass Project: For Leapfrog Geo Version 5.1
Pass Project
For Leapfrog Geo version 5.1
© 2020 Seequent Limited (“Seequent”). All rights reserved. Unauthorised use, reproduction, or disclosure is
prohibited. Seequent assumes no responsibility for errors or omissions in this document. LEAPFROG, SEEQUENT
and are trade marks owned by Seequent. All other product and company names are trade marks or registered
trade marks of their respective holders. Use of these trade marks in this document does not imply any ownership of
these trade marks or any affiliation or endorsement by the holders of these trade marks.
Session 1: Introduction to Numerical Modelling
Contents
Introducing the Project 1
Modelling Approach 2
Iterative Refinement 2
Choosing a Modelling Tool 2
Goals
For this series of sessions, we will begin with a pre-built Wolf Pass geological model. If you are new to the
project, we will take a few moments to get acquainted with the lithologies and modelling codes before
exploring the data in more depth using Leapfrog Geo's exploratory data analysis tools. Then we'll build
interpolants, demonstrate how to model numeric data as categories, create indicator interpolants and build
block models.
By the end of these sessions you will know how to:
l Download a local copy of the Wolf Pass project from Central
l Evaluate a model back on to your drillhole table
l Explore relationships within your data using statistics and graphs
l Build an RBF interpolant
l Model numeric data as categories
l Build an Indicator RBF Interpolant
l Create block models
l Import and export block models
The data for this session can be found on the Central Training server for your region or it will be provided by
your instructor. Your instructor will lead you through the steps to enable Central integration with Leapfrog
Geo, add the server and download a local copy of the project.
Iterative Refinement
Building a numeric model is a process of successive refinement. This involves:
l Defining the numeric model and basic structures. This usually corresponds to defining the topography and
boundaries.
l Refining the internal structure. This involves setting the proper trends and making manual corrections to the
point and value data until the resulting surfaces are geologically realistic.
RBF Interpolants
The typical way to create these boundaries is by building grade shells using the RBF Interpolant tool.
Interpolated grade shells are built by using the known drillhole or point data to interpolate values infinitely
across the boundary extents; isosurfaces (grade shells) are then created to link up identical values. Grade
shells created by interpolation cannot be snapped to contact points on drillholes. These interpolated grade
shells are ideal for exploration drillhole targeting where there is a good understanding of what is controlling
mineralisation and they work very well when the deposit has lots of drillholes.
In general, numeric models can be created in Leapfrog Geo using assay data, temperature values, geophysical
data or any other numeric data that is sparsely distributed in space to interpolate across a region.
Interpolation in Leapfrog is fast and flexible, and the shells produced to represent mineralisation are smoother
and more reproducible than traditional hand-drawn meshes.
Goals
In this session, we will cover:
l Displaying numeric data
l Creating a new evaluation / backflagged table
l Creating a new merged table
l Creating drillhole queries
l Using the statistics and graphs available in Leapfrog Geo
This session continues to use the Wolf Pass project.
Colourmaps
With numeric data, you have the option of Continuous or Discrete colourmaps. While a continuous colourmap
is the default, in this session, we will focus on creating a discrete colourmap.
1. Clear the scene, add the WP_assay table to the scene and select the Au column.
2. In the shape list, click on the Au colourmap and select New Colourmap:
For a description of the different Interval Modes, see the Colourmaps topic in the online help.
5. Click Apply to view the results of the changes you have made and Close the interval dialog.
6. When satisfied with your colourmap, click Close.
You only have to set up this colour scheme once per column and you can then export and share it between
projects.
In this project, we also have Cu values; we can display the Au values in scene, scaled by the Cu values.
4. Using the Radius values dropdown in the properties panel, select Cu_pct.
Downhole Graphs
In addition to viewing numeric data by numeric-scaled cylinder radius, numeric data can also be viewed as a
downhole graph alongside categorical data, allowing you to view two columns of downhole data
simultaneously. This visualisation capability can be helpful for familiarising yourself with your data, drawing
correlations between different datasets (e.g. rock type and grade), and assisting with modelling interpretations.
For this exercise, we will view the lithology data together with the Au assay data.
1. Clear the scene then drag the WP_lith table into the scene.
2. Double-click on the Drillholes Graphs object into the project tree.
If you have downhole points in your project, they can also be displayed alongside the drill trace. Below is an
example of LAS point data displayed as a downhole graph:
In the Edit Colourmaps window, all colour gradients in the project area available from the Gradient list:
Now that we have visually inspected our data, we will look at more quantitative data analysis tools.
For more information regarding these visualisation options, see the Colourmaps and Displaying Drillhole
Graphs topics in the online help.
You can view basic statistics and graphs for imported numeric drillhole tables, merged tables, composited
tables, points and block models.
While we will not specifically discuss the use of these tools with respect to QAQC, they can also be used for
this purpose. The graphing tools can be used to compare duplicate sample grades against the original sample
grades (scatter plots, Q-Q plots), or to compare assay data from different laboratories (boxplots), etc.
3. Click OK to create the new table, which will appear in the project tree as part of the Drillholes object:
The new table contains from, to and Wolf_Pass_GM columns defined using the intersection between the
model’s output volumes and the drillholes:
For output volume evaluations, you can view the correlation between modelled geology and actual logged
drillhole segments. To do this, right-click on the table and select Statistics.
For more information, see the Back-Flagging Drillhole Data topic in the online help.
2. Select both the assay and GM evaluation tables and rename the table “Assay and GM_Evaluation”.
The intervals of a merged table are dependent on the selected columns’ interval breaks. The merged table
interval will be the longest possible interval that is shared by all selected columns. Where the ends of
interval don’t align, small intervals will be created. For example, if an assay interval is 10-12m and has a
value of 0.563, but there is a lith code change at 11m, a merged table will present this as follows:
l 10-11m, Lith A, Au 0.563
l 11-12m, Lith B, Au 0.563
In many cases, this doubling up of the assay values is not ideal. To deal with this issue, you can use a New
Majority Composite table based on assay intervals, and merge that new table with the assay table. This
approach will not result in any split assays.
Now that we have our data prepared, we can examine the data analysis tools in Leapfrog.
Histograms
You can view histograms for your numeric data columns by right-clicking on the data column in the project tree.
We will begin with investigating the Au data in the Assay_and_GM_Evaluation merged table. This merged table
contains both the assay information, as well as the lithology units, allowing us to filter the graphs based on
lithologic unit.
1. Expand the Assay_and_GM_Evaluation merged table to view the available columns of data.
Initially all the Au data in the project will be displayed on the plot.
The statistics calculated on the histogram on the drillhole tables are length-weighted by default.
Display Options
Options available for displaying the Histogram include the three graph types, Histogram of the log,
Percentage and Bin width. A box plot is also automatically displayed beneath the histogram. It is possible to
adjust the axes of the graphs, by setting the X limits and Y limits.
3. Tick the box for Histogram of the log.
4. Switch the Histogram display to Cumulative histogram and then Log probability to review them all.
5. Switch back to the Histogram display.
Queries
While it’s useful to start out by reviewing the statistics of the entire dataset, it’s also important to look at the
unit-specific statistics. This can be achieved on the graphs available at the drillhole level by setting up query
filters that isolate the values in each unit.
6. Right-click on the Assay_and_GM_evaluation table in the project tree and select New Query Filter.
7. Click the ... Query builder button.
Graph-Scene Interactions
Leapfrog offers graph-3D scene interaction for improved visualisation of your data.
12. View the Assay_and_GM_Evaluation table in the scene by the Au column.
13. To visualise a bin of data in the scene window, click on it in the histogram.
To select multiple bins, click and drag the mouse across the bins you wish to select.
15. Once you’re finished with the graph, close its tab.
The settings you select are saved for the next time you view a graph.
Scatter Plots
1. Right-click on the Assay_and_GM_Evaluation merged table and select Statistics, then Scatter Plot.
2. Set the X column to AU_gpt and the Y column to CU_gpt.
Display Options
When necessary, either or both axes can be logged; query filters can be added to display selected data; point
size and shape can be changed to suit your preferences, and the graph background can be set to white. A third
variable can be displayed on the graph by selecting a different column of data to use as the Colouring.
5. Use the Query filter to show just the Early Diorite unit:
The linear regression line is weighted, by length for numeric data in drillholes and by volume for block
models. Scatter plot data, however, is not weighted. This weighted linear regression equation and
correlation coefficient will not directly compare to un-weighted equations and values calculated in other
software, like Excel.
Graph-Scene Interactions
Leapfrog Geo offers graph-3D scene interaction for improved visualisation of your data. The selection tools
available in the graph’s toolbar are similar to those available for the interval selection tool on drillhole data:
Box Plots
1. Right-click on the Assay_and_GM_Evaluation merged table and select Statistics.
2. Select Box Plot.
3. Change the Numeric column to AU_gpt and tick the box for Log scale.
Since the merged table contains a few category columns, we can view the box plots for the different logged
geological units.
4. Set the Category to Wolf_Pass_GM and tick the boxes for the units you’d like to display.
As with the other plots, there is also the option to add a Query Filter.
Q-Q Plots
The Q-Q plots in Leapfrog Geo can be used to compare values from different phases or styles of drilling,
different lab analyses techniques, duplicates vs original samples, etc., using query filters. In this project, we
only have one phase and style of drilling and assay results from one lab, but we will demonstrate the graph
functionality by comparing the Au values from the Early Diorite and Intermineral Diorite units.
1. Right-click on the Assay_and_GM_Evaluation merged table and select Statistics.
2. Select Q-Q Plot.
3. Change the X data to AU_gpt and the X filter to the Early Diorite query filter.
4. Change the Y data to AU_gpt and the Y filter to the Intermineral Diorite query filter.
When necessary, either or both axes can be viewed with a logarithmic scale.
Table of Statistics
In addition to the graphs, there is also a comprehensive, flexible table of statistics available.
1. Right-click on the Assay_and_GM_Evaluation merged table and select Statistics.
3. Under Categories, click Add and use the dropdown list to select the Wolf_Pass_GM column.
4. Check the box for AU_gpt and CU_pct in the Numeric items list.
5. Tick the Hide empty categories and Hide inactive rows boxes.
6. To get a more useful view, choose the Group by numeric item radio button.
There are a number of useful statistics in this table. By default they are length-weighted, but you also have the
option of un-weighted if necessary. Sort by the different columns by clicking on the column headers.
7. Click the Mean heading to sort based on grade.
We can quickly see the lithologies that contain higher gold grade, as well as those with less.
For more information, see the Statistics topic in the online help.
Goals
In this session, we will cover:
l A generalised modelling approach
l Building composites directly from drillholes
l Creating an initial RBF Interpolant
l Steps to take or consider in refinement of the numeric model
In order to explain how Leapfrog creates numeric models, we will introduce a relatively large amount of
basic interpolation and geostatistics theory.
This session continues to use the Wolf Pass project.
We can immediately see that the majority of our samples are 2.0 metres long. We will use this information to
choose our composite lengths.
It is possible to create a set of composited drillholes directly from drillholes. To do this, we go directly to the
Drillholes folder. This folder gives us more options, including whether to composite over the entire drillhole or
only within a particular lithology. Once the composite has been completed, it can be used to create a numeric
model.
1. Right-click on the Composites folder (under the Drillholes folder) and select New Numeric Composite:
l Entire Drillhole applies compositing parameters to all values down the length of the drillhole, regardless of
unit breaks.
l Subset of Codes lets you set compositing parameters for each individual code, based on a Base column.
This allows compositing to break along unit breaks.
l Intervals from other Table uses interval lengths from the Base table to determine composite lengths.
If you want to use grouped codes as a base table you will need to make a Group column in your interval
table. You can then select Intervals from other Table and choose the grouped codes column.
In addition to specifying the desired composite region and length, there are also 3 options for handling residual
segments of lengths less than a specified cut-off:
l Discard
l Add to previous interval
We will change a few of the more important parameters and check how they change the numeric model.
9. Double-click on the AU_gpt numeric model ( ) in the project tree.
This opens the Edit RBF Interpolant window.
We will start by refining the numeric model for the entire area, then look at creating a model within the Early
Diorite, which is the major mineralised lithology.
For the first model, we will change parameters in the Outputs, Interpolant and Value Transform tabs. When
we create the second model within the Early Diorite, we will also look at the Value, Boundary and Trend tabs.
Outputs Tab
In the Outputs tab, we can choose the values used to create isosurfaces and define the resolution of the
isosurfaces and how the isosurfaces create the associated volumes. By default, there are three isosurface
values, which are at the lower quartile, median and upper quartile of the data being used. These default values
are often not of interest but are useful in checking the general shape of the numeric model. We will go ahead
and change them to more reasonable values.
1. Click on the Outputs tab.
2. Click to highlight one of the default values beneath the Iso Value heading, then click it again to edit it.
3. Change the existing values to 0.5, 0.75 and 1.0.
Resolution is important when creating isosurfaces. Ideally, we would want it to be equal to the composite
length (6 m in this case). A quick test using one of Leapfrog’s laptops (16GB RAM, 2.8GHz processor) took 75
seconds to run these isosurfaces at a resolution of 6, but if your laptop is particularly slow, it may be worth
increasing the resolution to between 12 - 15. This will still give you a reasonable surface but will process more
quickly.
The resolution of isosurfaces is important because it determines the size of the triangles making up the
surface. If the resolution is 6, the approximate edge length of the triangles will be 6 units in length (remembering
that Leapfrog is unit-less). If the edge length of the triangles is 6 units, they will be able to include intervals that
are as small as 6 m long. If we were to increase the resolution to 12, the triangles would only be able to include
intervals as small as 12 m long and so will miss some of the smaller intervals.
Obviously, a lower resolution produces a more accurate surface, but can take a lot longer to run. A general
guide is that if you halve the resolution, the processing time will increase by four times.
5. Change the Default resolution to something between 6 and 20.
The resolution for each surface is set by the Default resolution unless a different resolution for a particular
surface is specified. Since the resolution of each isosurface can be set independently, so you can save time by
making the higher iso value shells at a lower resolution value and the lower iso value shells at a higher
resolution value.
The Volumes Enclose dropdown lets you choose from Intervals, Higher Values and Lower Values.
l Intervals will create a series of “donut” shaped shells. In this example, the shells will be < 0.5, 0.5 - 0.75,
0.75 - 1.0, 1.0 - 1.25, 1.25 - 1.5, >1.5.
l Higher Values will create a series of shells that enclose all higher values within them. In this example, the
shells will be >0.5, >0.75, >1.0, >1.25, >1.5.
l Lower Values will create a series of shells that enclose all lower values within them. In this example, the
shells will be <0.5, <0.75, <1.0, <1.25, <1.5.
7. In this case we will start by using Intervals, so keep this selected.
8. Click OK.
When the model reloads in the scene, every volume will be opaque. To view with increasing transparency,
clear the scene and drag in the numeric model ( ) again:
The Evaluation limits refer to when interpolated values evaluated onto objects (surface, points, block model,
etc). These limits do not affect the input data, or the interpolation itself. By default, there is a Minimum limit set
at 0.0. This means that regardless of the parameters set in the Interpolant tab, and the resulting interpolation,
no interpolated values less than 0 will be evaluated onto objects. There is also an option to set a Maximum
value for the interpolated values. If ticked, the default is the highest value in the dataset, meaning that no
interpolated value evaluated onto an object can exceed the highest measured value. Keep in mind, this is not
top cutting as it only affects the interpolated values when evaluated onto objects, not the input data or the
interpolation itself.
Interpolant Tab
1. Double-click on the numeric model object ( ) to open the Edit RBF Interpolant window.
2. Click on the Interpolant tab:
There are several parameters in the interpolant tab that can be set based either on rules of thumb or by using
geostatistical input from packages such as Leapfrog EDGE, Supervisor or Isatis. For this example, we will look
at rules of thumb that work well for a number of examples. The default settings are almost certainly incorrect so
the next few paragraphs are important when creating reasonable numeric models.
As you may have figured out by now, Leapfrog is fast at creating models, but that doesn’t necessarily mean
the first pass models are correct.
Understanding how the interpolation works is one of the key topics in the Leapfrog Geo.
Interpolant
There are two options for the Interpolant, Linear and Spheroidal. The Linear interpolant works well for
lithology data and for quickly visualising data trends. It is not suitable for values with a distinct finite range of
influence such as most metallic ore deposits. The Linear interpolant assumes that values a certain distance
from a point have a proportionally greater influence on that point than values further away. The Spheroidal
interpolant works well when there is a finite range beyond which the influence of one point upon another should
fall to zero. This is the case for most metallic ore deposits.
3. Change the interpolant type to Spheroidal.
Note that the interpolant function shown in the window changes shape to display the Spheroidal interpolant
rather than the Linear interpolant.
Base Range
The Spheroidal interpolant has a Base Range that represents the distance from the data at which the value
equals the Total Sill. As we move away from a specified point, the influence of that point decays in a roughly
linear manner up to around 30% of the range. Past 30% of the Base Range, the influence of the point starts
dropping more quickly until it reaches 96% of the value of the total Sill.
These are a good indication that the Base Range needs to be increased, as it’s extremely unlikely that all the
drillholes manage to perfectly follow thin pipes of high grade while missing the surrounding low grade!
As a rule of thumb, the Base Range should be set to 2.0 - 2.5 times the distance between drillholes.
In this case, the average distance between holes is around 100 m, so a Base Range of between 200 and 250
should be a good starting point.
4. Change the Base Range to 250.
Note that the shape of the interpolant function changes to include the range of 250, which is represented using
a vertical yellow line:
Nugget
The Nugget allows for local anomalies in sampled data, where a sample is significantly different to what might
be predicted for that point based on the surrounding data. By increasing the value of the Nugget, more
emphasis is placed on the average values of surrounding samples and less on the actual data point. The
Nugget can also be used to reduce noise from inaccurately measured samples.
The rule of thumb for the Nugget changes depending on the deposit type, and geostatistical input is vital. For
this deposit (a porphyry gold project), a Nugget of 10 - 20% is appropriate. It is important to note that the
Nugget is a percentage of the sill, so in this case a 15% nugget would be 0.15 (15% of 1).
6. Change the Nugget to 0.15.
Note the change in the interpolant function; the base point (0.0, 0.0) moves up the y axis to equal the value of
the Nugget:
Drift
The Drift controls the way the interpolant decays away from the data. A Constant drift means the interpolant
will decay to the mean value of the data. A drift of None means the interpolant will decay to a value of zero
away from the data, so is useful when there are no low-grade holes constraining the deposit. A drift of Linear
means the interpolant decays linearly away from the data.
In this case, as the interpolant model is currently not bounded to any domain (geological, structural, weathering
etc) a sensible drift to use will be None. This means that as we move away from the data toward the edges of
the model, the value of the interpolant will revert to a value of zero.
We are only using a drift of None as we currently have no geological domain set up. Later in this session once
we have introduced a geological domain, we will use a drift of Constant.
Alpha
The Alpha determines how steeply the interpolant rises toward the Total Sill. A low Alpha value produces an
interpolant function that rises more steeply than a high Alpha value. By looking at the interpolant function while
changing the Alpha, we can see that a high Alpha will give points at intermediate distances more influence
compared to lower Alpha values. The possible Alpha values are 9, 7, 5 and 3.
An Alpha of 9 gives the best approximation to the Spherical Variogram but takes longer to process and in most
situations gives a very similar result to using an Alpha of 3.
In this case, we will keep the Alpha at 3 to reduce processing time.
Accuracy
Leapfrog Geo estimates the Accuracy from the data by taking a fraction of the smallest difference between
measured data values. There is little point in changing the accuracy to be significantly smaller than the errors in
the measured data as the interpolation will run more slowly and will degrade the interpolation result.
The rule of thumb here is to leave the Accuracy as it is.
8. Click OK to reprocess the interpolant.
This will reduce the variance of the dataset, so some values on the Interpolant tab may need to be adjusted.
5. Click OK to process the changes.
As a first pass model, this isn’t a bad result. We will need to add trends to give the mineralisation more of a
defined shape, as well as limit the extents of the model to within geological domains (which we made as part of
the geological model earlier).
Bounding your numeric model to a geologically reasonable domain (lithological, structural, alteration, etc) is
vital to creating a reasonable volume result.
3. Collapse the original AU_gpt numeric model and expand the new numeric model in the project tree.
4. Right-click on the Boundary and select New Lateral Extent > From Surface:
6. Click OK.
7. Clear the scene.
The model has changed in two different ways. First, the isosurfaces have been clipped to the Early Diorite
boundary and second, the data has been clipped to the Early Diorite boundary:
Now that the boundary has been changed, we need to edit the other interpolation parameters as discussed
above.
9. Double-click on the numeric model to open the Edit RBF Interpolant window.
In the Values tab, we can see that the Surface filter is set to Boundary. As the boundary is now set to the
Early Diorite volume, this means the Surface filter is already using the Early Diorite, so we don’t need to
change anything here.
We can also leave the Boundary tab and the Trend tab unchanged.
10. Click on the Interpolant tab to see if anything needs to be changed there.
11. If necessary, make appropriate changes to the Total Sill, Drift and Nugget.
12. Since the values used in the interpolation have changed, you may wish to revisit the Value Transform tab
to assess the clipping Upper bound.
13. The Outputs tab should be correct as it is, so we can click OK and let the model reprocess.
14. Clear the scene.
6. Click OK.
7. Click and drag the Structural Trend into the scene to view it.
The ellipsoids visually represent the direction of the trend. The larger the ellipsoid, the stronger the trend. You
can see that close to the meshes, the trend is stronger and as we move further away from the meshes, the
strength decays.
4. Click OK, and the model will re-run to include the Structural Trend.
When Leapfrog is processing models with Structural Trends, the processing time is slightly longer because
the calculations are more complex.
2. Right-click on the Values under the interpolant and select New Contour Polyline.
4. Draw the contour polyline on the slicer, off the end of the drillhole, then click Save in the toolbar.
5. The model will be updated, and the surfaces will now represent the contour polyline:
There are four columns: Interval, Interval Volume, Approx Mean Value and Units. The first three columns
are used to calculate the last column, then each row is added to give a total number of Units.
l The Interval column lists the value(s) of the volumes that are being calculated.
l The Interval Volume column lists the total volume contained within each Interval.
l The Approx Mean Value lists the mean value of the volume. In the example above, the intervals <0.5 and
>1.5 have Approx Mean Values of 0.5 and 1.5 respectively (as there is no further information available
higher or lower than these grades). The interval from 0.5 - 0.75 has an Approx Mean Value of 0.625, which
is half way between the two grade shells. To gain accuracy, one method is to increase the number of grade
shells (by decreasing the spacing between shells).
l The Units column is calculated by multiplying the Interval Volume by the Approx Mean Value, which
gives a total number of units. Each row is added to give the total number of units.
There is a short description at the bottom of the window giving instructions for turning the total units into grams.
In this case, we can look at the average density of the Early Diorite lithology, multiply it by the total units, then
divide by 31.1 to give total ounces of Gold.
l Total Units is 120,986,158
l Density of Early Diorite is 3.06 (which can be found in the Histogram tab of the merged table)
l (120,986,185 x 3.06) / 31.1 = 11.9 million ounces of Gold within the Early Diorite.
Note that we haven’t included a cut-off, so we are calculating the grade within the entire Early Diorite domain.
There are several things we could do to get a more constrained result, including limiting the model to within a
certain distance to the drilling (New Lateral Extent > From Distance Function), using an Indicator RBF
Interpolant as a boundary, and creating further refined geological domains (using refined models).
Goals
In this session we will:
l Define and convert numeric data to categories
l Build mineralised surfaces in a geological model using the new category data
l Build mineralised surfaces in a refined model
This session continues to use the Wolf Pass project.
Note: a category can be made from any numeric data, this function is not exclusive to grade.
2. Select AU_gpt as the Base Column and name the new column:
3. Click OK.
5. Click OK.
A new “Category” column has been created in the assay table, from which a Geological Model can be
created.
Before moving on we need to decide whether or not the mineralised zone(s) are to be constrained using a
previously modelled lithologic boundary. If rectangular model extents are sufficient, we can proceed with
building the mineralised zone surfaces in a geological model; if we want to use a previously modelled lithologic
solid, for example the Early Diorite, we can build our new surfaces within a refined model. We will do both.
We will build the mineralised zone surfaces using the New Intrusion tool so that the surfaces can enclose
volumes and more easily avoid lower grade intervals.
6. Right-click on the Surface Chronology and select New Intrusion > From Base Lithology:
8. To control the filtering of short segments, click on the Compositing tab in the New Intrusion window.
During this exercise, it is important to keep resolution in mind. It is not possible to capture 0.5 m of high
grade using a surface with a resolution of 50. While as many contact points between zones will be honoured
as possible, it is likely some intervals will be missed due to surface geometry.
3. Give the model a new name or keep the default one and click OK.
The new copy will appear in the project tree:
Now we can start building our mineralised surfaces as a refinement to this model.
4. Right-click on the Geological Models folder and select New Refined Model.
The New Refined Model window will open.
5. Select the Wolf Pass GM copy and then the Early Diorite volume from the top two drop-down menus.
6. Select the base lithology column to be Au_gpt Category.
7. Set the Surface resolution to 20 and give your new refined model a meaningful name:
8. Click OK.
9. In the Early Diorite section of the refined model, build the mineralised zone surfaces in the Surface
Chronology using the New Intrusion tool.
As with the earlier model, treat the higher-grade categories as “younger” lithologies than the lower grade
categories. Remember to check the Compositing tab to ensure that the short segments are being treated as
desired.
10. Activate the surfaces, add the Output Volumes to the scene and critique your results.
The results rendered using these two different mineralised surface building methods may overlap in places, but
they will generally not yield the same result. It is up to you, as the geologist, to determine which model provides
the most realistic distribution of grade.
Goals
In this session, we will:
l Review how to make a merged table
l Review box plots within the exploratory data analysis tools
l Set up a basic indicator interpolant model
l Refine the model parameters
l Add a structural trend to reflect the geometry of the higher-grade zone
l Examine the statistical outputs to validate the use of the model.
This session continues using the Wolf Pass project and the additional data is available either as an
attachment from Central or it will be provided by your instructor.
Modelling Approach
As with the other numerical models, there are several steps to take, often iteratively, to build a sensible/fit for
purpose model. While the Wolf Pass gold mineralisation is reasonably well controlled by the grouped lithologies
(as we have been using in the earlier sessions), if we did not have the groups, it would be more difficult to
determine what might be controlling the metals distribution. We will use the ungrouped lithologies and the
copper assays for this session. Let's look at the copper distribution before starting to build our indicator
interpolant.
7. In the Structural Trend window, click the Add button and select the mesh we just imported.
8. Select Non-decaying.
9. Rename the Structural Trend “Central Shear” so that it can be distinguished from the one you made earlier.
For more information about where, when and how to use structural trends, see the Structural Trends topic in
the online help, or review this blog article.
In addition to the use cases mentioned in the introduction, the indicator RBF interpolant can be a very useful
tool to temper the effects of extreme high-grade samples combined with sparse drilling. Because all of the
samples are transformed into 0s or 1s, the resulting solid that is produced tends to be more conservative,
with fewer 'blow-outs'.
In practice, it can also be useful to consider the proposed selective mining unit (SMU) or block size when
calculating composites.
The cut-off value, in this case 0.3% Cu, is displayed on the histogram with the purple line. The samples below
cut-off, those coded as 0s, amount to 20% of the total sample count and have a mean grade of 0.18% Cu. This
leaves 80% of our samples coded as 1s, and they have a mean grade of 1.14% Cu.
We can visualise this division of samples in the scene, displayed by whether the sample grade is above the
cut-off value of 0.3% Cu (represented as a '1') or below the cut-off value (represented as a '0').
17. Make the model volumes invisible and drag the CU_pct Indicator 0.3 values into the scene, from the project
tree.
18. Display the sample point values by CU_pct Indicator 0.3 values.
We can also visualise how this sample division relates to modelled Inside and Outside volumes of our indicator
interpolant.
19. Display the sample point values by Sample grouping.
It is important to note that this sample grouping categorisation is a result of the modelled volumes (Inside or
Outside) being back-flagged on to the samples. Given that the modelled representation of the iso value shell
is dependent of the surface resolution, the distribution of these sample groupings will shift slightly if/when
the model's resolution is changed.
Now that we are familiar with the samples and the sample distribution, let's take a look at editing the interpolant
model.
Note: we are using a 5, 5, 1 ellipsoid ratio as that is the set ratio of the anisotropy applied by the Structural
Trend.
Notice that adjacent to the Base Range, there are Max, Int and Min values. These values reflect the 5, 5, 1
global trend ellipsoid values that we set earlier. We would like the Base Range in our direction(s) of greatest
continuity to be 2.0 - 2.5x the drillhole spacing, so we need to change the value from 300 to something that
brings the maximum and intermediate values into the ~300 range.
24. Enter a value of 175 into the Base Range.
Notice that the Max and Int values are now ~300:
Now that we have set the Base Range to a value that reflects the 5, 5, 1 ratio that the Structural Trend will
impart, we can switch the structural input from the global trend to the Central Shear structural trend.
27. Finally, click on the Volumes tab and enable the Volume filtering option.
28. Change the Discard volume parts smaller than option to 100,000 units cubed.
This will remove all volumes smaller than 100,000 units cubed.
29. Click OK.
30. Drag the indicator interpolant into the scene to view it.
Now that we have a more meaningful model with appropriate parameters, we can discuss, more effectively,
what our indicator interpolant represents and how to interpret its results.
A low iso value (0.1-0.3) will create a more “inflated” shell, whereas a higher iso value (0.7-0.9) will create a
tighter, more restricted shell.
Using a low iso value can be used to prioritise the tonnage included in the volume, whereas a high iso value
will prioritise metal (or other physical properties).
The choices made here should generally reflect the proposed mining method - bulk mining methods will be
more suited to tonnage prioritisation, while selective mining methods are more suited to prioritising grade.
Remember: The interpolant function is continuous and can be evaluated at any point. The iso-surface shell
represents this underlying continuous function, but is depended on the surfacing parameters chosen, eg.
changing the surface resolution alters the resulting model volumes (Inside and Outside), which, by
extension, affects the sample grouping categorisation of the points (at composite support).
Statistics
It is easy to review the statistics of your indicator interpolants.
37. Right-click on the original CU_pct Indicator 0.3 indicator interpolant model and click Statistics.
Statistics relating to the Indicator are listed in the window and can be copied to the clipboard for importing into
Microsoft Excel or a similar package:
From here, we have a full summary of the number of samples above and below cut-off, those above cut-off that
are outside the modelled isosurface, and those below cut-off that are inside the modelled isosurface.
Using these values, we can calculate discard and dilution factors. For example, there are 324 samples below
cut-off that are included in the Inside volume, so our dilution factor = 324/(3208+324)=0.09, or 9%.
Goals
In this session we will create, visualise, import, evaluate and export block models in Leapfrog Geo.
This session continues using the Wolf Pass project and the additional data is available either as an
attachment from Central or it will be provided by your instructor.
The New Block Model window will appear, which shows options for defining the grid.
2. Set the block size to 20, 20, 10 (x, y, z).
It’s possible to adjust the reference centroid and rotate the grid about the x axis by changing the azimuth or
rotating the handles in the scene.
3. Under the Enclose Object dropdown, select the WP_lith table.
The block model is currently empty. The next steps will go through the steps required to evaluate the geological
and numerical models against the blocks.
3. Click OK.
If you are evaluating onto a sub-blocked model, you will also have the choice of evaluating onto the sub-block
centroid or the parent block centroid. By default, categorical models evaluate onto the sub-block centroid and
numerical models evaluate onto the parent block centroid.
For numeric models you can use a value filter, and for category models you can turn different categories on and
off by selecting Edit Colours.
3. Click on the display list once again and select the model’s Status field:
All the blocks for this model are classified as either Outside or Normal.
7. Click OK to close the Legend for Status window.
If you evaluate a block model that has a lateral extent, blocks that fall outside this will have no assigned
value.
In addition to the standard Edit Colours, transparency slider and show legend in scene options, there is also
the ability to Show edges ( ) and Show inactive blocks( ).
In addition to the standard Slice mode and Query filter options in the properties panel, there is also the option
to display block models filtered by X, Y and/or Z indices. There are three Index Filter options, None, Subset
and Sliced, and you can choose to apply them to the X,Z, and/or Z axes. To filter on a specific axis, make sure
its check box is ticked.
For Subset, the grid is filtered to show the union of the selected X, Y and Z ranges.
For Sliced, the grid is filtered to show the intersection of the selected X, Y and Z ranges.
Summary Statistics
Summary statistics will provide an overall report of the volume of blocks evaluated, as well as the average
grade and other associated statistics within those blocks.
1. To open the Statistics tab, right-click on the block model, select Statistics.
2. Select the Table of Statistics option.
In the Statistics tab, you can view a basic summary and compare different domains with the interpolated
numeric value.
3. Under Categories, click Add and use the dropdown list to select the Refined Wolf Pass GM: Wolf Pass GM
column.
4. Check the box for AU_gpt in the Numeric items list.
5. Tick the Hide empty categories and Hide inactive rows boxes.
Calculating Tonnage
In addition to the statistics automatically displayed, we also have the option to calculate tonnage and display it
in the table. To calculate tonnage, we need to provide a density. The density can either be a constant value, or
based on a density interpolant. Given that we don’t have density information for each unit or the ability to
separately specify the density for each unit, it would make the most sense to provide a density and calculate
the tonnage for only our main unit of interest, the Early Diorite, as opposed to all of the units together.
7. In the Table of Statistics, change the Numeric items display to show only the Au_gpt clipped to Early
Diorite one.
Reporting
You can copy the results by clicking on the Copy ( ) dropdown at the top of the window and choosing either
Select All or Copy Selected Rows. This copies the results onto the clipboard, and you can then paste the
data into a spreadsheet for subsequent calculations. The displayed summary statistics can also be exported to
CSV by clicking the Export ( ) button.
In addition to the table of statistics that we just reviewed, scatter plots, box plots and Q-Q plots are available on
the block models.
6. Click Next.
There is also an option to omit certain rows depending on the block status.
7. Leave the Omit rows defaults.
8. Click Next.
When exporting block models as CSVs, there is the option to set the Numeric Precision; in this case, we will
leave the defaults.
3. Select the appropriate x, y and z columns to define the location of the block, if not already selected.
5. Multi-select the numeric columns associated with the block model, right-click the dropdown to code all
selected rows.
Note: you can use the Treat as Blank column to dictate the value used for inactive cells. Any blocks with
this value/text will be imported with a Blank Status.
7. Click Finish.
The model will be imported into Leapfrog.
8. Drag the block model into the scene to view it.
Leapfrog can import regular block models from many different software packages. Leapfrog's excellent
graphical capabilities make it easy to visualise large block models.
Tips for importing block models:
l The azimuth Leapfrog uses to rotate the model may be different than what your other software reports. If
the model isn't matching up as expected, try adding or subtracting intervals of 90 to the given azimuth
when entering in Leapfrog.
l If centroids aren't matching, and it's not clear why, import the block model into the Points folder to
determine where it sits in 3D space. This is very helpful for defining the grid during block model import.