Hysplit User Guide PDF
Hysplit User Guide PDF
Roland Draxler
Barbara Stunder
Glenn Rolph
Ariel Stein
Albion Taylor
Abstract
The HYSPLIT_4 (Hybrid Single-Particle Lagrangian Integrated Trajectory) Model
installation, configuration,
and operating procedures are reviewed.
Examples are given for setting up the model for trajectory and
concentration
simulations, graphical displays, and creating
publication quality illustrations. The model requires specially
preformatted
meteorological data. Programs that can be used to
create the model's meteorological input data are described. The
User's Guide has been restructured so that the section titles match
the GUI help menu tabs. Although this guide is
designed to support
the PC and UNIX versions of the program, the executable of the
on-line web version is identical.
The only differences are the
options available through the interface.
Features
The HYsplit_4 (HYbrid Single-Particle Lagrangian Integrated Trajectory) model is a complete system for computing
trajectories complex dispersion and deposition simulations using either puff or particle approaches.2
It consists of a
modular library structure with main programs for each primary application: trajectories and air concentrations.
Gridded meteorological data, on a latitude-longitude grid or
one of three conformal (Polar, Lambert, Mercator) map
projections,
are required at regular time intervals. The input data are interpolated to an internal sub-grid centered to
reduce memory requirements and increase computational speed. Calculations may be performed sequentially or
concurrently on multiple meteorological
grids, usually specified from fine to coarse resolution.
Air concentration calculations require the definition of the pollutant's emissions and physical characteristics (if
deposition
is required). When multiple pollutant species are defined, an emission would consist of one particle or puff
associated with each
pollutant type. Alternately, the mass associated with a single puff
may contain several species. The
latter approach is used for calculation
of chemical transformations when all the species follow the same transport
computing platforms.
The modeling system includes a Graphical User Interface (GUI) to set up a trajectory, air
concentration, or deposition
simulation. The post-processing part
of the model package incorporates graphical programs to generate
multi-color or
black and white publication quality Postscript
printer graphics.
A complete description of all the
equations and model calculation methods for trajectories and air
concentrations has
been published3 and it is also available on-line. The on-line and the version included with the PC installation contains
all the most recent corrections and updates.
Pre-Installation Preparation
There are two installation
programs that can be downloaded. The trial version (HYSPLIT_win{32|64}U.exe
~ 70 Mb)
available to anyone and a fully functional version
(HYSPLIT_win{32|64}R.exe ~ 70 Mb) that requires a user registration
through the web site. Both versions identical, except the trial
version will not work with forecast meteorological data
files. In
addition, several supplemental programs (Ghostview and Tcl/Tk) are
packaged with the trial version for
convenience. It is assumed
that these would have already been installed when the registered
version is installed on top
of the trial version.
The self-installing executable contains only HYSPLIT related programs. No additional software is required to run a
simulation if the
command line interface is sufficient. To enable the model's GUI,
the computer should have Tcl/Tk
script language installed. The
most recent version can be obtained over the Internet from or an
older version is
packaged with some of the other programs on the
HYSPLIT
web site. The installation
of Tcl/Tk will result in the
association of the .tcl suffix
with the Wish executable and all Hysplit GUI scripts will
then show the Tk icon. The
HYSPLIT GUI has been tested with Tcl/Tk
version 8.5.4.
The primary HYSPLIT graphical
display programs convert the trajectory and concentration model
output files to
Postscript format. The Postscript files can also
be viewed directly through the GUI if Ghostscript and Ghostview
have
been installed. See for more information on the Postscript
file viewer. The HYSPLIT code and GUI have been tested
with
Ghostscript 8.63 and Ghostview 4.9. Installation to different
default drives, directories, or other versions might
require
editing the main GUI script's directory pointers (edit file: /guicode/hysplit4.tcl or the directory entry in the
installation directory at this stage. Rename your old installation prior to installing the new code if you wish to keep the
original
version.
The installation will contain several sub-directories, some of
which are required for model execution, and some of
which provide
additional documentation and other information. For instance ...
bdyfiles - This directory contains an
ASCII version of gridded land use, roughness length, and terrain
data. The
current file resolution is 360 x 180 at 1 degree. The
upper left corner starts at 180W 90N. The files are read by
both
HYSPLIT executables, hyts_std (trajectory model) and
hycs_std (concentration model), from this directory.
If not
found, the model uses default constant values for land use
and roughness length. The data structure of
these files is defined
in the file ASCDATA.CFG, which should be located in either the model's startup or
/bdyfiles directory. This file defines the grid system and gives an optional directory
location for the landuse and
roughness length files. These
files may be replaced by higher resolution customized user-created
files. However,
regardless of their resolution, the model will
only apply the data from these files at the same resolution as the
data in various formats to the format (ARL packed) that HYSPLIT can
read. Sample programs include GRIB
decoders for ECMWF model
fields, NCAR/NCEP (National Centers for Environmental Prediction)
re-analysis
data, and NOAA Aviation, ETA, and Regional Spectral
Model files. All the required packing and unpacking
subroutines
can be found in the /source subdirectories. Sample
compilation scripts for Compaq Visual Fortran
6.6 are in some of
the decoder directories. More information on how to run these
programs can be found in the
Meteorology section.
examples - The directory contains several example
scripts and batch files that can be used to create automated
simulations.
html - Contains all the HELP files in
HTML format. These files can be displayed with any browser or
interactively through the GUI. The files that are opened in the
GUI depend upon the context from which HELP is
invoked.
document - This directory contains PDF
(Adobe Portable Document Format) versions of the User's Guide (all
the
HTML help files put together in one document) and other
documentation such as ARL-224, the principal ARL
S000.htm[2/9/2016 2:47:27 PM]
Technical
Memorandum describing the model and equations. This User's Guide
(this document) provides
detailed instructions in setting up the
model, modifications to the Control file to perform various
simulations and
output interpretation. The @readme.txt file
contains additional information about compilation, typical CPU
this script starts the HYSPLIT GUI. The Desktop shortcut as well
as the Start Menu options should point to this
script. If the
installation program did not properly setup the Desktop, you can
manually create a shortcut to the
script and edit its properties
such that the "Start In" directory is /hysplit4. You should
also select the HYSPLIT
icon from the /icons directory.
working - This is the Hysplit4 root
directory, which contains sample CONTROL files that can be used
for initial
guidance to set up more complex simulations. These
should be loaded into the GUI from the approprate
"Retrieve" menu
tab. Examples include:
sample_conc - concentration simulation example from users guide
sample_traj - trajectory simulation example from users guide
The "plants.txt" file contains a sample
listing of starting locations that can be opened in the GUI to
select
from a list of previously determined starting locations. This file can easily be customized. The "tilelist.txt"
file
contains the approximate coordinates of NCEP's NAM (North America
Mesoscale model) tile
domains.
Problems
If Tcl/Tk does not exist on your
system or there are other problems with the GUI interface, it is
very easy to run the
sample cases directly in the /working
directory by running the batch file "run_{model}.bat" If the sample
simulation
works well, then it is only necessary to manually edit the
CONTROL file to try out different simulation variations. The
R.R., 1999, HYSPLIT_4 User's Guide, NOAA Technical Memorandum ERL ARL-230, June, 35 pp.
[2]Draxler,
[3]Draxler,
R.R.,
and G.D. Hess, 1997, Description of the Hysplit_4 modeling system,
NOAA Technical
Memorandum ERL ARL-224, December, 24p.
[4]Stein,
A.F., R.R. Draxler, G.D. Rolph, B.J.B. Stunder, M.D. Cohen, and F. Ngan , 2015. NOAAs HYSPLIT
atmospheric transport and dispersion modeling system,
Bulletin of the American Meteorological Society. 96,
20592077. doi: http://dx.doi.org/10.1175/BAMS-D-14-00110.1
[5]PSPLOT
libraries
can be found at www.nova.edu/ocean/psplot.html and were created by
Kevin Kohler
([email protected]).
GUI Overview
Although the model can be configured and run manually from the command line, it is usually much quicker to use the
GUI. A summary of the main features in each of the primary GUI menus is reviewed below. More detailed help
information can be found
with the help button associated with each specific menu.
Meteorology / ARL Data FTP
Forecasts - Links to only the most recent forecast data for a
variety of different meteorological models available
on the ARL FTP server. The data have already processed to a HYSPLIT compatible format. Appended data are
defined as a combination of analysis and short forecasts (e.g. 0h and +3h appended with each new 6h cycle) that
are operationally maintained as a rolling archive valid for 48 hours prior to the most recent forecast cycle.
ARL Archive - Provides access to the long-term data archives
at ARL for the GFS/GDAS (formerly FNL) and the
NAM/NDAS (formerly EDAS) archives. Unless
otherwise noted, some of the archives go back to about 1995.
NOAA Reanalysis - The NCAR/NCEP reanalysis data archived has been reformatted for HYSPLIT and placed on
the ARL ftp server. These data are available from 1948 through the end of the last year. The files are updated
once-a-year.
Set Server - A generic menu that can be customized to open
different backup or user defined FTP servers to
download pre-formatted HYSPLIT compatible
meteorological forecasts.
Meteorology / Convert to ARL
Many of the scripts linked in this menu will convert various data file formats
into the data format used by
HYSPLIT. GRIB version 2 is not yet supported. However,
conversion software from GRIB-2 to GRIB-1 can be
obtained from several different
sources (NOAA-NCEP, ECMWF, WMO). All files must be on the local
computer. FTP is
no longer supported through these menus.
WRF-ARW - Processes fields from NCARs Advanced Research
WRF model into the ARL format. The data
conversion routines require NetCDF routines
currently supported only on UNIX or LINUX platforms.
NARR Archive - Archived GRIB-1 fields from the North American
Regional Reanalysis model (32 km
resolution, 3-hr intervals). The GRIB files must be
independently downloaded from the NCDC NOMADS web
site. The downloaded files need to be
renamed to end with the day-hour {DDHH}. The script is designed only to
work with archive
data to create a multi-day file as long as one month.
Global Lat-Lon - Archived GRIB-1 fields from the global model of ECMWF or NOAA may be processed locally
into ARL format. The procedure is designed only to work with archive data from their current operational model,
where the last four digits of
the file name represent the day.
ERA-40 - Archive GRIB-1 fields from the ECMWF ERA-40 reanalysis
may be processed locally into ARL
format. These files must already be present. The procedure
is designed only to work with the ERA-40 data files.
Three different file types must be
downloaded for this conversion: 3-d fields, 2-d surface fields, and one invariant
field.
MM5V3 format - Output files from MM5V3 can be converted to ARL
format. The script automatically processes
all domains (MMOUT_DOMAIN{X}) found in the selected directory.
MCIP IOAPI - Meteorological output files from MM5 converted to
the CMAQ IOAPI format. Available only
under UNIX systems with NetCDF installed.
User Entered - This menu tab is intended to convert user
entered meteorological data to the ARL packed
HYSPLIT compatible format. The conversion program will create a data file with one or more time periods
containing a uniform field
in space and height but varying in time. The grid is centered over the location
specified
in the main menu and covers a 50 by 50 km domain.
Meteorology / Display Data
Check File - A program to list the information in the meteorology
file index record as well as the ASCII header
portion of each data record.
Contour Map - A simple postscript contouring program that can be used to view the data fields in any ARL
packed meteorological data set. The output is always written to a file called contour.ps. Multiple time periods
may be processed.
Text Profile - This program is used to extract the meteorological data profile interpolated to a user selected
latitude-longitude position. The raw data values at the surface and each level are output on the left while
temperature converted to potential temperature and winds rotated from grid orientation to true are shown on the
right. Output is shown on the screen and is also written to a file called profile.txt.
Grid Domain - Program to generate a map showing the domain of the meteorological data grid with marks at each
selected data grid point. Output is written to the file showgrid.ps.
Utilities
Convert Postscript - Converts any Postscript file to another format using ImageMagick. Enter the name of the
Postscsript file and the desired name of the output file, with the file suffix indicating the desired conversion.
GIS to Shapefile - When the GIS output option is selected in selected display programs an ASCII text file (in
generate format) is created that contains the latitude-longitude values of each contour. This GIS text file can be
converted to an ERSI Arc View shapefile. This menu tab is available from three different points. From the
meteorology tab the conversion is "line", from the trajectory tab the conversion is "points", and from the
concentration tab the conversion is "polygon".
Trajectory
Quick Start - menu tab can be used to run any simulation in one-step. The last configuration will be used for the
simulation. The menu brings up a global map with the current source location shown as a blue dot which can be
moved to
a new location. The Run Example menu tab resets the control and namelist files
and reruns the example
case. This can be used as a starting point for configurations
that have been corrupted.
Setup Run - This menu is used to write the CONTROL
file that defines all the model simulation parameters. Note
that once a simulation
has been defined it can be "Saved As" and later "
Retrieved." Note that the setup must be
"Saved" before proceeding
to the "Run Model" step.
Run Model - The selection opens the window for
standard output and starts the execution of "hyts_std.exe".
Incremental progress is reported on WinNT and later systems, however on LINUX the display remains frozen
until the calculation has completed (a beep will sound).
Display - Display options are available to label the time increment along the trajectory and the amount of white
space surrounding the trajectory on the map. The HIRES option limits the domain to just fit the trajectory. Output
is written to the file trajplot.ps. The plot title can be defined in a file called labels.cfg which should be placed in
the \working directory. The special frequency display can be used to
plot the spatial frequency distribution of
multiple trajectory simulations.
Special Runs - The Ensemble option invokes a special executable that will run multiple trajectories from the same
location, with each trajectory using meteorological data slightly offset from the origin point. The Matrix option
simultaneously runs a regular grid of trajectory starting locations. Also available are multiple trajectory options in
both time and space as well as a menu to run a series of automated daily trajectories for as long as one month per
Deposition, and Grids", which is used to customize the concentration computation. Unlike trajectories, which are
terminated at the top of the model domain, pollutant particles reflect at the model top.
Run Model - Similar in function to trajectory run menu, executes "hycs_std.exe".
Display - Output is written to concplot.ps. Options on the menu permit concentration contours to be dynamically
scaled to the maximum for each map (default) or fixed for all maps based upon the highest value on any one map.
Other display programs are available for the color fill of uncontoured gridded data or the horizontal or vertical
display of the particle distributions and the computation of plume
arrival times.
Note that the units label as well
as the plot title can be defined in a file called
LABELS.CFG which should be placed in the \working directory.
Special display programs are available for a source-receptor matrix, ensemble, and
statistical simulation outputs.
Utilities
Convert to ASCII - Writes an ASCII file for each time period of concentration output with one record for every
grid point that has a non-zero concentration value. Each record is identified by the ending date and time of the
sample and the latitude and longitude of each grid point. The output file name is constructed from the name of
the [input file]_[julian day]_[output hour]. Some other output options are available.
Grid to Station - This program is used to extract the time-
series of concentration values for a specific location.
That location may be specified by a single latitude-longitude through the GUI or multiple stations may be defined
by entering the name of a file that contains an integer station number, latitude, and longitude on each record. The
output is written to con2stn.txt which can be read by the time series plotting program to produce the file
timeplot.ps. Note that the time series plot is always in integer units and therefore may require the specification of
a units multiplication factor.
Convert to DATEM - Conceptually similar to Grid-to-Station, this menu is used to convert a binary file to the
DATEM format and then compute model performance statistics. A data file with measurement data in DATEM
format must already exist, and the conversion program will match each model calculated concentration with a
measurement, compute performance statistics, and display a scatter diagram of measured
and calculated values.
Merge Binary Files - Multiple binary concentration files may
be added together for a sum, or the maximum value
is retained at each grid, or one of
the input files may be used as a zero mask, resulting in zero concentrations in
the
other file when the mask file shows a non-zero value.
Special Runs
Daily - Runs a script to rerun the last dispersion model
configuration each time with a different starting day and
time for one month. Output files are labeled with the starting time.
Matrix - Runs the matrix pre-processor to create
a CONTROL file with a grid of starting locations. The sourcereceptor conversion
option needs to be selected in the advanced menu to create the coefficient matrix
required for
source attribution applications.
Geolocation - Runs a special pre-processor that reads a file
of measured sampling data to automatically configure
measured sample.
Ensemble / Meteorology - An application similar to the
trajectory ensemble, in that multiple simulations are run
each with a different offset in
the meteorological data to create an ensemble showing the sensitivity of the
simulation
to gradients in the meteorological data fields.
Ensemble / Turbulence - Instead of starting each
simulation with the same random number seed, multiple
simulations are run, each with a
different seed, which then shows the concentration sensitivity to the turbulence.
Ensemble / Physics - The physics ensemble varies different turbulence and dispersion namelist parameters from
the base simulation to develop an ensemble showing the sensitivity of the calculation to various model physics
assumptions.
Global - The global special run invokes an executable that
contains a global Eulerian module to permit particle
mass to be transferred to the global model at a specified age. A concentration output grid for each computation is
created. One representing local/regional scale plumes, the other the contribution to
global background
concentrations.
Dust Storm - Invokes a pre-processor that creates
a CONTROL file with each location within the selected
computational domain that
contains a desert land-use designation. This option should be used in conjunction with
the advanced menu option to turn on the dust emission module.
Daughter Products - Invokes a pre-processor that creates a
DAUGHTER.TXT file containing the daughter
nuclides produced by a parent nuclide along with the half-life and branching fractions. In addition a CONTROL
file is modified to include the daughter product information. This option should be used in conjunction with the
advanced menu option to turn on the daughter product module and set MAXDIM to the number of daughters.
Advanced Topics
Contains several menus for custom model configurations that can be used to change the nature of the simulation
and the way the model interprets various input parameters. These menus would be used to configure matrix,
ensemble, global, pollutant transformations, and model initialization options.
Trajectory Configuration - Creates the SETUP.CFG
namelist file for trajectory calculations.
Concentration Configuration - Creates the SETUP.CFG
namelist file for concentration-dispersion calculations.
Global Configuration - Creates the GEMPARM.CFG
namelist file to configure the global Eulerian model
subroutines.
Dynamic Sampling - Creates a virtual sampler that can fly
through the computational domain, either passively
with the wind or with a pre-defined
velocity vector.
Emissions File - Opens a simple editor to create a special
emission file that can define point sources that vary in
space, time, and intensity.
Satellite Data / TOMS and Particle Viewer - Permits the download
of TOMS aerosol index data and plots those
data with particle positions or model
generated concentration fields.
Center Reset Button
This will cause any changes to the GUI variables to be set back to the sample trajectory
and concentration simulation
case which uses the oct1618.BIN sample meteorological file.
This button loads the sample_conc and sample_traj
control files from the working directory.
Table of Contents
The structure of these files is given in the ASCDATA.CFG file, which defines
the grid system and gives the directory
location of the land-use, terrain, and roughness length files. The ASCDATA.CFG file should be located in the model's
startup or root directory. The last line in the file should be modified to reflect the path to the bdyfiles directory.
File Format
These files may be replaced by higher resolution user created files. Note that the first data point on the first record is
assumed to be centered at 179.5W and 89.5N, so that from the northwest corner the the data goes eastward and then
southward. User supplied files should define roughness length (cm - F4)
and the following land use-categories (I4). The
record length of the file should be the number of longitude values times four bytes plus one additional byte for CR.
The 11 Land-Use Categories
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
-urban
-agricultural
-dry range
-deciduous
-coniferous
-mixed forest
-water
-desert
-wetlands
-mixed range
-rocky
Table of Contents
0 :conformal extract
Table of Contents
Table of Contents
CONTROL file when they are set to zero. Those options are
discussed in more detail below. Each input line is
numbered (only
in this text) according to the order it appears in the file. A
number in parenthesis after the line number
indicates that there is
an input loop and multiple entry lines may be required depending
upon the value of the previous
entry.
1- Enter starting time (year, month, day, hour)
Default: 0 0 0 0
Enter the two digit values for the UTC time that the calculation
is to start. Use 0's to start at the beginning (or end)
of the file
according to the direction of the calculation. All zero values in
this field will force the calculation to
use the time of the first
(or last) record of the meteorological data file. In the special
case where year and month
are zero, day and hour are treated as
relative to the start or end of the file. For example, the first
record of the
meteorological data file usually starts at 0000 UTC.
An entry of "00 00 01 12" would start the calculation 36
hours from
the start of the data file.
2- Number of starting locations
Default: 1
Single or multiple pollutant sources may be simultaneously
tracked. The emission rate that is specified in the
pollutant menu is assigned to each source. If
multiple sources are defined at the same location, the emissions
are
distributed vertically in a layer between the current emission
height and the previous source emission height. The
effective
source will be a vertical line source between the two heights. When
multiple sources are in different
locations, the pollutant is
emitted as a point source from each location at the height
specified. Point and vertical
line sources can be mixed in the same
simulation. The GUI menu can accommodate multiple simultaneous
starting locations, the number depending upon the screen resolution. Specification of additional locations requires
manual editing of the CONTROL file. Area source emissions can be specified from an
input file: emission.txt.
When this file is present in the root
directory, the emission parameters in the CONTROL file are superseded by
the emission rates specified in the file. More information on this file structure can be found in the advanced help
section.
3(1)- Enter starting location (lat, lon, meters, Opt-4, Opt-5)
Default: 40.0 -90.0 50.0
Position in degrees and decimal (West and South are negative).
Height is entered as meters above ground level
unless the mean-sea-level flag has been set.
The optional 4th (emission rate - units per hour) and
5th (emission area - square meters) columns on this
input line
can be used to supersede the value of the emission rate
(line 12-2) when multiple sources are defined, otherwise
all
sources have the same rate as specified on line 12-2. The
5th column defines the virtual size of the source:
point
sources default to "0".
4- Enter total run time (hours)
Default: 48
Sets the duration of the calculation in hours. Backward
calculations are configured by setting the run time to a
negative
value. See the discussion in the advanced
help section on backward "dispersion" calculations.
5- Vertical motion option (0:data 1:isob 2:isen 3:dens
4:sigma 5:diverg 6:msl2agl 7:average 8:damped)
Default: 0
Indicates the vertical motion calculation method. The default
"data" selection will use the meteorological model's
vertical
velocity fields; other options include isobaric, isentropic,
constant density, constant internal sigma
coordinate, computed from
the velocity divergence, a special transformation to correct
the vertical velocities
when mapped from quasi-horizontal surfaces (such as relative to MSL) to HYSPLIT's internal terrain following
sigma
coordinate, and a special option (7) to spatially average the
vertical velocity. The averaging distance is
automatically
computed from the ratio of the temporal frequency of the data
to the horizontal grid resolution.
6- Top of model domain (internal coordinates m-agl)
Default: 10000.0
Sets the vertical limit of the internal meteorological grid. If
calculations are not required above a certain level,
fewer
meteorological data are processed thus speeding up the computation.
Trajectories will terminate when they
reach this level. A secondary
use of this parameter is to set the model's internal scaling height
- the height at
which the internal sigma surfaces go flat relative
to terrain. The default internal scaling height is set to 25 km but
QCYCLE = 0.0000E+00,
FRME = 0.10000000,
FRMR = 0.0000E+00,
KRND = 6,
DELT = 0.0000E+00,
ISOT = 0,
TKER = 0.50000000,
NDUMP = 0,
NCYCL = 0,
TRATIO = 0.75000000,
MGMIN = 10,
KMSL = 0,
NSTR = 0,
CPACK = 1,
ICHEM = 0,
DXF = 1.0000000,
DYF = 1.0000000,
DZF = 9.9998E-03,
NINIT = 1,
PINPF = PARINIT,
POUTF = PARDUMP
-- End Namelist configuration --
NOTICE main: pollutant initialization flags
Gas pollutant - T
#################################
##################################
Table of Contents
Advanced / Help
This section provides some guidance in configuring the model to perform
certain specialized calculations. These include
deciding between particle or puff releases, dealing with continuous emissions, creating an file with
time variations of
the emission rate, unit-source
simulations with time-varying emissions,
area source emissions,
multiple pollutants,
pollutant transformations, atmospheric chemistry,
deposition and decay,
compilation limits, scripting for automation
applications, backward dispersion for source attribution, configuring the time or spatial variation of the emission rate
using an input file, and how to compute
deposited pollutant transport on ocean water surfaces.
The Advanced menu is
composed of four sections.
1. The Configuration Setup menu permits the
creation and modification of the SETUP.CFG namelist file for either
additional parameters that can be used to modify the nature of the simulation.
The namelist file is not required
because all the namelist variables take on
default values when the SETUP.CFG file is not found. The same
namelist
file can be used for either trajectory or air concentration simulations but
only certain variables are
applicable to each simulation. Supplemental
plot labeling options are available, a menu to configure
a dynamic
sampler that can move in space and time for
use with the concentration simulation, and a menu to configure the
default
directory structure and location of executable programs
used by the GUI. More complex point source
emissions scenarios can be defined
by creating the EMITIMES emissions file.
2. The Satellite Data menu opens two special menus that deal
with downloading and displaying TOMS satellite
data. The FTP NASA TOMS
satellite data menu opens up an anonymous FTP window to access the current
satellite
archives for the TOMS aerosol index (AI). These data can
then be displayed through the TOMS Viewer.
In
addition to AI, the viewing program will also display the pollutant particle
positions. The particle position file,
called PARDUMP by
default, is a snapshot of all the pollutant particles at a given time. Its creation
is controlled
through the configuration menu.
3. The View MESSAGES menu is used to display the diagnostic message file. For all trajectory or concentration
file. If the model does not complete properly, some additional diagnostic information may be obtained from this
file.
4. The Scan for Updates menu can be used to make small updates to a
limited number of executables and scripts for
the same HYSPLIT release version.
Updates to executables and scripts may be posted to the ARL web server.
This
menu compares the dates of those updates with the installed versions. If newer
programs are available, you
will be prompted to replace each with the update.
Older programs are copied to the updates directory and the
update can be reversed
if desired. The update process only changes programs in the ./exec directory and
scripts in
the ./guicode directory. Once a significant number of updates have
accumulated, the entire release version will be
updated and a new installation
must be manually downloaded from the ARL server. Only updates to the same
version
number are permitted.
Table of Contents
Meteorology / Help
Meteorological data fields to run the model are usually already
available or can be created from routine analysis
archives or
forecast model outputs. More complete descriptions of the different
data sets are available on-line. The
meteorological data available through the menu system are
summarized in the following sections.
The meteorology menu is divided into four sections: data FTP, data
conversion, data display, and utility programs.
The FTP section provides access to various data files that may be FTP'd
from the ARL server compatible for
immediate input
to HYSPLIT. These files include current and appended
forecasts as well as various regional and
global analysis archives and longer term archives such as the
NCAR/NCEP reanalysis data. All data files have
already
been converted for HYSPLIT applications.
The convert menu gives data processing options for files that are stored
on the local computer. Processing of
forecast or analysis files are available
for archive, forecast, and regional or global data files in various formats.
Table of Contents
The appended forecast files consist of a special time extraction of the previous seven (-48 hr) forecast cycle files to
create a series of pseudo-analysis (0-hour
initialization) and short-time (+3 h or +1,+2,+3,+4,+5) forecasts that have
been
appended to each other to create a continuous data time series for the previous
48 hours in a single file. These
special time extracts are only available for the
following files.
gfsa :: 1-deg 3P -48h (160 Mb)
nama :: 12-km 3P -48h (829 Mb)
ruca :: 20-km 1P -48h (60 Mb)
Table of Contents
40-km resolution (AWIPS 212 grid) starting in 2004. The GDAS (every
6-h) was saved on a hemispheric projection at
191-km resolution through the end of 2006. The new 1-degree and half-degree GDAS archive
is available every three
hours. See notes below for a more detailed
discussion of each file.
EDAS :: 80 km 3P (<=2003 SM 82 Mb) Semi-Monthly data files at three
hour intervals on pressure surfaces
EDAS :: 40 km 3P (>=2004 SM 600 Mb) Semi-Monthly data files at three
hour intervals on pressure surfaces
GDAS :: 1-deg 3P (>=2005 WK 600 Mb) Weekly files (W1=1-7; W2=8-14;
W3=15-21; W4=22-28; W5=29-end)
every three hours on pressure surfaces
GDAS :: 0.5-deg 3S (>=2010 DA 468 Mb) Daily files every three hours on
the native GFS hybrid sigma
coordinate system. Beginning July 28, 2010.
NAM12 :: 12-km 3P (>=2007 DA 395 Mb) Composite archive 0 to +6 hour forecasts
appended into daily files for
the CONUS at three hour intervals on pressure surfaces
NAMs :: 12 km 1S (>=2008 DA 994 Mb) Composite archive 0 to +6 hour forecasts
appended into daily files for
the CONUS at one hour intervals on sigma surfaces
From the GUI it is necessary to enter the year, month, and data
file period. Semi-monthly files are designated by 001
(1st through
the 15th) and 002 (16th through the end of the month).
Weekly files by the week number and the two-digit
day for the daily files.
The file names are created automatically.
For earlier global archive data see Meteorology - ARL Data FTP - Reanalysis.
Additional information about the data archives on the ARL server
can be found at the ARL web page.
Table of Contents
calculations using HYSPLIT through the GUI. Data files are named
according to the following syntax: RP{YEAR}
{MONTH}.gbl, where R indicates "Reanalysis", P indicates that the data are on pressure
surfaces, YEAR is a four digit
year, and MONTH is a two digit month.
The file suffix identifies the projection as the 2.5 degree global
latitudelongitude projection (gbl). The projection details are encoded in the file's index record and are processed by HYSPLIT
Table of Contents
Table of Contents
Table of Contents
However, the NARR GRIB files are bit mapped, meaning some of the corners
do not have data defined due to differences in the internal
grid projection
of the model vesus the output grid. HYSPLIT does not support projections
with a bit map and when running the data
converter a message will be
generated:
which indicates the the grid has been slightly reduced to insure that all data points within the domain have valid data. The reduced domain
Table of Contents
Table of Contents
will be processed into one output file. The ECMWF download menu permits
copying all the data on the global grid or
extracting regional sub-grids. The ARL conversion program can handle either option.
The conversion menu is designed to select three different GRIB
file types: upper air data, surface data, and time
invariant data.
The upper air and surface data files should contain data for the
same time periods. To be able to run
HYSPLIT, the upper air data
file must contain at a minimum the following variables:
geopotential, temperature, uvelocity, v-velocity, w-velocity, and
relative humidity. The surface variable file should contain the 2m
temperature, the
10m u- and v- velocity components, and the total
precipitation field, if wet removal calculations are required. The
Table of Contents
Table of Contents
information is removed and all the input files should be in the local working directory.
Table of Contents
The data entry widget contains five time fields and four data
fields. The output file time interval is computed
automatically
from the input data. No interpolation is performed and the input
data nearest in time to the output time
are used for the
conversion. The maximum output interval is one hour. Note that the
date-time field defaults to the
current system clock time. The
meteorological variables are wind direction, speed (m/s), mixed
layer depth (m), and
Pasquill stability category (1=A to 7=G).
After filling in the data for the first line, use the Repeat
button the fill the
remaining table entries and then edit the time
and data fileds as required.
Table of Contents
Table of Contents
Table of Contents
& V wind components shown on the left side are relative to the
grid, while those on the right side are with respect to
north-south
and east-west. They would be the same for a latitude-longitude
grid.
Table of Contents
Table of Contents
Normally the fields will be blank when the menu is first invoked. Prior to creating the shapefiles it is necessary to create
the "Generate" text file which contains the latitude-longitude points of each contour as created by the contouring
programs available through the display menus. Checking the GIS box in the menu (-a1 option on the command line)
creates these files. All generate format files start with GIS and end with .txt. The remainder of the file name depends
upon the application from which it was created. However in all applications, the two-digit string prior to the .txt
identifies the frame number. There is one GIS output file per frame. Use the upper browse button to set the generate
auxiliary portions of Shapelib, notably some of the components in the contrib directory come under slightly different
license restrictions. Check the source files that you are actually using for conditions.
Default License Terms
Copyright (c) 1999, Frank Warmerdam
This software is available under the following "MIT Style" license, or at the option of the licensee under the LGPL (see
LICENSE.LGPL). This option is discussed in more detail in shapelib.html. Permission is hereby granted, free of
charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute,
sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject
to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the
Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR
A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN
AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH
THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Table of Contents
Table of Contents
Description
Units
Identification
Pressure at surface
hPa
PRSS
hPa
MSLP
Temperature at surface
TMPS
TPP6
U-Momentum flux
N/m2
UMOF
V-Momentum flux
N/m2
VMOF
W/m2 SHTF
W/m2 LTHF
W/m2 DSWF
Temperature at 2 m
T02M
Relative humidity at 2 m
RH2M
U10M
V10M
TCLD
Units
Identification
UWND
VWND
Geopotential height
gpm* HGTS
Temperature
TEMP
hPa/s
WWND
Data may be obtained from any source, however there are certain
minimum data requirements to run the model: surface
pressure or
terrain height, u-v wind components, temperature, and moisture. In
addition, precipitation is required for
wet removal calculations.
Not required, but certainly necessary to improve vertical mixing
calculations, would be some
measure of low-level stability. This
may take the form of a near-surface wind and temperature or the
fluxes of heat and
momentum.
It is also important to have sufficient vertical resolution in
the meteorological data. Some of the NOAA higher
resolution data
files have five or more levels in the boundary layer (<850 hPa)
in addition to wind and temperatures
near the surface, usually at 2
and 10 m agl. These surface values are especially important when
the data are only
available on absolute pressure surfaces, rather
than terrain following sigma surfaces, to avoid interpolation to
the
surface between data levels when the local terrain is well
above sea-level.
Starting with HYSPLIT (Revision 596) the optional field DIFF is recognized as the difference between the original data
and the
packed data: DIFF=ORIGINAL-PACKED. The effect of this is to
increase the precision of variables that have
this additional field.
When the DIFF field is read by HYSPLIT, the values are added to the data field resulting in values
closer to the original data.
Currently only DIFW and DIFR (vertical velocity and precipitation)
are recognized as valid
DIFF fields.
Creation of a Meteorological Input Data File
One may prepare meteorological data from any number of
different sources to be in a suitable format for the model
using a
series of routines described in this section. In general it is
assumed one has access to a meteorological data
source, either the
data fields are already on a grid, such as those output from a
meteorological model, or perhaps from
observations that have been
interpolated to a grid. Some example conversion programs are
provided within the GUI to
convert from either NOAA or ECMWF
GRIB format data files to the HYSPLIT
format.
The meteorological data are processed in time-sequence, calling
the subroutines provided, to create a HYSPLIT
compatible output
file. These subroutines will pack the data and write the index
record. The index record, which
precedes the data records for each
time period, can only be written after the data records are
processed. The packing
routines must first be initialized by
setting the appropriate grid parameters and defining all the
meteorological variables
that will be written to the file. Multiple
output grids may be defined and written simultaneously by invoking
the
PAKSET routine with a different unit number for each
grid. The grid parameters are all defined in a configuration file,
Grid Number........ 99
Z-Coordinate....... 2
Orientation........ 0.0
S141.htm[2/9/2016 2:47:37 PM]
Reserved........... 0.0
Number of Levels... 16
Format: F6.,I3,(1X,A4)
records:
Records 13 & 14. In this example, the position
indicated is the center of the grid located over the North Pole.
Any
position is acceptable. It need not even be on the grid.
Lat-Lon Grids only: contains the latitude and longitude of the
1,1
position grid point.
Record 15 is not currently used.
Records 16 & 17 identify the number of grid points in
each direction.
Record 18 is the number of levels in the vertical,
including the surface level.
Record 19, through the number of levels, identifies the
height of each level in appropriate units according the definition
1-sigma (fraction)
2-pressure (mb)
3-terrain (fraction)
compare the checksum, set ksum = -1, this will save a little
computer time. Due to the sub-grid option the checksum
cannot be
computed in the regular unpacking loop, but requires a second pass
through the data. The checksum pass is
enabled when ksum=0.
It will then return a non-zero value. If you don't reset it to
zero, no further checksums will be
computed.
If you want to create your own packed data by converting a
real*4 array to the character*1 packed data array use the
following:
CALL PAKOUT(rvar,cvar,nx,ny,nxy,prec,nexp,var1,ksum)
Although the structure of the packed data may seem complex,
unpacking is a very simple process, the basic elements of
which are
summarized in the Fortran code shown below. The value of each
element is the sum of the initial value and
the difference stored
in that element location.
SUBROUTINE UNPACK(CPACK,RVAR,NX,NY,NXY,NEXP,VAR1)
CHARACTER*1 CPACK(NXY)
REAL*4 RVAR(NX,NY)
SCALE=2.0**(7-NEXP)
VOLD=VAR1
INDX=0
DO J=1,NY
DO I=1,NX
INDX=INDX+1
RVAR(I,J)=(ICHAR(CPACK(INDX))-127.)/SCALE+VOLD
VOLD=RVAR(I,J)
END DO
S141.htm[2/9/2016 2:47:37 PM]
VOLD=RVAR(1,J)
END DO
RETURN
Table of Contents
Trajectory / Help
The trajectory menu tab is composed of five main sections:
setting up the simulation, executing the model, displaying
the
trajectory, converting the graphic to other formats, and
running special simulations. The model is configured and
Table of Contents
Table of Contents
Table of Contents
messages may also appear in above window. Once the model has
completed, press Exit to close the window.
Table of Contents
the value and latitude-longitude points along each trajectory. This file can be converted to an ESRI Shapefile or
converted for other GIS
applications through the utility menu "GIS to Shapefile. The view checkbox would be
disabled to do just the GIS conversion without opening the Postscript viewer. Selecting Google Earth will create
a compressed file (*.kmz) for use in Google Earth; a free software package to display geo-referenced
information in 3-dimensions. You must have the free Info-Zip file compression software installed to compress
the Google Earth file and associated graphics.
-e[End hour to plot: #, (all hours)]
-f[Frames: (0)-all frames in one file 1-one frame per file]
= {blank} - draws four distance rings about source point, automatically scaled
= {#} - draws the indicated number of rings about the source point
= {#1}:{#2} - draws the number {#1} of rings about the source point each {#2} kilometers apart. In the special
case where #1 is zero, the rings are not drawn, but the size of the plot is scaled as if one ring of radius {#2} is to
be drawn.
-h [Hold map at center lat-lon: (source point), lat:lon]
= latitude:longitude - forces the map center to be at the selected location instead of the source point. In
conjunction with the -g0:{km} option it is possible to create a constant map projection for each execution.
-i[Input files: name1+name2+name3+... or +listfile or (tdump)]
= {name1+name2+...} - to overlay multiple tdump files on the same plot based upon file name1 plot limits.
= +filename - filename is a file of filenames to be plotted following the same convention as the previous
argument (limit <=5000 files).
-j[Map background file: (arlmap) or shapefiles.(txt|suffix)]
= N - 1=red,2=blue,3=green,4=cyan,5=magenta,6=yellow,7=olive
-l[Label interval: -12, -6, 0, (6), 12, ... hrs]
= 0 - none
= 1 - auto
= trajplot.ps is the default Postscript output file name, otherwise it may be {user defined}.
-p[Process file name suffix: (ps) or process ID}
-s[Symbol at trajectory origin: 0-no (1)-yes]
-v[Vertical: 0-pressure (1)-agl 2-isentropic 3-meteorology 4-none]
'MAPID&','PROBABILITY&'
'UNITS&','BQ&'
'VOLUM&','/M3&'
Additional supplemental text may be added at the bottom of the
graphic by creating a file called MAPTEXT.CFG, also
to be
located in the startup directory. This is a generic file used by
all plotting programs but each program will used
different lines in
its display. The file can be created and edited through the Advanced menu tab.
Standard ASCII Map Background File
By default, all mapping programs use the same text map background file,
arlmap, which normally would be located in
the ../graphics
directory. However, all graphics programs first search the local start
directory (../working if running the
GUI), then the ../graphics
directory. Customized map background files can be read instead of the default
file for
specialized applications. Some higher resolution map background
files are available from the HYSPLIT download web
page. These different map files may be accessed
implicitly by putting them in the "../working (the startup directory), or
Table of Contents
Table of Contents
Table of Contents
Matrix
The matrix calculation is a way to set up the CONTROL
file for multiple starting locations that may exceed the GUI
limit
of 6 under the Trajectory Setup tab.
Hundreds or thousands of starting points may be specified. The
Run Matrix
tab just executes the model using a
CONTROL file that is dynamically created from the
default_traj file using a special
program called
latlon that is called from within the GUI. The program reads
a CONTROL file that is required to have
three starting
locations and then re-writes the same file with multiple locations.
The multiple starting locations are
computed by the latlon
program based upon the number of starting points that fall in the
domain between starting point
1 and starting point 2. Each new
location is offset by an amount equal to the difference between
starting locations 1 and
3. An example would be more illustrative.
If the original control file has three starting locations: #1 at
40N, 90W, #2 at
50N, 80W, and #3 at 41N, 89W; then the matrix
processing results in a CONTROL file with 121 starting locations,
all
one degree apart, all between 40N 90W and 50N 80W.
Daily
Another way to generate multiple trajectories
in time is to use the automated daily trajectory menu option to create an
endpoints file for each starting time. This differs from the previous approach where all trajectory endpoints were
written to one output file. Using this menu the model is executed in a script with a unique output
file name for each
simulation. This method is required for trajectory
clustering.
Clustering
Multiple trajectories may be aggregated into groups to minimize the
difference between trajectories in that group using
Table of Contents
The model should be configured and run for the first simulation time
through the standard run menu to insure that the
simulation is configured
correctly. If part of the simulation requires meteorological data from the
previous month or the
next month, these should be included in the base
simulation test.
At first it will appear that after pressing the Execute Script
button that nothing is happening. However, as each day's
simulation is
completed, the Start Day entry box increases in value until its value
matches the End Day, at which time
the script is completed. Note that
concentration simulations are much slower than trajectories and it may
take a while
before the start day field is updated.
The dispersion daily run menu contains an additional entry shown below
that determines how the file name is calculated
for each run. The default
approach is the same as with the trajectory calculation, where the day and
hour is appended to
the base name. Another option is to name the files sequentially using a three digit number. Using sequential numbering,
the
output files can be processed by the probability display programs used in conjunction with the ensemble calculation.
Table of Contents
clusters, that are each different from the other sub-sets. The program will
usually produce at least one possible
outcome set of clusters. If more than one outcome is given, the user must then subjectively choose one for the final
result.
Description of clustering process:
Initially, total spatial variance is zero. Each trajectory is defined to be a cluster, in other words, there are N trajectories
and N clusters. For the first iteration, which two clusters (trajectories) are paired? For every combination of trajectory
pairs, the cluster spatial variance (SPVAR) is calculated. SPVAR is the sum of the squared distances between the
endpoints of the cluster's component trajectories and the mean of the trajectories
in that cluster. Then the total spatial
variance (TSV), the sum of all the cluster spatial variances, is calculated. The pair of clusters combined are the ones
with the lowest increase in total spatial variance. After the first iteration, the
number of clusters is N-1. Clusters paired
always stay together.
Step 1. Inputs.
Run ID - A label to identify each run. The label ends at the first blank space. The other numeric inputs may be
part of the label. For instance if trajectories during 2004 from Ohio were clustered,
the Run_ID could be
Ohio_2004. If you used 48-h trajectories, hourly endpoints, and every other trajectory, a label of
Ohio_2004_48_1_2 could be used. If you later decided to only use the first 36-h of the trajectory
Ohio_2004_36_1_2
might be used.
Hours to cluster - Trajectory durations up to the given hour are used. Must be a positive number. Time is from
trajectory origin whether backward or forward. Trajectories terminating before the given hour will NOT be
included in the clustering.
Premature terminations commonly result from missing meteorology data or the
trajectory reaching the meteorological grid horizontal or top edge. 36 hours is typical.
Daily. All the trajectory endpoints files need to be in one folder and each must have the name tdump within its
filename. In this
step, a file, INFILE, listing all the tdump files will be created in the working folder.
Note on endpoints files - There can be only one trajectory per file. At least 16 trajectories are needed after
trajectories are skipped, if specified.
Run Cluster The cluster analysis program is run here given the INFILE file, the trajectory endpoints files, and the
above inputs. On a typical PC, a cluster run with 365 trajectories, 36-h
duration, and using every hourly endpoint,
will take a couple minutes. Going
beyond several years of trajectories will result in a run that will take a long
time and/or use much memory. A warning message is given for larger runs,
but there can be hard to tell if a
"large" job fails due to lack of memory and/or is feasible.
For example with 1100 trajectories it may appear not to be running. Try some
intermediate runs - 600
trajectories, 900 trajectories - and note the run time. Add to the number of trajectories as reasonable. Let it run
overnight. If it
takes "too long", increase the "time interval" to say 2 to use every other trajectory
endpoint and/or
set the "trajectory skip" to say 3 to use every third trajectory.
Another option to bypass possible GUI errors is to
run cluster.exe from the command line. To do this, open the "Command Prompt" (for Windows, Start, All
Programs, Accessories), cd to your cluster working directory (e.g.
cd \hysplit4\cluster\working), and run
\hysplit4\exec\cluster.exe. If the file
with the input values CCLUSTER exists in the cluster working directory,
cluster.exe will start running, otherwise you will be prompted for the input values. When done, "exit" the
Command Prompt,
and return to the GUI for the subsequent processing.
clusters, NF. The arbitrary cluster number and percent of trajectories in each
cluster are given.
Display Clusters produces one map for each cluster, showing the
trajectories in each cluster.
Trajectories not usedare those input to the cluster
program, i.e. in the endpoints directory, and at the given skip
interval, that
terminate before the trajectory duration equal to the Step 1, Hours to cluster.
This occurs when the
trajectory reaches the meteorology grid edge or when there
is missing meteorology. Trajectories not used
immediately displays a plot showing the trajectories not used and opens the Trajectory Cluster Display window,
from which the cluster-mean trajectories for the trajectories not used (cluster #0) and all the other clusters may be
displayed. Note the plot showing the trajectories not used must have been previously created in Step 3, Display
Clusters, though it is not displayed there.
Archive All files are moved, not copied, to the given directory. Files created in Step 3 contain the final number of
clusters (NF) in their filename; hence output using various values of NF may be readily archived.
Table of Contents
measured data. The measured data file must be in the DATEM format. The configuration process
is very similar to that
described for dispersion simulations.
Step 1: defines the measured data input used in this series of calculations. The trajectory model is run backward, from
each of the sampling locations, with a trajectory
started (at the beginning, middle, end) for each sampling period with a
non-zero measurement.
Measured values less than or equal to the value in the text entry box are considered to be zero.
In this case the measurement data were created using a forward dispersion simulation from 40N 80W which is near the
maximum point of the trajectory frequency plot, the position of the maximum is also indicated along the left edge plot
label.
Table of Contents
CONTROL file when they are set to zero. Those options are
discussed in more detail below. Each input line is
numbered (only
in this text) according to the order it appears in the file. A
number in parenthesis after the line number
indicates that there is
an input loop and multiple entry lines may be required depending
upon the value of the previous
entry.
1- Enter starting time (year, month, day, hour, {minutes optional})
Default: 00 00 00 00 {00}
Enter the two digit values for the UTC time that the calculation
is to start. Use 0's to start at the beginning (or end)
of the file
according to the direction of the calculation. All zero values in
this field will force the calculation to
use the time of the first
(or last) record of the meteorological data file. In the special
case where year and month
are zero, day and hour are treated as
relative to the start or end of the file. For example, the first
record of the
meteorological data file usually starts at 0000 UTC.
An entry of "00 00 01 12" would start the calculation 36
hours from
the start of the data file. The minutes field is optional but should
only be used when the start time is
explicitly set to a value.
2- Enter number of starting locations
Default: 1
Simultaneous trajectories can be calculated at multiple levels
or starting locations. Specification of additional
locations for certain types of simulations can also be accomplished through the Special Simulations menu tab, or
through manual
editing of the CONTROL file and running the model outside of
the GUI. When multiple starting
locations are specified, all
trajectories start at the same time. A multiple trajectory in time
option is available
through the Advanced
menu through a namelist file parameter setting.
3(1)- Enter starting location (lat, lon, meters)
Default: 40.0 -90.0 50.0
Trajectory starting position in degrees and decimal (West and
South are negative). Height is entered as meters
above ground-level.
An option to treat starting heights as relative to mean-sea-level is
available through the
Advanced menu through a
namelist file parameter setting.
4- Enter total run time (hours)
Default: 48
Specifies the duration of the calculation in hours. Backward
calculations are entered as negative hours. A
backward trajectory
starts from the trajectory termination point and proceeds upwind.
Meteorological data are
processed in reverse-time order.
5- Vertical motion option (0:data 1:isob 2:isen 3:dens 4:sigma 5:diverg 6:msl2agl 7:average)
Default: 0
Indicates the vertical motion calculation method. The default "data"
selection will use the meteorological model's
vertical velocity fields;
other options include {isob}aric, {isen}tropic, constant {dens}ity, constant internal
{sigma} coordinate, computed from the velocity
{diverg}ence, vertical coordinate remapping from MSL to
AGL, and a special option (7) to spatially average the vertical velocity. The averaging distance is automatically
computed from the ratio of the temporal frequency of the data to the horizontal grid resolution.
6- Top of model domain (internal coordinates m-agl)
Default: 10000.0
Sets the vertical limit of the internal meteorological grid. If
calculations are not required above a certain level,
fewer
meteorological data are processed thus speeding up the computation.
Trajectories will terminate when they
reach this level. A secondary
use of this parameter is to set the model's internal scaling height
- the height at
which the internal sigma surfaces go flat relative
to terrain. The default internal scaling height is set to 25 km but
Default: file_name
The trajectory end-points output file is named in this entry line.
Table of Contents
5I6 - Data file starting Year, Month, Day, Hour, Forecast Hour
Record #3
I6 - trajectory number
Table of Contents
Concentration / Help
The concentration menu tab is composed of six main sections:
setting up the simulation, executing the model, displaying
the
concentrations, various utility programs for converting the output
to other formats, configuring special simulations,
and simulations
in a multi-processor environment. The model can be entirely
configured and executed through the
menu. However for experienced
users, each component may be run independently from the command
line.
In the Concentration Setup
menu, the entire purpose of the GUI is to
configure the model's input CONTROL file. This is
a text
file that configures the simulation parameters. Once the input
parameters are set to their desired value, the model
is executed
from the Run Standard Model menu tab. When
complete, the output window is closed and Display
Options
menu is used to draw and display the concentration contours from the model's binary concentration output file. The
Table of Contents
with the HYSPLIT graphic and three button options: Menu, Help Exit. Click on Menu.
Step 2 - The four main menus of Hysplit4 will appear: Meteorology, Trajectory,
Concentration, and the optional
concentration display, utility programs, and special simulations. In general, they should
be executed in sequential order.
Click on Setup Run.
Step 4 - The Setup Run menu brings up similar starting
information requirements as with the trajectory menu.
There are
three additional sub-menus: Pollutant - that can be used set the emission rate, duration, and start time of the
emission;
Grids - to set the location, resolution,
levels, and averaging times of the concentration output grid; and Deposition - to
set the characteristics of each
pollutant. Click on Retrieve, enter name of sample pre-configured control file:
sample_conc, then click on OK. After the data entry widget is closed, click on Save and the setup menu will close.
Step 5 - From the main
concentration menu tab select Run
Model, which copies the setup configuration to the model's
input CONTROL file and starts the model simulation. Messages will appear on standard output showing the progress of
the calculation or after
the calculation has completed. Be patient as concentration calculations
may take considerably
longer than trajectory calculations. Click on Exit to close the message window. At that point the binary
concentration
output file is ready to be converted to a graphical display.
Step 6 - Click on Display / Concentration and then select Contours to run a special program that converts the binary
Ghostscript/Ghostview. The 12 hour average air concentration for a one hour release is shown in the illustration below.
Table of Contents
pollutant and deposition menus must both reference the same number
of species. Multiple species simulations are
calculated
independently, hence there is no computational benefit by doing two
different simulations or combining both
species in one simulation.
The multiple species option is primarily used for chemical
transformation simulations. A
simple example is available from the
configuration menu checkbox of "10% per
hour", which transforms species #1 to
species #2 at a rate of 10%
per hour. The transformation occurs on the same particle and is
discussed in more detail in
the Advanced
Applications section. More complex transformations require a
linkage with compatible external modules.
None are available at
this time for public distribution.
Table of Contents
definition changes such that the latitude grid spacing equals the sector angle in degrees and the longitude grid
spacing equals the sector distance spacing in kilometers.
18(3)- Grid span (deg) Latitude, Longitude
Default: [180.0] [360.0]
Sets the total span of the grid in each direction. For instance,
a span of 10 degrees would cover 5 degrees on each
side of the
center grid location. A plume that goes off the grid would have
cutoff appearance, which can
sometimes be mitigated by moving the
grid center further downwind.
In the special case of a polar (arc,distance) concentration grid,
defined when the namelist variable cpack=3, the
definition changes such that the latitude span always equals 360.0 degrees and the longitude span equals the total
downwind distance in kilometers. Note that the number of grid points equals 360/arc-angle or the total-distance
divided by the sector-distance.
19(4)- Enter grid # 1 directory
Default: ( \main\sub\output\ )
Directory to which the binary concentration output file for this
grid is written. As in other directory entries a
terminating (\)
slash is required.
20(5)- Enter grid # 1 file name
Default: file_name
Name of the concentration output file for each grid. See Section
6 for a description of the format of the
concentration output
file.
21(6)- Number of vertical concentration levels
Default: 1
The number of vertical levels in the concentration grid
including the ground surface level if deposition output is
required.
22(7)- Height of each level (m)
Default: 50
Output grid levels may be defined in any order for the puff
model as long as the deposition level (0) comes first (a
S313.htm[2/9/2016 2:47:43 PM]
height of
zero indicates deposition output). Air concentrations must have a
non-zero height defined. A height for
the puff model indicates the
concentration at that level. A height for the particle model
indicates the average
concentration between that level and the
previous level (or the ground for the first level). Therefore
heights for
the particle model need to be defined in ascending
order. Note that the default is to treat the levels as
aboveground-level (AGL) unless the MSL (above Mean-Sea-Level) flag
has been set (see advanced configuration).
23(8)- Sampling start time: year month day hour minute
Default: [simulation start]
Each concentration grid may have a different starting, stopping,
and output averaging time. Zero entry will result
in setting the
default values. "Backward" calculations require that the stop time
should come before the start time.
24(9)- Sampling stop time: year month day hour minute
Default: 12 31 24 60
After this time no more concentration records are written. Early termination on a high resolution grid (after the
plume has moved
away from the source) is an effective way of speeding up the
computation for high resolution
output near the source because once
turned-off that particular grid resolution is no longer used for
time-step
computations.
25(10)- Sampling interval: type hour minute
Default: 0 24 0
Each grid may have its own sampling or averaging interval. The
interval can be of three different types: averaging
(type=0),
snapshot (type=1), or maximum (type=2). Averaging will produce output
averaged over the specified
interval. For instance, you may want to
define a concentration grid that produces 24-hour average air
the deposition will not be available in any output unless height "0" is
defined as one of the concentration grid levels.
may
be assigned as the default for each entry to define the pollutant
as a particle. If a dry deposition velocity is
specified as the
first entry in the next line (28), then that value is used as the
particle settling velocity rather than
the value computed from the
particle diameter and density.
If gravitational settling is on and the Shape is set to a
negative value then the Ganser (1993) calculation is used to
replace Stokes equation for estimating particle fallspeeds.
The absolute value of the Shape factor is used for the
calculation.
The Stokes equation overestimates particle fallspeeds for particles larger than about 20 micron
diameter. As this diameter often lies within size distributions of volcanic ash particles, it is desirable to use the
Ganser formulation so that particle fallspeeds can be computed accurately.
Ganser, G.H., 1993: A rational
approach to drag prediction of spherical and nonspherical particles. Powder Technology, 77, 143-152.
.
The particle definitions can be used in conjunction with a special
namelist parameter NBPTYP that determines if
the
model will just release the above defined particles or create a
continuous particle distribution using the
particle definitions as
fixed points within the distribution. This option is only valid if
the model computes the
gravitational settling velocity rather than
pre-defining a velocity for each particle size.
28(2)- Deposition velocity (m/s), Pollutant molecular weight
(Gram/Mole), Surface Reactivity Ratio, Diffusivity Ratio,
29(3)- Wet Removal: Actual Henry's constant, In-cloud (GT 1 =L/L; LT 1 =1/s),
Below-cloud (1/s)
Default: 0.0 0.0 0.0
Suggested: 0.0 8.0E-05 8.0E-05
Henry's constant defines the wet removal process for soluble
gases. It is defined only as a first-order process by a
non-zero
value in the field. Wet removal of particles is defined by non-zero
values for the in-cloud and belowcloud parameters. In-cloud removal
can be defined as a ratio of the pollutant in rain (g/liter) measured
at the
ground to that in air (g/liter of air in the cloud layer) when the value in the field is greater than one. For withincloud
values less than one, the removal is defined as a time constant.
Below-cloud removal is always defined
through a removal time
constant. The default cloud bottom and top RH values can be changed
through the
SETUP.CFG namelist file. Wet
removal only occurs in grid cells with both a non-zero precipitation
value and a
defined cloud layer.
30(4)- Radioactive decay half-life (days)
Default: 0.0
A non-zero value in this field initiates the decay process of
both airborne and deposited pollutants. The particle
mass decays
as well as the deposition that has been accumulated on the internal
sampling grid. The deposition
array (but not air concentration) is decayed until the values are written to the output file. Therefore,
the decay is
applied only the the end of each output interval. Once
the values are written to the output file, the values are
fixed.
The default is to decay deposited material. This can be turned off
so that decay only occurs to the particle
mass while airborne by
setting the decay namelist variable
to zero.
31(5)- Pollutant Resuspension (1/m)
Default: 0.0
Suggested: 1.0E-06
A non-zero value for the re-suspension factor causes deposited
pollutants to be re-emitted based upon soil
conditions, wind
velocity, and particle type. Pollutant re-suspension requires the
definition of a deposition grid,
as the pollutant is re-emitted
from previously deposited material. Under most circumstances, the
deposition
should be accumulated on the grid for the entire
duration of the simulation. Note that the air concentration and
the message shown below will appear. If the intent was to run using
this file, then continue, otherwise one can delete
the file and run,
or terminate the simulation. The situation may arise that the
namelist file was created for a previous
simulation and is not needed
this time.
Table of Contents
Normally only one input file is shown, unless multiple files have
been defined in the Concentration Setup Run
menu.
The default output file name is shown and unless the box is
checked all frames (time periods and/or levels) will be
output to
that one file. The program uses the map background file, arlmap, which by default is located in the \graphics
directory. Other
customized map background files could be defined. Some of these higher
resolution map background
files are available from the HYSPLIT download
web page. This plotting program also supports the use of ESRI
formatted
shapefiles.
The GIS output option will create an output file of the contour
polygons (latitude-longitude vectors) in two different
format: the
ESRI generate format for import into ArcMap or ArcExplorer, or XML formatted files packaged by InfoZip for import into Google Earth.
For multiple pollutant files, only one pollutant may be selected for display by individual levels, or averaged between
selected levels. These levels must have been predefined in the Concentration Setup menu. Multipliers can
be defined
for deposition or air concentrations. Normally these
values would default to 1.0, unless it is desired to change the
output
units (for instance, g/m3 to ug/m3
would require a multiplier of 106).
Contours and color fill can be specified as black and white or
color. The none option eliminates the black line defining
contours
and only leaves the color fill. This option is incompatible with
GIS output options, which require computation
of the contour vector.
Contours can be determined DYNamically by the program, changing with each map, or FIXed to
be the same for all maps.
A user can set the contour scaling (difference between contours) to
be computed on an
EXPonential scale or a LINear scale.
Concplot Command Line Options
The Postscript conversion program (concplot), found in the
/exec directory with all other executables, reads the binary
concentration output file, calculates the most optimum map for display,
and creates the output file concplot.ps. Multiple
pollutant species
or levels can be accommodated. Most routine variations can be invoked
from the GUI. More
complicated conversions should be run from the
command line using the following optional parameters:
concplot -[options (default)]
-a[Arcview GIS: (0)-no dump, 1-ESRI (log10), 2-ESRI (decimal), 3-Google Earth]
Selecting the ESRI Generate Format output creates an ASCII text file for each
output time period that consists of
the value and latitude-longitude points along each contour. This file can be converted to an ESRI Shapefile or
converted for other GIS applications through the utility menu "GIS to Shapefile" ". The view checkbox would be
disabled to do just the GIS conversion without opening the Postscript viewer. Selecting Google Earth will create
a compressed file (*.kmz) for use in Google Earth; a free software package to display geo-referenced
information in 3-dimensions. You must have the free Info-Zip file compression software installed to compress
the Google Earth file and associated graphics.
+a[KML altitude mode: 0-clampedToGround, 1-relativeToGround]:
Selecting clampedToGround (0) positions the contoured concentrations
flat on the Google Earth terrain, whereas
selecting relativeToGround will generate 3D contours be extending the edges of the contours to the ground
from
the valid height of the concentration data.
-b[Bottom display level: (0) m]
= 0 - Represents the height (meters) below which no data will be
processed for display. The level information is
interpreted
according to the display (-d) definition.
-c[Contours: (0)]
= 0 - Dynamic contour values (10x intervals) are optimized for each map.
= 1 - Fixed contour values (10x intervals) are the same for all maps.
= 2 - Dynamic contour values (constant interval) are optimized for each map.
= 3 - Fixed contour values (constant interval) are the same for all maps.
=50 - Force contour interval x10 and set dynamically for each frame.
=51 - Force contour interval x10 and set as fixed for all frames.
vertically averaged for all levels between the bottom and top
heights.
= 4 - A custom output format in which all the air
concentrations have been converted to mass loading.
-f[Frames: (0)-all frames one file, 1-one frame per file]
= 3 - Black and white color fill but without black contour lines.
-l[Label options: ascii code, (73)-open star]
The default plot symbol over the source location is an open star.
This may be changed to any value defined in the
psplot
ZapfDingbats library. For instance a blank, or no source symbol
would be defined as -l32
-L[LatLonLabels: 0=none, (1)=auto, 2=set:{value tenths}]
= 0 - No latitude or longitude lines are drawn on the map
= # - Sets the number of time periods to be processed starting with the first.
= [-#] - Sets the increment between time periods to be processed. For instance,
-n-6 would only process every 6th
time period.
-o[Output file name: (concplot.ps)]
The name of the Postscript output file defaults to concplot.ps unless it is set to a {user defined} value.
-p[Process file name suffix: (ps) or process ID]
The suffix defines the character string that replaces the default
"ps" in the output file name. A different suffix does
not change
the nature of the file. It remains Postscript. The suffix is used
in multi-user environments to maintain
multiple independent output
streams.
-q[Quick DATEM plot file: ( )-none, filename]
By defining the name of a text file in this field where the data
values are defined in the DATEM format, the
values
given in the file will be plotted on each graphic if the starting time of the
sample value falls within the
averaging period of the graphic.
-r[Removal: 0-none, (1)-each time, 2-sum, 3-total]
= 0 - No deposition plots are produced even if the model produced deposition output.
= 2 - The deposition is summed such that each new time period represents the total accumulation.
the end.
-s[Species: 0-Sum (1)-Single Pollutant {N}-Species Number]
= {Species Number} - Only one pollutant species may be displayed
per plot sequence if multiple species were
created during a
simulation. However, an entry of "0" will cause all species
concentrations to be summed for
display.
-t[Top display level: (99999) m]
= 99999 - Represents the height (m) above which no data will be
processed for display. The level is interpreted
according to the
display definition.
-u[Units label for mass: (mass), also see "labels.cfg" file]
Defines the character string for the units label. Can also be
modified using the labels.cfg file.
-v[Values (:labels:colors are optional) for up to 10 fixed contours: val1+val2+...val10]
If the contour values are user set (-c4), then it is also possible
to define up to ten individual contours explicitly
through this
option. For instance -v4+3+2+1, would define the contours 4, 3, 2,
and 1. Optionally, a label and/or
color (RGB) can be defined for each contour (e.g. v4:LBL4:000000255+3:LBL3:000255255+2:LBL2:051051255+1:LBL1:255051255). To specify a color but not a
label, two colons must be present (e.g. -v4::000000255).
-w[Grid point scan for contour smoothing (0)-none 1,2,3, ... grid points]
Defines if the gridded concentration data are to be smoothed prior to contouring.
For instance, a value of 1 means
that each grid point value is replaced by the
average value of 9 grid points (center point plus 8 surrounding).
-x[Concentration multiplier: (1.0)]
= X - where {X} is the multiplier applied to the air concentration input data before graphics processing.
-y[Deposition multiplier: (1.0)]
= X - where {X} is the multiplier applied to the deposition input data before processing.
-z[Zoom factor: 0-least zoom, (50), 100-most zoom]
= 50 - Standard resolution.
= 100 - High resolution map (less white space around the concentration pattern)
Additional supplemental text may be added at the bottom of the
graphic by creating a file called MAPTEXT.CFG, which
should be located in the working directory. This is a generic file used by
all plotting programs but each program will
used different lines in
its display. The file can be created and edited through the Advanced / Panel Labels menu tab.
Units and title information can be edited by creating a LABELS.CFG file which can
also be edited manually or through
the Advanced / Border Labels menu tab.
ESRI Shapefile Map Background Files
Another mapping option would be to specify a special pointer file, (origianlly
called shapefiles.txt, but now a suffix
other than "txt" is permitted) to replace the map background file arlmap in
the -j command line option (see above). Note
-jshapefiles... rather than
-j./shapefiles... is required. This file would contain the name
of one or more shapefiles that can
S330.htm[2/9/2016 2:47:45 PM]
Table of Contents
-x[longitude offset (0), e.g., o--90: U.S. center; o-90: China center]
-u[mass units(ie, kg, mg, etc); note these are labels only]
When called from the Global Grid menu tab, an additional feature is
available through the Add Plume checkbox. In this
S338.htm[2/9/2016 2:47:46 PM]
| 4=replace]
selection of the concentration input file name, which must have been
previously defined in the Concentration Setup
menu.
The "Plane" option results in the conventional graphic shown below of the
horizonal plane view. There is one black dot
for each particle. The size of the dot varies according to the pollutant mass assigned to the particle. All
examples are 12
S332.htm[2/9/2016 2:47:47 PM]
plume. Again all particles are shown regardless of their distance from the
regression line. The bottom panel view is
from left to right in increasing
longitude, regardless of the orientation of the regression line.
The global display is a special program that will always map the particles
on a global equidistant projection. The other
programs try to automatically
scale the plots according the to particle distribution, sometimes resulting
in distorted
plots when the particle coverage becomes global. There are additional command line features in this program (parsplot)
for particle size and color mapping according to the particle mass that are not
available through the GUI.
Table of Contents
-f[frames: (0)-all frames one file, 1-one frame per file] : multiple
frames can only be due to multiple levels or
species in the input file
-i[input file name (cdump)] : the input file needs to contain multiple time period
-j[graphics map background file (arlmap)] : the shape file option is not available
-o[output name (toa)] : the output file represents all time periods
Table of Contents
Step 1: defines the DATEM formatted measured data file and unit conversion factor required to convert the HYSPLIT
concentration predictions
to the same units as given in the measured data file.
Step 2: defines name of the statistical output file that will contain
the statistical results from each source location
compared with the measured
data values. Normally the statmain program outputs one results file per
execution. In this
configuration, the results form each source location are appended to a single file, one record per source.
Step 3: defines the root name of the final output graphics file which will
show a plot of the statistical results by source
location. The .ps suffix is
created by default.
Step 4: runs sequentially through all steps of the process, reading each source
location from the CONTROL file, running
matrix to extract that location from the HYSPLIT simulation output file, using c2datem to convert that binary extract to
the
DATEM format, and then running statmain to append the results to the statistical
output file. This step needs only to
be run once.
Step 5: defines which statistic will be shown: Corr - correlation coefficient, NMSE - normalized mean square error, FB
- fractional bias, FMS - figure of merit in space, KSP - Kolomogorov Smirnoff parameter, and finally the rank, which is
a combination of the previous parameters. Once all source locations have been processed, the statistical file is read and
converted to binary gridded format using stat2grid. This output file can then be plotted with any of the display
programs such as gridplot.
Default file names: The following describes the assigned default input and output
files names for each processing
section, so that they can be identified in the working directory.
File names in italics are the ones that can be changed
through the menu.
matrix
cdump - binary outfile from a single multi-source simulation using the ICHEM=1 option
SRM_cdump - binary concentration output file extracted for one source location
c2datem
SRM_cdump - binary concentration input file from the previous step
measured.txt - measured data file in DATEM format
hysplit_datem.txt - model prediction output for a single source converted to DATEM format
statmain
measured.txt - measured data input file in DATEM format
hysplit_datem.txt - model prediction input file for a single source from previous step
sumstats.txt - statistical summary output appended one record per source location
stat2grid
sumstats.txt - statistical summary input file from previous step
statmap.bin - binary statistical output file gridded in the HYSPLIT
concentration format
Table of Contents
numeric three-digit sequence. Menu settings for the output file name should
reflect the base name without the numeric
suffix. An illustration of the Ensemble Display menu is shown below.
not the
actual time span. For instance, if the model output is set to produce a one-hour average and the model is run for
24 hours, then each ensemble member
will have 24 time periods of output. By default the probability output will
represent the ensemble member variation for each hour, that is 24 frames of
output will be produced. However if the
aggregation period is set to 24, then
all output times will be combined into one output frame and the ensemble result
will represent the hourly variations as well as the member variations.
The menu permits a choice of six different ensemble output display
options:
1. The number of ensemble members at each grid point with concentrations
greater than zero shows the spatial
distribution of the Number of Ensemble
members.
2. The mean concentration of all ensemble members.
3. The variance of all ensemble members (the mean square difference
between individual members and the mean).
4. The coefficient of varition of all ensemble members (sqrt(variance)*100/mean).
5. The probability of concentration produces contours that give the
probability of exceeding a fixed concentration
value at one of three levels:
1% of the maximum concentration, 10% of the maximum, and the maximum
concentration.
The maximum is determined to be the first concentration to a power of 10 that is
less than the
actual maximum value or the value of the maximum set in the data
entry box (exponent of 10). The concentration
level for the probability display
is shown on the graphic with the pollutant identification field set to something
uses the existing files in the working directory. The output files may also be plotted
individually from the command
line using concplot. The conprob output
files contain the following information:
Cnumb - Number of members
Cmean - Concentration mean of all members
Cvarn - Concentration variance of all members
Ccoev - Concentration coefficient of variation of all members
Cmax05 - Probability of 5% of the concentration maximum
Cmax10 - Probability at 10% of the concentration maximum
Cmax00 - Probability at the maximum concentration level
Prob10 - Concentrations at the 10th percentile
Prob25 - Concentrations at the 25th percentile
Prob50 - Concentrations at the 50th percentile
Prob75 - Concentrations at the 75th percentile
Prob90 - Concentrations at the 90th percentile
Prob95 - Concentrations at the 95th percentile
The command line options for the probability program conprob [-options] are as follows:
-d[diagnostics = true]
Table of Contents
probability files have been created, the concentration distribution at a single location can also be viewed using
this
menu by entering a latitude-longitude location. The nearest grid point
will be selected and the concentration values are
retrieved from the prob{xx} files, which represent the various points on the box plot. In the
illustration shown below,
the limits of the box represent the quartiles (25th and 75th percentiles), the whisker extensions show the 10th and 90th
percentiles, and the circles show the 5th and 95th percentiles. The median
value is shown by the line through the box,
while the mean is shown by the
plus symbol. Up to 12 time periods can be shown in one plot. The graphic is
always
named boxplots.ps.
Table of Contents
Step 1: defines the DATEM formatted measured data file and unit conversion factor required to convert the HYSPLIT
concentration predictions
to the same units as given in the measured data file.
Step 2: defines name of the statistical output file that will contain
the statistical results for each ensemble member
compared with the measured
data values. Normally the statmain program outputs one results file per
execution. In this
configuration, a subset of the the complete results for each
member are appended to a single file. The complete
statistics are available in file
named stat_000.txt.
Step 3: runs sequentially through all steps, processing each ensemble member
using c2datem to convert that binary
concentration data to the DATEM format, and then running statmain to append the results to the statistical output file.
Default file names: The following describes the assigned default input and output
files names for each processing
section, so that they can be identified in the working directory.
File names in italics are the ones that can be changed
through the menu.
c2datem
{base name}.{xxx} - binary concentration input file, where xxx = 000 to 999
measured.txt - measured data file in DATEM format
hysplit_datem.txt - model prediction output for a member converted to DATEM format
statmain
measured.txt - measured data input file in DATEM format
hysplit_datem.txt - model prediction input file for a single source from previous step
sumstats.txt - statistical summary output appended one record per source location
S354.htm[2/9/2016 2:47:50 PM]
Table of Contents
If the Create Filenames button is selected, an special file called INFILE is created that contains all the file names in the
working directory
that have the wildcard sequence as defined in the Input Name box. These
files are then merged
sequentially according to the process rules. Note that the
concentration multiplier is applied during the writing of the
output file and
during sequential file processing this conversion is disabled.
The above example shows how two plume simulations have been merged. Each was run
for a different day and starting
location, however the graphic and its corresponding
binary output file, use only the information provided in the base
file and the
labeling information from the input file has been lost.
In the above example, the input and base files are identical to the previous
illustration, but in this example the mask
checkbox option was selected
and the result clearly illustrates the region of the base plume that was intersected by
the
input plume.
The result from the intersection option is shown in the above figure. In this result
only the region where the two plumes
intersect and both exceed the zero mask value
is shown. The region is identical to the missing plume area in the mask
option.
Table of Contents
Calls the utility program concrop to extract a sub-grid from a HYSPLIT binary
concentration data file. The new
subgrid is written to the file conxtrct.bin.
The utility program can be used to remove the whitespace by resizing the grid
about
the non-zero concentration domain or by specifying a sub-grid. Using the -x option
will force the extract of the
users domain regardless of white space.
USAGE: concrop -[options (default)]
-d[process id (txt)]
-g[Output plume grid limits rather than cropped file: (0)=no; 1=center-radius -1=from{-b}]
Table of Contents
Table of Contents
corresponds to hours after the start of the simulation. The output concentration
grid will appear to be like any other but with 24
pollutants, one for each release
period as defined by the QCYCLE parameter. Make sure that a sufficient
number of particles
have been released to provide consistent results for all time
periods.
Once the simulation has completed, the unit source simulation can be converted
to concentrations by applying an emission rate
factor for each release time. This
is accomplished by clicking on the Concentration / Utilities / Binary File / Apply
Source
menu tab. This menu is only intended to work with a single multi-pollutant
concentration file created in the manner described
previously.
The input required is simply the name of the Transfer Coefficient Matrix (TCM)
input file (the previously created multipollutant=multi-time-period) input file,
the base name of the single pollutant output file which will contain the
concentration
sum of all the release periods after they are multiplied by the
source term, the 4-character pollutant identification field, and the
name of the
time-varying source term file it it exists, otherwise a new file can be created.
An emission rate needs to be defined
for each release time, otherwise it is assumed to be zero.
The emissions file format is simply the starting time of the emission given
by year, month, day, and hour, and the emission rate
in terms of units per hour.
The first record of all emissions file should contain the column labels then
followed by the data
records, one for each release time period. For example, a constant emission rate corresponding to the 12 hour example test
simulation
(16 October 1995) would appear as follows:
YYYY MM DD HH TEST
1996 10 16 00 1.00E+12
1996 10 16 01 1.00E+12
1996 10 16 02 1.00E+12
1996 10 16 03 1.00E+12
1996 10 16 04 1.00E+12
1996 10 16 05 1.00E+12
1996 10 16 06 1.00E+12
1996 10 16 07 1.00E+12
1996 10 16 08 1.00E+12
1996 10 16 09 1.00E+12
1996 10 16 10 1.00E+12
1996 10 16 11 1.00E+12
Command Line Options - tcmsum
The underlying binary file conversion program can be run from the command line. As with all
other applications, the command
line argument syntax is that there should be no space
between the switch and options. The command line arguments can appear
in any order:
tcmsum -[option {value}]
The procedure described in this section is very similiar to the transfer coefficient solution approach with the exeception that
here the dispersion coefficients for the individual release times are contained in a single file while in the other approach an
input file is required for each release time simulation. The single file created here can be processed through the utility
program
conlight to extract one time period to its own file with each pass
through the program. Currently no GUI script application
exists to perform this task
automatically.
Table of Contents
Setting this flag turns off the writing of the first output record,
which is the column label field: DAY HOUR LAT
LON SPECIES-LEVEL.
This option corresponds to the Minimum Text checkbox of
the GUI menu.
-o[Output file name base (Input File Name)]
The default base name for the output file is the input file name. A new output file is created for each sampling
period, where the name
of the file is composed of the {base name}_{Julian day}_{hour}
of the sample ending
time. If the Include Minutes box is checked,
then the minutes field of the sample ending time is added to the
end
of file name. This field is not available through the GUI. The name
of each output file is always written to the file
named CON2ASC.OUT
regardless of any other options selected.
-s[Single output file flag]
This option corresponds to the Single File checkbox of the GUI
menu. The multi-time period ASCII output file is
named {input file}.txt.
-t[Time expanded (minutes) file name flag]
Table of Contents
complete descriptions of each of the programs used to convert the data and generate the statistical results. The on-line
repository has links to several
different tracer experiments and their associated meterological data. All
experimental
data have been converted to a common format. The meteorological
data are compatible for direct use by HYSPLIT.
To facilitate application of the analysis software, all the experimental data
from the second CAPTEX release are provide
in the ./hysplit/datem directory. Included are the control, namelist, measured data, and meteorological data files.
The
meteorological data is taken from the North American Regional Reanalysis.
The computational sequence is as follows:
options are available. For verification statistics, all values can be compared, only plume values (both measured and
calculated values are greater than zero), or all values but excluding pairs where both measured and calculated values are
zero.
In terms of averaging, the measured and calculated values can be used directly,
temporally averaged resulting in
one value for each location, or spatially averaged
resulting in one value for each time period. The contingency level is
the
concentration value required to be exceeded to compute the false alarm rate, probability of detection, and threatscore. The contingency level is also used as
the plotting threshold for the measured-calculated scatter diagram which is
created
by the Scatter Plot menu button.
Note that simulations with multiple species and levels are not supported. The concentration data files follow no specific
naming convention and any input or output file name may be selected in the menu. Arbitrary input file names can only
be set
through the Setup / Grids menu. The statistical output files are always written
to stat{A|T|S}.txt where the
character A|T|S represents All data, Temporal
averaging, or Spatial averaging. An additional output file, data{A|T|S}.txt
is
created for the scatter diagram plot, which is always named scatter.ps.
S345.htm[2/9/2016 2:47:53 PM]
Table of Contents
Converts a HYSPLIT binary concentration and deposition file to dose in rem/hr. The HYSPLIT calculation should be
done using a unit emission so that the concentration
units are m-3 and the deposition units are m-2. This postprocessing step reads the
file activity.txt which contains the activity (Bq) at time=0 for all the isotope
products. Sample
activity.txt files can be created by the program for a high-energy
nuclear detonation or a thermal fission reaction from a
power plant reactor.
The file also contains the half-life and cloud- and ground-shine does conversion
factors for each
radionuclide specified in the file.
For nuclear detonations, the activity.txt file contains columns for either a high-energy or thermal fission reaction
assuming a yield of 1 kT. The emission factor can be specified as a multiple of 1 kT. Activation products resulting from
the fission are not considered. Two fuel sources (U235 and Pu239) are available for each reaction type. The fission
yield data were obtained from T.R. England
and B.F. Rider, Los Alamos National Laboratory, LA-UR-94-3106;
ENDF-349 (1993). The
external dose rate for cloud- and ground-shine is computed from the factors given by
Eckerman
K.F. and Leggett R.W. (1996) DCFPAK: Dose coefficient data file package for
Sandia National Laboratory, Oak Ridge
National Laboratory Report ORNL/TM-13347.
During the processing step, the cumulative product of the activity and dose factor
is computed for each decay weighted
concentration averaging period, independently for noble gases and particles. Command line options exist to turn off the
dose
calculation but still multiplying the dispersion factors by the activity, resulting
in air concentration output. If the
dispersion factors input file contains only
one pollutant type and level, then an option can be set to output each of the
species
defined in the activity.txt file. Concentration units may also be converted from
Bq to pCi.
For nuclear power plant accidents, the same fission product table can be used by
entering the duration of the reactor
operation in terms of mega-watt-hours based upon the relation that 3000 MW-hours releases an energy equivalent to
2.58 kT. An
alternative approach is to generate an activity.txt file that corresponds with the
radionuclide release profile
of the reactor accident. The command line option
-afdnpp will create a sample activity.txt file that corresponds to the
maximum
emissions over a 3-hour period during the Fukushima Daiichi Nuclear Power Plant
accident for the 10 most
critical radionuclides for short-term dose calculations.
The breathing rate is used as a multiplier for the calculation of the inhalation dose factor. This is only used for the
cloudshine dose calculations.
-c[Output type: (0)-dose, 1-air conc/dep]
If the flag is set to one, the dose conversion factors are set to one and the output
will be concentration,the product
of the input air concentration (from a unit emission)
times the activity for each radionuclide defined in
activity.txt. The values are summed for all species in each pollutant class (NGAS or RNUC).
-d[Type of dose: (0)=rate (R/hr) 1=summed over the averging period (R)]
When computing dose, the option is to output a rate or the total dose accumulated over
each concentration
averaging period. Note that dose from deposition is always summed
over the duration of the simulation.
-e[include the inhalation dose in the calculation (0)=No 1=Yes]'
This is used in conjunction with the -b option. The default is not to include the
inhalation dose as part of the
cloudshine calculation. If this is turned on, insure that
the inhalation dose factors are correctly defined in the last
column of activity.txt.
-f[fuel (1)=U235 2=P239]
This flag selects which column of the activity.txt file will be used to determine the
emission amount. The first two
columns are for U-235 fission and the last two columns are
for Pu-239 fission. There are columns for highenergy or thermal fission (see -p flag).
Also see the -y and -w flags to scale the emissions to the event.
-g[decay type: 0=normal {c=1}, (1)=time averaged decay]
Prior to computing the dose, the activity is time-decayed to either the end of the
sampling time (0=normal) or as
the time averaged activity between the start and end time
of each sampling period (1=time-averaged, the default).
Normal decay can only be applied
to air concentration calculations. Dose output always requires time-averaged
decay.
-h[help with extended comments]
Provides a short narrative of the functionality of this program with a list of all the
command line options.
-i[input file name (cdump)]
The name of the HYSPLIT air concentration/deposition binary output file. The simulation
can contain both air
concentrations and deposition field. Proper conversion of the model output
fields to dose requires a unit release
rate in the HYSPLIT calculation.
-n[noble gas 4-char ID (NGAS)]
The program searches for this pollutant 4-character identification to use as the dispersion
factor for noble gases.
All other ID values (RNUC, etc) are assumed to be for particles. All
species will be summed to either the NGAS
or the other dispersion factor unless species matiching
is defined (-s1).
S347.htm[2/9/2016 2:47:54 PM]
The default dose output is in REM because the conversion factors are defined in the activity.txt
file in REM/Bq.
Setting this value to one converts the dose output from REM to Sieverts
(100 REM = 1 Sievert).
-s[(0)=sum species, 1=match to input, 2=output species]
The default value (0) sums each species in the the activity.txt file to its corresponding
pollutant class (NGAS noble gas, or RNUC - radionuclide particle). The match to input option (1)
just does a one-to-one match of the
pollutant ID in the activity.txt file with the pollutant IDs
used in the simulation. Pollutant IDs can be 4-digit
integer number or character for option (1). In
option (2), the output file can be expanded to include a
concentration for each species defined in
the activity.txt file using the dispersion factors from its pollutant class
from the simulation.
In this configuration, HYSPLIT can only be run with one level and species.
-t[0=no decay input decayed, or decay by species half-life (1)]
The default (1) is for all species to be decayed to the valid time of the concentration output
from the start time of
the simulation which is assumed to be the time fission ceased. The time of
release to the atmosphere can occur
much later. The no decay option is only required when decay
has already been applied in the dispersion
calculation. In general, decay should only be set in
the dispersion calculation for short-duration releases which
coincide with the termination of
fission, otherwise decay may be computed incorrectly. Computing decay within
this post-processing
step is the most accurate way to approach dose simulations.
-u[units conversion Bq->pCi, missing: assume input Bq]
When air concentration output is selected, then the output units can be converted from Bq to pCi.
-w[Fission activity in thermal mega-watt-hours, replaces the -y option value]
If the activity.txt file represents the number of fissions per kT, and the -w field is non-zero, then the activity is
computed with respect to the number of MWh generating the fission products. The assumption is that a 3000
MW reactor creates about 2.58 kT per hour of fission products.
-x[extended decay time in hours beyond calculation (0)]
Additional decay time can be added using this field to estimate long-term doses. This calculation
applies to either
-t option but only for deposition doses. This option permits longer-term doses to be
computed by adding
additional decay to the existing output files.
-y[yield (1.0)=kT]
The default assumes that the activity levels defined in activity.txt represent the emissions
from a 1 kT device.
Other values are computed as a multiplier to the values defined in the table.
The multiplier can be used in the
context of a detonation in terms of kT or a multiple of the
3-hour emission maximum from the Fukushima
Daiichi Nuclear Power Plant accident.
-z[fixed decay time in hours (0)]'
The fixed decay is computed to the end of the sample time in the output file regardless of the
actual simulation or
release start time. All output periods, regardless of sampling time, will be decayed for these number of hours.
An example of the activity.txt file for the Fukushima-Daichii Nuclear Power Plant Accident for the top ten
radionuclides important for short-term dose calculations. The activity represents the maximum number of Bq released
over a 3 hour period.
Mass Nucl T1/2 U235H U235T Pu239H Pu239T Cloudshine Groundshine Inhalation
Hr= 0.00 sec Bq Bq Bq Bq rem/h Bq/m3 rem/h Bq/m2 rem/Bq
95 Nb 3.02000E+06 1.62000E+13 1.62000E+13 1.62000E+13 1.62000E+13 1.26000E-08 2.62000E-10
2.38000E-06
110 Ag 2.18000E+07 6.48000E+12 6.48000E+12 6.48000E+12 6.48000E+12 4.57000E-08 9.29000E-10
2.38000E-06
132 Te 2.82000E+05 1.62000E+16 1.62000E+16 1.62000E+16 1.62000E+16 3.36000E-09 7.63000E-11
7.38000E-07
As an example, a HYSPLIT simulation could be configured with two species, NGAS for the noble gases and
RNUC for
the radio-nuclide particles. After the emissions stop, the total emissions of both NGAS and
RNUC should equal one.
Then in the post-processing step, con2rem is called and the concentration
and deposition fields are multiplied by the
total emission and dose conversion factor for each species and
added together to get a ground-shine and cloud-shine
dose file, which can then be plotted using any of the standard HYSPLIT display programs. Dose results, by species, are
decayed to the valid time of the
output graphic. An additional time factor can be added for long-term dose estimates.
One aspect of applying the decay in the the post-processing step is that the decay times are referenced to the start of the
simulation rather than the start of the release. This means that the
radionuclide release rate is valid at the start of the
simulation time. This might be the best configuration for a nuclear power plant accident, where the fission is assumed
to have stopped
at the start of the release (and simulation). However in other situations, where emissions might
be
occurring in concert with some other process, then starting the decay at the time when the
radionuclides are released
might be a more appropriate configuration. If the decay is computed
during the model calculation for each species
released then use the con2rem options -s1 -t0 to apply just the emission factors and not decay in the post-processing
step.
Table of Contents
Table of Contents
Similar to the other menus in the utility section, the input concentration
file must be defined in the Concentration Setup
menu.
The menu options correspond to the command line
options of the con2stn extraction program. There are two
options that
can be used to define an extraction location. A station location can be defined directly as an entry in the
menu, or a list of stations can be defined using an input file. If the input file does not exist, it can be
created by using
the New File button. In this example illustration,
a file has been defined with three locations that are within the plume
of the
example simulation. The file consists of three records, one for each station:
The extraction program is called con2stn and for these three stations,
produces the output file shown below, called by
default con2stn.txt. The
base name of the output file (con2stn) can be changed in the menu. In contrast to the simulation
shown in all the previous examples, in this
case the output averaging time was decreased from 12 hours to one hour,
to
generate a smoother looking graphic.
The output file shows the Julian day, month, day, and hour of the sample
start; day and hour of sample ending time, and
the concentrations for each
station (location selected by latitude-longitude). The format of each output
record is as
follows:
F8.3, 6I4 - Starting: Julian day, year, month, day, hour; Ending: day, hour
There are only two plot options supported through the GUI: linear or logarithmic ordinate scaling. If integer ordinate
units are desired then it may
be necessary to specify a units conversion factor, in this case 1015,
to create data in the text
file that can be plotted. With the log scaling option,
the conversion factor can be set to 1.0, and the ordinate scale will
cover the
appropriate order-of-magnitude range to plot all the data.
Command Line Options - con2stn
The program can be run from the command line or through interactive prompts
from the keyboard. The command line
argument syntax is that there should be no
space between the switch and options. The command line arguments can
appear
in any order: con2stn -[option {value}]
-a[rotation angle:latitude:longitude]
The default output format is to create multiple columns, one for each station,
for a selected pollutant and
level. Setting the value to one formats the output as one
record for each station, level, and pollutant. A
value of two sets the output to the
DATEM format.
-i[input text file name] contains data in the format output from con2stn
-i[input concentration file name] contains data in the same format as output from con2stn
-y[The default is linear ordinate scaling. The flag sets y-axis log scaling.]
Table of Contents
par2conc
Table of Contents
Table of Contents
The dilution factors are defined as the transfer coefficient matrix. The sum of each column
product SiDij shows the total
concentrations contributed by source i
to all the receptors. The the sum of the row product SiDij for receptor j would
show the total concentration contributed by all the sources to that
receptor. In this situation it is assumed that S is
known for all sources. The dilution
factors of the coefficient matrix are normally computed explicitly from each source
to all
receptors, the traditional forward downwind dispersion calculation.
In the case where measurements are available at receptor R and source S is the unknown
quantity, the linear relationship
between sources and receptors can be expressed by the
relationship:
Dij Si = Rj,
which has a solution if we can compute the inverse of the coefficient matrix:
Si = (Dij)-1 Rj.
For example, in the case of an unrealistic 2x2 square matrix (the number of sources
equals the number of receptors), the
inverse of D is given by:
| -D21 +D11 |
The solution for S1 (first row) can be written:
D22R1/(D11D22-D12D21) - D12R2/(D11D22-D12D21)
As a further simplification, assume that there is no transport between S1 and R2(D12 = 0), and then the result is the
trivial solution that the
emission value is just the measured concentration divided by the dilution factor:
S1 = R1/D11
The matrix solution has three possibilities. The most common one is that there are too
many sources and too few
receptors which results in multiple solutions requiring singular
value decomposition methods to obtain a solution. The
opposite problem is that there are too
many receptors and too few unknowns hence an over determined system
requiring regression
methods to reduce the number of equations. Unfortunately, possibilities for a matrix solution
may
be limited at times due to various singularities, such as columns with no contribution to
any receptor or measured
values that have no contribution from any source. The solution to these problems is not always entirely numerical as the
underlying modeling or even the measurements can contain errors. Note that large dilution factors (very small predicted
concentrations) at locations with large measured values will lead to large emissions to enable
the model prediction to
match those measurements. The opposite problem also exists in that
negative emission values may be required to
balance high predictions with small measurements.
The solution to the coefficient matrix is driven by errors, either in
the measurements, the
dispersion model, or the meteorological data.
Step 1: defines the binary input files which are the output file from the disersion
simulations configured to produce
output at an interval that can be matched to the measured
sampling data. Ideally the model simulation emission rate
should be set to a unit value. Each
simulation should represent a different discrete emission period. For example, a four
day
event might be divided into 16 distinct 6-hour duration emission periods. Therefore the matrix
would consist of 16
sources (columns) and as many rows as there are sampling data. The
entry field in step 1 represents the wild card string
*entry* that will be matched to the files
in the working directory. The file names will be written into a file called
INFILE.
This file should be edited to remove any unwanted entries.
Step 2: defines the measured data input file which is an ASCII text file in the DATEM format. The first record
provides some general information, the second record identifies all the variables that then follow on all subsequent
records. There
is one data record for each sample collected. All times are given in UTC. This file defines the
receptor
data vector for the matrix solution. It may be necessary to edit this file to remove sampling data that are not required or
edit the simulation that produces the coefficient matrix
to insure that each receptor is within the modeling domain.
Step 3: defines the file conversion details from sampling units to the
emission units. For instance, the default value of
10+12 converts the emission
rate units pg/hr to g/hr if the sampling data are measured in pg/m3 (pico = 10-12). The
exponent is +12 rather than -12 because it is applied in the
denominator. The height and species fields are the index
numbers to extract from the concentration file if multiple species and levels were defined in the simulation. The half
life (in days) is only required when dealing with radioactive pollutants and the measured data need to be decay
S337.htm[2/9/2016 2:47:56 PM]
Table of Contents
However, if the "Matrix" conversion module is checked in the Advanced Concentration Configuration menu tab,
then
the multiple source simulation maintains the identity of each
source in the concentration output file. The Display Matrix
menu tab permits extraction of information for
individual sources or receptors from this special concentration
output
file. The results of the same simulation are shown in the
illustration below. In this case the receptor check-box was set
and
the receptor location was identified as 45.0,-75.0 with normalization.
Therefore the graphic illustrates the fractional
contribution of each
region to the first 12 hour average concentration at the designated
receptor.
Ensemble-Meteorology
The ensemble form of the model, an independent executable, is
similar to the trajectory version of the ensemble. The
Ensemble-Turbulence
Another ensemble variation is the turbulence option, which
also creates 27 ensemble members, but due to variations in
turbulence
rather than variations due to gradients in the gridded meteorological
input data. The variance ensemble
S356.htm[2/9/2016 2:47:57 PM]
downwind receptor location needs to be selected and then by using the box plot display option, the concentration
variability (max-min range)
can be estimated and then determining when the decrease in range is no
longer cost
effective (in terms of computational time) with increasing
particle number.
Ensemble-Physics
The physics ensemble is created by running a script that varies in
turn the value of one namelist parameters from its
default value. These
are the parameters defined in the file SETUP.CFG and normally set
in the Advanced
Concentration Configuration menu tab.
If the namelist parameters have not been defined through the menu, then the
default values are assigned. In this first iteration, the GUI menu permits
no deviation from the values assigned by the
script. The entry box is for
information purposes only to show the progress of the computation.
cdump.001 : initd = 0
cdump.002 : initd = 3
cdump.003 : initd = 4
cdump.004 : kpuff = 1
cdump.005 : kmixd = 1
cdump.006 : kmixd = 2
cdump.008 : kmix0 = 50
cdump.009 : kzmix = 1
cdump.012 : kdef = 1
cdump.013 : kbls = 2
cdump.014 : kblt = 1
cdump.015 : kblt = 3
Conversely, independence among ensemble members does not necessarily imply that the reduced group of runs will be
more accurate than the full ensemble because, by chance, the later
could be also over emphasizing redundant members
that happen to be more accurate. Consequently, we can apply reduction techniques with the intent to produce more
accurate results than those obtained with the full ensemble and therefore requiring fewer computing resources. Solazzo
and Galmarini (2014) demonstrated that an ensemble can be reduced by optimizing the skills of the mean taken among
all the possible subsets of ensemble members. Following this methodology we calculate the average of all the possible
model combinations composed by an increasing number of sub ensemble members up to the total number of members
of the full ensemble and estimate their MSE. Furthermore, if M is the total number of ensemble members and n is the
number of sub ensemble members, then the number of possible combinations is given by M!/(n!*(M-n)!). In other
words, if our ensemble has, say, 24 members we will combine
them in 276 pairs, 2024 trios, 10626 quartets, etc. and
determine which combination provides the minimum MSE.
The reduction technique is applied by running a
postprocessing program that reads the different ensemble member outputs along with the measured data (all in DATEM
format) and calculates the minimum
square errors for each possible model output combination.
E. Solazzo, S. Galmarini, 2014. The Fukushima-137Cs deposition case study: properties of the multi-model ensemble,
S356.htm[2/9/2016 2:47:57 PM]
possible to take advantage of the the more precise Lagrangian approach near the
source and the more computationally
efficient Eulerian computation at the
hemispheric and global scales. More information on how to configure this
simulation
can be found in the global model help menu.
Dust Storms
A model for the emission of PM10 dust has been constructed (Draxler, R.R, Gillette, D.A., Kirkpatrick, J.S., Heller, J.,
2001,
Estimating PM10 Air Concentrations from Dust Storms in Iraq,
Kuwait, and Saudi Arabia, Atmospheric
Environment, Vol. 35,
4315-4330) using the concept of a threshold friction velocity which
is dependent on surface
roughness. Surface roughness was correlated
with geomorphology or soil properties and a dust emission rate is
The concentrations represent a 3-hour average from 21 to 24 hours after the start of the simulations. It is not possible to
say
exactly from when or where particles are emitted except to note
that the 105 potential source locations are shown.
The emission
algorithm emits in units of grams, but the in configuring the
concentration display, the units were
changed to ug by defining a
conversion factor of 106. Maximum concentrations are on
the-order-of 100, but the layer
was defined with a depth of 3 km to
facilitate comparison with satellite observations. The simulation
could be rerun
with a smaller layer to obtain concentrations more
representative of near-ground-level exposures.
Daughter Products
A nuclide daughter product module has been incorporated into HYSPLIT.
Given a chain of decay from a parent nuclide,
the model can calculate the
additional radiological activity due to in-growth of daughter products using the Bateman
equations.
Information about the available daughter products available in the model
can be found in ../auxiliary/ICRP07.NDX.
More information on how to configure this simulation
can be found in the daughter product help menu.
Table of Contents
Step 1: defines the measured data input used in this series of calculations. The dispersion model is run in its backward
mode, from each of the sampling locations, with a
particle mass proportional to the measured concentration and with
the particles released
over a period corresponding to the sample collection period. Sampling data files are in the ASCII
text DATEM format. The first record provides some general information, the second record identifies all the variables
that
then follow on all subsequent records. There is one data record for each sample collected.
All times are given in
UTC. The DATEM sampling data records have the following format:
INTEGER - Four digit year for sample start
INTEGER - Two digit month for sample start
INTEGER - Two digit day for sample start
INTEGER - Four digit hour-minutes for sample start
INTEGER - Four digit hour-minutes for duration of the sample
REAL - Latitude of the sampling point (Positive is north)
REAL - Longitude of the sampling point (Positive is east)
REAL - Air concentration in mass units per cubic meter
INTEGER - Site identification
An sampling data file which is used in the following calculations can be found in examples\matrix\measured.txt.
These
from different time periods valid for that 6-hour sampling period. A time aggregation value of 5 would average the
results from all 5 time periods into
one graphic. The zero threshold value can be used eliminate the very low level
contours that result from the zero-emission simulations. For instance, selecting
a value of 1.0E-15 (1/1000 of a pg/m3)
would set to zero any grid points less
than that value.
The comparable graphic for the inverse calculation is shown below for the mean emission values for 15 simulations that
had non-zero measurements. Before creating control files for the inverse simulation, previous control files should be
deleted to avoid mixing together the two types of simulations. For this example, the measured data file has 15 non-zero
measurements and 30 zero measurements and the
units conversion factor should be set to 1.0E-15 to go from pg to kg.
The resulting
interpretation of the graphic is that the central contour (value = 1) indicates
an average emission of 1 kg
in that region. The outermost contour (0.01) would require 100 kg to be released to match the measured data. Greater
dilution (D is
smaller) require greater corresponding emissions to match the measured data.
Table of Contents
Additional finer resolution regional meteorological grids can be defined for the
Lagrangian portion of the transport
calculation. Details of the global model have
been previously published: Draxler, R.R., 2007, Demonstration of a global
modeling
methodology to determine the relative importance of local and long-distance sources,
Atmospheric
Environment, 41:776-789, doi:10.1016/j.atmosenv.2006.08.052.
General Instructions
Although the HYSPLIT-GEM version has its own executable, without the required
namelist file GEMPARM.CFG the
calculation will proceed as a standard Lagrangian HYSPLIT simulation. In this situation, the model will write the
default
file gemparm.bak as a reminder that a GEM configuration file is required to
invoke the global subroutines. This
file can then be edited and renamed if not using the GUI interface. GEM and HYSPLIT each create their own
concentration output
files and if the total concentration is required, the results from the two calculations must be added
together. The program concadd is included in the
distribution and can be accessed from the global display menu
by
selecting the plume add checkbox. Furthermore, the GEM concentration grid is
always a snapshot concentration. There
is no averaging option. In general, the model
simulation is configured like any other HYSPLIT simulation, through the
CONTROL
and SETUP.CFG files. Additional GEM specific options are set in the GEMPARM.CFG file. Not all GEM
options are available through the GUI and can
only be modified by editing the file directly.
SETUP.CFG Namelist Configuration
There are several parameters in the HYSPLIT namelist that are important for the
GEM subroutines. In particular, pay
attention to the following:
1. Set INITD to one of the conversion variables, any value >100, for instance 109
2. Set CONAGE (default=48 hours) for particle conversion into the global model. Note that when CONAGE<0 the
particles switch to the GEM grid when the particle meteorology
grid switches to global meteorological grid or
when the particle age=|conage|
3. When using an EMITIMES file, QCYCLE is not used for emission cycling. Also if
the EMITIMES file has fewer
sources than defined in the CONTROL file, the
emissions specified in the CONTROL file need to be set correctly
and match the
first value in EMITIMES.
GEMPARM.CFG GUI Namelist Options
Only a subset of the namelist options are available to be set through the GUI. Note
that concentration (mass/m3) output
will always be written to gemconc.bin as individual
or averaged layers, but the integrated output (mass/m2) will only be
written to gemzint.bin.
The bottom and top values are given as index numbers where 1 is the bottom layer. Layer
thickness corresponds with the input meteorological data and cannot be modified. The GUI interface is shown
below:
Table of Contents
Step 1: Define the nuclide parent name (e.g. Cs-137) and run a preprocessor
(nuctree) that creates a summary text file
(DAUGHTER.TXT) that contains
the daughter nuclides produced by a parent nuclide along with the half-life and
branching fractions. This file is required to be in the working directory when running HYSPLIT for this application.
In
addition, the preprocessor will read the CONTROL file and create a new
(CONTROL.DAUGHTER) file containing the
daughter information. The names
of the species will be given in numbers that correspond to the daughter
products
names written in the DAUGHTER.TXT file.
Finaly, the GUI will copy the CONTROL.DAUGHTER back to
CONTROL.
Step 2: configure setup (SETUP.CFG) indicating that the daughter products module will be used (ICHEM=11) and set
Maxdim=#daughters obtained from DAUGHTER.TXT
Table of Contents
Concentration / Multi-processor
Special simulations may require a different executable file,
modifications to the Control file that are not supported by
the
GUI, or interactions with other items under the Advanced Menu tab. More information is provided
below for each
special simulation that can be run under a
multi-processor environment that supports MPI. These special
simulations
are available only for UNIX operating systems. All the
MPI simulations execute the special script run_mpi.sh which
Table of Contents
rather writes the same information using twice as many bytes. The
packed files are generally smaller because only
concentration
values at the non-zero grid points are written to the output file
by the model. However this requires the
grid point location to be
written with the concentration, hence the additional bytes. If most
of the grid is expected to
have non-zero concentrations, then the
old style format will save space. The output format of the
unformatted binary
(big-endian) concentration file written by
dispersion model (hycs_std) and read by all concentration
display programs is
as follows:
Record #1
CHAR*4 Meteorological MODEL Identification
INT*4 Meteorological file starting time (YEAR, MONTH, DAY, HOUR, FORECAST-HOUR)
INT*4 NUMBER of starting locations
INT*4 Concentration packing flag (0=no 1=yes)
Record #2 Loop to record: Number of starting locations
INT*4 Release starting time (YEAR, MONTH, DAY, HOUR)
REAL*4 Starting location and height (LATITUDE, LONGITUDE, METERS)
INT*4 Release starting time (MINUTES)
Record #3
INT*4 Number of (LATITUDE-POINTS, LONGITUDE-POINTS)
REAL*4 Grid spacing (DELTA-LATITUDE,DELTA-LONGITUDE)
REAL*4 Grid lower left corner (LATITUDE, LONGITUDE)
Record #4
INT*4 NUMBER of vertical levels in concentration grid
INT*4 HEIGHT of each level (meters above ground)
Record #5
INT*4 NUMBER of different pollutants in grid
CHAR*4 Identification STRING for each pollutant
Record #6 Loop to record: Number of output times
INT*4 Sample start (YEAR MONTH DAY HOUR MINUTE FORECAST)
Record #7 Loop to record: Number of output times
INT*4 Sample stop (YEAR MONTH DAY HOUR MINUTE FORECAST)
Record #8 Loop to record: Number levels, Number of pollutant types
CHAR*4 Pollutant type identification STRING
INT*4 Output LEVEL (meters) of this record
Table of Contents
Run the contour program and the display shown below will be created.
In the working there will be an additional file called GIS_METEO_01.txt
which is the generate format text file of the terrain contours.
Now open the GIS to Shapefile utility menu and browse for the generate format file name. Select the "Polygons" radiobutton to insure that the
terrain polygons are treated as closed connected lines. After processing the input file, four output files will be created with the base name of
contour. The subsequent plotting programs only need the contour.shp file, but the other files may be needed for import into other GIS
applications.
If you do not have a plume simulation already completed, open the concentration setup menu and run the default test case, but move the starting
location to 40N 125W and run the model for 24 rather than 12 hours.
To simplify the output, set the concentration grid averaging time to 24 h
and reduce the grid resolution from 0.05 to 0.10. Then run the model.
Now open the concentration / display / contours menu and define the
shapefiles.txt in the field for the map background. This will cause the
display program to use the files defined in this file rather
than the default map background. Multiple shapefiles can be displayed
at the same
time.
Executing the display will result in the following image showing the
intersection of the plume with the elevated terrain as it moves to the
east.
Table of Contents
Table of Contents
When using the GUI, the namelist file will be deleted and all variables
are returned to their default value by selecting
the "Reset" button.
The following summarizes the namelist variables and their corresponding section
in the GUI menu.
1.
2.
3.
4.
5.
6.
7.
8.
Table of Contents
When using the GUI, the namelist file will be deleted and all variables are
returned to their default value by selecting
the "Reset" button. The
following summarizes the namelist variables and their corresponding section
in the GUI menu.
Not all variables can be set through the menu system but may
be modified by directly editing the namelist file.
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
Table of Contents
(Lagrangian mode) or by using a pre-defined velocity vector (Forced mode). In the current version the dynamic sampler
transport is always vertically isobaric regardless of the vertical motion method selected for the pollutant transport. The
dynamic sampler samples model produced values that are generated internally on a snapshot concentration grid.
The first menu tab defaults to the configuration for one sampler. If multiple samplers are required, then enter the number
in this menu. After
pressing the Configure Samplers button, another menu comes up to
select the sampler number to
configure. Pressing the sampler number button
brings up the menu shown below. Each defined sampler must be
configured
according to the following instructions.
sampler. Multiple samplers would repeat the last seven line sequence.
1 : number of dynamic samplers
40.0 -90.0 500.0 : release location and height (agl)
0.0 0.0 : force vector - direction, speed (m/s)
95 10 16 00 00 : release start - year month day hour minute
95 10 16 00 00 : sampling start - year month day hour minute
00 : sample averaging (min)
60 : disk file output interval (min)
'LAGOUT.TXT' : sampler output file
Number of dynamic samplers: Due to file unit number restrictions, the current version only supports 9 simultaneous
sampler definitions. The following seven lines need to be repeated for every defined sampler.
Release location and height: The location is the latitude-longitude point at which the trajectory of the sampler path is
started. Although the subsequent vertical motion is isobaric, the initial starting height must be defined in meters above
ground-level. Note that the isobaric sampler trajectory will not be identical to the HYSPLIT trajectory model isobaric
trajectory because of differences in how the vertical motion is computed between the two models.
Force vector: If the direction (downwind) and speed (m/s) set to zero, then the model computes the sampler trajectory
according to the meteorological input data wind velocity. If any of these values are not zero, then the sampler
trajectory
is computed using these values for its entire length. To simulate a more complex aircraft sampling path, each leg of the
flight pattern requires its own sampler definition. For instance, an aircraft flying east with have a
direction vector of
090.
Release start: The release time of the sampler gives the date and time that sampler starts from the previously defined
release location. Note that in the current version, zero's for this field (relative start time) are not supported.
Sampling start: The sampler may proceed on its trajectory for quite some time before sampling is started. The sampler
start time must be greater than or equal to the release start time.
Sample averaging (minutes): At every computational time step the model determines the sampler position in the
concentration grid and accumulates the concentration value in memory. When the sampler accumulation time reaches
the sample averaging time, all sums are reset to zero. A sample averaging time of zero means that the sample is only
accumulated for one model integration time step.
Disk file output interval (minutes): At the disk output interval, the sampler concentration values are written to the output
file defined on the next input line. The value written is the accumulated concentration divided by the accumulated
minutes. The disk output interval is independent from the sample averaging time.
Sampler output file: The directory and file name should be enclosed in single quotes. If the file is defined without a
directory it is written to the model startup directory.
Concentration Grid Configuration
Note that dynamic sampling will only work if the sampler trajectory passes through a concentration grid covering the
region of interest. That means that a
concentration grid of sufficient resolution, in both the horizontal and vertical is
required for a sampler to capture the pollutant. However too much resolution (too fine a grid) may mean that there
could be an insufficient number of pollutant particles to provide a uniform distribution and therefore the sampling could
provide unrepresentative results. The concentration grid is required to be defined as a snapshot grid. In the current
version, only one pollutant species per simulation is supported.
Table of Contents
The first menu tab defaults to the configuration for one source. If multiple sources are required, then enter the number in
this menu. After
pressing the Configure Locations button, another menu comes up to
select the location number to
configure. Pressing the location number button
brings up the menu shown below. The GUI menu only supports the
creation of
a file for one pollutant for one emission cycle. If multiple pollutants are
defined, or multiple cycles are
required, then the file must be edited manually
by duplicating the emission record at each location for all pollutants in
the
order they are defined in the CONTROL file. Each defined release location must
be configured according to the
following instructions. The point source
emissions file must be located in the startup directory the name should always
Table of Contents
Table of Contents
S416.htm[2/9/2016 2:50:02 PM]
Table of Contents
always be six characters in length. The first three represent the time zone label,
such as UTC, the next character is the
sign field (+ to the east or - to the west),
and last two digits are the hours difference of the time zone from UTC.
Altitude units (ALTTD) may be set to feet for concentration output plots.
Table of Contents
The Table below shows the dependences of Foot and Mouth virus survival to the species identification string. FM_7 is the
most realistic option.
Identification Particle Aging Temperature Humidity
FM_0
False
False
False
FM_1
False
False
True
FM_2
False
True
False
FM_3
False
True
True
FM_4
True
False
False
FM_5
True
False
True
FM_6
True
True
False
FM_7
True
True
True
The species identification string can be set in the pollutant emission rate setup menu or automatically using the FMDV
radiobutton in the deposition setup menu in the following section.
- Particle age
Various publications use a virus decay constant to simulate particle ageing. For the Foot and Mouth virus estimates of an
effective half-life range from 30 mins[1] to 2 hours[2]. This constant is also dependent on the strain of the virus. To be
conservative the virus decay constant has been defined to be 2 hours by default but this can be easily redefined by the user
in the advanced settings menu (see below). Particle ageing can be turned off altogether if desired by using the species
identification label outlined above.
- Temperature
The survival of the virus depends on temperature. Taking the default temperature threshold of 24oC, all virus particles of
24oC and under are unaffected. The concentration mass of virus particles linearly decreases between 24oC and 30oC so that
none remain by the time they reach 30oC. This threshold of 24oC can be user defined in advanced settings (preserving the
6oC linear fall off) or the temperature dependence can be turned off altogether if required by selecting the appropriate
species identification string outlined above.
- Humidity
The survival of the virus also depends on humidity. The virus survives better in higher humidity. Virus concentrations with
relative humidity higher than 60% are left unaffected. Concentrations decrease exponentially as the humidity falls from 60%
to 1%. This threshold of 60% can be user defined in the advanced settings or the humidity dependence can be turned off
altogether if required by selecting the appropriate species identification string outlined above.
FMDV Survival Dependence
... upon Temperature
References
[1] M.G. Garner, G.D. Hess and X. Yang, An integrated modelling approach to access the risk of wind-borne spread of footand-mouth disease virus from infected premises, Environmental Modelling & Assessment (2006) 11: 195-207.
[2] J.H. Sorensen, C.O. Jensen, T. Mikkelsen, D.K.J. Mackay and A.I. Donaldson, Modelling the Atmospheric Dispersion of
Foot and Mouth Virus for Emergency Preparedness, Phys. Chem. Earth (B), Vol. 26, No. 2, pp. 93-97, 2001.
Table of Contents
Guide page with a more detailed discussion of the options. The namelist options that
only affect trajectories are shown
in italics.
CAPEMIN=-1 enhanced vertical mixing when CAPE exceeds this value (J/kg)
CMASS=0 compute grid concentrations (0) or grid mass (1)
CONAGE=48 particle to or from puff conversions at conage (hours)
CPACK=1 binary concentration packing 0:none 1:nonzero 2:points 3:polar
DELT=0.0 integration time step (0=autoset GT-0=constant LT-0=minimum)
DXF=1.0 horizontal X-grid adjustment factor for ensemble
DYF=1.0 horizontal Y-grid adjustment factor for ensemble
DZF=0.01 vertical (0.01 ~ 250m) factor for ensemble
EFILE=' ' temporal emission file name
FRHMAX=3.0 maximum value for the horizontal rounding parameter
FRHS=1.00 standard horizontal puff rounding fraction for merge
FRME=0.10 mass rounding fraction for enhanced merging
FRMR=0.0 mass removal fraction during enhanced merging
FRTS=0.10 temporal puff rounding fraction
FRVS=0.01 vertical puff rounding fraction
HSCALE=10800.0 horizontal Lagrangian time scale (sec)
ICHEM=0 chemistry conversion modules 0:none 1:matrix 2:convert 3:dust ...
INITD=0 initial distribution, particle, puff, or combination
KAGL=1 trajectory output heights are written as AGL (1) or as MSL (0)
KBLS=1 boundary layer stability derived from 1:fluxes 2:wind_temperature
KBLT=2 boundary layer turbulence parameterizations 1:Beljaars 2:Kanthar 3:TKE
KDEF=0 horizontal turbulence 0=vertical 1=deformation
KHINP=0 when non-zero sets the age (h) for particles read from PINPF
KHMAX=9999 maximum duration (h) for a particle or trajectory
KMIXD=0 mixed layer obtained from 0:input 1:temperature 2:TKE
KMIX0=250 minimum mixing depth
KMSL=0 starting heights default to AGL=0 or MSL=1
KPUFF=0 horizontal puff dispersion linear (0) or empirical (1) growth
KRAND=2 method to calculate random number 1=precalculated 2=calculated in pardsp 3=none
KRND=6 enhanced merge interval (hours)
KSPL=1 standard splitting interval (hours)
KWET=0 precipitation from an external file
KZMIX=0 vertical mixing adjustments: 0=none 1=PBL-average 2=scale_TVMIX
MAXDIM=1 maximum number of pollutants to carry on one particle
MAXPAR=10000 maximum number of particles carried in simulation
MESSG='MESSAGE' diagnostic message file base name
MGMIN=10 minimum meteorological subgrid size
MHRS=9999 trajectory restart duration limit (hours)
NBPTYP=1 number of redistributed particle size bins per pollutant type
NINIT=1 particle initialization (0-none; 1-once; 2-add; 3-replace)
NCYCL=0 pardump output cycle time
NDUMP=0 dump particles to/from file 0-none or nhrs-output interval
NSTR=0 trajectory restart time interval in hours
NUMPAR=2500 number of puffs or particles to released per cycle
NVER=0 trajectory vertical split number
P10F=1.0 dust threshold velocity sensitivity factor
S410.htm[2/9/2016 2:50:04 PM]
PINBC=' ' particle input file name for time-varying boundary conditions
PINPF=' ' particle input file name for initialization or boundary conditions
POUTF=' ' particle output file name
QCYCLE=0.0 optional cycling of emissions (hours)
RHB=80 is the RH defining the base of a cloud
RHT=60 is the RH defining the top of a cloud
SPLITF=1.0 automatic size adjustment (<0=disable) factor for horizontal splitting
TKERD=0.18 unstable turbulent kinetic energy ratio = w'2/(u'2+v'2)
TKERN=0.18 stable turbulent kinetic energy ratio
TM_PRES=0 trajectory diagnostic output pressure variable marker flag
TM_TPOT=0 trajectory diagnostic output potential temperature
TM_TAMB=0 trajectory diagnostic output ambient temperature
TM_RAIN=0 trajectory diagnostic output rainfall rate
TM_MIXD=0 trajectory diagnostic output mixed layer depth
TM_TERR=0 trajectory diagnostic output terrain height
TM_DSWF=0 trajectory diagnostic output downward short-wave flux
TM_RELH=0 trajectory diagnostic output relative humidity
TOUT=60 trajectory output interval in minutes
TRATIO=0.75 advection stability ratio
TVMIX=1.0 vertical mixing scale factor
VDIST='VMSDIST' diagnostic file vertical mass distribution
VSCALE=200.0 vertical Lagrangian time scale (sec)
VSCALEU=200.0 vertical Lagrangian time scale (sec) for unstable PBL
VSCALES=200.0 vertical Lagrangian time scale (sec) for stable PBL
Table of Contents
advection time step. Reducing this value will reduce the time step and
increase computational times. Smaller time steps
result in less
integration error. Integration errors can be estimated by computing
a backward trajectory from the forward
trajectory end position and
computing the ratio of the distance between that endpoint and the
original starting point
divided by the total forward and backward
trajectory distance.
Set Value - DELT (0.0) - is used to set the integration time step
to a fixed value in minutes from 1 to 60. It should be
evenly divisible
into 60. The default value of zero causes the program to compute
the time step each hour according to
the maximum wind speed,
meteorological and concentration grid spacing, and the
TRATIO parameter. The fixed time
step option should only be
used when strong winds in regions not relevant to the dispersion
simulation (perhaps the
upper troposphere) are causing the model to
run with small time steps. Improper specification of the time step
could
cause aliasing errors in advection and underestimation of air
concentrations. An alternate approach is to set the a
negative value
which will result in that value (abs) being used as the minimum time step.
Table of Contents
Table of Contents
Table of Contents
Table of Contents
Table of Contents
Horizontal Shift
Starting and ending member numbers can be selected where:
the range 1-9 applies for no vertical shift,
the range 10-18 applies for a positive vertical shift,
the range 19-27 applies to a negative vertical shift,
and DZF (0.01) is the vertical grid offset factor (0.01 ~ 250m).
Horizontal Shift
DXF (1.0) is the west to east grid factor for offsetting the meteorological
grid in the ensemble calculation.
DYF (1.0) is the south to north grid factor for offsetting the meteorological
grid in the ensemble calculation
Table of Contents
Table of Contents
Table of Contents
during the initial stages of the transport, a stage when the plume is still
narrow. Using mode #104 would start the
simulation with particles (and
concentration grid cells) and then switch to puff mode (and concentration
sampling
nodes) when the particles are distributed over multiple concentration grid cells. Valid options are:
103 - 3D particle (#0) converts to Gh-Pv (#3)
104 - 3D particle (#0) converts to THh-Pv (#4)
130 - Gh-Pv (#3) converts to 3D particle (#0)
140 - THh-Pv (#4) converts to 3D particle (#0)
109 - 3D particle converts to grid (global model)
A new option (109) introduced with the January 2009 version (4.9), converts
3D particles to the Global Eulerian Model
grid. The
particles are transferred to the global grid after the specified number of hours. This approach should only be
used for very long-range (hemispheric)
transport due to the artificial diffusion introduced when converting pollutant
Table of Contents
Table of Contents
Advanced / Configuration Setup / Emission Cycling and Point Source File (S624)
concentration array.
Table of Contents
all turbulent scales have been sampled and puffs grow in proportion to increasing time.
The empirical fit equation is
similiar but the rate of puff growth decreases with time.
Slower puff growth rate in the linear approach is represented
by the separation of puffs
after splitting due to variations in the flow. The empirical approach should only be used
in
those situations where puff splitting is constrained because of memory or computing
time limitations.
0 = Linear with time puff growth (DEFAULT)
1 = Empirical fit to the puff growth
Turbulence Anisotropy Factors
TKERD and TKERN are the ratios of the vertical to the horizontal turbulence
for daytime and nighttime, respectively.
TKER{D|N} is defined as W'2/(U'2+V'2).
A zero value forces the model to compute a TKE ratio consistent with its
turbulence
parameterization. A non-zero value forces the vertical and horizontal values derived
from the TKE to match
the specified ratio. This option is only valid with KBLT=3.
The Urban button increases the internal TKE by 40% and
slightly raises the nighttime ratio to account for enhanced turbulence in an urban setting. The landuse
file supplied with
the model is not of sufficient resolution to define urban areas and
hence the urban setting applies to all points in the
computational domain.
0.18 = TKERD day (unstable) turbulent kinetic energy ratio
0.18 = TKERN night (stable) turbulent kinetic energy ratio
Table of Contents
Table of Contents
particles that are of age = KHINP. The final output requires that the concentration
grids from the two simulations be
added together.
Also note that when a particle file is used for initialization of a simulation,
the meteorological grid identification of a
particle is retained from the previous simulation
(except when KHINP<>0). This means that if different meteorological
input files are
defined in the two simulations, a particle on the second simulation may not be defined on
the intended
meteorological grid at the start of the calculation.
Particle File Output Options
NDUMP can be set to dump out all the particle/puff points at
selected time intervals to a file called PARDUMP. This
file can be
read from the root directory at the start of a new simulation to continue the
previous calculation. Valid
NDUMP settings are [0] for no I/O or [hours]
to set the number of hours from the start of the simulation at which all the
POUTF is default name for the particle dump output file PARDUMP.
Although not yet available through the GUI, there is a command line program par2conc that can be used to convert the
particle position file to a binary concentration file.
Table of Contents
are defined they are released as individual particles. This feature is usually required with chemical conversion modules,
which are not part of the standard distribution. However, the species 1->2 conversion option will automatically set
MAXDIM=2. Both the standard point source and emissions file
input routines are enabled to release multiple species
on the same particle. In addition,
for the daughter product calculation MAXDIM needs to be set equal to the number of
daughter nuclides.
Setting MAXDIM greater than one forces all the species on the same particle, with the limitation
that the number of defined pollutants must equal MAXDIM. No other combinations are permitted. The advantage of
this approach is that not as many particles are required for the calculation
as in the situation where every pollutant is on
its own particle. The limitation is that each
species on the particle should have comparable dynamic characteristics,
such that they are all
gases or have similar settling velocities; otherwise the transport and dispersion simulation would
not be very realistic. In the case of multiple species defined on a single particle, the gravitational settling routine will
use the characteristics of the last defined pollutant to
compute settling for that particle. Deposition, decay, and other
removal processes are still
handled for independently for each pollutant on that particle.
Table of Contents
into multiple particles. To avoid the particle-puff number quickly exceed the computational
array limits, puffs
occupying the same location may be merged. There are several different
parameter settings that control the splitting and
merging. These are discussed in more detail
in this section.
means that only puffs whose total mass only represents 0.10 of the mass of all puffs will be subjected to enhanced
merging.
KSPL (1) is the interval in hours at which the puff splitting routines are called.
FRHS (1.0) is the horizontal distance between puffs in sigma units that puffs have to be
within each other to be
permitted to merge.
FRVS (0.01) is the vertical distance between puffs in sigma units that puffs have to be
within each other to be
permitted to merge. This parameter only applies to 3d puff simulations.
FRTS (0.10) is the fractional difference in the age that puffs have to be within each other
to be permitted to
merge.
KRND (6) is the interval in hours at which enhanced puff merging takes place. Enhanced merging is less
S630.htm[2/9/2016 2:50:11 PM]
restrictive (FRHS, FRVS, FRTS are 50% larger) and will degrade the accuracy of the
simulation. Puffs can be
further apart and still be merged into the same position. Enhanced
merging only occurs when the puff number
exceeds 25% of MAXPAR.
FRME (0.10) is the fraction of the total mass that represents a puff mass at which all
puffs with a mass less that
puff value will only account for FRME of the total mass. These "Low Mass" puffs will be subject to enhanced
merging.
FRMR (0.0) is the fraction of the mass that is permitted to be removed at KRND intervals.
The normal situation is
to permit no mass loss. However for certain simulations, such as
when a pollutant has a high ambient background
relative to a typical plume signal, a small
removal rate could significantly reduce the number of puffs on the grid
with no loss in the accuracy of the simulation. This removal procedure can also be specified for 3D particle
simulations.
concentration grid size, whichever is larger. A termination message to standard output has been added prior to
HYSPLIT completion if puff splitting restrictions are in place at the end of the simulation. Such a message would
suggest that it might be necessary to rerun the simulation with added array space or different merge parameters.
FRHMAX (3.0) is the maximum permissible value for FRHS.
SPLITF (1.0) at the default value of 1.0 is automatically recomputed to be
the ratio of the number of
concentration grid cells to the maximum number of particles permitted.
For values less than one this feature is
disabled and splitting occurs as before. For values
greater than one, that value is used to determine when a puff
splits. The distance adjustment
is based upon how many particles are required to cover the concentration grid
(assuming 20% in
the PBL). If there are insufficient number of particles, then the puffs are allowed to grow
Theoretical Considerations
For example, if a global simulation is required using a 1-degree resolution concentration grid, that results in about
65,000 grid points at the surface. Clearly a very long duration simulation that is expected to spread over much of the
domain will require a comparable number of puffs as grid cells to provide smooth concentration patterns. If we assume
the lowest layer only represents about 10 percent of the volume over which the puffs have been transported and mixed,
it is not unrealistic to expect such a simulation to require 10 times as many puffs. An alternate approach, would be to
dump the puffs into a global grid
model rather than splitting them as they grow to the size of the meteorological grid.
Concentrations at any point would then be a sum from the two modeling approaches. This is a
variation of a plume-ingrid model.
Parameter Considerations
Very long duration simulations or simulations using very fine resolution meteorological data, which have an insufficient
initial allocation of the puff array space (MAXPAR in the namelist) can result in split shutdown messages or perhaps
even emission shutdown messages. If any of these occur, the simulation results should be viewed with caution. The
results may be noisy and inaccurate if the emissions (new puffs released) have also been restricted. A simulation with
puff split restrictions may be improved by first increasing the array space to a value that still results in acceptable
simulation times. If not effective, or the CPU times become too long, the second choice could be to increase the
frequency of enhanced merging (perhaps decreasing KRND from 6, to 3, 2, or even 1), and perhaps in combination with
S630.htm[2/9/2016 2:50:11 PM]
decreasing the split interval (increasing KSPLT from 1 to 3, 6, or 12). Although decreasing the split interval will not be
effective once splitting has shutdown, it may extend the time at which splitting first shuts down. Enhanced merging has
little effect if most of the puffs have the same mass, perhaps because they were released at the same time. It is most
effective for continuous emission simulations, where there is a large range in puff mass due to the different number of
splits each puff has been subjected. An effective removal method is setting FRMR to a non-zero value. This has the
effect of purging the simulation of low-mass puffs and would be most appropriate for continuous emissions
simulations, where the puffs at longer distances have less importance. Used incorrectly, setting this parameter to a nonzero value can seriously bias the model results. For long-duration continuous emission simulations, it may also be just
as effective to stagger the emissions because it would not be necessary to emit puffs every time step for realistic (and
accurate) results. This could be accomplished by emitting more mass over a shorter duration and then cycling the
emissions. For instance instead of a continuous emission, one could emit 10 times the normal hourly amount over 0.1
hours (6 min) and the repeat the emission cycle (QCYCLE parameter) each hour. The emission cycle could even be
staggered over longer times.
Table of Contents
name of an external precipitation file that is in the ARL standard format. The text
file also defines the 4-character
name of the precipitation variable as well as
the conversion factor from file units to model units (meters/min).
The file name
is automatically constructed from the prefix and suffix names using current
calculation date
following the convention: {prefix}{YYYYMMDD}{suffix}. If the raindata.txt file is not found in the startup
directory, the model will create a template for manual editing and then stop.
MAXDIM (1) - is the maximum number of pollutants that can
be attached to one particle. Otherwise, if multiple
pollutants are
defined they are released as individual particles. This feature is
only required with chemical
conversion modules, which are not part
of the standard distribution. However, the species 1->2
conversion option
will automatically set
MAXDIM=2, but the standard emissions routine will still put each
species on a different
particle in its appropriate array element.
The only emissions routine that is enabled to release multiple
species on
the same particle is the emissions
file option. Setting MAXDIM greater than one forces all the species
on the
same particle with the limitation that the number of defined
pollutants must equal MAXDIM. No other
combinations are permitted.
The advantage of this approach is that not as many particles are
required for the
calculation as in the situation where every
pollutant is on its own particle. The limitation is that each
species on
the particle should have comparable dynamic characteristics,
such that they are all gases or have similar settling
velocities,
otherwise the transport and dispersion simulation would not be
very realistic.
S640.htm[2/9/2016 2:50:12 PM]
MESSG (MESSAGE) - is the default base name of the diagnostic MESSAGE file. In normal simulations, the name
defaults to the base
name. In an ensemble multi-processor calculation, the message file name
would be
automatically set to correspond with the process number by
appending the process number to the base name:
{base}.{000}, where the
process number can be any integer between 001 and 999.
NBPTYP (1) - defines the number of bins assigned to each particle
size as defined in the pollutant section of the
CONTROL file. The default value
of one uses the input parameters. A value larger than one will create that
number of particle size bins centered about each value in the CONTROL file.
The program creates the mass
distribution for a range of particle sizes given just a few points within the distribution. We assume that dV/d(log
R) is linear
between the defined points for an increasing cumulative mass distribution with
respect to particle
diameter. The input points in the CONTROL file should be
sorted by increasing particle size within the array. For
instance, if the
CONTROL file defines 3 particle sizes (5, 20, and 50), and NBPTYP=10, then for
the 5 micron
particle generates sizes from 2.3 to 8.1 microns while the 50 micron
input particle generates sizes from 30.2 to
68.9 microns.
OUTDT (0) - defines the output frequency in minutes of the endpoint
positions in the PARTICLE.DAT file when
the STILT emulation mode is
configured. The default value of 0 results in output each time step while the
positive value gives the output frequency in minutes. A negative value disables
output. The output frequency
should be an even multiple of the time step and be
evenly divisible into 60. In STILT emulation mode, the time
step is forced to
one minute.
PINBC - defines the input particle file name (default=PARINBC) that
can be used for boundary conditions during
the simulation.
RHB (80) - defines the initial relative humidity required to define
the base of a cloud.
RHT (60) - the cloud continues upward until the relative humidity
drops below this value. The values represent a
compromise between the cases when
the cloud (and precipitation) covers an entire grid cell and those situations
where the cloud may only cover a fraction of the meteorological grid cell. For
example, using the cloud fractions
as defined in the EDAS (40 km resolution) data, 70% of the time 100% cloud cover occurs when the RH exceeds
80%.
TVMIX - vertical mixing scale factor is applied to vertical
mixing coefficents when KZMIX = 2 for boundary layer
values and
KZMIX = 3 for free-troposerhere values.
VDIST (VMSDIST) - an output file that contains the hourly vertical
mass distribution, similar to the values shown
in the MESSAGE file every six
hours. The output can be disabled by setting the default file name to blank
or null
length. Programs vmsmerge and vmsread are available in the
exec directory to view and process these files.
VSCALE (200.0) - the vertical Lagrangian time scale in seconds
controls the transition of the dispersion rate from
linear to square root
growth as a function of time.
VSCALEU (200.0) - the vertical Lagrangian time scale in seconds
controls the transition of the dispersion rate
from linear to square root
growth as a function of time for unstable PBL conditions.
VSCALES (200.0) - the vertical Lagrangian time scale in seconds
controls the transition of the dispersion rate
from linear to square root
growth as a function of time for stable PBL conditions.
Table of Contents
ftp://jwocky.gsfc.nasa.gov/pub/{eptoms|nimbus7}/data/aerosol/{year}/
file = L3_aersl_{ept|n7t}_{year}{month}{day}.txt
More current information is available from the README files on the NASA server.
TOMS Earthprobe datasets are located in the following directory paths:
data/ozYYYY/gaYYMMDD.ept (YYYY=1996 >>) (EarthProbe still reporting data)
Daily gridded (g), ASCII (a) files from year YY, month MM and day DD. The data in
these ASCII files are in the
format used for the Nimbus-7 and Meteor-3 CD-ROM. Details
of this format are explained in the EarthProbe
Data Products User's Guide,
EARTHPROBE_USERGUIDE.PDF, available in the parent directory. A stub
program, RDGRID.FOR
may be used to read these datasets.
The values are in 3 digit groups 111222333444555
information about
interpreting/decoding the uv data file format.
The values are in 3 digit groups 111222333444555
123 = 2.3x10^1 J/m2 -23 = 2.3 --3 = 0.3 999 = missing or bad data
data/monthly_averages/gmYYMM.ept .
This directory contains gridded monthly (gm) averages of ozone
data from year YY and month MM. At least 20
days of data must be
available for monthly averages to be considered valid.
560
DO 570 IL=1,180
DO 560 jj=1,288
OZONE(JJ)=ozav(JJ,IL)
write(21,'(1X,25I3)') (OZONE(jj),jj=1,275)
write(21,'(1X,13I3,1A17)') (OZONE(jj),jj=276,288), latlab(IL)
CONTINUE
570
data/overpass/OVPxxx.ept
A subset (seconds UT, latitude, longitude, total ozone, surface
reflectivity, solar zenith angle,aerosol index, SOI
index, etc) of
EarthProbe data specific to a particular latitude-longitude
location. xxx designates a numerical site
location as listed in
file "1Sitelist" in this directory.
Fortran statements that can be used to read the 1st header
record and the data records in the OVPxxx files are the
following:
Apr
May
Jun
Jul
Aug
Sep
300
DO 300 IZ=1,36
write(22,'(I3,I4,12f6.1)')
CONTINUE
write(22,'(I3,I4,12f6.1)')
write(22,'(I3,I4,12f6.1)')
write(22,'(I3,I4,12f6.1)')
lat(IZ),lat(IZ+1),(ozm(IM,IZ),IM=1,12)
-65,0,(ozm(IM,37),IM=1,12)
0,65,(ozm(IM,38),IM=1,12)
-65,65,(ozm(IM,39),IM=1,12)
The Fortran statement that wrote the data records of the zmday_YY
files is:
write(21,'(A12,F9.3,1X,36F6.1,3X,3F6.1)') date,yd,zmd
Code 916
Greenbelt, MD 20771
Table of Contents
accomplished by overlaying the AI values with the model particle positions from
the particle dump file and/or the
concentration field. The TOMS display provides
a color coded representation of analysis of aerosols from the "Total
Ozone
Mapping Spectrometer". A range of warmer colors represents various positive
values corresponding to
absorbing aerosols.
The particle position dump file should have been created during the simulation
by configuring the output of particle files
(item 9) in the advanced configuration
menu. Note that output is only required once-per-day because each TOMS data
file
represents all the overpasses for one day. For any given longitude, the satellite
overpass occurs approximately near
solar noon. Therefore the output time should be
determined according to the predominant particle longitude. For
instance, for the
eastern United States, solar noon occurs around 1600 UTC.
The viewer has several input options that can all be selected through the
GUI. The map center can be set for the dateline
or prime meridians; the AI display
has a minimum value below which nothing will be shown; and there is an AI color
increment that represents the AI difference between the colors. Colors change from
cold (blue) to warm (red) with
increasing AI.
The times in the output files are not matched and they are just displayed in
time sequence, with the TOMS file opened
according to the date-time indicator
written in the particle file. Therefore multiple particle or concentration files
could
be written for each day, but only one TOMS file will be opened for that day.
There are two steps involved in viewing these files. First the map domain needs
to be initialization by first clicking and
holding the left mouse button from the
lower left corner and releasing the button at the upper right corner. The map
will
then be redrawn. Then each left button mouse click draws next data for the next time period. A right button click will
exit the display loop.
Table of Contents
configuration supplied with the test meteorological data is confined to a simple trajectory and inert transport and
dispersion calculation. Some of these more complex scenarios are configured through the Advanced menu
Configuration Setup tab which modifies the "SETUP.CFG" namelist file.
Particle or Puff Releases
The concentration model default simulation assumes 3D particle dispersion (horizontal and vertical). Other options are
set with the INITD parameter of the SETUP.CFG namelist
file defined in the advanced menu section. Normally
changes to the
dispersion distribution are straightforward. However, there are some
considerations with regard to the
initial number of particles released.
The default release is set to be 2500 particles over the duration of the
emission
cycle (see NUMPAR). A 3-dimensional (3D)
particle simulation requires many more particles to simulate the proper
hour and those particles will only contribute to concentrations on the grid with
the same release-time tag. The pollutant
identification field is used and its value
corresponds to hours after the start of the simulation. The output concentration
grid will appear to be like any other but with 24 pollutants, one for each release
period as defined by the QCYCLE
parameter. Make sure that a sufficient
number of particles have been released to provide consistent results for all time
perioids.
The output can be processed like any other concentration file by selecting the
specific pollutant (= release time) or by
two different post-processing applications
that are configured to decode the output file. The apply source
concentration
utility menu multiplies each concentration field with its associated
emission rate defined in an external file. Another
option is to use the convert to station menu to extract the time series of
concentrations contributed by each release time
to each sampling period for a
specific pre-selected location. The TCM format checkbox should be selected to enable
the proper conversion option.
Area Source Emissions
Normally emissions are assumed to be point or vertical line sources. Virtual point sources (initial source area >0) can be
http://www.ready.noaa.gov/HYSPLIT_pcchem.php
# Auto_traj.tcl
set Output_numb 1
puts $f "$Start_time"
puts $f "1"
puts $f "$Start_loc"
puts $f "$Run_hours"
puts $f "$Vert_coord"
puts $f "$Top_model"
puts $f "1"
puts $f "$Meteo_path"
puts $f "$Meteo_file"
puts $f "$Output_path"
puts $f "$Output_file"
close $f
exec "$Traj_path/hyts_std.exe"
incr Output_numb}
In this particular example the test trajectory case is run for
three different starting locations, each simulation writing a
new
endpoints file with a unique file name. The CONTROL file is recreated for each simulation. It would be trivial to
rewrite the script to set the latitude-longitude and loop through a different number of starting days and hours. With
Tcl/Tk installed, this script can be run under Windows or Unix. For instance, to compute new forecast trajectories each
day, the process can be automated by including a data FTP at the beginning of the script to get the most recent
meteorological forecast file, setting the starting time as
"00 00 00 00" so that the trajectories will start at the beginning
of the file, and finally calling the script once-a-day though the
Unix crontable or the Window's scheduler commands.
One problem with automated operations is that it is possible to
generate simultaneous multiple jobs which may interfere
with each
other. The executables, hycs_std and hyts_std have a command line
option of adding the process ID (PID):
e.g. hyts_std [PID]. In this
situation all standard named input and output files [those not
defined in the Control file]
have the PID added as a suffix to the
file name: e.g. Control.[PID], Setup.[PID], Message.[PID].
An example of another type of operational configuration is the
extended simulation of a pollutant emission using
archive data to
bring the simulation to the current time and then using forecast
meteorological data to provide a daily
projection. Each day the
archive simulation must be updated with the new archive data and a
new forecast product
generated. This process can also be automated
through a script, but for illustration purposes one can use the
advanced
features of the GUI to configure such a case. Assume a
one-hour duration accidental pollutant release that occurred 48
than the analysis. In practice, this procedure can be run at the same
frequency that the new forecast data are available,
typically 4 times per
day. Data at the initial forecast hour are identical to the analysis data.
Source Attribution with Dispersion
A common application of atmospheric trajectory and dispersion
models is to try to determine the source of a pollution
Although HYSPLIT does not contain the mass conservation adjustments and coupled
vertical-mixing horizontaltransport features of STILT, setting this flag causes
the HYSPLIT concentration output field units to be similar to the
STILT
configuration. Two changes are introduced; the mass summation is divided by air
density resulting in a mixing
ratio output field (ICHEM=6) and the lowest
concentration summation layer (concentration layer top depth) is
permitted to
vary with the mixed layer depth (ICHEM=9), so that only particles below 50% of the
mixed layer depth at
each time step are summed to the lowest layer. The ICHEM=8 switch turns on both density and varying layer depth. A
text file of particle position information (PARTICLE.DAT) at each time step will also be created
unless the namelist
parameter OUTDT defining the output
interval (min) is changed.
Concentration layer varies with mixed layer depth
This option changes the height of the lowest concentration layer from a fixed value to a variable, which changes
according to the mixed layer depth each time step. The depth, as a fraction of the mixed layer, is computed by the
height value of the
lowest level, where the height is interpreted in hundreds, such that the a fraction
representing 0.5 of
the mixed layer depth would be entered as 50. Note that in this
mode it may be desired to change the default minimum
mixing depth KMIX0 from its default value of 250 m, which would
result in a minimum sampling layer of 125 m.
Concentration levels above the first
level are unaffected by this option.
Table of Contents
Table of Contents
Table of Contents
95 10 16 06
1
40.0 -80.0 10.0
30
0
10000.0
1
./
oct1618.BIN
1
TEST
100.0
0.1
00 00 00 06 00
1
38.0 -76.0
0.10 0.10
15.0 25.0
./
cdump
1
100
00 00 00 00 00
00 00 00 00 00
00 06 00
1
0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0
0.0
0.0
&SETUP
tratio = 0.75,
initd = 0,
kpuff = 0,
khmax = 9999,
numpar = 5000,
qcycle = 0.0,
efile = '',
tkerd = 0.18,
tkern = 0.18,
ninit = 1,
ndump = 0,
ncycl = 0,
pinpf = 'PARINIT',
poutf = 'PARDUMP',
mgmin = 10,
kmsl = 0,
maxpar = 5000,
cpack = 1,
cmass = 0,
dxf = 1.0,
dyf = 1.0,
dzf = 0.01,
ichem = 0,
/