ClimsoftV4 User Manual Feb 2022
ClimsoftV4 User Manual Feb 2022
Revised by
Samuel Machua & Marcellin Habimana
February 2022
1
Contents
1. Introduction ................................................................................................................................... 5
2. Installation ..................................................................................................................................... 6
2.1 System Requirements...........................................................................................................6
2.2 Installation Procedures .........................................................................................................6
2.3 Creating Climsoft Database in MariaDB .............................................................................9
2.4 Climsoft First Login...........................................................................................................11
2.5 Updated Climsoft Login and Database Connection...........................................................13
2.5.1 Login Dialog ......................................................................................................................13
2.5.2 Database Connection Dialog..............................................................................................13
3. Climsoft Main Menu and Welcome Dialog ................................................................................ 14
3.1 Menu Bar ...........................................................................................................................14
3.2 Main Menu Icons ...............................................................................................................15
4. User Management and Configuration Settings............................................................................ 16
4.1 User Management ..............................................................................................................16
4.2 Configuring General Settings ............................................................................................17
5. Metadata Management ................................................................................................................ 19
5.1 Importing Station Metadata ...............................................................................................20
5.1.1 Station Metadata Import procedure ...................................................................................22
5.2 Other Metadata...................................................................................................................22
5.2.1 Elements Metadata .............................................................................................................22
5.2.2 Station Element ..................................................................................................................25
5.2.3 Instruments.........................................................................................................................25
5.2.4 Station Location History ....................................................................................................26
5.2.5 Station Qualifier .................................................................................................................26
5.2.6 Schedule Class ...................................................................................................................27
5.2.7 Physical Feature .................................................................................................................28
5.2.8 Paper Archive Definition ...................................................................................................28
6. Key Entry..................................................................................................................................... 29
6.1 Setting up Key Entry Forms List .......................................................................................29
6.2 Opening a Key Entry Form ................................................................................................29
6.3 Key Entry Forms Layout ...................................................................................................30
6.4 Key Entry Sequencer .........................................................................................................31
2
6.5 Configuring synoptic hours for key entry ..........................................................................33
6.6 Key Entry Operation ..........................................................................................................35
6.6.1 Some special features for specific key entry forms ...........................................................37
6.7 Emptying Key Entry Tables...............................................................................................42
6.8 Operational Monitoring .....................................................................................................43
6.8.1 Users Records ....................................................................................................................43
6.8.2 Settings...............................................................................................................................44
6.8.3 Performance Monitoring ....................................................................................................45
6.8.4 Double Key Entry Verification ..........................................................................................45
7. Post Key Entry Quality Control Checks ...................................................................................... 47
7.1 Updating the database from a QC report ...........................................................................49
7.2 Transferring Data from “observationinitial” to “Observationfinal” table .........................49
7.3 Update Observations ..........................................................................................................50
8. Data Transfer Operations ............................................................................................................ 53
8.1 Data from Climsoft V3 – Data migration ..........................................................................53
8.1.1 Importing from Climsoft Version 3 MySQL Database .....................................................53
8.1.2 Importing from Climsoft V3 Backup .................................................................................54
8.2 Importing CLICOM data ...................................................................................................54
8.3 Importing Automatic Weather Station (AWS) data...........................................................56
8.3.1 Selecting the import file and viewing the data...................................................................56
8.3.2 Configuring column headers and uploading data into Climsoft database .........................57
8.4 Importing text files data with different structures..............................................................58
8.4.1 Hourly and Daily Data Import ...........................................................................................58
8.4.2 Multiple Element Columns ................................................................................................61
8.5 Importing Summarized Data ..............................................................................................64
8.6 Importing Upper Air data...................................................................................................66
8.7 Upload failures ...................................................................................................................66
8.8 Importing GTS data downloaded from NOAA – NCEI ....................................................66
8.9 Selecting Station and Element on the Import Dialog.........................................................68
9. Data Backup and Restore ............................................................................................................ 69
9.1 Data Backup .......................................................................................................................69
9.1.1 Data backup by station .......................................................................................................69
9.1.2 Complete database backup .................................................................................................70
9.2 Data restore ........................................................................................................................72
3
9.2.1 Data restore by Station .......................................................................................................72
9.2.2 Complete Database Restore ...............................................................................................72
10. Climate Products ...................................................................................................................... 73
10.1 Standard Products ..............................................................................................................73
10.2 Data Products for Other Applications ................................................................................78
10.2.1 Data for CPT ......................................................................................................................78
10.2.2 Data for GEOCLIM ...........................................................................................................78
10.2.3 Data for Instat ....................................................................................................................79
10.2.4 Data for Rclimdex ..............................................................................................................79
10.3 Windrose Plot.....................................................................................................................80
10.4 Plotting Charts ...................................................................................................................81
10.5 Inventory Products .............................................................................................................83
10.6 Products from Upper Air Observations .............................................................................86
10.7 CLIMAT Messages............................................................................................................87
11. AWS Real Time Data Operations ............................................................................................ 90
11.1 Server Settings ...................................................................................................................90
11.2 Sites Settings ......................................................................................................................92
11.3 Data Structure Settings ......................................................................................................93
11.4 Updating existing structures ..............................................................................................94
11.5 Creating a new structure ....................................................................................................95
11.6 Coding Settings ..................................................................................................................97
11.7 Data Acquisition and Processing Dialog ...........................................................................98
12. Paper Image Archiving .......................................................................................................... 100
12.1 Archiving Images .............................................................................................................101
12.2 Archiving Images with Standard Filename Structure ......................................................101
12.3 Archiving Images without Standard Filename Structure .................................................102
12.4 Retrieving and Viewing Archived Images .......................................................................103
12.5 Viewing a List of Archived Image Files ..........................................................................104
13. TDCF Synoptic data Encoding Operations............................................................................ 105
13.1 Form for Synoptic data key entry and encoding ..............................................................105
13.2 TDCF Settings .................................................................................................................107
13.3 Climsoft TDCF Encoding in BUFR ................................................................................108
14. ACKNOWLEDGEMENTS ................................................................................................... 109
4
1. Introduction
What is Climsoft?
The Climsoft system is a suite of software which is specifically designed for storing climatic data
in a secure and flexible manner, and for obtaining useful information using these data. Climsoft
version 4 was developed using Microsoft Visual Basic.NET. The default database storage
engine is the open source MariaDB which is a fork from MySQL.
The Climsoft system comprises:
• The database, which holds climatic data for multiple stations in a logical and flexible
structure;
• A Key-entry component, to allow users to add new data to the database in a secure and
controlled manner;
• Facilities for importing climatic data in various formats into the database e.g. Data from
Automatic Weather Stations (AWS), CLICOM ASCII, NOAA GTS and Text Files;
• Facilities for exporting climatic data in formats ready for use by many climate applications
(e.g.: RCLIMDEX, CPT, GEOCLIM, INSTAT, R- INSTAT, ENACTS, etc.);
• Management of images of paper records (paper archives) in support of Data Rescue
(DARE);
• Quality Control checking of data that has been loaded into the database;
• A number of Products, application programs which use subsets of the data stored in the
database to produce useful reports, summaries and diagrams; and
• System management facilities for managing user access, operations monitoring,
administering the database and for tailoring the functions of the system to local needs.
5
2. Installation
2.1 System Requirements
Hardware:
The Climsoft system requires the following typical hardware specifications:
Computer with processor speed of 1 GHz, 2 GB RAM and enough disk space for installation of
application software and storage of database. A minimum of 20 GB disk space is recommended.
More disk space will be required for storage of AWS data and images of paper records for Data
Rescue. However, it would be ideal to store images of paper records on removable storage
media like external hard drives or Network Attached Storage (NAS).
Internet connection is also desirable.
Software:
• Microsoft Windows 7 or later, 64-bit Operating System (OS) is recommended for optimum
performance of the database. Climsoft Version 4 is compatible with the following Windows
Operating Systems: XP, Vista, Windows 7, Windows 8.x and Windows 10;
• Microsoft Office with Excel spreadsheet is required since Excel spreadsheet application is
the default application for displaying data products from Climsoft;
• Antivirus software;
• Minimum.Net Framework 4.5 is required. This version of the .NET framework is packaged
with the Climsoft setup package;
• Climsoft installation setup. Can be downloaded from met-eLearning site. Registration may
be required.
• MariaDB/MySQL database system and Climsoft database scripts which are also supplied
through Climsoft installer and will be copied into the subfolder called "dbase" in the
installation path;
• WRPlot software for Windrose plotting which will be copied into subfolder called "Bin" in the
installation path.
3. When the core Climsoft setup has been completed, a dialog with options for installing
additional applications required by Climsoft will appear as shown in Figure1(a) below:
6
Figure 1(a): Installation of additional applications
You may uncheck the selected applications if they are already installed or you would wish to
install them after Climsoft installation. If you choose to install them later then find their installers
in the ‘bin’ folder of the installation path.
If you choose the default selections the applications listed in the above dialog will be installed
automatically in the order they are listed on the above dialog. However, if the version of
Microsoft .Net Framework on your computer is very low and need to be upgraded, the system
will prompt you to do the upgrade, below is an example of a typical error message you may get:
7
If you get a warning showing that you have a newer version of one of these applications already,
click the Close button to continue with the installation of the next application. Figure 1(c) displays
the dialog with a warning that a newer version of Microsoft Visual C++2010 Redistributable has
been detected on your machine, in this case, click the Close button to continue with the
installation of the next application.
Figure 1(c): Warning about newer version of Microsoft visual C++ 2010
When you get to the installation of MariaDB, and you reach the point of specifying the Port
number for running the MariaDB Service, it is strongly recommended that you choose the
Port 3308, this is the port number defined in the configuration file of Climsoft for connecting to
the MariaDB database that comes with the Climsoft setup. If you choose a different number, you
will need to reconfigure the connection to the database to make the port number the same as
the one you have configured during the MariaDB setup.
It is also important to note that the password that you define during the installation and
configuration of MariaDB will be required when you login to Climsoft for the first time.
NOTE: If prompted to restart the computer before all selected installations are complete,
please choose "restart later" and then restart the computer when all installations are
complete.
If the installation of WRPlot appears to take too long to complete, check for a second installation
dialog that may be minimized on the bottom of the task bar requiring your input. If that dialog is
there, open it and respond as appropriate.
Although WRPLot is a freeware, you need a license key to activate this application. An initial
license key which is valid for one year is contained in the file "WRPLot_license.txt" in the sub
folder "Bin" in the Climsoft installation folder. To renew the license you will need to visit the
8
"Lakes Environmental" website at http: //www.weblakes.com/. This URL is also provided on
the "About" box of the WRPlot application.
If you need to install, or re-install MariaDB or WRPLot later, the installation files for these
applications are contained in the "Bin" sub folder of the Climsoft installation folder.
9
Figure 3: Entering the password on the MariaDB Client Prompt
Note that the user name is assumed to be “root”. (There is no provision for entering the
username if you have launched the MariaDB Client Prompt as described above.)
If you have entered the correct password i.e. the same password that you entered during the
installation and configuration of MariaDB, the prompt then appears as shown in Figure 4.
(ii) Assuming the script file for creating the Climsoft version 4 (Climsoftv4) database is in the
Path “C:\Program Files (x86)\ClimsoftV4\Dbase”
If you have typed the command correctly, the script will create the Climsoft Version 4
databases: Two (2) databases are created, the operational database
(mariadb_climsoft_db_v4) and the test database which contains data for testing purpose
(mariadb_climsoft_test_db_v4).
(iii) To check or verify that Climsoft Version 4 databases have been created in MariaDB database
engine, type: “Show databases;” command in MariaDB Client Prompt and press the Enter
Key. The 2 Climsoft databases will then be listed together with other database schemas that
come with MariaDB;
(iv) If you are sure you have typed the correct password but have been denied access, click on
the blue label “Manage database connections” on the above dialog to check the database
connection details and adjust accordingly. Then a dialog similar to Figure7 appears.
11
Figure 7: Database connection details
If the MariaDB database to which you want to connect is on your local computer then the server
must be either “localhost” or “127.0.0.1”. If the port number you chose during the installation of
MariaDB is different from “3308” as shown on the screenshot above, you will need to change
the port number to the one you used during the installation of MariaDB and click Ok. More details
on how to manage database connections are given in paragraph 2.5.2 below.
After successful login, the Climsoft front-end will appear as shown (Figure 8). From the front-
end, you can access all the components of the Climsoft, Climate Data Management System
(CDMS), either through the top menu or by clicking the icons on the Welcome screen.
However, a user will be able to access only the features corresponding to the privileges
associated with the user- group to which they belong to.
12
2.5 Updated Climsoft Login and Database Connection
Note that Climsoft V4 installation comes with two databases; one as the operational database
while the one as test database. The test database is populated with observational data and
metadata obtained from different NMHSs. In order to use the test database, it must be listed in
the Database Connections as shown in Figure 7a and be selected at the Login dialog through
the Database list box.
13
3. Climsoft Main Menu and Welcome Dialog
After a successful Login the main interface for Climsoft Version 4 displays (see Figure 8). It is
referred to as the Welcome Dialog and consists of the Main Menu and icons through which all
the functionalities are accessed.
15
4. User Management and Configuration Settings
4.1 User Management
The default user of the Climsoft CDMS is “root”. This is the super user of the system with access
to all the features of the Climsoft CDMS. To allow other users to make use of the system, the
“root” user must add new users and assign them to different user groups. The different user
groups that have been pre-defined for Climsoft are shown in Table1:
User Group Summarized access rights
ClimsoftAdmin All Climsoft features including creating and deleting other users
ClimsoftDeveloper Modification of all Climsoft features
ClimsoftMetadata Updating of metadata e.g. station information, instruments, etc.
ClimsoftOperator Key-entry of observation data
ClimsoftOperatorSupervisor Key-entry plus uploading key-entry data to observationinitial
table
ClimsoftProducts Extracting products from Climsoft
ClimsoftQC Key-entry, upload to observationintial and Quality Control
ClimsoftRainfall Key-entry of only rainfall data
ClimsoftTranslator Translation of text on controls and messages
Table 1: CLIMSOFT user roles
The “root” user or any other user with administrative privileges is able to access the user
management dialog by clicking the “User Management” icon on the Welcome screen or via the
“Administration” menu. The dialog for User Management is shown in Figure 9.
The user management dialog is self-explanatory. The “Add New” button is enabled after filling
all the text boxes and picking the user role from the dropdown list.
16
Apart from creating new users, the Climsoft administrator can change the password and also
the User Role of any user. To change the password for the selected user, the new password
must be entered and confirmed in the Password box and Confirm Password boxes
respectively and click Update button. The user role is changed by selecting the new one from
the user Role list box then click the buttons Update and Refresh Privileges in that order.
Any user can change his/her own password via the menu item “Change own password” on the
Welcome form.
The root user is managed through the database engine tools but not through the User
Management dialog.
The Administrator must navigate through all the records in the dialog and change the contents
in the text box to the right of the label “Setting Value” to suit the required local configuration.
After changing any record click Update button before going to the next.
Alternatively, by clicking the “View” button, the settings can be viewed and edited in datasheet
(table) view as shown in Figure 10(b).
17
Figure 10 (b): General settings in datasheet view mode
You may increase the columns width in the data sheet to see more details by dragging the
columns boundaries to the right. With administrative privileges the records may be updated but
not deleting or adding. Buttons for these operations are enabled after clicking the button Edit
Mode. Values in the column “KeyName” should not be changed because they are used by
Climsoft to reference the selected record.
Through Export button the displayed records can be exported into a text (csv) file and saved as
a backup data. The Import button does the reverse.
18
5. Metadata Management
Climsoft uses metadata to describe the data archived in its database. It is therefore important to
manage the metadata well and ensure all metadata is in place before the data it describes is
loaded into the Climsoft database. The dialog shown in Figure 11 below facilitates management
of all metadata in the Climsoft database. Each tab opens a dialog for the management of each
of the metadata type.
19
• Latitude and Longitude MUST be entered in decimal degrees. Where geographical
coordinates are not available in decimal values but in degrees, minutes and seconds
they may be entered in the provided boxes for conversion. They will get computed and
filled into the appropriate boxes once the N/S or E/W is selected.
• Command Buttons:
o AddNew – Adds a new metadata record after details have been typed in. The
Station Id is mandatory and must be unique in all records;
o Save – Adds and saves a new station record in the database whose details have
been typed;
o Update – Updates the changes made on the selected record;
o Delete – Removes the selected and displayed record from the database;
o View – Displays the metadata records in a tabular form. The Export button will export
the metadata file in csv format;
o Import – Opens a dialog through which stations metadata in a text (csv) can be
imported into Climsoft. By clicking the Import Command button the dialog in
Figure12 below appears;
The metadata import file can be created using EXCEL then saved as Comma Separated Values
(CSV) with column headers. In EXCEL the structure is as shown in Figure 13;
20
Figure 13: structure of station metadata file in Excel
After saving in a text (csv) format the data appears as shown in Figure 14.
21
5.1.1 Station Metadata Import procedure
• From the Metadata Import dialog, Figure 12, open and locate the text file
containing the stations metadata;
• The column headers in the text file are listed in the Import Field Name column of the
dialog;
• For each field in the list click twice (not double click) on the Select Field to select the
corresponding field name from the database;
• Click the Import command. The metadata will then be imported and any station that fails
to be imported will be listed in Errors message box with the cause of the failure. The
summary of the importing will be listed below that box.
• Scroll through the error messages, correct the text file and repeat the exercise. Duplicate
error messages should be ignored since they indicate the record already exists in the
database.
• Click Close to exit the dialog.
22
Figure 15: Elements dialog
The dialog in Figure 15 is used as follows:
(i) Adding a new element record:
• Click AddNew to obtain a blank form;
• Enter metadata values for the element;
o Data for ID is mandatory and unique in all records,
o Scale is the factor used in key entry for the decimal point. Therefore
Upper Limit and Lower Limit values should take the scale into account,
o Type is the frequency of observation e.g., minute, hourly, daily,AWS.
• Click Save to add the new record in database.
(ii) Updating (editing) existing element record:
• Use either Search Element box or ID to locate the required element record,
• Edit the desired values,
• Click Update to save the changes.
(iii) Deleting elements
• Use either Search Element box or ID to locate the required element record
• Click Delete
Alternatively move to Table View, Figure 16 and use the command buttons as described
below;
23
Figure 16: Elements in table view
24
5.2.2 Station Element
The details of the elements observed at a particular station are entered here:
This dialog enables the editing and updating of element information as recorded at a particular
station. Before adding a station element record in this dialog the stationId, elementId,
instrumentId and scheduleclass that describes this element should first be entered in their
respective tables through the provided dialogs (under the Tabs) in that order. The value for
Instrument Type is a code that can be obtained from the WMO Code and Flag tables manual.
Not all instruments that have codes. In that case the box is left blank. Some of the Instrument
types described in Code and Flag table are for measurement of evaporation and wind.
5.2.3 Instruments
This dialog enables the system administrator to specify the characteristics of a particular
instrument used to record an element at a given station.
25
Figure 18: Instrument details
27
5.2.7 Physical Feature
The detailed Physical features at the station’s environment are shown in Figure 22.
28
6. Key Entry
Climsoft comes with a list of Key-entry forms for a user to choose the form that suits the set of
data to be entered. To start, the Climsoft administrator selects only those forms required by the
key entry operators. The following paragraph describes how the key entry administration and
operations are undertaken.
29
Figure 25: Key Entry forms opening
• Double click on the required form or select the form. Press the Enter key to open it,
• Clicking OK also opens the selected form,
• Select Cancel to close the dialog.
30
Figure 26: Key Entry form for hourly data
31
For Hourly data, the required sequencer configuration is done on the appropriate dialog which
is accessed via the top menu on the Climsoft front-end by clicking on Tools >>Sequencer
configuration >>hourly. Figure 27 shows a screenshot of the dialog for configuring the
sequencer for hourly data. The dialog is intuitive, with some guidelines in the top part of the form.
Only element_code values that are typed in and the seq values filled by Climsoft
Figure 27: Dialog for configuring the sequencer for hourly data
The dialog for configuring the sequencer for daily data is accessed via Tools >> Sequencer
configuration >> Daily. A screenshot of the dialog is shown in Figure 28.
32
Similarly, the dialog for configuring the sequencer for monthly data is accessed via Tools >>
Sequencer configuration >> Monthly. A screenshot of the dialog is shown in Figure 29.
33
Figure 30: Hourly form enabled just for synoptic hours
When the button “Enable synoptic hours only” has been clicked, its label immediately changes
to “Enable all hours”, so it allows toggling between “all hours” and “synoptic hours”. The dialog
for configuring the specific synoptic hours that would be enabled is accessed via Tools >>
FormHourly time selection. A screenshot of the dialog is shown in Figure 31. From this dialog
any other set of hours can be selected for data entry. For instance for those NMHs that operate
for 12 hour only say from 06 – 18 may wish to enable those hours only for data entry. Always
remember to click the button “Enable synoptic hours only” if not all hours are used (i.e. not
0….23)
34
Figure 31: Configuring synoptic hours to be enabled on hourly form
35
The use of the key entry forms is quite intuitive, so there is no need for detailed guidelines except
to highlight the common aspects that apply to all key-entry forms, and some features related to
specific forms.
• After selecting a value from a dropdown list e.g. a station name, or entering a value in a
text box, the Enter key must be pressed for the value to be registered on the form and
for the cursor to move to the next cell.
• Some key entry forms have a provision for entering the “Total” for elements that require
the sum of values for a particular column of observation data. The Climsoft administrator
must configure the metadata in the element table to enforce the requirement for the total
to be entered for the elements that need a total to be entered during key entry.
• Some buttons are enabled or disabled depending on the situation. For example, the
“Save” button is enabled only after clicking the “Add New” button to enter a new record.
Then the same “Add New” button is automatically disabled until the “Save” button has
been clicked to save data for the newly added record.
• When the cursor leaves a particular text box or dropdown list after pressing the “Enter”
key, quality control (QC) checks are carried out on the value that has just been entered.
Figure 32 shows a QC message after entering a value that has failed the QC check for
upper limits when entering daily data.
38
Key entry form for hourly wind speed and direction
39
Key entry form for Synoptic observations
Figure 35: Key entry form for many elements observed at the same time
40
Key entry form for Upper Air observations
The form for entering upper air observations is made up of many elements at the standard
pressure levels namely Surface, 1000, 925, 850, 700, 500, 400, 300, 250, 200,150,100, 70, 50, 30,
20,10 shown in Figure 35a. The key entry is sequenced in the order of Levels, Day, Month
and Year. At the metadata configuration the observation elements for the upper air data
should be configured as detailed in Table 2 below. However, the QC limits and units should
be set according to the observation practice at the given centre.
Figure 35a: Key entry form for upper air observations at standard levels
41
309 LATHDISP Lat displ at pressure lvl Latitude Displacement at Pressure Level Upper Air
310 LONGHDISP Lon Displ at press lvl Longitude Displacement at Pressure Level Upper Air
311 ACCTMDISP Time Displ at press lvl Time Displacement at Pressure Level Upper Air
312 ACCVERSIG Vert Signf at press lvl Vertical Significant at Pressure Level Upper Air
313 RELHUM RH at Press Lvl Relative Humidity at Pressure Level Upper Air
Table 2: Upper observation elements details – To be configured in Elements Metadata
42
6.8 Operational Monitoring
During the key entry operation, details of the operator are captured. They are login name and
entry date time. These details can be used to monitor data entered by any operator. The
monitoring operation is started through “Administration” -> “Operations Monitoring” to obtain
the dialog similar to Figure 36. The tabs on the dialog describe the different processes
undertaken in the monitoring operation. They are:
(i) User Records: See Figure 37. The Administrator can monitor records for a single
operator or for all operators combined depending on the option selected. Any operator
can monitor own records through Accessories -> User Records. In this case only the
records for the current operator that will be displayed. The Start Date and End Date should
be appropriately selected then click View Records.
(ii) Settings: The other 2 operations namely Performance Monitoring and Double Key Entry
Verifications require some prior settings before they are undertaken. This is done through
the tab “Settings” as described in section 6.8.2.
43
6.8.2 Settings
To undertake any of the operation in this dialog select the option and click View to obtain a grid
table from where the values are set as follows:
Enter the target value for each operator against which the performance for the selected period
will be measured then click Update.
(ii) Data Entry mode:
44
key entry will not accept any data that has not been previously entered. To enter new data the
key entry mode has to be set back to 0 (single data entry).
45
Figure 42: Double key entry verification
46
7. Post Key Entry Quality Control Checks
Climsoft provides a facility for further Quality Control (QC) checks on data that has been
uploaded from key-entry tables or imported from external sources into the “observationinitial”
table. The dialog for these QC checks, shown in Figure 43, is accessed by clicking the icon
“Quality Control Checks” on the Welcome screen.
NOTE:
The QC reports are saved in the folder configured for the QC reports under “General Settings”.
Hence make sure that the folder for QC reports has been configured correctly.
The name of the QC report file generated is based on the type of QC selected and the specified
period for the QC.
For the inter-element QC checks, the codes of the elements that have failed the QC checks will
also be included in the file name e.g. “qc_interelement_101_3_200101_200312.csv” indicates
a report for inter-element checks between dry bulb temperature and minimum temperature from
January 2001 to December 2003.
A suffix “Updated” is added manually by the QC officer or administrator to show that the values
in the QC report have been modified. Figure 45 shows the contents of a QC report for limits
checks displayed in Excel spreadsheet.
The report on the screenshot in Figure 38 shows two values of maximum temperature, 6.6 and
6.4 that are below the lower limit of 12.0 for maximum temperature. The other useful information
displayed is the station ID, element ID, date/time, the username of the person who entered the
data and the key-entry form used for entering the data.
48
Figure 46 shows a QC report for inter-element comparison. The report shows maximum
temperatures (2) below dry bulb temperatures (101) for the same day.
49
Figure 48: Upload data to observationfinal table
Data for the selected Stations and Elements for the period specified in the time range (Begin
Year, End Year, Begin Month and End Month) will be transferred into the “observationfinal” table.
The dialog is self-explanatory on how to select data to upload.
Click the “Upload” button to start the data transfer. The progress of data transfer will be shown
in the progress box.
Data that had been previously uploaded can be overwritten by checking the box “Update
existing records”. Otherwise duplicate data will be ignored if uploaded.
Only data that have gone through QC are uploaded to the “observationfinal” table. Observation
records that have not yet been QC’d have a QC status flag of zero “0” while records that have
been QC’d have a QC status flag of one “1”.Records that have been changed through QC update
have QC status flag “2”. Therefore data with QC status flag “0” will not be uploaded to
“observationfinal”
50
Figure 49a: Update Observations Dialog – Select records
51
Click Delete button to remove all the selected and displayed records from the selected table
i.e. observation initial or observation final. A delete confirmation message will be prompted to
ensure that the Delete button was not clicked by mistake. If “Yes” button on the message box
is clicked, the record deletion will be completed without any other warning. Clicking “No” will
cancel the delete action.
(iii) Export records
All the selected and displayed records will be exported into text (csv) file that will be saved in
a location and filename to be selected by the Climsoft user.
Select the “Close” button to close the entire dialog and “Help” button to get more help on the
“Update Observations” procedures.
52
8. Data Transfer Operations
Data transfer involves the operations described below:
(i) Upload: Transfers quality controlled data from the table “observationinitial” to
“observationfinal”. More details in Section 7.2,
(ii) External Data: Imports text data files in formats such as; CLICOM, Automatic Weather
Station (AWS), NOAA GTS and any other text data in delimited text files. Backup
files from earlier Climsoft versions will also be imported through a process called
“Data migration”. More details are given in Sections 8.1 through 8.5,
(iii) Backup: Makes a backup of all data in table “observationfinal” to text files. More details in
Section 9.1,
(iv) Restore: Restores Climsoft version 4 backup files back to the database. More details in
Section 9.2.
54
Figure 52: Importing CLICOM daily data
To import CLICOM daily data click “Open File” then select the file containing the data and click
“Open” on the file open dialog. Enter the hour for the daily data observation in the text box
“Default Observation hour”. 06Z has been set as the default. Then from the import dialog click
“View Data” and the grid table gets populated with the data. Confirm that it’s the right data then
click “Load Data”. Wait for the process shown in Figure 53 to complete with a message “Data
import process completed”. CLICOM Synop and CLICOM Hourly data are similarly imported.
55
8.3 Importing Automatic Weather Station (AWS) data
Before importing AWS data, it is important to note the structure of the AWS data file. Many
Automatic Weather Stations (AWS) are configured to produce data output files in ASCII format.
However, AWS from different suppliers tend to have different observation data outputs
depending on the sensors available and configuration of the output file. Figure 54 shows a
screenshot of sample data from an AWS site installed in Rwanda. The first row contains the field
(column) headings representing different meteorological elements. In most cases, not all fields
are required for data ingestion into Climsoft database.
56
(ix) If all the data belongs to one station and that it does not have a column with station
IDs, then type the ID for the station in the “Station ID” text box.;
(x) After completing above steps click “View Data” to populate the grid view with the
selected data. Note that only the first 25 rows of data from the selected file will be listed
in the grid table. This is meant to enable confirmation that the right data has been
selected. However, the entire file will be imported if everything is well selected.
Note that the column headers in the grid table will be serialized with numbers from 1 up to
the last column of that data. The following section outlines how to rename the column
headers with the element codes for the observations they represent.
8.3.2 Configuring column headers and uploading data into Climsoft database
(i) Under “Columns Header Settings”, select the column number and the field name that
describes data in that column. Do that for each column until all of them are done. Make sure
the correct field names are selected in order for the data to be appropriately mapped into
the database. If any column contains data that will not be imported then “NA” should be
selected for field name,
(ii) If the step (i) above is correctively completed, then the configured header specifications can
be saved in a text file to be used later for a similar file without going through that process
again. To save the specifications click “Save Header Specs” button and choose a
57
convenient location and filename. Note that the file will be automatically allocated file
extension of sch e.g. aws1.csh;
(iii) If the header specs had been saved earlier click “Load Header Specs” and select the
header specs file. If correctly selected the columns in the displayed data table will be
automatically labeled with the header specs.
If satisfied that the columns are correctly named click “Load Data” to start the uploading
process. Wait until the busy mouse pointer changes to default. A message will be displaying at
the bottom of the dialog in red showing the data transfer progress.
It should be noted that station IDs and element codes used must described in the metadata. If
not, data described by them will not be imported. Station IDs and Element Codes not found in
metadata will be listed at the bottom of the dialog.
58
Figure 57: Sample file with observations in one column labeled G
Figure 58: Sample file with observations in many columns starting at column E
Follow the steps below to transfer Hourly/Daily data with one or many columns structure into
Climsoft version 4:
(i) Select “Data transfer” icon from the welcome main menu dialog,
(ii) Select “External Data” and select “Text Files” and then “Hourly or Daily” option, Figure
59,
59
Figure 59: Import External Data Dialog
(iii) After selecting “Hourly or Daily”, the dialog in Figure 60 opens up. Click on “Open File“
to browse and select the file containing data to transfer, specify the “delimiter” of the file
to import e.g. comma (default), TAB or Others (type in the delimiter character).
Check/uncheck the “remove scaling” box appropriately to ensure data will be imported
into “observationinitial” table without the decimal points. Click “View Data” button to view
the data in a grid view (tabular form),
(iv) Set the column headers by matching each data column by its corresponding field name.
When this process is completed you will have the dialog shown in Figure 61,
60
Figure 61: Columns Header Settings
(v) Click the “Load Data” button to load the data into the “observationinitial” table. Then
observe the records uploading count and finally the process completion message at the
bottom of the dialog. In case the upload fails a message will pop up accordingly.
Note: To avoid setting the columns headers each time a similar file is to be imported, save the
columns header specifications by selecting the “Save Header Specs” button. Note the file
extension (e.g. header1_specs.sch).
To retrieve and use the saved columns headers select the ” Load Header Specs” button and
navigate to select the appropriate file.
61
Figure 62: Sample data with multiple columns
62
Figure 63: Multiple column data Import
Figure 64: Multiple column data Import with column headers specification
64
Figure 64a Dekadal – Multiple Elements columns Figure 64b Dekadal – Single Element column
Figure 64c Monthly – Multiple Elements columns Figure 64d Monthly– Single Element column
The import procedures will differ with the layout of the summarized data as follows:
(iv) Data in multiple Elements Columns - Figure 64a and 64c
From Welcome dialog follow:
Data Transfer -> External Data ->Text File->Multiple Element Columns. Then check option
Dekadal or Monthly for type of Figure 64a or 64c respectively.
65
8.6 Importing Upper Air data
Importing procedures for upper air data that has been digitized through other applications and
archived in delimited text files is similar to files with surface observations data. The appropriate
menus will be used depending on the layout of data in the files. The suitable menus are those
for daily or multiple columns for elements. The only additional procedure is to indicate that the
file contains upper air data by checking the box; Upper Air Data and entering date and time the
balloon was launched in the format given. See Figure 64e below.
Figure 64e: Multiple element columns data Import for upper air data file
66
The data is publicly available in text (csv) files. Climsoft is able to import these files through the
external data import dialog. Figure 65 below shows a screenshot of sample data downloaded
from the NOAA-NCEI site:
Figure 66: Importing data from a text file downloaded from NOAA-NCEI site
Notes on NOAA-NCEI data:
b) Since the GTS data would have gone through QC at NOAA-NCEI, the data is transferred
to the “observationfinal” table.
c) The required data can be downloaded from the site:
67
https://www.ncei.noaa.gov/access/search/dataset-search
68
9. Data Backup and Restore
Data backup and restore are essential operations in data management.
The backup dialog shown in Figure 68 is accessed by clicking the “Data Transfer” icon on the
Welcome screen, followed by clicking the menu item “Backup”.
69
Browsing for the backup folder, then click the “Backup” button, for the backup process to start.
The progress of the backup process is displayed.
70
Figure 70: HeidiSQL interface showing database information
To carry out a complete backup of a required database, right click on the database name on the
left panel of the dialog in Figure 70;
A pop-up menu appears, click on the menu item “Export database as SQL”; this display a new
dialog to specify details for the backup as shown on Figure71.
71
To create a new database structure with all the tables and data, check all the boxes on the right
panel and select “INSERT” on the “Data” dropdown list.
Click the “Export” button, then backup process starts and saves the backup file with the specified
name (and the extension .sql), in the specified location.
To restore the database (script file) click on “File” then “Load SQL file…”. Select the backup
script file then click “Open” on windows explorer dialog and follow the subsequent dialogs.
72
10. Climate Products
10.1 Standard Products
Climsoft version 4 provides a number of standard products which produce commonly required
reports or diagrams using data contained in the “observationfinal” table. Products are obtained
as follows.
From the Welcome Dialog, click on the icon Climate Products or select Products on the main
menu. The resulting dialog is shown in Figure 73.
The categories of the products are listed after clicking on the list box. When a product category
is selected then all products under that category are then listed.
For instance if category Data is selected, the products list under that category will be as shown
in Figure 74;
73
Figure 74: Data Product list
74
Once the product type is selected, a dialog appears from where details of the desired product
are specified. For instance if Monthly product is selected the dialog obtained is as shown in
Figure 75.
The Station (s) and Element (s) are selected from the respective list boxes each one at a time.
Deletion of undesired selections can be carried out for one item or the entire list using the
commands provided at the bottom of each box.
The relevant Summary Type and Period are then selected. The command “Start Extraction”
is used to produce the selected product.
The Period specifies the time range for the selected product. Default period if from 01/01/2000
to the current date.
The “Transpose Values” box when checked, the products will be displayed in a horizontal layout
where date components (days, dekads or months) become the column headers for the data
values.
The “Advanced Products Select” box when checked a set of stations under a specified group
are selected and listed on the Stations panel. All those stations will then be included in the
extracted product. The station groups are those that belong to the same; Type (qualifier),
Authority, Admin region, Drainage basin and a specified Geographical location.
Figure 76(a) and Figure 76(b), Figure 76(c) and Figure 76(d) below give examples of data
products output;
75
Monthly TMax and TMin
Extreme Values
76
Extreme Maximum Values
77
10.2 Data Products for Other Applications
78
10.2.3 Data for Instat
79
10.3 Windrose Plot
Windrose is plotted after selecting “Graphics” from the list of climate products category shown
in Figure 73 above;
The next step is to choose “Windrose” from the list, followed by the station. It is recommended
that only one station is selected for each windrose plot. The system selects the wind speed and
direction elements automatically. Specify the period range for the data to be used for the
windrose plot.
Click “Start Extraction” to generate the windrose plot as shown in Figure 81.
Before the windrose picture is generated, a message props up with the question, “Is the wind data
from AWS?” This is because wind data from AWS has a different element code from that of
manned station.
Climsoft uses a third party application called WRPLot to produce the windrose picture. This
application is a free software but requires a licence that must be renewed annually.
It is important to check that WRPLot was installed as part of the Climsoft installation and the
licence for the application was registered. If WRPlot has not yet been installed, it can be installed
using the setup file and the first licence file available in the “Bin” subfolder of the Climsoft
installation folder.
To renew the license visit the Lakes Environmental website at
https://www.weblakes.com/products/wrplot/registration.html. Once you register your details and
apply for the activation code, it will be sent to you by email shortly.
80
10.4 Plotting Charts
Simple and basic charts can be plotted through the charts menu. There are two types of charts
that can be produced namely, Time Series and Histograms.
The charts are produced through the same procedures used in other products. It is
recommended that each chart be plotted for data from 1 station only.
After Extracting data required for the chart, a plotting dialog will be displayed from where the
chart options can be provided.
For example to plot a Histogram chart, for monthly temperatures (Tmax and Tmin) from Wilson
Airport, Kenya for the year 1988 do the following:
(i) Check the option Monthly
(ii) Click the button Plot and obtain the chart below.
(iii) Type in the labels appropriately
81
Similarly a Histogram chart for daily rainfall from Wilson Airport, Kenya for the period 2005-
2014 can be plotted as below.
82
10.5 Inventory Products
The inventory product retrieves details of the climatic elements that have been recorded at a
particular station. When you click on the menu item Products -> Inventory or the icon
Climate Products -> Inventory on the Welcome dialog, the following dialog is displayed:
There are 5 types of outputs that can be obtained from the Inventory product. They are;
(i) Details of Data Records - Gives a tabulated output where a missing observation is
represented by M and available observation by X. The procedure searches the database
for the appropriate information and produces an Excel Spreadsheet as shown in Figure 87.
(ii) Inventory of Missing Data – Gives the total number of missing observation records for a
selected parameter in a given period of time. The output is displayed in graphical form as
shown in Figure 87a.
(iii) Time Series Chart for Observing Stations – Produces a time series chart for yearly total
stations whose data has been archived in the database as shown in Figure 87b.
(iv) Yearly Elements Observed – Produces a time series chart for total observations per year
for the selected elements from a given station. See Figure 87c
(v) Monthly Elements Observed – Produces a time series chart for total observations per
month for the selected elements from a given station. See Figure 87d.
To produce any of these inventories select the required station(s) and element(s) from their
respective list boxes as shown in Figure 87e.
83
Figure 86: Input selection dialog for Inventory Products
84
Figure 87a: Total missing data records
85
Figure 87c: Total annual observed station elements
From the Welcome Dialog, click on the icon Climate Products or select Products on the
main menu to obtain the products category dialog as shown in Figure 73. From the listed
items Click on Upper Air to obtain the list of the available Upper Air data summaries as
shown in Figure 88.
86
Figure 88: List of Upper Air data summaries
If any of listed products say Daily Levels selected (click) the following dialog is obtained.
The dialog shown in Figure 88a is similar to those for surface observations but with an
additional list of upper air standard pressure levels. Any number of levels may be selected as
desired.
Note that without selecting any Level no product will be obtained.
87
Figure 88b: Dialog for encoding CLIMAT message
89
11. AWS Real Time Data Operations
This operation starts from the Welcome menu dialog. Click the icon AWS Real Time
Processing to obtain the main dialog for Real Time Automatic Weather Station (AWS) data
operations (Figure 89).
90
Figure 90: AWS server settings
These settings are for the servers that play the roles of AWS base station and message
switching system that connects to the Global Telecommunication System (GTS). The
commands Base Station and Message Switching are used to switch between the two server
settings. Details required for the server settings should be obtained from the servers’
administrators and then entered. Multiple servers can be configured where more than one base
station or message switching system is used.
Click Add New to effect the changes.
The details should be entered as follows;
FTP Server Address – The FTP address e.g. 40.73.196.133 or the server network name;
Input Folder:
This is the subfolder in the FTP home directory of the server where data are located in the case
of the AWS base station or where it will be copied to in the message switching system. If the
subfolder is not in the root, the relative path should be indicated in accordance to the operating
system rules. For example; AWS\input in windows and AWS/input in Unix/Linux;
FTP Data Transfer Mode
This is the login method allowed for data transfer by the target server. MS Windows based
servers allow use of FTP. Some Linux based servers prefer the use of more secure data transfer
login methods such as SFTP. The method indicated by the target server should be entered here;
91
User Name – Login name used to logon into the server,
Password – Password for the user name provided,
Confirm Password – The entered password to be confirmed. The Password is not saved until
correctly confirmed.
Site details:
• Site ID or Name as provided by the Met Service or obtained from WMO Vol A if already
listed there,
• Input Data file is the name of text file in the AWS base station containing data for the
station listed in that current record. It should be ensured that the name matches the station’s
AWS data file,
• Files name prefix if the data is in multiple files with a common prefix, the box before the
“File name prefix” should be checked and the prefix filled in the text box. Otherwise, this
should not be checked.
92
In some special cases where filenames have date components, a date structured file
name prefix may be used to selects files with observations for a day or a month. For
instance for a filename such as 000001202008290000.txt the prefix will be entered as follows:
yyyymmdd – To select all files for the current day’s observations
yyyymm – To select all files for the current month’s observations
For a filename such as 000001-2020-08-29-00-00.txt the prefix will be entered as follows:
yyyy-mm-dd – To select all files for the current day’s observations
yyyy-mm – To select all files for the current month’s observations
• Data Structure is the name of the structure for the data output file in the AWS base station
server. This is created by the user and corresponds to the data being transferred. Details
on how to undertake this task are listed in Section 11.3.2,
• Missing Data Flag is the value that represents missing data from the specified AWS
server. This value will be obtained from files in the base station server;
• AWS Server IP is the IP address or host name of the server;
• GTS Msg Header is the Global Telecommunication System message header (e.g. HKNC
for Nairobi) in case AWS Data are meant for sharing with others countries through the
regional or global centres,
• Operational if a site has been created in the Climsoft database but has no operational
AWS installed the box Operational should be unchecked,
• Encode for GTS if a site has been created in the Climsoft database but does not send
AWS to GTS, the box Encode for GTS should be unchecked.
Commands:
• To create new station in the database click Add New, enter all the details then click
Refresh,
• Update is used to save changes made on the details of an existing station,
• Delete is used to remove the current station from the database,
• View/Update enables viewing the sites in a table view. Editing can be done from there.
93
Figure 92: AWS data structure settings
To carry out the task of data structure settings, basic skills in BUFR and Climsoft elements are
necessary. Configuration of the data structure is done as follows;
94
Select the record by clicking at the left most cell of the record. It will become highlighted.
Then press Delete key on the keyboard. The record will be deleted without any warning.
(ix) Structure Name, Delimiter Type, Total Header Rows and Text Qualifier Character
may also be updated where necessary. In this case click Update to save changes.
95
should similarly start with value 2 in the field Cols. This procedure is followed until each column
of the data from the output text file is described in this table. Information required in each record
is as follows;
• Cols - Position (1,2…) of the column (from the left side) describing the data in the file,
• Element_abbreviation – Abbreviation for the name describing data in the column.
Users are free to use any abbreviation that conveniently describes the data except for the
column with date and time values. The following abbreviations SHOULD be strictly used
while abbreviating the date and time in the circumstances described below;
Date/time - Date and time are in one column e.g. 21/03/2000 12:30:00
Date - Date is in a column of its own e.g. 21/03/2000.
Time - Time is in different column from Date e.g 12:30:00
yyyy - Year is in a column of its own e.g. 2000
mm - Month is in a column of its own e.g.03
dd - Day is in a column of its own e.g.21
hh - Hour is in a column of its own e.g.12
nn - Minute is in a column of its own e.g.30
ss - Second is in a column of its own e.g.00
yyyymmddhhnnss -Date and time has this structure e.g. 20000321123000
ddmmyyyyhhnnss -Date and time has this structure e.g. 21032000123000
96
Figure 93b: AWS Output Text File
The dialog in Figure 94 shows the settings required in BUFR message encoding. Most of the
default values are automatically set at installation, but some are set by the user to meet the
requirements for a particular NMHS. To undertake these settings knowledge of Table Driven
Code Forms (TDCF) is necessary. WMO Manual on Codes is the main reference used for
setting up the values here.
97
11.7 Data Acquisition and Processing Dialog
The AWS data processing is controlled from the dialog shown in Figure 95 and the information
on the processing status is displayed. Before the process is started it is important that the user
understands the role of each control shown in the dialog. Details are as follows;
Controls Details
• Settings:
o Restart – Starts/Restarts the processing schedule,
o Stop – Stops the processing schedule,
o Retrieval Intervals – The frequency in minutes at which data will be retrieved from
AWS server and processed. Where encoding is required for Synoptic messages the
interval will be entered as 180 minutes.
o Hour Offset – Time delay in minutes to start encoding after the time set for data
downloading from AWS data logger,
o Timeout Period – The longest period in seconds that attempts to connect to the AWS
server can be made before timing out the connection. If no time out period is desired a
value of 999 should be entered. But in that case there is a risk of holding the process
indefinitely if connection is not achieved at all. If connection speed is the cause of the
delay then a reasonable time out period should be entered e.g. 60 seconds.
98
o Delete Input File After Processing - When checked, the files containing AWS data,
will be deleted from the base AWS server after processing,
o GMT Diff +/- – The time difference between GMT and the time for the AWS data. A
positive difference may not require entering a + sign but a negative one is mandatory,
o Save Changes – Updates the changes made.
• Processing information
o Input FTP Server - AWS Base station from where data is retrieved for processing,
o Input Files – List of the retrieved AWS text files,
o Output FTP Server– Message Switching System to where encoded messages have
been sent,
o Output Folder – Folder containing output files,
o Message Files – List of the encoded and transmitted files.
• Processing Status
o Current Date and Time – Computer clock time. It should always be correctly set,
o Retrieve last – Hours of data to be retrieved for encoding at each encoding interval,
o Last Processed – Date and time of the previous encoding process,
o Next Processing – Date and Time of the next encoding process,
o Status – The current activity.
• Error Messages
Errors encountered during processing are listed here together with their time of
occurrence. They are also logged into a file named aws_error located in the subfolder
data in the path “C:\ProgramData\Climsoft4”.
99
12. Paper Image Archiving
Images of scanned paper records can be archived in Climsoft and later retrieved when required.
The operation starts by first creating the metadata for the paper image archives then the folder
where the image files will be archived. The archived image files may occupy large amount of
storage space hence it is necessary to choose a folder where sufficient space is available. The
folder may reside in the local server or in a network connected server or Network Attached
Storage (NAS). Where the archiving is done on network folder it is recommended that it be
mapped as a drive on the local server for convenience of accessing the image files.
The procedures for creating paper archive metadata are as follows;
From the Welcome dialog click on the icon Metadata Information then on the tab Paper
Archive in the Metadata Information dialog and subsequently obtain the dialog shown in
Figure 96.
• Click AddNew button to clear the form and enter details of a new metadata record,
• The metadata to be entered are the Form ID which is the identification allocated by the
NMHS for that particular paper form. Details of the type of data contained in that form are
entered in Description box,
• Click Save button to save the metadata record,
• Other commands used are as follows;
o Update – Saves changes made on a metadata record,
100
o Delete – Removes a metadata record from the database,
o View – Displays all the metadata records in a tabular form.
It is required that the location (full path) for archiving the image files be configured in the general
settings of Climsoft. Follow the steps below to achieve that;
• From the Main Menu click Tools -> General Settings and browse the records until the
“Setting Description“ box displays the value; “Folder for Paper Archive image files”.
See Figure 97,
• Type the full path of the folder to be used for the image files e.g. C:\PaperArchive\Images
then click the Update button. If the path specified does not exist it will be created once
the process of image archiving starts. When no folder has been setup archiving will still
be possible but in a default folder that is in the path of the Climsoft installation. A message
will be provided to that effect until the folder has been changed from the default.
101
o If one particular observation is contained in different pages hence different image
files then HH will instead be used as page numbers in 2 digits starting with 01.
The procedure for archiving images with standard filename structure is as follows;
(i) From the Welcome dialog, click on the icon Paper Archive to obtain the dialog shown
in Figure 98,
(ii) Click on the explorer button and select the folder containing images to be archived, All
the image files will then be listed as shown in the dialog,
(iii) Uncheck the boxes for the image files that are not required.
(iv) Click the command Archive to complete the process.
102
Figure 99: Dialog for archiving images with unstructured file names
(i) Use the explorer button to select, open the image file and the image gets displayed as
shown above.
(ii) Enter the details for the content of that image in their respective boxes in the dialog,
(iii) The image may be rotated or zoomed so that the details are clearly seen.
(iv) Click Archive to structure the filename and archive the image,
(v) Repeat the process for all the images to be archived.
103
Figure 100: Retrieved archived image
104
Figure 101: View the list of archived images
105
Figure 102: Synoptic Data for many elements for one observation time - TDCF Form
Rules for entering data in the Synoptic forms with the aim of encoding into TDCF messages can
be obtained from: WMO manual on codes, Code tables, TDCF Templates and Common tables.
Data is then entered as follows:
(i) Pressure Data – In Hpa and multiplied by 10 to remove the decimal point,
They are:
• Station Level Pressure (SLP)
• Seal Level Pressure (MSL). To be entered as missing in stations above 500 M altitude and ;
• Pressure Change (3Hr and 24Hr) – To be entered as missing when not observed
(ii) Three (3) Hr pressure change characteristic - Code figure to be entered as described in the Manual on
Codes.
• Code figure 2 shall be used for positive tendency;
• Code figure 7 for negative tendency; and
• Code figure 4 for no change
(iii) Standard Pressure Level – In Hpa as a whole number and NOT multiplied by 10 e.g. 850
(iv) Geopotential Height – In whole number of geopotential metres (gpm),
(v) Temperatures – In degree Celcius and multiplied by 10 to remove the decimal point,
They are:
• Dry bulb - observed at synoptic hours,
106
• Wet bulb - observed at synoptic hours,
• Dew point - observed at synoptic hours,
• Minimum - recorded at the hour for daily data observations e.g. 06Z,
• Maximum – recorded daily but the hour varies with NMHS e.g. 18Z,
• Grass minimum – recorded at the same hour with minimum temperature,
(vi) Relative Humidity – In whole number of percentages,
(vii) Visibility – In meters rounded into tens,
(viii) Total Cloud Cover – Oktas. Climsoft will encode value into percentage of total cloud cover,
(ix) Vertical Significance – Code figure for the appropriate vertical significance,
(x) Low level Cloud Amount (Nh) – Code figure for the amount of lowest cloud level,
(xi) Cloud Base Height (h) – Height in tens of metres for the base lowest cloud level,
(xii) Low Level Cloud Type (CL) – Code figure for Low Level clouds type,
(xiii) Medium Level Cloud Type (CM) – Code figure for Medium Level clouds type,
(xiv) High Level Cloud Type (CH) - Code figure for High Level clouds type,
(xv) Individual Cloud Layers – Vertical significance, amount, type and height of individual cloud layers that
qualify to be considered should be entered in their appropriate boxes in the order they occur. When less
than 4 layers are reported the remaining boxes should be left untouched or flag value M be used,
(xvi) Present Weather, Past Weather (1) and (2) – Code figure for the appropriate weather description,
(xvii) Evaporation – Kgm/M-2 multiplied by 10 to remove decimal point,
(xviii) 24hr Sunshine – Number of hours multiplied by 10 to remove decimal point,
(xix) 1Hr Sunshine – Number minutes in whole numbers,
(xx) 3hr Rainfall – Amount accumulated for the last 3 hrs in Kgm/M-2 multiplied 10 to remove decimal point,
(xxi) 24Hr Rainfall – Amount accumulated for the last 24hrs in Kgm/M-2 multiplied 10 to remove decimal point,
(xxii)Wind Direction – Degree from True North in whole numbers,
(xxiii) Wind Speed – M/S multiplied by 10 to remove decimal point,
(xxiv) Radiation – MJ/M-2 multiplied by 100 to remove decimal point.
The settings in the dialog shown in Figure 103 need to be in place before the encoding and
sending of messages is done. To successfully configure Climsoft for this operation basic
knowledge on WMO Table Driven Code Forms (TDCF) is and required. Visit the WMO site for
more details. The most useful reference documents are;
107
(i) WMO manual on codes,
(ii) Code tables,
(iii) TDCF Templates and Common tables.
On the dialog select the right encoding template from the list box Template and enter the
message header according to the data type (T1T2A1A2ii) and originating center (CCCC). Enter
the Message Switching System (AMSS) details where the messages will be routed to. Click
Update to save changes in each case.
108
14. ACKNOWLEDGEMENTS
Contributions towards the completion of this document were received from different individuals
in the following areas:
(iv) Reviewers
• Fortunata Lubega – [email protected]
• Roger Stern – [email protected]
For any question or further clarifications, contact the Climsoft Helpdesk at: [email protected]
109