One Stream Design and Reference Guide
One Stream Design and Reference Guide
Guide
7.1.1 Release
Copyright © 2022 OneStream Software LLC. All rights reserved.
Any warranty with respect to the software or its functionality will be expressly given in the
Subscription License Agreement or Software License and Services Agreement between
OneStream and the warrantee. This document does not itself constitute a representation or
warranty with respect to the software or any related matter.
OneStream Software, OneStream, Extensible Dimensionality and the OneStream logo are
trademarks of OneStream Software LLC in the United States and other countries. Microsoft,
Microsoft Azure, Microsoft Office, Windows, Windows Server, Excel, .NET Framework, Internet
Explorer, Internet Information Server, Windows Communication Foundation and SQL Server are
registered trademarks or trademarks of Microsoft Corporation in the United States and/or other
countries. DevExpress is a registered trademark of Developer Express, Inc. Cisco is a registered
trademark of Cisco Systems, Inc. Intel is a trademark of Intel Corporation. AMD64 is a trademark
of Advanced Micro Devices, Inc. Other names may be trademarks of their respective owners.
Table of Contents
Table of Contents
Introduction 1
About the Financial Model 2
Extensible Dimensionality 2
Cube Design Choices within an Application 5
Application Model Design Considerations 5
Standard Cube Design 7
Standard Large Sparse Cube Design 7
Hybrid Cube Design 8
BI Blend 10
Key Elements of BI Blend 10
Relational Blending 10
Relational Blending Types 11
Model Blending 11
Relational Blending API 13
Cache Level Types 14
Relational Model Blending API Functions 15
Formulas 25
OneStream API Details and Database Documentation 25
Formula Structure 26
Applying Formulas 28
Formula Types 29
Formula Execution 31
Writing Formulas 34
Writing Stored Calculations 42
Avoiding Data Explosion 55
Key Functions 58
Account 59
Consolidation 60
Data 62
Entity 66
Flow 66
FX 67
Journals 69
Member 69
Scenario 71
Status 71
Time 72
Functions 72
Examples of Key Functions in Use 73
Finance Function Types 84
Navigation 114
About the OneStream Windows Application 114
Launching the OneStream Windows App 115
Defining Server Connections 115
Logging In 116
OnePlace Layout 117
Point of View 124
Navigating Mobile 127
Mobile Toolbar 127
Workflow 131
Workflow 131
Creating Workflow Suffix Groups 132
Using Workflow Profiles 134
Using Calculation Definitions 143
Using Workflow Channels 145
Data Units 150
Level 1: Cube Data Unit 150
Level 2: Workflow Data Unit 151
Level 3: Workflow Channel Data Unit 152
Data Loading 153
File 579
Excel Export Item 580
Report 580
Loop Variables 582
If Statement 582
Else/Else If Statement 583
Change Parameters 584
Book Preview Tool Bar 585
Cube Views 587
Cube View Toolbar 587
Cube View Paging Conditions 590
Display Gridlines in Excel 594
Cube View Group Properties 594
Advanced Tab 595
POV 596
General Settings 597
Header Text 599
Header Size 599
Header Overrides 600
Report 600
Navigation Links 602
Image 726
Label 726
Colors 727
Embedded Components 727
Data Adapters 728
Cube View MD 730
Dimension Leveling 735
Method Query 740
SQL 745
BI Blend Adapter 745
Parameters 747
Delimited List 749
Result Format String Type 750
Command Type 750
Database Location (If SQL Command Type) 751
Files 754
Strings 754
Dashboard Profiles 754
Previewing Dashboards 755
Overview 758
Report Alias (Displaying Alternate Descriptions for Members) 758
Requirements 908
About Groups and Inherited Security 909
About Exclusion Groups 909
Tips and Best Practices 909
Creating Groups 910
Creating Exclusion Groups 910
Managing Groups 911
Managing Group Membership 912
Loading and Extracting Groups 913
System Security Roles 915
Application Server Configurations 915
System Roles 916
System Configurations 916
General System Configurations 917
Environment System Configurations 919
Recycling System Configurations 921
Database System Configurations 922
Audits 925
Manage System Security 927
Access System Security Roles 928
General 1114
Spreading 1115
Spreading Types 1115
Administration 1116
Display Context Pane 1116
Preferences 1116
Excel Calculation 1118
Right-Click Options 1118
Retain Formatting in Cube Views with Selection Styles and
Conditional Formatting 1124
Using Selection Styles 1124
Creating a Selection Style 1124
Using an Existing Style 1126
Adding a Selection Styles Shortcut 1127
Reviewing Styles and Ranges 1128
Using Right-click Menu Options 1129
Modifying and Duplicate Styles 1129
Merging Styles 1130
Conditional Formatting 1131
Create Conditional Formatting in an Existing or New Cube
View 1132
Styles 1135
Creating a Custom Style 1135
Creating Excel Forms and Reports 1143
Named Regions 1143
Retrieve Functions 1144
Visual Basic for Applications (VBA) Procedures 1148
Introduction
The purpose of this guide is to provide best practices, tips, in depth explanations of key
capabilities, and configurations for implementers, administrators, and end users. Use this guide to
enhance your understanding of applications, from specific functions to comprehensive reporting.
These tools, along with other data modeling techniques, allow you to build robust financial data
models.
Extensible Dimensionality
Business units can inherit standard Dimensions from a standard set that corporate may maintain,
but Extensible Dimensionality® also allows them to extend those Dimensions to suit their own
process and reporting needs. This allows for operational significance for the business units yet
grants control of the overall process to corporate.
The diagram below shows an example of how a certain account can be extended differently
across a service business unit vs. a manufacturing business unit as well as across the Actual and
Budget Scenarios. Notice how each business unit can look at Gross Sales differently in the Actual
Scenario. Also, the Services business unit can look at Gross Sales at an even greater level of
detail in their Budget Scenario.
1. Extensible Dimensionality® can inherit Dimensions and extend them. In the example
above, there are four different Account Dimensions that inherit from each other like this:
l Corporate Accounts
o Club Manufacturing Accounts
o Services Accounts
2. Different Dimensions can be assigned to different Cubes and that Dimensional assignment
can be different for each Scenario Type. In the above example, there are three Cubes:
Corporate, Clubs, and Services. When looking at the Corporate Cube, data from the three
Cubes is all there for analysis. In Corporate, the Corporate Accounts Dimension is assigned
to all Scenario Types. In Clubs, the Clubs Manufacturing Accounts Dimension is assigned
to all Scenario Types. In the Services Cube, the Services Accounts Dimension is assigned
to every Dimension except for Budget, which is where the Services Budget Accounts
Dimension is assigned.
3. The Clubs and Services Cubes have their own respective Entity Dimensions referenced in
the Corporate Cube. The Entity Dimensions tie the data together.
NOTE: Other Dimensions such as Flow and the User Defined Dimensions can also be
extended and have flexible Cube assignment if desired.
Extensible Cube
There can also be separate Cubes for different uses, such as Human Resource data or Cost
Drivers. These Cubes can still reference other Cube data through the CB# qualifier in Member
formulas.
Extensible Workflow
There can be different Workflow Profile hierarchies per Scenario Type which is defined at the
Cube Level. For example, an Actual Scenario might be loaded from 12 GL systems across 500
Entities, and Budget Forecast, and Variance can define a Workflow for each of the 500 Entities
with regional review and signoff levels.
1. Single or “Monolithic” Cube is the simplest Application design. These typically have one
Workflow Profile structure, though that can vary by Scenario Type.
2. “Linked Cubes” is possible via relationships between multiple Entity dimensions into one
superset Entity dimension. The “Top” or “Parent” Cube is configured with Cube References
to others. Typically, Extensible Dimensionality is deployed with other dimensions, such as
Accounts, allowing the Business Unit Cubes to satisfy their management reporting
requirements. There is typically one Workflow Profile structure for all Linked Cubes, though
that can vary by Scenario Type.
3. “Exclusive Cubes” are separate Business Unit Cubes that move their data from a
Business Unit to a “Parent” Cube typically via Business Rules or the use of Data
Management instead of through configured Cube References. Each separate Cube
requires its own Workflow Profile structure, though that can vary by Scenario Type.
4. “Specialty Cubes” refers to special data collections outside of the typical Trial Balance or
Planning data loads and is typically encompassed with no parent/child relationships
between Cubes. Examples are for headcount or budget drivers. These figures might be
referenced by other Cubes via Business Rules or Data Management. Each Specialty Cube
would have separate Workflow Profile structures.
When designing an Application, consider these questions to identify the model Cube design best
suited to your needs:
l What is the overall size of the Entity, Account, Flow and User Defined dimensions that
estimate the Data Unit size potential? A OneStream Data Unit is comprised of Cube, Entity,
Parent, Consolidation, Scenario and Time.
l What is the size and relationship between dimensions? For example, are the User Defined
Dimensions large (thousand members) and are the members closely tied to a single or very
few Entities (“sparsely populated”)?
l Is the data related to the Cube a “specialty” collection, such as Human Resources, Product
Profit or Capital Expenditures?
l Do you fully understand how the data is managed? Determine if the Legal Entity is
appropriate for the Entity dimension so you can properly define the ultimate Data Unit
Controlling Dimension.
l Have you collected information how the data is assembled and used? Determine if the
Cube data is integrated for different types of data such as Actual and Budget.
Performance Profile
l Consolidation times for 12 periods generally range from minutes to ~2 hours on the higher
end, depending on complexity and volume.
Reporting Guidelines
l Standard Cube Views with minimum number of rows and columns should perform well,
rendering within a few seconds to a minute (depending on requirements).
l Cube View Performance is strongly dependent on concurrent use, Cube View design and
metadata involved.
l Data Unit size approximately 750k or more data records. Larger Data Units demand more
CPU processing power and time to calculate.
Performance Profile
l Optimize consolidation times by addressing sparse dimensions. With good extensibility you
can significantly reduce the size of the Data Unit consolidating, for faster consolidation.
l Applications that must be consolidated and contain large sparse dimension members, may
have long consolidation times and lower overall performance. Review these applications
designs to consider alternate design solutions that include extensibility or other Cube and
Model solutions.
l Data Unit size, number of Data Units, Business Rule best practices and other factors effect
consolidation times.
Reporting Guidelines
l Putting a “top” member such as UD2#Top for a large dimension on a Cube View , means
OneStream loops through each child member and aggregates on-the-fly. This can
consume large amounts of processing power and extend report processing.
l It is best to use Sparse Data Suppression settings on Cube Views which are based on
Single Data Unit.
l You may need custom Sparse Cube View Business Rules for “top of the house” queries.
l Quick Views may be impacted as they do not include Sparse Row Suppression settings.
l Predominately present is aggregation having simple needs at parent Entity levels. Parents
do not require special calculations or logic.
l Rely on Dynamic Business Rules attached to a dynamic member such as in UD8. This
means parent Entity data is not stored. You can set the Is Consolidated property on these
Entities to False.
l Determine if parent Entity data must be exported to external system so you can understand
the impact of Dynamic Business Rules.
l Usually have a data unit size of up to one million data records at the top parent Entity level.
Performance Profile
l Performance will vary by application size and structure.
l “Entity Aggregation on the Fly” consolidation simulation rules are required. You cannot
consolidate parent entities. This involves the API.Functions.
GetEntityAggregationDataCell function in a dynamic member.
l You must support the export of calculated data which could involve unique Scenarios or
cubes to replicate dynamic calculations as stored values. These processes can be
automated using scheduled Data Management jobs.
Reporting Guidelines
l Apply Sparse Suppression settings on Cube Views.
l You may need custom Sparse Cube View Business Rules for “top of the house” member
queries.
l Quick Views may be impacted because they do not include Sparse Row Suppression
settings.
BI Blend
BI-Blend is a “read-only” aggregate storage model that supports large-volume reporting for data
that is not appropriate to store in a traditional Cube. BI Blend data is large in volume and often
transactional. For example, to analyze data by invoice, a standard cube requires metadata to
store the data records. Soon, most of the invoice metadata would not be needed given the
transactional nature of the data. Therefore, storing transaction data in a Cube design is not a best
practice.
A key challenge of reporting on transactional data is presenting data in a uniform format for
standardized reporting, while flexibility supporting ever changing records and reporting
requirements. The overall large size of the data sets requires a model suitable for responsive
reporting and analysis.
The BI Blend solution approaches these challenges in a unique and innovate way. BI Blend
rationalizes source data for uniform and standardized reporting, like the Standard Cube models,
but stores the data in a new relational column store table for responsive reporting.
Use BI-Blend to analyze large volumes of highly changing data, such as ERP system transaction
data, which typically would not reside in a Cube. Processing is free from the intensive audit
controls within a traditional Consolidation Cube, such as managing calculation status.
l Non-Cube, executed to a relational table optimized for reporting on large data sets by
storing results in a column store index
For more information, see the BI Blend Design and Reference Guide.
Relational Blending
The dynamic blending of data stored in a relational database with data stored/derived by the in-
memory analytic engine is supported.
Relational blending is a data management approach. This enables analytic modelers to better
manage the trade-offs between building unsustainable/poor performing complex analytic models
that contain too much detail and the need to report on that level detail (data that may be
transactional in nature or constantly changing). Relational Blending makes it possible to
seamlessly integrate detail data points into the analytic reporting process.
Model Blending
The following sections provide sample use-cases and explanations of how Model Blending is
used to seamlessly integrate relational data points into an analytic model using Dynamic Member
Calculations that leverage the relational blending API functions of the finance engine.
A common challenge in analytic modeling is how to build a sustainable model when the definition
of metadata and data becomes blurred. The two use cases below provide examples of this
modeling challenge.
Use Case 1
In many cases, analytic modelers are faced with the challenge of building a model containing
Dimension Members that are unknown at design time or forced to build a Dimension containing
Members that will be constantly changing. Without relational blending, analytic modelers are
forced to build models full of unknown members (TBD1, TBD2, etc.) with the hopes that users of
the system do not need values beyond these placeholder members. (This data is transactional
and not a good candidate for an analytic model, Workforce Planning is a good example of this
problem)
This is a one-to-many issue so Drill-Back and Application Blending work well if the summary Cube
is the primary focus of analysis and transactions are only used for supporting details. Model
blending can provide a benefit as well, but keep in mind that model blending must relate an
analytic cell POV to a relational row or summarized row (one-to-one). For Model Blending to be
useful in these circumstances, the relational data must be returned in an aggregate format (avg,
min, max, sum, count) in order to reduce the one-to-many relationship to a one-to-one
Relationship.
Use Case 2
Analytic models that depend on Dimensions with Members that are constantly changing.
Consider business problem where the analytic model is based on a fixed number of members
(facility with rooms and beds). This is easy from a modeling perspective; however, the user
requirement is to build a model that is aware of the current occupant of the bed. The logical
metadata definition is room/bed, but the business problem requires the “occupant” to be defined
as a Member for the model to be meaningful. If occupant is used as a Member in the model, it is
almost guaranteed that the analytic model will eventually become unsustainable due to the
changing nature of the room/bed/occupant Dimension. The administrator of this model now has
the burden of constantly changing and rebuilding the model to reflect the current occupant data.
This is a one-to-one issue, so Model Blending fits well and provides a tremendous amount of
value. Detailed and changing information can be continuously loaded and updated as attribute
information in the OneStream Staging tables, Custom Relational Tables and the Model Blending
API can be used to dynamically incorporate this information into analytic model through dynamic
member formulas.
Relational blending can help keep the size of an analytic model to a manageable level by allowing
leaf level members to be kept in the relational table and only keeping summarize/static Members
in the analytic model definition (Dimension Members). Relational blending is not a cure-all, but it
is an important tool for building maintainable well performing analytic models when a model has
some dependencies on detailed information that cannot be clearly defined as metadata or data.
In other words, the information is useful in the model, but it is so detailed or changes so much that
it is difficult to incorporate into a rational metadata structure.
This is an efficient choice because the first time a cell is requested for the Entity, Scenario, Time
combination, a query will run to load all the stage data for the combination. Then all subsequent
cell requests would read values from cache. On the other hand, if the primary Cube View data
request is for multiple Entities, then BlendCacheLevelTypes.WfProfileScenarioTime cache level
would be a more efficient choice. This is more efficient because a single query would run to load
all the data for the Scenario and Time into cache and then all subsequent cell requests would read
values from cache. As a cautionary note, be aware that using coarse cache levels (reading more
data at once into cache) only benefits performance when the target Cube View can read many
values for the cache. If the target Cube View is only focused on one Entity/Account and the
BlendCacheLevelTypes.WfProfileScenarioTime cache level is chosen, all rows for the entire
Scenario and Time would need to be read into memory when only values for one Entity/Account
combination was needed. In this case,
BlendCacheLevelTypes.WfProfileScenarioTimeEntityAccount would be a more efficient cache
level.
In summary, choose a cache level that will minimize the number of actual database queries
needed to run in order to get the desired cells for the target Cube View. This is not an exact
science, and it may be difficult to choose a cache level that works efficiently for all target Cube
Views. If there is a diverse set of Cube Views using relational blend data, consider creating
specific Members that implement different cache levels that match the Cube View data pattern.
BlendCacheLevelTypes.WfProfileScenarioTimeEntity
Query will be run and cached using the supplied Workflow Profile, POV Scenario, POV Time and
POV Entity as criteria and cache key.
BlendCacheLevelTypes.WfProfileScenarioTimeAccount
Query will be run and cached using the supplied Workflow Profile, POV Scenario, POV Time and
POV Account as criteria and cache key.
BlendCacheLevelTypes.WfProfileScenarioTimeEntityAccount
Query will be run and cached using the supplied Workflow Profile, POV Scenario, POV Time,
POV Entity and POV Account as criteria and cache key.
BlendCacheLevelTypes.Custom
Intended to be used with custom table query (Cache level is explicitly controlled by the supplied
SQL query). Query will be run and cached using the supplied cache name.
Description
Read a stage text attribute value from a cached ado.net data table using the current POV values
and optionally perform concatenation on the results.
NOTE: Cache only lives for the duration of the WCF call.
Parameters
cacheLevel
Cache granularity level used to control how much information is cached in each chunk (Less
granular cache level helps repeated calls but hurts requests for a single cell because more data is
cached than necessary).
cacheName
Short name used to identify the values placed in the cache (Full CacheID will be CacheName +
CacheLevel Values).
wfProfileName
Name of the Import Workflow profile containing the values to be looked up. (Pass an empty String
to look up the workflow based on the POV Entity, use *.YourWFSuffix to get workflow profiles with
the specified suffix.)
fieldList
List of STAGE fields that will be used as criteria and/or returned.
Criteria
Criteria statement used to select rows in the cached data table.
fieldToReturn
Name of the stage field to return.
textOperation
Text operation to perform on the resulting data table (Note: FirstValue returns the first matching
row if there is more than one stage value for the specified cell).
Return Type
String
Example
'UD8 DynamicCalc - Lookup Attribute 1 From Stage
If Not api.Entity.HasChildren Then
Dim criteria As New Text.StringBuilder
criteria.Append("U1T = '" & api.Pov.UD1.Name & "' ")
criteria.Append("And U2T = '" & api.Pov.UD2.Name & "' ")
Return api.Functions.GetStageBlendTextUsingCurrentPov
(BlendCacheLevelTypes.WfProfileScenarioTimeEntity, "DU", "*.Sales Detail",
"U1T,U2T,A1,ConvertedAmount", criteria.ToString, "A1",
BlendTextOperationTypes.FirstValue)
Else
Return String.Empty
End If
GetStageBlendText
Function Prototype
Public Function GetStageBlendText (ByVal cubeName As String, ByVal entityName As String,
ByVal scenarioName As String, ByVal timeName As String, ByVal accountName As String, ByVal
cacheLevel As BlendCacheLevelTypes, ByVal cacheName As String, ByVal wfProfileName As
String, ByVal fieldList As String, ByVal criteria As String, ByVal fieldToReturn As String, ByVal
textOperation As BlendTextOperationTypes) As String
Description
Read a stage text attribute value from a cached ado.net data table using the specified POV values
and optionally perform concatenation on the results.
NOTE: Cache only lives for the duration of the WCF call.
Parameters
cubeName
Name of the Cube to use for the POV.
entityName
Name of the Entity to use for the POV.
scenarioName
Name of the Scenario to use for the POV.
timeName
Name of the Time to use for the POV.
accountName
Name of the Account to use for the POV.
cacheLevel
Cache granularity level used to control how much information is cached in each chunk (Less
granular cache level helps repeated calls but hurts requests for a single cell because more data is
cached than necessary).
cacheName
Short name used to identify the values placed in the cache (Full CacheID will be CacheName +
CacheLevel Values).
wfProfileName
Name of the Import Workflow profile containing the values to be looked up (Pass an empty String
if to look up the workflow based on the POV Entity, use *.YourWFSuffix to get workflow profiles
with the specified suffix).
fieldList
List of Stage fields that will be used as criteria and/or returned.
Criteria
Criteria statement used to select rows in the cached data table.
fieldToReturn
Name of the Stage field to return.
textOperation
Text operation to perform on the resulting data table (Note: FirstValue returns the first matching
row if there is more than one stage value for the specified cell).
Return Type
String
GetStageBlendNumberUsingCurrentPOV
Function Prototype
Public Function GetStageBlendNumberUsingCurrentPOV(ByVal cacheLevel As
BlendCacheLevelTypes, ByVal cacheName As String, ByVal wfProfileName As String, ByVal
fieldList As String, ByVal criteria As String, ByVal fieldToReturn As String, ByVal mathOperation
As BlendNumericOperationTypes) As Decimal
Description
Read a stage numeric attribute value from a cached ado.net data table using the current POV
values and optionally perform aggregation math on the results.
NOTE: Cache only lives for the duration of the WCF call.
Parameters
cacheLevel
Cache granularity level used to control how much information is cached in each chunk (Less
granular cache level helps repeated calls but hurts requests for a single cell because more data is
cached than necessary).
cacheName
Short name used to identify the values placed in the cache (Full CacheID will be CacheName +
CacheLevel Values).
wfProfileName
Name of the Import Workflow profile containing the values to be looked up (Pass an empty String
if you want to look up the workflow based on the POV Entity, use *.YourWFSuffix to get workflow
profiles with the specified suffix).
fieldList
List of stage fields that will be used as criteria and/or returned.
criteria
Criteria statement used to select rows in the cached data table.
fieldToReturn
Name of the stage field to perform math on and return.
mathOperation
Math operation to perform on the resulting data table (Note: FirstValue returns the first matching
row if there is more than one stage value for the specified cell).
Return Type
Decimal
Example
GetStageBlendNumber
Function Prototype
Public Function GetStageBlendNumber(ByVal cubeName As String, ByVal entityName As String,
ByVal scenarioName As String, ByVal timeName As String, ByVal accountName As String, ByVal
cacheLevel As BlendCacheLevelTypes, ByVal cacheName As String, ByVal wfProfileName As
String, ByVal fieldList As String, ByVal criteria As String, ByVal fieldToReturn As String, ByVal
mathOperation As BlendNumericOperationTypes) As Decimal
Description
Read a stage numeric attribute value from a cached ado.net data table using the specified POV
values and optionally perform aggregation math on the results.
NOTE: Cache only lives for the duration of the WCF call.
Parameters
cubeName
Name of the Cube to use for the POV.
entityName
Name of the Entity to use for the POV.
scenarioName
Name of the Scenario to use for the POV.
timeName
Name of the Time to use for the POV.
accountName
Name of the Account to use for the POV.
cacheLevel
Cache granularity level used to control how much information is cached in each chunk. Less
granular cache level helps repeated calls but hurts requests for a single cell because more data is
cached than necessary.
cacheName
Short name used to identify the values placed in the cache (Full CacheID will be CacheName +
CacheLevel Values).
wfProfileName
Name of the Import Workflow profile containing the values to be looked up. Pass an empty String
to look up the workflow based on the POV Entity, use *.YourWFSuffix to get workflow profiles with
the specified suffix.
fieldList
List of Stage fields that will be used as criteria and/or returned.
criteria
Criteria statement used to select rows in the cached data table.
fieldToReturn
Name of the stage field to perform math on and return.
mathOperation
Math operation to perform on the resulting data table
NOTE: FirstValue returns the first matching row if there is more than one stage value for
the specified cell
Return Type
Decimal
GetStageBlendDataTableUsingCurrentPOV
Function Prototype
Public Function GetStageBlendDataTableUsingCurrentPOV(ByVal cacheLevel As
BlendCacheLevelTypes, ByVal cacheName As String, ByVal wfProfileName As String, ByVal
fieldList As String) As DataTable
Description
Read stage data into a cached ado.net data table using the current POV values so that it can
be Queried / Analyzed in memory on the application server.
NOTE: Cache only lives for the duration of the WCF call.
Parameters
cacheLevel
Cache granularity level used to control how much information is cached in each chunk (Less
granular cache level helps repeated calls but hurts requests for a single cell because more data is
cached than necessary).
cacheName
Short name used to identify the values placed in the cache (Full CacheID will be CacheName +
CacheLevel Values).
wfProfileName
Name of the Import Workflow profile containing the values to be looked up (Pass an empty String
to look up the workflow based on the POV Entity, use *.YourWFSuffix to get workflow profiles with
the specified suffix).
fieldList
List of Stage fields that will be used as criteria and/or returned.
Return Type
DataTable
Example
'Lookup Attribute 1 From Stage Cached Data Table
If Not api.Entity.HasChildren Then
Dim result As String = String.Empty
Else
Return String.Empty
End If
GetStageBlendDataTable
Function Prototype
Public Function GetStageBlendDataTable(ByVal cubeName As String, ByVal entityName As
String, ByVal scenarioName As String, ByVal timeName As String, ByVal accountName As
String, ByVal cacheLevel As BlendCacheLevelTypes, ByVal cacheName As String, ByVal
wfProfileName As String, ByVal fieldList As String) As DataTable
Description
Read stage data into a cached ado.net data table using the specified POV values so that it can be
Queried / Analyzed in memory on the application server.
NOTE: Cache only lives for the duration of the WCF call.
Parameters
cubeName
Name of the Cube to use for the POV.
entityName
Name of the Entity to use for the POV.
scenarioName
Name of the Scenario to use for the POV.
timeName
Name of the Time to use for the POV.
accountName
Name of the Account to use for the POV.
cacheLevel
Cache granularity level used to control how much information is cached in each chunk. (The less
granular cache level helps repeated calls but hurts requests for a single cell because more data is
cached than necessary.)
cacheName
Short name used to identify the values placed in the cache (Full CacheID will be CacheName +
CacheLevel Values).
wfProfileName
Name of the Import Workflow profile containing the values to be looked up (Pass an empty String
to look up the workflow based on the POV Entity, use *.YourWFSuffix to get workflow profiles with
the specified suffix).
fieldList
List of Stage fields that will be used as criteria and/or returned.
Return Type
DataTable
GetCustomBlendDataTableUsingCurrentPOV
Function Prototype
Public Function GetCustomBlendDataTableUsingCurrentPOV(ByVal cacheLevel As
BlendCacheLevelTypes, ByVal cacheName As String, ByVal sourceDBLocation As String, ByVal
sourceSQL) As DataTable
Description
Read data from a custom table into a cached ado.net data table using the current POV values so
that it can be Queried / Analyzed in memory on the application server.
NOTE: Cache only lives for the duration of the WCF call.
Parameters
cacheLevel
Cache granularity level used to control how much information is cached in each chunk (Less
granular cache level helps repeated calls but hurts requests for a single cell because more data is
cached than necessary).
cacheName
Short name used to identify the values placed in the cache (Full CacheID will be CacheName +
CacheLevel Values).
sourceDBLocation
Database location name to query (Application, Framework, or a Named External Connection).
sourceSQL
SQL statement that defines the DataTable to cache for in memory querying.
Return Type
DataTable
Example:
GetCustomBlendDataTable
Function Prototype
Public Function GetCustomBlendDataTable(ByVal cacheLevel As BlendCacheLevelTypes,
ByVal cacheName As String, ByVal sourceDBLocation As String, ByVal As String) As DataTable
Description
Read data from a custom table into a cached ado.net data table using the specified POV values
so that it can be Queried / Analyzed in memory on the application server.
NOTE: Cache only lives for the duration of the WCF call.
Parameters
cacheLevel
Cache granularity level used to control how much information is cached in each chunk. (The less
granular cache level helps repeated calls but hurts requests for a single cell because more data is
cached than necessary.)
cacheName
Short name used to identify the values placed in the cache (Full CacheID will be CacheName +
CacheLevel Values).
sourceDBLocation
Database location name to query (Application, Framework, or a Named External Connection).
sourceSQL
SQL statement that defines the DataTable to cache for in memory querying.
Return Type
DataTable
Formulas
A formula is a set of calculation instructions to compute values. Formulas are written using
Microsoft Visual Basic .NET procedures that use OneStream API function libraries and member
script expressions. These combined capabilities provide a powerful programming environment
delivering reliable compiled formula definitions.
To use the OneStream API Details and Database Documentation, create a folder on the PC on
which this will be loaded and copy the related zip file:
Right-click and extract the zipped file’s contents here. Double-click the file that ends in chm. This
file launches the API Guide.
Contents are organized by the related Platform Engine (see Platform Engines). These are broken
down into Classes (e.g. DataApi), Overload Lists, Methods (e.g. GetDataCell), Syntax and
Parameters. The Index and Search tabs can be used to search by function name, enumerations,
properties, etc.
Formula Structure
Microsoft Visual Basic.NET With OneStream API and Member
Scripts
All formulas and business rules run as compiled VB.NET code. In a VB.NET function or
subroutine, calls are made to specific API functions which enable the rule writer to interact with the
Analytic Engine. Specific API functions are used to process member script expressions, and
create calculated values in the analytic data model.
Formula Composition:
VB.NET Language Keyword(s)
Return
Formula Composition
OneStream API Function
api.Data.Calculate("Member Script”)
Member Script
A#28000=A#28999:T#POVPrior1
You can use formula variables in member scripts to significantly improve performance when the
same formula is used for multiple members. When using formula variables, the formula text
remains the same, so there is no need for continued parsing and evaluation.
Using variables can also improve performance if a member Id is used instead of a member name
as the ID can also be used as the value in a formula variable. To use a formula variable in a
member script, use a dollar sign $ instead of a pound # sign before the member name, and use
the variable name after the dollar sign.
Example 1
api.Data.FormulaVariables.SetTextVariable(“variableAccount”, “8150”)
api.Data.Calculate(“A#8250=A$variableAccount * 10”)
Example 2
Dim acctMember As Member = api.Members.GetMember(DimType.Account.Id, “8150”)
api.Data.FormulaVariables.SetMemberVariable(“variableAccount”,acctMember)
api.Data.Calculate(“A#8250= A$variableAccount * 100”)
Example 3
Dim acctMember As Member = api.Members.GetMember(DimType.Account.Id, “8150”)
Dim acctId As Integer = acctMember.MemberId
api.Data.FormulaVariables.SetIntegerVariable(“myAccount”,acctId)
api.Data.Calculate(“A#8250 = A$myAccount * 1000”)
Applying Formulas
An administrator can create a formula on a Dimension Member or in a Business Rule.
Member Formulas
The preferred and more common approach is to write formulas on Dimension Members. Member
formulas are written using the Formula property on individual Scenario, Account, Flow, or User
Defined Members in the Dimension Library. The primary reasons for writing formulas on
Dimension Members are they provide intuitive formula organization and promote reusability of
Dimension and Members with their associated calculation across multiple Cubes. They also
enable parallel processing for performance optimizations using advanced multi-threading that
executes multiple formulas at the same time. Finally, writing formulas on Dimension Members
support drill down from a calculated amount to the amount used as the source for calculation and
they support the ability to vary formulas by Scenario and Time. This is useful because changes
can be made without affecting the calculation for older data or data in other Scenarios.
TIP: Due to the sophisticated built-in translation and consolidation algorithms, most
applications only require Member Formulas for the Data Unit Calculation Sequence
(DUCS) (i.e., Chart Logic). Custom Business Rules for Translation, Share, and
Intercompany Eliminations are not typically needed.
NOTE: Due to the way OneStream stores its data, Decimal should always be used
instead of Double or Integer when declaring variables that return a number within a
Business Rule or Member Formula.
See Business Rules in "Application Tools" on page 779 for more details.
Formula Types
Two types of formulas can be applied to dimensional members or members in a business rule file:
l Dynamic Cell Calculation (Dynamic Calc): An in-memory calculation that runs on demand
when a cell containing a dynamic member formula is requested. A Dynamic Calc formula
computes a value for a single cell and runs whenever the cell needs to be displayed without
storing the result.
l Stored Calculation: A persisted calculation that runs as part of the Data Unit Calculation
Sequence (DUCS). With these formulas you can calculate many cells simultaneously, such
as data buffer.
Performance Considerations
Dynamic Cell Calculation (Dynamic Calc)
Dynamic Cell Calculations enhance the consolidation process because the amount is calculated
when requested for display and is not written to the database. For reporting however,
performance may be impacted because data is calculated on demand. Dynamic calculations are
usually used for ratio or percentage calculations.
Stored Calculation
Consolidation performance is directly impacted by the volume and complexity of stored
calculations. Carefully consider each stored calculation since one poorly written rule can cause
large amounts of data to be written to the cube, negatively impacting consolidation performance. If
you use many member formulas or if data volumes are not considered in member formulas, over
100,000 stored numbers may be generated from just 1,000 initially loaded numbers. The quantity
of stored numbers is a critical factor to consider to optimize consolidation performance.
Stored Calculations
Stored calculations can reference other stored calculations and parent members aggregate
naturally.
TIP: You do not need to write a stored calculation to add or subtract individual members.
Instead, create an alternative member hierarchy and use the Aggregation Weight
property, set to -1.0 to negate. This aggregated value is dynamic and supports drill-
down.
Formula Execution
This section defines the types of calculations and calculation sequences used during different
analytic processing routines.
Cube Processing
Calculation
This executes the standard calculation sequence for a single Data Unit. A Data Unit refers to a
group of data cells for a specific Cube, Entity, Parent, Consolidation, Scenario, and Time Member.
See Data Units in "Workflow" on page 131 for more details on Data Units. Except for Dynamic
Cell Calculations, all Member Formulas are written to execute as part of a Data Unit’s Calculation
Sequence.
Translation
This executes a currency translation that occurs when the data for an Entity’s local currency
needs to be translated to a foreign currency. The translation step executes after the system has
run the Data Unit Calculation Sequence for an Entity’s local currency. After this is completed, the
default translation algorithms use Foreign Exchange (FX) rates to generate and store a
corresponding translated Data Unit. Finally, the Data Unit Calculation Sequence on the translated
Data Unit to produce the final translated amounts are run.
Consolidation
The Analytic Engine provides pre-built financial intelligence through a statutory Consolidation
Dimension that defines a sequence of Data Unit calculations and aggregations which include
currency translations, Parent-level adjustments, complex ownership computations, and
Intercompany Eliminations. For more details on Consolidation, see Consolidation.
As a basic guideline, customers should think about organizing formulas by account/collection type
across OneStream’s available formula passes using the following examples.
Formula Pass 1 - 8
Trial Balance
Formula Pass 8 or 9
Balance Account and CTA account
Formula Pass 9 - 16
Non - Trial Balance
All formulas in a pass are processed at the same time, so they cannot have dependencies on one
another.
2. Run the Scenario’s Member Formula, which is typically used for seeding a Scenario’s data
from another Scenario or from a prior year.
3. Run reverse translations by calculating Flow Members from other Alternate Currency Input
Flow Members. This is part of the built-in ability for an Entity to accept input using multiple
currencies.
4. Execute Business Rules (1 and 2). Up to 8 Business Rule files can be attached to each
Cube.
5. Run Formula Passes (1 – 4) for the Cube’s Account Dimension Members, then Flow
Members, and then User Defined Members. The Formula Pass is specified using each
Member’s Formula Type property in the Dimension Library.
Every time a data cell is written to the database, information with the stored cell about how it was
stored (e.g., manually entered, calculated, consolidated, etc.) is included. If a number was
calculated and stored as a result of a formula, it will always get cleared regardless of the metadata
settings. The AllowInput property specifies whether a cell can be written to, and therefore if a
formula stored a number, there would be information if the cell was calculated. If AllowInput is set
to True, and a new number is typed over the same cell, it is stored as manually entered instead of
a calculated cell.
Writing Formulas
Writing Dynamic Cell Calculations
When writing a formula for a Dynamic Calc Account, Flow, or User Defined Member in either a
Member Formula or a Business Rule (via FinanceFunctionType.GetDataCell), the goal is to return
an amount for a single data cell. In that case, the system knows the full 18 Dimensions of the data
cell it needs to display. Therefore, use any of the api methods that refer to a specific Account,
Flow, Intercompany, User Defined Dimension, etc.
For example, a user selected a Cube View to view some numbers. The full 18 Dimensions for
each cell the Cube View needs to display is determine that it needs to run a custom formula and it
initializes the api, so it knows about all 18 Dimensions. The Member Formula then displays an
amount on the Cube View.
If a Cube View is used to display the result of the above calculation for the Dynamic Calc account
for any numeric intersection (any Entity, Scenario, Time, UD, etc. for View set to YTD), the Cube
View will always display 123.4. If the account’s value is displayed for any text intersection (any
Entity, Scenario, Time, UD, etc. for View set to Annotation), the Cube View will show an empty cell
because the formula returned a numeric value, not a text value. To make the formula work for both
numeric and text View Members, use an If statement to check which type of View Member is
currently being requested. Then return a text value surrounded by double quotes when the View
Member is an annotation type. After saving, run the Cube View using multiple View Members
(YTD, Periodic, Annotation, Assumptions, etc.). The corresponding cell will display either of the
two constants specified.
objViewMember As ViewMember = ViewMember.GetItem(api.Pov.View.MemberId)
If objViewMember.IsAnnotationType Then
Return "My first OneStream Member Formula"
Else
Return 123.4
End If
A DataCell object is a wrapper for a DataCellPk object that defines the cell’s 18 Dimensional
intersection, a decimal accessed using the CellAmount property to store the number, and a
CellStatus containing other information about the cell such as NoData and Invalid Status.
A DataCellEx object is a wrapper for a DataCell object and a text property called
DataCellAnnotation which is used for setting a string for an Annotation type View Member. It also
contains some additional properties for CurrencyId and AccountTypeId filled in and can be
ignored when creating a DataCellEx object in a Dynamic Calc Member Formula.
The following example accomplishes the exact same result as the example above, except this
uses DataCell and DataCellEx objects to illustrate what to do if a return value containing cell
status is needed.
Dim objViewMember As ViewMember = ViewMember.GetItem(api.Pov.View.MemberId)
If objViewMember.IsAnnotationType Then
Dim objDataCellEx As DataCellEx = New DataCellEx()
objDataCellEx.DataCell.CellStatus = New DataCellStatus(True)
objDataCellEx.DataCellAnnotation = "My first OneStream Member Formula"
Return objDataCellEx
Else
Dim objDataCell As DataCell = New DataCell()
objDataCell.CellStatus = New DataCellStatus(True)
objDataCell.CellAmount = 123.5
Return objDataCell
End If
Notice the api.Data.GetDataCell function accepts one string Parameter in double quotes which
represents a Member Script equation. Each operand of the equation (e.g., A#Cash) takes the
unspecified Dimensions using the data cell currently being calculated. Therefore, every operand
points to a specific data cell identified by using an 18 Dimension intersection.
“E#CT:C#USD:S#Actual:#2013M1:V#YTD:A#Cash:F#None:O#Top:I#None:U1#None, …”
When writing Dynamic Calc formulas, specify any or all the 18 Dimensions if necessary.
(e.g., “A#Cash:U1#AllProducts + A#AcctsRec:U1#None”)
This first example uses the division operator (/) to calculate a data cell from one account divided
by a data cell from another account. If the denominator (A#AcctsRec) is zero or NoData, it will
automatically return a very large number (e.g., 9999999999999999999.0) as the result. This is
because dividing by zero in mathematics results in infinity, and the large number to approximate
infinity which allows subsequent functions or math operators that refer to the result to continue to
be processed is used.
Return api.Data.GetDataCell("A#Cash / A#AcctsRec")
Although an extremely large number is the best mathematical approximation for infinity, it is
typically not what administrators want to display in their financial system when source numbers
are not available. A Divide function that produces a NoData cell if either the numerator or the
denominator is NoData is available.
Return api.Data.GetDataCell("Divide(A#Cash, A#AcctsRec)")
The built-in Divide function is typically used when performing division in Dynamic Calc formulas.
However, for completeness and to provide some insight about how to create more complex
formulas, the following is an example of how to implement a formula that performs safe division.
Notice that the Member Formula is performing division using two DataCell objects to create a
resulting DataCell (Return numeratorDataCell / denominatorDataCell). This is a powerful
capability that allows any type of math to be performed using any number of DataCell objects.
Dim numeratorDataCell As DataCell = api.Data.GetDataCell("A#Cash")
Dim denominatorDataCell As DataCell = api.Data.GetDataCell("A#AcctsRec")
If ((Not numeratorDataCell Is Nothing) And (Not denominatorDataCell Is Nothing)) Then
If ((Not numeratorDataCell.CellStatus.IsNoData) And _
(Not denominatorDataCell.CellStatus.IsNoData) And _
TIP: When using VB.NET, use the underscore character at the end of a line in order to
continue a statement on the next line. For example, refer to the underscore after And in
the code above. This was done here because the full statement did not fit on one line in
this document. However, in the Member Formula editor, the If statement would be kept
on one long line.
The example below demonstrates a possible customized version of the DSO calculation.
Dim numDaysSum as Integer = 0
Dim currTimeId as Integer = api.Pov.Time.MemberPk.MemberId
Dim acctsRec as decimal = api.Data.GetDataCell("A#AcctsRec - A#SaleTax").CellAmount
If (acctsRec > 0.0) Then
Dim salesSum as Decimal = 0.0
Dim numPeriodsInYear = api.Time.GetNumPeriodsInYearForSameFrequency(currTimeId)
For (numPeriodsToSubtract As Integer = 0 To numPeriodsInYear)
Dim timeId as Integer
If numPeriodsToSubtract = 0 Then
timeId = currTimeId
Else
timeId = api.Time.AddTimePeriods(-1 * numPeriodsToSubtract, True)
End If
Dim timeName As String = api.Time.GetNameFromId(timeId)
Dim numDaysInTimePeriod As Integer = api.Time.GetNumDaysInTimePeriod(timeId)
This function employs standard currency translation using the Entity in the cell’s POV’s local
currency as the source and a parent Entity’s target and standard Percent Consolidation on the
Relationship Properties to calculate Share. It does not take intercompany elimination into account
unless that value was already consolidated and stored.
api.Functions.GetEntityAggregationDataCell(memberScript as string,
fxRuleTypeAssetLiabOverride as FxRuleType)
The last four optional settings let you specify alternate named FX Rate Types (e.g.
“AverageRate”) and FX Rule Types (i.e. FxRuleType.Direct or FxRuleType.Periodic) to perform
what-if simulations.
Create a UD8 Member named EntityAggregation and set its Formula Type to Dynamic Calc. Use
this as the UD8 member's Formula:
Return api.Functions.GetEntityAggregationDataCell("UD8#None")
Define columns that display a few key Accounts. Include a base-level Account that supports input
for both UD8#EntityAggregation and for input. E.g. A#Sales, A#Sales:UD8#None:Name(“Sales
Input”)
Change Cube View settings for General Settings / Common with Can Modify Data and Can
Calculate set to True.
Associate this Cube View with a Form Template which is assigned to a Workflow Profile.
The user enters a number on the Form in the Account that accepts input and clicks Save. The
user will see the dynamically aggregated Entity results without having to run a Consolidation.
This assumes that the UD8 Member with the formula is named EntityDefaultUD2Name. If
XFGetCell is being used in Excel, use None or any valid Member for all the other Dimensions.
'Display the Member name of the entity's DefaultUD1 Member using the Annotation View
Members.
If api.View.IsAnnotationType(api.Pov.View.MemberId) Then
Dim text As String = String.Empty
Dim udId As Integer = api.Entity.GetDefaultUDMemberId(api.Pov.Entity.MemberId,
DimType.UD1.Id)
If udId <> DimConstants.Unknown Then
Dim udMember As Member = api.Members.GetMember(DimType.UD1.Id, udId)
If Not udMember Is Nothing Then
text = udMember.NameAndDescription
End If
End If
Return text
End If
'If this is a numeric View Member (e.g., Periodic, YTD), display the number from the
U8#None Member.
Return api.Data.GetDataCell("U8#None")
The DynamicCalcTextInput Formula Type works the same as a DynamicCalc formula, but it
allows users to input annotations on Cube View cells without having to use the Data Attachment
Dialog. When this formula is used, the user can make annotations on Dynamic Calc Members
following the same method as a non-calculated Member.
Return is never seen in a Member Formula for Formula Pass. Instead of being returned, many
numbers are being calculated and stored. When running a Calculation, Translation, or
Consolidation, the Member Formula calls for an entire Data Unit. It does not tell with which
Account, Flow, or User Defined the numbers are being saved, this is the responsibility of the user.
Initially, this may be confusing because Member Formulas are often written in an account’s
Formula property, and administrators believe it will only allow that specific Member Formula to
write to that specific account. However, putting a Member Formula in an account’s Formula
property is only for organizational purposes. When it calls that formula, it is currently calculating a
Data Unit and it will initialize the api with only the Data Unit Dimensions.
Stored Formula passes use Data Buffer math, not the Data Cell math that occurs for the single cell
Dynamic Calc formulas. Stored Formulas are multi-Dimensional. For example, the formula is
executed for an entire Data Unit (e.g., Location1 Entity, USD Consolidation Member, Actual
Scenario, January 2013 time period). That Data Unit is a portion of a Cube where the UD1
Dimension could contain 1,000 products to keep track of sales by product. Therefore, the data for
the Sales2 account could contain a separate number for every Product (i.e., UD1), or if the
Location1 Entity only sells some of the products, there might be 200 numbers for Sales2 and the
other 800 products for Sales2 are NoData. That set of 200 numbers is called a Data Buffer. Data
Buffers can get much larger and more complicated when multiple Dimensions are used for
detailed analyses. However, since the same concepts still apply, it is easier to think about a
smaller set of Dimensions as in this example.
The formula “A#Sales1 = A#Sales2” is equivalent to saying, “Take the 200 numbers stored in the
Sales2 Data Buffer and copy them to a new Data Buffer, but change the account to Sales1, and
then store the new Sales1 Data Buffer in the database.” That one-line formula calculated and
stored an additional 200 numbers that did not exist before the formula was executed.
The formula below reads the Data Buffer for the Sales2 account (200 numbers) and then adds
50.0 to each of those 200 numbers to create a new Data Buffer that also contains 200 numbers.
The account for each of the 200 numbers in the new Data Buffer is changed to Sales1 and it is
then stored in the database.
api.Data.Calculate("A#Sales1 = A#Sales2 + 50.0”)
The newly modified formula below uses three accounts. That formula reads the Data Buffer for the
Sales2 account (200 numbers) and then reads the Data Buffer for the Sales3 account. For
example, the Sales3 account contains 100 numbers broken out by product in UD1, and 25 of
those 100 numbers use the same UD1 Members as some of the numbers from Sales2. The other
75 Sales3 numbers are for other products not used by the Sales2 account. OneStream
automatically combines the numbers from Sales2 and Sales3 and adds the Sales2 and Sales3
numbers that share a common intersection and also adds the additional non-common
intersections. The result is a new Data Buffer containing 275 numbers stored in the database for
the Sales1 account.
api.Data.Calculate("A#Sales1 = A#Sales2 + A#Sales3”)
To copy data from another Cube or Scenario that uses different Dimensionality, an example of the
Scenario formula would be as follows:
'Convert dimensionality
Dim destinationInfo As ExpressionDestinationInfo =
api.Data.GetExpressionDestinationInfo("")
Dim sourceDataBuffer As DataBuffer = api.Data.GetDataBuffer
(DataApiScriptMethodType.Calculate,
"Cb#AnotherCube:S#AnotherScenario",destinationInfo)
Dim convertedDataBuffer As DataBuffer = api.Data.ConvertDataBufferExtendedMembers
("AnotherCube", "AnotherScenario", sourceDataBuffer)
api.Data.SetDataBuffer(convertedDataBuffer, destinationInfo)
To drill down on this formula, use the following example in the Scenario’s Formula for Calculation
Drill Down setting:
If api.Pov.Cube.Name.XFEqualsIgnoreCase("TheDestCube") Then
Dim result As New DrillDownFormulaResult()
result.Explanation = "Pseudo-formula:
Cb#TheDestCube:S#TheDestScenario=Cb#AnotherCube:
S#AnotherScenario”
result.SourceDataCells.Add("Cb#AnotherCube:S#AnotherScenario")
Return result
End If
Return Nothing
Out-Of-Balance
The following formula would be implemented as a Member Formula on the Balance account. It
stores the difference of two other accounts.
If ((Not api.Entity.HasChildren()) And (api.Cons.IsLocalCurrencyForEntity())) Then
api.Data.Calculate("A#Balance = A#2899 - A#5999")
End If
Therefore, the following is incorrect when trying to read a specific Data Cell inside a Stored
Formula:
Dim objDataCell As DataCell = api.Data.GetDataCell("A#Cash")
Instead, all non-Data Unit Dimensions need to be specified. If one or more of those Dimensions
needs to be based on the other data stored in the Data Unit, then use Eval (for details on Eval see
Advanced Stored Formulas using Eval below). Otherwise, the formula will look like this:
Dim objDataCell As DataCell = api.Data.GetDataCell
("V#YTD:A#Cash:F#None:O#Import:I#None
:U1#None:U2#None:U3#None:U4#None:U5#None:U6#None:U7#None:U8#None")
The syntax above is accurate but creating that long string for many Dimensions is tedious and
error prone especially when using functions to determine what the Member names should be, and
then concatenating multiple strings. Instead it is recommended to use the MemberscriptBuilder
class when creating Member Scripts.
Dim MemberscriptBldr = New MemberscriptBuilder("V#YTD:A#Cash")
MemberscriptBldr.SetFlow("None").SetOrigin("Import").SetIC("None").SetAllUDsToNone()
Dim Memberscript As String = MemberscriptBldr.GetMemberscript()
Dim objDataCell As DataCell = api.Data.GetDataCell(Memberscript)
The Sales2 numbers need to be copied to Sales1 for “green” Products. In this fictitious example,
there is a special tax situation for green products and the sales numbers for those products need
to be isolated into the special Sales1 account. The application uses the UD1 Member’s Text1
property to keep track of which products are green.
However, this would not work because Stored Formulas are executed for an entire Data Unit. A
Data Unit represents all data for a Cube, Scenario, Entity, Parent, Cons, and Time Member. Since
there is no single product (i.e., UD1 Member) for the Data Unit currently being calculated, the first
line above does not make sense. A Data Unit cannot be asked what the UD1 MemberId is
because a Data Unit has data for multiple UD1 Members (200 different products in the Sales2
example).
The solution is using Eval to evaluate the individual Data Cells in a Data Buffer. Put the Eval
keyword around any portion of the api.Data.Calculate function including math statements. After
OneStream reads or calculates the DataBuffer defined within the Eval statement, it executes the
Eval function to give the opportunity to filter the list of Data Cells in the Data Buffer, or to
completely change the list of Data Cells in the Data Buffer. After the Eval function is completed,
OneStream uses the modified Data Buffer to perform the remaining part of the api.Data.Calculate
function.
The example formula needs to be modified by adding the Eval keyword around the A#Sales2
Data Buffer, a helper function, typically the name OnEvalDataBuffer, needs to be implemented
allowing an inspection, filter, and/or change to the Data Cells in the Data Buffer. The helper
function Loops over each of the Sales2 Data Cells (200 in this example). If the Data Cell’s UD1
Text1 setting says green, add that Data Cell to a new list of result cells. Otherwise, ignore the Data
Cell causing it to be skipped. The result is a new modified Data Buffer containing only the Data
Cells for green products (i.e., fewer than 200 Data Cells).
api.Data.Calculate("A#Sales1 = Eval(A#Sales2)", AddressOf OnEvalDataBuffer)
NOTE: If using Eval for NoData and ZeroCells, refer to the Remove Functions in
Formulas section for alternative performance enhancing solutions.
For those rare situations, use the GetDataBuffer and SetDataBuffer functions directly.
GetDataBuffer and SetDataBuffer are more fundamental then Eval. They are part of the internal
implementation of the Eval functionality. They allow the user to read some numbers using a
Member Script, process or modify each cell in the result, and then save the changes.
In the following example, the UD2 and UD3 Dimensions are being used to analyze data based on
each UD1 Member’s default settings for UD2 and UD3. All data is initially loaded to the
U2#Input:U3#Input Members, but that loaded data needs to be copied to the
U2#DefaultUD1:U3#DefaultUD1 Members. GetDataBuffer needs to be used in order to read the
loaded data because the destination UD2 and UD3 Members can be different for every Data Cell
based on its UD1 Member’s settings. Loop over each Data Cell and use the UD1 Member to get
its default UD2 and default UD3 settings. Then, change the UD2 and UD3 Member IDs for the
Data Cell in the Data Buffer. Finally, after Looping, call SetDataBuffer to save the new numbers.
When using api.Data.Calculate functions with or without Eval, it is important to know to which
Member a formula is being attached. For example, if the formula starts with api.Data.Calculate
(“A#Sales1 = …”), put the formula in the Sales1 account Member’s Formula setting. However, the
formula in this example is not writing to a specific Member. Every Data Cell being saved is
possibly written to a different UD2 and UD3 Member.
Technically speaking, the formula can be put in any Member’s Formula property even a seemingly
unrelated Member. If the Formula Pass is set correctly, the formula executes before any other
dependent formulas. Assigning stored formulas to Members is for organizational purposes only.
The Member does not restrict what the formula can do. However, if a formula is attached to an
unrelated Member, it will make the application difficult to maintain and understand. Therefore,
decide to attach a formula like this to the Scenario’s Member which means this formula needs to
be processed before most other formula passes. If there are numerous Scenarios, this formula
should be put in a Business Rule file, and the Business Rule file should be added to the Cube(s).
This can be done under
Application Tab|Cube|Cubes.
See the completed formula using GetDataBuffer and SetDataBuffer below:
'Copy all "U2#Input:U3#Input" numbers for this dataUnit to the corresponding UD1
default Members for UD2 and UD3.
Dim destinationInfo As ExpressionDestinationInfo =
api.Data.GetExpressionDestinationInfo("")
Dim sourceDataBuffer As DataBuffer = api.Data.GetDataBuffer
(DataApiScriptMethodType.Calculate, _
"U2#Input:U3#Input", destinationInfo)
If Not sourceDataBuffer Is Nothing Then
Dim resultDataBuffer As DataBuffer = New DataBuffer()
For Each cell As DataBufferCell In sourceDataBuffer.DataBufferCells.Values
If (Not cell.CellStatus.IsNoData) Then
Dim ud1Id As Integer = cell.DataBufferCellPk.UD1Id
cell.DataBufferCellPk.UD2Id = api.UD1.GetDefaultUDMemberId(ud1Id,
DimType.UD2.Id)
cell.DataBufferCellPk.UD3Id = api.UD1.GetDefaultUDMemberId(ud1Id,
DimType.UD3.Id)
resultDataBuffer.SetCell(api.DbConnApp.SI, cell)
End If
Next
api.Data.SetDataBuffer(resultDataBuffer, destinationInfo)
End If
'This function compares the numbers in 2 DataBuffers and returns a new DataBuffer
that has a value for each pair of numbers that don't match.
eventArgs.DataBufferResult.DataBufferCells.Clear()
If Not eventArgs.DataBuffer1 Is Nothing And Not eventArgs.DataBuffer2 Is
Nothing Then
'For each cell in DataBuffer1, try to find a number for the same intersection in
DataBuffer2.
For Each cell1 As DataBufferCell In eventArgs.DataBuffer1.DataBufferCells.Values
If (Not cell1.CellStatus.IsNoData) Then
cell1.DataBufferCellPk)
End If
End If
Next
'Now, for each cell in DataBuffer2, try to find a number for the same intersection in
DataBuffer1
cell2.DataBufferCellPk)
If cell1 is Nothing Then
First, create a new Finance Business Rule (in this example, a Finance Business Rule called
SharedFinanceFunctions was created) and then set the Contains Global Functions for Formulas
property to True. If the Business Rule is only being used to hold Shared Functions, delete most of
the content in the Main function. However, a Main function is still needed even if it is empty.
Next, create a Public Function or Sub in the Business Rule. See below for an example. If any
edits to the Business Rule impact Calculation Status, assign the Shared Business Rule to the
Cube under
Application Tab|Cube|Cubes. This is recommended.
Use the Business Rule in a Member Formula by creating an instance of the Business Rule and
assigning it to a variable. Then, any of the Business Rule’s Public Functions or Sub can be called.
Namespace OneStream.BusinessRule.Finance.SharedFinanceFunctions
Public Class MainClass
End Function
FinanceRulesArgs) As String
Try
Return "This is the result of my Test function!"
Catch ex As Exception
Throw ErrorHandler.LogWrite(si, New XFException(si, ex))
End Try
End Function
End Class
End Namespace
If Api.Cons.IsLocalCurrencyforEntity Then
Parent-Child Relationships
If api.Cons.IsForeignCurrencyForEntity Then returns True if the current calculated Consolidation
Member also depends on the Parent Entity (i.e. OwnerPreAdj, Share, Elimination, OwnerPostAdj,
Top). If there are two different Parent Entities for the same Entity, then there are two different sets
of numbers stored for those Consolidation Members.
Drill down can occur on data cells copied from one Scenario to another via formula or Data
Management Sequence. Before displaying the drill results, every cell’s Formula for Calculation
DrillDown Scenario Property is executed. The result determines whether the Scenario Member
will appear as drillable or not. Therefore, use If Statements in the formula to narrow in on the cell’s
Storage Type and/or the POV Members associated with the data copy, so cells do not appear
drillable when they are not. The example below copies data from the Actual Scenario Type to
Budget:
Dim result As New DrillDownFormulaResult()
If args.DrillDownArgs.RequestedDataCell.CellStatus.StorageType =
DataCellStorageType.Calculation Then
‘Use this to drill down to data that was copied using a Scenario Formula.
result.Explanation = “Formula Definition: Actual = Budget”
result.SourceDataCells.Add(“Cb#Houston:E#Houston:S#Actual”)
Else If args.DrillDownArgs.RequestedDataCell.CellStatus.StorageType =
DataCellStorageType.Input Then
‘Use this to drill down to data that was copied using Data Management.
‘result.Explanation = “Data Management Defintion: Actual = Budget”
‘result.SourceDataCells.Add(“Cb#Houston:E#Houston:S#Actual”)
End If
Return Result
For example, an application has one Entity, four Accounts, 100 Products (UD1) and 100
Customers (UD2). The user has entered three numbers in the system.
(A#Sales:U1#Prod1:U2#Cust1=22.0, A#Sales:U1#Prod5:U2#Cust1=33.0,
A#Sales:U1#Prod7:U2#Cust100=55.0).
The following cases explain what happens when writing the following Formulas.
Case 1: No Explosion
Use the following statement to set the value of a data intersection equal to another or a constant
value.
api.Data.Calculate(“A#Cash1 = A#RestrictedCash”)
Cash1 Account will equal what was in the Restricted Cash Account. The only UD1 and UD2
Members populated would be None Members because it is not typical to delineate Cash by
Product, or Customer.
Case 2: No Explosion
Two of the three numbers are multiplied by 1.05 and copied from the Sales Account to the Profit
Account which are only the source Sales numbers for Customer 1.
api.Data.Calculate(“A#Profit = A#Sales:U2#Cust1”)
The user specified a Customer for source data but did not specify which customer to use when
writing to the destination. This means answers are written to every base-level member of the
Customer dimension, resulting in some data explosion. The Profit account has a number for every
Customer (UD2) using the same Product (UD1) member as the source numbers. Profit account
will have 300 numbers, but because there were three different source products, there will be some
data explosion.
Not specifying a Member for a Dimension is the same as specifying all for a Member. If All is
specified on the left only, data explosion occurs. Some level of data explosion occurs if:
l The left side of an equation specifies All for one or more dimensions.
api.Data.Calculate(“A#Profit = 2.0”)
Profit will have 1000 numbers stored that are all Products by all Customers, causing large scale
explosion. However, this example is overly simple. If all Dimensions are used, trillions of numbers
may be stored. A constant is the same as not specifying All for every Dimension on the right side
of the equation.
The examples below are the same as Cases 3 and 4 from above, but this time the formulas are
written to avoid data explosion.
Case 4:
api.Data.Calculate(“A#Profit = 2.0”) will fail since the constant of 2.0 implies All Members from
each Dimension. In order for this Formula to execute, it needs to be api.Data.Calculate
(“A#Profit:F#All:O#All:I#All:U1#All:U2#All:U3#All:U4#All:U5#All:U6#All:U7#All:U8#All = 2.0”),
although the user may choose to do otherwise.
Data Explosion can occur when a formula is inadvertently written to read or calculate a Data
Buffer and then copy all Data Cells in that Data Buffer multiple times to every base-level Member
of a Dimension using the resulting Data Buffer (often causing hundreds of thousands of new
numbers to be saved). We only apply calculation to intersections where data exists. Fortunately,
OneStream protects the user from writing formulas that could result in data explosion, but the
concepts are important to understand because it is possible to circumvent those protections.
Now, the formula is changed, so it only copies the sales data for specific customers with UD2
being the Customer Dimension.
api.Data.Calculate("A#Sales1:U2#None = A#Sales2:U2#CustomerX +
A#Sales3:U2#CustomerY”)
The above formula will not result in data explosion because there is the same level of detail (i.e.,
the same Dimensions) specified in the destination as in every source operand. The example now
reads the Sales2 data for CustomerX, adds it to the Sales3 data for CustomerY, and saves the
results in the Sales1 account and the UD2 None Member.
The UD2 Dimension is specified for the source operands which is the right-hand side of the equals
sign, but UD2 is not specified for the destination which is the left-hand side of the equals sign.
When processing this formula, OneStream will read the two Data Buffers and add them together
correctly as before. However, it then needs to assign the combined Data Buffer to the Sales1
Account. However, the system cannot use U2#CustomerX or U2#CustomerY because Data Cells
were created by adding those together and it cannot arbitrarily choose one over the other. The
system could also have defaulted to use the U2#None Member, but history suggests that this type
of rule is more often written in error, and the customer did not intend the results to be stored in the
U2#None Member. From a maintenance perspective, OneStream feels that it is better to explicitly
specify U2#None if that is the intended destination Member.
If something like this were to happen, OneStream will provide an error message notifying the user
that data explosion will occur when trying to execute the above formula. Otherwise, the formula
will copy the source Data Buffer to every base-level UD2 Member because #All is the default
setting for each unspecified Member.
To circumvent the error message and force data explosion (please do not do this), explicitly
specify U2#All in the destination as shown below. This should be avoided and #All should never
be used in Member Scripts for stored formulas. However, the capability is provided for extremely
rare circumstances where that functionality was relied upon using an older product. In this case,
the consultant carefully analyzed the quantity of data and metadata settings to ensure the data
explosion resulted in a manageable number of Data Cells.
CAUTION: This causes Data Explosion! Do not ever use #All explicitly in stored
Member Formulas.
Key Functions
The list below contains the most commonly used functions; however, this is not the complete list
of all available functions. Download the OneStream API Overview Guide and OneStream API
Details and Database Documentation from MarketPlace for detailed Business Rule engine
background, an API guide and information on each database.
Account
Name Function Description
Public Function GetAccountType
(MemberId As Integer) As
AccountType
Account Retrieves the Account type for the
Type Member.
myAccountType =
api.Account.GetAccountType
(MemberId)
Get Cell
Retrieves the Account type of the data cell
Account GetCellLevelAccountType
based on its Account and flow settings.
Type
Public Function GetFormulaType
(MemberId As Integer) As
Get FormulaType
Returns the Formula Type if the Account is
Formula
calculated.
Type myFormulaType =
api.Account.GetFormulaType
(MemberId)
Public Function GetPlugAccount
(MemberId As Integer) As Member
Get Plug
myMember = Retrieves the plug Account.
Account
api.Account.GetPlugAccount
(MemberId)
Consolidation
Name Function Description
Public Sub Calculate(Formula As String, Optional
onBeforeSetDataBuffer As
BeforeSetDataBufferDelegate, Optional userState As
Object, Optional arg0 As String, Optional arg1 As
Executes a
String, Optional arg2 As String, Optional arg3 As String,
calculation for a
Calculate Optional arg4 As String, Optional arg5 As String,
specifically qualified
Optional arg6 As String, Optional arg7 As String)
Point of View.
api.Data.Calculate(Formula, onBeforeSetDataBuffer,
userState, arg0, arg1, arg2, arg3, arg4, arg5, arg6,
arg7)
Puts data into the
Execute Public Sub ExecuteDefaultElimination()
Elimination Member
Default
of the Consolidation
Elimination api.ExecuteDefaultElimination()
Dimension.
Used to calculate
Data Units where
Boolean argument for use in Entity members are
Second Pass FinanceFunctionType.Calculate rules sibling members.
Eliminations Dim bValue As Boolean = Used to ensure
args.CalculateArgs.IsSecondPassEliminationCalc source Entities are
fully calculated at
Eliminations
Puts data into the
Public Sub ExecuteDefaultShare()
Execute Share Member of the
Default Share Consolidation
api.ExecuteDefaultShare()
Dimension.
Data
Name Function Description
Allocates data
across Dimensions
(Entities, User
Defined
Dimensions,
Accounts, etc.) with
configurable
Allocation Use Journals for allocations
weighting, all
through Journals
that can be
previewed, are
generated, are
posted and can be
unposted.
Modifies Dimension
Convert Data Members for the
api.Data.ConvertDataBuffer
Buffer cells in a Data Buffer
using mapping.
Automatically
aggregates the data
for extended
Members in order to
create data cells for
Parent Members
that are Base-Level
Members in the
Convert Data
destination
Buffer Extended api.Data.ConvertDataBufferExtendedMembers
Dimensions. This is
Members
used when copying
data from a source
Data Buffer created
in another Cube or
Scenario where one
or more Dimensions
have been
extended.
myDataBuffer =
api.Data.GetDataBufferForCustomShareCalculation
(cubeId, entityId, ParentId, scenarioId, timeId, viewId)
Public Function
GetDataBufferForForCustomElimCalculation(Optional
includeICNone As Boolean, Optional
includeICPartners As Boolean, Optional
combineImportFormsAndAdjConsolidatedIntoElim As
Boolean, Optional cubeId As Integer, Optional entityId
As Integer, Optional ParentId As Integer, Optional
Get Data Buffer scenarioId As Integer, Optional timeId As Integer, Use this function to
for Custom Elim Optional viewId As Integer) As DataBuffer assist in Custom
Calculation Calculations
myDataBuffer =
api.Data.GetDataBufferForForCustomElimCalculation
(includeICNone, includeICPartners,
combineImportFormsAndAdjConsolidatedIntoElim,
cubeId, entityId, ParentId, scenarioId, timeId, viewId)
Entity
Name Function Description
Public Function GetLocalCurrency(Optional EntityId As
Retrieves the
Default Integer) As Currency
assigned Currency for
Currency
the Entity or Parent.
myCurrency = api.Entity.GetLocalCurrency(EntityId)
Public Function IsDescendent(dimPk As DimPk,
ancestorMemberId As Integer, descendentMemberId
As Integer, dimDisplayOptions As DimDisplayOptions) Returns if the Member
As Boolean is a Descendant of
Is Descendant
another Member.
myBoolean = api.Members.IsDescendent(dimPk, (Boolean)
ancestorMemberId, descendentMemberId,
dimDisplayOptions)
Public Function IsIC(Optional EntityId As Integer) As Returns if the Entity or
Is Boolean Account Member is
Intercompany an intercompany
myBoolean = api.Entity.IsIC(EntityId) Member.
Flow
Name Function Description
Public Function
SwitchSign(MemberId
As Integer) As
Flow Dimension only. Responds as to whether credits are
Switch Boolean
switched to debits for the specified Member for Revenue /
Sign
Expense Accounts. (Boolean)
myBoolean =
api.Flow.SwitchSign
(MemberId)
Public Function
SwitchType
(MemberId As Flow Dimension only. Responds as to whether Account types
Switch Integer) As Boolean are switched for the current or specified Member. This can
Type drive translating this Member by a different FX Rate Type.
myBoolean = (Boolean)
api.Flow.SwitchType
(MemberId)
FX
Name Function Description
Public Function GetDefaultCurrencyId
(Optional CubeId As Integer) As Integer
myInteger = api.Cubes.GetDefaultCurrencyId
(CubeId) Retrieves the currency type for
Currency
or the Cube or the Consolidation
Type
Public Function GetCurrency(currencyName Dimension Member.
As String) As Currency
myCurrency = api.Cons.GetCurrency
(currencyName)
Public Function GetStoredFxRate(fxRateType
As FxRateType, Optional timeId As Integer,
Optional sourceCurrencyId As Integer,
Current
Optional destCurrencyId As Integer) As FxRate Retrieves the current exchange
Exchange
rate for the specified Entity.
Rate
myFxRate = api.FxRates.GetStoredFxRate
(fxRateType, timeId, sourceCurrencyId,
destCurrencyId)
Public Function GetCalculatedFxRate
(fxRateType As FxRateType, timeId As
Exchange Integer) As Decimal Calculates the exchange rate
Rate from the default currency to
Calculated myDecimal = another.
api.FxRates.GetCalculatedFxRate
(fxRateType, timeId)
Public Function
GetFxRateTypeForAssetLiability(Optional
Get FX Rate
CubeId As Integer, Optional ScenarioId As Retrieves the default Rate Type
Type for
Integer) As FxRateType for Asset and Liability Accounts
Asset /
in this Cube or Scenario
Liability
myFxRateType = (overrides Cube value).
Accounts
api.FxRates.GetFxRateTypeForAssetLiability
(CubeId, ScenarioId)
Journals
Name Function Description
Public Function AllowAdjustments(Optional EntityId As
Integer, Optional varyByScenarioTypeId As Integer,
Journal Results determine if
Optional varyByTimeId As Integer) As Boolean
Postings Journal postings are
Allowed allowed for the Member.
myBoolean = api.Entity.AllowAdjustments(EntityId,
varyByScenarioTypeId, varyByTimeId)
Public Function AllowAdjustmentsFromChildren
(Optional EntityId As Integer, Optional
Journal
varyByScenarioTypeId As Integer, Optional Results determine if
Postings
varyByTimeId As Integer) As Boolean Journal postings from
from
children are allowed for
Children
myBoolean = this Member.
Allowed
api.Entity.AllowAdjustmentsFromChildren(EntityId,
varyByScenarioTypeId, varyByTimeId)
Member
Name Function Description
Determines if the
Base HasChildren = false, e.g. A#Root.Children
Member is a base
Members (HasChildren=False)
Member.
api.Members.GetMember(dimTypeId,
Get MemberName).MemberPk.MemberID Retrieves the name
Member or for the selected
ID Member.
api.POV.Dimension.Memberpk.Memberid
Get Retrieves the name
Member api.Members.GetMember(dimTypeId, MemberName).Name for the selected
Name Member.
Retrieves the
Member Members from a
See section of documentation on Creating Member Lists
Lists named list stored in
a Business Rule.
Member Retrieves the
api.POV.AccountDim, api.POV.EntityDim, etc
Name Member name.
myMember = api.Members.GetMember(dimTypeId,
Retrieves the
Member MemberId)
Member for the
Name or
specified ID number
ID Get Member ID from Member name:
or name.
Public Function GetMember(dimTypeId As Integer,
MemberName As String) As Member
myMember = api.Members.GetMember(dimTypeId,
MemberName)
Retrieves the top
Top api.Account.GetTopMemberForDimType
Member of the
Member (AccountMemberId, dimTypeForTopMember)
selected Dimension.
Public Function IsBase(dimPk As DimPk,
ancestorMemberId As Integer, baseMemberId As Integer,
Optional dimDisplayOptions As DimDisplayOptions) As Determines whether
Boolean the POV Member is
Is Base
the Base of a
defined Member.
myBoolean = api.Members.IsBase(dimPk,
ancestorMemberId, baseMemberId, dimDisplayOptions)
Public Function GetFirstCommonParent(dimPk As DimPk,
topMostMemberId As Integer, MemberIdA As Integer,
MemberIdB As Integer, Optional dimDisplayOptions As
DimDisplayOptions) As Member Returns the first
Get First
common Parent
Common
between multiple
Parent
Members.
myMember = api.Members.GetFirstCommonParent(dimPk,
topMostMemberId, MemberIdA, MemberIdB,
dimDisplayOptions)
Scenario
Name Function Description
Public Function GetConsolidationView
(Optional ScenarioId As Integer) As
Scenario ViewMember Determines if the Scenario's
Consolidation Consolidation View is set to YTD or
View myViewMember = Periodic.
api.Scenario.GetConsolidationView
(ScenarioId)
Public Function GetDefaultView(Optional
ScenarioId As Integer) As ViewMember
Scenario Retrieves the Scenario's Default
Default View View.
myViewMember =
api.Scenario.GetDefaultView(ScenarioId)
Public Function GetInputFrequency
(Optional ScenarioId As Integer) As
Frequency
Scenario Input Retrieves the Scenario's Input
Frequency Frequency.
myFrequency =
api.Scenario.GetInputFrequency
(ScenarioId)
Status
Name Function Description
Public Function GetCalcStatus(Optional CubeId As Integer,
Optional EntityId As Integer, Optional ParentId As Integer, Optional Retrieves the
consId As Integer, Optional ScenarioId As Integer, Optional timeId calculation
Calc
As Integer) As CalcStatus status for the
Status
data
myCalcStatus = api.CalcStatus.GetCalcStatus(CubeId, EntityId, intersection.
ParentId, consId, ScenarioId, timeId)
Time
Name Function Description
Weekly
applications
only: This
determines
MemberId
to which
for the
Dim timeIdForMonth As Integer = month a
Month to
BRApi.Finance.Time.ConvertIdToClosestIdUsingAnotherFrequenc specific
which a
y(si, timeIdForWeek, Frequency.Monthly) week
Week
belongs.
belongs
Used within
a Finance
Business
Rule.
Functions
Name Function Description
Calculates Days
Days Sales Dim cell As DataCell = api.Functions.GetDSODataCell
Sales Outstanding
Outstanding (acctsReceivableMember, salesMember)
(see below).
api.Functions.GetEntityAggregationDataCell
(memberScript as string, Optional Pseudo/approximate
Dynamic useStoredAmountsWhereCalcStatusIsOK as Boolean, consolidation of a
Simulation of Optional fxRateTypeRevenueExpOverride as String, data cell (see
Consolidation Optional fxRuleTypeRevenueExpOverride as String, Dynamic Simulation
Optional fxRateTypeAssetLiabOverride as String, of Consolidation)
Optional fxRuleTypeAssetLiabOverride as String)
GetStageBlendTextUsingCurrentPOV
GetCustomBlendDataTableUsingCurrentPOV
GetCustomBlendDataTable
For example, the following Stat Account is used to calculate Total Cost of Sales for three months
and is not set to be a Durable storage method (optional argument):
api.data.calculate("A#TOT_COS_LAST3:V#YTD = A#TOT_COS:V#Periodic + A#TOT_
COS:T#POVPrior1:V#Periodic + A#TOT_COS:T#POVPrior2:V#Periodic", False)
An alternative to this overloaded function is to provide Member Filters (all optional) that can be
used to filter the results before saving them to the target to affect fewer intersections, such as only
to be applied to certain Flow members:
api.Data.Calculate(formula, accountFilter, flowFilter, originFilter, icFilter,
ud1Filter, ud2Filter, ud3Filter, ud4Filter, ud5Filter, ud6Filter, ud7Filter,
ud8Filter, onEvalDataBuffer, userState, isDurableCalculatedData)
DataBuffer
When setting a value equal to another value, the item on the left side of the expression is the
value being set, and the item on the right side is the value being queried or calculated to set the
left side. Example: F#BeginBalance = F#EndingBalance.T#POVPrior1 would set the beginning
balance in the Flow Dimension to the prior period’s ending balance. In a Business Rule, the
DestinationInfo is the left side of the equation while a GetDataBuffer is the right side of the
equation.
Dim destinationInfo As ExpressionDestinationInfo =
api.Data.GetExpressionDestinationInfo("A#EBITDA:UD1#Tires")
Dim sales As DataBuffer = api.Data.GetDataBuffer("A#Sales:UD1#Tires",
destinationInfo)
Dim operatingExpenses As DataBuffer = api.Data.GetDataBuffer
("A#OperatingExpenses:UD1#Tires", destinationInfo)
Dim ebitda As DataBuffer = (sales – operatingExpenses)
api.Data.SetDataBuffer(ebitda, destinationInfo)
GetDataBufferUsingFormula
Use an entire math expression to calculate a final data buffer.
Api.Data.GetDataBufferUsingFormula can perform the same data buffer math as
api.Data.Calculate, but the result is assigned to a variable where api.Data.Calculate saves the
calculated data.
Example
Loop over the contents of myDataBuffer to conditionally change each data cell.
Formula Variables
There is additional capability to using Formula Variables to achieve the same level of flexibility
and integration as using Evals. After creating a data buffer variable, name it as a Formula
Variable and reference it inside api.Data.Calculate or other calls to
api.Data.GetDataBufferUsingFormula. This provides flexibility and can improve performance
because the Data Buffer is calculated once and the variable is re-used multiple times.
Example
FilterMembers
Use this inside of an api.Data.Calculate or api.Data.GetDataBufferUsingFormula script.
Example
Change a data buffer and only include numbers for the specified Dimensions. The first parameter
is the starting data buffer. This can be a variable name or an entire math equation in parentheses.
There can be as many parameters as needed to specify Member Filters and different Member
Filters can be used for multiple Dimension types. The resulting filtered data buffer will only
contain numbers that match the Members in the filters.
RemoveMembers
This uses the same syntax as FilterMembers, but it takes the data cells away for the specified
Members instead of keeping them.
Example
Example
myDataBuffer.LogDataBuffer(api,”MyDataBufferOutput”,1000)
The third Parameters (1000) indicates the maximum number of cells to include in the log and
displays what is in the data buffer.
Example api
api.LogMessage(XFErrorLevel.Information, "MyDataBuffer As a CSV String For Excel",
myDataBuffer.GetCSVString(api, False, 1000))
The false Parameter specifies whether to include Member IDs in the output. Member names are
always included.
This function is important to use for performance purposes. Use the RemoveZeros function in
calculations where there is a substantial amount of No Data or 0.00 cells in Data Units. This can
be determined by looking at the Data Unit Statistics when right-clicking on a cell in a Cube View.
Remove NoData
The RemoveNoData function evaluates a source data buffer and removes data cells that have a
cell amount of NoData.
This function is important to use for performance purposes. Use the RemoveNoData function in
calculations where there is a substantial number of cells with a Cell Status of NoData in Data
Units. This can be determined by looking at the Data Unit Statistics when right-clicking on a cell in
a Cube View.
Performance Note: In the calculation performance testing for a single Data Unit, the use of
RemoveZeros and RemoveNoData using api.Data.Calculate and/or GetDataBufferUsingFormula
rendered a significant performance advantage. This is testing the before and after calculation time
of a single formula change for a Data Unit. Times may vary from Data Unit to Data Unit and
Application to Application. However, the use of RemoveZeros and RemoveNoData is highly
recommended in formulas where Cube and Dimensionality designs lend to sparse data models.
This is not limited to sparse data models as this can be identified in dense data models as well.
Error Traps
Try
if api.POV.Cons.name =api.POV.GetEntityCurrency() then
api.Data.Calculate("A#Cash1=A#[Restricted Cash] + 70000")
End If
catch ex as exception
api.LogError(ex)
End Try
Math Functions
Use the math functions built into VB.NET.
Examples:
Rounding - math.round()
Several other supported math functions and examples are listed here:
http://msdn.microsoft.com/en-us/library/thc0a116%28v=VS.90%29.aspx
The following example first declares the two MemberListHeader names of Sample Member List
and PartnerList. It then addresses each MemberList. Sample Member List shows a way to get the
children of an Entity. E#Texas.[Sample Member List] is just another way of saying
E#Texas.Children. In the PartnerList example, E#Root.PartnerList will generate a list that
includes Paris and Nice.
Case Is = FinanceFunctionType.MemberListHeaders
‘ Additional logic that defines the names of the two custom Member Lists
Dim myMemberListHeaders = New List(Of MemberListHeader)myMemberListHeaders.Add(new
Case Is = FinanceFunctionType.MemberList
If args.MemberListArgs.MemberListName = "Sample Member List" Then
Dim myMemberListHeader = new MemberListHeader
(args.MemberListArgs.MemberListName)
Dim myMembers = new List(Of Member)()
Dim myMemberList = New MemberList(myMemberListHeader,
myMembers)myMembers.AddRange(api.Members.GetChildren
(args.MemberListArgs.DimPk,
args.MemberListArgs.TopMember.MemberPk.MemberId,args.MemberListArgs.DimDisplayOption
s))
Return myMemberList
Else If args.MemberListArgs.MemberListName = "PartnerList" Then
Dim myMemberListHeader = new MemberListHeader
(args.MemberListArgs.MemberListName) Dim myMembers = new List(Of
Member)()
Dim myMemberList = New MemberList(myMemberListHeader, myMembers)
myMembers.AddRange(api.Members.GetBaseMembers(args.MemberListArgs.DimPk,
api.Members.GetMember
(args.MemberListArgs.DimPk.dimtypeid,"Paris").Memberpk.Memberid,
args.MemberListArgs.DimDisplayOptions))
myMembers.AddRange(api.Members.GetBaseMembers(args.MemberListArgs.DimPk,
api.Members.GetMember
(args.MemberListArgs.DimPk.dimtypeid,"Nice").Memberpk.Memberid,
args.MemberListArgs.DimDisplayOptions))
Return myMemberList
End If
POV Object
In Business Rules, only Data Unit Dimensions are valid in the POV object, not Account or User
Defined Members, so the code below only works in Business Rules:
Dim AcctID as Integer=api.POV.Account.MemberPk.MemberID
SetDataCell
Use this to set a value equal to another value. All Members on the right side must be specified.
api.Data.SetDataCell(Memberscript, amount, isNoData)
api.Data.SetDataCell("A#[Restricted
Cash]:O#Forms:F#None:IC#None:U1#None:U2#None:U3#None:U4#None:U5#None:U6#None:U7#None
:U8#None", 50, False)
Translate
The api.data.translate function is the same as api.data.calculate, but aggregates AdjInput data
into the AdjConsolidated Member.
'GetParentCurrency only returns a value when running a translate.
If api.Parameters.FunctionType = FinanceFunctionType.Translate Then if
api.POV.Cons.Name =
api.POV.GetParentCurrency.Name Then
api.Data.Translate("A#[Restricted Cash]=A#[Restricted Cash]:C#[USD]*10")end if
End If
Unbalanced Math
The Unbalanced math functions are required when performing math with two Data Buffers where
the second Data Buffer needs to specify additional dimensionality. The term Unbalanced is used
because the script for the second Data Buffer can represent a different set of Dimensions from the
other Data Buffer in the api.Data.Calculate text. These functions prevent data explosion.
In the examples below, the first two parameters represent the first and second Data Buffers on
which to perform the function. The third parameter represents the Members to use from the
second Data Buffer when performing math with every intersection in the first Data Buffer. The
math favors the intersections in the first Data Buffer without creating additional intersections.
It is key that the dimensionality of the Target (left side of the equation) matches the dimensionality
of the first data buffer on the right side of the equation (argument 1).
AddUnbalanced
api.Data.Calculate("A#TargetAccount = AddUnbalanced(A#OperatingSales,
A#DriverAccount:U2#Global, U2#Global)")
SubtractUnbalanced
api.Data.Calculate("A#TargetAccount = SubtractUnbalanced(A#OperatingSales,
A#DriverAccount:U2#Global, U2#Global)")
DivideUnbalanced
api.Data.Calculate("A#TargetAccount =DivideUnbalanced (A#OperatingSales,
A#DriverAccount:U2#Global, U2#Global)")
MultiplyUnbalanced
api.Data.Calculate("A#TargetAccount =MultiplyUnbalanced (A#OperatingSales,
A#DriverAccount:U2#Global, U2#Global)")
Consider this example. A#OperatingSales has 100 stored records in January for a single Entity.
Because A#OperatingSales has a total of 100 stored values, A#TargetAccount will end up with
100 stored numbers and the amounts would be the values from A#OperatingSales
plus/minus/multiplied/divided by whatever was found at A#DriverAccount:U2#Global for each of
those 100 intersections.
This means that if there was no data in A#OperatingSales:U2#Widgets, then even though the
UD2 Dimension is unspecified in the target and in the first Data Buffer expression, no record
would be created, hence avoiding data explosion. The most common use case would be applying
a driver for some of the Dimensions.
ConvertUnbalanced
This function is related to the Unbalanced Math functions (see Unbalanced Math later in this
section) and used to convert a data buffer so that it is balanced with an api.Data.Calculate script
where unbalanced math does not apply. This is necessary when using an
api.Data.GetDataBufferUsingFormula to calculate a data buffer where the script was not
balanced to match a formula in another script where a data buffer variable needs to be used.
In the example below, a myDataBuffer was created to have data for all stored accounts, but the
subsequent api.Data.Calculate scripts expects each operand to use a specific account. The
ConvertUnbalanced function filters the data buffer to only include the specified account name and
it also converts the data buffer to make it balanced and consistent with the destination. The same
data buffer can be re-used multiple times.
Example
Dim myDataBuffer As DataBuffer = api.Data.GetDataBufferUsingFormula("A#All")
api.Data.FormulaVariables.SetDataBufferVariable("myDataBuffer ", myDataBuffer, True)
api.Data.Calculate("A#6050 = ConvertUnbalanced($myDataBuffer, A#6000) +
ConvertUnbalanced($myDataBuffer, A#3000)")
Calculate
Additional logic during calculation of Entity, Consolidation Scenario and Time. This sets the value
of one or more values (left side of Formula) equal to another (right side). It then executes a
calculation for a specifically qualified Point of View. This is the most common function used.
There are situations where the Entity being processed must access another Entity’s data. In
situations involving pulling Consolidation dimension Elimination results from other Entities as
siblings, the multi-thread processing of the calculations requires an additional function to ensure
the calculations are complete. Below is an example of this sibling relationship. The Entities Base1,
Base2, etc. are siblings that would be calculated simultaneously during a consolidation:
For this purpose, the Calculate Finance Function Type supports the argument
IsSecondPassEliminationCalc.
During the Calculate process, this function allows Business Rules to execute after the Sibling
Entities have calculated results to the Consolidation Elimination member. Below is a reference to
the Consolidation dimension:
Once all the sibling Entity members are calculated to Elimination, the Business Rules within the
IsSecondPassEliminationCalc will be executed.
Translate
Additional logic that uses custom translation.
FXRate
Custom logic used to determine Foreign Exchange rates for any intersection.
Consolidate Share
Additional logic used during the custom calculation of the Share Member.
Consolidate Elimination
Additional logic used during the custom calculation of the Elimination Member.
Custom Calculate
A CustomCalculate Finance Function Type can be used in order to execute a single year custom
calculation via a Dashboard Parameter Component Server Task Action. This is considered a
partial calculation and does not store the calculated data or run the calculation during a
consolidation. Running a custom calculation from a Dashboard will impact calculation status for
the affected data unit even if the data does not change. See Parameter Components in
"Presenting Data With Books, Cube Views and Other Items" on page 576 for more details on how
to assign this type of Finance Rule to a Dashboard. See Data Management for details on creating
this type of Data Management Step. See Parameter Components for more information on passing
arguments to a Custom Calculate function.
Example:
Select Case api.FunctionType
Case Is = FinanceFunctionType.CustomCalculate
If args.CustomCalculateArgs.FunctionName.XFEqualsIgnoreCase("FunctionName") Then
Me.CalcTest(si, globals, api, args)
api.Data.Calculate("A#TFS2903 = A#10000 + A#69000)
End If
End Select
The following Business Rule example will make all cells for the Account 6000 read-only. This
should be added to a Business Rule attached to a Cube.
Case Is = FinanceFunctionType.ConditionalInput
If api.Pov.Account.Name.XFEqualsIgnoreCase("6000") Then
Return ConditionalInputResultType.NoInput
End If
Return ConditionalInputResultType.Default
Confirmation Rule
Special logic that runs with Confirmation Rules.
Data Cell
Named GetDataCell calculations that can be reused such as a Better/Worse calculation in Cube
Views.
Select Case api.FunctionType is the expression used when a certain process needs to be isolated
and run special logic. See examples below:
Select Case api.FunctionType
Case Is = FinanceFunctionType.Calculate
‘ Additional logic to run with every calculation.
api.Data.Calculate("A#DACash1= A#Cash4 + 1")
api.Data.Calculate("A#2150:F#Movement = V#Periodic:A#5750:F#None") if
api.metaData.Cons.name ="USD" then
api.Data.Calculate("A#Cash1=A#[Restricted Cash]")
End If
Case Is = FinanceFunctionType.Translate
‘ Additional logic to run with every translation. If api.ExecuteDefaultTranslation
is not
Return myMemberListHeaders
Case Is = FinanceFunctionType.MemberList
‘ Additional logic that defines the Members within custom Member List that are
included in Business Rules.
new MemberListHeader(args.MemberListArgs.MemberListName)
Dim myMembers = new List(Of Member)()
Dim myMemberList = New MemberList(myMemberListHeader,
myMembers)myMembers.AddRange(api.Members.GetChildren(args.MemberListArgs.DimPk,
args.MemberListArgs.TopMember.MemberPk.MemberId,args.MemberListArgs.DimDisplayOption
s))
Return myMemberList
End If
End Select
Ultimate Ownership
This function calculates and stores Ultimate Ownership results for every ancestor/child Entity
relationship. The current Entity being calculated is the ancestor Entity, and when storing the
results, the Members in the Intercompany Dimension are used to represent each descendant
Entity. This function assumes the source Direct Ownership numbers are weights and are typically
entered by a user (e.g., numbers between 0.0 and 1.0).
C#OwnerPreAdj:O#AdjInput is using an account that does not consolidate and accepts manual
entry for adjustments in those Parent/Child relationships. It cascades the user-entered amounts
up the Entity tree and multiplies them when a Parent owns part of a Child and that Child owns part
of a Grandchild.
This function is intended to run only for the Entity's Local Consolidation Member, and it uses IC
Members to store the results for each descendant Entity.
Here are the steps to set up Ownership entry and calculations in an application. This functionality
is intended to be used in a separate Control Cube accessed via the main Financial Cube:
UltimateOwnership
This stores the calculated results in Parent Entities using IC Members for every descendant
Entity.
AccountType = BalanceRecurring, FormulaType = Formula Pass 1, Allow Input = False,
IsConsolidated = False, Is IC Account = Conditional
Formula:
If api.Cons.IsLocalCurrencyForEntity() Then
api.Data.CalculateUltimateOwnership("C#OwnerPreAdj:A#DirectOwnership:O#AdjInput",
"A#UltimateOwnership:O#Forms", 1.0)
End If
Column
A#DirectOwnership
Row
E#Root.TreeDescendants
Column
IC#ICEntities.Base
Row
E#Root.TreeDescendants
Consolidation
Consolidation is the process of taking base level Entities and aggregating them up a hierarchy to
their Parent. The hierarchy and add efficiencies where applicable, meaning some siblings may be
able to run in parallel when they are processed. Calculations and translations are also run during
this process.
Consolidations can be launched from the Process step in Workflow, Cube Views or Forms. By
triggering a consolidation, the existing POV is run for that time period and is consolidated for the
respective period along with any of the prior periods for that year. For example, if a consolidation
is triggered for June, it will consolidate January through June.
SHARE (Execute DUCS, FinanceFunctionType = Calculate, only if not using default calc-on-
the-fly)
Default Share is calculated on-the-fly
Example of a Consolidation
Launching a Consolidation
In Workflow Profiles, set up Calculation Definitions to process the appropriate calculation,
translation, or consolidation type. Right-click on the appropriate cell in a Cube View or Form to
view Process options:
Calculate
Runs calculations at the Entity level within the Local Member of the Consolidation Dimension
without translating or consolidating.
Calculate
Force Calculate
Translate
Runs the Calculate step above at the Entity and then translates data within the Translated
Member of the Consolidation Dimension for each applicable Relationship.
Translate
Force Translate
Consolidate
Runs the Calculate and Translate steps and then completes the calculations required all the way
up the Consolidation Dimension.
Consolidate
Force Consolidate
Force menu items such as Force Consolidate will run as if every cell included is marked as
requiring calculation, translation or consolidation.
Consolidate and Force Consolidate check and determine a Parent Member’s calculation status
and all children of the Parent before consolidating any data. The difference is Consolidate checks
Calculation Status, and if the status is OK, it accepts it and continues the consolidation process.
Force Consolidate runs as if every Member needs to be consolidated regardless of its actual
Calculation Status and does not bother querying Calculation Status. While they both perform
optimally, there are some cases where one performs better than the other. See the examples
below to learn more about when to use Force Consolidate vs. Consolidate.
In this case Consolidate will perform better than Force Consolidate because every month prior to
December has an OK Calculation Status. This means the data for that month has not changed
since the last consolidation and only the month of December needs to be consolidated. If a Force
Consolidate was used, every calculation would be performed again whether it is necessary or not,
therefore taking longer in the consolidation process.
In this case, Force Consolidate will perform better than Consolidate because every month needs
to be consolidated. If a Consolidate was used, the system would needlessly check each
Calculation Status before calculating each period. A Force Consolidate will calculate all periods
regardless. See Calculation Status for details on the status codes.
NOTE: Forced calculations that run on base entities from open periods will not impact
the status of Parent Entities for closed periods if the data is in an OK state. If a metadata
change occurred and entities are in an OK, MC state, all periods and entities will
recalculate, regardless of Workflow Open/Close state.
The Logging items (such as Force Translation with Logging) trigger additional detailed logging
which can be viewed in the Task Activity area. Drill into a log to see the length of time and
details about every calculation. A progress window displays Consolidations.
Calculation Status
If the existing data set for a POV changes, the calculation status is updated accordingly. A
timestamp table is used, and servers are synchronized to ensure the calculation status is always
accurate.
To explore calculation status, build a Cube View where the View Member in the Point of View is
set to CS, the columns are the time periods, and the rows are set to the desired Entity structure.
For example:
l OK: The data for this intersection has not changed since the last calculation.
l OK, MC: The intersection was calculated but metadata changed due to modifications to
artifacts such as Business Rules associated with this Cube, formulas and FX rates. This is a
clue that if the calculation is run again, the results may not be the same.
l CA: Calculate data since an import was run or data was entered.
l TR: Translate.
Currency Translation
Currency can be converted from one currency to another. This process utilizes the defined FX
Rates in the FX Rates portion of the product. See Foreign Exchange Rates in "Cubes" on
page 400 for additional details.
The base Entity can be converted to the Parent Entity’s currency if required. The currency
translation is run as needed based on the configuration of the Parent / child currencies as stored
in the Entity Dimension. This feature can be run independently of a consolidation if required.
Right-click in the appropriate cell and the translation option will display.
The Consolidation Dimension is shown above. For example, if an Entity’s currency is Euro, and
data is written to the EUR Member, that value is also displayed in the Local Member. The Local
Member is a pointer to the appropriate local currency Member where the data is stored. If that
European Entity in the screen shot above is consolidated to a company in the UK that has GBP as
their Currency setting under Entities, when the European Entity is consolidated into the UK
Parent, the European Entity’s Translated Member will reference the translated value which is
stored under the GBP Member under Currencies.
Calculation Status determines when data was entered. If the Local currency’s Calculation Status
is CA because the data has not been calculated yet, then the foreign currencies would also need
to be translated and calculated resulting in a TR calc status. If a calculation is done on the Local
Currency in order to make its calc status OK, the foreign currencies are still going to be TR
because that data has not been translated or calculated yet. Also, if a foreign currency has an OK
calculation status, and a foreign currency journal is entered, that currency then becomes TR.
The consolidation process starts at Local, which is the same data as one of the currencies under
the Currencies Member (based on the Entity’s default currency setting). If the Entity is base-level,
the data can be loaded into the Local Consolidation Member using the Import Origin Member, and
data can be typed into the Local Consolidation Member using the Forms Origin Member. If it is a
Parent Entity, the Local Consolidation Member is read only (except for Journals) as it represents
data that has been rolled up from child Entities (using the Import, Forms, and the AdjConsolidated
Origin Members).
Regardless of whether the Entity is base-level or a Parent, Journals can be entered using the
Local, Translated, OwnerPreAdj, OwnerPostAdj, and Any Currency Consolidation Member. For
all the Consolidation Members, Journals are always posted to the AdjInput Origin Member. As
data is rolled up from child Entities into a Parent Entity during the consolidation process, the
AdjConsolidated Origin Member will contain the combined values from the child Entities’ Journals.
Adjustments
Adjustments are created either through Journal entries, or in special instances, through Forms
that do their input as a Journal would. Both do their input into the AdjInput Member of the Origin
Dimension. Adjustments can be made to the following Members of the Consolidation Dimension:
Local, Translated, OwnerPreAdj and OwnerPostAdj.
When a consolidation is run, the AdjInput entries in child Entities are consolidated into the
AdjConsolidated Members in the Parent.
Users can drill down into the Adjustments Member in a Parent to show adjustments made in both
the Parent and child Entities.
Eliminations
When eliminations are calculated, the Elimination Member of both the Consolidation and Origin
Dimensions is updated. The primary purpose of the Elimination Member within the Origin
Dimension is to allow visibility from the top down without those figures getting lost during the
consolidation process.
Users can drill down into the O#Elimination Member in a Parent to show the elimination entries
made in both the Parent and child Entities.
Intercompany Elimination
Intercompany Elimination is the process of cancelling out account balances for intercompany
partners for intercompany accounts with any unresolved balance being placed in a Plug Account.
The Entity structure above belongs to GolfStream, a fictitious golf manufacturer. If the Detroit
Entity sold golf club shafts to Monterey who assembles the final club product, this would be an
intercompany transaction. The following prerequisites must exist for the transaction to eliminate.
l Monterey and Detroit must have there Is IC Entity property set to True.
l The Accounts Intercompany Receivables and Intercompany Payables must be set with the
Is IC Account property set to True and the Plug Account pointed to a third account.
l The intercompany entries must properly note the intercompany partner in the IC Dimension
Member. For example, Detroit would book an entry to Intercompany Receivables and the IC
Member for that entry would be Monterey.
Intercompany Eliminations occur once the values roll up to a common Parent. As the
consolidation begins, Detroit consolidates its values to Michigan and Monterey consolidates its
values to California. An elimination does not occur because they have not yet consolidated their
values to a common Parent. The elimination occurs when Michigan and California are
consolidated into the US common Parent. The two intercompany values will be eliminated at this
level with any discrepancies being posted to the Plug Account.
In another example in Workflow, Houston Heights and South Houston trade with Carlsbad, Dallas,
and Montreal. In the screen shot below, South Houston is shown in green because it is balanced
within an acceptable tolerance of $1. The $0.59 discrepancy is booked to the related Plug
Account.
Houston Heights is shown in red because there is a $229.51 discrepancy. Details on the
discrepancy are shown in the lower part of the screen. Houston Heights’ discrepancy with
Montreal is shown at the bottom in the application reporting currency (USD), Houston Heights’
currency (USD), and the partner currency for Montreal (CAD). Through the right click menu, leave
a status and description that each partner can see. As these values roll up the Entity structure, the
Parent Entities can also see this detail.
See Workflow Profiles in "Workflow" on page 517 for more details on how to set up Intercompany
Matching via Workflow.
In the example, transactions that occurred between the HQ2 Entities eliminate at HQ2.
Transactions between members of HQ1 and HQ2, such as a transaction between Paris and
Hartford, eliminate at the first common parent, Total Company.
Reporting on the results at the Total Company level, Direct returns results that occurred between
the HQ1 and HQ2 groups. Indirect at Total Company would allow reporting on eliminations
outside its direct children, HQ2 eliminations.
Custom Consolidation
If the standard consolidation or translation logic does not meet your project requirements, deploy
custom Business Rules. The first step in this process is defining the Cube’s properties settings.
See "Cubes" on page 400 in Calculation section for Consolidation Algorithm Type and Translation
Algorithm Type settings. The default setting is Standard but you can change this to Custom for
additional flexibility.
l "Application Tools" on page 779 in the Business Rule section on applying Business Rule
logic by Finance Function Type to discover what you can customize.
Reference the Finance Business Rule under any of the Business Rule 1-8 properties for each
Cube on which to use the logic. Also see Data Unit Calculation Sequences (DUCS).
Equity Pickup
OneStream supports Equity Pickup calculations using three different properties located under the
Entity’s settings all of which can vary by Scenario Type. These settings were designed to
implement Equity Pickup using normal Business Rules and formulas.
See Entity Dimension in "Cubes" on page 400 for definitions of each setting.
l Clubs (USD)
l Holding (EUR)
l Houston (USD)
l Carlsbad (USD)
l Frankfurt (EUR)
In the example above, the formulas for Holding need to read calculated data from the other sibling
Entities. Therefore, the Sibling Consolidation Pass property for Holding would be set to Pass 2
causing calculation to occur on all the other sibling Entities before the Holding Entity is calculated.
This allows the formulas for Holding to correctly read calculated data from Houston, Carlsbad, and
Frankfurt. For Entities not involved in Equity Pickup, the (Use Default), or Pass 1 settings for
Sibling Consolidation Pass causes all sibling Entities to be calculated at the same time.
Holding is using a different local currency than its Parent Clubs, but only wants to read data using
the EUR currency. In this situation, the Auto Translation Currencies setting for Houston and
Carlsbad needs to be set to EUR in order to have them automatically translate to EUR when
Clubs is consolidated. Normally, all the sibling Entities translate to the Parent Entity’s local
currency, which in this case is USD, however this setting tells the engine to translate Houston and
Carlsbad to EUR as well during the consolidation. Once the consolidation is complete, Holding’s
formulas, which are calculated in Pass 2, can read data from E#Houston:C#EUR,
E#Carlsbad:C#EUR, and E#Frankfurt:C#EUR.
Sibling Repeat Calculation Pass is designed for circular ownership and may not be used as often
as the other Equity Pickup settings. If this is used, it causes the Entity’s calculation to be repeated
after all the Sibling Calculation Passes have been completed. For example, if there was another
Entity in the structure above named Holding2, it would be set to use a Sibling Calculation Pass of
Pass 3. This would cause its normal calculation to occur after Holding and allow Holding2 to read
calculated data from Holding. Holding could also use a Sibling Repeat Calculation Pass causing it
to be recalculated. In that repeat calculation, Holding could then read calculated data from
Holding2 resulting in circular ownership. When writing formulas, use
api.Args.CalculateArgs.HasRepeatCalc and api.Args.CalculateArgs.IsRepeatCalc to determine if
the engine is currently running the repeat calculation.
Entity Aggregation
Entity Aggregation provides the speed and flexibility required for Budgeting, Planning and
Forecasting. Unlike consolidation processes, entity aggregation is simpler and faster because it
does not roll up financial data - heavily driven by financial and accounting rules - to a parent level
for reporting.
Consider this: "Consolidations are usually crafted to satisfy internal management and external
regulatory agency reporting requirements. The most common, effective way to understand the
core requirements of a consolidation system is to begin with the end in mind and look at the
reports produced by the legacy (or current) system. These usually involve an Income Statement
(Profit & Loss), a Balance Sheet, and a Cash Flow Statement” – OneStream Architect Factory
For base level entities, the Aggregated member displays the data that is stored in the “Local”
member.
For parent entities, the Aggregated member stores the results of the Entity Aggregation process
that occurs on its children.
Aggregation Algorithm:
l Execute chart logic (business rules and member formulas) on the Local Consolidation
member for all base entities.
l Execute these steps recursively for each Parent and its direct children, from lower-level
entities to the parent entities:
o For each child:
o Translate stored data in memory.
o Calculate the share amount in memory
o Add the data cells from each child in memory.
o Store the results in the Aggregated member for the parent entity.
Aggregated Dimension
l Rows
o Entity: E#[North America].TreeDescendantsInclusive
l Columns
o Consolidation: C#Aggregated, C#Aggregated:V#CalcStatus, C#Local,
C#Local:V#CalcStatus
o Time: T#2018M1, T#2018M12
Aggregate Jan 2018, by right-clicking on the number and pick Consolidate to Aggregate numbers.
Context aware algorithms are used recognize the Aggregated member.
Notes:
l No Eliminations
Launching an Aggregation
This is the same process as launching a Consolidation using context aware algorithms once the
Aggregated member of the Consolidation (C#Aggregated) is selected. For example, right-click on
the appropriate cell in a Cube View or Form to view the Process by clicking Consolidate.
Consolidate
l Consolidate
l Force Consolidate
Aggregate Jan 2018, by right-clicking on the number and picking Consolidate to Aggregate
numbers. Context aware algorithms are used recognize the Aggregated member.
l Different business units require smaller subsets of the data unit because they only need to
report on specific accounts, cost centers, products and so on.
l Business processes require a higher level view of data for budget or planning.
In these cases, data is filtered to focus on business and consumption needs for more efficient data
analysis.
Hybrid data uses data from a source scenario member and displays the results in a target
scenario member. You may need multiple target scenarios based on how you use hybrid source
data in your application.
For information about associated properties, see "Hybrid Scenarios" on page 423. Also see:
Data Bindings
Data bindings determine if data is shared or copied from the source scenario member to the target
scenario member. Sharing data is best if you need to analyze smaller data sets from a large data
unit.
The data results are dynamic and reference source data. This data is not stored to the target
scenario and is read-only. Shared scenarios also share the source scenario’s calculation status
and indicate source data changes. Standard calculations run from a shared scenario also run on
the source scenario member.
Use Copy Data to compare ‘What If’ Scenarios, for budget versioning or forecasting. By default a
data copy occurs if a standard calculation - associated with the following - is run on the target
scenario:
l Calculation definitions
The Calculate Data Management Step offers more control over the hybrid source data process.
Set ExecuteCopyAfterCalc Scenario to False and enable the copy execution on the Data
Management Step. If enabled, calculations run on the scenario, but the data copy only occurs if
the calculation runs from the Data Management Step.
Calculations follow the standard sequence and store the data as "calculated". If multiple
calculations run, the previously calculated or copied data is cleared. To preserve existing data, set
the data as "durable" on the target scenario.
Once the target scenario copy finishes, you can modify and adjust data, but the target scenario is
still bound to the source. The data copy occurs whenever the scenario is calculated based on the
scenario setup. Hybrid copies can be chained, so results in a target scenario can be used as the
source in another.
Data Filters
Hybrid source data generates unique data views using different filters, so you can best focus your
data modeling and analysis.
Pre-aggregated members provide a summarized source data view by defining the source parent
members and target base members. Members are filtered before aggregation and cached in
RAM. Then the smaller data set is brought in and Share or Copy the information over to the target
accounts.
This shares the base accounts for Account 60999. Any zeros in the source data are removed.
2. Specify a Data Binding Type: Copy Input Data from Source Scenario.
This copies the base accounts of Account 69000 and excludes Origin dimension members. Data
is copied from the top of Origin in the source scenario to the Import member of Origin in the target
scenario. The calculation occurs only if run from the Data Management Step, and the copied data
is stored as durable.
Navigation
You can access OneStream through a web browser or via OneStream Windows App through an
application shortcut deployed via a OneStream website, or from a version installed directly on a
computer. These options are available for both administrators and end users.
The layout is intuitive and easy to use. OnePlace will highlight all the major touch points as well as
provide information on the navigation tips.
TIP: Hover over any of the icons and a tool tip will temporarily display.
l The Windows app automatically updates whenever the application server version is
updated.
l It offers robust spreadsheet functionality, so you may not need to install the Excel Add-in.
1. Click the launch icon in the upper right corner of the window from any 4.2 or greater web-
based instance.
You can also launch the Windows App via Microsoft Edge with the URL associated with the latest
version of the server. First, enable Edge Chromium support for Click Once.
See https://docs.microsoft.com/en-us/deployedge/microsoft-edge-policies#clickonceenabled.
This icon can now be used to launch the Windows App from your desktop.
1. On the Logon page, click in Server Address to add the URL of the server to connect
to.
The Manage Connections window displays.
Logging In
1. In Server Address on the OneStream Logon screen, specify the URL or a client
connection, then click Connect. See "Defining Server Connections" on the previous page.
a. Click External Provider Sign In. Your IdP Login displays on a new browser
tab.
3. On the OneStream Logon screen, select an application to use and click Open
Application.
OnePlace Layout
1. Navigation Pane - This section covers the three tabs that are available: System, Application
and OnePlace.
Each bar can be displayed by pinning it to the screen, or by Auto Hide. Additional details can be
found further down in this section.
2. Home - Click the large OneStream icon to navigate to the user’s set home screen. See
Page Setting Options below for more information on setting home screens.
4. Application Tray
Back
This will navigate back to the last open screen and continue to navigate back to each previously
opened screen.
File Explorer
This opens the File Explorer dialog allowing users to access public folders, documents and the
File Share.
Forward
This works with the Back icon and navigates forward to the screen a user was previously using.
Environment Name
This is a customized environment name which can be made different across environments (e.g.
Development, Test and Production). Specify an environment name and color in the application’s
application server configuration file. See Installation and Configuration.
Logon/Logoff Icon
Upon selecting the Logoff icon, the user will be prompted to End Session or Change Application.
End Session logs the user off and removes the saved password from the logon screen. Change
Application keeps the user logged in and allows him/her to select a new application from the drop-
down screen.
Task Activity
This displays all tasks performed within the application. See Task Activity in "Logging" on
page 942 for more details on this feature.
Refresh Application
Refreshes the Application and checks the first open tab. If it is an Application tab, the view will
change to that tab. If it is not an Application tab, the view will stay on the selected tab but will
change the main active tab to Application.
Clipboard
Drag and drop items such as data cells, text, rule scripts to the clipboard in order to reuse them in
other areas. Users can store up to ten items on a clipboard.
Help
This opens OneStream documentation for Platform and MarketPlace.
c. Create Shortcut: This will create a shortcut for the current page and store it in the
user’s Favorites Folder. When this shortcut is selected from the user’s folder, it will
navigate to that page. This can be used for specific Cube Views, Dashboards, or
Application/System pages.
d. Set Current Page as Home Page: This setting controls the default settings for both
the page display as well as the pinning of the Navigation and Point-of-View panels
when you sign on. Click the OneStream icon in the application tray in order to
navigate to the home page from any other screen. This functionality works in both
the browser and the OneStream Windows App versions. Changes made here within
one environment carry over to the other.
This also controls the Pinning of the Navigation Bar and the POV Bar as well.
This generates a UserAppSettings.XML file that has the following pin options:
<SLHomePagePinNavPane>TrueValue</SLHomePagePinNavPane>
<SLHomePagePinPovPane>FalseValue</SLHomePagePinPovPane>
These control if the user’s navigation bar is pinned by default when logging into the application.
5. Workflow Bar
This section displays exactly where the user is in the Workflow process. Based on the
Workflow Profile, this can be configured as a Certifier or as a Data Loader. The example
above is configured as a Data Loader on the Validate task of the Workflow. The color green
indicates a completed task, blue indicates incomplete tasks. The white OneStream icon
indicates what task is currently in view.
6. Page Refresh
This section covers the local refresh and the ability to close a page.
Refresh Page
This refreshes the active page
Close Page
This closes the active page
7. Toolbar
Similar to the Workflow bar, this displays the items for a Data Loader or for a Reviewer /
Certifier. The example below is configured as a Data Loader during the Import task.
8. Context Pane
This bar is where the Point of View is set. This is an important concept because it
determines to which Dimensions the users will have access for this application view.
Additional details can be found further down in this section.
9. Grid
This displays the active window’s contents for the functionality being executed.
l Pages: This allows the ability to navigate through the pages either by choosing the
page, by clicking on the next page, or the first / last page.
l Tabs: Now that the tab is open, simply click on one of them to save time in navigating.
Clicking the New tab allows two or more like tabs to be opened. For example, two
Cube Views tabs can be opened at once.
10. Pages
NOTE: Right-click on any of these opened tabs for more page setting options.
The following icons are only located in the OneStream Windows App.
Zoom Options
This controls the zoom settings when working in the OneStream Windows App.
Point of View
The Point of View is located on the right side of the application. This tab can be docked by clicking
the pin button, otherwise, it will disappear when clicking anywhere in the main page.
There are three primary sections defined under the Point of View.
Global POV
This is the Point of View for the whole application. This is set by the administrator and will not be
active for the end user to update. This includes Scenario and Time.
Workflow POV
This has the same configuration as the Workflow area of the OnePlace Tab. It will display the
active Point of View; however it will not be active for the end user to update. This includes
Workflow, Scenario, and Time. The Time displayed is based on the Time Dimension Profile
associated with the Cube assigned to the Workflow Profile.
Cube POV
This is active and available to be updated by the end user. Each Dimension will need to be set
based on the information or activity a user needs to perform. Hover over any of the Dimensions
and a tool tip will display the Dimension type. To update a Dimension, select one and a Select
User POV box will appear. This box will give the user an opportunity to pick the Cube, Dimension,
and the ability to apply a Member Filter and search.
TIP: Save a commonly used POV by right clicking on the Cube POV and selecting Save
Cube POV to Favorites. This saves the POV under Application|Documents|Users|(User
Name)|Favorites allowing it to be used on any Cube View, grid, or Dashboard.
The Point-of-View panel supports the new User Defined Descriptions. Hovering over a selectable
point-of-view member will display the defined description. Dimensions which are fixed, not
selectable, will display the defined description and append “Not Used by Current Page”. See
Application Properties and then User Defined Dimensions (Descriptions) for more information.
Navigating Mobile
Mobile Toolbar
Click this to display or hide the OneStream Mobile POV context pane. The Mobile POV is
derived from the OneStream POV. The Mobile POV is interactive and can be changed to view
Cube Views or Workflow Status.
Presentation
NOTE: OneStream application security applies to all Dashboards, Cube Views and
Documents in Mobile. If users do not have access to particular reports in the
application, they will not have access to them in the Mobile web interface.
Dashboards
Select this to run Mobile Dashboards. Dashboards display several different views of OneStream
data. These Mobile specific Dashboards are designed in the Application, but can be viewed from
Mobile devices such as cell phones and tablets. See Application Dashboards in "Presenting Data
With Books, Cube Views and Other Items" on page 576.
Cube Views
Select this to view any Cube View currently saved in OnePlace in the application. If users can
change certain POV Members in the application, they can control those same Members in their
Mobile POV in order to have multiple views of their Cube View data.
Documents
Select this to launch and view the files saved in the application File Explorer. These documents
could include items such as Extensible Documents or Excel analysis workbooks.
NOTE: Users cannot launch any type of process from the Mobile web interface.
Processes include things such as running Business Rules, Calculations, Data
Management Steps or Sequences, etc. Mobile Dashboards and Cube Views are read-
only.
Status
Workflow
Select this to get the current Workflow Status of the Workflow set in the Application. The
Workflow Status provides details on each Workflow Input Type and shows the percentage OK, In
Process, Not Started, or Error. To view a different Workflow, click and select a new Workflow in
the Workflow POV. Navigate back to the Workflow Status page to view the new Workflow’s
details. Workflow security applies to OneStream Mobile.
Servers
Select this for a read-only view of the OneStream Application Servers’ Status. This provides
details about each application server such as the Environment, Web Server Name and
Connection Status.
Activity
Select this to see all the task activities in the application. End Users can view their own tasks and
administrators can view every task performed by any user.
Settings
Workflow
The Workflow Engine is the coordinator of all activity in the system. It protects you from having to
deal with the complexities of an advanced multi-Dimension Analytic Model. Profiles and Channels
are the basic building blocks of a solid workflow management structure. Data Units represent
units of work for loading, clearing, calculating, storing, and locking data within the multi-
Dimensional engine.
Workflow
Workflow is the overall system manager coordinating all end user activities while guaranteeing the
quality and transparency of source data used to feed the Analytic Models contained in OneStream
applications.
l Manage and enforce the quality process along with the data certification process.
The primary reason Workflow exists is to care and feed Analytic Cubes. Therefore, before a
Workflow hierarchy can be created, at least one Cube marked as Top Level for Workflow must
exist in the application.
A Cube is an Analytic Model that consists of eighteen Dimensions and provides system designers
the ability to quickly and reliably create multiple Cubes within an application. They also create a
Dimension Library that enables reusability and Dimension calculation logic. This capability
enables system designers to create optimal Analytic Models tailored to the specific business
process exhibited without having to move data in and out of the application.
The Workflow Engine’s job is to provide a common and seamless user experience no matter how
an application designer chooses to implement the underlying Analytic Model(s) (Cubes). This
capability enables application designers to create organized applications because they are free
from worrying about how to train users on the specific Analytic Models to access during the
business process.
When a Cube is defined, it can be marked as a Top-Level Cube for Workflow. This setting tells
the Workflow Engine the Cube is eligible for Workflow management. Each Cube can only
participate in one Workflow Management Structure, which means if a Cube is referenced by
another higher-level Cube, it cannot be set as a Top-Level Cube for Workflow.
Once a Cube is tagged as a Top-Level Cube for Workflow, the Workflow Engine will recognize the
Cube and allow a Workflow Management Structure to be created for the Cube based on the Suffix
Values for varying Workflow by Scenario Type.
Building a Workflow Management Structure for a Cube starts with defining how many variations
there will be in the processes used to feed the Cube (Suffix for varying Workflow by Scenario
Type). Workflow hierarchy variation is aligned with the ten Scenario Types that enable Cube
extensibility. By default, no suffix values are created which means the Cube can only have one
hierarchy in its Workflow Management Structure.
The table below details the three business processes for the Cube and the proposed suffix values
that will identify the business process in the Workflow Management Structure.
Next, the suffix values need to be assigned to the Scenario Type that will be used to capture data
for each process. The assignment process will vary from application to application and will be
dependent on the requirements of each business process.
The Workflow Engine can now manage three different collection hierarchies for the
FinancialReporting Cube. Once Workflow processing has begun and data is being actively
collected for the Cube, the Workflow Management Structure (suffix definitions) cannot be
changed.
Workflow can be thought of as an outline (Workflow Profile Hierarchy) of the business process
that is being used to model and analyze. This section describes the Components of a Workflow
Management Structure and the relationship between Workflow Profiles, Entity Members and
Origin Members.
Open State
The Workflow is available for usage and locking is controlled at the individual Workflow Profile
level. In addition, all Workflow hierarchy structure information is read from the current Workflow
hierarchy as it reads the Workflow Profiles management screen. This also means the Workflow
hierarchy is accessed from memory (cache) rather than being read from the database which
provides very fast read performance.
Closed State
The act of closing a Workflow hierarchy triggers the Workflow Engine to place a high-level lock on
the Workflow. This means individual Workflow Profile lock status values do not matter, and the
Workflow level will display a to indicate a closed Workflow. In addition, the Workflow Engine
will take a snapshot of the current Workflow hierarchy structure being managed from the Workflow
Profiles management screen. It will store it in a historical audit table for the Scenario and time
being closed. This also means the Workflow hierarchy is not accessed from memory (Cache) as
would be the case with a Workflow in an open state. A closed Workflow must be read from the
database rather than memory because it is considered a point in time snapshot stored in a
historical table. This is a performance penalty noticed when reading the entire closed Workflow
hierarchy for a Scenario and time. Workflow hierarchies should only be closed if major changes
are being made to the Workflow hierarchy and the structure of a Cube and historical hierarchy
relationships need to be preserved.
Named Dependents
Review Workflow Profiles have a unique ability to establish a dependency on the status of Input
Parent Profiles that are not their direct descendant in the Workflow hierarchy structure. This
concept is referred to as a Named Dependent relationship and was developed to accommodate
situations where a single Input Parent Profile loads data for many legal Entities that have very
different responsibility structures from a sign-off perspective. This situation is very common when
an organization utilizes a Shared Services infrastructure strategy.
All Input Parent Profiles must have at least one Input Child of each type (Import, Form and
Adjustment). This requirement exists because of the relationship between Input Child Profiles
and the Origin Dimension Members. Input Child Profiles can be thought of as a specialized
extension of the Input Parent with added intelligence and control features particular to data
updating.
Default Input
Default Input Parent Profiles are special because they cannot be created directly. They are
automatically created whenever a Cube Root Profile is created.
Unassigned Entities
The primary purpose of the Default Input Profile type is to serve as the initial relationship between
the Entities belonging to a Cube Root Profile and the Workflow hierarchy. Entities cannot be
explicitly assigned to a Default Input Profile. Any Entity Member under the Cube with which the
Default profile is associated and is not explicitly assigned to a Parent Input Profile or Base Input
Profile, is implicitly assigned to the Default Profile.
Parent Input
Parent Input Profiles are used to allow adjustments to Parent Entities in the Cube. Adjustments to
a Parent Entity are only allowed via Forms or AdjInput Members of the Origin Dimension.
Consequently, Import Child Profiles are not allowed to be used with a Parent Input Profile. The
Workflow Engine will automatically create an Import Child for each Parent Input Profile, but the
Import Child will be forced to be inactive (Profile Active = False).
Assigned Entities
The primary purpose of the Parent Input Profile type is to establish a relationship between Parent
Entities that require the ability to accept adjustments and the Workflow hierarchy. Parent Entities
do not need to be explicitly assigned to a Parent Input Workflow Profile unless the Parent Entity
requires the ability to be adjusted. Most Parent Entities exist as unassigned Entities and therefore
are controlled by the Default Input Profile.
Base Input
Base Input Profiles are used to control all methods of data entry for Base Entities in the Cube.
This is the most common Workflow Profile type and can be thought of as the workhorse of data
update management. Base Input Parents define the Entities that can be updated, the Cube being
targeted, and all the Import Child types that will participate in the input scheme.
Assigned Entities
The primary purpose of the Parent Input Profile type is to establish a relationship between Base
Entities that need to receive data from an external source and the Workflow hierarchy.
For example, if a Forms Input Child has the Profile Active set to False, and there are no other
active Input Child siblings of the type Form, data entry forms and Excel (SetCells Function or
Cube Views) cannot be used to set data cell values for the Entities assigned to the Input Parent
Profile of the Forms Input Child. The same technique can be used to enable/disable Import and
Adjustment input types.
Import Child
An Import Child defines and controls how data is imported into the Cube (See Data Loading for
more details). Each Import Child is bound to the Data Source and Transformation Rule Profile
which will define its Workflow behavior during the Import Workflow Step.
Forms Child
A Form Child defines and controls how data is manually entered in the Cube. Each Form Child is
bound to an Input Forms Profile that will define its Workflow behavior during the Input Forms Step.
It is possible to use a data entry form to update the AdjInput Member of the Origin Dimension, but
this requires the account being updated to have its Adjustment Type set to Data Entry rather than
the default value of Journal.
Adjustment Child
A Journal Child defines and controls how data is entered via journal into the Cube. Each Form
Child is bound to a Journal Template Profile that will define its Workflow behavior during the Input
Journals Step.
E#Root.WFProfileEntities
When used in a Member Filter, this expression returns all Entities associated with the selected
Workflow Unit.
E#Root.WFCalcuationEntities
When used in a Member Filter, this expression returns all Entities defined as part of the
Calculation Definitions for the selected Workflow Unit.
E#Root.WFConfirmationEntities
When used in a Member Filter, this expression returns all Entities defined as part of the
Calculation Definitions when the Confirmed Switch is set to True for the selected Workflow Unit.
If there are multiple Import Workflow Profiles, it automatically handles clearing and merging data.
For example, if there are two Import Workflow Profiles and import has already performed on one
of them, when the second import is performed and the user clicks Load, all the target Entities are
cleared. The two import data sets are merged, and a replace-style load is made to the financial
model.
Select the Workflow Profile to configure, expand the Workflow Profile, and then select the Origin
to be configured. In this example, Houston.Import is YTD and Houston.Sales Detail is MTD.
Next, choose the Scenario type where this behavior is needed. Select (Default) if this behavior is
required for all Scenario Types.
As a result, the Flow Type Accounts will be processed as Periodic rather than YTD upon data
submission or when no data is loaded for Flow Accounts. The Balance Accounts are forced to be
processed as YTD upon data submission.
Calculation Definitions are an incredibly valuable tool to the application designer because they
take the guess work out of what needs to be calculated and when. During the Workflow hierarchy
design process, the Calculation Definitions can be used to execute combinations of calculations,
translations, and consolidations at Workflow completion points.
When defining Calculation Definitions for a Workflow Profile, the Workflow Entity Relationship can
be leveraged. This means predefined variables can be used to execute calculations for Entities
assigned directly to a Workflow Profile (Input Parent types) or related to the Workflow Profile
through its dependency chain.
Dependent Entities
This defines a calculation for all Entities that assigned to the dependent Workflow Profiles of the
Review Profile. This list includes all Entities assigned to any Named Dependent Profiles as well.
Assigned Entities
This defines a calculation for all Entities directly assigned to the Workflow Profile.
Loaded Entities
This defines a calculation for all Entities imported by the Import Child Workflow Profiles and are
dependents of the Input Parent.
Each Calculation Definition record has a Confirmed switch associated with it. This switch
determines whether the Entities defined by a Calculation Definition should be subjected to the
Confirmation Workflow Step. It also gives the application designer control over which Entities are
subject to the Confirmation Rule validation process.
Filter Value
Assign a Data Management Sequence to Calculation Definitions by setting the name of the
Sequence under the Filter Value and setting the Calc Type to No Calculate. Next, set up a
DataQualityEventHandler Extensibility Business Rule to read the Sequence name assigned to
the filter and in turn, execute the Data Management sequence during the Process Cube task in
the Workflow.
In addition, Workflow Channels are a mechanism used to increase the granularity of the standard
Data Unit. They provide application designers with the ability to clear, load, and lock data at the
intersection of accounts and the Members of a User Defined Dimension. See Data Units for more
details.
There are three predefined Workflow Channel Members used as defaults when an application is
being built, Standard, NoDataLock, and AllChannelInput. New metadata Members and new
Workflow Profile Input Profiles are configured with default Workflow Channel Members and have
no effect on the granularity of application Data Units, or the Workflow processes associated with
clearing, loading, and locking Data Units.
2. Tag each Account Member with the proper group to which it belongs. The Workflow
Channel settings can vary by Scenario Type for both Metadata Members and Workflow
Profile Members.
3. Tag each Workflow Profile Input Profile with the Workflow Channel it should control. This
step hard wires the Workflow Profile to control data clearing, loading, and locking behaviors
of the Metadata Members associated with the assigned Workflow Channel.
The diagram below details the steps to set up a metadata and Workflow structure that isolates
process management for groups of User Defined Members and binds specific Workflow Profiles
to control the care and feeding of these groups (data clearing, data loading, and data locking).
Based on the examples above, if both Accounts and a User Defined Dimension are making use of
Workflow Channel tagging, a situation can occur where the Workflow Channel assigned to a
Workflow Profile is incompatible with either the Workflow Channel assigned to the Account
Dimension or the one assigned to the User Defined Dimension.
1. Assign the AllChannelInput Member to the Workflow Profile’s Workflow Channel. This will
allow the Workflow Profile to function in a more generic manner by limiting its usage to
metadata Members tagged with a specific Workflow Channel. The only negative
consequence of this approach is the Workflow locking for Workflow Profiles using this
setting reverts to the Origin Member level which is less granular than the Workflow Channel
level.
2. Sacrifice the Workflow Channel assignment of either the Account or the User Defined
Dimension by assigning the NoDataLock Member. Assigning this Member will basically
take it out of the Workflow Channel process and allow it to function with any Workflow
Profile no matter what Workflow Channel is assigned to the Workflow Profile.
It checks to see if there are any other sibling Import Workflow Profiles. If there are, it will then
check for overlapping Data Units within the proposed Stage load.
Workflow Channels Not in Use (or Same Channel Applied to Multiple Import Children)
In this case, it is possible that the two import siblings are attempting to load to the same
intersections. Consequently, the Workflow Engine will evaluate the sibling import data content in
order to determine if they overlap and are trying to write to the same data unit.
No Overlap
Load the Workflow Profile being processed because it is not overwriting sibling data.
Yes Overlap
Clear all data for all assigned Entities for the Import Origin Member. Next, reload the first Import
Child Workflow using Replace. Then, reload the second, third, etc., Import Children in order, so
the ultimate value in overlapped data units is the cumulative value from all Import Siblings.
Data Units
A data unit is used to load, clear, calculate, store, and lock data in the multi-dimensional engine.
With workflow channels, OneStream provides the following data units.
l Cube
l Entity
l Parent
l Consolidation
l Scenario
l Time
l Clear data
l Load data
l Copy data
l Calculate
l Translate
l Consolidate
The workflow data unit is the default level used by the workflow engine to control, load, clear, and
lock data. Workflow level data loads from the staging data mart to the cube, and is cleared and
locked at a granularity level that includes the account dimension by default.
For example, if two import workflow profiles are not siblings of the same input parent, but load to
the same entity, scenario and time dimensions, the data loads and clears at the account level.
However, if these two workflow profiles load the same accounts, the last workflow profile to load is
used. If these workflow profiles load to different accounts, then data loads for both workflow
profiles.
l Cube
l Entity
l Parent
l Time
l Consolidation
l Scenario
l Account
l Clear data
l Load data
l Lock data
The user-defined dimension that extends the data unit is specified at the application level from the
Application Properties screen. You can only use one user-defined dimension per application, so
carefully consider which user-defined to select in relation to the application's dimension
dimension.
For example, cost center and version user-defined dimensions are commonly used in a workflow
channel data unit. These user-defined dimension are frequently included in a workflow channel
data unit because they represent data slices that align with data collection and locking
requirements of the Budget and Forecast business processes.
l Cube
l Entity
l Parent
l Time
l Consolidation
l Scenario
l Account
l User Defined (x)
l Clear data
l Load data
l Lock data
Data Loading
This section describes how the workflow engine loads data for each relationship between the
workflow and data units. The OneStreamworkflow engine controls data loading from the staging
data mart to the analytic model. The workflow engine includes intelligence about what data to load
and how to load for each workflow unit. The loaded analytic cube gets this from the binding
relationship between its input parent workflow profiles and base entities.
In addition, the workflow engine uses the origin dimension's import member exclusively when
loading data. This predefined relationship provides a built-in level of data protection between
imported data, manual data entry, and journal adjustments. The workflow engine manages how
data is placed into the origin dimension's Import, Forms, and AdjInput members. The workflow
engine also forces imported data to use the local member of the consolidation dimension.
The workflow engine always starts with a workflow data unit to control clearing, loading, and
locking data for its entities. A workflow channel data unit is used if workflow channels are active in
the workflow unit's analytic model relationship.
l Explicitly locked
l Evaluate previously loaded data units to list data units to clear during the load.
l Clear Workflow data units loaded by the workflow unit. A workflow data unit
considers accounts and cube data unit standard members, so data clears at an
account level by default.
Clear all workflow channel data units loaded by the workflow unit. User-defined members and
workflow data unit members are standard members of a workflow channel data unit, so data
clears at a user-defined member entity, scenario, time, and account level by default.
l Data loads using parallel processing by entity. Multiple entities process at the same
time.
This workflow profile configurationhas only one import child profile under the parent (Frankfurt).
The workflow engine follows basic clear and replace data loading steps described in "Data Load
Execution Steps (Clear and Replace)" on page 153.
This workflow profile configuration has more than one import child profile under the parent (San
Diego). The workflow engine must perform extra steps to determine how to load data in the child
profiles.
In this case the two import child profiles may try to load the same cube or workflow data unit
because they have the same input parent workflow profile and are trying to load the same entities.
When the import GL data or import sales detail workflow profiles execute the data load step, the
following process determines how to correctly load data from both pofiles to the cube.
1. Check for overlapped data units between import child siblings (import GL data or import
sales detail).
This workflow profile configuration has a central input parent profile that may load data assigned
to another workflow profile. The Central HR Load workflow profile must have the Can Load
Unrelated Entities set to True, so the workflow engine will let it try to write data for unassigned
entities.
In this situation, when either Central HR Load or Houston executes a data load, the basic clear
and replace data loading steps described in the previous section are used. However, Central HR
Load does not control any entities so it checks and abides by the workflow and locking status of
the workflow profiles that own the entities. For example, the workflow engine disallow updates if
Houston is certified and/or locked and Central HR Load tries to load an entity owned by Houston.
Certain application designs may require a workflow parent to have multiple sibling import
channels. These designs typically use parallel processing techniques to load multiple non-
overlapping sibling import children. The Load Overlapped Siblings setting on the parent boosts
parallel processing performance in these workflow designs by eliminating overlapping checks
between sibling channels. It only happens when the sibling channels' data sources do not contain
overlapping data unit data records. This switch lets applications optimize data partitioning with
parallel processing using the least workflow profiles.
l True: Default behavior, sibling channels check for overlapping data units.
l False: Do not check sibling channels for overlapping data units. If an overlapping condition
occurs, the last processed channel overwrites the prior.
Data Locking
OneStream uses a locking strategy different than other analytic systems. All data control tasks are
delegated to the Workflow Engine including the Entity data locking control because of the
integrated Workflow Controller.
The Workflow Engine creates a bidirectional link between the Workflow Engine, the Staging
Engine, and the Analytic Engine. This two-way link creates a much stronger control structure
compared to systems with separate Workflow control modules that only interact with an Analytic
Model in a unidirectional control structure.
This is an important control feature because if a user of the system attempts to update a data cell
directly after all Workflow processing is completed, the Analytic Engine must check with the
Workflow Engine in order to determine if the cell can be updated. In a unidirectional control
structure, the data cell can be unlocked and updated regardless of the Workflow control state
creating a break in the process audit chain. This situation cannot exist because every input data
cell is associated with a Workflow Unit. Any attempt to update a data cell directly (Data Entry
Form or Excel, etc.) triggers the Workflow Engine to validate the data cell’s Workflow state by
resolving its Workflow status through the Entity assignment relationship mechanism.
Locking data in means the data is Locked for Input. When data is locked (Explicitly or Implicitly)
the Workflow Engine will not allow any form of data input to affect the Entities assigned to the
Workflow Profile of the locked Workflow Unit.
Lock Types
Explicit Locks
An Explicit Entity Data Lock is created when a Workflow Unit is locked therefore locking its
assigned Entity(s) for the Scenario and Time associated with the Workflow Unit.
Implicit Locks
An Implicit Entity Data Lock is created when a Workflow Unit’s Parent Workflow has been
certified. Implicit locks are created in order to ensure once a higher-level Workflow Unit is
certified, the underlying Entity data cannot be changed. Implicit locks can be cleared by un-
certifying the Parent Workflow Unit.
Locking Granularity
Data locks can be placed at different levels of granularity within the Analytic Model.
The diagram below demonstrates how the Forms Origin Member has been divided into multiple
Workflow Channels enabling each Form Input Child Workflow Profile and the UD1 data cells
bound to the same Workflow Channel to be locked independently. Workflow Channels can be
used with Import Input Children as well Adjustment Input Children.
See Data Management Automation through PowerShell in "Implementing Security" on page 322
for more information on executing OneStream Data Management Sequences from PowerShell
scripts.
Field Layout
File ID-ProfileName-ScenarioName-TimeName-LoadMethod.txt
aTrialBalance-Houston;Import-Actual-2011M1-R.txt
File ID
Any text value used for file identification and controlling sort order.
Profile Name
A valid Import Child Workflow Profile name. Use a ; to delimit Parent and Child Profile names.
Scenario Name
This is a valid Scenario name passed to Data Sources using the Dimension data type Current
DataKey Scenario. C can be passed as a substitution variable to reference the Scenario name
passed in the function call: HarvestAndProcessFiles. G can be passed as a substitution variable
to reference the Global Scenario name set for the application.
Time Name
This is a valid Time name passed to Data Sources using the Dimension data type Current
DataKey Time. C can be passed as a substitution variable to reference the Time name passed in
the function call: HarvestAndProcessFiles. G can be passed as a substitution variable to
reference the Global Time name set for the application.
Load Method
R = Replace, A = Append
Collecting Data
OneStream can connect to and import data from any external system using direct database
connections to the external system. Data can be collected from delimited data sources, forms,
and Excel files using XFSetCell or Cube views. Connector Business Rules define the connection,
data result sets, and drill-back option capabilities of an external data connection.
In this section you will learn about the various methods for collecting data.
Data Sources
Delimited Data Source
A Delimited Data Source with a separate column for Debit and Credit for the Amount Dimension is
accomplished by using a Parser Complex Expression, or a Business Rule. First, assign the Debit
column as the Amount. Next, create a Parser Complex Expression if this is a one-time
occurrence, or create a Business Rule if it can be applied to numerous Data Sources, and assign
it to the Amount Column. In the Complex Expression or Business Rule, check the value returned
for the Debit. If it is empty, or 0.00, refer to the credit value with the method outlined below.
Assessing the credit value is possible in OneStream because the transformation engine provides
an array list of all fields in the line. In the example below, Debit is column six, and Credit is column
7.
Fields from the external data query results are mapped to Dimensions creating a processing
behavior similar to the behavior of a Delimited File. Using this mapping process enables a
Connector Data Source to use all the same built processing capabilities available with file-based
Data Sources. This capability enables the design of an external data Connector to be entirely
focused on connecting to and reading data from an external source instead of focusing on
integrating complex business logic. The specific business logic can be added to the Data Source
Dimensions in the form of a Complex Expression or Business Rule. This design methodology will
help with writing the Connector Business Rule in a way that requires very little maintenance by
business users.
GetData
This is called by the Import Workflow task when the Load and Transform button is clicked. The
execution of a data query(s) that retrieves the row values for the chosen Workflow Unit is
requested.
Fields
The field names returned by this query must match the field names returned by the GetFieldList
request.
Where Clause
Typically the active Workflow Unit Time and/or Scenario values are converted to equivalent
criteria values for the Time and/or Scenario of the external system.
Data Volume
Consider loading summarized data rather than full transaction system data replication because
drill back is provided for more detailed values.
GetDrillBackTypes
Drill Back types can deliver results based on the different visualization types. This is called when a
user double-clicks or right-clicks and selects Drill Back from a row in the source data load or drill
down screens. A set of supported drill-back options to present to the end user as a list of
DrillBackTypeInfo objects [List(Of DrillBackTypeInfo)] is requested. Drill Back types provide the
Connector designer with the power to provide the end user with a menu list of drill back options.
DataGrid
This presents a grid of data rows to the end user.
TextMessage
This presents a text message to the end user.
WebUrl
This presents a website or custom HTLM web content to the end user.
WebUrlPopOutDefaultBrowser
Opens a website or custom HTML web content in an external browser. From the Stage Import
data grid, right-click on a data record, and select Drill Back. A dialog presents a menu of pre-
configured Drill-back options. When you choose WebUrlPopOutDefaultBrowser, a standard
browser session is launched, and you go to a web page based on variables.
FileViewer
This presents file contents to the end user from one of three locations.
FileShareFile
A file located in a folder in the OneStream File Share.
AppDBFile
A file stored in an application database.
SysDBFile
A file stored in a framework (System) database.
GetDrillBack
This is called when a user selects a specific Drill Back type presented by the GetDrillBackTypes
request. When this action is executed, the Business Rule arguments will contain a reference to
the DrillBackTypeInfo object the user selected which allows the Connector designer to determine
how to get proper information to display for the DrillBackTypeInfo.
NOTE: The requirement for Oracle Database integrations is that all Oracle Source
System TNS Profile details need to be in place on each of the OneStream application
servers.
2. Name the document and change the file extension from txt to udl.
This creates a Data Link File to assist in the formation of the source system connection
string.
3. Determine the DB Provider that the GL Source System is using (e.g. SQL, Oracle, etc.).
4. Determine the server name where the data resides for the GL Source System.
5. Determine the user name and password used to connect to the server for the GL Source
System.
6. Determine the database name on the server where the GL Source System data resides.
7. Save the completed UDL file and then rename the extension back to txt from udl.
SQL Server
Provider=SQLOLEDB.1;Integrated Security=SSPI;Persist Security Info=False;Initial
Catalog=DBName;Data Source=SQLSERVERNAME
DB2
Provider=IBMDA400.DataSource.1;Password=<xxxxx>;Persist Security Info=True;User
ID=OSuser;Data Source=HUTCH400;Use SQL Packages=True
MS Access
Provider=Microsoft.ACE.OLEDB.12.0;Data
Source=\\UNCFileShare\DB1.accdb;Mode=Read|Share Deny None;Persist Security Info=False
To extract data from any source system, the data query method and facility need to be
determined. Data can be queried through a SQL Query, a SQL View, or Stored Procedure.
OneStream executes this request against the source system using the defined source system
connection string and processes the returned results within OneStream.
For example, if directly pulling in Trial Balance Data is required, then the detailed query that
currently makes up the existing Trial Balance Report would be necessary for OneStream to pull
the same data.
SQL Query
A SQL Query can be broken down into numerous elements, each beginning with a keyword.
Although it is not necessary, a common convention is to write these keywords in all capital letters.
The standard sections of a SQL Query are made up of the following four elements:
SELECT
FROM
WHERE
ORDER BY
The example below is a SQL Query used to pull Trial Balance Data from several different tables in
an Oracle Database:
SELECT
GL_SETS_OF_BOOKS.NAME
,GL_BALANCES.ACTUAL_FLAG
,GL_BALANCES.PERIOD_NAME
,GL_BALANCES.PERIOD_NUM
,GL_BALANCES.PERIOD_YEAR
,GL_CODE_COMBINATIONS.CODE_COMBINATION_ID
,GL_CODE_COMBINATIONS.SEGMENT1
,GL_CODE_COMBINATIONS.SEGMENT2
,GL_CODE_COMBINATIONS.SEGMENT3
,GL_CODE_COMBINATIONS.SEGMENT4
,GL_CODE_COMBINATIONS.SEGMENT5
,GL_CODE_COMBINATIONS.SEGMENT6
,GL_CODE_COMBINATIONS.SEGMENT7
,GL_CODE_COMBINATIONS.SEGMENT8
,GL_CODE_COMBINATIONS.SEGMENT9
,GL_CODE_COMBINATIONS.SEGMENT10
,SUM( NVL(GL_BALANCES.BEGIN_BALANCE_DR,0) - NVL(GL_BALANCES.BEGIN_BALANCE_CR,0))"OPEN
BAL"
,NVL(GL_BALANCES.PERIOD_NET_DR,0) "DEBIT"
,NVL(GL_BALANCES.PERIOD_NET_CR,0) "CREDIT"
,SUM( NVL(GL_BALANCES.PERIOD_NET_DR,0) - NVL(GL_BALANCES.PERIOD_NET_CR,0))"NET
MOVEMENT"
,SUM(( NVL(GL_BALANCES.PERIOD_NET_DR,0) + NVL(GL_BALANCES.BEGIN_BALANCE_DR,0))) - SUM
(NVL(GL_BALANCES.PERIOD_NET_CR,0)+NVL(GL_BALANCES.BEGIN_BALANCE_CR,0))"CLOSE BAL"
,GL_BALANCES.CURRENCY_CODE
,GL_BALANCES.TRANSLATED_FLAG
,GL_BALANCES.TEMPLATE_ID
,FND_FLEX_VALUES_VL.FLEX_VALUE
,FND_FLEX_VALUES_VL.DESCRIPTION
,FND_FLEX_VALUES_VL.FLEX_VALUE_SET_ID
FROM
GL_BALANCES,
GL_CODE_COMBINATIONS,
GL_SETS_OF_BOOKS,
FND_FLEX_VALUES_VL
GROUP BY GL_SETS_OF_BOOKS.NAME
,GL_BALANCES.ACTUAL_FLAG
,GL_BALANCES.PERIOD_NAME
,GL_BALANCES.PERIOD_NUM
,GL_BALANCES.PERIOD_YEAR
,GL_CODE_COMBINATIONS.CODE_COMBINATION_ID
,GL_CODE_COMBINATIONS.SEGMENT1
,GL_CODE_COMBINATIONS.SEGMENT2
,GL_CODE_COMBINATIONS.SEGMENT3
,GL_CODE_COMBINATIONS.SEGMENT4
,GL_CODE_COMBINATIONS.SEGMENT5
,GL_CODE_COMBINATIONS.SEGMENT6
,GL_CODE_COMBINATIONS.SEGMENT7
,GL_CODE_COMBINATIONS.SEGMENT8
,GL_CODE_COMBINATIONS.SEGMENT9
,GL_CODE_COMBINATIONS.SEGMENT10
,NVL(GL_BALANCES.PERIOD_NET_DR,0)
,NVL(GL_BALANCES.PERIOD_NET_CR,0)
,GL_BALANCES.CURRENCY_CODE
,GL_BALANCES.TRANSLATED_FLAG
,GL_BALANCES.TEMPLATE_ID
,FND_FLEX_VALUES_VL.FLEX_VALUE
,FND_FLEX_VALUES_VL.DESCRIPTION
,FND_FLEX_VALUES_VL.FLEX_VALUE_SET_ID
HAVING SUM(( NVL(GL_BALANCES.PERIOD_NET_DR,0) + NVL(GL_BALANCES.BEGIN_BALANCE_DR,0)))
- SUM(NVL(GL_BALANCES.PERIOD_NET_CR,0)+NVL(GL_BALANCES.BEGIN_BALANCE_CR,0)) <> 0
SQL View
In many cases, creating a SQL View of data to provide information to OneStream is a more
preferred option and typically simplifies the complexity of the query.
In the example below, the customer can combine several data tables required in the source
system, and present the data in one View for OneStream to query:
SELECT
SEGMENT1 As Entity
SEGMENT2 As Establishment
SEGMENT3 As France_Account
SEGMENT4 As US_Account
SEGMENT5 As Cost_Center
SEGMENT6 As Family
SEGMENT7 As Product_Line
SEGMENT8 As Interco
SEGMENT9 As Future
PERIOD_YEAR As Year
PERIOD_MONTH As Month
CURRENCY_CODE As Currency_Code
CLOSE_NET_BALANCE As Net_Balance
SET_OF_BOOKS_ID As Set_Of_Books_ID
FROM APPS.XXSWM_ONESTREAM_GL_BALANCES
Stored Procedure
The example below is a SQL Stored Procedure used to pull Trial Balance Data from several
different tables in a SQL Database.
In this example, the Entity, Year, and Period are passed to the Stored Procedure:
spGLCalcTrialBalance 'ASCC', '2013', 6
As a best practice, create a new Dashboard Maintenance Unit named EXS The Connector Name.
The prefix EXS stands for External System and will provide administrators with an immediate
understanding of the Maintenance Unit’s contents. The three steps below explain how to create
this.
Step 1
Create a new Data Adapter for each type of query needed to proto type (GetFieldList, SelectData,
Drill Back, etc.)
Example of a Data Adapter being used to get all fields in the source table of the external database
connection:
Step 2
Step 3
Evaluate the results of the query. The Data Adapter test only returns a small subset of rows from
the query, but it specifies the actual number of rows that will return during an actual query
execution.
GetFieldList
Select Query against the external database. There will be a manual list of strings returned for
each field.
GetData
The selected statement should match GetFieldList. Add criteria for Scenario and Time and map
the OneStream Workflow Unit Scenario and Time values to corresponding values in the source
system as a Where Clause criteria value.
GetDrillBackTypes
This shows the set of drill back options provided to the user.
GetDrillBack
This executes the selected drill back type for the current source data row.
Drill Back
Using a SQL connector allows a user to drill back to a source system and show detailed records
from a document, PDF, website. The Connector Data Source, configured by the author, provides
a menu of data viewing options such as Year to Date, Month to Date, Invoice Documents or
Material Type Detail. Utilizing this feature can reduce the amount of data imported into the
Financial Model by allowing analysis to occur at the source system.
Viewing Data
Once data is loaded into the Stage, a user can right-click on a data row and select Drill Back. This
will bring up the pre-configured options from which the user can choose.
If more detail is needed, another level of Drill Back can be performed. This is configured in the
Connector Business Rule and can drill back and around source systems. These nested drill paths
can provide as much detail as an application requires.
Namespace OneStream.BusinessRule.Connector.RevenueMgmtHouston
Public Class MainClass
Public Function Main(ByVal si As SessionInfo, ByVal globals As BRGlobals,
Case Is = ConnectorActionTypes.GetData
'Process Data
Dim sourceDataSQL As String = GetSourceDataSQL(si, globals,
api)
api.Parser.ProcessSQLQuery(si, DbProviderType.OLEDB,
connectionString, true, sourceDataSQL, false, api.ProcessInfo)
Return Nothing
Case is = ConnectorActionTypes.GetDrillBackTypes
'Return the list of Drill Types (Options) to present to the
end user
Return Me.GetDrillBackTypeList(si, globals, api, args)
Case Is = ConnectorActionTypes.GetDrillBack
'Process the specific Drill-Back type
Return Me.GetDrillBack(si, globals, api, args,
args.DrillBackType.DisplayType, connectionString)
End Select
Catch ex As Exception
Throw ErrorHandler.LogWrite(si, New XFException(si, ex))
End Try
End Function
Catch ex As Exception
Throw ErrorHandler.LogWrite(si, New XFException(si, ex))
End Try
End Function
sql.Append("SELECT Top(1)")
sql.Append("TransID, PlantCode, CustId, CustName, InvNo,
Return sql.ToString
Catch ex As Exception
Throw ErrorHandler.LogWrite(si, New XFException(si, ex))
End Try
End Function
selectClause.Append("SELECT ")
selectClause.Append("TransID, PlantCode, CustId, CustName,
whereClause.Append("WHERE ")
'Get the YEAR from the current XF Workflow Unit TimeKey
whereClause.Append("(")
whereClause.Append("InvYear = " & TimeDimHelper.GetYearFromId
(api.WorkflowUnitPk.TimeKey).ToString)
whereClause.Append(")")
orderByClause.Append("ORDER BY ")
orderByClause.Append("PlantCode, CustId, WorkDay, ProdModel,
DestinationCode")
Return statement.ToString
Catch ex As Exception
Throw ErrorHandler.LogWrite(si, New XFException(si, ex))
End Try
End Function
drillTypes.Add(New DrillBackTypeInfo
(ConnectorDrillBackDisplayTypes.FileShareFile,
Return drillTypes
Catch ex As Exception
Throw ErrorHandler.LogWrite(si, New XFException(si, ex))
End Try
End Function
case is = ConnectorDrillBackDisplayTypes.DataGrid
'Return Drill Back Detail
Dim drillBackSQL As String = GetDrillBackSQL(si, globals,
api, args)
Dim drillBackInfo as new DrillBackResultInfo
drillBackInfo.DisplayType =
ConnectorDrillBackDisplayTypes.DataGrid
drillBackInfo.DataTable =
api.Parser.GetXFDataTableForSQLQuery(si,
case else
return Nothing
End Select
Catch ex As Exception
Throw ErrorHandler.LogWrite(si, New XFException(si, ex))
End Try
End Function
api.Parser.GetFieldValuesForSourceDataRow(si, args.RowID)
If (Not sourceValues Is Nothing) And (sourceValues.Count > 0) then
Return "Applications/GolfStream_
v24/DataManagement/RevenueMgmtInvoices/"
api.Parser.GetFieldValuesForSourceDataRow(si, args.RowID)
If (Not sourceValues Is Nothing) And (sourceValues.Count > 0) then
whereClause.Append("WHERE ")
'Get the YEAR from the source record
whereClause.Append("(")
whereClause.Append("InvYear = " & TimeDimHelper.GetYearFromId
(sourceValues.Item
(StageTableFields.StageSourceData.DimWorkflowTimeKey).ToString))
whereClause.Append(")")
whereClause.Append("(")
whereClause.Append("ProdModel = '" & sourceValues.Item
(StageConstants.MasterDimensionNames.UD2).ToString& "'")
whereClause.Append(")")
orderByClause.Append("ORDER BY ")
orderByClause.Append("BomCode")
End Try
End Function
End Class
End Namespace
1. On the Cube’s Integration Tab, ensure that the TextValue field is enabled for the desired
Scenario Type. This is needed to import the actual text.
3. Hardcode the View Dimension to import to the Annotation Member, or one of the other View
Dimension Comment Members such as VarianceExplanation.
5. There may not be an Amount to bring in but select a column that has a decimal value in
each row and a comment and link the Data Source to that column. These numbers will
come into the Stage but will not end up in the Cube because they are mapped to an
Annotation-type View Member.
Forms
Forms Channel Workflow
To minimize form maintenance, Cube View and Excel XFSetCell updates are not tied to specific
Forms. Association is at the Input Type level, not the individual Form level. The Forms Input Type
determines if you can update data from Excel. If the Forms channel is completed, but the process
is not certified, you can import data from Excel using XFSetCell or a Cube View. If a cell is
updated, the Analytic Engine traces the cell by:
This determines if the Form Input Type is enabled for the Scenario Type. If not, the Form Input
Type is disabled and cannot update cells from a Cube View in the web, Cube View, or XFSetCell
function from Excel.
If Form Input Type is enabled, the Analytic Engine checks the full Workflow Status for the active
Form Input Type. If the Workflow is locked or the Parent Workflow is certified, cells are not
updated. If the Workflow indicates updates can occur, the Process Cube task of the Workflow and
all ancestor Parent Workflows are impacted.
Form Allocations
Advanced Distribution
In the example below, an advanced distribution is used on a Product Revenue Form. This
allocation will take the previous year’s actual data, increase it by 20% and populate the current
year’s revenue budget revisions for all regions and customers.
The allocation data is being written to a form which will then populate the Revisions column. The
sum of the Baseline and Revisions will then create the new Full Budget for each Region and
Customer.
Right-click the first data cell in the Revisions column and select Allocation. This helps create the
Source and Destination POV.
By default, the Allocation dialog will open to the last Allocation processed. Select the Allocation
Type desired (e.g., Advanced).
1. Source POV
The Source POV defaults to the last cell selected for Allocation. Every Dimension is
represented in the POV. In this example, it defaults to the data cell under Revisions
because that is where the allocation option was selected.
Cb#Houston:E#[HoustonHeights]:C#USD:S#BudgetV1:T#2011M1:V#YTD:A#2000_
100:F#None:O#Forms:I#None:U1#None:U2#Mach5:U3#NA:U4#TotalCustomers:U5#Non
e:U6#None:U7#None:U8#None
Users can also select a data cell from the grid and drag and drop the cell’s POV into this
field. The Source POV is the default Source Amount for the allocation.
3. Destination POV
This is where the allocation is applied. In this example, the Destination POV is blank
because it is using the same Members from the Source POV. Users can also drag and drop
a data cell’s POV.
7. Offset
The offset properties are optional and not used in this example.
The allocation results dialog provides information on all the allocation destinations, weight
information, and displays all the data rows that will be updated upon selecting Save Allocation
Data. Check the Show All Dimensions box in order to see every Dimension intersection for each
data row. Once the allocation data is saved, the form data will update and store the data to the
Cube.
Results:
NOTE: The Var % column updated itself to 30% from 10% because of the additional
20% added to the allocation. The Full Budget column also updated itself with the new
total from the Baseline and Revisions columns.
The example below uses Cube Views, however, if the forms are driven from Spreadsheets, a
Spreadsheet Dashboard Component can also be used.
1. Design the Cube Views necessary for data entry. Once the Cube Views are complete,
create a Dashboard Maintenance Unit.
2. Within the Dashboard Maintenance Unit, create a Delimited List Dashboard Parameter
specifying all Cube View names in both the Value Items Property.
3. Create a Cube View Dashboard Component and enter the Parameter name in Cube View
Property enclosed in Pipes and Exclamation Marks.
4. Create a Supplied Parameter Dashboard Component in order to pass the Parameter value
from the Dashboard to the Form Template. Specify the Parameter Name in the Bound
Parameter property.
5. Create a Dashboard with a Uniform Layout Type and assign the Cube View and Supplied
Parameter Components to it.
6. Create a Form Template and set the Form Type to Dashboard and assign the desired
Dashboard.
7. Define which Cube View this specific Form should use in the Name Value Pairs property by
hardcoding a specific Cube View name from the Delimited List Parameter.
When the Form Template is used in the Workflow, the specified Cube View will display for
data entry.
The following Dimension Tokens are used within an Import Excel Template. Please note these
tokens can be in any order on the Excel template.
Dimension Meaning
Tokens
AMT# Amount: using the AMT.ZS# header will automatically apply zero
suppression to this import.
F# Flow
Dimension Meaning
Tokens
IC# Intercompany
E# Entity
C# Consolidation
S# Scenario
T# Time Period
V# View
O# Origin
UD1#-UD8# Each row must have a value even if a User Defined Member is not used in
the application. Create a Static Value of None for any UD Members where
this applies. Ex. UD5#:[None]
UX# or UDX# can be used for all User Defined Dimensions.
LB# Label: This is used for an Account description related to a line of data. It is
imported just for reference purposes and not stored in the Cube.
SI# Source ID: This is a key for data imported into Stage. This typically includes
a reference to the Entity being loaded but depends on the implementation.
It is a best practice to have only one Source ID per Named Range and
these can be the same or different for every Named Range imported for
one Excel workbook.
TV# Text Value: this is used to store large amounts of textual data.
A1# through Attribute Dimensions: these 20 Dimensions can each store 100 characters
A20# of text.
Dimension Meaning
Tokens
AV1# through Attribute Value Dimensions: these 12 Dimensions can store numeric data.
AV12#
Header Abbreviations
Static Value
Use :[] in order to fix a specific Member to the entire column creating a Static Value for the
specified Source Dimension. For example, F#:[None] imports the None Flow Member for every
Flow row within the Named Range. This syntax applies to all Dimension Tokens.
Data Sources allow text values to be loaded as a View Member from the same row as the numeric
value. Specify #Annotation, #VarianceExplanation, #AuditComment, #Footnote, or #Assumption
as the Static Text Value of the TextValue Source Dimension and a new row will be created for the
comment row. For example, use TV#:[#Annotation] to add an additional Annotation row.
Business Rule
Pass a Business Rule for any specified Source Dimension in order to set a specific value.
AMT#:[]:[BusinessRuleNameThatSetsAValue]
Matrix Member
This repeats for each Member. For example, if there were twelve time periods in the named range
the syntax would be as follows:
T#:[]:[]:[2012M3]
In order to use Current/Global Scenario and Time, use .C# and .G# which creates a Static Value
for the Time and Scenario within the Named Range. T.C# and S.C# returns the current Workflow
Time and Scenario. T.G# and S.G# returns the Global Time and Scenario.
Example
Dim objXFResult As XFResult = BRApi.Finance.Data.SetDataCellsUsingCsvFile(si,
filePath, delimiter, originFilter, targetOriginMember, loadZeros)
When using this BRApi make sure to specify the Origin Filter which determines the type of data
desired from the file (Import, Forms or Adjustments), and the Target Origin Member which
determines where the data will be stored upon loading the file.
OneStream reads this template using a specific Named Range which is explained later in this
section. Ensure the following information is included in the Named Range.
Property Tokens
The first four rows of the Named Range in the Excel template must include the following token
definitions:
Workflow Name
Enter the Form Workflow name. For example, if the name of the Workflow Profile is Houston, and
the Form input type is named Forms, enter Houston.Forms.
Workflow Scenario
Enter the current Workflow Scenario such as Actual, Budget, etc. In order to dynamically use the
current Workflow Scenario, use the |WFScenario| Substitution Variable.
Workflow Time
Enter the current Workflow Time Period. In order to dynamically use the current Workflow Time,
use the |WFTime| Substitution Variable.
Dimension Tokens
Next, create the Dimension Tokens necessary to load the form data to the correct Dimensions in
OneStream. The Dimension tokens need to be the column header for each data row. The
standard tokens used determine the Cube, Entity, Parent, Account, Flow, Intercompany, the User
Defined Members, and an Amount. Refer to Loading Stage Data for the syntax. The form specific
tokens are as follows:
HD#
Has Data
Enter Yes or No to specify whether the row has data.
AN#
Annotation
AS#
Assumption
AD#
Audit Comment
FN#
Footnote
VE#
Variance Explanation
Header Abbreviations
Static Value
Use :[] in order to fix a specific Member to the entire column creating a Static Value for the
specified Source Dimension. For example, F#:[None] imports the None Flow Member for every
Flow row within the Named Range. This syntax applies to all Dimension Tokens.
Example:
1. The Workflow Scenario Token, located in cell B5, is using a Substitution Variable to
dynamically reference the user’s current Scenario.
2. The Scenario Dimension Token needs to reference that Substitution Variable to ensure the
correct Scenario is used and the template functions properly.
Once the Dimension Tokens are configured, enter the data in the corresponding column. The
Dimensions can be in any order.
The final step is to create a Named Range beginning with XFF making sure to include the
definition of each property, the Dimension tokens, and the data rows. The Named Range must
begin with XFF for OneStream to read and load the form data correctly. Multiple XFF Named
Ranges can be used across multiple tabs.
This template uses the same property tokens as a regular Excel Form template shown above.
In the Matrix Form template, Amount and Time must be specified in the same column. A third
Dimension can be specified (e.g., Scenario) if desired. The example below is indicating the
Amount Column using AMT# and then specifying to which Time Members the Amount detail
belongs.
To set up a CSV template for a Form, the Header and Detail values must be specified.
process. While the Form Input type is selected, click the icon in the Form toolbar. This
allows the user to select the desired Excel or CSV template and load it into OneStream. Once the
file is loaded, the data will appear in the Form grid and it auto-saves upon importing to the Cube.
Load Example
'BRApi.Forms.Data.ImportAndProcessForms(si, filePath, save, complete, throwOnError)
OneStream reads this template using a specific Named Range which is explained later in this
section. Ensure the following information is included in the Named Range.
Property Tokens
The first eleven rows of the Named Range in the Excel template must include the following token
definitions:
Template Name
Enter a template name is applicable. If this is a free form journal, leave this blank.
Name
Enter the name of the journal.
Description
If desired, enter a description for the journal.
Journal Type*
Standard or Auto-reversing. Allocation is not supported for Excel or CSV import.
Balance Type*
Balanced, Balanced by Entity, or Unbalanced
Is Single Entity*
True or False
Entity Filter*
Use a Member Filter to specify the Entities used with this journal.
Consolidation Member
Enter the specific currency or Local Member of Consolidation.
Workflow Name
Enter the Journal Workflow name. For example, if the name of the Workflow Profile is Houston,
and the Adj input type is named Journals, enter Houston.Journals.
Workflow Scenario
Enter the Workflow Scenario or make the template dynamic by entering the |WFScenario|
Substitution Variable.
Workflow Time
Enter the Workflow Time or make the template dynamic by entering the |WFTime| Substitution
Variable. Workflow Time support two fields. The available cell immediately to the right is option
representing the CubeTimeName. This field can be used when the Scenario’s Workflow Tracking
Frequency is Yearly, and the Input Frequency is Monthly. For example, the Workflow Time would
be |WFTime| or 2019 and the CubeTimeName would be the period to post, 2019M7.
*See Journal Templates "Data Collection" on page 547 for details on these Journal properties.
Dimension Tokens
Next, create the Dimension Tokens necessary to load the journal to the correct Dimensions in
OneStream. The Dimension tokens need to be the column header for each data row. The
standard tokens used determine the Cube, Entity, Parent, Account, Flow, Intercompany, the User
Defined Members, and a Label if needed. Refer to Loading Stage Data for the syntax. The
journal specific tokens are as follows:
AMTDR#
This indicates the debited amount.
AMTCR#
This indicates the credited amount.
Header Abbreviations
Static Value
Use :[] in order to fix a specific Member to the entire column creating a Static Value for the
specified Source Dimension. For example, F#:[None] imports the None Flow Member for every
Flow row within the Named Range. This syntax applies to all Dimension Tokens.
Once the Dimension Tokens are setup, enter the data in the corresponding column.
Template Example:
The final step is to create a XFJ Named Range making sure to include the definition of each
property, the Dimension tokens, and the data rows. The Named Range must begin with XFJ for
OneStream to read and load it correctly. Multiple XFJ Named Ranges can be used within the
template over multiple tabs.
NOTE: Loading of Journal Templates or previously exported Journal data only requires
a Parent (P#) column value to be populated if the target Consolidation dimension
member being updated is OwnerPreAdj or OwnerPostAdj. Otherwise, this entry can be
left blank.
process. While the Journal Input type is selected, click the icon in the Journal toolbar. This
allows the user to select the desired Excel or CSV template and load it into OneStream. Once the
file is loaded, the journal line items will appear in the journal and the user can save it to the Cube.
Extract
Use BRApi.Journals.Data.ExportJournalstoCSV and define the session, filepath, Workflow
Profile, Scenario, Time Filter, and Journal Status.
BRApi.Journals.Data.ExportJournalsToCsv(si, filePath, "Houston", "Actual", "T#|WFYear|.Base",
"Posted")
Load Example
Use BRApi.Journals.Data.ImportAndProcessJournals and define the session, filepath, and
journal tasks to complete upon loading the journal details.
'BRApi.Journals.Data.ImportAndProcessJournals(si, filePath, save, submit, approve, post,
unpostAndOverwrite, throwOnError)
OneStream reads this template using a specific Named Range which is explained later in this
section. Ensure the following information is included in the Named Range.
Dimension Tokens
The first 19 rows of the Named Range in the Excel template must include the following token
definitions:
The Cube and each Dimension Member must be specified. All User Defined Members must be
specified. If a specific User Defined Member is not used in the application, enter None.
Next, create the Dimension Tokens necessary to load each Cell Detail line. The specific tokens
are as follows:
AMT#
Amount
LIT#
Line Item Type
AW#
Aggregation Weight
CL#
Classification Type
LB#
Label allowing users to add a description or additional detail.
Once the Dimension Tokens are configured, enter the data in the corresponding column. The
Dimensions can be in any order.
The final step is to create a Named Range beginning with XFC making sure to include the
definition of Dimension token, and the data rows. The Named Range must begin with XFC for
OneStream to read and load the Cell Detail correctly. Multiple XFC Named Ranges can be used
across multiple tabs.
process. While the Form Input type is selected, click the icon in the Form toolbar. This allows
the user to select the desired Excel or CSV template and load it into OneStream. Once the file is
loaded, it has been successfully stored to the Cube.
Extract
Use BRApi.Finance.Data.ExportCellDetailtoCSV and define the session, filepath, the Entity
Dimension, the Entity Member Filter, the Scenario and the Time Member Filter.
BRApi.Finance.Data.ExportCellDetailToCsv(si, filePath, entityDimensionName,
entityMemberFilter, scenarioName, timeMemberFilter)
Load Example
Use BRApi.Finance.Data.ImportCellDetail and define the session and filepath.
BRApi.Finance.Data.ImportCellDetail(si, filePath, throwOnError)
Example:
Dim objDataAttachmentList As DataAttachmentList = BRApi.Finance.Data.
GetDataAttachments(si, memberScript, includeFileBytes)
OneStream reads this template using a specific Named Range which is explained later in this
section. Ensure the following information is included in the Named Range.
In the first three rows of the Named Range in Column A, specify the following:
Database Location
Application or System specifies which database contains the custom tables.
Table Name
Custom tables only; enter the Table name
Load Method
The load method determines the action and any additional criteria for the action.
Replace
If there are no criteria, Replace clears everything first. By default, instead of merging, it clears the
entire table. This will perform better for high volume because it does not try to match rows from
the file to the table. An error will occur if it finds a match.
Next, define the Field Types and Field Names beginning in Column A Row 4 and spanning as
many columns as necessary.
Field Type
This relates to the column name in the table.
xfGuid
Unique identifier [SQL = uniqueidentifier]
xfText
Text defined column in the table [SQL = nvarchar, nchar, ntext]
xfInt
Short integer (4 byte integer) [SQL = int]
xfBit
0,1 (True, False) [SQL = bit]
xfDec
Decimal [SQL = Decimal (28,9)]
xfDbl
Floating point number (8 byte floating) [SQL = Float]
xfDateTime
Date [SQL = datetime]
Field Name
This is specific to the SQL table to be loaded.
StaticValue
Whatever is specified as the Static Value will override every row for that column regardless if it is
blank or not.
StaticValueExample
This example will override all rows and enter 50,000 as the Static Value.
xfDec#:[Salary]:50,000
DefaultValue
This only applies to blank rows.
NOTE: If something is specified in the Static Value, it will ignore whatever is in the
DefaultValue.
Finally, create a Named Range beginning with XFT making sure to include the entire template.
Once the template is complete, it is ready to be loaded into the custom table. If this is being used
in conjunction with a MarketPlace Solution, refer to the Solution for further instructions on how to
load the template to the table. If this is being loaded via an Extensibility Business Rule, refer to
the following example.
Example
Dim fieldTokens As New List(Of String)
fieldTokens.Add("xfGuid#:[EmployeeID]::NewGuid")
'fieldTokens.Add("xfGuid#:[EmployeeID]")
fieldTokens.Add("xfText#:[EmployeeName]")
'fieldTokens.Add("xfText#:[EmployeeName]::|Username|")
fieldTokens.Add("xfInt#:[GradeLevel]")
fieldTokens.Add("xfBit#:[Active]")
fieldTokens.Add("xfDec#:[Salary]")
fieldTokens.Add("xfDbl#:[VacationDays]")
fieldTokens.Add("xfDateTime#:[HireDate]")
BRApi.Utilities.LoadCustomTableUsingDelimitedFile(si,
SourceDataOriginTypes.FromFileShare, filePath, Nothing, ",", dbLocation, tableName,
loadMethod, fieldTokens, True)
Using Parameters
Use parameters to filter data while running the following:
l Cube view
l Extensible document
You can pass chosen parameter values from the dialog to underlying cube views included in the
data adapter which drives the dashboard components. This is useful when using the same report
for numerous users and views.
l Entities
l Workflow Profiles
l Scenarios
l Views
l Other items
To do this, refer to the parameter in the Point of View, rows, or columns to restrict the query to just
the data desired. you can run the same report, but the data differs depending on what you choose.
Parameters can prompt for a text entry, drops lists, or hierarchical dialogs. Surround the
parameter name in pipes and exclamation points for the parameter to run correctly. For example,
the Parameter |!ParamView!| prompts you to choose either YTD or periodic data upon running a
Cube View. If a delimited list parameter is used in a Cube View or component title, surround it in
two exclamation points to refer to the display Items and not the value (for example,
|!!ParameterName!!|.
Parameter Types
Parameters are created within Dashboard Maintenance Units. Each type of parameter has two
sections, General and Data Source parameter properties. The General property requirements
are standard across all types of Parameters, while Data Source properties vary by type of
Parameter.
Literal Value
The value is hard coded.
Input Value
This allows the user to enter or change the value.
Delimited List
This provides a distinct list of values populated in the Parameter Type.
Bound List
This is a list of Members created by using a predefined Method Query or entering a specific
expression to get the Members wanted in the list.
Member List
This produces a flat list of Members.
Member Dialog
Similar to Member list, this allows the user to select a Member, but through a pop up Member
selection dialog which also has search capabilities. This is more appropriate for a Dimension such
as Accounts or Entities where the user can choose a base or Parent Member by traversing a
hierarchy.
List Parameters
List parameters let you pick a value from a drop list instead of typing a number into a cell.
Parameters can be assigned as a list source to a cube view row or column. Parameters are
supported in Excel, web browsers, and Cube View-driven reports.
You can also specify a parameter name can be a Cube View row or column. Edit cells using a
drop list containing the parameter’s list of items. A number is stored in the data cell as specified in
the parameter's definition. If using a List Parameter on a numeric cell, ensure each value in the
parameter’s name-value pairs is a number.
1. Create the delimited list parameter needed for the Cube View.
3. Select the dashboard maintenance unit where you want to store the parameter and click
The following image shows the parameter saved under the GolfStream parameters
maintenance unit. When creating the parameter, give it a name that indicates its
use. Configure the parameter to create the list desired.
In this example, the parameter creates a drop-down list of options to select so the value
does not have to be entered manually.
The Display Items appear on the Cube View report. The Value Items are what appear in the
application. In this case, the cube view cells are numeric, so the value Items must also be
numeric.
4. Navigate to Application > Presentation > Cube Views and select the Cube View where
the prameter is to be used. The Cube View being used for this parameter is called
ReportStatus. Configure the column and rows to show the desired data.
This Cube View’s column members show Time based on workflow and a non-financial
Account named Report Status.
The Cube View’s Row members show all the Base Entities for clubs.
5. Under Cube View Row in the List Parameter field, enter ParamReportStatus so you can
choose a status for each entity in the row. Click to run this Cube View.
Nested Parameters
A nested parameter can refer to another parameter, which lets you slice data more specifically.
This example shows how to create a nested parameter and use it in a Cube View. The Cube View
displays profit by product. The parameters filter the data by product segment and then by a
specific product within the selected segment.
2. Select the dashboard maintenance unit in which you want to store where these parameters.
The Member Filter parameter returns the top members of the UD2 Dimension and their
children in a drop-down list and prompts to select a product segment.
The nested parameter breaks this category down even further. The setup for this is as
follows:
The Member Filter references the first parameter. The nested parameter returns the
selected product segment's base members in a drop-down list and prompts to select a
product.
4. After creating both parameters, navigate to Application > Presentation > Cube Views.
5. Select the Cube View where you want to use these parameters.
6. Select the column or row where this parameter is needed.The example Cube View displays
products in columns, so the specific column is selected. The following example shows the
parameters being used in the Profit by Product Cube View.
7. Select the desired dimension (UD2 in the example), and set the Member Filter field as
shown.This allows the ParamBaseProducts parameter to reference the
ParamProductSegments parameter first, then return a product list based on the selected
segment.
8. Click Run to see how this this Cube View's nested parameters work.
ParamProductSegements runs first, prompting you to select a segment.
l Columns
l Headers
l Cells
Each parameter includes a specific font, color, number formatting, and other format
elements.They simplify Cube View formatting and create consistency in all Cube View reports.
OneStream includes a standard set of Cube View styles, but more can be added. Existing styles
can be altered to fit your organization's specific needs. The Default Value field of each parameter
displays style information.
NOTE: When using these parameters, the Cube View row and column format settings
override the Cube View property format settings. For example, if DefaultCell is entered
in the Cube View properties Cell Format field, and Row_DetailCell is entered in the
Rows Cell Format field, Row_DetailCell is used for that specified row. If there is more
than one row, and one of the Row Cell Format fields is blank, DefaultCell is used for that
row.
1. Open a spreadsheet.
2. Select Cube Views then Add. For example, add Headcount and select the workflow
profile entity.
3. Add a workflow profile validation list to help select the Cube View entity. Add another sheet,
select Cube Views, then add a Cube View to create a list of workflow profile entities.
There are several workflow profiles in the standard Cube Views that are part of GolfStream
and standard reports, and a few under the Excel lists. You can also create your own (such
as scenarios and time periods) in another Cube View to get a list on the page.
4. Select the list of workflow profile entities that appear on the sheet, then copy the resulting
named range (WorkflowProfileEntitiesList_RowHeader in this case). The named range
adjusts to the number of rows.
5. On Sheet1, select a cell above the embedded Cube View for a validation list. Click Data >
Data Validation.
6. Select List in the Allow field and set Source to the named range, in this case the range that
contains the list of workflow profile entities. Click OK.
7. On the original Cube View and POV, copy the parameter name without the pipes and
exclamation points.
8. On the spreadsheet, select the cell that contains the entity list. Paste the parameter name,
then press Enter.
9. Refresh the spreadsheet, change the value in the Select Entity field, and refresh again.
This feeds the parameter into the Cube View, but suppresses the Parameter dialog box.
In this section, you will learn how to work with extensible documents.
The value of this feature is that special integration tools are not necessary because once the
document is launched, it updates itself with the correct Parameter value, Image, or Retrieve
Function values. Extensible Documents allow the you to display any information you want from
OneStream and because it is integrated with these different products, the data stays current and
dynamic.
NOTE: Extensible Documents only work with Microsoft Office version 2007 and later.
Once the document is saved in the format mentioned above, upload the document into
OneStream’s File Explorer.
3. Click Select File to launch the document and see the updated values.
See one of the Creating an Extensible Document sections below for examples on how to
incorporate Parameters into an Extensible Document.
Substitution Variables can also be used in Word, Excel, PowerPoint, or a Text File. These
variables call out details such as an Application Name |AppName|, User Name |UserName|, or
refer to a specific POV which creates versatility when reusing the same document. See
Substitution Variables under Member Filters in "Cubes" on page 400 for more details on
Substitution Variables.
See "Creating a Document in Microsoft PowerPoint" on page 250 for an example on how to
incorporate Substitution Variables into an Extensible Document.
For more details on an image’s configuration, refer to the Extensible Document Settings in the
Object Lookup dialog in OneStream, or see Extensible Document Settings under Object
Lookup in "Presenting Data With Books, Cube Views and Other Items" on page 576.
See Creating an Extensible Document in Microsoft Word below for an example on how to insert a
report into an Extensible Document.
Excel charts can be created based off of the retrieve values and will display the correct data once
the spreadsheet is launched from OneStream. In order to have the Excel chart refresh and display
the updated data, use the XFGetCellVolatile Retrieve Function. Excel requires a volatile function
for proper refreshing when using charts that reference calculated cells.
See Creating an Extensible Document in Microsoft Excel for an example on how to use Retrieve
Functions with Extensible Documents.
There is also another type of retrieve function called XFCell which retrieves data from a single cell
in OneStream. This is intended for text documents such as Word or PowerPoint. For example,
XFCell(A#20500:E#Clubs) will return a value for this Account and Entity intersection.
Parameters and Substitution Variables may also be used in an XFCell formula. For example,
XFCell(A#20500:E#|!MyEntityParameter!|:T#|Global|) would return a value for the specified
Account, the Entity selected at run-time, and the Global Time Period of the user’s Application.
Additional settings can be included in an XFCell function used within an Extensible Document that
will help format the resulting data. An example of such a fully qualified function would be: XFCell
(A#20500:E#Clubs, Culture=User, NumberFormat=N3, DisplayNoDataAsZero=True, Scale=3,
FlipSign=True, ShowPercentSign=False)
NOTE: Any Dimensions that are not specified in the formula will come from the user’s
POV.
For more examples and details on XFCell’s syntax, refer to the Extensible Document Settings in
the Object Lookup dialog in OneStream, or see Extensible Document Settings under Object
Lookup in "Presenting Data With Books, Cube Views and Other Items" on page 576.
See Creating an Extensible Document in Microsoft PowerPoint for an example on how to use
XFCell with Extensible Documents.
NOTE: In order to embed any type of file (Word Document, Rich Text File or text file), it
must be saved in the OneStream File Explorer. This allows the Extensible Document to
access the file’s content at run-time.
NOTE: When inserting content into a Word document, the embedded content takes on
the page settings of the main document. To change the page settings for the inserted
content, add a section break, using Microsoft Word's Breaks menu item, before the
inserted content. After the section break, specify the desired page settings for the
embedded content.
The OneStream Logo is an image serving as a placeholder for a Cube View Report. Any image
can be used as a placeholder. Once the image is inserted into the document, right-click on the
image and select Format Picture. Click Layout & Properties and expand ALT TEXT.
Navigate to the Object Lookup Dialog located in the OneStream application. This dialog
provides all the syntax needed to insert any type of report into an Extensible Document. This icon
can be found on the following screens under the Application Tab: Form Templates, Books, Cube
Views, Dashboards, Data Management.
Once in the dialog, expand the desired item type (Cube View Report in this case) and click Copy
to Clipboard in order to copy the first string. Go back into the Word Document and paste
(CTRL+V) it into the Title Field. The following string will populate the field:
{XF}{Application}{CubeViewReport}{CubeViewName}
Update the {CubeViewName} with the name of the Cube View required for this document. In this
example, the following syntax was entered for the Cube View: {XF}{Application}
{CubeViewReport}{Product Sales}
NOTE: For more examples on these different item types, see Extensible Document
Settings under Object Lookup in "Presenting Data With Books, Cube Views and Other
Items" on page 576.
Navigate back to the Object Lookup and copy the second string under the desired item type
(Cube View Report in this case). This is the standard formatting for the item type. Go back into the
Word Document and paste the string into the Description field. The following string will populate
the field:PageNumber=1, Zoom=101, MaintainAspectRatio=True, FillMode=Width,
Anchor=TopCenter, IncludeBorders=False, IncludeReportMargins=False,
IncludeReportHeader=False, IncludePageHeader= True, IncludePageFooter = False,
IncludeReportFooter=False, IncludePageFooter=False
NOTE: Any of these properties can be changed. See Extensible Document Settings
under Object Lookup in "Presenting Data With Books, Cube Views and Other Items" on
page 576 for more formatting details and options.
The OneStream Logo is an image serving as a placeholder for a Cube View Report. Any image
can be used as a placeholder. Once the image is inserted into the document, right-click on the
image and select Edit Alt Text…
Navigate to the Object Lookup Dialog located in the OneStream application. This dialog
provides all the syntax needed to insert any type of report into an Extensible Document. This icon
can be found on the following screens under the Application Tab: Form Templates, Books, Cube
Views, Dashboards, Data Management
Once in the dialog, expand the desired item type (Cube View Report in this case) and click Copy
to Keyboard in order to copy the first string. Go back into the Word Document and paste
(CTRL+V) it into the Description Field. The following string will populate the field:
{XF}{Application}{CubeViewReport}{CubeViewName}
Update the {CubeViewName} with the name of the Cube View required for this document. />In
this example, the following syntax was entered for the Cube View: {XF}{Application}
{CubeViewReport}{Product Sales}
NOTE: For more examples on these different item types, see Extensible Document
Settings under Object Lookup in "Presenting Data With Books, Cube Views and Other
Items" on page 576.
Navigate back to the Object Lookup and copy the second string under the desired item type
(Cube View Report in this case). This is the standard formatting for the item type. Go back into the
Word Document and paste the string into the Description field. The following string will populate
the field:PageNumber=1, Zoom=101, MaintainAspectRatio=True, FillMode=Width,
Anchor=TopCenter, IncludeBorders=False, IncludeReportMargins=False,
IncludeReportHeader=False, IncludePageHeader= True, IncludePageFooter = False,
IncludeReportFooter=False, IncludePageFooter=False
NOTE: Any of these properties can be changed. See Extensible Document Settings
under Object Lookup in "Presenting Data With Books, Cube Views and Other Items" on
page 576 for more formatting details and options.
Using Parameters
In this Word document example, all the information was typed normally, but Parameters were
inserted in places where the user wanted to obtain information from OneStream. Three
Parameters named |!CurrentQtr!|, |!ReportingYear!| and |!ReportingYearPrior!| were created in
order to display the appropriate years and quarters for this document. When the Extensible
Document is processed at run-time, it will prompt the user to select the reporting quarter and
reporting year.
After the Parameters have been selected, the document will now display the desired data.
(Highlighted in yellow below)
This opens the Content Control Properties dialog. The Title and Tag fields are used to define the
Extensible Document content.
The Title field is where the content item is specified. The Tag field is where additional formatting
settings are specified and only applies when embedding a Cube View or Dashboard Report.
Navigate to the Object Lookup and select Extensible Document Settings. Expand Insert
Content Using Word Rich Text Control Content and then expand Microsoft Word Document.
Select the string and click Copy to Clipboard. Navigate back to the Extensible Word Document
and paste the string into the Title field. The following string populates the field:
{XF}{Application}{File}{Documents/Public/ExtensibleDocs/WordFileName.docx}
Edit the string’s file path and specify the exact location of the Word document being embedded
into the document. In this example, the final string displays the following:
{XF}{Application}{File}{Documents/Public/XF Docs/Documents/Form10_Q.docx}.
Ensure all changes are saved, navigate back into OneStream, and open the File Explorer. Upload
the Extensible Document into the File Explorer, select the file and launch it. The Word Document
is now embedded in the Extensible Word Document and can be formatted as desired.
l Use Parameters
NOTE: The same format for XFCell is used in Microsoft Word and Text Files.
1. A Parameter named |!GetCellEntity!| was created in order to allow the user to select a
specific Entity on which to base the data for this slide. When the Extensible Document is
processed at run-time, it will prompt the user to select the desired Entity.
2. XFCell retrieves data from a single cell in OneStream. In the example above, the user
wants to retrieve data from Account 62000, for the Entity selected at run-time, for the
Application’s Global Time Period.
NOTE: For more details on XFCell’s syntax, refer to the Extensible Document
3. A Substitution Variable is being used to call out the Global Time of the OneStream
Application. This will update with whatever Global Time Period is currently set in the
Application.
4. Additional format settings can be included to control options such as number formatting or
scaling.
Once the document is processed, it will prompt the user to select the desired Entity.
It will then display the updated data for the Houston Entity.
Houston has now replaced the Parameter, and 29,624 is the outcome of the XFCell function.
If a different Entity was selected, the data will refresh and update accordingly.
l Use a Parameter.
1. A Parameter named |!GetCellEntity!| was created in order to allow the user to select a
specific Entity on which to base the data for this spreadsheet. When the Extensible
Document is processed at run-time, it will prompt the user to select the desired Entity.
2. The XFGetCellVolatile Function was used to retrieve specific data from OneStream and
update the Excel chart once the data is refreshed. Excel requires a volatile function for
proper refreshing when using charts that reference calculated cells. This XFGetCellVolatile
formula derives from the |!GetCellEntity!| Parameter and will display the updated data once
an Entity is selected at run-time.
NOTE: See Retrieve Functions in "Navigating the Excel Add-In" on page 1097 for
more details.
3. The IF Excel Function was also used which derives from the XFGetCellVolatile function.
This data will also be updated once the spreadsheet is launched from OneStream and the
data is refreshed. An example of the IF formula is as follows: =IF((F6=0),"", (D6-F6)/F6)
4. An Excel Chart was inserted into this spreadsheet and is driven by the data. Once the data
is refreshed, the chart will automatically display the correct values.
Once the document is processed, it will prompt the user to select an Entity and then run the Excel
spreadsheet. Log into the Excel Add-In and click Refresh Data in order to see the updated values.
Result:
The user running this Book must have access to the Extensible Document, otherwise the File will
display blank pages. See "Presenting Data Using Books" on page 576 in Presentation for more
details on this feature.
NOTE: If a Book contains an Extensible Excel Document that is using the XFGetCell
function, the user will not need to login to the Add-In to see the updated values.
l Modifications to colors of individual chart items within a series does not carry through to a
PDF.
l ACCRINT
l ACCRINTM
l AGGREGATE
l AMORDEGRC
l AMORLINC
l BAHTTEXT
l BETA.DIST
l BETA.INV
l BINOM.DIST
l BINOM.DIST.RANGE
l BINOM.INV
l BINOMDIST
l CHISQ.DIST
l CHISQ.DIST.RT
l CHISQ.INV
l CHISQ.INV.RT
l CHISQ.TEST
l CONFIDENCE.T
l COUPDAYBS
l COUPDAYS
l COUPDAYSNC
l COUPNCD
l COUPNUM
l COUPPCD
l COVARIANCE.P
l COVARIANCE.S
l CRITBINOM
l CUBEKIPIMEMBER
l CUBEMEMBER
l CUBEMEMBERPROPERTY
l CUBERANKEDMEMBER
l CUBESET
l CUBESETCOUNT
l CUBEVALUE
l DDB
l DEVSQ
l DISC
l DURATION
l EXPON.DIST
l EXPONDIST
l F.DIST
l F.DIST.RT
l F.INV
l F.INV.RT
l F.TEST
l FACTDOUBLE
l FILTERXML
l FISHER
l FISHERINV
l FORECAST
l FVSCHEDULE
l GAMMA
l GAMMA.DIST
l GAMMA.INV
l GAUSS
l GCD
l GETPIVOTDATA
l GROWTH
l HARMEAN
l HYPGEOM.DIST
l HYPGEOMDIST
l INTRATE
l ISPMT
l KURT
l LCM
l LIMEST
l LOGEST
l MDURATION
l MULTINOMINAL
l NEGBINOM.DIST
l NEGBINOMDIST
l ODDFPRICE
l OFFFYIELD
l ODDLPRICE
l ODDLYIELD
l PERCENTILE.EXC
l PERCENTILE.ING
l PERCENTILERANK.EXC
l PERCENTILERANK.INC
l PERMUT
l PERMUTATIONA
l PHI
l POISSON
l POISSON.DIST
l PRICE
l PRICEDISC
l PRICEMAT
l PROB
l QUARTILE.EXC
l QUARTILE.INC
l RANK.AVG
l RANK.EQ
l RECEIVED
l ROMAN
l RSQ
l RTD
l SERIESSUM
l SKEW.P
l SLN
l STANDARDIZE
l STDEVPA
l STEYK
l SYN
l T.DIST
l T.DIST.2T
l T.DIST.RT
l T.INV
l T.INV.2T
l T.TEST
l TBILLEQ
l TBILLPRICE
l TBILLYIELD
l TREND
l TRIMMEAN
l UNICHAR
l VARA
l VARPA
l VDB
l WEBSERVICE
l WEIBULL
l WEIBULL.DIST
l XIRR
l YIELD
l YIELDDISC
l YIELDMAT
l Z.TEST
l ZTEST
Cube Views
A Cube View is used to query Cube data and present it to the user in variety of ways. Cube Views
can be made read-only, used to edit data, or they can be used as the Data Source for several
different display mechanisms. Learning how to create Cube Views is a key skill towards becoming
a knowledgeable user.
Workflow Process After calculating data in the Process Workflow task, a group of Cube Views can
be exposed for the user to review.
Workflow Forms A Cube View is designed as the basis for the Form Template in a form-based
data entry Workflow.
A Cube View can be referenced within Excel through the OneStream Excel Add-in. The Cube
View is formatted similarly to how it appears in the web interface by default. Any formatting
applied in the Cube View carries over to Excel. If formatting was not applied, Excel’s Styles can be
applied through the Add-in. See "Getting Started with the Excel Add-In " on page 1059 for more
details on how to do this.
Workflow Forms
A Cube View designed for data entry can be assigned to an Excel sheet and launched from
Workflow. Data can be updated in Excel, as it would be in the Cube View in the web interface, and
then submitted back.
Lists
Cube Views can be used to pull lists of Members to use in Excel Form templates.
Named Regions
When a Cube View is rendered in Excel, its rows and columns are combined to create unique
>Named Regions which can in turn be applied with Excel Styles for formatting purposes. See
Style Types in "Navigating the Excel Add-In" on page 1097 for more details on how to do this.
When a Cube View is viewed in the web interface, it is being done in a Data Explorer Grid which
can be added as a Component of a Dashboard.
Cube Views can also be used as a Data Source for graph Dashboard Components. This is
covered in more detail in the Application Dashboards section.
Similar to graphs, Cube Views can be used as the Data Source for a report Component in a
Dashboard.
When a Cube View is rendered in a report, its rows and columns are combined to create unique
field names in the resulting database table. It can then be applied to separate rows and columns
in a report and formatted separately.
A Cube View must contain one or more columns. The default name is Col1 but can be changed to
something useful.
In order to show specific combinations of columns that are not attainable by just adding new
columns to the Cube View, there is a way to do so under the Rows and Columns Slider. In the
Header Overrides section under the General Settings Slider, turn on the three Dimension types
wanted and set Use Default Column Headers to False. Then, in the Column Member Expansion
1, select one of the three Dimension types. Finally, in the filter, enter the columns separated by
commas. Multiple Dimensions can be specified for each column in the same Member Script.
In order to show a column that expresses an actual existing headcount from all input channels,
and a column to show a budget existing headcount to which data can be entered, and then finally
an actual terminated headcount to which data can be entered, enter the following into the filter:
S:Actual:A#HeadcountExist:O#Top, S:Budget:A#HeadcountExist:O#Forms,
S:Actual:A#HeadcountTerm:O#Forms
A Cube View must also contain one or more rows. Creating separate rows and columns with
unique names allows the Cube View results to be broken out and formatted separately. This will
happen either directly in the Cube View, in Excel, or in the resulting report. For example, the Entity
Europe may be listed separately from the country Entities that roll up to it. This will make Europe
appear as bold on a report and have the child Entities appear in plain text.
Navigation Links
Navigation Links will provide the option to set up one Dashboard to launch another in order to drill
into more detail or related detail on a certain row of data from a report, while other times it can
create a chart of the reviewed data.
The following information provides a detailed example including the steps one must take to create
Navigation Links. The implementation of this feature will differ, but for this example an Income
Statement Summary Dashboard will launch another Dashboard with more detail:
Below is a sample Cube View called IS Summary which contains Income Statement summary
accounts. First, select the row to be highlighted for navigation. Set Enable Report Navigation Link
to True and type the name of the Dashboard to open in the Dashboard to Open in Dialog field. The
specified Dashboard (not a Cube View) will open when this highlighted row is selected from this
Cube View’s related Dashboard.
This next step is optional. Add a Header Format using a Cube View Style which will make the row
appear blue or another color different from the other rows, so the user knows this row is clickable.
In this case, |!Highlight_Row!| is related to a Parameter that was made in a sample application.
When this Cube View is run, the format of TextColor=Blue will be activated for this row.
The Cube View is passing the clicked account to the other Dashboard and there the account and
its children can be viewed. Now, determine how the account will be passed. Go to the General
Settings Slider and go to the Report Navigation Links section. A value of ClickedAccount was
entered for the Account Bound Parameter Name which is going to be passed from one Dashboard
to another. More than one Dimension can be passed if it is defined in Rows, but in this case this
does not need to happen.
Copy this Cube View in order to have the same columns and POV settings for the drilled
Cube View containing the contents. This one is called IS Content because it contains the content
of the drilled data. Whatever it is called, be consistent and come to an agreement with the project
team.
In the new Cube View (called IS Content in this case), remove the rows that were there and add
the rows that will be seen when this Dashboard appears. In this case, refer to the Parameter for
ClickedAccount and add the ChildrenInclusive extension.
Under Dashboards, a Maintenance Unit was created and called GolfStream Navigation Link
Example, which stores all the objects needed to get this to work properly.
Starting from the bottom of the example above is the Parameter needed to highlight the row.
The Data Adapters are required in order to point to the Cube Views. First, set up one for IS
Content. Set the Command Type to Cube View and set the Cube View to IS Content.
Next, set up the Data Adapter for IS Summary. Set the Command Type to Cube View and set the
Cube Viewto IS Summary. Set the Include Row Navigation Link property to True to drill from Cube
Views in a Dashboard.
Now, set up two Content Components which will be placed on a Dashboard. These were named
with the prefix of der_ because they are Data Explorer Reports and not Charts or another form of
Component. Again, come up with a naming schema with the project team, so the result is
organized. In this case, click to create a Component and chose Data Explorer Report as the
type. Do this for both IS Summary and IS Content and click in the Dashboard toolbar to
attach the appropriate Data Adapter to each.
Select Parameter Components and create a new Component with a type of Supplied
Parameter. In the Bound Parameter field, enter the name of the Parameter being passed, which is
ClickedAccount in this example. Do this for each Parameter being passed from Dashboard to
Dashboard.
Create a Dashboard Group (called IS in this example) and two Dashboards. Create one for the
launching Dashboard (e.g. IS Summary) and one for the launched Dashboard (e.g. IS Content). In
this example, a Layout of Uniform is being used, but the use of this feature may vary.
Under Dashboard Components attach the Data Explorer Report Component. Do this for both IS
Summary and IS Content. For just the IS Summary(the launched Dashboard), also attach the
Parameter Component being passed (ClickedAccount in this example). This allows that account
to be passed from the initial Dashboard to the other.
The following information provides a detailed example including the steps one must take to create
a Linked Cube View. The implementation of this feature will differ, but for this example an Income
Statement Summary Cube View will have an Account Detail Cube View assigned to all its data
cells and a Sales by Product Cube View linked to specific Cube View Rows. Once a Linked Cube
View is selected, it will display it in another Cube View dialog. There is also the possibility to add a
second level of Linked Cube Views in order to provide even more detail.
NOTE: Column settings override Cube View settings and Row settings override Column
settings.
Once the base Cube View, in this example it is the IncomeStatementSummary, has been
determined, create a Cube View to display more detail.
The AccountDetail Cube View will display Account details based on the selected data cell. The
IncomeStatementSummary’s data displays Parent Accounts for a specific time, so the
AccountDetail’s data will display the selected Account’s Children for that same time period.
In the AccountDetail Cube View configure the columns and rows to query the correct data.
|!DrillAccount!| is a Bound Parameter and will display the correct account data based on the
selected account in the IncomeStatementSummary Cube View.
|!DrillTime!| is a Bound Parameter and will display the correct Time data based on the selected
time in the IncomeStatementSummary Cube View.
Next, ensure the AccountDetail’s POV matches the IncomeStatementSummary’s POV with the
exception of Account and Time. Enter the |!DrillAccount!| and |!DrillTime!| Bound Parameters into
these Member fields to ensure that the AccountDetail Cube View displays data based on the
selected Account and Time in the IncomeStatementSummary.
Incorporate any formatting needed to the AccountDetail Cube View. In this example, the following
was added to the Page Caption property:
To assign the AccountDetail Cube View to the entire IncomeStatementSummary Cube View,
select the IncomeStatementSummary Cube View and navigate to Navigation Links located under
the General Settings Slider. Enter the name(s) of the Linked Cube View in the Linked Cube Views
property by clicking the ellipsis and using the Object Lookup to enter the desired Cube View
name.
Next, enter the Bound Parameters which will pass from one Cube View to the next. In this
example, a Bound Parameter was entered for Account and Time in order to base both Members
on the selected IncomeStatementSummary data cell.
Once the IncomeStatementSummary Cube View is run, the AccountDetail Cube View will be
available when a user right-clicks on any data cell.
The ProductDetail Cube View will display product details based on the
IncomeStatmentSummary’s selected data cell. The IncomeStatementSummary’s data displays
Parent Accounts for a specific time, so the ProductDetail’s data will display the selected Account’s
product sales.
In the ProductDetail Cube, configure the rows/columns, POV, and formatting to ensure that the
user sees the correct data. See the AccountDetail instructions above for more details.
In order to assign the ProductDetail Cube View to the IncomeStatementSummary rows, select the
IncomeStatementSummary Cube View and navigate to the Linked Cube Views property under
the Data Tab in the Rows/Columns Slider. Select the desired row or column and enter the name of
the Cube View.
NOTE: The AccountDetail Cube View must be assigned in this field as well because of
the overrides. Previously, AccountDetail was assigned to the entire Cube View, but
because a Cube View is being assigned to a specific row, this will override the Cube
View settings and only display the Cube View specified in this property. For both Cube
Views to be available on this row, they both must be specified here.
If any Bound Parameters need to be entered (in this example, the same Bound Parameters are
being used for both Cube Views), enter them under Navigation Links in the General Settings
Slider.
In the example above, the ProductDetail Cube View was assigned to the Net Sales row.
Now when a user right-clicks on a Net Sales cell in the Data Explorer grid, Spreadsheet tool, or
Excel Add-In, this Cube View will be available to view.
In the ProductDetail_2 Cube View, configure the rows and columns as needed.
A new |!DrillProduct!| Bound Parameter is used in order to display specific product details. For
example, if a user right-clicks on a Clubs cell in the ProductDetail Cube View and selects the
ProductDetail_2 Cube View, it will then display sales details for Clubs.
The same |!DrillTime!| Bound Parameter is used for this Cube View as it is in the other Linked
Cube Views to ensure the correct data is being displayed for the correct time.
Next, set the ProductDetail_2 POV to match the ProductDetail POV except for the User Defined 2
Member (in this example Products are viewed via the UD2 Member). Enter the |!DrillProduct!|
Bound Parameter into this Member Field.
Incorporate any formatting needed to the ProductDetail_2 Cube View. In this example, the
following was added to the Page Caption property:
Open the ProductDetail Cube View and navigate to the Linked Cube View property under
Navigation Links in the General Settings Slider. Click the ellipsis and select the ProductDetail_2
Cube View. This will link the Cube View to the entire ProductDetail Cube View.(If this needs to be
done to particular rows or columns, see Linking a Cube View to Specific Rows or Columns in the
previous section)
Once the ProductDetail Cube View is run from IncomeStatementSummary, the ProductDetail_2
Cube View will be available when a user right-clicks on any data cell.
Result:
Linked Dashboards
Linked Dashboards is a new capability, similar to the Linked Cube Views feature, that offers more
flexibility and ease of use when performing data analysis. It provides the option to launch a
Dashboard from a Cube View when the latter is viewed in either the Data Explorer grid, the
Windows Application Spreadsheet tool, or the Excel Add-In.
When you right-click a Cube View data cell, the context menu displays a list of Linked Dashboards
along with Linked Cube Views. Selecting a link launches the linked Dashboard in a dialog
providing more detail and visibility.
l Overall purpose of the linked Dashboards in relation to the source Cube View.
l The data you want to display via the source Cube View as well as the detail data you want
to display in the linked Dashboards. This may require additional detail Cube Views.
l The POV information you want to pass from the source Cube View to the Dashboard. This
determines the Bound Parameters needed on the Source Cube View.
l How the POV information will be used in the Dashboard. This determines where the Bound
Parameters need to be defined.
Data Entry Example: Using a linked Dashboard and Cube View for Budget data entry. The
objective is to launch an Operating Expense Dashboard Form from a Budget Review Cube View
(the source Cube view). Linked forms must be specific to the selected Entity, Time, and Account
from the source Cube View.
2. Define a Bound Parameter for each POV Member you need to pass to the linked reports
(this will vary and may not always be applicable).
Source Cube View Bound Parameters example: The linked form in this example must be
specific to the selected Entity, Time and Account from the source Cube View. Each of those
dimensions require a Bound Parameter.
l Cube View Component -- Display an Operating Expense data entry Cube View
The intent of Bound Parameters is to pass the Parameter value in the background rather than
through a prompt, thereby creating a seamless navigation experience. Bound Parameters impact
the Dashboard when there is a Member dependency between one of its Components and the
source Cube View.
2. Creating a supplied Parameter Dashboard Component for each Cube View Bound
Parameter and assigning the Components to the Dashboard (Requirement 2).
Requirement 1
The data source for the Dashboard Cube View Component is an Operating Expense Cube View.
The results of this Cube View are dependent on the Source Cube View Account, Entity and Time
Members.
To pass these values, three Bound Parameters (drilltime, drillaccount, drillentity) from the Source
Cube View must be assigned to the OpEx (Operating Expense) Cube View. This is the Cube View
assigned to the Cube View Dashboard Component and it must use the same Entity Member.
Rows must display the Base Accounts of the source Cube View selected Parent Account:
Columns must display time periods based on the selected Year in the source Cube View :
NOTE: The selected Cost Center in the Dashboard combo box is also used by the OpEx
Cube View. This does not require a cost center (UD1) value/Bound Parameter from the
source and is specific to the Dashboard functionality.
Requirement 2
The Source Cube View uses three Bound Parameters: drillaccount, drillentity, and drilltime. Each
requires three Supplied Parameter Components.
1. Type the name of the respective Cube View Bound Parameter in the Bound Parameter
Property
2. Construct and assign Dashboard Components as normal making sure to include these
Supplied Parameters.
Navigate back to the source Cube View and assign the Dashboard accordingly. This has the
same behavior as Linked Cube Views.
The Dashboard is available when you right-click on any Cube View data cell.
The Dashboard is only available when you right-click on data cells in this specific column or row.
2. Right-click on a data cell (this will vary based on where it’s assigned).
Use Cases
This functionality gives you the ability to drill down to:
l Interactive Dashboards for detailed analysis (for example, the BI Viewer, Advanced Charts,
and so forth)
l Workspaces
l Resources
Conclusion
Linked Dashboards offer more flexibility and ease-of-use when performing data analysis. This
feature provides an all-encompassing, guided-reporting experience as the data presented in the
linked reports is directly related to the main Cube View via Bound Parameters. It also eliminates
the need to incorporate too much detail in a single Cube View or Dashboard because you can drill
down into the details as needed.
See Member Filters in "Cubes" on page 400 for more details on using Member Filters.
Designs for analytic reports typically have multiple dimensions nested on rows. The combination
of members generated from the nested expansions can easily result in billions of potential
expanded rows, many of which may not have data. For example, a Cube View with 1,000
Accounts, 1,000 nested UD1 Members, and 1,000 nested UD2 Members will result in 1 billion
expanded row.
In these designs, using standard row and/or column suppression, which can be applied as
suppression for invalid rows, no data rows or zero rows, the Cube View might not open because
each of those billion rows will be inspected individually to determine which of the relatively few
rows have any data. The internal code that evaluates one billion rows one at a time may take
significant time to execute.
With Sparse Row Suppression enabled, Data Unit evaluations are performed first to determine
the full population of data before generating potentially billions of rows. This allows most of those
rows to be eliminated before they are generated. Once filtered by the Sparse Suppression
process, the rows returned to the Cube View will be evaluated by the Cube View row/column
suppression settings. By performing the Sparse Suppression, which eliminates the empty
records, a Cube View can be displayed much faster and more efficiently because the Cube View
suppression processing has a smaller set of records to evaluate.
The fundamental processing of Sparse Row Suppression is to evaluate the data records on the
rows, by the intersections defined by the Cube View columns. Sparse Row Suppression will filter
records with no data prior to returning them to the Cube View.
Cube View columns must be evaluated to identify any that contain a member generated from a
Dynamic Calc formula or a GetDataCell script as an expansion or calculated column. If a Cube
View enabled for Allow Sparse Row Suppression contains such a column definition an error will
be displayed.
If a Cube View contains a column which will result in a member having a Dynamic Calc formula or
a GetDataCell script, the specific column’s Allow Sparse Row Suppression setting must be
changed from True to True (but determine sparse rows using other columns). This setting flags
the Sparse processing to ignore the column when evaluating data intersections for the rows. The
suppression for the rows will then be determined by the data records evaluated across all other
columns which are set to True for Allow Sparse Row Suppression.
The Sparse Column setting True (but determine sparse rows using other columns) can also be
applied to a column(s) by the designer if it can determine that other columns, or a single column,
can be used to determine sparsity. This could enhance performance because the Sparse Row
Suppression process would have few column intersections to evaluate.
The design layout of rows and columns for use with Sparse Row Suppression has no restrictions.
The rows or columns can contain Data Unit members, such as Entity and Scenario, or “right side”
(non-Data Unit) dimensions such as Account and User Defined dimensions. However, overall
best practices for Cube View designs still applies in that the overall performance of Cube Views is
greater when multiple Data Unit dimension members are not defined in the rows or columns.
1. The Cube View’s General Setting / Common must have Allow Sparse Row Suppression to
be set to True.
2. The rows should be designed for the required Member Filter definition of
dimensions/members/expansions.
3. Each row can vary by the assignment of Suppression settings found under Data. Any row
assigned a Suppression setting will be enabled for Sparse Suppression when the Cube
View Common/Report setting of Allow Sparse Row Suppression is set to True.
4. Each report column must be activated for Allow Sparse Row Suppression as True. The
Suppression settings for Invalid Columns, No Data Columns or Zero Columns can be
independently assigned to support reporting requirements.
5. Should any column be defined as a Dynamic Calc member or as a GetDataCell script, the
Allow Sparse Row Suppression setting must be changed from True to True (but determine
sparse rows using other columns). A column may also be set to True (but determine sparse
rows using other columns) if it is determined other columns sufficiently define the data
intersections to limit the amount of processing required to determine sparsity.
With Workflow
Set the Point of View, Rows, and Columns in the Cube View, so that it is driven by the Workflow
POV and the Entities assigned to the Workflow Profile in use. By doing this, a user can make
Forms, Dashboards, Cube Views, and Reports driven dynamically by the Workflow Profile.
Workflow Entities
Using an expression such as E#Root.WFProfileEntities from within the Rows or Columns Tab
shows the Entity or Entities assigned to that particular Workflow Profile at run time.
WFProfileEntities or similar expressions cannot be assigned to the POV because there can be
more than one and the POV only requires a single Member.
Workflow Scenario
Under the Cube View Point of View Slider, select the WF Member for the Scenario Dimension, or
use WFScenario|, or a similar Substitution Variable in Rows and Columns.
Workflow Time
Under the Cube View Point of View Slider, select the WF Member for the Time Dimension, or use
WFTime|, or a similar Substitution Variable in Rows and Columns.
Within the Cube View, refer to Parameters within the Point of View and Rows and Columns Slider
in order to restrict the query to just the data expected. Surround the Parameter name in pipes and
exclamation points (e.g. |!ParameterName!|).
See below for an example of using the Entity and Region Parameters within the Cube View’s
Point of View:
TIP: Use Dashboard Parameters as a single repository for Parameters that are used in
Dashboards or Forms. If a Parameter is referred to within a Cube View (e.g.
|!ParameterName!|) and there is not a Parameter by that name associated with the
Form, it will search through the Application’s Dashboard Parameters for one with that
name and use it.
Example
A#60999:Name(“Net Sales”) or A#60999:Name(Net Sales)
Example of simple member math including the :Name() function, typically applied to a column in a
Cube View:
GetDataCell(“S#Actual - S#Budget"):Name(Variance)
The Variance, VariancePercent and Divide functions are very useful for common calculated
columns and rows in Cube Views.
Variance = (A - B) / Abs B
Divide = A / B The new Divide function is different than the regular division operator (i.e., the
forward slash character) in that Divide> returns NoData when a divide-by-zero would have
occurred.
BetterWorse and BetterWorsePercent functions also provide a Variance, but take Account Type
into consideration.
GetDataCell(Variance(S#Actual,S#Budget)):Name("Variance")
GetDataCell(“VariancePercent(S#Actual, S#Budget)"):Name(Variance %)
BetterWorse Example
BetterWorsePercent Example
These functions require square brackets around the date function in order to work with
Parameters. Placing [square brackets] around a Member Filter (after the # sign) help the financial
engine interpret what is being asked, especially for Time-based member filters. This is also the
case when there is a space or a period in the Member name. Example:GetDataCell
("VariancePercent(T#[|POVTime|],T#[YearPrior1(|POVTime|)M12])"):Name("PrYear%")
There are several variations on this method, depending on whether the expression refers to
Columns, Rows or a combination.
Example of simple member math including the :Name() function, typically applied to a column in a
Cube View:
GetDataCell(CVC(Col1) - CVC(Col2)):Name(Variance)
For example, if Columns in a Cube View are defined as the Scenario dimension and Col1’s
Member Filter is defined as S#Actual and Col2’s Member Filter is defined as S#Budget, the
difference between these for each row in a Cube View would result in this calculated Column with
a header of “Variance.”
The syntax is similar, but instead of CVC, a calculated Row should use CVR for getting the value
of a Row within a formula. Example of syntax: GetDataCell(CVR(SomeRowName) + CVR
(SomeOtherRowName)):Name(HeaderName)
NOTE: If a column name is numeric (e.g. 500), then single quotes are required when
specifying the row name. Square brackets are allowed, but not required. For example:
GetDataCell(CVR('123') - CVR(['4,567'])):Name(Difference)
If the Member Filter in a Cube View Column or Row results in more than one result, but default,
the first number to appear will be used unless a Column/Row Index is used.
GetDataCell(CVC(Col1, 1) - CVC(Col2, 3)):Name(Variance)
It is possible to use the keyword First instead of the index. Note that since the index is optional
when using CVC and CVR functions, it is not necessary to use First or 1 for the index unless using
the CVRC function (Row & Column math, see below). Examples: CVC(Col1, 1) CVR(Row1, First)
An example of needing a Column & Row Index is this: Col1 has a Member Filter of S#Actual,
S#Budget, meaning it will return two columns. In this case, a variance between Actual and Budget
Scenarios can be shown like this: GetDataCell(CVC(Col1, 1) - CVC(Col1, 2)):Name(Variance)
Functions like Divide can be used to avoid divide by zero situations. Note that double quotes are
not necessary in this case as they are the use of these functions with Member Filters. GetDataCell
(Variance(CVR(Col1,2), CVR(Col1,1)):Name(Variance)GetDataCell(VariancePercent(CVR
(Col1,2), CVR(Col1,1)):Name(Variance %) GetDataCell(Divide(CVC(Col3), CVC(Col2))):Name
(Ratio) GetDataCell(BWDiff(CVC(Col1), CVC(Col2))):Name(BetterWorse
Difference)GetDataCell(BWPercent(CVC(Col1), CVC(Col2))):Name(BetterWorse %)
The function CVRC takes four arguments to compare the intersection of a Row and Column. Note
that the Index is not optional: CVRC(RowName, RowIndex, ColumnName, ColumnIndex) CVRC
(Row1, 1, Col1, 1):Name(Column Row Intersection)
It is possible to use the keyword Current instead of the Row or Column Name or Index. For
example: CVRC(Current, 2, Col, 1):Name(Current) CVRC(Current, First, Col1, 2):Name(Current)
The following CVRC statement is a common example that would be defined in a Cube View
column. Assume that the rows of the Cube View are revenue Accounts (including a row called
NetSalesRow) and that Col1 is V#YTD. This formula could be used to determine the percent of
the Net Sales based on each row and index (i.e. Current). GetDataCell(CVRC(Current, Current,
Col1, 1) * 100 / CVRC(NetSalesRow, 1, Col1,1)):Name(% of Net Sales)
2. Expand Cube View Groups, expand a specific cube view group and select a cube view.
3. Click Rows and Columns. In the example below there are three columns, where Column 3
has the GetDataCell() function applied for the total of Column 1 and Column 2.
4. Click Open Data Explorer to generate the Cube View and see the data. In the example
below you see that it is adding the value 200.00 from Column 1 and the value 200.00 from
Column 2 for the total of 400.00 in Column 3.
5. Click Edit.
6. Select Column 1 and at the bottom of the window click the Formatting tab.
7. In Header Format, click the ellipsis. The Header Format window opens.
8. Click Format.
Dashboards
Dashboards are a powerful place where the user can view formatted Cube Views, pixel perfect
financial reports and expressive charts. Any given Dashboard is contained in a Dashboard Group
and is made up of Components, Data Adapters and potentially Parameters. All of the objects are
managed and shared across Dashboard Maintenance Units. Each is a building block on the other.
The main use of a Dashboard Maintenance Unit is to enable the sharing of key Dashboard
artifacts like Parameters, Data Adapters and Components across multiple Dashboard Groups.
These objects do not have security access settings, so they assume the settings of the
Maintenance Unit. Once a Maintenance Unit is created, Dashboard Groups, Components, Data
Adapters, Parameters, and Files can be created within the unit.
Dashboard Groups are created to organize Dashboards and work as placeholder where the
Dashboards reside. The Dashboard Groups are then available to assign to Dashboard Profiles for
viewing throughout the Workflow process.
Dashboards are composed of Components which are broken into Parameter Components,
Content Components, and Embedded Components. Components can have one to many Data
Adapters and can be used across multiple Dashboards and Dashboard Groups within the same
Dashboard Maintenance Unit.
When creating a Component, give it a Name and Description. Feed Parameters and other
Substitution Variables into the Description, so they will appear in the resulting Component.
Data Adapters are the minimum building block for Components. These specify where the data is
coming from for the Dashboard Component. A list can be made of each Data Adapter which
names the resulting table created when the Dashboard runs.
Parameters can be used to filter data in the resulting Dashboard. They are not required, but
extremely useful. See "Using Parameters" on page 220 and "Presenting Data With Books, Cube
Views and Other Items" on page 576.
The File section allows an administrator to create company specific Dashboards by uploading
documents and images.
NOTE: The optimal display is 1920 x 1080 resolution. For the Windows desktop
application, we recommend setting the scale to 100% and using the zoom functionality
to zoom in. This is especially helpful in dashboards such as People Planning and
Reporting Compliance.
1. Select the Dashboard to which changes need to be made and set it as the default
Dashboard.
Design Mode:
3. Toolbar
Click this to go back to maintenance mode. Note: Highlight a specific Dashboard item
(Data Adapter, Component, etc.) and select this icon to go directly to that item in the
Dashboard maintenance screen.
4. Dashboard Maintenance Unit Items This provides the entire Dashboard maintenance
hierarchy for the default Dashboard. For more complex Dashboards, use the Search to find
the desired item. This Dashboard example contains one chart Component, one Cube View
Component and the custom substitution variables used (parameters and their variables).
Hover the mouse over each item to display a tooltip that identifies the value currently being
used. Each Component has its own Data Adapter.
6. Find Tree Item When a user clicks this icon, the maintenance items involved in this part
of the Dashboard are highlighted in the tree. For example, in order to change the title of the
Dashboard chart above, click in order to see what part of the maintenance hierarchy
controls the chart.
7. Click to return to edit mode and it navigates directly to the Chart Component selected in
design mode.
8. Make the necessary changes and preview it once more in design mode to see the outcome.
NOTE: Authoring Dashboards which are designed to contain more than six levels of
nested Dashboards is a rare use case and not recommended. When viewing in the
Silverlight Web Browser, results may be inconsistent. If there is a desire to nest
Dashboards to this level or beyond, it is recommended to use the OneStream Windows
App interface.
3. Click Components.
7. When you set the properties at design time, you set a minimum and maximum date range
that is viewed at run time. For example, Min Date to 20200101 and the Max Date to
20201231, others can only select dates within that range.
if you click the ellipsis, you can choose an existing parameter which will automatically fill in
the|!...!|.
You can enter and use a Dashboard XFBR string.
NOTE: Min and Max Date values should be entered in the following format:
yyyyMMdd. This value is converted at run time based on your application culture
setting.
Use the double arrows >> to show the max date and << to show the min date.
NOTE: If no min date is specified, the min date is set to 1/1/1900. If no max date is
specified, the max date is 12/31/9999.
The chart below is displaying product revenue by region. This data is driven by a Cube View
where the Cube View Rows display the region and the Cube View Columns display the products.
Only the Y axis can be customized with this chart type which is driven by the Cube View data. The
Legend is driven by the Members on the Cube View Rows and the column Members display
multiple axes organized in a circle.
Report Designer
The Report Designer is built into the Dashboard Report component, which allows you to edit
reports on the report itself.
NOTE: The OneStreamClientApi.dll is now included with the Windows app install. When
you install the Windows app, it includes the Client API installer.
2. Setting the display Zoom slider will assign the display percentage to the Maximize view.
1. Click the XF Tools tab and then click Export Report Layout. The Save As dialog box
opens.
3. The report layout changes to the formatting of the Cube View you selected.
4. Click Save to save the new report layout or click Cancel All Changes Since Last Save to
revert to the previous layout.
3. The report layout changes to the formatting of the Dashboard Report Component you
selected.
4. Click Save to save the new report layout or click Cancel All Changes Since Last Save to
revert to the previous layout.
Functionality
For most right-click functionality in Report Designer, use the Properties dialog box.
NOTE: When editing sub reports, you can access the Generate Own Pages option from
the Properties menu.
2. Click any of the column headings to sort the data in the column.
1. From the Edit Bindings dialog box, click Show only invalid bindings.
A list of invalid bindings opens. The invalid bindings have a red background.
2. Select one of the invalid bindings from the Data Binding column and click the down arrow.
3. From the drop-down list select the correct data to a path that exists in your data set.
4. After you click off of that row, the color of the row changes to yellow to identify that it is
modified.
NOTE: Even if you modify an invalid binding to make it valid, it will still appear in
the Invalid list.
6. Click Save in the Report ribbon to save your changes to the report and database. Or, click
Undo to revert the changes without saving.
Implementing Security
In this section you will learn about the four-pronged approach to managing security, which
consists of Workflow Security, Entity Security, Account Security, and Security Roles. Security can
be implemented on accounts or dimensions allowing you to control who can review specific
Dimension Members. Security is determined through Users and Groups, with users given specific
roles to determine what data can be accessed or edited.
Application Security
A four-prong approach to manage security which consists of Workflow Security, Entity Security,
Account Security, and Security Roles is used. Once Entities are identified and assigned to
Workflow Profiles, Data Loaders and Certifiers can be determined for each Entity. Data Loaders
load data into the system, therefore they need Read/Write access to Entities. Data Certifiers
review and sign off on the loaded data, so they need Read access to Entities. Security can also
be done on the Account or any other Dimension to control who can review specific Dimension
Members.
Security is determined through Users and Groups. Users are given specific roles to determine
what data is accessed or edited. For example, if a user is given the ModifyData role in an
application, he/she will have write-access to any data in it. Users are also put into Security
Groups. Groups can support native Groups, Exclusion Groups, or Groups of Groups. For
example, a user can be put into an Entity’s Read Write Data Group in order to have read/write
access to the Entity’s data.
Every object has Access and Maintenance security rights, with the exception of Task Scheduler.
Access allows the security group to view the object, while Maintenance allows the groups to edit
the definition of the object. This system applies to most objects. For example, Cube Views,
Dashboards, Transformation Rules, and Workflow Profiles.
A Maintenance Group is the middle level of power for an object at the group level. For example, a
Maintenance Group assigned to a specific Entity Transformation Rule Group allows the assigned
users to create, edit, and delete rules within that Transformation Rule Group.
An Access Group is the lowest level of power for an object at the group level. This means the
object can be used, but its definition cannot be edited.
Confirmation Rules
Confirmation Rule Groups are assigned to Confirmation Rule Profiles which are then assigned to
Workflow Profiles. The run time access to these Confirmation Rules depends on to which
Workflow Profile they have been assigned. If a user has Workflow Execution Access, he/she will
be able to execute them.
The best way to control Confirmation Rules is to set Access to Everyone and Maintenance to
Administrators for both Confirmation Rule Groups and Profiles.
Certification Questions
Certification Question Groups are assigned to Certification Question Profiles which are then
assigned to Workflow Profiles. The run time access to these Certification Questions depends on
to which Workflow Profile they have been assigned. If users have Workflow Execution Access,
they will be able to execute them.
The best way to control Certification Questions is to set Access to Everyone and Maintenance to
Administrators for both Certification Question Groups and Profiles.
Data Sources
Data Sources are assigned to Workflow Profiles. The run time access to these Data Sources
depends on to which Workflow Profile they have been assigned. If a user has Workflow Execution
Access, he/she will be able to execute them.
The best way to control Data Sources is to have to the ManageDataSources Application role, and
no security settings at the object level.
Transformation Rules
Transformation Rule Groups are assigned to Transformation Rule Profiles which are then
assigned to Workflow Profiles. In this case, an appropriate user group needs to be assigned to
Access and Maintenance because users will be able to right-click on an Import Workflow Profile
and view/edit their Transformation Rules. The user groups should include the users assigned to
execute the Workflow Profiles to which the Transformation Rule Profile has been assigned.
The best way to control Transformation Rules is to set Access to Everyone and Maintenance to
Administrators for most core, shared, or corporate Transformation Rule Groups. For some
specific Transformation Rule Groups, such as an Account Transformation Rule Group that
applies to a specific location, assign the appropriate user groups to Access and Maintenance.
Block access to the Maintenance screen for anyone except administrators because this could
potentially allow users more access than they need.
The best way to control Form and Journal Templates is to set Access to Everyone and
Maintenance to Administrators for both Form/Journal Groups and Profiles.
Cube Views
The best way to control Cube View Groups is to set Access to Everyone and Maintenance to
Administrators and anyone else building a Cube View. To keep the assignment of Cube View
Groups to multiple Cube View Profiles flexible, the Cube View Groups need to remain smaller in
size. For Cube View Profiles, set Access to anyone who will need to see these Cube Views in
OnePlace, Excel, or assign them to Workflow Profiles, Forms, or Dashboards. Set Maintenance
to anyone who needs to change the assignment of the Cube View Groups to Cube View Profiles.
OneStream recommends setting the Can Modify Data, Can Calculate, Can Translate, and Can
Consolidate properties to False. This can be pre-set for all new Cube Views by creating an
example or Cube View Template which can be copied to create new ones. Some examples of
when this will not be needed is if the Cube Views are going to be read by administrators only, the
Cube Views will be used as a data entry form or are only going to be visible in a formatted report or
chart.
The best way to control Dashboard Groups is to set Access to Everyone and Maintenance to
Administrators and anyone else building a Dashboard. In order to keep the assignment of
Dashboard Groups to multiple Dashboard Profiles flexible, the Dashboard Groups need to remain
smaller in size. When assigning Maintenance for Dashboard Profiles, give access to anyone who
needs to see the Dashboard in OnePlace, assign it to a Workflow Profile, or change the
assignments of Dashboard Groups to Dashboard Profiles.
Use multiple Dashboard Maintenance Units in order to keep them a reasonable size making it
easier to manage multiple objects and access. Dashboard Parameters can also be used across
all Dashboards and do not need to be copied across all Maintenance Units. Security has no
bearing on the use of Parameters.
Workflow Security
Security groups for Workflow Execution, which is the ability to process a Workflow for a specific
Workflow Profile, Certification Signoff and the separate ability to Process, Approve and Post
Journals, exist for all Workflow Profiles. In certain cases, the user simply needs the Access and
Workflow Execution Group Access to run Workflow. For example, the user does not need Access
or Maintenance Group access to Data Sources or Transformation Rules in order to run through
the Import Workflow.
In some cases, having access to certain objects is necessary along with Workflow Execution
Group Membership. The Manage App Role has to do with creating, reading, updating, and
deleting Journal and Form Templates (metadata) themselves, not just instances of these objects
at run time. It is expected that 90% of Workflow users will not have any of the Application Roles,
but their access will be controlled by the Access Group for those Journal/Form Template Groups
and Profiles. Workflow users also need Workflow Execution Group access in order to perform
import, forms, and journal actions. The user does not have to be in the Manage Application role to
create a Journal or enter data in the Form. Workflow security governs access to the forms. If the
user is in the ManageJournalTemplates Application Role group, he/she can create any Journal
needed for the Workflow Profiles to which they have proper execution access.
Users need to have at least Access Group privileges to the Cube Root Workflow Profile node to
edit Workflow Profiles with having the ManageWorkflowProfiles role. Otherwise they will not be
able to see any Workflow Profiles under the Cube Root Workflow Profile.
The order to follow when assigning access to Workflow Profiles and data is to first assign Read
and Read/Write Groups to the Entities involved. Next, create an Access Group, Data Group, and
Approver Group for each Workflow Profile and include the appropriate Entity groups.
Import
First, determine whether the users can load data to the Workflow for the assigned Entities and
then determine whether they load both GL (BS and PL) and Supplemental data, or one or the
other. Next, decide if the users for the assigned Entities can certify the loaded data as part of the
Workflow.
Forms
First, determine whether the users can manually input data into a form and certify it as part of the
Workflow for the assigned Entities.
Adjustments
First, determine whether the users can manually input data into a journal and certify it as part of
the Workflow for the assigned Entities.
Entity Security
Entity Security controls the overall read/write access to Entity data and controls whether Cube
Security should be used. When creating Entity security groups for the Read Data Group and the
Read/Write Data Group, the groups should be named in a logical convention such as XXXX_View
or XXXX_Mod. The Entity Read/Write Data Group should be designed first because it is needed
for data loading in Workflows. The Workflow Execution Security Group should be assigned to all
the Entities’ Read/Write Security Group for the Workflow to gain loading access to the Entities.
When setting up View Security Groups for Entities, first consider how users need to view their
data whether it is by segment or region. Determine whether it makes more sense to have one
Entity View Group per Entity, or to create one Entity View Group per segment and apply one Entity
View Security Group to many Entities’ Read Data Group. All the Entities’ View Groups below the
Parent must be assigned to the Parent Level Entity View Group in order to gain access to data at
the Parent Level Entity and View Entities below it. Try to minimize the amount of View Entity
Security Groups where possible.
Relationship Security
You can change the security model to allow who controls viewing or modifying the relationship
members in the Consolidation dimension.
For the Use Parent for Relationship Consolidation dimension Members functionality:
l If set to False, the user’s entity rights control their rights to all members of the Consolidation
dimension. This is the default security model.
l If set to True, the user’s rights to the relationship members of the Consolidation dimension
are determined by their rights to the current entity’s immediate parent.
Users have either read or read/write access to view or modify their entity so they can see their
entire entity from all of the relationship members.
You can allow the relationship security portion of the Consolidation dimension to be controlled by
the access to the parent entity in the immediate hierarchy.
2. Select a Cube.
3. In the Cube Properties tab, in Use Parent Security for Relationship Consolidation
Dimension Members, click True.
4. Click Save to save the cube properties. You have now changed the security model for the
read or write relationship for the Consolidation Dimension members to be based on the
security rights to the immediate parent entity.
6. In Cube Views, go to the Designer tab and click Open Data Explorer.
Now you can see that all of the security has changed.
l In the first screenshot, all the Consolidation Dimension members are available across
Houston.
l When you switch on the new security, in the previous screenshot, the user has rights to
“USClubs,” then looking at the Houston member there, all the Consolidation Dimension
members are available.
l Looking at the last group, the user has no rights to Texas, based on the NoAccess to Local
and Translated. In Houston you can see the relationship members changed to “NoAccess.”
l The reason why USClubs turns to NoAccess is because the user apparently has no rights
to the parent NAClubs, which is not displayed on the report.
Data Security
Data Security controls the overall read/write access to Analytic Models. There are several steps to
see if you have access to data. First, it will check to see if you have the ModifyData role in the
application, otherwise you will never get write access to any data. Next, it will ensure you have
either the Access Group or Maintenance Group for a Cube, or you will not have access to any of
the Cube’s data. It will then check the Scenario and Entity’s Read Data Group and Read Write
Data Group to ensure access, and finally it will check the Data Cell Access Security at the Cube
level. If you have access to the data and use Excel or any other method to import data, it ensures
the Entity being written to belongs to a Workflow Profile with an active Forms Channel that is not
locked or certified. If any of these steps result in no access, the process stops.
There are several ways to guarantee data is secure. Throughout the Dimensions, different
security groups are available, and an administrator can decide what users belong to each group.
The Scenario Dimension has both Read Data Group and Read and Write Data Group. The Entity
Dimension uses both security groups from the Scenario and adds Display Member Group and
Use Cube Data Access. The Display Member Group only refers to the member display access
level, not the data access level. Use Cube Data Access will be used if “slice” filters are being
applied at the cube level to apply additional layers of security for specified Security Groups. The
Account, Flow, and all User Defined Dimensions also use Display Member Group. At the Cube
level, more security is put in place through Groups, Member Filters, and complex layered security.
A Cube “slice” filters a data entry form to the right Member Set such as choosing to which Cost
Center a user can enter data. It is important to note, for the “slice” filters to take effect, the user
must first be granted access to the cube, scenario, and entity. “Slice” security cannot increase
access to data that was not administered first through Users and Groups. An administrator can
also lock down more than one Dimension by using Cube Data Access and can control user
visibility by only giving access to certain accounts. Finally, Data Cell Conditional Input and Data
Management Access Security can be used. These are not security settings but can still control
how a Dimension can be used for input and how a Cube is modified.
Security Configurations
Restrict Users to an Application
Setting up access to an application can be done within the Application Security Roles. When an
application is first created, the OpenApplication defaults to Everyone. If specific people have
access to log into OneStream, but only need access to specific applications, then security groups
can be assigned to the OpenApplication role. Once the security group is created, users can be
assigned to it. This can be done by performing the following steps:
3. Assign all users who should have access to the application to the new security group.
4. Refresh the application in order for this new security group to appear in all drop down
menus.
6. Click the drop-down arrow and select the new security group.
7. Click Save.
4. Click , or double click on the cell to make changes to the Member Filter. Add the
Dimension intersection to restrict data loading. In this case it will restrict users from loading
to the Trial Balance Account through the Forms Origin.
5. In the In Filter field, choose a Behavior and choose the Read Only Access level.
For example, there may be many read-only intersections in Actual to which users should not load
data. These rules are setup in the Data Cell Conditional Input section within Application|
Cube|Cube|Data Access. However, for historical purposes, there may be data in these
intersections that was there prior to the filters being applied. The Scenario to which the data is
being copied needs to allow access because the historical data in these intersections may need to
be copied for analysis.
Create a new Data Cell Conditional Input Rule for the entire Scenario, set the Behavior to
Increase Access And Stop, set the Access Level to Read Only.
Position this new rule at the top of the Data Cell Conditional Input for the Scenario.
The Behavior option Increases Access And Stop is being used because if the current cell matches
the filter, access is being increased and all subsequent data access rules are being ignored
below. In this case, the Preserve Scenario has access to everything, and the subsequent Data
Cell Conditional access rules are ignored or not applied for Preserve.
System Security
System Security applies to the framework. There are some key assumptions around how the
different roles or security groups work. First, it is important to understand the hierarchy of certain
System Security Roles and Security Groups:
Maintenance Group
The Maintenance Group means users cannot only see an object, but can create new objects in
Groups, edit, and delete them. Users do not need to be in the Access Group for an object if they
are in the Maintenance Group. The Maintenance Group can also control the contents of Profiles.
Access Group
The Access Group means users can see the object and read its contents.
Deploying changes to a production environment should avoid times during high load and high
application activity. Changes to these types of application artifacts especially should not be
performed against a production environment experiencing heavy activity:
l Confirmation Rules
Applying changes like this while the production system is under a high level of activity may have a
negative impact on servers and have the potential to cause running processes to produce an
error.
Standard environments are recommended to schedule production changes during slow periods
or non-work hours. Large environments should consider using the Pause functionality within the
Environment tab to allow activity to wind down. These large environment managers should also
consider the Marketplace solution, Process Blocker, which allows a pause of critical processes to
perform maintenance on the system, without having to shut down the entire application. Process
Blocker allows current tasks to be completed, while any new requests are queued, allowing the
changes to be applied safely and effectively. Once these changes are in place, it is recommended
to significantly limit the ability for users to make such changes during high volume.
It is key that servers get a chance to recycle for good system memory health. IIS also has an Idle
Time-Out setting for our OneStreamAppAppPool. This setting should be set to 0 since
OneStream has other settings to recycle IIS. For active, global environments with Data
Management Sequences regularly being executed, a recycle of IIS is recommended every 24
hours for these OneStream App Servers. Please discuss this situation with OneStream Support to
find what is recommended, as each situation may vary.
3. Click Security.
4. Set the Logon Inactivity Threshold (days) to the number of days of inactivity before the user
can no longer access the system.
5. Click OK.
6. Click Save.
Review Settings
The Administrator is the only role that the inactivity threshold does not apply.
3. If the Remaining Allowed Inactivity is zero that means the user no longer has access.
4. When they try to logon, they will get a message telling them they have been disabled.
8. Select True.
9. Click Save.
The Remaining Allowed Inactivity field updates allowing the user to log back in within the
time frame specified.
Business Rules
1. Go to Application > Business Rules > Extensibility Rules.
System Table
1. Go to System > Database > System Database> Tables.
Business Rules
A Business Rule is a VB.Net Class meaning each Business Rule is an independent object
encapsulating VB.Net code. A Business Rule can be a one-line call to write a log message, or it
can be a full code library containing other custom VB.Net Classes, Methods and Properties.
l Platform Engines
See the API Overview Guide andAPI Details and Database Documentationfor detailed Business
Rule engine background, an API guide, and information on each database related. .
Platform Engines
The platform is comprised of multiple processing engines. These engines have distinct
responsibilities with respect to system processing and consequently they expose different API
interfaces to the Business Rules they call. This section provides a brief overview of each engine
in the platform and describes the engine’s core responsibilities.
Workflow Engine
The Workflow Engine is thought of as the controlling engine or the puppeteer. The main
responsibility of this engine is to control and track the status of the business processes defined in
the Workflow hierarchies. This engine is primarily accessed through the BRApi and can be called
from other engines in order to check Workflow status during process execution. The Workflow
Engine provides a very rich event model allowing each Workflow process to be evaluated and
reinforced with customer specific business logic if required (see Appendix 2: Event Listing).
Stage Engine
The Stage Engine performs the task of sourcing and transforming external data into valid analytic
data points. The main responsibility of this engine is to read source data (files or systems) and
parse the information into a tabular format. This allows the data to be transformed or mapped to
valid Members defined by the Finance Engine. The Stage Engine is an in-memory, multi-
threaded engine that provides the opportunity to interact with source data as it is being parsed
and transformed. In addition to parsing and transforming data, the Stage Engine also has a
sophisticated calculation that enables data to be derived and evaluated based on incoming
source data. The Stage Engine provides quality services to source data by validating, mapping,
and executing Derivative Check Rules.
Finance Engine
The Finance Engine is an in-memory financial analytic engine. The main responsibility of this
engine is to enrich and aggregate base data cells into consolidated multi-Dimensional
information. The Finance Engine provides the opportunity to define sophisticated financial
calculations through centralized Business Rules as well as member specific Business Rules
(Member Formulas). It works concurrently with the Stage Engine to validate incoming
intersections and works with the Data Quality Engine to execute Confirmation Rules which are
used to validate analytic data values.
Presentation Engine
The Presentation Engine provides extensive data visualization services to the Platform. The
Presentation Engine is made up of the following component engines: Cube View Engine,
Dashboard Engine, Parameter Engine, Book Engine and Extensible Document Engine. The
Presentation Engine is responsible for managing and delivering content to the end user as well as
providing a development environment for custom user interface elements. This engine enables
the MarketPlace application development capabilities and continues to evolve with each product
release. Like the Data Management Engine, the Presentation Engine interacts with and can call
the services of all other engines in the product.
Scaling Engine
This feature will be made available in a future release.
The Scaling Engine provides services that will determine whether the customer wants to Scale
their Server Set or Database Elastic Pool on the Platform. This is only available to Cloud (Azure)
and does not pertain to On-Premise solutions. For example, customer must be utilizing Azure
Scale Set, and/or SQL Server Elastic Pool functionality. This provides the ability to create or
delete a VM and/or increase/decrease database resources based on the logic that is designated
in the System Extender Business Rules to meet the customer needs.
BRApi
The BRApi is common across all Business Rules, engines and APIs being run, so it is not an
engine itself. A BRApi function runs outside of the other engines and can orchestrate certain
functions from within other engines. In other words, a BRApi function be run from one engine (e.g.
Parser) to tell other engines (e.g. Finance) to execute their own APIs (e.g.
API.Data.GetDataCellUsingMemberScript). For another example, while the
API.Data.GetDataCell function is available from within the Finance engine, a similar BRApi called
GetDataCellUsingMemberScript can be run from any engine if given the appropriate arguments.
A common use is BRApi.ErrorLog.LogMessage from any engine.
There are two broad business rule classifications: shared business rules and item specific
business rules. Each engine in the system may support one or both business rule classifications.
Whenever a processing sequence is executed in the platform, the particular engine(s) involved
evaluates how and what business logic is associated with the process. This may include shared
business rules (named and event handlers) as well as item specific business rules (member
formulas, logical expressions, and confirmation rules).
NOTE: Shared business rules can be written in either VB.NET or C#, item specific
business rules can be written in VB.NET only.
For example, when creating a one-off rule without any reusable value to other components in the
system, write an Item Specific Business Rule directly on the platform component because it
requires a very specific piece of business logic. Another example, which is more common when
creating calculation logic for an analytic model, is to write a Member Formula that directly
associates a calculation with a Dimension Member. This creates system maintenance clarity and
maintainability.
Item Specific Rules, in particular Member Formulas, can have a positive performance impact
because they allow calculations to be broken down into formula passes and processed in a
parallel (multi-threaded) fashion. The same formulas can be written in a Shared Finance Business
Rule, but the calculations will always execute in the serial manner defined in the rule.
l Formula Header
l Formula Footer
These hidden sections (i.e. Regions) keep the formula / expression as readable as possible. In a
Shared Business Rule, these sections are visible which make the rule more verbose. The idea
behind the Item Specific Business Rule is to create discrete code blocks that are easy to manage
and have limited interdependencies. If one knows how to write a Shared Business Rule, then
she/he also knows how to write an Item Specific Business Rule and vice versa.
Item Specific Rules are categorized into three types: Member Formulas, Complex Expressions,
and Confirmation Rues. These relate to the platform engine with which they are associated.
Member Formulas
A Member Formula is assigned to a Dimension Member and executes within the Finance Engine
during a Cube processing sequence. See Formula Guide in "About the Financial Model" on
page 2 for more information on processing sequences. Member Formulas provide the same level
of syntax and logic capability that exist when writing a Finance Shared Business Rule, however
custom consolidation, elimination, and translation logic cannot be written. Member Formulas are a
great choice for writing logic limited to calculations based on a single Member and calculations
that do not span Dimensions. If Member Formulas are written with these constraints in mind, then
the Dimension Member and its formula can be reused in different Cubes without having
dependencies on other Dimensions. This does not mean that a Member Formula cannot look at
other Dimensions. Referencing Dimension Members outside of the specific Dimension where the
formula exists will limit the reusability of the Dimension, or require all referenced Dimensions be
used together in any new Cube.
Member Formulas are written directly on a Dimension Member within the Dimension Library.
Navigate to the specific Member’s Formula property and click the ellipsis in order to store a
Member Formula.
Complex Expressions
A Complex Expression is a Business Rule assigned to Data Source Dimensions, Derivative
Rules, and Transformation Rules and execute within the Stage Engine during a transformation
processing sequence. Complex Expressions provide the same level of syntax and logic capability
that exist when writing a Stage Shared Business Rule. The primary reason for using a Complex
Expression rather than a Stage Shared Business Rule is the logic being written has no reusability.
Complex Expressions isolate the logic by associating it directly with a specific item.
requiring custom logic and setting the Logical Operator. Clicking the Edit Rule Formulas
toolbar button opens the Logical Expression Editor dialog and allows the user to either select a
Shared Derivative Business Rule, write a Complex Expression, or use a Pre-Built Expression.
Both Shared Derivative Business Rules and Derivative Complex Expressions result in the exact
same compiled Business Rule code. The exception is a Complex Expression is only executed for
the rule to which it is applied, and a Shared Derivative Rule is shared and can be called by many
rules.
toolbar button opens the Logical Expression Editor dialog and allows the user to either select
a Shared Conditional Business Rule or write a Complex Expression. Both Shared Conditional
Business Rules and Conditional Complex Expressions result in the exact same compiled
Business Rule code. The exception is a Complex Expression is only executed for the rule to which
it is applied, and a Shared Conditional Rule is shared and can be called by many rules.
Confirmation Rules
Confirmation Rules are called by the Data Quality Engine and Finance Engine. Apply Complex
Expressions to Confirmation Rules by selecting the individual Confirmation Rule and clicking the
Edit Rule Formulas toolbar button. This button opens the Rule Editor dialog and allows the
user to write a Complex Expression containing the Confirmation Rule logic. A Confirmation Rule is
only written on the specific rule to which it applies. Confirmation rules do not have an equivalent
Shared Business Rule because each Confirmation Rule requires specific logic.
TIP: Shared Finance Business Rules can be called from a Confirmation Rule. Create
standard helper functions in a Shared Finance Business Rule and call them from a
specific Confirmation Rule creating some reusable logic and improving the overall
Confirmation Rule infrastructure maintenance.
l Class: MainClass
l Function: Main
Function Prototypes
Each Business Rule has one standard entry point Function Title called Main. The Function
definition below represents the standard prototype used by the Main Function in each OneStream
Business Rule. The Main Function always has the same standard parameter layout, but the last
two parameters, API and ARGS, contain different object references based on the type of
Business Rule being executed.
Public Function Main
(
ByVal si As SessionInfo, à Connection Object Required to use API
ByVal globals As BRGlobals, à Global Variable Object Used to Share Values
ByVal api As Object, à Specific API object (Different for each Type)
This section describes how to reference a Shared Business Rule from a within a Business Rule
and how to reference an external DLL from within a Business Rule.
When a Shared Business Rule is created, its public members can be referenced and executed by
other Shared and Item Specific Business Rules.
NOTE: Reference more than one Business Rule by creating a comma-separated list of
reference statements.
Syntax
BR\<Business Rule Name to Reference>
Example (Single Reference)
BR\OPS_PostalServiceHelper
In the screenshot below, the SharedForecastSeeding Rule can be called from any other Finance
Rule because its Contains Global Functions For Formulas property is set to True.
NOTE: When a Finance Business Rule has its Contains Global Functions For Formulas
set to True, any changes made to the Business Rule causes a metadata status impact
and changes the Calculation Status to OK, MC. This dependency must occur because a
global rule can be used by a Member Formula calculation, and therefore can impact the
status of the Finance Engine’s data (analytic / Cube data).
Code Declaration
Once a reference is made to a Shared Business Rule, the Business Rule’s Public Methods
(Functions / Subs) can be called. To access the Shared Business Rule’s Public Methods, declare
an instance of the rule in the code using the Business Rule’s fully qualified Namespace. This
creates an object variable that references the Shared Business Rule and calls its Public Methods.
Example Declaration
‘Declaring an object variable to reference a Shared Business Rule
Dim opsHelper As New OneStream.BusinessRule.DashboardExtender.OPS_
PostalServiceHelper.MainClass
Example Usage
‘Executing a Function on the Reference Business Rule Object Variable
Dim desc As String = opsHelper.GetFieldFromID(si, "Dashboard", "Name", dashName,
"Description")
l Complex logic requiring development tools only available within Microsoft Visual Studio
(Web Service Discovery and Interface Development)
3. Add a reference specification to the DLL in the Referenced Assemblies property of the
Business Rule using it.
Reference Specification
This section defines the syntax required to reference an external DLL by setting the Shared
Business Rule’s Referenced Assemblies property. There are three methods available for
referencing an external DLL.
Method 1
This method uses the XF\ prefix to create a reference to an external DLL located in the
BusinessRuleAssemblyFolder folder which is specified in the application server configuration file.
Syntax
XF\<External DLL Name to Reference>
Method 2
This method uses the file system path C:\DLLFolderName\ to create a reference to an external
DLL located on each application server.
NOTE: The same folder path and DLL must exist on all application servers. This
referencing method is not a best practice for custom business logic DLLs because it
creates a maintenance and update burden.
Using a file system path reference is a valid method when referencing an external DLL that
already exists on an application server. The DLL exists on the application server as part of the
operating system or another installed software component.
Syntax
C:\DLLFolderName\<External DLL Name to Reference>
Code Declaration
Once a reference is made to an External DLL from a Shared Business Rule, the Public Methods
(Functions / Subs) of that External DLL can be called. In order to access the Shared Business
Rule’s Public Methods, declare an Import to the Namespaces defined by the DLL, and then create
an instance of the desired class to utilize in the code.
Example Import
Imports YourNamespace.SubNamespace
Example Declaration
‘Declaring an object variable to reference a class on the external DLL
Dim extHelper As New YourClass
Example Usage
‘Executing a Function on the external DLL
Dim desc As String = extHelper.YourFunciton(“SomeParameter”)
Method 3
This method uses a Windows environment variable to create a reference to an external DLL. All
standard Windows paths are supported, and the name is determined by .NET.
Syntax
%System%\DLLName.DLL
Example
%userprofile%\documents\WindowsBase.DLL
Below is an example of a Cube View Extender Business Rule for review and understanding:
See the OneStream API Overview Guide for more details on this Business Rule including the
various lists of potential options available for setting properties. Search in the below highlighted
section of the API Guide:
The breakdown of each section for this type of rule are below:
Args.FunctionType
Select Case args.FunctionType is the expression used when a certain process needs to be
isolated and run special logic. The Case statement in the example Business Rule above is
necessary to determine one of two operations that are being performed to generate the Cube
View Report. These are GetReportOptions or FormatReportUIItem.
CVExtenderFunctionType.GetReportOptions
This is for the retrieval of and setting of properties such as margins and height of title and footer. A
value of -1 means to use the default value and is not necessary if that property is not being
overridden. Any of these numeric settings is represented in pixels, roughly the width of a human
hair, so these are precise measurements.
Case Is = CVExtenderFunctionType.GetReportOptions
reportOptions.ReportMarginTop = 100
reportOptions.ReportMarginBottom = -1
reportOptions.PageHeaderTitlesHeight = 20
reportOptions.PageFooterHeight = -1
Return reportOptions
Notice in the rule above a new CVExtenderReportOptions object is declared so that a few
properties can be set for that object and then written back with the Return statement at the bottom
of the section. Without the Return statement, the properties do not get set for this particular
instance when this report is run.
These property settings will not override the saved properties for this Cube View, but just override
at run time. Properties that can be set in a CVExtenderReportOptions object:
l PageFooterHeight
l PageHeaderTitlesHeight
l ReportMarginBottom
l ReportMarginTop
CVExtenderFunctionType.FormatReportUIItem
The key concept to grasp for how these Cube View Extender Business Rules apply to the
formatting of a report is that every item is looped through and is eligible to have its format changed
through logic. When a report is run, it plots every label, every line, everything seen on the page
one by one. Based on the current item being processed (i.e. Args.Report.CurrentUIItem), the
system has the context of many properties (which depend on the type of item it is, or UIItemType)
and the Cube View Column or Row to which it is related.
This is for formatting a specific Report User Interface Item, which could be a data cell, line, footer,
header, row header, column header or another item on the report. These object types are called a
UIItemType.
Case Is = CVExtenderFunctionType.FormatReportUIItem
Based on the UIItemType, apply logic that will set properties as appropriate and based on
conditions.
Args.Report
Can set properties in the report view of the Cube View output. See below for some examples.
l CurrentUIItem
Example: Dim uiItem As CVExtenderReportUIItem = args.Report.CurrentUIItem
This is the key Report Arg as it returns an object called CVExtenderReportUIItem
o A CVExtenderReportUIItem can return:
o UIItemType
This a key property. This will reveal if the Report Item is a Label, Page Header
or Footer Label, Row or Column Header Label or a Data Cell Label (meaning
an amount or cell on the report). Once it knows what type of Report Item is
being analyzed, certain properties can be applied.
Example of this logic:
If uiItem.UIItemType = XFReportUIItemType.DataCellLabel Then…
o Whether the item has data (uiItem.XFHasData) if the type is
DataCellLabel
o The amount of the DataCellLabel (uiItem.XFAmount)
o Text stored and how formatted (uiItem.Text, uiItem.FontFamily or
uiItem.FontSize)
o The item’s Name (uiItem.Name)
o Change the colors (uiItem.TextColor, uiItem.BackgroundColor,
uiItem.BorderColor, etc.) with a statement such as
uiItem.BackgroundColor = XFColors.Yellow
o Change borders and lines (uiItem.BorderSides, uiItem.BorderLineStyle,
etc.)
o Whether the item can grow or shrink based on content, if it needs to stay
on one row, or how large that row or column should be (uiItem.CanGrow,
uiItem.CanShrink, uiItem.Padding or uiItem.WordWrap)
l Margin sizes
o api.Report.MarginLeft
o api.Report.MarginRight
o api.Report.PageWidthMinusMargins
Args.CubeView
These Args can be used to retrieve properties from a Cube View being processed when the report
is run, but really are here more for internal use. Some of these properties could be used as
conditions, when setting labels or other properties. Examples are:
l Paper Size
l Margins
l Titles
Args.PageInstanceInfo
This is for setting Dashboard Page State and is not related to Cube View Extenders.
Args.CustomSubstVars
This is not related to Cube View Extenders for the most part. They can provide the ability to
retrieve the name value pairs of a custom Parameter applied when this Cube View was run and
the choice the user made. For example, if the user selects an Entity upon running the report,
MyEntity = Houston could be returned and used in a custom report, however, there are other
methods to apply this same information.
As shown in the previous example, a test can be performed to see if there is anything to be
rendered in this column or row with If Not cvRow Is Nothing Then, which is a good practice.
Conditional logic can be used before applying formatting to labels or data cells such as the Name
of the Cube View Row related to the CurrentUIItem:
If cvRow.Name.XFContainsIgnoreCase("Total") Then…
Function:
Case Is = CVExtenderFunctionType.FormatReportUIItem
Dim uiItem As CVExtenderReportUIItem = args.Report.CurrentUIItem
End If
Function:
Case Is = CVExtenderFunctionType.FormatReportUIItem
Dim uiItem As CVExtenderReportUIItem = args.Report.CurrentUIItem
End If
Function:
Case Is = CVExtenderFunctionType.FormatReportUIItem
Dim uiItem As CVExtenderReportUIItem = args.Report.CurrentUIItem
Else
uiItem.SetPictureBoxImage
(FileSystemLocation.ApplicationDatabase,"Documents/Public/Standard.png",
TriStateBool.TrueValue)
End If
Function:
Case Is = CVExtenderFunctionType.FormatReportUIItem Dim uiItem As
CVExtenderReportUIItem = args.Report.CurrentUIItem
Else
uiItem.SetPictureBoxImage
(FileSystemLocation.ApplicationDatabase,"Documents/Public/Standard.png",
TriStateBool.TrueValue)
End If
End If
NOTE: CurrentPageInfo can also use CenterPosition in order to move the logo to the
center of the report.
In order to prevent the Center Subtitle from wrapping, apply a Cube View Extender function and
control the width.
Function:
Case Is = CVExtenderFunctionType.FormatReportUIItem
Dim uiItem As CVExtenderReportUIItem = args.Report.CurrentUIItem
End If
Result:
Function:
The function below is controlling the Center Subtitle’s width and putting a border around it.
Case Is = CVExtenderFunctionType.FormatReportUIItem
Dim uiItem As CVExtenderReportUIItem = args.Report.CurrentUIItem
End If
Result:
Function:
Case Is = CVExtenderFunctionType.FormatReportUIItem
Dim uiItem As CVExtenderReportUIItem = args.Report.CurrentUIItem
End If
Result:
If set to True, the Labels are changed to a Table structure which allows the row height to be
formatted across rows as text wrapping occurs within Row expansions.
Conditional Formatting
Use Case: Format cells based on the cell content. In the example below, the function will highlight
any value less than $500,000 and place the text NODATA in any cell without a value.
Function:
Case Is = CVExtenderFunctionType.FormatReportUIItem
Dim uiItem As CVExtenderReportUIItem = args.Report.CurrentUIItem
End If
If uiItem.XFHasData Then
If uiItem.XFAmount < 500000.0 Then
uiItem.BackgroundColor = XFColors.SteelBlue
End If
Else
uiItem.Text = "NODATA"
uiItem.TextColor = XFColors.DarkBlue
uiItem.FontSize = 9
End If
End If
Result:
Result:
Result:
To use page state in a Dashboard Extender Rule, get the Page Instance ID using
args.PageInstanceInfo.PageOrDlgInstanceID
l BRApi.State.DeletePageOrDlgState
l BRApi.State.DeleteAllStateForPageOrDlg
l BRApi.State.SetPageOrDlgState
l BRApi.State.GetPageorDlgState
Pinpoint Example
This places a clickable pinpoint at each geographical location. A Parameter value can be
included in the string in order to generate an action upon clicking the pinpoint.
NOTE: See Business Rule Client Image Types for a list of available status images.
Result:
Ellipses Example
This places a clickable ellipse at each geographical location. A Parameter value can be included
in the string in order to generate an action upon clicking the ellipsis.
Result:
Polylines Example
This creates a continuous line composed of several pre-determined line segments.
First define the coordinates for each polyline segment. The string then defines the following:
(Polyline name, reference to the local variables above, line color, line thickness, Parameter Value,
hover color, hover thickness)
Dim polylines As New List(Of XFMapPolyline)
Dim points As New List(Of XFMapPoint)
points.Add(New XFMapPoint(20, 20))
points.Add(New XFMapPoint(30, 25))
points.Add(New XFMapPoint(40, 15))
polylines.Add(New XFMapPolyline("Polyline1", points, XFColors.Black.GetHexString(),
3,
"polylineParam1", XFColors.Green.GetHexString(), 6))
Polygons Example
This creates a polygon shape which outlines a specific location on the map.
First define the coordinates to apply to the polygon. The string defines the following:
(Polygon name, reference to the local variables above, polygon color, opacity, stroke, stroke
thickness, Parameter Value, hover color, hover opacity, hover stroke, hover stroke thickness)
Result:
If args.DataSetName.XFEqualsIgnoreCase("GanttDataSet") Then
'Create a DataSet from the XFGanttTaskCollection
'----------------------------------------------------------------------------
'Define the properties that we need to set for each task
Dim now As DateTime = DateTime.UtcNow
Dim taskName As String = String.Empty
Dim taskTitle As String = String.Empty
Dim taskDesc As String = String.Empty
Dim wfStatusType As WorkflowStatusTypes = WorkflowStatusTypes.Unknown
Dim imageSource As XFImageFileSourceType = XFImageFileSourceType.ClientImage
Dim imageNameOrPath As String = ""
Dim taskStartTime As DateTime = now
Dim taskEndTime As DateTime = now.AddDays(2)
Dim taskDeadline As DateTime = now.AddDays(5)
Dim isMilestone As Boolean = False
Dim percentComplete As Double = 0
Dim isHighlighted As Boolean = False
Dim taskParameters As String = String.Empty
Dim dependencies As New List(Of String)
Dim children As New List(Of XFGanttTaskItem)
'Create Task 1
taskName = "MyTask1"
taskTitle = "MyTask1 Title"
taskDesc = "MyTask1 Description"
wfStatusType = WorkflowStatusTypes.Unknown
imageSource = XFImageFileSourceType.ClientImage
imageNameOrPath = XFClientImageTypes.StatusGrayBall.Name
taskStartTime = now
taskEndTime = now.AddDays(3)
taskDeadline = now.AddDays(6)
isMilestone = False
percentComplete = 25
isHighlighted = False
taskParameters = "MyParam=HiTask1"
dependencies = New List(Of String)
children = New List(Of XFGanttTaskItem)
Dim task1 As New XFGanttTaskItem(taskName, taskTitle, taskDesc, wfStatusType,
imageSource,
imageNameOrPath, taskStartTime, taskEndTime, taskDeadline, isMilestone,
percentComplete,
isHighlighted, taskParameters, children, dependencies)
The Direct Load Workflow includes supporting BRAPI’s to aid in programmatical Business Rule
development.
Determine Workflow Types
Details for Transformation and Intersection errors relative to the current Direct Workflow process.
Direct Load’s in-memory processing only supports 1000 errors per load/import.
The Direct Load Workflow has two settings to manage summarizing Stage records as Row or
Blob. The Blog method does not physically write records to the StageSummaryTarget Table.
These BRAPI’s are built to automatically determine the storage method and retrieve the records.
• As DataTable = BRApi.Import.Data.ReadSummaryTargetDataTableTimeRange(si,
wfClusterPk, cubeStartTimeId, cubeEndTimeId)
PowerShell is built into Windows Server 2008 and Windows 7, provided as an optional feature
during installation. You can use Windows Task Scheduler to automate PowerShell script
execution.
l Windows PowerShell ISE: An integrated script editor you can use to type PowerShell
commands and to edit and run PowerShell script files. These are text files with a ps1
extension.
l Windows PowerShell: A command line execution tool similar to a DOS prompt. You can use
this tool to run commands or script files but you cannot create or modify scripts.
2. Create or alter the PowerShell execution and IDE configuration files, so the script engine
understands how to use the .Net Framework v4.0.
l powershell.exe.config
l powershell_ise.exe.config
</startup>
</configuration>
Learning PowerShell
Microsoft provides extensive resources to help IT professionals leverage PowerShell. For more
information, see: http://technet.microsoft.com/en-us/scriptcenter/powershell.aspx
OneStreamClientAPI
LogonInfo
Type
LogonInfo
SI
Type
SessionInfo
Authentication
Logon
Parameters
string webServerUrl
string userName
string password
XFClientAuthenticationType clientAuthenticationType
Return Value
LogonInfo
Logoff
Parameters
None
Return Value
None
OpenApplication
Parameters
string Application
Return Value
LogonInfo
LogonAndOpenApplication
Parameters
string webServerUrl
string userName
string password
string application
XFClientAuthenticationType clientAuthenticationType
Return Value
LogoInfo
EncryptPassword
Parameters
string clearTextPassword
XFClientAuthenticationType clientAuthenticationType
Return Value
string
DataManagement
ExecuteSequence
Parameters
string sequenceName
string customSubstVarsAsCommaSeparatedPairs
Return Value
DataMgmtResult
ExecuteStep
Parameters
string dataMgmtGroupName
string stepName
string customSubstVarsAsCommaSeparatedPairs
Return Value
DataMgmtResult
DataProvider
GetAdoDataSetForCubeViewCommand
Parameters
string CubeViewName
bool dataTablePerCubeViewRow
CubeViewDataTableOptions dataTableOptions
string resultDataTableName
Dictionary<string, string> customSubstVars
bool throwExceptionOnError
Return Value
DataSet
GetAdoDataSetForMethodCommand
Parameters
XFCommandMethodTypeId xfCommandMethodType
string methodQuery
string resultDataTableName
Dictionary<string, string> cistomSubstVars
bool throwExceptionOnError
Return Value
DataSet
OneStream Web API must be installed on a web server. It also must be configured for external
authentication providers supporting OAuth2.0/OpenID Connect authorization protocol. Identity
Providers currently supported are Okta, Azure AD and PingFederate.
OneStream Web API is API client agnostic. It accepts and outputs data in JSON format making it
possible for every API client application that supports this format to also interact with this service.
One of the most widely used API clients is Postman, a Windows app. For more information about
how to configure OneStreamWeb API to interact with Postman see the autogenerated
documentation at http(s)://[servername]:[port]/onestreamapi.
l POST api/Authentication/LogonAndReturnCookie
Used primarily by the Enablement Team to verify Web API installation completed
successfully. Returns a one-time cookie value that holds authentication state or a message
indicating failure along with a proper HTTP code.
l POST api/DataManagement/ExecuteSequence:
Executes a Data Management Sequence and returns a success/failure message along with
a proper HTTP code.
l POST api/DataManagement/ExecuteStep
Executes a Data management Step and returns a success/failure message along with a
proper HTTP code.
l POST api/DataProvider/GetAdoDataSetForAdapter
Returns a JSON representation of a DataSet a given Dashboard Adapter or a failure
message along with a proper HTTP code.
l POST api/DataProvider/GetAdoDataSetForCubeViewCommand
Returns a JSON representation of a DataSet for a given Cube View or a failure message
along with a proper HTTP code.
l POST api/DataProvider/GetAdoDataSetForSqlCommand
Returns a JSON representation of a DataSet for a given SQL Query or a failure message
along with a proper HTTP code. Administrator role is required for this functionality.
l POST api/DataProvider/GetAdoDataSetForMethodCommand
Returns a JSON representation of a DataSet for a given pre-defined list of method
commands used by XFDataProvider to fill a DataSet or a failure message along with a
proper HTTP code. Administrator role is required for this functionality.
You can extract applications as an XML or zip file. If the file size is larger than 2Gb, you must use
a zip file.
When importing, you may have extracted an XML file that was larger than 2Gb in size. You must
do a zip extract to successfully load the file.
1. Enter a user name and a range of time in order to find Dimension Member and Dimension
Member Relationship changes. Leave the User field empty in order to find changes by all
users.
For example, in the application’s Dimension Library, a Member was created, another
Member was deleted, and a relationship was deleted. Once the user and time filters are
entered, click OK, and these changes will be highlighted.
The hierarchy will indicate where changes were made. The example below indicates there
was a partial change to the Dimensions.
7. Search
Search for a specific Member in the selected Dimension.
8. Dimension Hierarchy
Scroll through the Dimension hierarchies in order to see where changes occurred and/or
manually select or de-select Members or Relationships to extract.
NOTE: Right-click on any Parent Member under Relationships and click Select All
Descendants in order to select all Child Members within the hierarchy and include
them in the extraction.
11. Any deleted Members found within the User/Time Parameters will display here. Click the
plus sign to manually add additional Members to delete. To exclude a particular entry from
the xml, select the line to exclude and click the minus sign.
14. Users can manually add additional relationships to delete by clicking the plus sign and
indicating the Parent and Child. To exclude a particular entry from the xml, select the line to
exclude and click the minus sign.
2. Deleted Members
3. New Relationships
4. Deleted Relationships
NOTE: The processing order is important because if any errors occur during the xml file
load, OneStream applies as many modifications as it can up to the point of the error. For
example, an xml file contains ten Member changes, three deleted Members and one
new relationship. If an error occurs when trying to delete the first Member, the ten
Member changes will still take place because they are processed first during the xml
load. Any modifications in the xml prior to the error will occur and any after the error will
not.
If a user receives an error during the load process, the error must be resolved in order to complete
the metadata migration.
a. Clear the Member’s data in the destination application and reload the xml file.
b. Create a new xml extract excluding that Member from the file.
c. Edit the xml file to exclude the Member and the action.
a. Create the security group in the destination application and reload the xml file.
b. Create a new xml extract excluding this Member and its changes from the file.
c. Edit the xml file to exclude the Member and the action.
a. Create the FX Rate Type in the destination application and reload the file.
b. Create a new xml extract excluding this Member and its changes from the file.
c. Edit the xml file to exclude the Member and the action.
a. Make the Member modifications in the source application and extract an xml file
without invalid characters.
Project File
The Application Designer must first manually define an XML file to support the export of objects as
aF Project File. The file is saved with the file extension of .xfProj and saved to a local project folder
which could also support a version control system.
Sample File:
File Structure:
l xfProject: The root node to start a .xfProj which contains two attributes:
o TopFolderPath: Will create and define the starting folder location of where the
specified files are extracted to.
o DefaultZipFileName: Will create a standard default file name for .zip file extracts.
l projectItems: A list structure containing the project items needed to extract (no attributes
needed).
l projectItem: The item reflecting what is needed to extract from OneStream or load from the
file system. It has 4 attributes:
o ProjectItemType:
o BusinessRule
o Cube
o CubeViewGroup
o CubeView
o CubeViewProfile
o DashboardMaintenanceUnit
o DashboardFile
o DashboardString
o DashboardParameter
o DashboardGroup
o DashboardAdapter
o DashboardComponent
o Dashboard
o DashboardProfile
o DataManagementGroup
o DataManagementStep
o DataManagementSequence
o DataManagementProfile
o DataSource
o Dimension
o TransformationRuleGroup
o TransformationRuleProfile
o FolderPath: The name of the sub folder where the project item type is extracted to.
o Name: The name of the project item.
o IncludeDescendants: Default is true and only affects these project item types:
o CubeViewGroup
o DashboardGroup
o DashboardMaintenanceUnit
o DataManagementGroup
File Extract
The .xfProj file is placed in a local folder, such as the user’s desktop. The defined folderpath
folders will be generated here as the target location for application exports and loads. There are
two file extract options available on the Windows App.
l .zip: The export option will collect all the objects defined in the .xfProj file as a Zip file to the
location of the .xfproj file.
l File: The standard export will export all the objects defined in the .xfProj file to the folderpath
locations defined in .xfproj file.
4. (Optional) From the OneStream Windows Client, select or de-select Extract to Zip as
required.
Example:
Zip Extract
The zip file extract will create an application zip file containing all the objects defined.
NOTE: If you select Replace, it will only remove files that differ for CubeViewGroups,
DataManagementGroups, DashboardMaintenanceUnits, and DashboardGroups. For all
other items (such as, business rules or extensibility rules), if you select Replaceit will act
as a Merge.
Zip Load
The zip file load functions as any other application file load. The contents of the file are merged
into the application. The zip file load is not supported by alternative merge or replace file load
options.
Cubes
Cubes are organization structures that contain data. They control how data is stored, calculated,
translated, and consolidated based on dimensions assigned to the cube. While flexible and
designed to hold multiple types of data, they are generally designed for specific purposes. An
application can have several Cubes that share Dimensions, time profiles, business rule functions,
and data. In this section, you will learn about cube dimension data, time profiles, and other cube-
specific characteristics.
Dimensions
There are three types of Dimensions available: Customizable, Derived and System Dimensions.
Customizable Dimensions have no preset Members and can be free form. These include Entity,
Scenario, Account, Flow, and UD1…UD8 (also known as User Defined Dimensions 1 through 8).
System Dimensions are non-customizable Dimensions and cannot be changed. They are pre-
defined as part of the system. These include Consolidation, Time, View and Origin.
Dimensions can be viewed by Members, Orphans, or in a Hierarchy. All Members will appear in
an alpha-numerical list. Use the search button to search for specific Members. This search
will also produce every hierarchy in which a Member appears. It is possible for different settings
to be set based on each Parental roll up. Orphans are a list of Members not assigned to any
Dimension hierarchy.
Root Dimensions
Every Dimension hierarchy has a None Member. This means there is not a selection for this
Dimension or it is not applicable. For example, a setting of None in the Intercompany Dimension
would be appropriate for an account entry of cash.
Entity Default
Each User Defined Dimension has an additional Member called EntityDefault used to assign
attributes to an Entity. This is set on the specific Entity in the Vary by Cube Type settings for each
of the User Defined Dimensions. This allows an Entity to have a specific default and reduce the
need to map every Entity to a common tag such as Region or Division as it pertains to data import
and form-based data entry. A user can select the EntityDefault Member without knowing the
specific Entity setting for each of these UD Dimensions. However, using this setting has a slight
impact on consolidation time because there will be more intersections in the financial model.
The Dimension Library allows for all the customizable Dimensions to be defined for your business
needs. Dimensions can be shared across multiple Cubes. The defining of Dimensions is
extremely important in order to allow sharing between Cubes.
Restricted Characters
There are certain restricted characters in Dimensions, Members, Dashboards, that can not be
named with the following characters:
l /
l |
l !
l @
l %
l #
l ,
l ;
l ^
l *
l +
l -
l =
l \
l ?
l >
l “
l [
l ]
l {
l }
l &
Reserved Words
These reserved words cannot be used on structural application components, like Cubes and
Dimensions. We recommend avoiding these reserved words in the application.
l Account
l All
l Cons
l Consolidation
l Default
l DimType
l Entity
l EntityDefault
l Flow Origin
l IC
l None
l Parent
l POV
l Root
l RootAccountDim
l RootEntityDim
l RootFlowDim
l RootScenarioDim
l RootUD1Dim
l RootUD2Dim
l RootUD3Dim
l RootUD4Dim
l RootUD5Dim
l RootUD6Dim
l RootUD7Dim
l RootUD8Dim
l Scenario
l Time
l UD1 – UD8
l UD1Default
l Unknown
l View
l WF
l Workflow
l XFCommon
Create Dimension
Use this to create a new Dimension
Save Dimension
Use this to save changes to the selected Dimension
Rename Dimension
Use this to change the name of the Dimension.
Move Dimension
Use this to move the dimension. You can move a dimension up one level above a parent or below
a sibling. If the move is invalid, you will not be able to move it. Integrity of Member relationships is
evaluated and enforced.
Create Member
Use this to create a new Member under the selected Dimension
Save Member
Use this to save changes to a selected Member
Search Hierarchy
Use this to search Member hierarchies within a Dimension.
Collapse Hierarchy
Use this to collapse a Member hierarchy within a Dimension.
Navigate to Security
This icon appears in all Dimension Security properties and when clicked it navigates to the
Security screen. This is an easy way to make changes to Security Users or Groups before
assigning them to specific Dimensions.
Clone Member
This creates a new Member using the same settings as the selected Member. New Members can
be cloned and positioned as a First Child, Last Child, Previous Sibling, or Next Sibling of the
selected Member.
Note: Formula Types and Formulas do not copy to the new Member.
Delete Member
This deletes the selected Member from the Dimension Library.
Remove Relationships
Use this to remove the copied Member(s) from their current relationship without moving them to a
new one. If the copied Member is no longer a part of the Dimension structure, it will be placed
under Orphans.
Member Filter
Builds a list to see specific Members. See Member Filter Builder Dialog for more details.
Grid Settings
Cube Type
Members can change based on Cube Type. Specifies the Members to look at in the grid view.
Scenario Type
Members can change based on Scenario Type. Specifies the Members to look at in the grid view.
Time
Members can change based on the Time Member. Settings can be turned on or off and the
formulas will change in order to look at a specific time frame.
Entity Dimension
The Entity Dimension is different from all others. In a multi-Cube application, the Entity Dimension
links everything together.
Dimension Type
This indicates what Dimension is currently being used (e.g., Entity).
Dimension
This indicates the Dimension name (e.g., Houston).
Member Dimension
The Dimension to which it is a Member (e.g., HoustonEntity Dimension).
Name
The name of the Member in the Dimension (e.g. Houston Heights).
Default Description
A description of the Member in the Dimension. Refer to Report Alias descriptions for Members.
Security
Display Member Group
This group can see that this Entity exists within a list of Entities.
NOTE: Click and begin typing the name of the Security Group in the blank field. As
the first few letters are typed, the Groups are filtered making it easier to find and select
the desired Group. Once the Group is selected, select CTRL and Double Click. This will
enter the correct name into the appropriate field.
The settings for these are used to specify a comma separated list of category names that will be
processed for the Entity. If these settings are left empty, then all categories will be used in the
corresponding settings on the Data Access tab of the Cube Administration page. See Data
Access under "Cubes" on page 452 for more details.
Settings
Currency
The local currency of a particular Entity.
Is Consolidated
If set to True, the data from this Entity’s children is consolidated (i.e., this Entity will equal the total
of its children).
If set to False, the data will not be consolidated. The use case to set the Is Consolidated setting to
False is to use the Parent Entity strictly for grouping purposes. Also, by setting Is Consolidated to
False helps with consolidation performance times because the consolidation will not be performed
at the Parent Entity.
Is IC Entity
If set to True, this will make the Entity an Intercompany Entity. An Entity cannot post
intercompany transactions to intercompany accounts if this option is True. This is only required
for Base Entities where the intercompany intersections roll up and eliminate. If set to False, this
will not be an Intercompany Entity.
Flow Constraint
This is the Flow Dimension constraint. This Entity can only use the Members with this child or
Member under a selected Parent Member.
IC Constraint
This is the Intercompany Dimension constraint. This Entity can only use the Members with this
child or Member under a selected Parent Member. Setting Entity constraints will define the data
intersection as a green no input cells.
IC Member Filter
This is an additional way to limit intercompany partners of a particular Entity. The IC Member
Filter can make a list and the Entity can only have intercompany transactions with this list of
partners, and those lists of partners are the only ones that can have a transaction with this Entity.
This provides additional protection to the intercompany transaction.
See Equity Pickup in "About the Financial Model" on page 2 for more details on this feature.
In Use
If set to True, the Entity is in use, if set to False, this can turn off the ability to use an Entity based
on Time. This keeps historical data available. This is designed to be used when an Entity
becomes inactive or is sold. Once an Entity is no longer in use, it will be ignored during
consolidation and all intersections including this Entity will be invalid.
Allow Adjustments
If set to True (default), the Journals Module is enabled for the Entity to enter adjustments to the
AdjInput Origin member. When set to False, the Journals Module is disabled for the Entity.
However, when False, adjustment to the AdjInput Origin member is still allowed on Accounts
having the Account Adjustment Type of “Data Entry” which is used in designs where adjustments
are performed using Form data entry rather than the Journals Module. To prevent input to
AdjInput on Accounts set as “Data Entry” Adjustment Type, NoInput Rules or Data Cell
Conditional security can be used. This setting can be applied as True/False a default or it can
vary by Scenario Type and/or Time.
Text1…Text8
This is open for custom attributes used for multiple purposes such as Business Rules, Member
Filters or Transformation Rules. The value can be changed in the tag over time as the business
changes or by Scenario Type.
This is done with the Add Relationship for Selected Member option within the Entity
Dimension. In addition, this icon can be used when a Member must be moved out of its current
hierarchy and inserted into another one.
Dimension Type
This indicates what Dimension is currently being used. (e.g., Entity)
Dimension
This indicates the name of the Dimension.
Member Dimension
This displays the name of the Member. (e.g., CorpAccounts, Scenarios, CorpEntities, etc.)
Member Name
This displays the name of the Member as it was defined. (e.g., Actual, Budget, Flash, etc.)
Position
First Sibling
Last Sibling
Sibling Member
This will show a list of all the siblings in the current hierarchy available to move based on the
setting in the Position field.
Default Parent
Parent Sort Order
This setting is used to determine a default Parent when evaluating Member lists (e.g., in Cube
Views). If a Parent is not explicitly specified, the Entity’s Parent with the lowest sort order is used.
Percent Ownership
This is an ownership setting that can be used by Business Rules if need be. By itself, the setting
has no effect on the consolidation.
Ownership Type
This is an ownership setting that can be used by Business Rules if need be. By itself, the setting
has no effect on the consolidation.
Full Consolidation
Normal setting for Entities that fully consolidate into a Parent.
Holding
This is used to designate the Parent/Child relationship as a holding company situation.
Equity
This is used to help Business Rules determine the value to increase the equity method of
Accounting for an investment.
Non-Controlling Interest
This is used for Business Rules to determine the minority interest portion of an Entity into the
consolidation.
Custom 1…Custom 5
Open for custom use in Business Rules.
Text 1-8
Use to define custom attributes to modify aspects of business rule, transformation rule and
member filter capabilities. These custom attributes act as variable placeholders, activated at run-
time to customize views of, or ways to interact with, data. You can change these values by
scenario type or to suit evolving business needs.
Dimension Type
Identifies the Dimension it currently is such as Account…UD8.
Name
The current name of the Dimension
Description
The current description of the Dimension
Access Group
This security group has access to the Dimension
Maintenance Group
This security group has access and can make changes to the Dimension.
Source Type
Standard
Normal metadata Member must match a Cube.
Business Rule
Connects to a Business Rule so that a set of Members does not have to match a Cube.
XBRL
Connects to an XBRL taxonomy.
Source Path
Enter the Business Rule name or the full File System Path for the XBRL Taxonomy’s Link
Definition File.
‘Internal/XBRL/Taxonomies/EntryPoints/….’
Scenario Dimension
Scenario types offer great flexibility and the following 24 pre-set types are provided in each cube.
Each type contains an unlimited number of scenarios. Dimensions are assigned to cubes and can
differ by scenario type. You can show scenarios as a part of a hierarchy for organizational
purposes. They do not consolidate.
l Actual
l Administration
l Budget
l Control
l Flash
l Forecast
l FXModel
l History
l LongTerm
l Model
l Operational
l Plan
l Sustainability
l Target
l Tax
l Variance
l ScenarioType1-8
Security
These groups grant members access and rights to a scenario:
l Read and Write Data Group: Can view and modify data.
l Calculate from Grids Group: Can calculate, translate, and consolidate from a cube view or
form.
l Manage Data Group: Membership in this group is required to run Data Management steps
such as custom calculate or reset scenario. This prevents unauthorized users from
launching steps which could alter or clear data unintentionally.
Workflow
Under Scenario, you can use the following properties to control the type of periods displayed to
end users when they load data.
Use in Workflow
Set to False to omit the Scenario from the workflow view in OnePlace, making it unavailable to
users setting the workflow point of view. To display the scenario, set this field to True.
NOTE: Data can still be entered with forms and the Excel Add-In to a hidden scenario.
Monthly
Use to set the workflow periods to monthly, which could be 12 to 16 months depending on the
application. Once defined, you cannot change this setting.
Quarterly
Use to set the workflow periods to four periods, such as Q1, Q2, Q3, Q4. Once defined, you
cannot change this setting.
Half Yearly
Use to set the workflow periods to two periods, such as H1, H2. Once defined, you cannot change
this setting.
Yearly
This sets the workflow periods to one period, such as 2021. Once defined, this setting cannot be
changed.
Range
Allows an Administrator to define a custom range that is displayed as one-time period including
the start and end time. As data loads, each period displays, such as 11 – Jan and 11 – Dec.
Once defined, this setting cannot be changed. The next set of properties only become available
when the Range option is chosen.
Workflow Time
To define the workflow time, click the ellipsis to the right and choose a time. Year or year and
month can be defined.
Settings
Scenario Type
This property groups similar scenario types in order to share settings or Business Rules. A
scenario type can contain many scenarios.
NOTE: The Input Frequency property with the Workflow Tracking Frequency property.
For example, if the Input Frequency in 2017 is Monthly and the Workflow Tracking
Frequency is All Time Periods, the workflow displays the following for this scenario in
2017:
If the Input Frequency varies by year, the Workflow Tracking Frequency updates the workflow
view accordingly. For example, if the Input Frequency is Weekly in 2018, the Workflow Tracking
Frequency still displays All Time Periods as shown:
Default View
This is the standard default view for calculations, member formulas, and clearing calculated
data. The current selections are YTD and Periodic.
Zero No Data
The following properties determine how to handle zero no data. Periodic places a 0 value in the
period without data. YTD places a 0 value in the current YTD month, causing the period to negate
values from prior months.
Consolidation View
This is the standard setting for the view of the consolidation. The current selections are YTD and
Periodic. Typically, the Periodic setting is used especially if work is completed by period.
However, all numbers are stored as YTD. Select YTD to enhance consolidation performance.
This property can be changed and will update for the next consolidation.
NOTE: If a Consolidation View is Periodic and no data is loaded to the IC accounts for a
specific month, custom elimination rules are required in the
FinanceFunctionType.ConsolidateElimination section of a business rule attached to the
cube to calculate expected elimination results. This requires storaging the C#Share
member, which impacts overall consolidation performance.
The YTD consolidated results related to Consolidation View property on a Scenario is set to
Periodic and where the Percent Consolidation Relationship Properties of Entities vary by time
period, also known as “Org by Period.”
When the Consolidation Algorithm Type property on a cube is "Standard,” Scenario Consolidation
View property of Period and Org by Period Entity Relationship Properties, the Share
Consolidation member is consistently displayed for both YTD and Periodic on reports.
When the Consolidation Algorithm Type property on a Cube is set to "Standard with Stored
Share," Scenario Consolidation View property of Period, and Org by Period Entity Relationship
Properties, when a time period's Percent Consolidation changes to 0.0 when varying over time,
the system now displays stored calculated results rather than derived.
NOTE: that the best practice for this type of “Org by Period” design typically drives a
financial model design where the Consolidation Algorithm Type is set to “Custom.” In
rare cases, this Cube setting is set as Standard and even more rare, this is set as
Standard with Stored Share. In both rare situations, this Share Member view will have
changed.
Formula
This is the ability to use a formula to execute copying between Scenarios prior to the consolidation
of a Scenario. See Formula Guide "About the Financial Model" on page 2 for more details.
See "Formulas for Calculation Drill Down" in "About the Financial Model" on page 2 for more
information.
FX Rates
Use Cube FX Settings
If True, the rate type that is the default for the current cube is used. If False, custom rate
calculations are required. Rate Type for Revenues and Expenses & Rate Type for Assets and
Liabilities.
Average Rate
The average currency rate of a period from the first day to the last day of the month.
Opening Rate
The currency rate at the beginning of the period.
Closing Rate
The currency exchange rate at the end of a period.
Historical Rate
The currency rate to use for a particular historical Account calculation open so a special
transaction can be valued on a specific date. Rule Type for Revenues and Expenses & Rule Type
for Assets and Liabilities.
Direct
Calculation is direct with the current value and current rate.
Periodic
Calculation is weighted based on a period.
Hybrid Scenarios
For information about how the following properties impact hybrid scenarios queries and using
hybrid scenarios, see "Working With Hybrid Scenarios" on page 109
l Share Data from Source Scenario: shares a read-only cube data set from the source
Scenario to the target Scenario.
l Copy Input Data from Source Scenario: copies base level cube data from a source
Scenario to the target Scenario.
l Copy Input Data from Business Rule: copies base level cube data based on a Finance
business rule.
Enter the Source scenario member or business rule to indicate the source location.
End Year
This is the only property that controls Time. If a data query must end after a specific year, indicate
that year here. For example, if it should not occur in 2021, the End Year is 2020 to exclude all
future years from a query. To query all years, leave this field empty. All years containing data will
be included in the results.
The database stores data records by year, each year having its own data table and containing the
data records for each period. When data is queried using a Hybrid Scenario, it occurs at the
database level, and only returns the periods containing data.
Member Filters
The members listed in a comma separated list are the only ones included in data query results.
This can include multiple dimension types, member expansions, or single members. If this field is
blank, all source data is included in query results.
The members listed here in a comma separated list, are excluded from query results. This can
include multiple dimension types, member expansions or single members. If this includes
members from the Member Filters property, those members are excluded.
NOTE: You cannot use Data Unit dimensions in the Member Filters or Member Filters to
Exclude properties (Entity, Time, Consolidation, Scenario).
Pre-aggregated Members
Use this property to share or copy data from a parent member (source) to a base member (target).
For example, if you query the Top member of a large dimension such as a UD, the aggregated
total is calculated "on the fly" each time.
If the detail of a dimension is not needed, set the top member to a Base member to pre-aggregate.
This alleviates repetitive "on the fly" calculations for the same number.
UD1#Top=UD1#None,UD2#Top=UD2#None,UD3#Top=UD3#None,UD5#Top=UD5#None,UD
5#Top=UD5#None, UD6#Top=UD6#None
NOTE: Ensure that the base members shown above are included in the Member Filters
property and the parents are included in the Member Filters to Exclude property.
Options
Hybrid Source Data Options provide additional control when executing a hybrid share or copy.
The settings are optional and will vary. Options are name-value pairs. Ensure that the option
names, definitions and syntax are accurate and include any custom name-value pairs if a
business rule is used to copy. Create a comma separated list if you use multiple options.
ExecuteCopyDuringCalc
A True / False property setting which is False by default. If True, Hybrid Source Data copy is
executed during the Data Unit Calculation Sequence (DUCS). If False, the Hybrid Source Data
copy will not run during the DUCS and a Calculate Data Management Step type is needed to
execute the Hybrid Source Data copy. On the Data Management step, set Execute Scenario
Hybrid Source Data Copy as True. The calculation defined on this Data Management step uses
the settings from the Hybrid Source Data properties to run the copy.
This setting is helpful for versioning or seeding where constantly copying unchanging numbers
adds unnecessary overhead to every calculation. This setting should be used with either of the
Copy binding types and is not applicable with the Share binding type.
CopyAsDurableData
A True / False property setting which is False by default. If True, copied data has a Storage Type
of "Calculation". Calculated data is cleared at the beginning of the DUCS (if the Scenario setting of
Clear Calculated Data During Calc is True) and will re-calculate. If this setting is True, copied
data has a Storage Type of "DurableCalculation". Durable data is not cleared at the beginning of
DUCS (regardless of the Scenario setting for Clear Calculated Data During Calc) and must be
cleared manually with a rule.
RemoveZeros
A True / False property setting that is False by default. If True, zeros are removed when copying
data if all periods within the year of the intersection being copied are 0.
SourceTimePeriodForYearlyResult
This property setting is used when the data results only require data from a specific source Time
member. Indicate the source Time Period. Define that time period here. For example:
SourceTimePeriodForYearlyResult=M6
The hybrid scenario’s Input Frequency must be Yearly, the only frequency that accepts a value
from one period. The source scenario’s Input Frequency can be more granular such as Weekly or
Monthly. If a source time period is not defined, the source Year value is used in the hybrid Year
Value.
Examples include a weekly period (Wx), a monthly period (Mx), a quarterly period (Qx) or a half
year period (Hx). You cannot specify a year. It will match the year between the source and target.
SourceViewForYearlyResult
This setting is used if the Input Frequency of the target scenario is Yearly and the source scenario
has a more granular Input Frequency. This setting can be Periodic or YTD. YTD is the default.
You can indicate if you want to copy / share the time period being copied / shared as a periodic or
YTD value.
Custom Settings
Text 1…Text 8
Enables you to user custom attributes for many purposes such as business rules, member filters
or transformation rules. The value can be changed in the tag over time as the business changes,
or by Scenario Type.
Account Dimension
Account Members are meant to be organized in a hierarchical fashion. The calculation engine will
aggregate these Members, and then perform math on them as they roll up based on their Account
Type (e.g. Revenue is positive and Expense is negative). Be sure to set Is Consolidated to False
for those items that do not necessarily need to be consolidated for performance purposes.
Security
Display Member Group
The group that can see that this Account exists within a list of Accounts.
Settings
AccountType
Account attributes to determine the behavior of the Accounts.
Group
This is only an organization Member and does not accumulate data. It is meant to only view and
organize data.
Revenue
Setting for Income Accounts to tag with a revenue attribute. Amount does not have to be negative
because of this attribute.
Expense
Setting for Expense Account to tag with an expense attribute.
Asset
Setting for Asset Accounts
Liability
Setting for Liability Accounts. Amount does not have to be negative because of this attribute
Flow
Setting to hold values that act like an Income Statement Account and have a periodic and year to
date value. This account does not translate.
Balance
Setting to hold values that act like a Balance Sheet Account that are at a particular time. This
account does not translate.
BalanceRecurring
Setting for a Balance Sheet Account that does not change over time such as an Opening Balance.
This account does not translate.
NonFinancial
Setting for Informational Accounts that are captured and not financial such as Headcount or
Square Footage. This is primarily used for legacy purposes such as upgrading from older
systems. This account does not translate. NonFinancial and Balance Account Types are similar
in that they are available for legacy purposes, however one difference between them is that a
NonFinancial data cell is not affected by the Flow Member’s Switch Type setting.
DynamicCalc
An Account that calculates on the fly and does not need other formulas to run in order to calculate.
An example of these types of Accounts are ratios that can be calculated as needed.
FormulaType
FormulaPass1…FormulaPass16
The formulas are included in the Account metadata and can be shared between Cubes. The
formula pass is to define when the calculation should run and whether it is dependent on a
calculation from other formulas to derive its value.
DynamicCalc
A Dynamic Calc formula computes a value for a single cell and runs every time the cell needs to
be displayed without storing the result.
For more details on using Formulas, refer to the "Formulas" on page 25.
Allow Input
If set to True (default), data input for the Account Member is allowed. This is typically set to False
if this Account has a formula. This will make the Account read-only and will not affect the formula.
If specific Scenarios or Entities need input, set to True and use the Cube’s Conditional Input
Settings to control input.
Is Consolidated
If set to Conditional (True if no Formula type (Default)) it determines which Accounts will be
part of the consolidation. If the Account has a Formula Type, the Account Member will be
calculated and consolidated only if the setting is True.
Is IC Account
If set to True, this account is identified as an Intercompany Account and allows transactions to be
processed based on the settings in the constraint section under IC Constraints and IC Member
Filter, if set to False, it will not be identified as an Intercompany Account.
Plug Account
Intercompany Plug Account to handle any non-eliminating transactions.
Aggregation
This can be used to turn off aggregation for specific Dimensions preventing them to roll up. It
might be possible that a Dimension Member is used for informational purposes only and does not
need to aggregate. Settings are True or False.
Standard
This is a basic Member with no special purpose other than to act as the default Workflow Channel
for Account Members and Workflow Profile Input Children.
NoDataLock
This is a special Member that only applies to a metadata Member (Account or UDx) and should
not participate in a Workflow Channel grouping scheme. This is the default value for any UDx
Member.
In Use
If set to True, the Account is in use, if set to False, this can turn off the ability to use an Account
based on Time. This keeps historical data available but allows the ability to close the Account
without having to have No Input rules.
Formula
An individual formula kept with the Account across Cubes and can vary by Time. For example, a
calculation changes over time, but the historical interpretation of that formula needs to be saved.
See the Formula Guide, in "About the Financial Model" on page 2 for more details.
2.
3. Next, click the ellipsis button in the Stored Value field in order to input a formula.
4.
5. Type the individual formula in the Formula Editor for the specific Account. Click the check
mark icon in order to ensure that the formula was compiled correctly.
6.
Adjustment Type
This can limit the use of adjustments over time. Settings include Not Allowed, Journals and
DataEntry.
Text 1…Text 8
Open for custom attributes used for multiple purposes such as Business Rules, Member Filters or
Transformation Rules. The value can be changed in the tag over time as the business changes,
or by Scenario Type.
Relationship Properties
For information on General and Position within Parent see General Relationship Properties
Aggregation Weight
The Aggregation setting is available in all Account, Flow and UD Dimensions. This setting can
change for a Base Entity based on its Parent. If a Member is reused in a Dimension, but it does
not need to sum up more than once, set the weight to 0 data in this node and it will not allow the
Member to aggregate to the top.
Flow Dimension
The Flow Dimension is like the eight User Defined Dimensions in that it can use Extensible
Dimensionality® and extend Dimensions across business units and Scenario Types with
flexibility. Its Members aggregate up just like the other User Defined Dimensions. The Flow
Dimension, however, has some additional settings that help with historical currency overrides and
eliminate many common custom scripts in order to perform calculations.
Security
Display Member Group
The group that can see that this Dimension exists within a list of Dimensions.
Settings
For Formula Type, Allow Input and Is Consolidated, see Account Dimension.
Switch Sign
The Flow Member is linked to an Account through Flow constraints. Depending on the Account
Type, this property would be set to True to switch the sign of data for the Flow Member or set to
False to keep the sign as is.
Switch Type
Switch the type of data based on the Account attribute, for example setting an Asset to a
Revenue. This is useful when treating roll forward Accounts as Income Statement Accounts in the
Balance Sheet. Settings are True or False.
Flow Processing
These settings are used for dollar override values. In order to be used, an Account must be
flagged as true by using the setting UseAltInputCurrencyInFlow. (Except for
IsAlternateInputCurrencyforAllAcounts) The Flow Dimension can hold both values of a dollar
override and the settings differ based on how this setting is used.
Security
Display Member Group
The group of users who can see that this Dimension exists within a list of Dimensions.
Settings
For Formula Type and Allow Input see "Account Dimension" on page 427.
Is Consolidated
If set to Conditional (True if no Formula Type and no Attribute (default)): The data from this
entity’s children is consolidated; this entity will equal the total of its children.
If set to True (regardless of Formula Type and Attribute): Consolidate the results of the
dimension and the attribute. Consolidate and/or aggregate UD attributes when they reference
entity, as the reference Dimension to view results at the parent entity instead of having the Parent
entity use the same algorithm as the base entities.
If set to False: The data will not be consolidated. Set to False when using the Parent Entity strictly
for grouping purposes. Also, by setting Is Consolidated to False helps with consolidation
performance times because the consolidation will not be performed at the Parent Entity.
UD2-UD 8 Constraint
A constraint is a setting that allows only certain Members to be used. If a Member is outside the
UD Constraints applied on the Account Dimension, its cube intersection will show as a red/invalid
intersection and any numbers in that intersection will not aggregate. Constraints applied on the
Entity and UD1 Dimensions will create a green, no input data intersection.
UD2-UD8 Default
This is the standard default Member that can be mapped for a setting and saves having to map to
each Member. If the setting says Default, the mapping will always go to that default.
The Account, Flow and User Defined Dimensions, as Related Dimension Types, support the
Attribute Members for “calc-on-the-fly” aggregations at parent members. The dynamically
generated results within the Attribute Members will be automatically aggregated to the parent
members. Being stored members, the Entity and Scenario, as Related Dimension Types, do not
support “calc-on-the-fly”. Attribute Member results on Entity or Scenario will only be available on
base members.
The use of User-Defined Attribute Member can impact application performance, particularly with
respect to Consolidation time. The impact is due to the dynamic generation of User-Defined
Attribute members’ data intersections adding to the size of the final Data Unit. Therefore, the
potential intersections derived from User-Defined Attribute Members should be included in
application Data Unit analysis. As a guideline, typical application designs should always consider
the performance evaluation of User-Defined Attribute Members should the number of User-
Defined Attribute Members approach approximately 2000 items.
l The Attribute members will not impact the size of the Data Unit in Consolidation.
l Values derived by Attribute members can be referenced by Business Rules and by member
formulas.
l The members are treated as standard dimensions and records in that they will be
processed within a Cube View’s Allow Sparse Suppression routine supporting large sparse
application model reporting.
l The results can be modified quickly and easily by modifying the definition of the reference
on the Attribute member or from a change on the properties of source member(s) even if
those properties (e.g. Text4) vary by Scenario Type or Time.
The model design and use of Attributes should consider if the feature is appropriate for the
application model. Here are some considerations:
l Attributes may not be appropriate in situations where reporting on the Attribute member
must be maintained with a high level of data integrity. This is due to the dynamic nature of
the Attribute where its results are based on properties of other members.
l Attribute results cannot be locked for data integrity. Although the underlying data being
referenced will be locked, modifying the definition of the Attribute or a change on the
properties of the referenced source member, this may impact the results.
l Since Attribute members do not store data, they can be deleted and are not subject to data
integrity restrictions if in use. Therefore, dynamic designs of reports and use in rules should
be considered.
l Attributes cannot be input or contain formulas; however, they can reference other input
members or calculated members as a source.
l Drill-Down based on the Attribute member intersection cannot be used to drill-back to the
Stage Load Results, the Source Member for Data reference member defined on the
Attribute must be used.
Consolidate UD Attributes
Consolidate UD Attributes, when referencing Entity as the reference Dimension, consolidates
Attribute results. The results could consolidate vs the Parent entities using the same algorithm as
the Base Entities.
l Numbers entered at the None Level settings Attribute Member, referenced Dimension is
Entity.
An Attribute member can define its source reference by up to two physical dimensions using the
Related Dimension Type. The interaction between the two references is handled by the
Expression Type.
Expression Types
l Comparison 1 Only: This is used for an Attribute utilizing only a single Related Dimension
l Comparison 1 and Comparison 2: This is to utilize two Related Dimensions where the
results will be bound by meeting both Related Dimension conditions
l Comparison 1 Or Comparison 2: This is to utilize two Related Dimensions where the results
will need to meet one of the Related Dimension conditions
Is Attribute Member
True/False to enable the User Defined Member as an Attribute.
Expression Type
Used to utilize one or two Related Dimensions and the conditional relationship between two to
return results.
Related Property
The Related Dimension can be evaluated on its Name, Description, Name or Description, Name
and Description, Text1-8. The Related Property will be evaluated only on the Base Members of
the Related Dimension Type.
l Name: Model4
l Name and Description: Model4 – Service Vehicle. Note the hyphen that OneStream
creates when presenting the Name and Description will need to be included in the
Comparison Property when evaluating as Property = Comparison Text
Comparison Text
The text condition being evaluated against the defined Related Property as the Name, Description
or Text Field. When referencing a Text Field, the source Text Field can vary by Scenario and
Time. When referencing the Description, only the Default Description can be referenced.
Comparison Operator
Sets the evaluation method to compare the Related Property to the Comparison Text. This can
be done explicitly with “=” or “<>” as well as dynamically using the Starts With, Ends With,
Contains or Does Not Contain operators.
Once the Attribute members and are defined, and any required Comparison Text is applied to the
Source Related Dimension Type, the data will render dynamically. Neither Consolidation nor
Calculation is required to render the results. Any change applied to the Source definition of the
Comparison text, such as a change to a Text property, will immediately be reflected in the results
on the Attribute members. Similarly, modifying the Attribute Member’s Setting Properties will
immediately change the Attribute results.
Consolidation Dimension
The Consolidation tree of Dimension Members is how the data rolls up from the local currency to
the final numbers. The tree gives opportunities to go from local currency to translated currency
showing any intercompany eliminations and allows for the Entity to adjust the tree before or after
the final numbers. Also included are all the currencies assigned to this application. See
Consolidation in "About the Financial Model" on page 2 for more information on the Consolidation
Dimension.
Time Dimension
Time is a fixed Dimension and is based on the Time Dimension Type associated with the
application. The Time Dimension type determines whether data is stored at a monthly or weekly
level and how many months are in a year, months are in a quarter, and weeks are in a month. The
Time hierarchy is driven by this and will display in the Point of View accordingly.
See Time Dimensions in "System Tools" on page 947 for more details on Time Dimension Types.
The Time hierarchy can be used by fiscal year and is not tied to a calendar year by using the fiscal
year and Month1 (M1), Month2 (M2), etc. to designate the time frame.
NOTE: There are many issues with rolling retained earnings and beginning balances for
corporations that do calendar reporting and fiscal year reporting in the same application.
For example, if a large corporation had two companies using different reporting years,
the books would have to be separate because each company is based on a different
method. In these instances, two applications would need to be used for each reporting
year.
View Dimension
The View is how to see the data or look at the text that was entered in certain Dimension
Members. Common calculations are included because the data is easily available, and it saves
having to create custom formulas. Also included is CalcStatus which shows the calculation status
for a Data Unit and several views into different types of Data Attachment comments, such as
Annotation, Assumptions, Footnote and VarianceExplanation.
Periodic Trailing2MonthAverage-Trailing12MonthAverage
YTD YSumTD
Calculation Status YearSumToDatePrior
CalcStatus, CS YSumTDPrio
AuditComment Trailing2MonthTotal-Trailing12MonthTotal
Audit T2Total-T12Total
VarianceExplanation
VarExp
Origin Dimension
Origin is the channel where data is loaded for each Workflow. Data can be a file import, data
entry, forms or journal entries that can be adjusted based on input currency or consolidated data.
Eliminated data is from eliminations at common Parents between Entities’ intercompany
transactions. See Consolidation in "About the Financial Model" on page 2 for more information on
Consolidation and Elimination.
Time Profiles
Time Profiles are used to create different Fiscal Years in order to apply them to different Cubes.
Use the Standard Time Dimension Profile if the Fiscal Year begins in January, otherwise create a
new Time Dimension Profile with the desired Fiscal Year start date. Once the Time Dimension
Profile is created, assign it to the specific Cube to which it is being used. Time Dimension Profiles
are included during Metadata load/extract. The default Time Dimension Profile for Cube Views is
determined by the Cube POV settings. The time display for the Workflow POV is determined by
the Cube assigned to the Workflow Profile. The Standard Time Dimension Profile is used for all
time varying properties such as Formulas, In Use, Text Fields, etc.
Time Dimension Profiles can also be specified in a Substitution Variable or Member Filter
Functions and used in Cube Views. See Substitution Variables for Cube Views or Dashboards
and Commonly Used Member Filter Functions for more details.
Profile
Name
The name of the Time Dimension Profile.
Description
Include a description for the Time Dimension Profile.
NOTE: For Fixed Weeks, each month is either 28 or 35 days except for the twelfth
month which will have an extra one or two days.
NOTE: There are many issues with rolling retained earnings and beginning balances for
corporations that do calendar reporting and fiscal year reporting in the same application.
For example, if a large corporation had two companies using different reporting years,
the books would have to be separate because each company is based on a different
method. In these instances, two applications would need to be used for each reporting
year.
Time Periods
Descriptions use |fy|, |fyfy|, |cy|, or |cycy| to include Fiscal or Calendar Year.
Default Year
Year Description
This displays the Year as either Fiscal Year or Calendar and defaults as |fyfy|.
Half Years
HY1, 2 Description
This displays the Half Year as either Fiscal Year or Calendar Year and defaults as H1 |fyfy|.
Quarters
Q1, 2, 3, 4 Description
This displays the Quarters as either Fiscal Year or Calendar Year and defaults as Q1 |fyfy|.
Months
M 1-12 Description
This displays the Months as either Fiscal Year or Calendar Year and defaults as this format
Jan|cycy|.
Short Description
This displays the week’s short description and defaults to W1, W2, W3, etc.
Cubes
A Cube is the organizational structure that holds data. A Cube is flexible and can be designed for
a specific purpose or type of data. An Application can have multiple Cubes that share Dimensions
and data.
For more information on Cubes and Extensible Dimensionality®, see Extensible Dimensionality®
in "About the Financial Model" on page 2.
Cube Properties
Name
Input field for creating a new Cube
Description
Input field for description of a new Cube.
Cube Type
This is an optional setting that creates tags for different types of Cubes which may be used to
separate and capture types of data without affecting other Cubes. This is used in order to have
different settings for Default and Constraint settings that apply to certain Dimensions and can vary
by Cube Type, such as Entity. The Cube Type names are arbitrary and do not have functional
differences but are there to represent different kinds of Cubes that may be created.
Standard
Cube used for normal Consolidation
Tax
Cube specifically used for Tax
Treasury
Cube specifically used by Treasury
Supplemental
Cube specifically used to capture supplemental data
What If
Cube specifically used to capture various What If Scenarios.
Security
Access Group
A user can see the object and read its contents.
Maintenance Group
A user can see the object, create new objects in groups, edit, and delete them.
NOTE: Click in order to navigate to the Security screen. This is useful when
changes need to be made to a Security User or Group before assigning it to a Cube.
Click and begin typing the name of the Security Group in the blank field. As the first
few letters are typed, the Groups are filtered making it easier to find and select the
desired Group. Once the Group is selected, click CTRL and Double Click. This will
enter the correct name into the appropriate field.
Workflow
Is Top Level Cube for Workflow
If additional Cubes are used that roll their data into the main Cube via Extensible Dimensionality®,
other Cubes should be set to False. Each Cube listed as True for this setting can be set as a (top
level) Cube Root Profile for Workflow Profiles, meaning that Cube can have different Workflows
than the others.
For example, in order to have Budget, Plan and Forecast Scenario Types follow a different
Workflow Profile structure than any other Scenario Types, add a suffix such as Plan to these three
Scenario Types. When the Workflow Profiles are built, there will be two Cube Root Profiles. If the
main Cube is called GolfStream, and there is a suffix of Plan for the three Scenarios Types listed
above, and no suffix for the other Scenarios Types, two Cube Root Profiles will be created called
GolfStream and GolfStream_Plan.
Calculation
Custom
The Consolidation will utilize custom Business Rules to calculate amounts for the Share and
Elimination Consolidation Members using the Finance Function Types of Consolidate Share and
Consolidate Elimination.
Standard
This is the default Translation algorithm. Amounts for a foreign currency Consolidation Member
are generated from the Local Consolidation Member using the FX rate tables.
Custom
The translation will be run entirely through Business Rules assigned to the Cube. It is assumed
that the Business Rule will calculate translated amounts from data in the Local Consolidation
Member.
Business Rules
Custom Business Rules can be attached to a Cube. Eight different Business Rules can be
defined. See Business Rules in "Application Tools" on page 779 for more details. This method of
definition allows for extreme flexibility to share or not share certain Business Rules between
Cubes. See Consolidation in "About the Financial Model" on page 2 to see the calculation order
of Member formulas vs. Business Rules attached to each Cube.
BusinessRule1…BusinessRule8
These settings are limited to attaching the Finance Business Rules only.
NOTE: Click in order to navigate to the Business Rules screen. This is useful when
changes need to be made to a specific Business Rule before assigning it to a Cube.
FX Rates
Default Currency
This setting is the default reporting currency for the Cube. This is used for FX rate triangulation if
the Cube currency is the common currency. This is also used for Intercompany Matching’s
reporting currency.
Rate Type for Revenues and Expenses & Rate Type for Assets and Liabilities
Note that these are the default settings for this Cube and can be overridden at the Scenario-level
with the same settings.
Average Rate
The average currency rate of a period from the first day to the last day of the month.
Opening Rate
The currency rate at the beginning of the period.
Closing Rate
The currency exchange rate at the end of a period.
Historical Rate
The currency rate to be used for a specific historical Account calculation open for a specific
special transaction to be valued on a specific date.
Rule Type for Revenues and Expenses & Rule Type for Assets and Liabilities
Note that these are the default settings for this Cube and can be overridden at the Scenario-level
with the same settings.
Direct
Calculate direct with current value and current rate.
Periodic
Calculate periodic value translation method. This method considers the translation rates for prior
time periods and calculates a form of average.
NOTE: Click in order to navigate to the FX Rates screen. This is useful when
changes need to be made to FX Rates before assigning them to a Cube.
Cube Dimensions
Cube Dimensions determine what metadata Dimension Member trees will be used as a default
and for a specific Scenario Type. The Dimensions are defined in the Dimension Library. There is
a default Scenario Type and a specific Scenario Type. Different Dimension Members can be
used for different Scenario Types. For example, a different Region Dimension may be used for
Budget that has more detail (through Extensible Dimensionality®) than the Dimension being used
for Actual.
Cube Reference
This is for sharing Cubes through Extensible Dimensionality ®. Multiple Cubes can be linked
together and extended for different purposes. See Entity Dimension for more details. These
options appear in a Top-Level Cube for Workflow where there are references in the assigned
Entity Dimension to other Entity Dimensions. In the example below for the GolfStream Cube, the
CorpEntities Dimension have references to the five other Entity Dimensions following it.
Therefore, in this example, the GolfStream Cube needs to be told which Cube to find data for
those other Entity Dimensions.
GolfStream acts as a Super-Cube that consolidates data from five other Cubes because the
Houston Entity and its children are referenced within the CorpEntities Dimension and the Cube
References are updated to point to the Houston Cube for their data (shown below).
Data Access
These are security settings that control access to the Entity:
A typical use case is to use the Read-Only and Read-Write security group settings on Entities and
Scenarios to specify the users that need access to any data for those Dimensions. Then, Cube
Data Access Security can be used to further refine which data certain users can access. For
example, if restricting read and write access by Cost Center (which may be set up as UD1 in the
application) is wanted, this can be done by having entries in Cube Data Access Security that
specify which users have access to certain cost centers.
First, choose a User Group, the level of access, and then enter a Member Filter. For example, a
User Group that includes Senior Management and Human Resources can have All Access to
actual compensation figures (S#Actual, A#[Total Compensation].Tree), but everyone else will
have No Access.
Note that each of these Data Cell Access Security rules either grants or takes away access. This
depends on the Action, Behavior and Access Level and the order in which the rule appears in the
list.
General
Category
This is an optional Category name by which access rules can be named and grouped. If these
categories are created, more than one can be applied to an Entity’s security settings. If the
category is left blank in the Entity’s security settings, then all these rules will apply.
Description (Optional)
Description for the rule.
Security
Access Group
This is the group of users to which particular security roles apply. It can be an actual named
security group or refer to an Entity or Scenario group. The first four options refer to the Entity’s
Read Data Group, Read Data Group 2, Read Write Data Group or Read Write Data Group 2. The
5th and 6th group are the Scenario Read Data Group or Read Write Data Group. For example, if a
user is in the Read Data Group for an Entity, and he/she needs to be given access to Product
Sales data for that Entity, the rule would be set up as follows:
All Access groups from the 7th Access Group down are the full list of security groups from the
specific Framework database.
Action
Actions
There are three cases that will drive different behaviors and access levels for this particular Data
Cell Access Security rule in relation to other rules that came before or after it in the list. First, it
depends on whether the user trying to query or update data is in a particular User Group and
second, it depends on if the cell of data in question falls within a certain Member Filter. These are
the three cases:
Behavior
There are eight possible behaviors that coincide with the three action cases. For example, the
Increase Access…” rules will increase support while going down the list of rules. The rules in the
list will continue until it either reaches the end of the list or it reaches a Behavior that includes the
word “…Stop.”
Access Level
No Access
Cannot read or write to the cell.
Read Only
Can read the cell.
All Access
Can read and write to the cell.
These properties work in conjunction with the security that is placed on an Entity. Refer to the
Security section under Entity Dimension to get a better understanding of how this works.
Category
This is an optional Category name by which access rules can be named and grouped. If these
categories are created, more than one can be applied to an Entity’s security settings. If the
category is left blank in the Entity’s security settings, then all of these rules will apply.
Description (Optional)
Description for the rule.
Action
There are two cases that will drive different behaviors and access levels for this particular Data
Cell Conditional Input rule in relation to other rules that came before or after in the list. This
depends on whether the cell of data in question falls within a certain Member Filter. These are the
two cases:
Based on the Action case, a series of Behaviors and Access levels will apply. See Action under
Data Cell Access Security for more information on Access Level choices.
Member Filters
These are the areas of the Cube that are affected by this rule.
Category
This is an optional Category name by which access rules can be named and grouped. If these
categories are created, more than one can be applied to an Entity’s security settings. If the
category is left blank in the Entity’s security settings, then all these rules will apply.
Description (Optional)
Description for the rule.
Security
See Data Cell Access Security for a description.
Action
See Data Cell Access Security for a description.
Member Filters
These are the areas of the Cube that are affected by this rule.
See Security Best Practices in "Implementing Security" on page 322 for more information on Data
Access Security.
Integration
This section controls the Dimensions that can be updated via the Import Input Type in Workflow
and the columns of data that appear in the Stage database when imported data is viewed. It can
determine the order of processing Dimensions and which Dimensions are turned on for data
integration. If one Dimension has to process before another (rare case), the transformation
sequence could be changed.
Cube Dimensions
This is a list of Dimensions to which a particular Cube can be mapped.
Transformation Sequence
Order in which a Dimension’s Transformation Rules will be processed for a given Scenario Type.
Enabled
When a Dimension is Enabled, it will appear in the Stage area and data can be imported into it for
a particular Scenario Type.
Special Dimensions
There are also special fields that can be mapped to the Stage area but cannot be mapped to the
Cubes. They can be drilled back from the Cube or used in Derivative Rules. The use of these
fields does increase processing times. Here are the choices:
Label
This would be the description for any given Account that was related to a particular line of data. It
is imported just for reference purposes.
Source ID
This is one of the keys to the data in the Stage and should be enabled. It can be mapped in via a
Data Source and can be set to a particular value in a file, the file name, or even a tab name from
an Excel file. Max length is 100 characters.
TextValue
This is to store large amounts of textual data. Max size is 2GB of text.
Attribute Dimensions
These 20 Dimensions can each store 100 characters of text.
Alias
This controls the column header that appears for this Dimension when viewing data in the Import
Input Type in Workflow.
Transformation Sequence
This is the order in which a Dimension’s Transformation Rules will be processed for a given
Scenario Type.
Enabled
When a Dimension is Enabled, it will appear in the Stage area and data is allowed to be imported
into it for a particular Scenario Type.
The settings in the Global Point of View essentially give a Cube View the currency rates that can
be input or viewed in the system.
Opening Rate
The currency rate at the beginning of the period.
Closing Rate
The currency exchange rate at the end of a period.
Historical Rate
The currency rate to be used for a specific historical Account calculation open for a specific
special transaction to be valued on a specific date.
Time
Select Time frame
Source Currency
Select type of currency
Destination Currency
Select Destination Currency for viewing in this Cube View
Column Axis
This drop down defines what to display in the columns.
Example of FX Rates:
Lock FX Rates
Use the Lock FX Rates feature to lock FX Rate Types. The feature prevents or allows all FX Rate
Types rates to be changed. This may help eliminate mistakes and provides Audit and Task
Activity information around FX Rate activities, adding integrity to the FX Rate Type data.
The application locks the FX Rate Type by Time only, not by details of source or destination
currency.
Security Administrators have full rights and all functionality including locking and unlocking, by
default. To assign non-administrators lock and unlock rights requires standard security group
assignments in Application > Security Roles.
Non-administrators:
l Must be granted access to Application User Interface Roles / FxRatesPage. This will grant
View access to the Rates and the Locking Page.
l Access to the FX Rate Types, creating and deleting, and modifying/loading rates has not
changed. Users must be granted Application Security Roles / ManageFXRates.
l Users that will lock rates manually or via BRAPI Rules will require Application Security
Roles / LockFxRates.
l Users that will unlock rates manually or via BRAPI Rules will require Application Security
Roles / UnLockFxRates.
The Administrator User Group by default has both roles. Other users can be assigned
permissions for lock only, unlock only or both. You do not need to be able to manage the rates,
lock/unlock is a separate role.
Users need security rights to each of these roles to lock and unlock.
NOTE: Users can have access to either LockFXRates or UnlockFXRates. They do not
need access to both.
l Audit FX Rate Lock Table: track the current status of the Rate Type as to who performed
the unlock or lock.
2. In the task list you will see which users have locked or unlocked rates and the time.
4. In FX Rate Types, select a rate type from the list or enter a type in the search box.
5. In Time, select a year or expand the year to view the full hierarchy.
6. You can navigate the Time structure using the expansion icons. If you select an item using
the selection “check box”, it will set the Lock status as a Lock or Unlock icon. Also, any lock
that appears blue indicates that a descendant beneath it has a differing “Lock” status.
l If all the descendants are in the same lock state, the padlock is black.
8. You can right-click to apply to a hierarchy and choose Lock Descendants inclusive to lock
all within the time frame or Unlock Descendants inclusive to unlock all the within the time
frame.
10. The FX Rates Grid is displayed. If the rate type and period is locked, you will see a green
background and a lock icon in the corner of the cell.
9. Click OK. The spreadsheet will populate with the new data.
NOTE: If you change the lock or unlock rate types in the spreadsheet, when you
Refresh the application and return to the Rates grid, the data will be updated to reflect
your changes in the spreadsheet.
Member Filters
A Member Filter can be used to filter the data down to what an administrator wants to see by
creating a list of restricted Members. Member Filters contain multiple Member Scripts separated
by commas.
A Member Script is a brief statement typed to query one defined set of Dimensional Members.
Members can be specified for any or all Dimensions and the primary Dimension can also specify a
Member Expansion Formula (e.g., .Descendants).
Here is an example of a simple Member Script that returns the year 2012:
T#2012
Here is an example of a Member Script with a Member expansion that returns all the Income
Statement Accounts.
A#[Income Statement].Descendants
If one or just a few Dimensions in the Member Script are qualified, the remaining Dimensions are
pulled either from the Cube View POV, the Global POV, or Workflow, Time, and Scenario.
Separate each Dimension qualified in the Member Script with a colon. Here is an example of a
fully qualified Member Script:
Cb#GolfStream:E#Houston:P#Texas:C#Local:S#Budget:T#2012M3:V#YTD:A#60000:F#None:
O#Forms:
IC#None: U1#Sales: U2#HybridXL: U3#Northeast: U4#SportsCo: U5#None: U6#None:
U7#None: U8#None
Here is an example of a Member Filter made up of three different Member Scripts that returns the
Actual, Budget and Forecast Scenarios.
Cube
Cb#GolfStream
The specific Cube being referenced.
Entity
E#Houston
The specific Entity being referenced.
Parent
P#Texas
The Parent of the Entity being referenced. This is important because Entities can be rolled up into
multiple hierarchies.
Consolidation
C#Local
The specific Member of the Consolidation Dimension being referenced.
Scenario
S#Actual
The specific Scenario being referenced.
Time
T#2012Q1
Specific time period in focus. There are several selections available relative in nature to certain
boundaries like the time period set in the POV, the selected Workflow time period, or the Global
time period. More of these constants are explained in the Member Script Constants section later
in this document.
T#POV
This returns the current period of the selected year in the Cube POV.
T#W1
This returns the first week of the selected year in the Cube POV (this only applies to weekly
applications).
T#M1
This returns the first month of the selected year in the Cube POV.
T#Q1
This returns the first quarter of the selected year in the Cube POV.
T#H1
This returns the first half of the selected year in the Cube POV.
View
V#YTD
The specific view of the requested information.
Account
A#60000
The specific Account being referenced.
Flow
F#BegBal
Specific Flow Member being referenced. There is a dedicated Flow Dimension.
Origin
O#Import
The Origin of the data being referenced and then deciding if it was imported, entered as a Journal,
or entered into a Form.
UD1 to UD8
U2#[Fairway Woods] or U3#TotalRegions
Specific Members from the eight User Defined Dimensions are being referenced. If all eight of
these Dimensions are not being used, they still need to be qualified in the POV, in a Cube View, or
in the Member Script to get a proper intersection. Each User Defined Dimension has a Member
called None or the reserved word All that can be used (e.g. U5#All) in some areas of the product
such as Intercompany Reporting settings.
Workflow (WF)
Refer to the time period or Scenario currently selected in the Workflow module. Examples are
T#WF and S#WF.
Global
The Application has a Global Time and Global Scenario setting found under the Application
Tab|Tools|Application Properties. These settings can be used optionally and can be referenced
(e.g., T#Global and S#Global). An example of using this as a reference is to build a Cube View
and have Global Time and everything before it appear in columns and the Actual Scenario and
everything after it appear as the Forecast Scenario.
Member Expansions
To illustrate these expansion concepts, the following examples will use a portion of the Entity
structure from GolfStream Corporation, which is a golf supply company. This is the management
rollup, but there is also a geographical rollup where Houston rolls up to Texas, Carlsbad rolls up to
California, Frankfurt rolls up to Europe, etc.
l Total GolfStream
o Clubs
o NA Clubs
o Canada Clubs
o Montreal
o Quebec City
o US Clubs
o Augusta
o Carlsbad
o Houston
o Frankfurt
o Golf Balls
o Europe Golf Balls
o NA Golf Balls
o Accessories & Apparel
o Course Management
o Corporate
Add these expansions after a Member name to return the desired Members like this:
E#Houston.Ancestors.
TIP: Use square brackets, [ ], to reference any name with a space. For example, E#[NA
Clubs].
Member
This lists the Member requested. E#Frankfurt.Member is the same as stating E#Frankfurt.
Base
E#[NA Clubs].Base queries only the base level Entities and excluding any other aggregate
Members:
l Montreal
l Quebec City
l Augusta
l Carlsbad
l Houston
Children
E#Clubs.Children returns the first level children in a flat list:
l NA Clubs
l Frankfurt
ChildrenInclusive
E#[NA Clubs].ChildrenInclusive returns both NA Clubs and its children:
l NA Clubs
o Canada Clubs
o US Clubs
This can be chained together with another .Children statement to see the next level as well. This
may also be referred to as Children.Children elsewhere in the documentation.
l NA Clubs
o Canada Clubs
o Montreal
o Quebec City
o US Clubs
o Augusta
o Carlsbad
o Houston
Descendants
E#[NA Clubs].Descendants lists every Member under NA Club in a flat, non-hierarchical list
excluding NA Clubs:
l Canada Clubs
l Montreal
l Quebec City
l US Clubs
l Augusta
l Carlsbad
l Houston
DescendantsInclusive
E#[US Clubs].DescendantsInclusive lists every Member under US Clubs in a flat, non-hierarchical
list including US Clubs:
l Augusta
l Carlsbad
l Houston Heights
l South Houston
l Houston
l US Clubs
Tree
E#[US Clubs].Tree returns the specified Member and all Members below it in a hierarchy:
This expansion automatically includes expandable rows.
l US Clubs
o Augusta
o Carlsbad
o Houston
TreeDescendants
E#Clubs.TreeDescendants lists every Member under Clubs in hierarchical tree excluding Clubs:
l NA Clubs
o Canada Clubs
o Montreal
o Quebec City
o US Clubs
o Augusta
o Carlsbad
o Houston
l Frankfurt
TreeDescendantsInclusive
E#[US Clubs].TreeDescendantsInclusive lists every Member under US Clubs in a hierarchical
tree including US Clubs:
l US Clubs
o Augusta
o Carlsbad
o Houston
Parents
E#Houston.Parents returns the direct Parents of the given Member regardless of how many
hierarchies to which the Member belongs:
The Parent is derived from the Cube View’s POV setting by default. If a Member is used in
multiple hierarchies, specify a specific Parent using the following syntax in order to override it:
E#Houston.Base:P#USClubs or E#Houston.TreeDescendants:P#Texas.
NOTE: P# works differently when using it with an expandable Tree filter because that
filter processes the children as individual queries, so the P# will only apply to the top-
level Members. Use E#Houston.TreeDescendants:P#Texas instead, or specify a
Parent on the Cube View’s POV settings.
Ancestors
E#[NA Clubs].Ancestors returns all Members up the chain from NA Clubs:
l Total GolfStream
l Clubs
Branch
Expand multiple Members of a given expansion by finding specific items and then performing
additional expansions on those items. Refer to the Samples Tab in the Member Filter Builder for
an example of the syntax.
l Total GolfStream
o Clubs
o NA Clubs
o Frankfurt
o Golf Balls
o Europe Golf Balls
o NA Golf Balls
o Accessories & Apparel
o Course Management
o Corporate
This is also used with Quick Views in order to expand several hierarchies at one time.
Find
Find will apply Member expansions to a nested subset of results:
FindAt
This returns a specific Member of a given expansion using a zero-based position index and
performing an additional expansion on the specific item. Refer to the Samples Tab in the Member
Filter Builder for an example of the syntax.
l Total GolfStream
o Clubs
o Golf Balls
First
First will find the first items in the list of results and allow additional Member Filters to be applied:
l Clubs
o NA Clubs
o Frankfurt
l Golf Balls
l Course Management
l Corporate
Last
Similar to First, Last will find the last items in the list of results and allow additional Member Filters
to be applied.
Keep
Keep will search the results from a Member Filter and only keep certain values:
E#[Total GolfStream].Children.Keep(Clubs, [Golf Balls]).Children will list all of the first level
children of Total GolfStream, only keeping Clubs and Golf Balls and then show their children:
l Clubs
o NA Clubs
o Frankfurt
l Golf Balls
o Europe Golf Balls
o NA Golf Balls
Remove
This will remove some of the Members from the results:
E#[Total GolfStream].Children.Remove(Corporate).Find(Clubs).Children.Find([NA
Clubs]).Children will list several Entities and then remove the Corporate Entity from the results:
l Clubs
o NA Clubs
o Canada Clubs
o US Clubs
o Frankfurt
l Golf Balls
l Course Management
List
This will create a list of specific Members:
An Indent Level can also be specified when defining a list of Members. Refer to the Samples Tab
in the Member Filter Builder for an example of the syntax.
l Clubs
o Golf Balls
o Course Management
Where
The Where clause in a Member Filter will allow further qualification of the results. Use AND, OR
and Parentheses to provide detailed conditions for including Members. Use the IN or NotIN
qualifiers to see if Members belong to a certain list:
See Member Expansion Where Clause later in this section for more examples.
Options
Use this to reference Sub-Cubes and specify how Dimensions should be processed. Options
must immediately follow the expansion function for which it is being used. For example,
A#19999.Children might return a different list when looking at an extended dimension associated
with a different cube.
Combined Expressions
Stack multiple Member Expressions to display Children and Parent Members. The example
below is using the List and Find expressions to display Base and Parent Members.
l Clubs
l Montreal
l Quebec City
l Augusta
l Carlsbad
l California
l US Clubs
l Houston Heights
l South Houston
l Frankfurt
l Corporate
l Frankfurt
For more Member Expansion examples, refer to the Samples Tab in the Member Filter Builder.
In Cube Views, this functionality can be set in the Member Filter Builder using the Member
Expansion Functions tab for Rows (and/or) Columns.
In Quick Views, this can be set in Preferences under the Quick View Double-Click Behavior
section in the Default Expansion for Rows (and/or) Columns.
This example will demonstrate the TreeDescendantsR being used in a Quick View. This Gross
Income view below has been created using Next Level expanison (available in a Quick View):
To change the double-click behavior, select Preferences under the Administration menu. In the
Quick View Double-Click Behavior section, select the drop-down list for Default Expansion For
Rows and select TreeDescendantsR. The result is the reverse direction of the originating account
on expansion for rows when using the double click:
AllPriorInYear
This returns all the time periods before the specified time period excluding the specified time for its
frequency. T#2012M6.AllPriorInYear, T#WF.AllPriorInYear return the previous periods in that
year, but not the specified period.
AllPriorInYearInclusive
This returns all the time periods before the specified time period including the specified time for its
frequency. T#2012M4.AllPriorInYearInclusive returns periods 2012M1 through 2012M4.
AllNextInYear
This returns all the time periods after the specified time period not including the specified time for
its frequency. T#2012M8.AllNextInYear returns periods 2012M9 through 2012M12.
AllNextInYearInclusive
This returns all the time periods before the specified time period including the specified time for its
frequency. T#2012M8.AllNextInYearInclusive returns periods 2012M8 through 2012M12.
Weeks
This returns all the weeks associated with the specified time filter. T#2017M2.Weeks returns all
the weeks in M2. T#2017.Weeks returns all the weeks in 2017. If the POV Time is 2017M7,
T#POV.Weeks returns all the weeks in M7.
Months
This returns all the months associated with the specified time filter. T#2017.Months returns all the
months in 2017. If the POV Time is set to 2017Q2, T#POV.Months returns all the months in Q2.
MonthsInQuarter
This returns the months in the specified quarter. If the Workflow period is 2012M2,
T#WF.MonthsInQuarter will return 2012M1, 2012M2, and 2012M3.
MonthsInHalfYear
This returns the months in the half year of the specified period. If the Global period is 2012M2,
T#Global.MonthsInHalfYear will return 2012M1 through 2012 M6.
MonthsInYear
This returns all of the months in the year of the specified period. If the POV period is 2012M7,
T#POV.MonthsInYear will return 2012M1 through 2012M12.
Quarters
This returns the quarters associated with the specified year. T#2017.Quarters will return 2017Q1,
2017Q2, 2017Q3, 2017Q4.
QuartersInHalfYear
This returns the quarters in the half year of the specified period. If the Workflow period is 2012M3,
T#WF.QuartersInHalfYear will return 2012Q1 and 2012Q2.
QuartersInYear
This returns all of the quarters in the year of the specified period. T#2012M7.QuartersInYear will
return 2012Q1, 2012Q2, 2012Q3 and 2012Q4.
HalfYears
This returns the half years associated with the specified year. T#2017.HalfYears returns 2017H1
and 2017H2.
HalfYearsInYear
This returns the half years in the year of the specified period. If the Global period is 2012M5,
T#Global.HalfYearsInYear will return 2012H1 and 2012H2.
Prior 1-Prior 12
This returns the prior period(s) in relation to the specified period. T#2010M12.Prior12 will return
the 12 months prior to 2010M12 not including 2010M12. T#2017W40.Prior12 will return the 12
weeks prior to 2017W40 not including 2017W40.
Next 1-Next 12
This returns the next period(s) in relation to the specified period. T#2010M12.Next12 will return
the 12 months after 2010M12 not including 2010M12. T#2017W40.Next12 will return the 12
weeks after 2017W40 not including 2017W40.
WFTimePeriods
This returns the time period(s) associated with the Workflow Profile. When using the Standard
Workflow Tracking Frequency, it returns the single time period associated with the selected
Workflow Unit. When using other Workflow Tracking Frequencies, it returns the range of time
periods for the selected Workflow Unit between WFStartTime and WFEndTime.
WFCalculationTimePeriods
This returns the time periods needed to be calculated for a Workflow Unit. For example, if a range-
based Workflow Unit spans two years, it would return the last period of the first year and the last
period of the second year. Executing a calculation using those two time periods would cause all
24 time periods to be calculated because the calculation engine automatically calculates all prior
periods in a year. WFCalculationTimePeriods is intended to be used in Data Management Steps
and Business Rules.
MemberDim
U2#AllProducts.Descendants(MemberDim='HoustonProducts') filters the Member list to the
products in the HoustonProducts Dimension under UD2.
InUse
A#Root.Children(InUse=True) filters the results to only the Members in use.
AccountType
A#Root.Base(AccountType = Revenue) filters the results to only the Members under the Account
Type of Revenue.
Formula
F#Root.TreeDescendants(Formula <> '') filters out Members with an empty value for the Formula
property.
Choices:
UserInReadDataGroup
UserInReadDataGroup2
UserInReadWriteDataGroup
UserInAnyDataSecurityGroup
UserInReadWriteDataGroup2
A#Root.Children(Text1='Blue')
A#Root.Children(Text5 <> '') filters out the Members that have an empty value for the Text5
property.
This is the list of supported operators. Operators such as < are not applicable to the text-based
fields.
>=
<=
<>
>
<
StartsWith
Contains
DoesNotContain
EndsWith
For more Where Clause examples, refer to the Samples Tab in the Member Filter Builder.
Time Functions
When using the POV as a point of reference, use the following constants in the Member Scripts. In
the constants below, N refers to the number of periods desired.
T#PovFirstInYear
This returns the first period in the year portion of the POV period. It is the same as T#POVM1,
T#POVQ1, or, T#POVH1 based on the Input Frequency setting of the given Scenario. T#M1,
T#Q1, or, T#H1 work the same way as well.
T#PovLastInYear
This returns the last period in the year portion of the POV period.
T#POVFirstInMonthPrior1
POV Month Example
This refers to the POV month and returns N months prior. If the POV is 2017M10, and
T#POVFirstInMonthPrior5 is requested, 2017M5 is returned.
This refers to the POV sub-period’s half year and goes back N periods from the first period in that
half year. If the period is 2013M5, and T#POVFirstInHalfYearPrior2 is requested, 2012M11 is
returned.
This refers to the POV year and goes back N periods from the first period. If the period is 2013M5,
and T#POVFirstInYearPrior1 is requested, 2012M12 is returned.
T#WFFirstInYear
This returns the first period in the year portion of the Workflow period. This is the same as
T#WFM1, T#WFQ1, or T#WFH1 based on the Input Frequency settings of the given Scenario.
T#WFLastInYear
This returns the last period in the year portion of the Workflow period.
T#GlobalFirstInYear
This returns the first period in the year portion of the Global period. This is the same as
T#GlobalM1, T#GlobalQ1, or T#GlobalH1 based on the Input Frequency setting of the given
Scenario.
T#GlobalLastInYear
This returns the last period in the year portion of the Global period.
For more Time Function examples, refer to the Samples Tab in the Member Filter Builder.
T#Year(|!MyTimeParam!|)Period(|!MyTimeParam!|)
The Year function returns the specified year and the Period function returns the period without the
year. The result is a valid time Member because they are combined. If the user selected 2012M6,
it looks like this after the Parameter substitution.
T#Year(2012M6)Period(2012M6)
T#2012M6
This is an example showing last year’s values for the same month:
T#YearPrior1(|!MyTimeParam!|)Period(|!MyTimeParam!|)
This is an example showing last month’s values for the same year:
T#Year(|!MyTimeParam!|)PeriodPrior1(|!MyTimeParam!|)
T#Year(|!MyTimeParam!|).Base
T#Year(2012M6)Period(2012M6) = 2012M6 T
The 2012M6 could have started as a substitution variable such as |!MyTimeParam!| or |PovTime|
T#Year(2012M6)M2 = 2012M2
T#YearPrior(2012M6)Period(2012M6) = 2011M6
T#YearPrior2(2012M6)Period(2012M6) = 2010M6
T#YearPrior2(2012M6) = 2010
T#Year(2012M6)PeriodPrior3(2012M6) = 2012M3
T#Year(2012M6)PeriodPrior8(2012M6) = 2011M10
The PeriodPrior8 caused it to change the year too.
T#Year(2012M6)PeriodPriorInYear8(2012M6) = 2012M1
The PeriodPriorInYear8 does not change the year.
If the Year section is skipped, the year comes from the Period section.
T#PeriodPrior3(2012M6) = 2012M3
T#Year(): T#Year(|!TimeParam!|)Period(|!TimeParam!|)
T#YearPrior1(): T#YearPrior2(|WFTime|)
T#YearNext1(): T#YearNext3(|GlobalTime|)
T#Period(): T#Year(|!TimeParam!|)Period(|!TimeParam!|)
T#PeriodPrior1(): T#PeriodPrior1(|WFTime|)
T#PeriodNext1(): T#PeriodNext2(|GlobalTime|)
T#PeriodPriorInYear1(): T#PeriodPriorInYear2(2012M6)
T#FirstPeriodInQuarter(): T#FirstPeriodInQuarter(|WFTime|)
T#FirstPeriodInQuarterPrior1(): T#FirstPeriodicInQuarterPrior1(2012M6)
T#FirstPeriodInQuarterNext1(): T#FirstPeriodInQuarterNext2(|POVTime|)
T#FirstPeriodInHalfYear(): T#FirstPeriodInHalfYear(|WFTime|)
T#FirstPeriodInHalfYearPrior1(): T#FirstPeriodInHalfYearPrior1(2012M6)
T#FirstPeriodInHalfYearNext1(): T#FirstPeriodInHalfYearNext2(|WFTime|)
T#FirstPeriodInYear(): T#FirstPeriodInYear(|WFTime|)
T#FirstPeriodInYearPrior1(): T#FirstPeriodInYearPrior1(|!TimeParam!|)
T#FirstPeriodInYearNext1(): T#FirstPeriodInYearNext2(|PovTime|)
T#Quarter(): T#Quarter(|WFTime|)
T#QuarterPrior1(): T#QuarterPrior1(2012M6)
T#QuarterNext1(): T#QuarterNext2(|POVTime|)
Weekly Functions
The following Time Functions apply to weekly applications. The following examples are based on
a 52 Week 445 calendar.
T#FirstPeriodInMonth(): T#FirstPeriodInMonth(2017W43)
Returns: 2017W40
T#FirstPeriodInMonthPrior1(): T#FirstPeriodInMonthPrior3(|WFTime|)
Workflow Time: 2017W1
Returns: 2016W50
T#FirstPeriodInMonthNext1(): T#FirstPeriodInMonthNext2(2017W49)
Returns: 2017W50
T#Month(): T#Month(2017W40)
Returns: 2017M10
T#MonthPrior1(): T#MonthPrior3(2017W44)
Returns: 2017M8
T#MonthNext1(): T#MonthNext5(|POVTime|)
POV Time: 2017W9
Returns: 2017M8
Changing the Time Label in Headers When Using These Time Functions
When using this Time function to change a header label in a Cube View, use the :Name(“”)
function and substitute what will be seen. For example, to show the value from the last period from
the year prior to the Global Time period, use this function:
T#YearPrior1(|GlobalTime|)M12
2011M12 will return in the header of the Cube View, such as a column header. In order to return
Year End 2011, use the Name function in this way:
Substitution Variables
Substitution Variables can be used to exchange values at run time for Parameters presented to
the user before a Dashboard runs, for Members in the POV, for Global variables for Scenario or
Time, or for Workflow variables for Time, Scenario or other Workflow Profile attributes.
By adding Desc to the end of many of these substitution variables, the Description for the Member
will be returned instead of the Member name:
NOTE: If there is an index out-of-range, the Cube View will display an error showing the
data POV as invalid. When the Cube View is passed to the Pivot Grid control, it
attempts to resolve the error column. It cannot, so in the case of the Time dimension it
will instead display the default time member as "Name - Description". To resolve the
issue in the Pivot Grid, you must first correct the issue in the source Cube View.
Substitution Periods are defined within a given Year. The number of sub periods are limited to the
number of months or weeks within the given application year. For example, if you go prior or next
within a given year, it will stop at the year boundary. If PovSubPeriodPrior14 in a given monthly
app for a given year will not go past M1. PovSubPeriodNext will not go past M12. In a weekly app,
PovSubPeriodPrior will not go past W1 and PovSubPeriodNext will not go past W53.
POVTimeDimProfile
This returns the Time Dimension Profile name associated with the user’s current Cube POV.
POVCurrency
This returns the actual currency for an Entity based on the Consolidation Dimension setting in the
POV. For example, if Consolidation is set to Local for the Manchester Entity, GBP will return,
which is its currency. If Consolidation is set to Translated and the Parent is set to US, USD will
return.
POVTimePriorInYearN
This returns the prior (or more) time period if it is still in the current year.
POVTimePriorN
This returns the prior (or more) time period.
POVTimeNextInYearN
This returns the next (or more) time period if it is still in the current year.
POVTimeNextN
This returns the next (or more) time period.
POVYear
This returns the Year portion of a POV Time period.
POVYearPriorN
This returns the Year portion of a Time period for the previous year where N is the number of
years.
POVYearNextN
This returns the Year portion of a Time period for the upcoming year where N is the number of
years.
POVSubPeriod
This returns the Week, Month, Quarter, or Half Year portion of a Time period. If the POV is
2012M5, M5 is the SubPeriod.
POVSubPeriodPriorN
This returns the previous Week, Month, Quarter, or Half Year portion of a Time period.
POVSubPeriodNextN
This returns the next Week, Month, Quarter, or Half Year portion of a Time period.
POVSubPeriodNum
This returns the sub period’s number for the current POV, so if the POV is set to M3, the number 3
is returned.
POVTimeFirstInYear
This is the same as T#POVFirstInYear, T# W1, T#M1, T#Q1, or T#H1 based on the Input
Frequency of the Scenario.
POVTimeLastInYear
This returns the last time period in the year based on the Input Frequency of the Scenario.
WFProfile
This provides the current Workflow Profile name.
WFProfileIndex
This provides a numeric value of the Workflow Profile in the hierarchy.
WFProfileLastDescendantIndex
This provides an index of bottom descendants in the Workflow Profile tree.
WFProfileKey
WFReviewProfileKeys
WFInputParentProfileKeys
WFImportProfileKeys
WFFormProfileKeys
WFJournalProfileKeys
WFScenario
This returns the Scenario in the Workflow View.
WFScenarioDesc
This returns a Description of the Scenario in the Workflow View.
WFScenarioID
This returns the Numeric ID of the Scenario in the Workflow View.
WFTime
This returns the Time period associated with current Workflow Unit.
WFTimeID
This returns the Numeric ID of the Workflow Unit Time period.
WFTimePriorInYearN
This returns the prior (or more) Workflow time period if it is still in the current year.
WFTimePriorN
This returns the prior (or more) time period.
WFTimeNextInYearN
This returns the next (or more) time period if it is still in the current year.
WFTimeNextN
This returns the next (or more) time period.
WFYear
This returns the year portion of the Workflow Unit.
WFYearPriorN
This returns the Year portion of a Time period for the previous year where N is the number of
years.
WFYearNextN
This returns the Year portion of a Time period for the upcoming year where N is the number of
years.
WFSubPeriod
This returns the Week, Month, Quarter, or Half Year portion of a Time period
WFSubPeriodPriorN
This returns the previous Week, Month, Quarter, or Half Year portion of a Time period.
WFSubPeriodNextN
This returns the next Week, Month, Quarter, or Half Year portion of a Time period.
WFSubPeriodNum
This returns the period’s number for the current Workflow Period, so if the Workflow is set to M3,
the number 3 is returned.
WFTimeFirstInYear
This is the same as T#WFFirstInYear, T#W1, T#M1, T#Q1, or T#H1 based on the Input
Frequency of the Scenario.
WFTimeLastInYear
This returns the last time period in the year based on the Input Frequency of the Scenario.
WFStartTime
This is the Workflow Start Time entry for this Scenario.
WFStartTimeDesc
This provides a Description of the Workflow Start Time entry for this Scenario.
WFStartTimeShortDesc
This provides a Short Description of the Workflow Start Time entry for this Scenario.
WFEndTime
This is the Workflow End Time entry for this Scenario.
WFEndTimeDesc
This provides the Description of the Workflow Start Time entry for this Scenario.
WFEndTimeShortDesc
This provides the Short description of the Workflow Start Time entry for this Scenario.
WFCubeRoot
This returns the very top level of the Workflow Profile hierarchy.
WFCube
This returns the Cube related to this Workflow Profile.
WFTimeDimProfile
This returns the Time Dimension Profile name associated with the user’s current Workflow POV.
WFEntityDim
The returns the Dimension in play for the Workflow Entities assigned to this Workflow Profile.
WFScenarioDim
This returns the Scenario Dimension for the Cube assigned to this Workflow Profile.
WFAccountDim
This returns the Account Dimension associated with the Cube for this Workflow Profile.
WFFlowDim
This returns the Flow Dimension for the Cube assigned to this Workflow Profile.
WFUD1Dim-WFUD8Dim
This returns the UD1 Dimension-UD8 Dimension for the Cube assigned to this Workflow Profile.
GlobalTime
This is the Global Point of View Time period from Application Properties.
GlobalScenarioDesc
This provides the Descriptions for the Global Scenario.
By adding Desc to the end of many of these substitution variables, the Description for the Member
will be returned instead of the Member name:
The following Substitution Variables will return the Cube View Name, Cube Name, or Dimension
Name associated with the specific Cube View as well as a Description.
CVCurrency
This returns the actual currency for an Entity based on setting in the specific Cube View.
CVTimePriorInYear
CVTimePrior
CVTimeNextInYear
CVTimeNext
CVYear
CVYearPrior
CVYearNext
CVSubPeriod
CVSubPeriodPrior
CVSubPeriodNext
CVSubPeriodNum
CVTimeFirstInYear
CVTimeLastInYear
Null
|Null| is empty text. |Null| is mostly used in Cube-level security and the Stage parser. I t can also
be used within a comma-separated list of Parameter values when you need to set a value to be an
empty string. However, |Null| is not a commonly used substitution variable.
Space
This is used to replace a string with nothing or use a space along with the Name function in a
Member Filter. For example, in order to make a Column Header or Row Header blank, use
T#POV:Name(“ ”) or T#POV:Name(|space|). Both will produce the same result.
UserName
This provides the current user name
UserText1-4
This allows the reference of the Text1 through Text4 properties related to a User account:
|UserText3|.
AppName
This provides the application name
DateTimeForFileName
This returns the current date and time: 20131208_102540.
DateForFileName
This returns the current date: 20131208.
DateLong
This returns the current date: Sunday, December 08, 2013.
DateMMDDYYYY
This returns the current Date as Month, Day, Year: 12/08/2013.
DateDDMMYYYY
This returns the current Date as Day, Month, Year: 08/12/2013.
DateYYYYMMDD
This returns the current Date as Year, Month, Day: 2013/12/08.
DateTimeHHMMSS
This returns the current Date/Time as Hour, Minutes, Seconds: 11:00:19.
DateTimeForFileNameUTC
DateForFileNameUTC
DateLongUTC
DateMMDDYYYYUTC
DateDDMMYYYYUTC
DateYYYYMMDDUTC
DateTimeHHMMSSUTC
The version of these functions with the UTC suffix returns the same result, but in Coordinated
Universal Time.
MFTime
MFTimeDesc
MFTimeShortDesc
MFYear
MFSubPeriod
MFSubPeriodNum
The following Substitution Variables work with the XFMemberProperty function in order to retrieve
any Dimension Member Name being used within a Member Filter. Add Desc to any of these
Substitution Variables in order to display the Member Description instead of the Member Name.
See XFMemberProperty under Commonly Used Member Filter Functions for more details on
using these Substitution Variables.
|MFCube|
|MFEntity|
|MFParent|
|MFConsolidation|
|MFScenario|
|MFTime|
|MFView|
|MFAccount|
|MFView|
|MFFlow|
|MFOrigin|
|MFIC|
|MFUD1|-|MFUD8|
Loop1-4Variable
Loop1-4DisplayVariable
Loop1-4Index
Variable1-10
See "Presenting Data With Books, Cube Views and Other Items" on page 576.
For more GetDataCell examples, refer to the Samples Tab in the Member Filter Builder.
Parameter/Parameter Display
Use one of these buttons in order to enter a custom Parameter reference that comes from either a
Form or Dashboard. The Parameter Display Substitution Variable is only used when working with
a Delimited List Parameter where the Display Item will be displayed instead of the Member name.
Business Rules
Business Rules can be passed in a Cube View Member Filter in order to do complex calculations
on the Members referenced in the Cube View. Setup the Finance Business Rule to indicate the
name of the function and any name-value pairs to reference in the Member Filter. A different
action can be performed based on the definition of the name-value pair. In the example below,
the Business Rule is going to read the current time period from the rule and get the value for the
prior year based on the Member script.
Name-Value Pair
The Name-Value Pair in this string is Field1. This needs to be referenced in the Member Filter
and defined. Based on the Name-Value Pair, the rule can run different actions.
Dim ms1 As String = args.DataCellArgs.NameValuePairs("Field1")
Dim priorYearTimeName As String = api.Time.GetNameFromId
(api.Time.GetPriorYearPeriodId())
Dim memberScript As New System.Text.StringBuilder
memberScript.Append(ms1)
memberScript.Append(":T#")
memberScript.Append(priorYearTimeName)
Return api.Data.GetDataCell(memberScript.ToString)
End If
End Select
Example
GetDataCell("BR#[BRName = XFR_CVDataCellHelperNew, FunctionName = PYMonthForCol,
Field1 =A#60999]"):Name("Sales Last Year")
NOTE: See the Samples tab in the Member Filter Builder for more examples on this
function.
In the Business Rule, define the List Name and the name-value pairs:
Select Case api.FunctionType
Case Is = FinanceFunctionType.MemberList
Dim listName As String = args.MemberListArgs.MemberListName
Dim entityList As String = args.MemberListArgs.NameValuePairs
("EntityList")
If listName.Equals("EntityParentList",
StringComparison.InvariantCultureIgnoreCase)
Then
Dim objMemberListHeader = New MemberListHeader
(args.MemberListArgs.MemberListName)
NOTE: See the Samples tab in the Member Filter Builder for more examples on this
function.
If listName.Equals("AlphaSortList", StringComparison.InvariantCultureIgnoreCase)
'Return list
Return New MemberList(objMemberListHeader, objMembers)
End If
End Select
XFMemberProperty
This function allows users to specify a Dimension property and display the Member Property
selection as a row or column header on a Cube View. Use this function with the Name() and
GetDataCell() portion of a Member Filter.
The following name-value pair settings can be used for this function:
l DimType
Dimension name such as Entity, Account, etc.
l Member
Dimension Member name
l Property
The exact Dimension property name
l VaryByCubeType
Use this if the property varies by a specific Cube
l VaryByScenario
Use this if the property varies by Scenario such as =Actual, =Budge, etc.
l VaryByTime
Use this if the property varies by a specific Time Period such as =2016M1, =2016M5, etc.
l TimeDimProfile
This can be set to CV, WF, any Time Dimension Profile name, or a Cube name using the
CB# qualifier
Example Syntax:
T#2015.Base:Name(XFMemberProperty(DimType=Time, Member=|MFTime|,
Property=Description, TimeDimProfile=|CVTimeDimProfile|))
The example above uses the |MFTime| Substitution Variable in order to reference the
Members in the T#2015.Base Member Filter. See Member Filter Substitution Variables for
more details on these Substitution Variables. It also uses a Substitution Variable for
TimDimProfile in order to point to whatever Time Dimension Profile is being used on the
Cube View.
NOTE: For more examples on using this function, refer to the Samples tab in the
Member Filter Builder.
1. Member Filter
This area is where the Member Filter will be built. Type in this section or use the dialog to
help fill it out.
2. Member Selection
There is a button for each Dimension here that will launch the appropriate selection dialog.
5. Time Functions
These only apply to the Time Dimension, such as T#POVPrior1. Double-click on a Time
Function to add it.
6. Substitution Variables
Double-click on a system wide Substitution Variable to add it.
7. Samples
Refer to this tab for examples on how to build Member Expansions, Time Functions, Where
Clause Expressions, GetDataCell Expressions, and Custom Member List Expressions.
8. Expansion
These buttons are commonly used Member Expansions. Click the Expansion to add it.
9. Workflow
These buttons are commonly used Workflow Member Expansions used in Cube Views that
point to a Report, Form, or Dashboard and are affiliated with a specific Workflow Profile.
10. Other
These buttons are commonly used Member Filter Functions which allow the user to create
calculated columns and rows or use a custom Parameter to store Member lists.
Workflow
A primary feature of the OneStream platform is the ability to tailor and optimize all aspects of an
application to best fit the requirements of the model design, such as Consolidation, Planning,
Forecasting or Operational models. In this section you will learn about the various import
methods, workflow analysis, and the blend workflow.
l Standard: Highly durable and auditable, stored details that target Finance Engine Cubes.
The BI Blend Engine is a key element of Analytic Blend models as it is a read-only aggregate
storage model. The purpose is to support reporting on large volumes of data that is not
appropriate to store in a traditional Cube, such as transaction or operational data.
The Blend Type Import rationalizes the source data into a structure that is uniform and
standardized for reporting by leveraging Cube Dimensions, deriving the metadata and
aggregation points for the resulting BI Blend relational tables. This enables the transaction
content stored in the relational tables to be aligned with the Finance Engine Cube data through
common metadata and aggregation points for Analytic Blend Reporting. Refer to the BI Blend
Design and Reference Guide for more detail.
Limit Description
Example Description
The benefits of efficient Workflow structure using partitioning when working with large data
sources are:
The Direct Load Workflow is designed for data that has a high frequency of change and does not
demand durability for audit and history.
Common Uses
l Data Integrations where the OneStream Metadata and Source System are mirrored,
allowing “* to * “ Transformation rules to pass-through all records, minimizing the need Drill-
Down or Transformation analysis.
l Data that is “disposable” in nature. Typically, this may be data that has a high frequency of
changes and may only be valid for a short time. Perhaps only valid for one to seven days.
l High-volume data loads, such as nightly batch loading, where optimal performance is
desired. Such data is commonly deleted and reloaded frequently.
l Extended Application data moves, where data from a detailed application feeds a
summarized target application.
Important Limitations
A key differentiator of the Direct Load is that it does not store source and target records in Stage
Database tables. This, by design, will eliminate the audit and historical archiving of Workflow
Activity. Other limitations as a consequence of the in-memory Workflow are that the Drill-Down
feature will not function to support analysis of records between the Finance and Stage tables.
l Direct Load Type does not support historical audit of workflow history, such as Import and
Transformation Rule history.
l Direct Load Type does not support Re-Transform as Import records are not stored data.
Data must be re-loaded.
l Transformation and Validation analysis and map correction is limited to 1000 error records
per load.
l Data files cannot load to Time and Scenarios beyond the current Workflow Scenario and
Time. The data record's Time and Scenario being loaded must match the Workflow
Scenario and Time.
As Direct is an in-memory Workflow with only a single step for the data integration process, Load
And Transform. The Direct Load Execution Status screen provides statistics to analyze the
Workflow’s performance. These key statistics are helpful in determining if the Workflow design is
supportive of best-practice designs to optimize application performance.
l Detailed Row Count: The total number of Data Source and Derivative Rule records.
l Summary Row Count: The total number of records summarized in the Transformation
process.
l Loaded Row Count: The recorded number of records loaded to the Finance Engine target
Cube, which should always equal the Summary Row Count.
The Direct Load Workflow’s in-memory processing results in Transformation and Validation
errors that are not stored being stored in a table. The total number of errors that can be
processed and presented in Validation is limited to 1000 records. If the total number of
errors exceeds 1000, the data must be re-loaded to re-execute the Transformation and
Validation process to generate the next batch of errors, at a maximum count of 1000
records per load.
Integrations with high complexity and mapping may benefit by having a “development”, Standard
Workflow, to finalize the core Transformation Rules. A Standard Workflow supports pageable
Validation and Intersection Error analysis, as well as the ability to Retransform source data that
the Direct Load does not. The Standard Workflow also provides the Drill-Back from the Finance
Engine to Stage that may streamline the data validation process. Once the core Transformation
Rules are developed, a “production” Direct Load Workflow can be used, managing only the
Validation exceptions.
l Tightly coupled source metadata to OneStream metadata allows the use of “ * to * “ Mask
maps to easily associate source data to cube data.
l Integrations with complex mapping may best make use of a Standard Workflow as
“development” to provide transparency and as a platform for debugging.
l Blob: Summarized records are not stored directly in the StageSummaryTargetData, but as
a serialized Byte array stored in the StageDirectLoadInformation.
The BI Blend Settings contain core properties used to design and structure the relational tables
created by the BI Blend Engine.
Data Controls Defines the core data source and output structure and design
of the relational tables.
BI Blend also solves for use cases that are not pure analytic reporting problems. By leveraging
OneStream hierarchies, along with BI Blend configuration settings, it is possible to aggregate on a
few dimensions (Entity or Account as an example) while including transaction information (Invoice
number) that is not associated with a cube. The ability to combine the dimensional structure with
transaction details allows for selective enrichment of transactional data. Refer to the BI Blend
Design and Reference Guide for more detail.
Workflow Channels
Workflow Channels allow the process to be locked down to a more granular level than the
standard Workflow Profiles. This is an additional setting that can be configured to one additional
Dimension. For example, this can lock down by product.
Standard
This is a basic Member without any special purpose other than to act as the default Workflow
Channel for Account Members and Workflow Profile Input Children.
NoDataLock
This is a special Member that only applies to a metadata Member (Account or UDx) that should
not participate in a Workflow Channel grouping scheme. This is the default value for any UDx
Member.
AllChannelInput
This is a special Member that only applies to a Workflow Profile Input Child (Import, Forms or Adj)
and indicates the Workflow Profile can control data input processes for any Workflow Channel.
To set this, go to the Application Tab > Tools > Application Properties and set it here. See
Workflow in "Workflow" on page 131 for more details on this process.
Workflow Profiles
Workflow Profiles are the foundation of the data loading process, which is where Data Sources
and Transformation Rules are assigned, and the anchors for the review and certifications. This
section will go through more detail about all the configuration settings available. See Workflow in
"Workflow" on page 131 for more details on how the Workflow function works.
TIP: It is best to lay out the organization first in Excel to view and review how the
structure looks. Once confirmed, create the same structure in OneStream.
TIP: There are a variety of combinations for building out the Workflow Profiles. Below
are a couple from some best practices.
Fully Integrated
This is the complete flow from data loading to certification in the same structure.
Shared Services
Data loading and data certification are integrated; however shared services are separated from
the certification.
Separation of Duties
Data load and certification each have their own structures.
Review
Reviewer or Certification
Base Input
Data Load
Parent Input
Parent Adjustments
Rename Profile
Use this to rename a Workflow Profile or Workflow Profile Input Type
Save
Use this to save any changes made to selected Workflow Profile
Move Up
Use this to move sibling profiles up in the hierarchy
Move Down
Use this to move sibling profiles down in the hierarchy
Navigate
This icon appears in various fields and when clicked it navigates to a section that coincides with
the Workflow Profile property. For example, if this icon is clicked in the Cube Name setting, the
Cube screen will open allowing the user to make any changes needed before assigning a Cube to
a Workflow Profile.
Use the Workflow search tool to filter down to the specific Workflow Profile or Workflow Input Type
desired. The Cube Root defaults to the last one selected and displays the associated Workflow
Profiles. In order to see another Workflow structure hierarchy, select a different Cube Root.
Right-click on any Workflow Profile name in order to expand or collapse all the selected Workflow
Profile’s descendants.
Import
Import is typically used to load a GL data file or a OneStream configured Excel Template. The
Import Origin can be configured by one of the following:
(Import, Validate, Load), (Import, Validate, Load, Certify), (Import, Validate, Process, & Certify),
(Import, Validate, Process, Confirm, Certify) (Central Import), (Workspace), (Workspace, Certify),
(Import Stage Only), (Import, Verify Stage Only), (Import, Verify, Certify Stage Only)
Form
Form is used to load data either through a Form template or a pre-configured Excel XFSetCell file.
The Form Origin can be configured by one of the following:
(Form Input), (Form Input, Certify), (Form Input, Process, Certify), (Form Input, Process, Confirm,
Certify), (Pre-Process, Form Input), (Pre-Process, Form Input, Certify), (Pre-Process, Form Input,
Process, Certify), (Pre-Process, Form Input, Process, Confirm, Certify), (Central Form Input),
(Workspace), (Workspace, Certify)
Journal
Journal is used to load journal adjustment data through a journal template. The Journal Origin
can be configured by one of the following:
(Journal Input), (Journal Input, Certify), (Journal Input, Process, & Certify), (Journal Input,
Process, Confirm, Certify), (Central Journal Input), (Workspace), (Workspace, Certify)
Parent Adjustment
If a top side adjustment is needed, do so with a Parent Input Workflow Profile either through a
journal or form. Both will update the AdjInput Member in the Origin Dimension.
Review
These Workflow Profiles do not take input, but are meant for reviewing, confirming and
certifying results from the lower input Workflow Profiles. Since the Review Workflow Profiles
cannot load data, the only tasks available for a Review Workflow Profile are (Process, Certify),
(Process, Confirm, Certify)
See Workflow Tasks in "Using OnePlace Workflow" on page 1003 of the Reference Guide for
more details on each of these task types.
Profile Properties
The first tab is the primary configuration tab and is available for all types of profiles. It is where
security and other objects are attached.
General
Name
Name of profile.
Description
Brief description of profile.
Security
Access Group
Controls the user or users that will have access to the Workflow Profile at run time to view results.
Maintenance Group
Controls the user or users that will have access to maintain and administer the rule group.
A user must be a Member of the Access and Workflow Execution Group to perform the Lock
action for a Workflow Profile. If a user has access to the Lock action, then they also have access
to the Lock Descendants action for all Workflow Profiles below it in the hierarchy. This will happen
even if the user does not have Membership in the same security settings in the descendants’
profiles.
NOTE: Click and begin typing the name of the Security Group in the blank field. As the
first few letters are typed, the Groups are filtered making it easier to find and select the
appropriate Group. Once the Group is selected, press CTRL and double-click to enter
the correct name into the appropriate field.
Workflow Settings
In this section the administrator can assign a Workflow Channel and a Workflow name. It is
important to understand the tasks of each input type in order to properly assign a Workflow name.
Workflow Channel
This option ties to the Workflow Channels found under the Application Tab/Workflow/Workflow
Channels. This allows for an additional layer of security that defined through the Dimension
Library. (e.g., Accounts, or UD1 – UD8). Click on the drop down to display all configured Workflow
Channels. By default, this is set to Standard. This is an option on each load channel (such as
Import, Forms, and Adjustments).
Workflow Name
The Workflow name controls the tasks the users need to complete in the Workflow. There are a
variety of combinations based on the type of Workflow being designed. These tasks can vary by
Scenario and Input Type. For example, the Workflow Name Import, Validate, Load is set for the
Houston Workflow Profile Import Input Type for the Actual Scenario.
When working in the Houston Workflow and loading data in the Actual Scenario, complete the
following tasks:
See Workflow Tasks in OnePlace Workflow for more details on these tasks.
TIP: The next three configurations are for configuring Workflow Profiles that may load as
MTD. The entire Workflow for the year must be loaded consistently the same way for
Zero No Data.
Matching Parameters
Click on the ellipsis button in this field to configure the matching settings.
TIP: Make sure the (Unassigned) box is unchecked, otherwise the fields will not be
available to update.
Currency Filter
Enter the reporting currency.
View Filter
Enter how the Intercompany data should display in the Workflow (such as V#YTD or V#Periodic).
Suppress Matches
Check this to apply suppression.
Matching Tolerance
If the matching tolerance must be on the penny, leave this at 0.0, otherwise add a tolerance
threshold for the offset amounts.
Entity Filter
The Workflow Entities associated with the Intercompany matching.E#Root.WFProfileEntities
automatically points to the Entities assigned to the Workflow.
Partner Filter
Enter a Member Filter specifying the Partner Entities.
Calculation Definitions
The second tab is where the administrator assigns the Calculation Definitions for the Workflow
Profile. This will determine the type of calculation/consolidation that occurs when a user selects
Process Cube during the Workflow. Multiple Entities can be entered and ordered accordingly.
See Using Calculation Definitions in "Workflow" on page 131 in the Design Guide for more details.
Auto-assigning Entities can be done through (Assigned Entities) or (Load Entities). The
Confirmed check box controls which Entities are tested by Confirmation Rules.
TIP: By right clicking on any line item, a user can insert or delete a row, save, or export
data.
Entity Assignment
The third tab is only available when clicking on the Cube Root Workflow Profile. This is where the
actual Entity gets tied to the Workflow Profile. There are two sections to this tab:
Unassigned Entities
This is the search window to find Entities to attach to a profile. This only becomes enabled for the
data loading profiles. The search engine uses that contains technology to find Entities. In other
words, type the whole word or part of the word and it will search through the complete list looking
for that combination of characters. More than one profile may be chosen. Lastly, once an Entity
has been attached, it will no longer appear in the search window.
Central Input
For corporate configurations there is an option called Central Input. This can be used for
situations where corporate does the final adjustment after the data has been loaded by the sites.
The Workflow Channel can configure with Central Input which will then display a grey check mark
for that Workflow Channel. This allows corporate to make updates and adjustments at the top
level because the Workflow owns the Entity. All activity is tracked in the audit history.
Workspace
OneStream can have a custom Workflow Profile that displays a Workspace style Dashboard.
Workflow Templates
Workflow Templates are useful when building a series of Base Input Workflow Profiles with similar
settings. Design a template as generic or customized as desired and then apply the template to
the new Base Input Workflow. On the Workflow Templates screen, there is a (Default) Template
that can be used or click to create a new one. Once the template is created, it will look similar
to a Base Input Workflow Profile and include the default input types.
l Disable input types by Scenario (e.g., disable Journals for the Budget Scenario if that input
type is not used)
The goal of the template is to make as many common changes and updates as possible which will
save time and clicks during the actual Workflow build. Once a template is completed, click to
navigate back to the Workflow Profile page.
NOTE: After a template is applied to a Base Input Workflow Profile, any changes made
to the existing input types (Import, Forms, Journals) cannot be applied to the Workflow
Profile. If a new Input Type is added to the template, this can be applied to an existing
Workflow Profile. From the Workflow Profile screen, select . Select the template
with new input types and the Workflow Profiles to which the changes will apply.
Confirmation Rules
Confirmation Rules are used as a control to check the validity of the processed data. The rules
can be setup to act as an error to the process or show a warning message. If the rule was setup
as an error and it failed within the process, the user would not be able to proceed within the
Workflow. The rules will process individually for each Entity associated with the Workflow Profile.
Description
A field for a more detailed description of the rule group.
Security
Access Group
This controls the user or users that have access to the rule group within the Workflow.
Maintenance Group
This controls the user or users that have access to maintain and administer the rule group.
NOTE: Click in order to navigate to the Security screen. This is useful when
changes need to be made to a Security User or Group before assigning it to a
Confirmation Rule. Click and begin typing the name of the Security Group in the
blank field. As the first few letters are typed, the Groups are filtered making it easier to
find and select the desired Group. Once the Group is selected, click CTRL and Double
Click. This will enter the correct name into the appropriate field.
Settings
Scenario Type Name
The rule group can be made available to a specific Scenario Type or all Scenario Types.
Order
The order in which the rules will process and display within the group when the Workflow is
processed.
Rule Name
A name given to the rule. This name will be seen in the Workflow, so it is best to give it a
descriptive, purposeful name.
Frequency
This option will dictate how often the rule is required to run in the Workflow Profile.
Monthly
This runs the rules every month. If this is for a weekly application, they will run the last week of
each month.
Quarterly
This runs the rules every quarter, or four times a year.
Half Yearly
This runs the rules two times a year; once in June and December.
Yearly
This runs the rules once a year in December.
Member Filter
This turns on the Frequency Member Filter. Filters can then be defined in that section.
Rule Text
A description of the rule. This will also be seen in the Workflow. The Rule Text should be a textual
description of the Rule Formula associated with this rule.
Action
A drop down list containing the options Warning (Pass) or Error (Fail). If the data being evaluated
does not pass the rule, these options dictate how to handle the problem. If a rule is associated
with the Warning action, a warning message will display to alert the user, but the process will not
stop, and the user will be allowed to Certify if there were no errors in other rules. If a rule is
associated with the Error action, an error message will display, and the rule will have failed. The
user will not be able to proceed further until all failures have been addressed and/or resolved. If
No Action is associated with the rule, the value for the given rule will just be displayed during
confirmation. This data can be used by the user for informational purposes.
Failure Message
This message will be displayed to the user if an error occurs. This field will only be used if the rule
is associated with the Error Action. The message should stipulate the error again and possible
options of what the user should look for or do to resolve it.
Warning Message
This message will be displayed to the user if the line item was associated with the Warning Error
Action. The warning will alert the user to the issue.
NOTE: Messages longer than 2000 characters will be truncated with an ellipsis (…)
placed in the last three characters.
TIP: In order to edit the Rule Formula, right click on the line item that needs to be edited
and select Edit Rule Formulas. This will open the Rule Formula Editor and the rule logic
can be created or maintained.
Click the highlighted check icon, to ensure the formula was compiled successfully.
The right-click feature also allows the user to insert or delete a row, save, and export data.
Profiles. The Profiles are then assigned to Workflow Profiles. Choose the icon to create a
new Profile. To assign a Rule Group to a Profile, choose the icon. This allows the user to
select which Groups will be in the Profile.
Under the Rule Profile Settings, choose the Cube Name and Scenario Type where this Profile can
be viewed and used when designing a Workflow Profile. Assigning a Rule Profile to a Workflow
Profile is done in the Application Tab| Workflow Profiles| Data Quality Settings section.
Certification Questions
Certification Question maintenance is the area where the repository of questions is maintained.
The questions are answered by the assigned users and act as the certification to the data load
process. Once the Certification Question Groups are created, they are organized into
Certification Question Profiles. The Profiles are then assigned to Workflow Profiles. See
Confirmation Rule Profiles.
Description
The field for a more detailed description of the group.
Security
Access Group
This controls the user or users that have access to the rule group within the Workflow.
Maintenance Group
This controls the user or users that have access to maintain and administer the rule group.
NOTE: Click in order to navigate to the Security screen. This is useful when
changes need to be made to a Security User or Group before assigning it to a
Certification Question. Click and begin typing the name of the Security Group in the
blank field. As the first few letters are typed, the Groups are filtered making it easier to
find and select the desired Group. Once the Group is selected, click CTRL and Double
Click. This will enter the correct name into the appropriate field.
Settings
Scenario Type Name
The rule group can be made available to a specific Scenario Type or to all Scenario Types.
Order
The order in which the questions will appear to the user.
Name
A descriptive name given to the question.
Category
A drop down list of question types or categories. Select the best option for the question type.
Risk Level
Assign a risk level to the question. This will dictate the importance of the question as it pertains to
being answered correctly.
Frequency
This option will dictate how often the question is required to be answered and when it will appear
to the user.
Monthly
This displays the questions every month. If this is for a weekly application, they will display the
last week of each month.
Quarterly
This displays the questions for every quarter, or four times a year.
Half Yearly
This displays the questions two times a year; once in June and December.
Yearly
This displays the questions once a year in December.
Member Filter
This turns on the Frequency Member Filter. Filters can then be defined in that section.
Question Text
This is the question the user will answer. The question should be phrased to illicit a Yes or No
response. The user is given a field to explain his/her answer in free text.
Response Optional
Check this box in order to make the question optional.
Deactivated
This will deem the question not active and it will not appear to the user any longer. All historical
responses will be preserved if the question is not deleted.
Deactivated Date
Select the date the question was or is to be deactivated.
TIP: By right clicking on any line item, a user can insert or delete a row, save, or export
data.
Data Collection
Data Sources are built to act as a blueprint for the types of imports that need to be done and to
define how the data should be parsed and imported. These include fixed files and delimited files,
which use connectors to pull the data directly from the source. Data sources also have
dimensions associated with them along with specific properties. In this section you will learn about
these data sources and how to leverage them to build your data collection.
Data Sources
Data sources are blueprints of the types of imports required and define how to parse and import
data. Data sources can be a fixed width file, delimited file, connectors file, or a Data Management
Export sequence that pull data from a source system. Once built, you can assign a data source to
one workflow profile or to many workflow profiles sharing a common file format.
Associate a source file with a data source by clicking in the upper toolbar. The file opens in
the top area of the screen and you can select fields and functions to build Data Source dimension
definitions.
Name
The name of the Data Source.
Description
The field for a detailed description of the Data Source
Security
The security properties are standard across all data types.
Access Group
Members of the assigned group have the authority to access the Data Source
Maintenance Group
Members of the assigned group have the authority to maintain the Data Source.
NOTE: Click and begin typing the name of the Security Group in the blank field. As
the first few letters are typed, the Groups are filtered making it easier to find and select
the desired Group. Once the Group is selected, click CTRL and Double Click. This will
enter the correct name into the appropriate field.
Settings
The Settings are standard across all data types.
Cube Name
The Cube associated with this Data Source which will dictate the available Dimensions that can
be used.
Scenario Type
This allows the profile to be assigned to a specific Scenario Type or All Scenario Types. If the
Data Source is assigned to a specific Scenario Type, it will only be available when assigned to the
Workflow Profile.
Type
This defines the source file structure. The Type can be Fixed, Delimited, Connector, or Data
Management Export Sequence. Details on these types can be found below.
Tabular
This will have a line or lines specific to a single intersection with one amount.
Matrix
This will have multiple amounts on a given line using rows and columns to determine the
intersection that corresponds to each amount.
Fixed Files
Fixed files are in a columnar format with data in predefined columns.
Connector Settings
Connector Name
A drop down list of available Connector Type Business Rules. The rules will be built containing the
code to connect to and pull data from a given source system or database.
Delimited Files
Delimited files have the fields in the file separated by a common character.
Quote Character
In delimited files, the fields are often put in quotes in case the delimiter is also a valid character in
one of the field Members. This option specifies what quote character is being used in the source
file.
Connector Settings
Connector Name
A drop down list of available Connector Type Business Rules. The rules will be built containing the
code to connect to and pull data from a given source system or database.
Connectors
Connectors pull data directly from a source system or database. See Connector Data Source in
"Collecting Data" on page 164 for more information.
Connector Settings
Connector Name
A drop down list of available Connector Type Business Rules. The rules will be built containing the
code to connect to and pull data from a given source system or database.
For more information, see "Business Rules" in "Application Tools" on page 779.
Source Dimensions
Each Data Source will have Source Dimensions assigned to it. The Source Dimensions are added
to the Data Source by clicking the button in the toolbar along the top. Once the button is
clicked, a dialog box will appear with a drop-down list of available Dimensions. These Dimensions
correspond with the Dimensions of the Cube assigned to the Data Source Properties.
Settings
Data Type
Not all data types will be available for every Source Dimension.
DataKey Text
This will read the value from the file as defined in the position settings.
Text
This will read the value from the file as defined in the position settings.
Stored Text
This will override the position settings and force the value to be a constant value for every line.
Matrix Text
This will have multiple amounts on a given line using rows and columns. This will determine the
intersection that corresponds with each amount when more than one column contains the same
Dimension.
Label
This will read the value from the file as defined in the position settings.
Stored Label
This will override the position settings and force the value to be a constant for every line.
Numeric
This defines the numeric amount field for the Data Source. This field will be read and stored as a
number, not as text or string.
Position Settings
Position Settings are the definition of where the Source Dimension will be found in the source file.
For both Fixed Width and Delimited Files, there are tools in the toolbar and an attached file that
assist in populating these values. Highlight the specific area to assign it to a Dimension (for a
Delimited File, it only needs to be a portion of the column). The highlight will appear in Red.
When the defined area is selected, click the button. This will commit the selection to the
Dimension and the corresponding values will be populated in either the Start Position and Length
fields for Fixed Width or the Column Number field for Delimited. To clear this selection without
A Fixed Data Source with a start position of 20 and a length of five will start with the 20th character
and include the next five characters.
Connector Settings
Source Field Name (Connector and Data Management Export Sequences Only)
Source field names will be provided by the Connector Business Rule assigned to the Connector
Data Source. These field names will either be explicitly listed out in the Business Rule, or
dynamically returned from a SQL query. Source field names for Data Management Sequences
are provided within OneStream and will always contain the same list.
Complex Expression
This selection is used when .NET scripting is needed for the Dimension, but not needed
elsewhere. The script used in the Complex Expression will only be available within that
Dimension.
Business Rule
This selection is used when .NET scripting is needed for the Dimension and the script is available
in the Business Rule Library.
Logical Expression
This is the name of the Business Rule assigned to the Dimension when Business Rule is selected
for Logical Operator.
Static Value
This is an override setting which allows a hardcoded value to be assigned to a Dimension rather
than being read from a file or Data Source.
Substitution Settings
Substitution Old Value (Find)
If the value entered in this field is encountered in the Dimension, it will be replaced with what is
entered in the Substitution New Value.
Matrix Settings
This setting is only available when Matrix is selected for the Data Structure Type.
Numeric Settings
These settings are only available in the Amount Source Dimension which will help with the
formatting and properties of the amount values.
Thousand Indicator
Enter the character used to separate thousands in the value. For example, for the value 1,000 the
Thousand Indicator is “,”. This can also be done by highlighting the character in the file and
Decimal Indicator
Enter the character used to separate decimals in the amount value. This can also be done by
Currency Indicator
Enter the currency symbol for the respective currency. This can also be done by highlighting the
button.
button.
Factor Value
The amount being imported is factored by the value entered in this field.
Rounding
The available options for Rounding are Not Rounded and the values 1 – 10. Not Rounded will not
round the values. If a value between 1 and 10 is selected, the value will be rounded to the
corresponding digit.
Zero Suppression
If the import process should not include zero values, set this to True. To import 0 values, set this
to False.
Bypass Settings
Bypass allows an administrator to look for a specific value in a column or an entire line. If a value
is found, that line will not be processed. In order to setup the Bypass Dimension, highlight the
value to skip. Click the button to skip the value only if it is found in the exact position. Click
Bypass Type
Contains at Position
This switch will tell the Data Source to skip an entire line of a file if the Bypass Value is found at the
specified location in the Position Settings section.
Bypass Value
The value defined will indicate an entire line should be skipped when found in a specific location,
or anywhere on the line.
TIP: Users can create a Bypass within a Fixed Data Source for blank spaces by
specifying the Position Settings and entering double square brackets around the
specified number of blank spaces in the Bypass Value. This can be used if an import is
going to encounter an area in the Data Source containing blank spaces in the location
specified.
Transformation Rules
Transformation Rules help map data from source systems to the financial model. The different
Member Script concepts and how they relate to different Transformation Rule types are described
below. These are listed in the order in which the rules run during the Validate step in Workflow.
Create Group
Use this to create a Transformation Rule Group under each Dimension
Create Profile
See Transformation Rule Profiles
General Settings
General
Name
The name of the Transformation Rule Group
Description
A short description of the rule or where it is used
Security
Access
Members of this group will have access to the Transformation Rule Group
Maintenance
Members of this group have the authority to maintain the Transformation Rule Group
NOTE: Click and begin typing the name of the Security Group in the blank field. As
the first few letters are typed, the Groups are filtered making it easier to find and select
the desired Group. Once the Group is selected, click CTRL and Double Click. This will
enter the correct name into the appropriate field.
Settings
Cube Dimension Name
The specific Dimension to which the Rule Group is assigned.
Mapping Types
Mapping Types allow the data to be mapped in different ways with the possibility of using
conditional rules, wild cards, ranges and others.
One-to-One
One-to-One mapping allows one source Dimension Member to be mapped or transformed to one
target Dimension Member explicitly. No Member Scripts are used.
Source Value
This is the value for the related Cube Dimension in a defined data field.
Description (optional)
This is a description of the mapping rule.
Target Value
The Dimension Library Member to which the Source Value is being mapped or transformed.
Order (Optional)
The one-to-one mapping rules will be processed in the order of the (defaulted) alpha numerical
sort. The Order field allows a value to be assigned to a record which will allow a custom sort order
of the mapping table.
Composite
Composite mapping is a way to do a map conditionally. Dimensional tags may be used to include
another Dimension in the evaluation.
A#[199?-???*]:E#[Texas]
In the example, any similar Formatted Account starting with 199 and an Entity of Texas would be
mapped to a specific Target Entity.
Range
A range mapping gives the upper limit and lower limit of a range. Any Member that falls within this
range will be mapped to the corresponding target Member.
If an administrator wants the range of source Accounts from 11202 to 11209 to map to Account
12000, then enter 11202~11209 under Rule Expression with ~ as the separator.
List
List mapping allows the user to create a delimited list of Members that all map to the same target.
If an administrator wants the list of Accounts 41137, 41139 and 41145 to map to Account 61000,
enter 41137;41139;41145 under Rule Expression with ; as the separator.
Mask
Wildcards are used in Mask mappings. The wildcard characters for these mappings are * and ?.
The * character is used to represent any number of characters. 27* would capture 270, 2709, or
27XX-009. The ? character acts as a placeholder for a single character. 27?? Would capture
2709, but would not capture 27999 or 2700-101.
Double-sided wildcards can be used as well in Mask Transformation Rules. For example, *000*
would capture any account number with character(s) before and after the 000 sequence.
The following properties are standard for Composite, Range, List and Mask Mapping Types.
Rule Name
This is a unique name assigned to each mapping rule that will also determine the default sort and
processing order.
Description (Optional)
The description of the mapping rule.
Rule Expression
This is where the specific source field processing rule is placed. For example, 27* would capture
270, 2709, or 27XX-009.
Target Value
The Dimension Library Member to which the Source Value is being mapped or transformed. The
use of wild cards to define Target Values is not recommended. The following exceptions apply to
Target Value wild card usage:
l The * character is not supported when used as a prefix (left side) to a Target Value.
l The * character used as a suffix (right side) will yield the Source value.
Logical Operator
This provides the ability to extend a normal mapping rule with VB.NET scripting functionality.
Expression Type:
None (Default)
No script is assigned or employed for this related Transformation Rule.
Business Rule
This selection is used when .NET scripting is needed for the Dimension and the script is available
within the Business Rule Library.
Complex Expression
This selection is used when .NET scripting is needed for the Dimension but will not need to be
used elsewhere. The script used in the Complex Expression will only be available within that
Dimension.
Order (Optional)
The Order field allows a value to be assigned to a record which will allow a custom sort order of
the mapping table.
Derivative
Derivative Rules apply logic to Stage Data. The output of Derivative Rules is to generate
additional records in the Stage Environment. The two types of Derivative Rules are Source and
Target.
Rule Name
A unique name given to a particular Derivative Rule.
Description
A detailed description used in the Label field of the data imported.
l BlendUnit All
l BlendUnit Base
l BlendUnit Parent
Additional information on the use and design elements of BI Blend can be found in the BI Blend
Design and Reference Guide.
Expression Derivative
Rule Expression Notes
Type Type
Accounts that start with 11 will be
A#[11*]=Cash None Final aggregated up to a new Account
called Cash and stored in Stage.
Accounts that start with 12 will be
A#[12*]=AR None Interim aggregated up to a new Account
called AR, but not stored.
The Derivative Account called Cash
will not be included here because the
Interim
A#[1300- calc is excluded. The CashNoCalc
None (Exclude
000;Cash]=CashNoCalc interim Account will be created as an
Calc)
aggregate of Account 1300-000, but
not stored.
The two Accounts (1310-000 and
Cash) will be aggregated to equal the
A#[1310- new Derivative Account called
None Interim
000;Cash]=CashIncludeCalc CashIncludeCalc because the calc is
included. Note the use of the
semicolon (;) as a list separator.
The following rules create additional rows in the Stage Area when importing data based on logic.
Expression Derivative
Rule Expression Notes
Type Type
This rule creates a new row in the
Stage for any Account that falls
between 1000 and 1999 in the source
data, but will add a suffix to it.
A#[1000~1999]<<New_:E# Applies to Account 1010 will create a new row
None
[Tex*]=TX All Types for Account New_1010. The end of
the rule syntax shows each Entity
name starting with “Tex” will be
created as the Entity called TX in
these new Stage rows.
This rule creates a new row in the
A# Stage for every Account between
[2000~2999]>>_:Liability:U2# 2000 and 2999 with a prefix. Account
[*]= Applies to 2300 will come into a new row as
None
All Types 2300_Liability. The rest of the rule
None:U3[*]=None:U4# means all UD2, UD3 and UD4
[*]=None Dimension Members will be set as the
None Member.
This rule takes the first three digits of
each Account between 3000 and
3999 to create new rows in Stage.
A#[3000~3999]@3:E# Applies to
None Each Entity starting with Tex will be
[Tex*]@1,1 All Types
shown as “T” since the @1,1 syntax
starts at the first position of the string
and looks one character to the right.
Logical Operator
Logical Operator provides the ability to extend a normal mapping rule with VB.Net scripting
functionality.
None
This is the default and no changes will be made.
Business Rule
A Business Rule will run on the resulting Derivative Member data. This Business Rule must have
Derivative as its type.
Complex Expression
Write a script here instead of a shared Business Rule and it will run against the resulting
Derivative Member’s data.
Multiply
This will multiply the resulting Derivative Member’s value by another specified value.
Divide
This will divide the resulting Derivative Member’s value by another specified value.
Add
This will add the resulting Derivative Member’s value by another specified value.
Subtract
This will subtract a specified value from the resulting Derivative Member’s value.
Create If > x
If the resulting Derivative Member’s value is greater than a specified value, it will be created.
Create If < x
If the resulting Derivative Member’s value is less than a specified value, it will be created.
Derivative Type
Derivative types determine if the resulting Derivative Member will be created, not created, or if the
Member will be calculated.
Interim
This will not be stored in the Stage area and cannot be mapped to a target Member. It can be
used within other subsequently run Derivative Rules.
Final
This will be stored in the Stage area and available to be mapped to a target Member.
Check Rule
This is a custom validation rule that uses the same syntax as Member Filters and can be applied
to the source data during the Validation task of the Workflow.
Order
The Order field allows a value to be assigned to a record which will allow a custom sort order of
the mapping table.
Lookup
This Transformation Rule is very versatile in its configuration. This can be utilized as a table for
formulas, Business Rules, or a simple look up.
Profiles. The Profiles are then assigned to Workflow Profiles. Choose the icon to create a
new Profile. Choose the icon to assign a Rule Group to a Profile. This allows the user to
select which Groups will be in the Profile.
Under the Rule Profile Settings, choose the Cube Name and Scenario Type where this Profile can
be viewed and used when designing a Workflow Profile. Assigning a Rule Profile to a Workflow
Profile is done in the Application Tab > Workflow Profiles > Import > Integration Settings section.
Form Templates
Form Templates can be setup to allow manual data entry. The entries can be done in a Cube
View from an Excel file or from the Spreadsheet feature (OneStream Windows App only). Each
Form Template Group has a specific Cube View, Dashboard or Excel File assigned to it. Forms
can also be loaded via an Excel or CSV template. See Loading Form Data in "Collecting Data" on
page 164 for more details on creating these templates.
Description
A short description of the Template Group such as how or where it is used
Security
Access Group
Members of this group have access to the Form Template Group
Maintenance Group
Members of this group have the authority to maintain the Form Template Group
NOTE: Click in order to navigate to the Security screen. This is useful when
changes need to be made to a Security User or Group before assigning it to a Form
Template Group. Click and begin typing the name of the Security Group in the blank
field. As the first few letters are typed, the Groups are filtered making it easier to find and
select the desired Group. Once the Group is selected, click CTRL and Double Click.
This will enter the correct name into the appropriate field.
General
Name
Name of the Form Template
Description
This allows for a more descriptive definition of the Form Template
Form Type
Cube View
Select this to utilize a Cube View for the form’s data entry method.
Dashboard
Select this to utilize a Dashboard for the form’s data entry method.
Cube View/Dashboard
Select the Cube View or Dashboard that will be associated with this Form Template. Click
and begin typing the name of the Cube View or Dashboard in the blank field. As the first few
letters are typed, the names are filtered making it easier to find and select the one desired. If the
name is unknown, expand a Cube View or Dashboard Group and scroll through the list to select
the correct one. Once the Cube View or Dashboard is selected, click CTRL and Double Click.
This will enter the correct name into the appropriate field.
option is selected. Click the button to upload an Excel file. Once selected, the file name in
the Excel File space will appear. Click the button to delete an uploaded Excel file, the
button to download a copy of the uploaded Excel file, or the button to open the uploaded
Excel file.
An Excel file may be used for data entry. Click the button to upload an Excel file. Once
selected, the file name in the Excel File space will appear. Click the button to delete an
uploaded Excel file, the button to download a copy of the uploaded Excel file, or the
button to open the uploaded Excel file.
Workflow
Form Requirement Level
Not Used
This setting is used if the Form is no longer in use and shows the form as Deprecated in the
Workflow.
Optional
This setting will allow the user to enter data via the Form if desired.
Required
This setting will make the Form mandatory for any process to which it is assigned
Form Frequency
Monthly
This allows the form to display every month. If this is for a weekly application, they will display the
last week of each month.
Quarterly
This allows the form to display every quarter, or four times a year
Half Yearly
This allows the form to display two times a year; once in June and December
Yearly
This allows the form to display once a year in December
Member Filter
This turns on the Frequency Member Filter. Filters can then be defined in that section.
See Applying Literal Value Parameters to Form Templates in "Collecting Data" on page 164 for
more details on the Literal Parameter Value feature.
Literal Value
The Value will be hard coded.
Input Value
This allows the user to enter or change the value.
Delimited List
This provides a distinct list of values populated in the Parameter Type.
Member List
This will produce a flat list of Members to show to the user.
Member Dialog
Similar to Member list, this allows the user to select a Member, but through a pop-up Member
selection dialog which also has search capabilities. This is more appropriate for a Dimension such
as Accounts or Entities where the user can choose a base or Parent Member by traversing a
hierarchy.
User Prompt
This prompts the user based on the question entered here.
Default Type
Set the default value so it is not blank.
Dimension Type
Choose the Dimension being prompted
Member Filter
Allows the ability to put in a defined filter. (e.g., E#Root.WFProfileEntities)
The Profiles are then assigned to Workflow Profiles. Choose the icon to create a new Profile.
Choose the icon to assign a Group to a Profile. This allows the user to select which Groups
will be in the Profile.
Assigning a Profile to a Workflow Profile is done in the Application Tab| Workflow Profiles| Form
Settings, or Journal Settings for Journal Template Profiles.
NOTE: Click in order to navigate to the Security screen. This is useful when
changes need to be made to a Security User or Group before assigning it to a Form
Template Profile. When assigning Security Groups to Form Template Profiles or
Journal Template Profile, click and begin typing the name of the Security Group in
the blank field. As the first few letters are typed, the Groups are filtered making it easier
to find and select the desired Group. Once the Group is selected, click CTRL and
Double Click. This will enter the correct name into the appropriate field.
Journal Templates
Create pre-set Journal Templates in Journal Template Groups. Once the Journal Template
Groups are created, they are organized into Journal Template Profiles and then assigned to
Workflow Profiles. For more details on Profiles, see Form Template Profiles.
Journal details can also be loaded via an Excel or CSV template. See Loading Journal Data in
"Collecting Data" on page 164 for more details on creating these templates.
Description
A short description of the Template Group such as how or where it is used
Security
Access Group
Members of this group have access to the Journal Template Group
Maintenance Group
Members of this group have the authority to maintain the Journal Template Group
NOTE: Click in order to navigate to the Security screen. This is useful when
changes need to be made to a Security User or Group before assigning it to a Journal
Template Group. Click and begin typing the name of the Security Group in the blank
field. As the first few letters are typed, the Groups are filtered making it easier to find and
select the desired Group. Once the Group is selected, click CTRL and Double Click.
This will enter the correct name into the appropriate field.
General
Name
Name of the journal template
Description
A detailed description of the journal template.
Journal Template
Journal Frequency
All Time Periods
This allows the journal to display every period.
Monthly
This allows the journal to display every month. If this is for a weekly application, it will display the
last week of each month.
Quarterly
This allows the journal to display every quarter, or four times a year.
Half Yearly
This allows the journal to display two times a year; once in June and December.
Yearly
This allows the journal to display once a year in December.
Member Filter
This turns on the Frequency Member Filter and then the filters can be defined in that section. For
example, if this journal only needs to be completed in September, use the filter T#WFYearM9.
The following settings control what the user can modify when creating an instance of a Journal.
Settings are True or False.
Journal
Journal Type
The options for Journal Type are Standard, Auto Reversing, or Allocation. When posting an Auto
Reversing Journal, the auto reversal journal is automatically created in the next time period and
set to the Approved state. The Auto Reversal Journal has all the debits and credits reversed.
When un-posting an Auto Reversing Journal, check to make sure the auto reversal is not posted
first. If it is not, delete the Auto Reversal Journal from the next time period and un-post. An
Allocation Journal can be set up to perform simple or more intricate allocations, such as creating
the weighting of the allocation, previewing the actual allocation entries, and un-posting them if
they need to be run again.
Balance by Entity
Debits and Credits in a multi-Entity Journal must balance for each Entity.
Unbalanced
Balance check will not be performed. This is normally used for one-sided journals.
Is Single Entity
If True, the Entity name is entered in the Journal POV and all Journal Lines relate to this one
Entity. If False, the Cube, Entity and Parent columns must be filled out for every line in the Journal
instance.
Point of View
In order to limit the amount of setup for every Journal Line, the items that remain constant (e.g.
Flow = None) can be set in the Journal POV instead of in every line Item.
TIP: A Journal Template can be repeated on a regular basis if values are placed in the
journal lines and journal’s settings require repeating upon a certain frequency.
Once a Book is saved, it can be viewed and utilized in different areas of OneStream. A Book is
stored as a file, so if it is saved in the OneStream File Share, it can be viewed from OneStream’s
File Explorer. PDF Books can be used within other Books and can be viewed in Dashboards by
using the Book Viewer or File Viewer Dashboard Components (See Dashboards for more
information on these Dashboard Components). Another way to view Report Books is by running a
Data Management Sequence. Attach a Report Book file to an Export File Step and the Book will
be processed and exported to the OneStream File Share. See Data Management in "Application
Tools" on page 779 for more details on this feature)
NOTE: Embedding fonts in a PDF Report Book significantly increases the size of the
PDF file. Use the PDF Embedded Fonts to Remove property in the Application Server
Configuration File to specify the fonts to not embed. This will reduce the size of PDF
files and control the resolution during Report Book PDF generation. This property is for
Report Books only.
The default setting is: Arial; Calibri; Segoe UI; Tahoma; Times New Roman; Verdana.
Open Book
Click the arrow and select the location of a saved book
Close Book
Use this to close a previously saved book
Save As
Use this to save a new book by clicking on the arrow and selecting a location
When saving a Report Book, specify the Book Type in the New File Name field. Report Books can
be saved using the following:
Excel Books
ReportBookName.xfDoc.xlBook
PDF Books
ReportBookName.xfDoc.pdfBook
Add Item
Click the arrow and select a Book Item to add
See below for detailed descriptions of each Book Item
Remove Item
Use this to remove the selected Book item
Object Lookup
See Object Lookup.
Book Properties
Determine Parameters from Content
This automatically determines the required input Parameters from the Book’s content items. For
large books, a performance gain can be achieved by setting this to False and manually specifying
all the required Parameter names in the Required Input Parameters property.
File
File Source Type
URL
Display a file from an internal or external web page.
Output Name
Enter a name to be used as the File’s name when used in a Zip Book. If the .pdf extension is
included for a Word or Excel file, the File will be converted into a PDF. This property is optional
and does not apply to PDF Books.
Output Name
Enter a name for the Cube View which will display on the Excel worksheet tab. This is optional.
Maximum characters are 31.
Report
Report Type
Cube View
Select this if the report is deriving from a Cube View.
Dashboard Chart
Select this if the report is deriving from an Application Dashboard Chart.
Dashboard Report
Select this if the report is deriving from an Application Dashboard Report Component.
Output Name
Enter a name to be used as the Report’s file name when used in a Zip Book. This property is
optional and does not apply to PDF Books.
Loop
A Loop is a sequence of instructions that will continually run a process as many times as is
defined in the Loop Definition. For example, a Book can be set up to loop through all the Base
Entities under a particular hierarchy and generate an instance of the same Cube View Report for
each Entity.
Loop Type
Dashboard Parameter
Select this to use a pre-configured Dashboard Parameter.
Member Filter
Select this to use specific Dimension Members.
Loop Definition
Example
(Houston, Clubs, [Houston Heights])
Example
The ParamSalesRegion Parameter returns a list of all Sales Regions within the application
resulting in the Book’s Loop Variables using the same list.
Example
E#Frankfurt, E#Houston, E#Montreal
This Loops over each Entity and performs the process three times.
E#[NA Clubs].Base
This Loops over each Base Entity under NA Clubs and performs the process for however many
Base Entities there are.
Loop Variables
|Loop1-4Variable|
This allows all Book items located under the Loop’s hierarchy to reference the Loop Definition’s
values by name. Use |Loop2Variable| through |Loop4Variable|to create nested Loops within a
Loop.
|LoopDisplay1-4Variable|
This allows all Book items located under the Loop’s hierarchy to reference the Loop Definition’s
values when using a Delimited List Dashboard Parameter Loop Type.
|Loop1-4Index|
This assigns a number to the values in the Loop Definition beginning with the number one which
can be referenced in the Book items in the Loop’s hierarchy. Use |Loop2Index| through
|Loop4Index|to create nested Loops within a Loop.
If Statement
If Statements determine how Book items within the hierarchy are processed.
Statement
Enter a conditional statement using Parameters to determine whether the Book items will be
processed.
If Frankfurt is in the Loop Definition for the Loop1Variable, the Book items in this hierarchy will
process for Frankfurt.
If the person running the Book has a user name of Administrator, the Book items in this hierarchy
will be included in the Book.
If the person running the Book has a user name of Administrator or JSmith, the Book items in this
hierarchy will be included in the Book.
Else/Else If Statement
Else and Else If Statements determine how Book items within the hierarchy are processed. An If
Statement is needed in order to use an Else or Else If Statement.
Statement
Enter a conditional statement using Parameters to determine whether the child items will be
processed. See examples under If Statement in the previous section.
Example:
If Statement: (|Loop1Variable|=[Frankfurt])
If Loop1Variable is Frankfurt, run the Report and File located under the If Statement.
Else
Else (In all other cases) run the Report and File located under the Else Statement.
Change Parameters
This section allows changes to be made to the output of a Cube View being used in a Report
Book. A Loop must be used in order to use Change Parameters. When a Change Parameters
type is encountered in a Loop, the Loop Variable is updated to use the next Loop Variable.
For example, a Loop may be used to loop through a list of Entities and run a Report for each. The
Change Parameters Book Item should be added here to pass along the appropriate Loop
Variable (e.g. |Loop2Variable|) which applies to each Entity included in the Loop.
Change Workflow
Set to True in order to change settings for Cube Views or Reports driven by Workflow information.
Workflow Profile
Specify the Workflow Profile name to replace the original Workflow Profile referenced in the Cube
View or Report.
Workflow Scenario
Specify the Workflow Profile Scenario Type to replace the original Workflow Profile Scenario Type
referenced in the Cube View or Report.
Workflow Time
Specify the Workflow Profile Time to replace the original Workflow Profile Time referenced in the
Cube View or Report.
Change POV
Set to True in order to change the Book’s POV without having to change the actual Cube View or
Report.
NOTE: In order to use this feature, the POV tab in the Cube View cannot be completed.
Member Script
Click the ellipsis button in order to launch the Member Filter Builder and enter a Member
Script to change the POV. The example below is changing the POV for both the Entity and
Account.
Example: E#[|Loop1Variable|]:A#Sales
Change Variables
This serves as a placeholder that can store up to ten variables. This is valuable when If
Statements are used.
Variable Values
Enter a comma separated list of name value pairs to change the values of the predefined
variables named Variable 1-10.
Change Parameters
This allows a user to specify a value for a custom Parameter.
Parameter Values
Click the ellipsis button in order to select the desired Parameter. Enter a comma separated
list of name value pairs to override the custom Parameter’s values.
This displays what page is currently being previewed in the Report Book. To navigate to a specific
page, enter the page number and click Enter.
/ First/Last
Use these buttons to navigate to the first or last page of the Report Book.
Previous/Next
Use these buttons to navigate forward or backward one page.
This combines the Report Book’s pages and treats them as one content item. Use this feature in
conjunction with saving or printing an entire Report Book. If this box is checked, a user can save
the entire Report Book as one PDF file, print the entire book, and navigate page by page using the
blue navigation arrows. If this box is unchecked, only the current page will save or print, and black
navigation icons will be enabled.
Refresh
Use this button to refresh the Report Book and select new Parameter values.
Close
Use this to close the Report Book in the Preview Screen.
Open
Use this to open a Report Book from a PC’s desktop or folder.
Save
Use this to save the current Report Book Page. To save the entire Report Book as one PDF file,
ensure the Combine All Items check box is checked.
Print
Use this to print the current Report Book page. In order to print the entire Report Book, ensure the
Combine All Items check box is checked.
Pan
Use this button to scroll through a Report Book by clicking anywhere on the screen and moving
the mouse.
Text Selection
Use this to select portions of the Report Book. This feature allows users to copy and paste.
Highlight a portion of the Report Book, select Ctrl+C and then Ctrl+V.
Use this to find specific keywords in a Report Book. Type the word in the Find Filter and click
Enter. Use the navigation arrows to Find Previous and Find Next.
Cube Views
A Cube View is used to query Cube data and present it to the user in variety of ways.
Create Group
A Cube View Group is where all Cube Views are organized. This is the first step to create new
Cube Views.
Create Profile
Use this to create Cube View Profiles which is where Cube View Groups are organized.
Save
Use this to save any changes.
Search
Use this to search for Cube Views.
Move Up
Use this to move rows or columns.
Move Down
Use this to move rows or columns.
Object Lookup
See Object Lookup.
Navigate to Security
This icon appears in the Cube View Group and Profile Security properties and when clicked it
navigates to the Security screen. This is an easy way to make changes to Security Users or
Groups before assigning them to specific Cube View Groups and Profiles.
l First Page: The user can always navigate directly to the first page.
l Percentage: The percentage displays the total percentage the report that has been
rendered up to the current page.
l Evaluation 3 – Paging: Once Paging begins, the Cube View evaluates the rows attempting
to return a minimum of 20, to a maximum of 2000 un-suppressed rows. In the case of
nested dimensions on rows, the evaluation starts on the left most dimension expansion, as
defined in the Cube View. After a maximum processing time of 20 seconds, the first page of
the Cube View will be returned for display containing only the rows which completed
processing during the time constraint. For this reason, Cube View pages are not a fixed
number of rows. The rows are ultimately determined by their time to process. This also
relates directly to the Percentage display, as each page is generated by processing time
requirements, and the last page is not known while the Cube View is executing. Therefore,
this percentage is not intended to be a precise measurement.
When a row is defined with nested Dimensions, the Paging evaluation is performed on the left
most dimension, this being Account in the example below. For each expansion of the left most
dimension (e.g. Net Sales, Operating Sales, Returns and Allowances), the Paging will not
progress to the next sibling until all the records are returned, by all the other dimension
expansions to be completed. Therefore, in the example, all the records for Net Sales, by the base
UD2 Products, will be evaluated and returned first. Once all the intersections of Net Sales are
known, the Paging evaluation will restart on the next sibling member of the left most Dimension,
Operating Sales in the example.
The purpose of Cube View Paging is to protect the server from “runaway” Cube Views. This
would be a Cube View processing very large numbers of cell intersections. The paging returns
data in a timelier manner to the user, such as in large suppression operations.
You can customize the behavior of a specific Cube View’s Paging to adjust the paging to tailor the
presentation of the Cube View rows. Paging looks at the definition of the Cube View rows and
figures out how many rows will be generated. If it is more than the limit, for example, 2000 rows,
then it will create multiple pages. However, these settings should be used with care, as they can
affect the performance of both the Cube View and the application.
l Number of rows in the cube view: for example, how many rows are written before it starts
paging.
l Time to run: if the cube view is large, it counts 20 seconds and would then return at that
point and start paging. For example, if the cube view has 2000 or 5000 rows depending on
how much can generate in 20 seconds, you may only see 5 rows.
These new options in the Paging section allow you to tailor this to your specific cube view. You
can go into a specific cube view and customize those settings.
4. Scroll to the Paging section and change either of the following options:
The application will never split up a hierarchy in the nested expansions. Assume you
have Services and within that is IT, Marketing, and Sales hierarchies as parents. If
you enter 20 for Max Seconds To Process, but only half of Sales is able to output
within that 20 seconds, the application will give you IT and Marketing on page 1 and
Sales on page 2.
Also, if there are members you want to display that take a long time to run, you can
adjust the seconds to display more members within the pages, but keep in mind that
you will need to wait longer for the first page to display.
3. If you do not want gridlines to show in Excel, set Display Gridlines to False.
Description
A short description of how the Group is used, or what it contains
Security
Access
Members of this group can access the Cube Views within the Cube View Group
Maintenance
Members of this group have the authority to maintain the Cube Views within the Cube View Group
NOTE: Click and begin typing the name of the Security Group in the blank field. As
the first few letters are typed, the Groups are filtered making it easier to find and select
the desired Group. Once the Group is selected, click CTRL and Double Click. This will
enter the correct name into the appropriate field.
Advanced Tab
The Advanced Tab uses the same Cube View properties found in the Designer Tab, however
each property’s location is organized differently. If a user is working in the Advanced Tab, refer to
the information below in order to locate the desired Cube View Property.
Name
Description
Is Visible In Profile
Is Shortcut
Shortcut Cube View Name
Literal Parameter Values
Can Modify Data
Can Calculate/Translate/Consolidate
Include Cell Attachment Status
Page Caption
Default Header/Cell Format
Allow Sparse Row Suppression
Column/Row Sharing
Paper
Landscape
Margin Left, Top, Right, Bottom
Auto Fit to Page Width
Title/Subtitles
Footer
Advanced Tab|Headers
The following properties are located under the Headers Tab under the Advanced Tab.
Column/Row Name
Can Modify Data
Column/Rows Sharing
Header Format
Cell Format
List Parameter
Enable Report Navigation Link (Rows Tab Only)
Dashboard to Open in Dialog (Rows Tab Only)
Primary Dimension Type
Member Filter
Indent Level
Column/Row Suppression
Column/Row Overrides
POV
Choose a single Member for each Dimension listed on the Point of View Slider which will then lock
in for this Cube View. Once selected, users cannot change that Member at run time. If a Cube
View is needed to enter data for the Budget Scenario only, select Budget for the Scenario
Dimension. If a value is not selected, the Member selected in the Application POV will be used.
NOTE: The default Time Dimension Profile for Cube Views is determined by the Cube
POV setting.
TIP: Use the drag and drop capabilities to drag the Cube POV from the POV context
pane to the POV Slider in the Cube View Designer. This will populate each field with the
current POV and allow the user to make changes as necessary.
TIP: When designing a Cube View as the basis for a Form Template, choose base-level
Members in the Point of View. Data input in aggregated Member intersection is not
allowed. If the Entity Default settings are being used (see section on Entity Dimension)
for User Defined Dimensions, enter the value EntityDefault for the Member name and it
will be found in the Cube View.
General Settings
Sharing
An administrator can share either all rows / columns or just specified rows / columns from another
Cube View. This sharing can save the user a significant amount of time and provide efficiencies
and uniformity when building and managing Cube Views.
All Columns/Rows
If this option is chosen, enter the Cube View name(s) from where the rows/columns will be shared.
Specified Columns/Rows
If this option is chosen, a single column/row from the source Cube View can be referenced when
designing columns/rows. Refer to the Sharing Tab under the Rows and Columns Slider to specify
the Cube View name and its row/column name.
Common
Is Visible in Profiles
When this property is set to True, it allows the ability to create Cube Views that are only used as a
row or column source for other Cube Views. In these cases, the Cube View does need to be
assigned to a Profile, Excel, Workflow, Dashboards or OnePlace.
Page Caption
The Page Caption appears at the top of the Data Explorer Grid when viewing the output of a Cube
View. If this is left blank, it will get its value from the Cube View Description. If that is blank, it will
display the Cube View Name.
Is Shortcut
This determines whether the Cube View is a shortcut. Settings are True or False. See Cube
Views in "Presenting Data With Extensible Documents" on page 235 for more details on using
Cube Views as shortcuts.
Shortcut. Click and begin typing the name of the Cube View in the blank field. As the first few
letters are typed, the names are filtered making it easier to find and select the one desired. If the
name is unknown, expand a Cube View Group and scroll through the list to select the correct one.
Once the Cube View is selected, click CTRL and Double Click. This will enter the correct name
into the appropriate field.
The default setting is False, which will not allow Sparse Row Suppression. If set to True, Sparse
Row Suppression will be turned on. See Using Sparse Filtering In Cube Views in "Presenting
Data With Extensible Documents" on page 235 for more information on this feature.
Header Text
By default, a Cube View with standard row and column headers to represent rows (e.g. Accounts)
and columns (e.g. Time) is shown. All the Header settings control the row and column headers
presented when the Cube View is run if the default is not desired.
For each Dimension, determine whether the Name, Description, or Name and Description
(default) will be displayed when the Cube View is run.
Header Size
Column Header Heights
Column Header Heights allow column headers to align with the bottom of a report. The default
setting is -1, so for bottom alignment to work in column headers, the height of the column header
needs to be explicitly set. This is helpful for wrapped headers which may cause the cause the
columns to expand.
NOTE: If a Cube View Row Header width is not specified, the system will loop over all
the expanded headers and use their text and font to determine the row header width
automatically. The automatic row width maximum is a half-page. Any text longer than a
half page is wrapped.
Header Overrides
By default, the Dimensions to show for row and column headers based on the Member
expansions specified in the Rows and Columns Slider is shown. The Dimensions to show for row
and column headers can also be selected manually.
The Report Column Index for Row Headers can be used for a report to display the row headers
after the column instead of on the left-hand side of the page. The default value is -1, but setting
this to a positive number will change the Cube View column index. Specifying several columns
greater than the existing number of columns will revert the row headers back to the left-hand side
of the report. Hidden columns are not included in the column count.
TIP: If a Cube View report exceeds the width of the page, there will automatically be a
page break on the appropriate column and repeat the row headers on the following
page. If Report Column Index for Row Headers or Auto Fit to Page Width are enabled,
the paging right will not replicate the row headers.
Report
Custom Report Task
Advanced formatting can be applied to Cube Views in the form of a Cube View Extender Business
Rule or an Inline Formula. See Cube View Extender under Business Rules in "Application Tools"
on page 779 for more details on advanced formatting options for Cube Views.
No Task
Select this if there are no Cube View Business Rules or Inline Formulas running on the report.
Business Rule
Click the ellipsis and select the desired Cube View Extender Business Rule. This property is only
enabled when Execute Cube View Extender Business Rule is selected in the Custom Report Task
property.
Formula
Click the ellipsis to include an inline formula for advanced formatting. This property is only enabled
when Execute Cube View Extender Inline Formula is selected in the Custom Report Task
property.
NOTE: An inline formula operates the same as a Cube View Extender Business Rule,
but is attached to the specific Cube View and cannot be shared with other Cube Views.
Extender Business Rules can be shared amongst several Cube Views. Inline formulas
are automatically included when extracting Cube Views. Business Rules will need to be
selected separately in order to include them in the extract.
See Cube View Extender: Advanced Cube View Formatting in "Implementing Security" on
page 322 for examples on how to use this rule.
Paper
Specify a paper size for the printed report. Some examples of this are Legal, Letter, or A4. The
Default setting is Letter.
Landscape
When set to True, the resulting Data Explorer Report will be viewed in Landscape, if set to False, it
will be viewed in Portrait.
Navigation Links
Linked Cube Views
Enter a comma separated list of Cube Views that will be available for the user to open when
viewing a Cube View in the Data Explorer Grid. The Cube Views specified in this field will apply to
the entire Cube View and be available when a user right-clicks on any Cube View data cell. In
order to assign Linked Cube Views to a specific row or column, see Rows and Columns later in
the Cube View section. See Linked Cube Views under Cube Views in "Presenting Data With
Extensible Documents" on page 235 for more details on how to configure this feature.
Report Header
1. Title
This overrides the Page Caption, Description and Name properties of the Cube View. In the
example above, the entry was set as Income Statement for |CVEntity|
2. Formatting Icons
These formatting icons contain drop down menus of the most common Cube View
formatting properties. See Cube View Formatting for more details on this feature.
3. Default Formatting
Select the Default cell in order to create default formatting for the entire Cube View’s
Headers and Cells. See Cube View Formatting for more details on this feature.
4. Columns
Select a specific Column in order to make changes.
5. Rows
Select a specific Row in order to make changes.
6. Row/Column Cells
Select a cell in order to make changes.
7. Tabs
The tabs available depend on the Row/Column selection.
Member Filters
A maximum of four different Dimensions can be nested as rows and two different Dimensions can
be nested as columns in a Cube View.
NOTE: When nesting Cube View columns, if the top column header is longer it will span
the nested column headers when displayed on a Cube View Report.
Dimension Type
Select the desired Dimension Type from the drop-down list. This indicates which Dimension Type
will display on the selected row or column.
Member Filter
Enter a Member Filter to express the specific Members needed from the Dimension.
Formatting
Data
Cell Type
Text Box
This makes the Cube View cell numerical. Only numerical values can be entered in the data cell
at run time. This is the Default setting.
Combo Box
Select this when a List Parameter is used in a Cube View data cell. Enter the name of the List
Parameter in the property below.
Date
This enables a calendar in the Cube View cell and allows users to select a specific date in the data
cell.
Date Time
This enables a calendar and a time display and allows users to select a specific date and time in a
data cell.
NOTE: Selecting Now in the calendar enters the current date or current date and time.
List Parameter
Enter a Parameter here to create a drop-down menu of choices in a Cube View data cell. See List
Parameters in "Using Parameters" on page 220 for more details on this function.
See Parameters in the Application Dashboards section for more information on Parameters and
their functionality.
NOTE: Row settings override column settings which override Cube View settings.
Suppression
These settings help suppress columns/rows that contain invalid data (red cells), no data, or
zeroes. This rule can optionally be applied to Parent rows that have children which adds a slight
amount of processing time.
Sparse Row Suppression is available on the Rows and Columns Slider|Data Tab when a column
cell is selected. This feature provides significant performance improvements for Cube Views
using multiple nested row Dimensions. Data Unit evaluations are performed to query all
intersections of actual data instead of inspecting each row individually to see whether they carry
data. The default setting is True. If the setting is False for any column, then Sparse Row
Suppression cannot be used for the entire Cube View. If a column is set to True (but determine
sparse rows using other columns), then Sparse Row Suppression is used, but that column’s Data
Units aren’t considered when evaluating data. Typically, this setting is used for columns that use
Dynamic Calc formulas or a GetDataCell script, and other columns are used for the Data Unit
queries. Use to Determine Row Suppression Cube View property has been created to increase
performance on large Cube Views. This Column property will allow the designer to better define
how to apply Row Suppression. In multi-column Cube Views, this setting can be used to
determine which Columns should be used to determine Row Suppression. Limiting this
evaluation to the smallest possible number of Columns will reduce the total number of cells which
are evaluated for suppression.
See Using Sparse Filtering In Cube Views in "Presenting Data With Extensible Documents" on
page 235.
The Allow Insert Suppressed Member feature is only available on Cube Views and through Form
Templates. It is not available in Excel or the Spreadsheet Tool.
You can enable the Allow Insert Suppressed Member feature by the the following row expansion
settings:
l Innermost: Allow visibility of the row expansion that is at the bottom level.
Administrator Tasks
1. From the Application tab, under Presentation, click Cube Views.
2. Expand Cube View Groups, expand a specific cube view group and select a cube view.
4. Select a row.
l Nested: Allow visibility of the row that is directly beneath the top level.
User Tasks
1. From the OnePlace tab, select Workflow.
TIP: When you see an orange flag in the lower right corner of a cell, it indicates
that there is some action for you to do. In this case, when you right-click on a cell
with an orange flag you can then choose Insert Suppressed Member.
5. The Edit Row Expansions window opens. The areas that are active are editable.
8. Click the arrow to add one or more members to the Result List.
The Edit Row Expansions window now shows the new member that launches a new cube
view. You will see from the following screenshot that the value has changed to the selected
member.
9. Click OK.
The insert suppressed member, in this example, New Jersey, opens.
10. Enter data for the insert suppressed row member. In this example, we added 3,000.00 to
Jan 2018.
NOTE: The Insert Suppressed Member setting is active for the entire row.
12. If the feature is not available on a row, the option is grayed out.
14. Right-click the row to undo suppression and select Allow Insert Suppressed Member.
Sharing
Share Settings from Another Cube View
When this is set to True, a single row or column from another Cube View can be shared. The
Specified Columns/Rows property under the General Settings Slider|Sharing must be selected in
order to perform this function.
Row/Column Overrides
Four overrides can be applied against different columns or rows and can be used to change the
display for a given column or row. In this example, two columns for the Flash Scenario (Col1) vs.
Budget (Col2) is needed. If Row1 is being used by the Net Sales Account (e.g. A#60999), and it
needs to be compared to the Flash Net Sales which is a different account (e.g. A#FlashNetSales),
then a column override would be used.
Row/Column Range
Type a column name to display a different value in the Cube View. In this example, Col1 would be
typed in order to show a different value for Flash Scenario Column.
Member Filter
Type the Member Filter to override the value for the columns or rows. In this example, type
A#FlashNetSales to override the results from A#60999.
The override results will display the value from A#FlashNetSales in Col1 for the Flash Scenario.
Cell Format
See Cube View Formatting.
List Parameter
See List Parameter under Data earlier in this section.
Report Footer
A Cube View’s formatting can be selected for the entire Cube View, specific rows or columns,
headers, and individual cells. The formatting options allow for the number formats, percentage
signs, scaling, currency symbols, colors, fonts, and font size to be specified and unique to the
administrator’s needs. For example, to make all the cells yellow in an entire Cube View, specify
that under the Default Cell Format to make one row bold, specify that in the Rows Cell Format, or
to specify different number formats and colors for positive and negative numbers in columns,
specify that in the Columns Cell Format.
In order to apply custom formatting beyond the Cube View Designer properties, users can apply
advanced formatting via a Cube View Extender Business Rule. See Cube View Extender under
Business Rules in "Application Tools" on page 779 for more details on advanced formatting
options for Cube Views.
When using the icon options in the formatting toolbar, note the following:
Check boxes with an orange box indicate the Use Default selection, indicates True, and
The available properties located in each toolbar icon are listed below. For more details on these
properties, see the Header Format and Cell Format sections below.
Find the following properties in the Text drop down menu when a Row or Column Header is
selected.
l Font
l Text Color
l Text Size
l Bold
l Italic
The following properties become available in the Text drop down menu when a Row or Column
Cell is selected.
l Number Format
l Scale
l Flip Sign
l Show Currency
Find the following properties in the Border drop down menu when a Row or Column Header is
selected.
l Background Color
l Gridline Color
Find the following properties in the Column drop down menu when a Column Header is selected.
l Is Column Visible
l Column Width
Excel
The green icons in the toolbar represent Excel formatting which is used when a Cube View is
displayed in an Excel spreadsheet.
Find the following properties in the Text drop down menu when a Column or Row Header is
selected.
l Text Color
l Horizontal/Vertical Alignment
l Indent Level
l Wrap
The following properties become available in the Text drop down menu when a Row or Column
Cell is selected.
l Number Format
l Use Scale
The following properties become available in the Border drop down menu when a Row or Column
Header or Cell is selected.
l Background Color
l Border Colors
l Border Lines
The following properties become available in the Column drop down menu when a Column
Header is selected.
Column Width
Reporting
The orange icons in the toolbar represent Report formatting which is used when a Cube View is
published into a polished report.
The following properties become available in the Text drop down menu when a Column or Row
Header is selected.
l Text Color
l Text Alignment
l Text Size
l Underline
The following properties become available in the Text drop down menu when a Column or Row
Cell is selected.
The following properties become available in the Border drop down menu when a Column or Row
Header is selected.
l Background Color
l Border Lines
The following properties become available in the Border drop down menu when a Column or Row
Cell is selected.
The following properties become available in the Lines drop down menu when a Row Header or
Cell is selected.
l Color
l Padding
l Thickness
The following properties become available in the Column drop down menu when a Column
Header is selected.
Column Width
Header Format
The following properties are for a Cube View’s Header formatting which includes Default, Row
and Column Headers. All these properties are found under the Rows and Columns Slider, but not
all properties are available for each Header.
In order to set Default settings which, apply to the entire Cube View’s Headers and Cells, select
All these properties can be found by clicking the located in the Cube View Formatting Tab.
TIP: Click to filter by column or row names. This is helpful when working with Cube
View reports with a large number of columns and rows.
General
Custom Parameters
Click the ellipsis in order to select and assign a custom Parameter such as a Cube View Style to
the Cube View Header. See Cube View Styles in "Using Parameters" on page 220 for more
details on this feature.
RowExpansionMode
Control expansion of nested rows by selecting Collapse All or Expand All. This will determine how
rows will be displayed in the Data Explorer grid. Use Default sets this property to False.
ShowDimensionImages
Set this to False to hide the Dimension icons in the Data Explorer Grid row and column headers.
The Default setting will display the Dimension icons.
IsColumnVisible
If set to False, this setting will hide specific columns. A clever use of this property is to set this at
run time with a Parameter in order to show and hide detail.
ColumnWidth
Enter a numerical value for column width. If Header Label exceeds column width, the Header will
automatically wrap the text.
IsRowVisible
If set to True, this setting will hide specific rows. A clever use of this property is to set this at run
time with a Parameter in order to show and hide detail.
Font
FontFamily
Any .Net font installed on the Excel client’s operating system is included in the font family.
FontSize
The default font size is 11.
Bold
This determines whether the font will be bold. Setting are True or False
Italic
This determines whether the font will be italicized. Settings are True or False.
Colors
TextColor, GridLinesColor and BackgroundColor
There are about 140 build in colors (e.g., TextColor=White, TextColor=Transparent,
TextColor=Aqua, …) Select a color from the list of options. Any color can also be specified by
using the ARGB values (alpha(opacity), Red, Green, Blue). (e.g., TextColor=#FF0000FF is
another way to get Blue)
Excel
Cube Views can be immediately exported to Excel via the button in Data Explorer, or can be
inserted into an Excel sheet through the Cube View Connections function in the Excel Add-in.
Apply formatting in this section of the Cube View in order to impact how the data will appear in
Excel.
You can apply outline levels to both rows and columns. Use this feature to:
l Add more robust reporting options because you can use both row and column outline
levels.
ExcelMaxOutlineLevelOnRows
Up to six outline levels can be used for the creation of collapsible and expandable Excel Groups of
rows when exporting. Decide how many outline levels are needed and enter the number here.
This relates to the Excel Indent Level setting that is applied in the Header Format on each Row, so
the number entered here should equal the highest setting applied on the Rows. Example entry is
3.
ExcelExpandedOutlineLevelOnRows
When a Cube View is exported to Excel, this setting controls to which Outline Level the file is
initially opened. The default is 1, which means each grouping is fully collapsed. Example entry is
2.
ExcelMaxOutlineLevelOnCols
Up to six outline levels can be used for the creation of collapsible and expandable Excel Groups of
columns when exporting. Decide how many outline levels are needed and enter the number here.
This relates to the Excel Indent Level setting that is applied in the Header Format on each column,
so the number entered here should equal the highest setting applied on the columns.
ExcelExpandedOutlineLevelOnCols
When a Cube View is exported to Excel, this setting controls to which Outline Level the file is
initially opened. The default is 1, which means each grouping is fully collapsed.
ExcelColumnWidth
This will be the default Excel Column Width in pixels unless overridden in Columns.
ExcelOutlineLevelCol
The default Excel Outline level for Columns is 1 unless specified here or overridden in Columns.
ExcelOutlineLevel
The default Excel Outline level for Rows is 1 unless specified here or overridden in Rows.
Alignment
ExcelHorizontalAlignment/ExcelVerticalAlignment
These control how the data will be aligned horizontally and vertically using Excel’s standard
alignment options. Use the following link for tips on this function.
Excel Alignment
ExcelIndentLevel
The number of characters to indent the text or value.
ExcelWrapText
This determines if text like Headers will be wrapped in the output. Settings are True or False.
Colors
For ExcelTextColor and ExcelBackgroundColor, choose a color from the drop-down options.
Borders
Choose a line design or color from the drop-down options to determine how the lines on the Left,
Top, Right, and Bottom Borders will look.
Reporting
ReportSeparateBandPerCubeViewRow
This is a setting to optimize performance for large Cube Views that will be dynamically generating
a Data Explorer Report with many row definitions with unique formatting. Default setting is (Use
Default) which is a conditional setting. This applies to when a Data Explorer Report is run from a
Cube View and background temporary database tables in memory, one for each Row Name in the
Cube View are created. If there are more than 100 row definitions, fewer report bands (separate
temporary in-memory tables) are created, combining rows where formatting is the same.
Setting this to True will ensure a separate table per Cube View row definition, though this may
impact performance with hundreds or thousands of row definitions.
If set to False, row definitions of the same format will always be combined into fewer temporary in-
memory tables.
ReportRowHeight
This sets the height of a row on a report by number of pixels.
ReportTopLinesOnFirstRowOnly
If the Row definition results in multiple rows, a Top Line will be placed on the first line only and not
on each row.
ReportBottomLinesOnLastRowOnly
If the Row definition results in multiple rows, a Bottom Line will be placed on the last line only and
not on each row.
ReportBandHeightBeforeFirstRow
This setting allows a number to be entered in pixels, which is about the width of a hair, to add as a
pad before the first row of where a Row definition starts.
ReportBandHeightAfterLastRow
This setting allows a number to be entered in pixels, which is about the width of a hair, to add as a
pad after the last row of where a Row definition starts.
NOTE: The top and/or bottom report bands used for lines and spacing are removed if
there are no data rows. Row padding is also removed for suppressed rows.
ReportRowPageBreak
This will apply a Page Break where appropriate for this row. For example, creating a Page Break
after the Total Assets line of a Balance Sheet report. Settings are None, After, Before, or Use
Default Option.
ReportRowContentTop
This specifies the vertical position of the row relative to the row above it. If it is set to zero, the top
of the row immediately follows the row above it with no vertical spacing.
ReportRowContentHeight
The height of the row in pixels.
Font
ReportFontSize
The point size of the font on the report if it differs from the font size displayed in Data Explorer.
ReportTextAllignment
This controls how the data will be aligned in a report.
Bottom: Center, Justify, Left, or Right
Middle: Center, Justify, Left or Right
Top: Center, Justify, Left, or Right
ReportUnderline
This creates a simple underline of report values. In order to control the format of the underline
including double underline, see the Top/Bottom Lines section below. Settings are True or False.
Colors
For ReportTextColor, ReportNegativeTextColor, ReportBackgroundColor, and
ReportNegativeBackgroundColor choose a color from the drop-down options.
Top/Bottom Lines
Top and Bottom Lines are used for underlines and overlines. The Line1 and Line2 options can be
used together to create a double underline or double overline. For the lines to have a small gap
between each column resulting in the example below, use the ReportBottomLine1PaddingLeft
property or one similar. The example below has a setting of 5 pixels. The color and thickness of
the lines can also be adjusted.
Borders
These settings allow a line to be drawn around the left, top, right, and bottom border of a given
cell. Settings are True or False.
Cell Format
This is a collection of settings for font formatting, text colors, cell background, and grid lines. This
also includes cell border settings for when a Cube View is generated into a Report.
The following properties are for Cube View Cell formatting which includes the Cube View Default
Settings, Rows, Columns, and Rows/Columns Overrides.
l General
l Amount
NumberFormat
This uses the Microsoft .NET standard Number Format syntax which also allows different formats
to be specified for positive and negative numbers separated by a semi-colon (not available in
Excel). For details, see Number Format Library
Common Examples
#,### ;(#,###);0 would show the number using a comma as the thousands separator, no degrees
of precision, Parentheses around negative numbers and a zero for null values. The pattern of this
format choice is Positive;Negative;Null values, each separated by a semicolon. After the first
#,### there is a space. This allows the numbers to line up with negative numbers due to the
Parentheses.
#,###.#% ;(#,###.#%);” – “ would show a percentage with a comma as the thousands separator,
one degree of precision, negative percentages in Parentheses and a dash for null values.
N2 would show the data as a numeric value with two degrees of precision. Negative numbers are
presented with a minus sign.
NOTE: Click , select the desired format, and click CTRL and Double Click. This will
enter the correct format into the appropriate field.
ZeroOffsetForFormatting
This setting is related to NegativeTextColor which is determined by whether a number is less than
zero. For example, if sales were less than 100, rather than 0, they could be displayed in red. A
valid setting here is any number other than zero.
Scale
-12 to +12 are the valid values for the scale. For example, to show a number in thousands, the
scale should equal 3, or for millions, the scale should equal 6.
FlipSign
This will flip the display value from positive to negative or vice versa. This is particularly useful for
reports such as a Trial Balance where certain expense numbers are stored as positive or negative
and need to be shown on the report. Settings are True or False.
ShowPercentageSign
This determines whether a percentage sign will be displayed. Settings are True or False.
ShowCurrency
This shows the currency code (e.g. EUR). Settings are True or False. This is not available in
Excel.
Font
See Header Format
Colors
TextColor, GridLinesColor and BackgroundColor
There are about 140 build in colors (e.g., TextColor=White, TextColor=TransParent,
TextColor=Aqua, …). You will be able to choose a color in the Cube View properties. These
colors are chosen under Cube View Properties, Rows, and Columns. Any color can also be
specified by using the ARGB values (alpha(opacity), Red, Green, Blue). (e.g.,
TextColor=#FF0000FF is another way to get Blue)
NegativeTextColor
Optionally override TextColor for negative numbers.
WritableBackgroundColor
Optionally override the data cells’ BackgroundColor for writeable data cells.
SelectedGridLinesColor
Optionally override the data cells’ GridLinesColor for selected data cells.
Excel
Amount
ExcelNumberFormat
Apply Excel number formatting in order to control how numbers appear when exporting to Excel.
A number format can have up to four sections of code separated by semicolons. These code
sections define the format for positive numbers, negative numbers, zero values, and text, in that
order.
<POSITIVE>;<NEGATIVE>;<ZERO>;<TEXT>
For example, use these code sections to create the following custom format. Note that this format
can include the underscore to create a space after the trailing positive number and can control the
color of the negative number to be Red:
#,##0.00_);[Red](#,##0.00);0.00
l See the following for more information about Excel number formats:
Excel Number Formats
NOTE: Click , select the desired format, and click CTRL and Double Click. This will
enter the correct format into the appropriate field.
ExcelUseScale
This determines if Excel is going to abide by the Scale property. Settings are True or False.
Reporting
Amount
ReportNoDataNumberFormat
By default, NoData (i.e. null) cells as empty text in the Data Explorer and as zeroes in a report are
displayed. However, any .NET number format text can be specified to format those zero values
differently in the report. For example, show the word NODATA if desired by typing NODATA in
that setting. To display empty text, type # in that setting. If the cell’s number format setting already
does something with null values, this property does not need to be filled.
This feature provides the ability to generate numbers instead of text when exporting a Dashboard
report to Excel. However, it cannot be used for Calculation Status and Annotation data cells
because those features cannot be represented as numbers. In those cases, use column overrides
to display the values as text.
NOTE: Click , select the desired format, and click CTRL and Double Click. This will
enter the correct format into the appropriate field.
For Position, Font, Colors, Top Lines, Bottom Lines, and Borders see the Reporting section under
Header Format.
The Conditional Formatting follows the Cube View Processing order of operations as:
3. Column Formatting
4. Row Formatting
5. Column Overrides
6. Row Overrides
The combination of formats and overrides equals the format for the cell when rendered.
Formatting can be applied, and isolated to, the nested Expansion Levels on rows and/or columns
by using Property Filters found in the Conditional Formatting dialog box.
This feature is a Cube View formatting option. It is not available as a Standard Reports setting
found in Application Properties.
Formatting can be applied, and isolated to, the nested, Expansions, on rows and/or columns by
using Filters found in the Conditional Formatting dialog box.
l Text/Object – Text object to be tested. If the Text/Object result is a reserved word such as
“If”, “Else”, “ElseIf”, “Then”, “End If” or other Operators as “In”, the text must be enclosed in
square brackets: [ ].
IsDerivedData Test for derived data, commonly resulting from Scenario Zero-
View settings
Indent Level
The IndentLevel Property Filter will dynamically format from defined rows or expansions.
Indentation is zero-based. The formatting can be applied to the default or to Rows. This solution
can speed formatting for summary level dimension members.
Apply the formatting to the Header section of the Cube View Default or to specific Rows.
“Traffic-Lighting” is data related and therefore applied as a Cell Format. The designer has a
choice to apply the conditional formatting to either a Row or a Column. The order of operations for
formatting can impact the decision. Row overrides are the final layer of formatting applied to a
Cube View and would not be impacted by other more general formatting.
1. Conditional Formatting can be applied to the rows or columns. A definition applied to the row
Formatting tab would apply to all columns. A row Override would isolate the formatting to a
specified column(s).
2. The CellAmount Filter is used within multiple If/ElseIf statements to define the various tests
required for the report.
Filter can also be applied as a Default Cube View format, in which the definition will apply to all
rows. This would require If/ElseIf type statements to support all rows.
1. Design or open a report which supports formatting for ExpandedRowNum , such as a ranking
report in the example.
2. Determine how the formatting should be applied, as a default, row or column. The example will
use the Cube View Default Formatting.
3. Cell Format is applied for Conditional Formatting using the ExpandedRowNum filter. Being
zero-based, and having to account for each row and its expansions:
a. Condition1 – Rows < 6 is defined because the text header row “Top10Title” initiates the
count at zero.
b. Condition2 – Rows > 11 and < 15 is defined because all the rows up to Row4,
“SalesBottom10”, reflect rows 0-11 on the Cube View. The conditional row references must
reflect all the Cube View Expansions.
1. In this example the formatting can be applied to the Cube View Default Cell Format since it
will globally apply to all rows.
HeaderDisplayText differs from the MemberName and MemberDescription Property Filter in that
it references the custom Name parameter in a Member Filter.
Dynamic Criteria can be applied to the Name and Description Property Filters to apply the
required formatting.
Parameter Formatting
Conditional Formatting definitions can be applied to a Cube View as a Literal Parameter. To
apply, select the Format string from the Cube View, Copy (Ctrl-C), select the Default Value
property of a Literal Value-type Parameter and Paste (Ctrl-V). Once saved, this Parameter can be
referred to in a Cube View Format. Per the example below, the reference of |!Param_Conditional_
PLTree!| in a Cube View Format would apply the associated Format string.
functions in XF. Choose the icon to create a new Profile. Choose the icon to assign a
Group to a Profile. This allows the user to select which Groups will be in the Profile.
The Cube View Profile’s Visibility setting controls what Cube Views are viewed throughout the
application. This is the key property for all Cube View Profiles. Any combination of the following
settings is available:
OnePlace
This allows the Cube View Profile to be viewed in the Cube View section under the OnePlace Tab.
Dashboards
This allows the Cube View Profile to be used within a Dashboard Data Adapter and viewed in the
Dashboard section under the OnePlace Tab.
Excel
This allows the Cube View Profile to be downloaded into Excel through Cube View Connections.
Forms
This allows the Cube View Profile to be used as a Form Type under Form Templates.
Workflow
This allows the Cube View Profile to be attached to a Workflow Profile. This can be completed in
the Workflow Profiles section under Data Quality Settings.
Never
This is used for a Cube View Profile that is expired, or no longer being used.
Always
This allows the Cube View Profile to be viewed in all the options mentioned above.
Application Dashboards
Dashboards are contained in Dashboard Groups made up of Components, Data Adapters,
potentially Parameters, and Files. All these objects are managed and shared across Dashboard
Maintenance Units which are then assigned to Dashboard Profiles.
Dashboard Toolbar
Create Group
Use this to create Dashboard Groups which is where all Dashboards are organized.
Create Profile
Use this to create a Dashboard Profile which is where all Dashboard Groups are organized.
Create Dashboard
Use this to create a Dashboard.
NOTE: A Dashboard Maintenance Unit and a Dashboard Group need to be created first
for this icon to be available.
Multiple items can be selected all at once using Shift+click or Ctrl+click. Items can be copied and
pasted multiple times and each time the item is pasted a suffix of _Copy will be added. For every
additional instance of the Dashboard item that is pasted a suffix of _Copy (instance#) will be
added.
NOTE: Users can Copy using right-click on the Dashboard Item or toolbar menu Copy
button.
Items can be pasted multiple times and each time the item is pasted a suffix of _Copy will be
added. For every additional instance of the Dashboard item that is pasted a suffix of _Copy
(instance#) will be added.
NOTE: Users can Paste using right-click on the Dashboard Item or toolbar menu Paste
button.
Create Parameter
Use this to create a Parameter to use in a Dashboard or Cube View.
Create File
Use this to store files to be used in a Dashboard.
Create String
Used with XFString functionality and displayed as a caption in a Cube View.
View Dashboard
Use this to run a Dashboard from the Application Tab.
Object Lookup
See Object Lookup.
Navigate to Security
This icon appears in the Dashboard Maintenance Group, Dashboard Group, and Profile Security
properties and when clicked it navigates to the Security screen. This is an easy way to make
changes to Security Users or Groups before assigning them to specific Dashboard Maintenance
Units, Groups, and Profiles.
Description
A short description of how the Dashboard Group is used, or what it contains.
Security
Access
Members of this group can access the Dashboards within the Dashboard Group.
Maintenance
Members of this group have the authority to maintain the Dashboards within the Dashboard
Group
NOTE: Click and begin typing the name of the Security Group in the blank field. As
the first few letters are typed, the Groups are filtered making it easier to find and select
the desired Group. Once the Group is selected, click CTRL and Double Click. This will
enter the correct name into the appropriate field.
Dashboard Properties
General
Name
The name of the Dashboard.
Description
A quick description of the Dashboard.
Page Caption
This is used to give a name to the Dashboard when it is displayed. The name will be displayed at
the top left of the Dashboard. If the Page Caption is left blank, then it will default to the name
given to the Dashboard.
Dashboard Group
The Dashboard group to which it belongs.
Formatting
Layout Type
Canvas
A Component docks to the top, bottom, right, or left of the Dashboard. The Component only
shows in the small space defined by the dock position and does not display fully on a screen.
Slide bars are used to view the Components.
Dock
A Component docks to the top, bottom, right, or left of the Dashboard. Use Dock Position to
determine how much of the screen will be used for this layout.
Grid
This allows the ability to make a Dashboard in a grid format by using rows and columns to add
Components, Lines, or Moveable Splitters.
Tabs
Each Component will display in its own tab.
Uniform
Each Component is docked, but all are equal in size.
Wrap
See Canvas
Display Format
This allows unique display formatting per Dashboard for background color, dialog width and
height, and border formatting. See Display Format Settings.
No Task
A Dashboard Extender Business Rule is not being used on this Dashboard.
Number of Rows
The number of rows the Dashboard will use for the grid layout.
Number of Columns
The number of columns the Dashboard will use for the grid layout.
Row/Column Type
Component
Use the Component option if a Content or Parameter Component is needed for the row or column
of the grid layout for the Dashboard.
Line
Use the Line option if a line is needed for the row or column on the Dashboard.
Moveable Splitter
Use the Moveable Splitter option if one is needed for the row or column on the Dashboard. This
allows the ability to move the grid up and down, or left and right.
Row/Column Height
Utilize Row/Column Height to customize the sizing of the rows and columns heights and widths to
be displayed on the Dashboard. Auto expands the size based on the contents of the row. You can
use an * to create a fractional value of the remaining space. For example, if there are three rows,
then each row will take a third of the space.
InnerBackgroundColor
This is used to display the background color of a Dashboard when using Dashboards within other
Dashboards.
Dialog
DialogDisplayStyle
Controls how a Dashboard is sized when viewed as a Dialog Box. If the specified value exceeds
the current screen, the value will be changed to the maximum display setting for height and width
to fit to screen.
Specify Maximize or Normal. The default is Normal. Maximize opens using the Dialog Maximize
mode to full screen. Normal opens to the defined Dialog Height and Width settings.
DialogWidth
Customize the width of the dialog for the Dashboard. The dialog width supports up to 10,000. If
set to -1, the size will set at run time to the maximum width for the active screen.
DialogHeight
Customize the height of the dialog for the Dashboard. The dialog height supports up to 10,000. If
set to -1, the size will set at run time to the maximum height for the active screen.
EnableSystemClose
This is used to control the display and use of the Close button of the dialog. Setting this to True will
remove the Close button from the user interface.
Styles
TabControlStyle
Choose from Classic, No Border, or Rounded Corners to define how tabs appear in the
application. The tab control functionality remains the same.
Dashboard Tabs
There are three different tab styles that can be selected for dashboards:
These tabs are only available for use with the dashboards and do not apply to the rest of the
product.
Select a Tab
1. Go to Application > Dashboards and select a dashboard.
2. Click to open the dashboard and in Dashboard Properties > Formatting make sure the
Layout Type is set to Tabs.
4. Go to Styles > TabControlStyle and click into the field to get a menu option.
If the Layout Type is not set for Tabs, the field will be greyed out.
6. Click OK.
7. Click Save.
Border
BorderThickness
This is used to customize the border thickness for the Dashboard.
BorderCornerRadius
This is used to customize the border corner shape for the Dashboard. If left blank, the border will
be square.
BorderColor
This is used to customize the outer border coloring for the Dashboard.
BorderBackgroundColor
This is used to display the border background color of a Dashboard when using Dashboards
within other Dashboards.
Margin
BorderMarginLeft/Top/Right/Bottom
Used to customize the margin of the border on the Dashboard.
Padding
BorderPaddingLeft/Top/Right/Bottom
Used to customize the padding or spacing of the border on the Dashboard.
Dashboard Components
General
Name
The name of the Dashboard Component.
Description
A free form field to enter a description of the Dashboard Component.
Components
Move Up
Use this to move a Dashboard Component Up in the Components list.
Move Down
Use this to move a Dashboard Component Down in the Components list.
Position on Dashboard
Dock Position
Left/Top/Right/Bottom
Allows the ability to dock a Component in a specific position on the Dashboard.
Left
This sets the amount of Dashboard space to allocate to the left dock position.
Top
This sets the amount of Dashboard space to allocate to the top dock position.
Width
Enter a percentage to set the width of the Dashboard.
Height
Enter a percentage to set the height of the Dashboard.
Parameter Components
Parameter Components are objects such as a Button or List Box that, when selected, could drive
additional actions, such as opening a different Dashboard or running a Custom Calculate.
Custom Calculate executed from a Dashboard Parameter Component allow the passing of a
complete set of name value pairs that align to the Dimension Point of View (i.e.
Entity=MyEntity,Time=2017M11,Consolidation=Local, etc) and the Custom Calculation
processor will use those variables to initialize the Finance Engine Api POV Variables (i.e.
api.Pov.Entity.Name will equal MyEntity). All api.Pov objects (Cube, Entity, Parent,
Consolidation, Scenario, Time, View, Account, Flow, Origin, IC, UD1, UD2, UD3, UD4, UD5, UD6,
UD7, UD8) are initialized and available to this function. This makes it possible to share Custom
Calculation rules with standard consolidation/calculation rules because both rule sets have a fully
initialized set of api.Pov variables. For Consolidation Business Rules, this POV is related to the
Data Unit. When using a Parameter Component with a Cube View on the same Dashboard,
api.POV is related to the 18-dimension POV that is available from each cell rendered. See the
section on Custom Calculate under Finance Function Types for more information.
Button
This displays a button on a Dashboard. Buttons can be used to initiate a variety of functions
including Business Rules, other Dashboards, the file explorer, Member selection, file upload
screen, etc. This Component can be used with Mobile and Application Dashboards.
Formatting
Text
The text to be displayed with the button. This is an optional field. If left blank, the Text field will
default to display the Name of the Button. To clear the default Name from displaying, |SPACE|,
can be entered.
Tool Tip
The text to be displayed when hovering over the button.
Display Format
The formatting assigned to the control. See Component Display Format Properties.
URL
The image displayed on the button is based on a URL.
Dashboard File
The image displayed on the button is in a file stored in the Dashboard Maintenance Unit File
Section.
URL Syntax
Access a built-in toolbar image, or use an image from any web server
http://www.onestreamsoftware.com/img/onestream-logo.png
The following Excel properties are used to specify an Excel file from either an Extensible
Document or Report Book in order to display it as an Image or Button on a Dashboard.
Excel Sheet
Enter the name of the Excel worksheet to display.
See Image under Component Display Format Properties to position and align the Excel image as
needed.
Button
Button Type
Specify the functionality to be assigned for the selected button.
Standard
Displays a generic button that can have custom functionality assigned such as display a
Dashboard, execute a Business Rule, etc.
File Explorer
This sets the button to display the File Explorer dialog.
File Upload
This sets the button to display the File Upload dialog.
Select Member
This sets the button to display the Select Member dialog.
Action
Bound Parameter
This is name of the Parameter the Component will represent. Piped variable names are not
available in this setting. The Component will use the Parameter to get its list of options. The
selection will specify a value for the Parameter when it is referenced elsewhere via piped variable
naming. Click and begin typing the name of the Parameter in the blank field. As the first few
letters are typed, the names are filtered making it easier to find and select the one desired. If the
name is unknown, expand a Dashboard Maintenance Unit and scroll through the list to select the
correct one. Once the Parameter is selected, click CTRL and Double Click. This will enter the
correct name into the appropriate field.
Save Action
Selection Changed Save Action
Essentially, what happens when this button is pressed.
No Action
When the Parameter Components are changed, the changes will not be saved.
POV Action
Selection Changed POV Action
No Action When Parameter Components are changed, no change will be made to the user’s POV.
Change POV
This setting will change the user’s POV to a specified POV when Parameter Components are
changed.
Change Workflow
This changes the user’s Workflow to a specified Workflow when Parameter Components are
changed.
NOTE: The Selection Changed POV Action must be set to something other than No
Action for this change to occur.
Server Task
Selection Changed Server Task
No Task
When Parameter Components are changed, no server task is executed.
See "About the Financial Model" on page 2 for an example of this calculation logic.
Calculate
Executes a Calculation when Parameter Components are changed.
Force Calculate
Executes a Force Calculate when Parameter Components are changed.
Translate
Executes Translate when Parameter Components are changed.
Force Translate
Executes Force Translate when Parameter Components are changed.
Consolidate
Executes Consolidate when Parameter Components are changed.
Force Consolidate
Executes Force Consolidate when Parameter Components are changed.
TIP: For more details on these Calculation Types, see Data Management in "Application
Tools" on page 779.
Dashboard Extender Business Rule Example (each enclosed with curly braces)
{Business Rule Name}{Function Name}{Optional Name-Value Pairs}
Finance Custom Calculate Business Rule Example (each enclosed with curly braces)
{Business Rule Name}{Function Name}{Optional Name-Value Pairs}
NOTE:
Name Value Pairs can have settings for Cube, Parent and any Dimension Type. It can
also have a setting for any Time Member Filter including CurrentPeriod,
AllPriorInYearInclusive or AllInYear. It can have any other name-value pair for use by
the Business Rule since they are passed into the rule using the CustomCalculateArgs
object.
No Action
When Parameter Components are changed, no user interface action is executed.
Redraw
This repaints the screen without making a connection to the database acquiring updated data.
Refresh
This refreshes the Dashboard specified in the Dashboards to Redraw property which makes a
connection back to the database acquiring all new data.
Close Dialog
This closes a dialog upon performing a specified action in a Dashboard.
Open Dialog
Opens a Dashboard in a modal window.
Dashboards to Redraw
Comma separated list of Embedded Dashboards to redraw when the user changes the selection
in this Parameter Component.
Dashboards to Show
Comma separated list of embedded Dashboards to make visible when the user changes the
selection in this Parameter Component. This is typically used when multiple Parameter
Components are needed to collect settings to display an Embedded Content Dashboard. After
the last selection has been made, that Parameter Component will show the previously hidden
Embedded Content Dashboard.
Dashboard to Hide
Comma separated list of embedded Dashboards to hide when Parameter Components are
changed.
Navigation Action
Selection Changed Navigation Action
No Action
No navigation action is performed when Parameter Components are changed.
Open File
When Parameter Components are changed by a user, a new page will open. This is specified in
the Selection Changed Navigation Arguments field.
Open Page
When Parameter Components are changed by a user, a new page will open. This is specified in
the Selection Changed Navigation Arguments field.
Open Website
When Parameter Components are changed by user, a website will open in a new browser. This is
specified in the Selection Changed Navigation Arguments.
XFPage=Dashboard:MyDashboardName, OpenInNewXFPage=True
Specify the file source type (e.g., Dashboard, Application, System, FileShare or URL) and the file
name. Also use PinNavName=True/False or PinPOVName=True/False to open or close the
Navigation or POV panes.
FileSourceType=FileShare, FullFileName=Document/Public/NameofFile,
OpeninXFPageIfPossible=False
NOTE: Click and begin typing the argument string in the blank field. As the first few
letters are typed, the names are filtered making it easier to find and select the one
desired. If the name is unknown, scroll through the list to select the correct one. Once
the argument string is selected, click CTRL and Double Click. This will enter the correct
string into the appropriate field.
Chart (Advanced)
See Advanced Chart Examples in "Presenting Data With Extensible Documents" on page 235 for
some examples on the Diagram Types and associated properties.
Action
See Button
Diagram Type
Polar2D
This is used to display data as a circular graph and displays values on the basis of angles.
Radar2D
This is used to display data as a circular graph and has multiple axes along which data can be
plotted.
Simple2D
This is commonly used to compare percentage values of different point arguments in the same
series.
XY2D
This is commonly used to show filled areas on a diagram. Values can be used separately or
aggregated.
NOTE: The properties below do not apply to all of the Diagram Types.
This controls whether the toggle icon displays on the Dashboard at runtime.
Swap Axes
If set to True, this exchanges the X axis with the Y axis.
Domain Color
This is the chart’s background color.
For example, to display a value to two decimal places enter {V:0.00} where 0.00 is the Format
String. Another common example is {V:0,,M} which will display the value in the millions or {V:0,T}
which will display the value in thousands.
Applying a number format after the colon displays the value in a desired format.
{V:#,###,0}
Any text written within the curly braces after the Format String will be displayed as text.
Additionally, any text written before or after the curly braces will also be displayed as text. Refer to
MSDN's topic on Format Strings for additional information.
Enable Animations
This controls how the chart renders at runtime. Set this to True to include chart animations upon
launching the Dashboard.
Show Border
Set this to True in order to display a border around the chart.
Show Legend
Set this to True in order to create and display a customized legend for the Dashboard chart.
Title
Enter the legend title.
Vertical/Horizontal Position
This controls where on the screen the legend will display.
Orientation
This determines whether the legend displays vertically or horizontally.
Show Border
Set this to True in order to display a border around the legend.
Crosshair Enabled
A crosshair is similar to a point label, but only displays when the user hovers over the data series.
Set this to True, in order to enable the crosshair function.
Title
Enter a title for each axis.
DateTime
Use this if dates are included in the arguments. This causes the chart’s x-axis to be continuous
meaning the x-axis ticks are independent of the point’s argument.
Double
This causes the chart’s x-axis to be continuous meaning the x-axis ticks are independent of the
point’s argument.
String
This causes the chart to be discrete meaning the axis ticks are dependent on the point’s
arguments.
Text Format
See Point Label Text Format.
Logarithmic
Setting this to True will force the axis values to follow a logarithmic pattern specified in the
Logarithmic Base property below.
Logarithmic Base
Set the Logarithmic Base value. For example, logarithmic values with base 10 yields the axis
values 0, 1, 10, 100, 1000, 10000, etc.
Minimum/Maximum Value
Specify the minimum and maximum range values.
Step
Specify the intervals for the tick marks.
Reverse Order
By default, the order of values on a chart begins at the bottom and works its way to the top, so the
first color in the legend is the bottom value's color. Setting this property to True, will reverse that
order. When the order is reversed, the values are displayed top to bottom, thus following the
order of the legend.
Interlaced
Set this to True for the chart’s domain color to alternate between the interlaced color. If this is set
on the x axis, the color will display horizontally. If this is set on the y axis, the color will display
vertically.
Interlaced Color
Enter the desired interlaced color.
Polar2D Types
PolarArea
The data series displays as a filled area on a circular diagram.
PolarLine
The data series displays in a continuous line on a circular diagram.
PolarPoint
The data series displays in a series of small circles on a circular diagram.
Radar2D Types
RadarArea
The data series displays as a filled area on a circular grid with multiple axes along which data can
be plotted.
RadarLine
The data series displays as a line on a circular grid with multiple axes along which data can be
plotted.
RadarPoint
The data series displays as a series of small circles on a circular grid with multiple axes along
which data can be plotted.
Simple2D Types
Funnel
A funnel chart displays a wide area at the top, indicating the total points' value, while other areas
are proportionally smaller.
NestedDonut
This is similar to a Doughnut Chart but compares the data with one doughnut nested in another
one.
PieAndDonut
This is used to compare the percentage values of different data points in the same series.
XY2D Types
AreaRange
This displays the data series as filled areas on a diagram, with two data points that define
minimum and maximum limits. This chart is used to illustrate the difference between start and end
values.
BarRangeOverlapped/Waterfall
This displays either vertical or horizontal bars along the Y axis (the axis of values). Each bar
represents a range of data with two values. This chart type is used to show activity from different
data series one above another to compare.
BarRangeSideBySide/Waterfall
This displays either vertical or horizontal bars along the Y axis (the axis of values). Each bar
represents a range of data with two values. This chart type is used to show activity from different
data series grouped by their settings.
BarSideBySide
This displays the data series as individual bars where the height of each bar is determined by the
data value.
BarSideBySideFull/SideStacked
This can stack different bars and combine them into groups displayed side-by-side across the
same axis value.
LineScatter
This is a type of Line Chart where the data points are connected by a continuous line.
LineStep
This is used to display to what extent values have changed for different points in the same series.
Point
This is used to show points from two or more different series on the same chart plot.
Stock
This is used to show variation in stock prices over the course of time.
NOTE: See Chart (Basic) under Content Components for all other XY2D Chart Types.
Model Display
Change the look of the series by choosing between three tiers (Basic, Moderate, Advanced) to
determine which preset model the chart series will use. Selecting Moderate or Advanced
provides a more enhanced view of the chart.
Show Markers
This is used in conjunction with Line Charts to display a circular marker at each end of the line.
Set this to True to display line markers.
Marker Size
Enter a value for the marker size. Values are entered in pixels.
Bar Width
This is used in conjunction with Bar Charts. Enter a value to control the width of each bar on the
chart.
Line Thickness
Enter a value measured in pixels to control the line thickness displayed in a Line Chart.
Business Rule
A Dashboard DataSet Business Rule from the chart’s Data Adapter is being used as the data
source. Utilizing a Business Rule gives more control to the chart’s data series and provides
additional customized settings.
Suppress Zeros
If set to True, this removes all results that are zero.
All Rows
The chart will include all rows.
First Row
The chart will only include the first row.
Annotation
The legend text comes from the Annotation View Member in the Cube View instead of the default.
The Data Adapter must include this information for this option.
Assumptions
The legend text comes from the Assumptions View Member in the Cube View instead of the
default. The Data Adapter must include this information for this option.
Audit Comment
The legend text comes from the AuditComment View Member in the Cube View instead of the
default. The Data Adapter must include this information for this option.
Footnote
The legend text comes from the Footnote View Member in the Cube View instead of the default.
The Data Adapter must include this information for this option.
Variance Explanation
The legend text comes from the VarianceExplanation View Member in the Cube View instead of
the default. The Data Adapter must include this information for this option.
Check Box
This provides a small box that when selected is filled with a check mark. This Component can be
used with Mobile and Application Dashboards.
Formatting
See Button
Action
See Button
Cube View
Use this Component to attach a Cube View to a Dashboard. This Component is not compatible
with Mobile Dashboards.
Cube View
Show Header
Select True in order to display the Cube View’s Page Caption, select False in order to hide it.
Actions
For all Action settings and examples, see Button earlier in this section.
NOTE: In order to enable Component Actions, the Cube View must have Navigation
Link Parameter(s) configured otherwise, the actions will not function.
Combo Box
This displays a drop-down list of strings. This Component can be used with Mobile and
Application Dashboards.
Formatting
See Button
Action
See Button
Gantt View
This is a graphical illustration of a schedule over time and is used for relational or hierarchical data
(non-Cube data). This Component can use a SQL query to query the data, but in order to fill these
Components with data, a Dashboard Data Set Business Rule must be used within a Data
Adapter. The Business Rule must use an XFGanttTaskCollection object that helps the user
convert a normal data source into the object-based data source required to feed these new
hierarchical Components. See Gantt View Component Business Rule Example in "Implementing
Security" on page 322 for an example of the Dashboard Data Set Business Rule. This
Component is not compatible with Mobile Dashboards.
Action
See Button
Show Header
When set to True, the grid Name/Description will display as the Component’s header on the
Dashboard, when set to False, it will be hidden.
Grid View
This displays data from a Data Adapter in a grid. Similar to the SQL Table Editor, but data is not
editable. This Component is not compatible with Mobile Dashboards.
NOTE: Grid View dashboard component should not be used for large data sets as it is
not a component that uses paging. In this instance a "SQL Table Editor" component
should be used to display the data to the end user and it will use paging to return the full
data set.
Deselect each row individually or click the Deselect All toolbar button.
Formatting
See Button
Action
See Button
Grid View
Table Name
The name of the table created in the data Parameter.
Show Header
When set to True, the grid Name/Description will display as the Component’s header on the
Dashboard, when set to False, it’s hidden.
When set to True, Column Setting will show on the Dashboard, if set to False, it’s
hidden.
Allow Column Reorder
When set to True, you can drag columns and put them in a different order, if set to False, you can’t
drag them to put them in a different order.
Save State
Enables User Preferences for Column: Order, Visibility, Filtering, Sorting and Width. If this setting
is True, then the column activity can be changed and saved via the Grid View Column Format
section.
When Vary Save State By is enabled, the related Dashboard Component will have the additional
reset option of Reset All States, which can be used to clear the user's Save States across all the
Vary Save State By parameters.
Description
Column description to be displayed. By default, the column name from the table is displayed.
OneStreamClientImage
This replaces the data in the column with the associated image.
WorkflowStatusImage
This replaces the data in the column with the associated Workflow status indicator image.
IsVisible
Setting to override the default Columns Visible setting. Settings are True, False, or Use Default.
ParameterName
The assigned Parameter name to be used to store the Parameter value from the specified
column.
Background Color
Background color to be displayed on the selected column. Choose color from drop down options.
IsGroupable
This determines whether the column can be used in the Group panel on the grid. Settings are
True or False.
IsFilterable
This setting, located at the top of the column, turns the filter button on and off. Settings are True or
False.
ShowDistinctFilters
This setting turns the filter option on and off, which enables users to click check boxes to select
filtered Members. Settings are True or False.
ShowFieldFilters
This enables and disables advanced filtering on the specified column. Settings are True or False.
IsSortable
This enables and disables sorting on the specified column. Settings are True or False.
IsMultilineText
This setting allows columns to display data on mulrows if the column is not wide enough to display
the full value. Settings are True or False.
TIP: The data will wrap on string spacing. If there are no blank spaces, the column data
will not wrap. Additionally, keyed fields cannot be wrapped.
DataFormatString
Specify a number and date format to the data in the column. For example, mm/dd/yyyy will return
the current Month/Day/Year using a slash. MM-dd-yyyy will return the Month-Day-Year using a
dash. N0 will return a number without a decimal point, and #,###,0 will return a number without a
decimal and a comma depicting the thousandth place. See Application Properties in "Application
Tools" on page 779 for more examples of number formats.
Width
Specifies the default column width to be displayed.
If Save State is set to True, the changes to the headers and columns will be saved.
If Save State is set to False, and you view the dashboard, you can’t drag and drop the headers.
You can click Column Settings which opens the Column Settings window.
You can hide columns and change the order that the columns are shown on the dashboard.
Reset State
When you view a dashboard, you can right click Reset State to go back to the
default settings of the grid view component.
Sort Column
When you click on a column header it will sort the column:
If Save State is set to True, the changes to the sort order will be saved.
Filter
When a new filter is applied to the grid, the filter icon is orange to show that it’s been changed.
List Box
This is a rectangular control that displays a list of strings to be selected by the user at run time.
This Component is not compatible with Mobile Dashboards.
Formatting
See Button
Action
See Button
Map
The Map Component is used to display specific locations on a geographical map via a Dashboard
DataSet Business Rule. Within each location, users can pass in Parameters in order to perform
specific actions and display additional data. Map Components can only display data via a
Dashboard DataSet Business Rule using the XFMapItemCollection objects. Define the data set
in the Business Rule, assign the rule to a Data Adapter, and the Data Adapter to the Map
Component. See Map Dashboard Component Business Rule Examples in "Implementing
Security" on page 322 for examples on using this component.
Action
See Button.
This controls whether the toggle icon displays on the Dashboard at runtime.
Display Type
Humanitarian
This displays all countries/states/cities in English. The geographical location labels display
smaller than Standard and Transport.
Standard
This displays all countries/states/cities in the respective local language. The map displays
geographical location labels and road lines.
Transport
This displays all countries/states/cities in the respective local language. The map displays major
cities with large labels.
Zoom Level
Enter the zoom value to which the map will open upon running the Dashboard.
Center Latitude/Longitude
Enter the GPS coordinates to use upon launching the Dashboard.
Note: To define the exact GPS coordinates refer to a GPS coordinate website.
Member Tree
This is a hierarchical collection of labeled items represented in a tree node. This Component is
not compatible with Mobile Dashboards.
Formatting
See Button
Action
See Button
Formatting
See Button
Action
See Button
Supplied Parameter
This is a control that holds a Parameter value for use on a Dashboard, but it will not be displayed.
This Component can be used with Mobile and Application Dashboards.
Bound Parameter
See Button
Text Box
This provides a box for entering text. This Component is not compatible with Mobile Dashboards.
Formatting
See Button
Action
See Button
Multiline
Set this to True for word wrap or click Enter and go to the next line. Set this to False for one string
of text.
Tree View
This provides a graphical illustration of hierarchal data. This is used for relational or hierarchical
data (non-Cube data).
Action
See Button
Action
See Button
Application
Select if the table to be displayed is located in the application database.
Framework
Select if the table to be displayed is located in the framework database.
External
Select if the table to be displayed is located in an external database.
Table Name
The name of the table being displayed in the control.
Where Clause
The SQL string where clause used to pull data from the table.
Order by Clause
The SQL string to order the results of the table.
NOTE:
The Order by Clause property does not allow SQL Functions, like YEAR(effectiveDate).
When run inside a dashboard, the Table Editor's columns allow clicks to change the
order to ascending, descending, or no sort. SQL functions can not be allowed in the
ordering.
Show Title
Show or Hide the text in the Title Header.
Allow Inserts
If set to True, new rows of data can be added to existing data in the table. If set to False, new rows
cannot be added.
Allow Deletes
If set to True, rows of data can be deleted from the existing data in the table. If set to False, rows
cannot be deleted.
Allow Multiselect
If set to True, the multiple selection of records will be enabled by item selection or using a
selection check box. The active selected items will be passed to the defined Bound Parameter
field as a comma delimited list. The Bound Parameter format will be as: item1, Item2, Item3. If the
3 values A, B, C,D are selected (where C,D is one value) the resulting bound parameter string is
A, B, "C,D". If set to False, only a single Bound Parameter can be passed at a time and the
selection boxes will be deactivated from the user interface.
Save State
If set to True, user settings on the Component are retained. User preferences for Columns saved
will be for: Order, Visibility, Filtering, Freeze Bar, Sorting and Widths.
NOTE: Reset Save State Back To Default: Right-click on the Dashboard to enable the
“Reset State” to return back to its Component Properties settings.
When Vary Save State By is enabled, the related Dashboard Component will have the additional
reset option of Reset All States, which can be used to clear the user's Save States across all the
Vary Save State By parameters.
Server Task
Create Table if Necessary
The option to create the table if it does not already exist. Settings are True or False.
TIP: The data will wrap on string spacing. If there are no blank spaces, the column data
will not wrap. Additionally, keyed fields cannot be wrapped.
DataFormatString
Specify a number/date format to the data in the column. For example, mm/dd/yyyy will return the
current Month/Day/Year using a slash. MM-dd-yyyy will return the Month-Day-Year using a dash.
N0 will return a number without a decimal point, and #,###,0 will return a number without a
decimal and a comma depicting the thousandth place. See Application Properties in "Application
Tools" on page 779 for more examples of number formats.
Width
This specifies the default column width to be displayed.
Description
Column description to be displayed. By default, the Column name from the table is displayed.
IsVisible
Setting to override the default Columns Visible setting. Settings are True, False, or Use Default
AllowUpdates
This is a setting to override the Default For Allow Column Updates setting for each individual
column. If set to True, users can modify data displayed in the grid. The Use Default setting uses
the Default For Allow Column Updates setting.
ParameterName
The assigned Parameter name to be used to store the Parameter value from the specified
column.
DefaultValue
The default value to be entered when allowing new records to be added. This helps ensure that
invalid blank cells are not created.
IsMultilineText
If set to True, this will allow columns to display data on multiple rows if the column is not wide
enough to display the full value.
l Allow Multiselect will generate the Bound Parameter as a comma delimited list.
l Selection Methods can be performed with the Select All check box, selecting a range using
Shift key, and by item using Ctrl-Click.
l Deselecting can be done by item or using the Deselect All toolbar button.
l Applying a Column Filter after selections are activated is not allowed. Applying a column
filter will clear any existing selected records. The column must be first filtered, once filtered,
selections can be performed.
l Defer Refresh button can be used to manually control the execution of tasks defined in the
Actions properties of the SQL Table Editor Component. This user-controlled execution of
actions may provide better performance within complex dashboards.
NOTE: Defer Refresh remains checked after the Refresh button is clicked in SQL
Table Editor.
NOTE: Multi-select will retain selected items after column filtering and sorting by utilizing
table’s primary key columns. If the referenced table does not contain a primary key
column(s), then the selection will not be retained across pages or after filtering or sorting
columns.
The Column Order button can be selected by the user to modify the presentation of the SQL Table
Editor. These are user-based preferences which will be saved only if the Saved State is set to
True.
The Column Filters can be set by the user and will be preserved if the Saved State is set to True.
However, the Column Sort will not be preserved once the SQL Pivot Grid is deselected or the
dashboard is closed.
When Save State is set to True the Freeze bar will retain its position if you drag it to a certain
column.
Content Components
BI Viewer
This is an Interactive Dashboard Engine that is used to quickly create a Business Intelligence (BI)
visualization of data from existing or new Data Adapters. The user can quickly create BI
Dashboards that will allow the user to visualize and analyze data.
The BI Dashboards are a powerful tool and allow the user to dynamically add data and build
Charts, Gauges, Maps, Grids, Cards. Etc. This BI component integrates well with the Cube View
MD Data adapter, allows the user at design time to; Add Calculated fields to the Data Source to
create new fields and measures, design a path for Drill Down, Multi-Select filtering and
Conditional formatting for dynamic viewing at runtime and provides the ability Customize Palette
Colors.
l Data Adapter – used to source the data and render the dataset
Additional information on the use and design elements of BI Viewer can be found in the BI Viewer
Design and Reference Guide.
Book Viewer
This is used to display a Report Book in a Dashboard. This Component is not compatible with
Mobile Dashboards.
Dashboard File
Display a Book stored in a Dashboard Maintenance Unit File section.
Show Header
Select True in order to display a header derived from the Component’s Name or Description.
Chart (Basic)
This is used to display a chart in a Dashboard. This Component can be used with Mobile and
Application Dashboards.
Chart Type
Area
An area chart or area graph displays graphically quantitative data and it is based on the line chart.
The area between axis and line are commonly emphasized with colors, textures and hatchings.
Commonly, two or more quantities are compared with an area chart. Area charts are used to
represent cumulated totals using numbers or percentages over time. Use the area chart to show
trends over time among related attributes.
Bar
A bar graph is a chart that uses vertical bars to show comparisons among categories. One axis of
the chart shows the specific categories being compared, and the other axis represents a discrete
value.
Bubble
A bubble chart is a variation of a scatter chart in which the data points are replaced with bubbles,
and an additional Dimension of the data is represented in the size of the bubbles.
CandleStick
A candlestick chart is a combination of a line and bar chart, in that each bar represents the range
of price movement over a given time interval.
Doughnut
A doughnut chart is functionally identical to a pie chart, except for a blank center and the ability to
support multiple statistics as one.
Horizontal Bar
A horizontal bar graph is a chart that uses horizontal bars to show comparisons among
categories. One axis of the chart shows the specific categories being compared, and the other
axis represents a discrete value.
Line
A line chart displays information as a series of data points connected by straight line segments.
Pie
A pie chart is a circular chart divided into sectors, illustrating numerical proportion
Range
The range chart displays a range of data by plotting two Y values per data point, with each Y
value being drawn as a line chart. The range between the Y values can then be filled with color.
Scatter
A scatter chart displays numerical values along the horizontal and the vertical axis, combining
these values into single data points that are displayed in uneven intervals.
Spline
A spline chart is a specialized form of a line chart. Unlike conventional charts which connect data
points with straight lines, a spline chart draws a fitted curve through the data points.
Spline Area
A spline area chart is a specialized form of an area chart. Unlike conventional charts which
connect data points with straight lines, a spline chart draws a fitted curve through the data points.
Spline Range
The spline range chart displays a range of data by plotting two Y values per data point, with each
Y value drawn as a line chart. The range between the Y values can be filled with color.
Stacked Area
The stacked area chart stacks two or more data series on top of one another.
Stacked Bar
A stacked bar chart stacks multiple data points in each bar on the chart instead of a single data
point.
Stacked Line
A stacked line chart has lines that do not intersect because they are cumulative at each point.
Stacked Spline
A stacked spline chart has lines that do not intersect because they are cumulative at each point.
Stick
A stick chart is a combination of a line and bar chart.
Waterfall
A Waterfall chart is a visualization of the sequence of positive and negative values that arrive at a
final value.
Show Legend
If set to True, a legend containing a list of the variables will appear in the chart and an example of
each appearance. This information allows the data from each variable to be identified in the chart.
Legend Title
The text to be displayed under the legend.
Legend Position
The location of the legend in relation to the chart. The options are Bottom, Left, Top, Right.
Chart X/Y Axis
Title
The title for the horizontal axis of the chart.
Minimum Value
Manually set the X & Y Axis starting value. This can only be set if Use Automatic Range is False.
Maximum Value
Manually set the X & Y Axis maximum value. This can only be set if Use Automatic Range is
False.
Step
Manually set the change in values displayed on the X & Y Axis. This can only be set if Use
Automatic Range is False.
Chart Data
Cube View
A Cube View from the chart’s Data Adapter is being used as the data source.
Suppress Zeroes
If set to True, this removes all results that are zero.
All Rows
The chart will include all rows.
First Row
The chart will only include the first row.
Column List
A specified list of columns used as X Axis types for the chart. These columns are entered in the
Column List for X-Axis field.
All Columns
All columns are used as X-Axis Members.
Column List
A specified list of columns used as Data Sources for the chart. These columns are entered in the
Column List for Data field.
All Columns
All columns are used as Data Sources.
Default
The legend text comes from the row headers.
Annotation
The legend text comes from the Annotation View Member in the Cube View instead of the default.
The Data Adapter must include this information for this option.
Assumptions
The legend text comes from the Assumptions View Member in the Cube View instead of the
default. The Data Adapter must include this information for this option.
Audit Comment
The legend text comes from the AuditComment View Member in the Cube View instead of the
default. The Data Adapter must include this information for this option.
Footnote
The legend text comes from the Footnote View Member in the Cube View instead of the default.
The Data Adapter must include this information for this option.
Variance Explanation
The legend text comes from the VarianceExplanation View Member in the Cube View instead of
the default. The Data Adapter must include this information for this option.
Chart (Advanced)
This Component can be used with Mobile and Application Dashboards to display charts.
Series Type
Waterfall
A waterfall chart (also known as a Walk or Bridge chart) is a type of chart that provides a visual
story of the net changes of values between two identified points, such as starting and ending
values in income statements, balance sheets, and operational expenses. You can build a
waterfall chart with data supplied in a particular format from a Data Adapter. You can also convert
existing BarRangeSideBySideWaterfall charts to the new waterfall chart type.
Waterfall charts display net changes of a single value per bar. They provide positive variance
values above the bar and negative variance values below the bar, as shown in the following
image.
When you select the Waterfall type chart, the Waterfall Series Properties section provides
configurations specific to the Waterfall chart:
l Start Bar Color: Sets the color of the first bar in the series.
l Total Bar Color: Sets the color of the bar that represents a Total.
l Rising Bar Color: Sets the color of all bars whose values are positive (but are not Start or
Totals).
l Failing Bar Color: Sets the color of all bars whose values are negative (but are not Start or
Totals).
l Total Included in Series: Indicates if the data source includes a total value (a sum). If set
to TRUE, the chart identifies the last value in the data source as a Total bar on the chart. If
set to FALSE, the chart auto-calculates a sum and adds the result as a Total bar on the
chart.
l Include Subtotals: Indicates auto-calculated subtotals added to the waterfall chart. If set to
TRUE, you can indicate where you would like to place the subtotal bar(s). If FALSE, no
subtotals are defined. Default is False.
l Subtotal Indexes: Enter a comma separated list of number(s) to indicate where you would
like your subtotal bar(s) to be placed. For example, to display a subtotal after your first delta
value, you would enter 1. You can also enter 2,4 for example, which would display a
subtotal after the second delta value and also after the fourth delta value. These numbers
count the delta values.
l Subtotal Labels: Enter a comma separated list of labels for each subtotal. Each label
needs to have unique text. The number of labels must equal the number of subtotal indexes
specified. If nothing is listed, defaults to “Subtotal 1,” “Subtotal 2,” “Subtotal 3,” and so forth.
l Subtotal Bar Color: Sets the color of bar(s) that represent a subtotal. Defaults to
XFDarkBlueBackground.
l Label Position: Sets the position of the bar labels. Default is Auto. Options include:
o Auto: Labels are placed above the bar for rising values and below the bar for falling
values.
o Center: Labels are placed in the center of the bars.
o InsideEnd: Labels are placed on the inside bottom of the bar for falling values and on
the inside top of the bar for rising values.
o InsideStart: Labels are placed on the inside bottom of the bar for rising values and
on the inside top of the bar for falling values.
2. Expand Dashboard Maintenance Units and then expand the appropriate maintenance
unit.
3. Click the Components label and then, from the toolbar, click Create Dashboard
Component.
4. In the Create Dashboard Component dialog, click Chart (Advanced) and then click OK.
5. In the Name field, type a name for the new waterfall chart.
6. Scroll down to the Series Properties section and select Waterfall in the Type field.
7. Click Save.
If you have an existing BarRangeSideBySideWaterfall chart type, you can change it to waterfall by
selecting Waterfall in the Type field.
Data Explorer
This is a Component linked to a Cube View and displayed in the standard Data Explorer view
which is seen when a Cube View is launched from within the Analysis area on the OnePlace Tab.
To use this, name the Component and add a Data Adapter linked to a Cube View. This
Component is not compatible with Mobile Dashboards.
This is a component linked to a Cube View and displayed in Report Viewer as if generated on the
fly from a Data Explorer Window. To use this, name the Component and add a Data Adapter
linked to a Cube View. This Component can be used with Mobile and Application Dashboards.
See Report in the Cube Views section.
File Viewer
This is used to present files such as PDF’s stored on the OneStream file share. This Component
can be used with Mobile and Application Dashboards.
There are only three situations where users can see these files:
URL
Display a file from an internal or external web page.
Dashboard File
Display a file stored in a Dashboard Maintenance Unit File section.
Show Header
Select True in order to display a header derived from the Component’s Name or Description.
NOTE: An Extensible Document is a Text, Word, PowerPoint, or Excel file that uses
Parameters in its content. The file name must contain .xfDoc before the extension.
Example: StatusReport.xfDoc.docx
Image
This is used to display an image on a Dashboard. This Component can be used with Mobile and
Application Dashboards.
Formatting
See Label
Image
URL
The image displayed is located on a URL.
Dashboard File
The image displayed is located in a file stored in the Dashboard Maintenance Unit File Section.
URL Syntax
Access a built-in toolbar image, or use an image from any web server
The following Excel properties are used to specify an Excel file from either an Extensible
Document or Report Book in order to display it as an Image or Button on a Dashboard.
Excel Sheet
Enter the name of the Excel worksheet to display.
See Image under Component Display Format Properties to position and align the Excel image as
needed.
Label
This is used to display text strings on a Dashboard. This Component can be used with Mobile and
Application Dashboards.
Formatting
Text (use {1}, {2} for Data Table Cells) (Label Component Only)
The text to display in the Label. Use {1}, {2} to reference cells from the associated Data Adapter’s
Data Table.
Example: Sales and Profit for |!EntityParam!| are {1} and {2}
Tool Tip
The text to display when a user’s mouse hovers over the Component.
Display Format
The formatting assigned to the control. See Component Display Format Properties
The following properties become available if a number is specified in the Data Table Cells field.
First Row
The specified row is the first row of data from the Data Adapter.
Number Format
The formatting applied to a value returned to the label placeholder if a string is not returned.
Logo
This is used to display a logo on a Dashboard. This Component can be used with Mobile and
Application Dashboards.
Formatting
See Label
Password Box
The Password Box displays a text box in which the text is hidden so as you are typing your text
displays as bullets. If you are entering a password, for example, your password characters are
hidden.
When you create a new Password Box, you can enter text for a Tool Tip that displays when you
hover over the text box.
3. Click Create Dashboard Component. The Create Dashboard Component window opens.
5. In Component Properties, enter the Name; this field is required. You can also add text for
Tool Tip, which appears when you hover over the text field.
6. Click Save.
7. The new Password Box component now appears in the Components list.
8. Add the PasswordBox component to the Dashboard and then run the Dashboard.
9. After you generate, you see the text box and the Tool Tip.
Pivot Grid
The Pivot Grid is a component within the Dashboard Maintenance Unit that allows you to create a
pivot table, utilizing data from any existing and new Data Adapters to perform multi-dimensional
analysis. At runtime, the user can customize the layout of the report using simple drag-and-drop
operations and conditional formatting based on their analysis requirements. In the Pivot Grid, data
can be summarized, calculated fields (text and decimal) can be added and displayed in a cross-
tabular format that can be sorted, grouped and filtered. The Pivot Grid also supports drill-down (to
view the underlying Data Adapter Results Table). The resulting output can be printed or exported
to various file formats (such as PDF, XLS and XLSX).
All Data Adapters are supported when using the Pivot Grid. Utilizing Cube View MD (Multi-
Dimensional) as the source of a Data Adapter will return the selected Cube View as a Multi-
Dimensional Fact Table and provide an easy to use source table for the Pivot Grid. The results of
the Cube View MD are Dimensions (Entity, Consolidation, Scenario, time, View, Account, Flow,
Origin, IC, UD1-UD8) as columns.
Component Properties
Name
The name of the Pivot Grid.
Description
A quick description of the Pivot Grid.
Maintenance Unit
The Maintenance Unit to which the Component belongs.
Component Type
The type of Component
Pivot Grid
Row Fields
Enter a comma separated list of Column Names (from the Results Table) to be placed in the Row
Area of the Pivot Grid by default.
Column Fields
Enter a comma separated list of Column Names (from the Results Table) to be placed in the
Column Area of the Pivot Grid by default.
Data Fields
Add a Measure (from the Results Table) to be assigned as the default Measure. Multiple
measures can be added using a comma “,” to separate the column names.
Field Groups
Enter a comma separated list of Column Names (from the Results Table) to be grouped together
as a default.
Save State
Enables the Save Button on the Pivot Grid when set to True and Disables the Save button when
set to False.
When copying a Dashboard that contains a Pivot Grid or Large Data Pivot Grid, the Pivot Grid and
Large Data Pivot Grid component layouts are saved per dashboard location, not by component.
Next time each dashboard is run the Pivot Grids should have different saved layouts.
Additional information on the use and design elements of the Pivot Grid can be found in the Pivot
Grid.
Report
Text Editor
This is used to display documents created with the Text Editor feature in a Dashboard. It also can
process Extensible Documents when embedded in a Dashboard. This Dashboard Component is
only compatible with the OneStream Windows App. If using any version other than the
OneStream Windows App to see files created with the Text Editor feature within a Dashboard,
information will not display, and an error message appears telling the user that Text Editor
documents are not available in the OneStream browser version. See Text Editor in "Application
Tools" on page 779 for more information.
Dashboard File
Display a file stored in a Dashboard Maintenance Unit File section.
Show Ribbon
Select True to enable users to see the Ribbon below the Menu Bar, select False to hide the
Ribbon.
Text Viewer
This is used to view text and rich text documents similar to those created in Microsoft Word. This
is available in the OneStream Windows App version. Documents cannot be created or edited
using this tool.
Spreadsheet
This is used to display spreadsheet files created with the Spreadsheet feature in a Dashboard.
See Spreadsheet in "Application Tools" on page 779 for more information.
NOTE: There are some configuration changes necessary for the Spreadsheet
referenced in a Dashboard if functions such as XFGetCell or XFSetCell are included in
that XLSX file that reference a custom Parameter that is driven by the Dashboard.
In this case, the function available to Excel Add-in and Spreadsheet called
XFGetDashboardParameterValue should be used. If that function is used within an
XLSX file that is using a function like XFGetCell or XFSetCell (or similar) where these
are referencing a custom parameter value (e.g. ParamEntity) that is on the Dashboard
that references this Spreadsheet from within it as a Component.
Spreadsheet
Dashboard File
Display a file stored in a Dashboard Maintenance Unit File section.
Show Ribbon
Select True to enable users to see the Ribbon below the Menu Bar, select False to hide the
Ribbon.
State Indicator
This is used to indicate a specific status on a Dashboard. This Component is not compatible with
Mobile Dashboards.
Formatting
See Label
States
Lamp
Off, Green, Yellow, Red
Arrow
Up, Down, Left, Right Arrow is displayed
Smile
Very Happy, Happy, Neutral, Sad
Traffic Lights
Off, Green, Yellow, Red
(Not Used)
The State is not used.
Minimum Amount
The minimum amount for the State to be selected.
Maximum Amount
The minimum amount for the State to be selected.
Range
The amount between two values for the State to be selected.
Minimum Amount
The value used in State tests for minimum amount.
Maximum Amount
The value used in State tests for maximum amount.
Text Items
The text values used in the types: Equals Text Item, Starts with Text Item, Ends with Text Item &
Contains Text Item.
Web Content
This is used to display a URL or file embedded in a OneStream Dashboard. It can be used with
Mobile and Application Dashboards.
NOTE: This component uses Microsoft Edge WebView2 Runtime to embed the web
content. It must be installed on the client device. If the client device does not have it
installed, you will receive an error when accessing a dashboard that is configured with a
Web Content component.
URL
The URL to a web page. This can be internal or external.
Dashboard File
The image displayed on the button is based on a file stored in the Dashboard Maintenance Unit
File Section.
Show Header
Select True in order to display a header derived from the Component’s Name or Description.
The Large Data Pivot Grid is a Dashboard Component that supports connecting to external tables
or large database tables for “pivot” style analytic reporting. The Large Data Pivot Grid allows the
designer to integrate data, found in external tables, seamlessly into the OneStream environment
through a Dashboard for analytic reporting. The Large Data Pivot Grid’s Paging feature and
Server Based Processing enables the component to manage very large data sets/tables.
In some cases, use of this component will require server configuration settings to allow users the
ability to access the specific database table.
Property Setting
Property Setting
Where Clause Used to assign global filters to focus the returned results from the source
table
Data Field List of comma-separated key value pairs that specify one aggregate
Aggregation function type per data field. Supported Sum, Average, Min and Max.
Types
Page Size Number of records returned in a page. Defaults to 500 with a maximum
value of 3000.
Save State Enables the Save Button on the Large Data Pivot Grid when set to True
and Disables the Save button when set to False
General
Custom Parameters
Click the ellipsis in order to select and assign a custom Parameter such as a Cube View Style to
the Cube View Header. See Cube View Styles in "Using Parameters" on page 220 for more
details on this feature.
IsVisible
This determines whether the Component is visible on the Dashboard. Settings are True, False, or
Use Default.
HorizontalAlignment
This determines how the Component should be aligned horizontally on the area allocated in the
Dashboard.
Left
The Component is aligned on the left of the allocated area.
Center
The Component is aligned in the center of the allocated area.
Right
The Component is aligned on the right of the allocated area.
Stretch
The Component is stretched to fill the allocated area.
Use Default
Use the default setting for horizontal alignment.
VerticalAlignment
This determines how the Component should be aligned vertically on the area allocated in the
Dashboard.
Top
The Component is aligned on the top of the allocated area.
Center
The Component is aligned in the center of the allocated area.
Bottom
The Component is aligned on the bottom of the allocated area.
Stretch
The Component is stretched to fill the allocated area.
Use Default
Use the default setting for vertical alignment.
Width
The width of the control.
Height
The height of the control.
BorderThickness
The thickness of the border used when displaying the control.
FontFamily
The font type used on the Component (e.g., Ariel, Times Roman, etc.).
FontSize
The font size used on the control text.
Bold
If set to True, the font will be bold.
Italic
If set to True, the font will be italicized.
Color
The color of the text displayed in the control.
TextColor
Choose a color from the drop-down options for the control text.
BackgroundColor
Choose a color from the drop-down options for the text background.
BorderColor
Choose color from the drop-down options for the border of the control.
HoverColor
Choose color from the drop-down options to change the hover color of a button Dashboard
component.
Image
HorizontalContentAlignment
This determines how the Image’s content should be aligned horizontally on the area allocated in
the Dashboard.
VerticalContentAlignment
This determines how the Image’s content should be aligned vertically on the area allocated in the
Dashboard.
ImageStretch
This setting allows the image to be re-sized when being displayed.
None
This does not re-size the image.
Fill
The image stretched to fill the control.
Uniform
This preserves the aspect ratio of the image.
UniformtoFill
This scales the source image to fit within the bounds of the image object and keeps the source
image centered within the image object.
Label
LabelPosition
The location where the control label is placed.
None
This does not display a label for the control.
Left
This displays the label to the left of the control.
Top
This displays the label above the control.
Right
This displays the label to the right of the control.
Bottom
This displays the label below the control.
Use Default
Use the default label display location
LabelFontFamily
The font type to use on the label (e.g., Ariel, Times Roman, etc.).
LabelFontSize
The font size to use on the label.
LabelBold
If set to True, the font is bold.
LabelItalic
If set to True, the font is italicized.
Colors
LabelZeroOffsetForFormatting
This holds a number with the default value of 0.0. When a Label is being used to display a
number, and the number is greater than the number in this field, it will be displayed using the color
associated with LabelTextColor. Otherwise, the number will be displayed using the color
associated with LabelNegativeTextColor.
LabelTextColor
Choose a color from the drop-down options for the label text.
LabelNegativeTextColor
Choose a color from the drop-down options for the label’s negative text.
Embedded Components
Embedded Dashboard
When a Dashboard is created in the Maintenance Unit, an Embedded Dashboard is also created.
These Embedded Dashboards are used for sharing across Dashboard Groups in the same
Maintenance Unit.
Data Adapters
Data Adapters specify the kind of data used within a Dashboard. Once the Data Adapter is
configured and pointing to the appropriate data, attach it to a Dashboard Component in order to
display it on a Dashboard.
General Properties
Name
The name of the Data Adapter.
Description
A quick description of the Data Adapter.
Maintenance Unit
The Maintenance Unit to which the Data Adapter belongs.
Cube View
Choose a Cube View as the source of a Data Adapter. Additional options can be selected here to
include supplemental information for the resulting tables. However, adding on to what is
defaulted may have a slight impact on performance:
Cube View
This command type allows for a pre-configured Cube View to be the Data Source for a
Dashboard. Click and begin typing the name of the Cube View in the blank field. As the first
few letters are typed, the names are filtered making it easier to find and select the one desired. If
the name is unknown, expand a Cube View Group and scroll through the list to select the correct
one. Once the Cube View is selected, click CTRL and Double Click. This will enter the correct
name into the appropriate field.
Include Title
At the creation of the Data Adapter, the default is set to False. When set to True, the title will be
displayed from the Report section of the Cube View as the title for the Dashboard. Settings are
True or False.
Include… POV
If set to True, the POV information for the Cube, Entity and all other Dimensions are included. Use
these if the report or Dashboard needs this information.
Cube View MD
Choose a Cube View MD (Multi-Dimensional) as the source of a Data Adapter. This Command
Type will return the selected Cube View as a Multi-Dimensional Fact Table versus the reporting
table that is returned by the Cube View Command Type. The results of the Cube View MD are
Dimensions (Entity, Consolidation, Scenario, time, View, Account, Flow, Origin, IC, UD1-UD8) as
columns. This simplifies the report building process in the BI Designer, Pivot Grid, and Dashboard
development. There are additional Loop Parameter options that can be selected here to include
incremental information from the modified Cube View definition in the resulting tables. Adding on
to what is defaulted, however, may have a slight impact on performance.
Results Table from the Cube View MD Data Adapter example above.
Cube View
This command type allows for a pre-configured Cube View to be the Data Source for a
Dashboard. Click and begin typing the name of the Cube View in the blank field. As the first
few letters are typed, the names are filtered making it easier to find and select the one desired. If
the name is unknown, expand a Cube View Group and scroll through the list to select the correct
one. Once the Cube View is selected, click CTRL and Double Click. This will enter the correct
name into the appropriate field.
For example; tbl_OperatingExpenses can be used to identify the Results Table Name and the
Name of the Data Adapter (OperatingExpenses_CVMD in this example) associated with this
table.
Entity
At the creation of the Data Adapter, the default is set to Name And Description. This will display
both the Name and Description of the Entity. When set to Name, the Entity Name will be displayed
in the results table from the Cube View. When set to Description, the Entity Description will be
displayed in the results table from the Cube View.
Consolidation
At the creation of the Data Adapter, the default is set to Name And Description. This will display
both the Name and Description of the Consolidation. When set to Name, the Consolidation Name
will be displayed in the results table from the Cube View. When set to Description, the
Consolidation Description will be displayed in the results table from the Cube View.
Scenario
At the creation of the Data Adapter, the default is set to Name And Description. This will display
both the Name and Description of the Scenario. When set to Name, the Scenario Name will be
displayed in the results table from the Cube View. When set to Description, the Scenario
Description will be displayed in the results table from the Cube View.
Time
At the creation of the Data Adapter, the default is set to Name And Description. This will display
both the Name and Description of the Time. When set to Name, the Time Name will be displayed
in the results table from the Cube View. When set to Description, the Time Description will be
displayed in the results table from the Cube View.
View
At the creation of the Data Adapter, the default is set to Name And Description. This will display
both the Name and Description of the View. When set to Name, the View Name will be displayed
in the results table from the Cube View. When set to Description, the View Description will be
displayed in the results table from the Cube View.
Account
At the creation of the Data Adapter, the default is set to Name And Description. This will display
both the Name and Description of the Account. When set to Name, the Account Name will be
displayed in the results table from the Cube View. When set to Description, the Account
Description will be displayed in the results table from the Cube View.
Flow
At the creation of the Data Adapter, the default is set to Name And Description. This will display
both the Name and Description of the Flow. When set to Name, the Flow Name will be displayed
in the results table from the Cube View. When set to Description, the Flow Description will be
displayed in the results table from the Cube View.
Origin
At the creation of the Data Adapter, the default is set to Name And Description. This will display
both the Name and Description of the Origin. When set to Name, the Origin Name will be
displayed in the results table from the Cube View. When set to Description, the Origin Description
will be displayed in the results table from the Cube View.
IC
At the creation of the Data Adapter, the default is set to Name And Description. This will display
both the Name and Description of the IC. When set to Name, the IC Name will be displayed in the
results table from the Cube View. When set to Description, the IC Description will be displayed in
the results table from the Cube View.
UD1-UD8
At the creation of the Data Adapter, the default is set to Name And Description. This will display
both the Name and Description of the UD1-UD8. When set to Name, the UD1-UD8 Name will be
displayed in the results table from the Cube View. When set to Description, the UD1-UD8
Description will be displayed in the results table from the Cube View.
Loop Parameters
This section allows changes to be made to the output of a Cube View definition being used in a
table for reporting. The Loop Parameter filters the results and considers the additional parameters
to be passed to the Cube View definition and add those results to the table accordingly.
The parameter(s) overrides the POV. So, if the Entity POV is set to CT, and the loop filter
parameters are set to NY, MA, and NJ, then data for those will be returned and NOT CT. A Loop
must be used in order to change parameters.
For example, a Loop Parameter may be used to loop through a list of Entities in the Cube View
definition and return multiple Entities for that specific Cube View. The Dimension Type and
Member Filters should be added here to pass along the appropriate Loop (e.g. Dimension
Type=Entity, Member Filter= E#US.Base) which applies to each Entity included in the Loop.
NOTE: It is recommended to not loop on any Dimensions that already exist in the Cube
View’s rows or columns.
At the creation of the Data Adapter, the default for each Dimension Type (1 & 2) are set to (Not
Used) and Member Filter (1&2) are greyed out. This will display the results without any
consideration of additional parameters to pass to the query. When the Dimension Types are set
along with the Member Filters, the results will consider the additional parameters to be passed to
the Cube View definition and add those results to the table accordingly.
Dimension Type 1
The Dimension Type containing the list of Members. e.g., Entity or Account
At the creation of the Data Adapter, the default for each Dimension Type (1 & 2) are set to (Not
Used) and Member Filter (1&2) are greyed out. This will display the results without any
consideration of additional parameters to pass to the query. When the Dimension Types are set
along with the Member Filters, the results will consider the additional parameters to be passed to
the Cube View definition and add those results to the table accordingly.
Member Filter 1
Enter a Member Filter here to determine what is seen in the Parameter.
At the creation of the Data Adapter, the default for each Dimension Type 1 is set to (Not Used) and
Member Filter 1 is greyed out. This will display the results without any consideration of additional
parameters to pass to the query. When the Dimension Types are set along with the Member
Filters, the results will consider the additional parameters to be passed to the Cube View definition
and add those results to the table accordingly.
The name of the Dimension containing the list of Members. Start typing in the blank field OR click
the ellipsis button in order to launch the Member Script Builder and enter a Member Script to
change the Cube View definition. The example below is changing the POV for the Products.
Example: UD2; U2#Top.Base
Dimension Type 2
At the creation of the Data Adapter, the default for Dimension Type 2 is set to (Not Used) and
Member Filter 2 is greyed out. This will display the results without any consideration of additional
parameters to pass to the query. When the Dimension Types are set along with the Member
Filters, the results will consider the additional parameters to be passed to the Cube View definition
and add those results to the table accordingly.
Member Filter 2
Enter a Member Filter here to determine what is seen in the Parameter.
At the creation of the Data Adapter, the default for Dimension Type 2 is set to (Not Used) and
Member Filter 2 is greyed out. This will display the results without any consideration of additional
parameters to pass to the query. When the Dimension Types are set along with the Member
Filters, the results will consider the additional parameters to be passed to the Cube View definition
and add those results to the table accordingly.
The name of the Dimension containing the list of Members. Start typing in the blank field OR click
the ellipsis button in order to launch the Member Script Builder and enter a Member Script to
change the Cube View definition. The example below is changing the POV for the Products.
Example: UD2; U2#Top.Base
Dimension Leveling
The Dimension Level property setting in the Cube View MD Data Adapter displays dimensional
data as a hierarchical tree in the BI Viewer. Dimension leveling allows you to display data in a
hierarchical structure into which you can drill down to view child data. After the data is leveled, you
can use the BI Viewer to view data as a tree, a pivot table, a chart, or a grid.
Prerequisites
To use dimension leveling to display hierarchical data, you must first create a cube view MD data
adapter with data.
1. Inside the OneStream application, click the Application Dashboards tab at the bottom of
the screen.
6. In the Command Type field, click the drop-down arrow and select Cube View MD.
7. Click the Edit button ( )at the far right of the Cube View field.
8. In the Object Lookup dialog box, select the appropriate Cube View and then click OK.
9. In the Dimension to Level field, click the drop-down arrow and select the appropriate
leveling option. With this step, you are leveling on this dimension, which in this case is
Entity.
l Outermost Row - hierarchy uses the first row and first level (of row) definition.
l Outermost Column - hierarchy uses the first column and first level (of column)
definition.
l Both - hierarchy uses the first row and first level definition and uses the first column
and first level (of column) definition.
NOTE: When Data Adapter is run, the Data Table will generate the
additional columns for the levels including a column(s) to determine the
status of the level:
12. Test the data adapter by clicking Test Data Adapter. The Data Preview dialog box
displays the table data.
NOTE: When Data Adapter runs, the Data Table generates the additional
columns for the levels including columns to determine the status of the level.
c. In the Create Dashboard Component dialog box, click BI Viewer and then click OK
d. On the Component Properties tab, in the Name field, type the name of the new
component.
g. In the Add Data Adapter dialog box, select the appropriate data adapter and then
click OK.
h. Click Save.
b. In the Data Source field, click the drop-down arrow and select the data source from
which to pull data. This is usually a table.
c. Drag each dimension level that you want to view from the Table view into the Data
Items column under Dimensions.
d. If you want to filter items based on whether they are base items, drag the
RowNumberIsBase item into the Data Items column but under Hidden Data Items.
This item will not display in the resulting dashboard but will be available for you to
filter on if necessary.
e. From the BI Designer ribbon, select the type of dashboard item you want to view in
the dashboard. For example, select Filter Element > Tree View. In the resulting tree
view in the dashboard notice that you can expand parent members down to their
base child members.
If you add a Grid, all levels of the parent entity display in their own columns. If you
want to see a pivot table, select Pivot from the ribbon and notice that each level is
expandable similar to tree view.
Conclusion
With Dimension leveling, you can view data in an easy-to-understand, hierarchical format. With
the BI Viewer, you can design various dashboard items in which to view the dimension leveled
data.
Method Query
TIP: To view an example Method Query, leave the Method Query field blank, click Save,
Method Type
BusinessRule
Use the Business Rule option when creating a custom rule to incorporate within a Method Query.
The Business Rule is used as the first set of {} within the Method Query.
CertificationforWorkflowUnit
This lists all Certification Questions for the particular Workflow Unit.
Example Method Query: {Workflow Profile Name}{Scenario Name}{Time Name}{Include
Descendants}{} or {Dallas}{Actual}{2011M2}{true}{}.
ConfirmationforWorkflowUnit
This lists the Confirmation Rules results for a particular Workflow Unit.
Example Method Query: {Workflow Profile Name}{Scenario Name}{Time Name}{Include
Descendants}{Filter} or {Montreal}{Actual}{2011M6}{true}{} Name}{Include Descendants}{Filter}
or {Montreal}{Actual}{2011M6}{true}{}
DataUnit
This returns all rows of data related to the specified Data Unit (i.e. Cube, Entity, Parent,
Consolidation Member, Scenario, Time and View).
Example Method Query: {Cube}{Entity}{Parent}{Cons}{Scenario}{Time}{View} {True}{Empty
String or Filter Expression}
DataUnitComparison
This returns all rows from two different Data Units specified for comparison purposes.
Example Method Query: {Cube1}{Entity1}{Parent1}{Cons1}{Scenario1}{Time1}
{Cube2}{Entity2}{Parent2}{Cons2}{Scenario2}{Time2}{View}{True}{True}{Empty String or Filter
Expression}
Excel
This returns data sourced from an Excel file.
FormsStatusForWorkflowUnit
This lists detailed information about the Forms for a particular Workflow Unit.
Example Method Query: {Workflow Profile Name}{Scenario Name}{Time Name}{Form Status}
{Filter} or {Houston}{Actual}{2011M1}{All}{}
Groups
This returns the Group ID, Name, Description and whether or not this is an Exclusion Group.
Method Query Example: {GroupName = 'FinanceGroup'}
GroupsforUsers
Select User properties and all of the Groups to which the user. This returns the same group
properties as the Group Method Query.
Method Query Example: {UserName = 'Administrator'}{}
ICMatchingforWorkflowUnit
This returns a detailed Intercompany Matching Report table for the given Workflow Unit and
several other Parameters. The Parameters here override what is already set up in the Workflow
Profile.
Method Query Example: {Workflow Profile Name}{Scenario Name}{Time Name}{Plug Account
Override}{Suppress Matches Override}{Tolerance Override}{Filter} or {Flint}{Actual}{2011M1}
{Empty String or A#MyPlugAccount}{Empty String or true/false}{Empty String or 0.0}{Empty
String or Filter Expression}.
ICMatchingPlugAccountsforWorkflowUnit
This returns the list of Intercompany Plug Accounts for a given Workflow Profile and Scenario
Type configured for the Workflow Profile.
JournalforWorkflowUnit
This lists the Journals entered for a given Workflow Unit
{Workflow Profile Name}{Scenario Name}{Time Name}{Journal Status}{Filter} or
{Frankfurt}{Actual}{2011M3}{All}{}
Members
This returns Dimension ID, Member information such as ID, Name and Description, and a few
other properties for the chosen Dimension and Member Filter.
{Account}{MyAccountDim}{A#Root.TreeDescendants}
{Empty String or Filter Expression}
UserCubeSliceRights
This lists each user’s Data Access settings on a given Cube.
Example Method Query: {UserName}{CubeName}{Filter} or {AllUsers}{AllCubes}{}.
UserEntityRights
This returns the Cubes and Entities the user has access to according to the security settings
under Entities.
Method query example: {UserName}{CubeName}{Filter} or {US Clubs Controller}{AllCubes}{}.
Users
This returns all properties associated for the chosen User Name.
Method query example: {UserName = 'Administrator'}
UserScenarioRights
This returns all accessible Scenarios and many related Scenario properties for the chosen User
Name filter and Cube.
Method query example: {AllUsers}{AllCubes}{Empty String or Filter Expression}
UserinGroups
This returns a list of Users and selected User properties for the chosen User Group.
Method Query Example: {GroupName = 'FinanceGroup'}{}
UserWorkflowProfileRights
This lists the rights assigned to users for Workflow Profiles.
Method query example: {User Name}{Workflow Cube Name}{Workflow Profile Type}{Filter} or
{Administrator}{GolfStream}{AllProfiles}{}.
WorkflowandEntityStatus
This returns properties for Workflow status, status code/description, last executed step, date/time
information, completed steps, and data status for the chosen Workflow Unit including both the
Workflow Profile level and individual Entity level.
Method query example: {MyWorkflowProfileName}{Actual}{2011M1}
{AllProfiles}{Descendants}{Empty String or Filter Expression}
WorkflowCalculationEntities
This lists the Entities that appear under Calculation Definitions for this Workflow Profile.
WorkflowConfirmationEntites
This lists the Entities that appear under Calculation Definitions with a Confirmed check box for this
Workflow Profile.
WorkflowProfileandDescendantEntities
This creates a list of Entities and all descendants located under Entity Assignment for this
Workflow Profile.
WorkflowProfileEntities
This creates a list of Entities located under Entity Assignment for this Workflow Profile.
WorkflowProfileRelatives
This lists related Workflow Profiles based on certain criteria.
Method query example: {Workflow Profile Name}{Scenario Name}{Time Name}{Workflow Profile
Type}{Relative Type}{Include Requesting Profile}{Filter} or {GolfStream}{Actual}{2011M1}
{AllProfiles}{Descendants}{true}{}.
WorkflowProfiles
This lists the Workflow Profiles.
Method query example: {WorkflowProfileType}{Filter} or {AllProfiles}{Type = 'InputAdjChild'}.
WorkflowStatus
This lists the status, lock status and last step completed of a given Workflow Unit.
Method query example: {Workflow Profile Name}{Scenario Name}{Time Name}{Workflow Profile
Type}{Relative Type}{Filter} or {Houston}{Actual}{2011M1}{AllProfiles}{Descendants}{}
WorkflowStatusTwelvePeriod
This returns status value and text summary for 12 months for a given Workflow Profile, Scenario
and Year. For Workflow Profile Type, options are AllProfiles, CubeRoot, Default, Review,
BaseInput, InputImportChild, InputFormsChild, InputAdjChild and ParentInput.
Method query example: {MyWorkflowProfileName}{Actual}{2011}{AllProfiles}
{Descendants}{Empty String or Filter Expression}
Method Query
The query ran for this Data Adapter.
WorkflowLockHistory
Displays all Lock history details in a report for a given Workflow Profile with the ability to filter by
Scenario, Time, Workflow Profile, Origin, Channel, Time, User and Lock Status.
Method query example:{Workflow Profile Name}{Scenario Name}{Time Name}{Workflow Profile
Type}{Relative Type}{Filter}
Journal Status
Approved, Posted, Rejected, Submitted
SQL
A SQL query against either the Application or Framework database can be written as a Data
Source. Reference substitution variables such as |WFProfile| from within the SQL statement.
Database Location
Application
The current OneStream Application database where Stage and financial Cube data resides.
Framework
The connected OneStream Framework database where security and log data resides.
External
Any other database outside of OneStream.
SQL Query
The SQL statement ran for this Data Adapter.
BI Blend Adapter
The purpose of the BI Blend Adapter is to provide Dashboard designers with a pre-defined
interface to querying BI Blend tables. The data sourced by the BI Blend Adapter can be used to
design standard OneStream Dashboard.
The BI Blend Workflow process is designed to generate large external database tables formatted
in a column store index optimized for analytic reporting. The overall number of records that may
be generated by the BI Blend process may be too large for the Dashboard Components to
process. Therefore, the designer should manage the returned dataset by defining an appropriate
Where Clause to retrieve a “slice” of the BI Blend table which is suitable for the BI Blend Adapter.
Table Info
This defines the name of the BI Blend table name to be queried. This label supports the use of
Parameters and Substitution Variables.
Group By
This defines the source database columns, and their order, which are to be returned to the
adapter. These labels must match the names specified on the database. Any field to be pivoted
must be in the Group by list.
l Sum
l Min
l Max
l Avg
l Count
The syntax required is to define each Aggregation Type result as a unique key. AggregationType
= [databaseColumnName, Type]
Where Clause
The is used to “pre-filter” the table. The BI Blend Adapter will query the source table and return all
the results to the client for processing, such as for use in a Pivot Grid. The size of the query must
be managed to ensure the overall performance of the Dashboard report is optimized. The Where
Clause uses standard SQL syntax to define the filters which are applied to the query.
l DatabaseColumn = ‘Text’
l Stage Summary Target Data: Query designed to automatically determine the method
used to manage Summary Target Data, as Row or Blob, and appropriately return the
results.
Method query example: {Workflow Profile Name}{Scenario Name}{Time Name}{Filter}
Parameters
Parameters prompt the user for values that help filter data in the resulting Dashboard. They are
not required for Dashboards, but once an administrator learns how to use them, he/she will likely
add multiple to the Dashboards. See "Using Parameters" on page 220 for more details on how to
use Parameters in Dashboards and Cube Views.
Description
A quick description of the Parameter.
User Prompt
This is a free form message that will appear to users giving them a specific instruction.
Maintenance Unit
A Parameter is created and then assigned to a Dashboard Maintenance Unit. This cannot be
changed once the Parameter is created.
Sort Order
This organizes and puts the Parameters in order.
Input Value
This Parameter will prompt the user for a value to type in by hand.
Default Value
For the Input Value Parameter, leave the Default Value blank. This will allow a user to manually
key in an input value.
Literal Value
A Literal Value Parameter specifies a fixed Parameter value to replace a custom substitution
variable name.
Default Value
This will allow a user to manually key in an input value.
Delimited List
This is a list of options that mean something to the user but are associated with Values that are
linked back to the Cube.
Default Value
This will allow a user to manually key in an input value.
Bound List
This is a list of Members created by using a predefined Method Query or entering a specific
expression to get the Members wanted in the list.
For example, to list all Entities from a Dimension, enter this Method Query with a Command Type
of Member:
Or specifically:
{Entity}{CorpEntities}{E#Root.TreeDescendants}{}
Default Value
This will allow a user to manually key in an input value.
Custom
This is the custom option to customize the format string. When Custom is chosen, the Result
Custom Format String is enabled.
For example, if a Parameter's value is Blue and the custom format string is "The color is {0}.", then
the display text would be The color is Blue.
Command Type
Method
This Parameter will use a Method Query
SQL
This Parameter will use a SQL Query.
If Method is chosen for Command Type, the following properties are available:
Method Type
See Method Query under Data Adapters for details on these options.
Method Query
The query ran for this Parameter.
Display Member
Determines the returned display of the member. Enter the field as Name or Description” to
display results as required.
Value Member
Enter the name of the Member as it appears in the application.
If SQL is chosen for Command Type, the following properties become available:
Framework
The connected OneStream Framework database where security and log data reside.
External
Any other database outside of OneStream.
SQL Query
The SQL statement ran for this Parameter.
TIP: If the Method Query syntax is unknown while building Parameters, simply leave it
blank and click this button in order to run the Parameter and get a sample set of
results. This will reveal what the syntax is and include an example.
Member List
This is a list of Members displayed based on a Member Filter for a Dimension.
Default Value
This will allow a user to manually key in an input value.
Display Member
Determines the returned display of the member. Enter the field as “Name”, “Description” or
“Name and Description” to display results as required.
Cube
The name of the Cube containing the list of Members. Click and begin typing the name of the
Cube in the blank field. As the first few letters are typed, the names are filtered making it easier to
find and select the one desired. Once the Cube is selected, click CTRL and Double Click. This
will enter the correct name into the appropriate field.
Dimension Type
The Dimension Type containing the list of Members. e.g., Entity or Account
Dimension
The name of the Dimension containing the list of Members. Click and begin typing the name
of the Dimension in the blank field. As the first few letters are typed, the names are filtered making
it easier to find and select the one desired. If the name of the Dimension is unknown, expand the
correct Dimension Type, and scroll through the list. Once the Dimension is selected, click CTRL
and Double Click. This will enter the correct name into the appropriate field.
Member Filter
Enter a Member Filter here to determine what is seen in the Parameter.
Member Dialog
This is a list of Members displayed in a Dimension hierarchy based on the Member Filter for the
Dimension. See Member List for property definitions.
TIP: Dashboard-based Parameters are very similar to those used by Forms. if a Form is
run and it references a Parameter by name and the name cannot be found in that Form
Template, it will look for a parameter of the same name under Dashboard Parameters
and use it. If a Dashboard Parameter was already created, it can be used for
Dashboards and Forms.
Files
Administrators use the File option to upload images and documents to create personalized,
company-specific Dashboards. Items such as PDF report books and Word or Excel Extensible
Documents can also be stored and used in Dashboards. You can:
Strings
Strings are setup as an object type under a Dashboard Maintenance Unit. These Strings can be
used with XFString functionality and displayed in a Cube View as a Caption or Heading, for
example. See section on Report Alias for details on use.
Dashboard Profiles
Once a Dashboard Group is completed, it is organized into a Dashboard Profile. The Profiles are
then assigned to various functions throughout XF. Choose the icon to create a new Profile.
Choose the icon to assign a Group to a Profile. This allows the user to select which Groups
will be in the Profile.
The Dashboard’s Visibility setting controls what Dashboards are viewed throughout the
application. This is the key property for all Dashboard Profiles. The options are as follows:
Never
This is used for Dashboards that are expired, or no longer being used.
Always
This allows the Dashboard to be available in OnePlace and Workflow.
OnePlace
This allows the Dashboard to be viewed in the Dashboard section under the OnePlace Tab.
Workflow
This allows the Dashboard to be attached to a Workflow Profile. This can be completed in
Workflow Profiles under the Integration Settings and Data Quality Settings.
Previewing Dashboards
Once in preview mode, Parameters may need to be entered, and then click Ok to get the
Dashboard to run. Click to go back to Parameters and run the Dashboard again under a
different input value.
The example above shows a Dashboard with two Chart Components and one Data Explorer
Component.
In this example above, there are three Components listed in the Dashboard Components tab in
the order in which they appear on the screen.
Component 1 has the Dock Position set to Bottom and Height to 65%.
Component 2 has the Dock Position set to Left and Width to 60%.
Component 3 has the Dock Position set to any position such as Right and all other setting were
left blank. The Dashboard will automatically dock this last Component to fill the rest of the
available space.
Set layout type to Tabs in order to view each Component in its own tab.
By right clicking on a Profile under OnePlace Dashboards on the left in the Navigation Pane, a
user can export all the Dashboards in the Profile out to a combined PDF file. Right click and
choose Export All Reports in this Profile… and then choose Combined PDF File or PDFs in Zip
File. This is a quick and easy way to produce Report books. The only exception is that currently
graphs and charts do not export to PDF in these mass forms.
Accessed via the OneStream Windows Application Spreadsheet, the Table Views provide a
client-based tool to support Dashboard forms. Table Views are not intended as an alternative to
other tools, such as the SQL Table Editor or Grid Viewer, Dashboard Components.
Key Use
l Utilize client-side functionality, found in the Spreadsheet tool, such as calculations and
pick-list validation lists
Design Considerations
l Controlling elements must be designed into the Table View Business Rule by the creator to
ensure data integrity, security and performance
l Spreadsheet support of Table Views depends upon the number of rows and row content
l The Spreadsheet Control does not support paging, therefore all rows and content must be
returned
Overview
A Table View definition for the Windows Application Spreadsheet Tool is defined in a Business
Rule. The Administrator designing the rule can define the rows and columns which should be
returned to the Spreadsheet from the source table presented in the Table View.
The Table View Business Rule can collect data from multiple data sources. For example, a single
Spreadsheet worksheet can display a Table View which collects data from two or more sources.
The Administrator has full control over the write back “save” process through Business Rules.
When designing the Table View Business Rule, the BRAPI Authorization functions should be
designed into the Business Rule to control access to the viewing or modifying the data. This can
be applied to the entire table or to specific cells. A workbook can contain multiple Table Views.
These can be on the same worksheet or across worksheet pages.
A single Business Rule file can be used to define multiple Table Views by calling the Business
Rule argument, TableViewName. Additionally, a single named range can be used to manage
table data cells within the Spreadsheet using user defined named ranges (XFTV_*).
NOTE: When browsing for Table Views Business Rule, only Spreadsheet Type Rules
will be displayed in the Object Lookup dialog.
User Culture
A User’s culture is set in the Culture property field in each User’s configuration. This is located
within the Security section in the System tab under Administration.
When the User is selected, a property grid will display. The Preferences tab within the property
grid will contain an option for Culture which can be selected from the dropdown field. These
options are controlled in the OneStream Server Configuration Utility and additional languages can
be added there as needed.
In this example, select the Dimension tab (e.g. Account Dimension shown below). Next select the
desired Dimension members (e.g. 61000 – Gross Income) to be updated. In the Descriptions field
under the Member Properties tab, enter the translated description (e.g. Result d explotation) next
to the language (e.g. French) being updated.
From the Application tab, select Cube Views, then select an existing Cube View from the Cube
View Groups area (or create a new one). From the Designer tab click on General Settings and
then select the Header Text box.
The Culture setting is associated with how the Member Description will be displayed in the Cube
View. This setting can be set multiple ways to display in different languages associated with the
application. Selecting the ellipsis (…) at the end of the Culture field will provide options to define
how the Members are displayed in the Cube View:
Current (default) – When blank, reacts the same as Current, which means the current User’s
Culture setting is used to apply a Report Alias during Cube View rendering.
Invariant will display the Members in their Default Description as defined in the Member
Properties of that item.
Selecting other cultures (e.g. fr-FR, as shown below) will display Members in their respective
languages for any User running the Cube View.
In the example below, we selected a Culture of “fr-Fr” and Saved the Cube View to display the
members in French.
The results display the Account Members in French for the given Cube View.
The Culture setting can also be set as a prompt to provide the option to change the language each
time the Cube View renders. The user can type in a Parameter to be used to prompt the user for a
culture that will override the user’s culture setting established in their security setup.
Below, the Parameter|!Enter Culture!| was entered in the Culture field and selected the Save
button.
Run the Cube View via the Open Data Explorer button
The user is prompted to enter the Culture, which they can type in. Alternatively, there could be a
Parameter set up under Application Dashboards which is a delimited list of applicable Cultures for
this Application.
The user can modify the Culture to reflect English by typing in “en-US” and click the OK button.
Example: In the Application tab, select Dashboards under the Presentation section. In the
Dashboard Maintenance Units section, we selected “Test Maint Unit” group.
This adds a new String called MyAliasReportingString. This will be used in our Cube View
example to display the Page Caption of the Cube View report in multiple languages based on the
User’s default culture setting and via a parameter to prompt the user for a specific culture to
render that respective language.
Below the Description for the Languages set forth in this String are:
In the Cube View within the Designer tab under the Common section there is a Page Caption
property in the General section. The Page Caption section will be updated with an XFString
function to call a String from the Dashboard Maintenance Units.
Using a parameter within the Page Caption in this instance will prompt the user to type in the
culture of the language they wish the report caption to render.
The results shown below reflect the page caption of “Gross Income Report” in French:
For example, in the Cube View via Data Explorer, click the Export To Excel button.
This will render the Cube View in Excel in the respective language.
Using Cube View Connections in Spreadsheet and Excel Add-in will also render the Cube View in
the language that is established. Open a new or existing Spreadsheet and select Cube Views >
Cube Veiw Connections in the OneStream menu.
Click the ellipsis in the next Cube View Connection dialog box.
Search for your Cube View (Object Type) and then select and click OK.
In the Cube View Connection dialog box, check your settings and click OK.
This Cube View we selected contains a prompt for Culture and Time Periods, enter those
accordingly and click OK.
This Cube View is now added. Click Close to view the results.
The results of the Cube View connection with the Language updates (Report Alias) in the Cube
View.
Object Lookup
Use this dialog throughout the application to look up an object to assign rather than having to
remember a specific name or format.
When designing a Cube View Header, click the Object Lookup icon and select Parameters (with
Pipes) from the Object Type menu. If the name of the specific Parameter is known, begin typing it
in the filter at the top of the dialog. Select the desired Cube View Styles Parameter, such as
DefaultHeader, and click Copy to Clipboard or click CTRL/Double Click to copy.
Next, click into the empty field where the copied Parameter needs to go, and click CTRL+V. This
will paste the Parameter into the Cube View field.
Then paste it into the Cube View Name For Sharing All Rows property.
Example
{XF}{Application}{Chart}{Waterfall}
{XF}{Application}{ChartReport}{Waterfall}
NOTE:
Use this for all Chart (Advanced) Dashboard Components.
{XF}{Application}{ItemType}{CubeViewName}
Example
{XF}{Application}CubeViewReport}{BalanceSheetSummary}
{XF}{Application}{ItemType}{FilePath}
Example
{XF}{Application}{ExcelFile}{Documents/Users/jsmith/Favorites/VarianceReport.xfDoc.xlsx}
NOTE: For Excel Named Range Item Types, the Excel Named Range Name is
configured in the formatting string. (e.g., ExcelNamedRange=TotalAssets)
PDF
To insert a PDF into an Extensible Document, the following string needs to be updated and
pasted into the Title field when configuring the image:
{XF}{Application}{ItemType}{FilePath}
Example
{XF}{Application}{FileViaPDF}{Documents/Users/jsmith/Favorites/IS.pdf}
Report
To insert a Report Dashboard Component into an Extensible Document, the following string
needs to be updated and pasted into the Title field when configuring the image:
{XF}{ItemLocation}{ItemType}{ReportComponentName}
Example
{XF}{Application}{Report}{UserTaskActivity}
Options
The following options are available when formatting Extensible Document Image Content. Each
Image Type has a formatting string which is located in the Object Lookup Dialog under Extensible
Document Settings|Insert Content in Office Image. The user can copy and paste the desired
formatting string into the Description field when configuring the image. If changes need to be
made to the string, delete the current option and replace it with the correct one. The list below
covers all the formatting options for every Image Type.
Item Location
Application, System
Item Type
Chart, ChartReport, CubeViewReport, ExcelFile, FileViaPDF, Report
FillMode Options
Width, Height, LargestSide, SmallestSide
Anchor Options
BottomCenter/Left/Right, MiddleCenter/Left/Right, TopCenter/Left/Right
Cropping Options
This allows a user to narrow in on a portion of the image before other settings are applied. The
default setting is 0 which means cropping is not being used.
CropLeft, CropTop, CropWidth, CropHeight
XFCell
XFCell is a retrieve function used mainly in text documents such as Microsoft Word or
PowerPoint. The specific Dimension details provided in the function obtains a single cell of data
from OneStream and displays the updated value on an Extensible Document at run-time.
Examples of common XFCell formulas are provided below. See Extensible Document
Framework in "Presenting Data With Extensible Documents" on page 235 for more details on how
to configure XFCells in an Extensible Document.
The following example pulls data for a specific Entity and Account:
XFCell(E#US:A#Sales)
NOTE: Any Dimensions not specified in the formula will come from the user’s POV.
Culture Invariant is a default culture not associated with a specific country. It is typically used
when users need to convert a number to a string, but do not want the result to be different if one
user is running the string on a French PC and another user is using an English PC.
XFCell(E#US:A#Sales, Culture=Invariant)
Include Member Scripts in XFCells in order to retrieve data for specific Dimension Members.
Culture User is based on the computer’s Windows settings and the User settings.
XFCell(Memberscript, Culture=User, NumberFormat=N3,DisplayNoDataAsZero=True, Scale=3,
FlipSign=True, ShowPercentageSign=False)
Example
BR#[BRName=FMK_Helper, FunctionName=GetUserSetting, SolutionID=MBL, FieldName=
[Period]]
Application Tools
Various application tools provide you the ability to manage how you work with the OneStream
application. These tools include setting application security roles, setting application properties,
and scheduling data management sequences. There are a wide variety of tools at your disposal to
help configure the application according to your needs. In this section you will learn how to use
these tools and others to manage the application environment.
Administer Application
This role allows a user to administer the application and load zip files. This is useful when multiple
applications exist in one environment and different groups of administrators/users need to
administer separate applications.
Administrator Database
This application-level role is intended for a few people that are allowed to mass delete metadata
and data, primarily using the database page.
This roleType is unlike most other roleTypes because Administrators are not automatically given
access to operations that require this role.
Open Application
This allows the user to see and open the application.
Modify Data
This allows the user to modify data. The user is basically a read-only user throughout this
application if he/she does not have this role.
Manage Metadata
This allows a user to edit metadata under the Dimension Library for this application.
Manage FX Rates
This allows a user to update FX Rates.
Manage Data
This allows users to manage data in all aspects included, but not limited to exporting data and
clearing data completed through Data Management. This is typically an administrator function.
Below are the specific Application-Level User Interface Roles and what they control:
BookAdminPage
This gives access to the Book Designer screen located in |Application|Presentation|. This is
typically restricted to administrators, or any users who create Report Books.
FX Rates Page
This gives access to the FX Rates screen located in |Application |Cube|. This is typically
restricted to administrators.
NOTE: Click and begin typing the name of the Security Group in the blank field. As
the first few letters are typed, the Groups are filtered making it easier to find and select
the desired Group. Once the Group is selected, click CTRL and Double Click. This will
enter the correct name into the appropriate field.
Application Properties
This is where default properties are set for the application and for properties that differ by
Scenario Type.
General Properties
Global Point of View
These are enabled when forcing Global Scenario and Global Time through Transformation
settings. An initial value should be configured even if the Transformation setting is not being
used.
Global Scenario
This is the default Scenario users will see when looking at Workflow.
Global Time
This is the default Time users will see when looking at Workflow.
Company Information
Company Name
Place company name here for it to appear on automatically generated reports from Cube Views.
Workflow Channels
UD Dimension Type for Workflow Channels
The Origin Dimension controls data load, but in some cases other User Defined Dimensions
require their own layer of locking. For example, a company plans by Entity by Product. One Entity
can have five products done by different people. Each channel can be locked separately to
protect that layer of data instead of locking the entire Entity.
Formatting
Number Format
This shows the format for numeric values displayed throughout the application. Configure to
show additional degrees of precision to the right of the decimal point. This setting can be
overridden through Cube View formatting.
N0
This setting will not show any decimals or zeroes
N1-N6
These settings will show X amount of decimals. If N2 is chosen, two decimals are displayed, N5
will display five decimals, etc.
NOTE: The N in the above settings indicates that these settings are international.
#,###,0\%
This returns 10,000% and -10,000%
#,###,0
This returns 10,000 and -10,000
The three sections in a number format, separated by semi-colons, represent the format for
positive numbers, negative numbers and zeros.
#,###,0;(#,###,0);0
This returns 10,000 and (10,000)
#,###,0.00%
This returns 10,000.00% and -10,000.00%
#,###,0.00
This returns 10,000.00 and -10,000.00
#,###,0.00;(#,###,0.00);0.00
This returns 10,000.00 and (10,000.00)
NOTE: In order to vertically align positive and negative numbers in reports where
parenthesis are used for negative values, include trailing spaces in the positive number
format. This will account for the trailing parenthesis used by negative numbers.
Example: #,###,0.00 ;(#,###,0.00);0.00
NOTE: Click , select the desired format, and click CTRL and Double Click. This will
enter the correct format into the appropriate field.
Currencies
All currencies used in the application must be listed here in order to be used on the Entity, to do
any translation of currency, or enter any rates. The list of currencies will include any available
currencies that are pre-Euro, or phased out currencies for historical data loading purposes. If a
now defunct or new currency is not listed and is needed for the application, call OneStream
Support.
Transformation
Enforce Global POV
If set to True, this will enforce the current Global Scenario and Time setting for all users, so they
cannot change their Workflow View. If the Global POV is enforced, will display on the Import
task during the Workflow.
periods prior to the current Workflow year and will display on the Import task during the
Workflow.
after the current Workflow year and will display on the Import task during the Workflow.
Certification
Lock after Certify
If set to True, this will auto lock after certification in the Workflow.
Dimension Properties
Time Dimension
Start Year
The starting year of the application.
End Year
The ending year of the application.
l Point of View
l Dimension Library
UD1-8 Description
Enter a generic description that best describes the purpose of each User Defined dimension. See
example above.
Standard Reports
These settings will be applied with auto-generating a report from a Cube View.
Logo
Height
Enter a numerical value to determine the Height of the report. (e.g. 105 pixels)
Bottom Margin
Enter a numerical value to determine the Bottom Margin size.
Title
Top Margin
Enter a numerical value to determine the Top Margin size.
Font Family
The font displayed in the Title of the report.
Font Size
Enter a numerical value to determine the size of the font.
Bold
If set to True, the Title will be bold in the report.
Italic
If set to True, the Title will be in italics in the report.
Text Color
Use the ellipsis icon to choose a text color for the report Title.
Header Labels
This is where the default Header Label properties are defined for all the reports in the application.
Top/Bottom Margin
Enter a numerical value to determine the Top/Bottom Margin size.
Font Family
The font displayed in the Header Labels of the report.
Font Size
Enter a numerical value to determine the size of the font.
Bold
If set to True, the Header Labels will be bold in the report.
Italic
if set to True, the Header Labels will be in italics in the Report.
Text Color
Use the ellipsis icon to choose a text color for the Header Labels.
Header Bar
Background Color
Use the ellipsis icon to choose a Header Background color.
Line Color
Use the ellipsis icon to choose a Header Line color.
Footer
Text
An open field to enter footer text.
Font Family
The font displayed in the Footer.
Font Size
Enter a numerical value to determine the size of the font in the Footer.
Show Line
If set to True, the report will show a line in the Footer.
Show Date
If set to True, the report will show the date in the Footer.
Line Color
Use the ellipsis icon to choose the color of the line in the Footer.
Text Color
Use the ellipsis icon to choose a text color in the Footer.
Business Rules
Business Rules contain calculation logic configured to run against different parts of an application.
These rules are compiled with VB.NET or C# code and are created within a rich integrated
development environment, or IDE.
Download the OneStream API Overview Guide and OneStream API Details and Database
Documentation from MarketPlace for detailed Business Rule engine background, an API guide
and information on each database related to OneStream.
There are several areas in the product using the exact same rule syntax and applying it to how
data is imported, how the Cubes are calculated, and other operations. Once this syntax is
understood, logic can be written.
Business Rules
Business Rules are found under the Application Tab|Tools. There are nine types of Business
Rules as shown below. These can be stored, secured, and then assigned to multiple areas of the
product with the ability to re-use them.
Complex Expression
This logic can be created as a Business Rule or as a Complex Expression from within an
application artifact such as a Data Source. The syntax is the same with the only difference being
that a Business Rule can be shared across multiple application artifacts where a Complex
Expression is contained within the artifact.
Member Formula
The same Business Rule syntax can be applied to Member Formulas as well. This logic stays with
the Member and cannot be shared.
There are also three utility groups available when writing Business Rules:
BRAPI
BRAPI provides application programming interface to commonly used functions involving all
areas of the product where a Business Rule can be used.
API
The more specific API provided as a Parameter to a Business Rule provides functions specific to
the type of Business Rule being written. For example, when implementing a Business Rule for a
finance-oriented task, API refers to the functions used for processing calculation logic and other
capabilities related to processing a Cube's data and metadata.
ARGS
An argument represents the value passed to a Business Rule when the procedure is called and
the calling code supplies the arguments. For example, if a Parser Business Rule is assigned to
the Account Dimension, args will supply the Account Dimension data as well as a set of functions
available to use against that data. Different args will be provided depending on the type of
Business Rule used.
The authorized user can Encrypt Business Rule by clicking the encrypt button.
Once clicked, an Encrypt Business Rule dialog will display prompting the user for a password.
The user will create and enter the password and then Re-Enter the password in the box below and
click OK. The system encrypts the Business Rule, displays “Business Rule Is Encrypted”
message text in the editor and the editor is in read only mode.
NOTE: It is important to remember and record the password being used for each
Business Rule being encrypted or the Business Rule will not be able to be decrypted for
further changes without the assistance of OneStream Support.
To Decrypt a Business Rule, an authorized user can Decrypt Business Rule by clicking the
Decrypt button . At this time, the Decrypt Business Rule dialog box will display prompting the
user for a password. User will enter the password and click OK. The system will decrypt the
Business Rule, display the Business Rule text editor and then return to read/write mode again.
Set the proper access to Encrypt Business Rules to allow an authorized user to Encrypt and
Decrypt a rule from the Business Rule screen in the Application tab, if the user is in the role.
Next advance to the Business Rules section (Application Tab>>Tools>>Business Rules) and
Refresh the screen and the Encrypt Business Rule button will appear in the menu.
Select the Business Rule to be encrypted, then select the Encrypt Business Rule button.
At this time, an Encrypt Business Rule dialog will display prompting the user for a password.
The user will create and enter the password, then Re-Enter the password in the box below and
click OK.
The system encrypts the Business Rule, displays “Business Rule Is Encrypted” message text in
the editor and the editor is in read only mode and the Decrypt Business Rule button is now
displayed in the menu bar.
To Decrypt a Business Rule, an authorized user can Decrypt Business Rule by clicking the
Decrypt button . At this time, the Decrypt Business Rule dialog box will display prompting the
user for a password.
The system Decrypts the Business Rule displays the Business Rule text editor and the Business
Rule is in read/write mode again.
3. Click Business Rule Search. The Object Lookup dialog box opens.
4. Start typing the start of the business rule name. All business rules that match your text
display in the list.
5. Select the business rule from the list and click OK.
Execute Extender
Use this to run the selected Extender Business Rule
Ctrl+M
Expand /Collapse all regions and methods. Click in the script after selecting the Business Rule in
order to use the hotkey.
l StatusGrayBall
l StatusWhiteBall
l StatusOrangeBall
l StatusBlueBall
l StatusRedBall
l StatusLightGreenBall
l StatusGreenBall
l StatusGrayCheckMark
l StatusGreenCheckMark
l StatusLockedWithCheckMark
l StatusLockedWithFolder
Type
The type of Business Rule (see below for a detailed description of each Business Rule Type)
For more details on this feature, see "About the Financial Model" on page 2.
Referenced Assemblies
Enter a list of referenced assembly names separated by semi-colons.
BR\SharedFunctionsBR
Example:
Dim sharedFinanceBR As New
OneStream.BusinessRule.Finance.SharedFinanceFunctions.MainClass
Dim myResult As String = sharedFinanceBR.Test(si, api, args)
For more details on this feature, see Referencing a Business Rule from a Member Formula or
Business Rule in "About the Financial Model" on page 2.
XF\ThirdPartyFunctions.dll
Otherwise, use no prefix and enter the full path and file name of any DLL file stored on the
application server(s) file system.
Is Encrypted
This will be set to True if the Business Rule has been encrypted or False if it has not been
encrypted.
Security
Access
Members of this group will have access to the Business Rule
Maintenance
Members of this group have the authority to maintain the Business Rule
NOTE: Click and begin typing the name of the Security Group in the blank field. As
the first few letters are typed, the Groups are filtered making it easier to find and select
the desired Group. Once the Group is selected, click CTRL and Double Click. This will
enter the correct name into the appropriate field.
The image below explains the major regions and elements of the Business Rule editor.
Example APIs
api.FunctionType
The expression used when special logic needs to be run and a certain process needs to be
isolated.
Translate
Additional logic that uses custom translation.
FXRate
Custom logic used to determine Foreign Exchange rates for any intersection.
Consolidate Share
Additional logic used during the custom calculation of the Share Member.
Consolidate Elimination
Additional logic used during the custom calculation of the Elimination Member.
The following Business Rule example will make all cells for the Account 6000 read-only. This
should be added to a Business Rule attached to a Cube.
Case Is = FinanceFunctionType.ConditionalInput
If api.Pov.Account.Name.XFEqualsIgnoreCase("6000") Then
Return ConditionalInputResultType.NoInput
End If
Return ConditionalInputResultType.Default
Confirmation Rule
Special logic that runs with Confirmation Rules.
Data Cell
Named GetDataCell calculations that can be reused such as a Better/Worse calculation in Cube
Views.
Parser Rule
Parser Business Rules are used to evaluate and/or modify field values being processed by the
Stage Parser Engine as it reads source data. These Business Rules are written as Shared
Business Rules or Logical Expressions and applied to a Data Source Dimension.
Example API
args.Line
This will return the entire record being processed from the Data Source.
args.Value
This will return the Dimension the Business Rule is assigned to in the Data Source.
Common Usage
l Custom parsing logic
Connector Rule
Connector Business Rules are used to communicate with, collect data from, and drill back to
external systems. These Business Rules are written as Shared Business Rules and applied to a
Data Source.
See Connectors in "Collecting Data" on page 164 for more information on using Connectors.
Example API
args.ActionType
This will return one of the four available Connector action types.
ConnectorActionTypes.GetFieldList
This will run the SQL query to return the available field list to the Data Source for Dimension
assignment.
ConnectorActionTypes.GetData
This will run the SQL query to retrieve the source data and return it to the Stage based on the Data
Source Dimension assignment.
ConnectorActionTypes.GetDrillBackTypes
This will return a list of the specified drill back types. (e.g., File or Data Grid)
ConnectorActionTypes.GetDrillBack
This will run the required SQL query against the source system and will provide greater detail than
what was originally imported.
Conditional Rule
Conditional Rules (mapping) are used to conditionally evaluate mapping criteria during the data
transformation process. These Business Rules are written as Shared Business Rules or Logical
Expressions and applied to a Transformation Rule definition.
They are only applicable to Transformation Rules with the type of Composite, Range, List, or
Mask either as a Business Rule or Complex Expression.
Example API’s
args.GetSource
This will return the source value for the specified Dimension.
args.GetTarget
This will return the mapped or transformed value for the specified Dimension.
args.OutputValue
This will return the originally mapped target value from the Transformation Rules.
Derivative Rule
Derivative Rules (derive data prior to mapping) are used to evaluate and/or calculate values
during the data derivation process. These Business Rules are written as Shared Business Rules
or Logical Expressions and applied to a Derivative Rule definition.
They are only applicable to Transformation Rules with the type of Derivative either as a Business
Rule or Complex Expression.
Common Usage
l Calculate mathematical expressions
NOTE: These rules do not apply to how a Cube View looks like in the Data Explorer Grid
view.
The Extender Rule is used in conjunction with the Execute Cube View Extender Business Rule
setting on the Cube View. See Cube View Extender: Advanced Cube View Formatting in
"Implementing Security" on page 322 for examples on how to use this rule. See the OneStream
API Overview Guide as well as the OneStream API Details and Database Documentation Guide
for more details on this Business Rule.
Common Usage
The following are key uses for Cube View Extender Business Rules in formatting Reports.
Dashboard DataSet
DashboardDataSet Rules are used to create programmatic query results. This rule type
combines multiple types of data into a single result set using the full syntax capability of VB.Net.
These Business Rules are written as Shared Business Rules and applied to Dashboard Data
Adapters or Dashboard Parameters.
Common Usage
l Combine different types of data for a report
Dashboard Extender
DashboardExtender Rules are used to perform a variety of tasks associated with custom
Dashboards and MarketPlace Solutions. These Business Rules can be thought of as multi-
purpose rules and make up the majority of the code written in a MarketPlace Solution. In addition,
they are written as Shared Business Rules and applied to Application Dashboard Parameter
Components (Buttons, Combo Boxes, etc.).
Common Usage
l Execute task when the user clicks a button
l Automate a Workflow
l Include Page State to store parameters and values about a specific Dashboard page
instance
Dashboard XFBRString
Dashboard XFBRString Rules are used to process conditional Dashboard Parameters. These
rules inspect and alter a Dashboard Parameter value using the full syntax capabilities of VB.Net or
C#. Dashboard String Functions are written as Shared Business Rules and called by using a
XFBR (BusinessRuleName, FunctionName, UserParam=[UserValue]) specification anywhere a
standard Dashboard Parameter is used. After a Dashboard BRString rule is created, create a
Dashboard Component to call the BRString using the following script.
BRString(brRuleName, funcNameWithinBRRule, optionalName1 = var1, optionalName2 =
var2)
The return value from the Business Rule will be used in the Dashboard Component.
NOTE: This Business Rule can be applied to any Dashboard or Cube View property
where a Parameter is used.
Extensibility Rule
Extensibility Rules have these two types: Extender and Event Handlers. Extender Rules are the
most generalized type of Business Rule in the OneStream platform. Use these to write a simple
utility function or a specific helper function called as part of a Data Management Job. Event
Handlers are exclusively called before or after a certain operation occurs within the system.
Extender
This can be used to automate custom tasks like running external scripts, backups and report
publishing.
Available operations
Transformation Event Handler
This can be run at various points from Import through Load. Available operations.
l StartParseAndTransForm
l InitializeTransFormer
l ParseSourceData
l LoadDataCacheFromDB
l ProcessDerivativeRules
l ProcessTransformationRules
l DeleteData
l DeleteRuleHistory
l WriteTransFormedData
l SummarizeTransFormedData
l CreateRuleHistory
l EndParseAndTransForm
l FinalizeParseAndTransForm
l StartRetransForm
l EndRetransForm
l FinalizeRetransForm
l StartClearData
l EndClearData
l FinalizeClearData
l StartValidateTransForm
l ValidateDimension
l EndValidateTransForm
l FinalizeValidateTransForm
l StartValidateIntersect
l EndValidateIntersect
l FinalizeValidateIntersect
l LoadIntersect
l StartLoadIntersect
l EndLoadIntersect
l FinalizeLoadIntersect
l SubmitJournal
l ApproveJournal
l RejectJournal
l PostJournal
l UnpostJournal
l StartUpdateJournalWorkflow
l EndUpdateJournalWorkflow
l FinalizeUpdateJournalWorkflow
l SaveForm
l CompleteForm
l RevertForm
l StartUpdateFormWorkflow
l EndUpdateFormWorkflow
l FinalizeUpdateFormWorkflow
l StartProcessCube
l Calculate
l Translate
l Consolidate
l EndProcessCube
l FinalizeProcessCube
l PrepareICMatch
l StartICMatch
l PrepareICMatchData
l EndICMatch
l StartConfirm
l EndConfirm
l FinalizeConfirm
l SaveQuestionResponse
l StartSetQuestionairreState
l SaveQuestionairreState
l EndSetQuestionairreState
l StartSetCertifyState
l SaveCertifyState
l EndSetCertifyState
l FinalizeSetCertifyState
l StartSequence
l ExecuteStep
l EndSequence
l UpdateWorkflowStatus
WorkflowLock
WorkflowUnlock
Client Updater
The Client Updater is used to retrieve updated software from the OneStream server for the Excel
Add-In client program when the version being used does not match the version found on the
server being connected. Note that in order to perform the update, the user needs to be able to
write to the installation folder. From this page, the user can first save work and then restart using
elevated Windows privileges using the Restart OneStream as Windows Administrator button.
Information regarding the current version is displayed at the top of the window, as well as the
module being reviewed, the location of the installation folder and the version status all appear
within this window.
Client Module
This field is used to select which application (OneStream Excel Add-In) needs to be compared to
the current version of each that are currently installed on the user’s desktop. Click the selection
arrow and select the appropriate application.
If the versions do not match, click the Update button, then click OK.
NOTE: If any versions of Excel are open, they must be closed before clicking OK to
proceed.
NOTE: A backup folder with files for the outdated version is automatically created and
saved as part of the update process. It can be found in the same location as the newly
updated version folder.
If this functionality has been disabled, the following message appears when trying to update those
applications. “The Client Updater has been disabled by your System Administrator. Please use
OneStream’s full client installation program, or see your System Administrator.”
Data Management
The Data Management module allows you to copy data or clear data for a Cube, Scenario, Entity,
and Time. In order to do this, each Data Management Group must contain a Sequence and a
Step. The Steps are created and specifically defined by Cube, Scenario, Entity, and Time. Once
the Step is defined and saved, it can be assigned to a Sequence. Once these Groups are created,
they are organized into Data Management Profiles.
Search
1. Click the binoculars to open the Select Data Management Object dialog box.
2. Select the Object type from the drop-down: All Items, Sequence, or Step.
Data Management
A Data Management Group allows you to create different groups each containing Sequences and
Steps. A Data Management Group can be assigned to multiple Data Management Profiles.
General
Name
The name of the new data group.
Description
A description of the new data group.
Sequences
A Sequence is an ordered series of one or more Data Management Steps which will execute in
the order in which the Steps are organized. Click to create a new Data Management
Sequence.
General
Name
The name given to the Data Management Sequence.
Description (Optional)
A description of the Data Management Sequence.
Task Activity
The Data Management sequence monitors the server’s CPU and evaluates other queued tasks in
order to make sure that they are processed in order as resources become available. If the server
resource utilization is greater than the limit, the job status will be set to Queued and the task
progress bar will stay on the queued task step until enough CPU resources are available to start
the task. This can be monitored in Task Activity.
Use Queueing
Set this to True to use queueing for this task in order to have better control of the application
server’s CPU utilization. This task will not start running until CPU utilization is below the specified
value, or until all previously queued tasks have been completed. The default is set to True.
NOTE: Do not set a value less than 10 or the task may never start.
When a task gets queued, the task progress dialog will stay on Task Queued. Open Task Activity
to monitor the queued task and the current server CPU utilization.
NOTE: Batch processing queue overrides other Workflow batch queue settings. The
batch processing queue does not apply to batch script, only the Batch screen in the
Workflow.
Value
The value passed into the related Parameter variable.
Steps
There are six built in Data Management Step types.
Calculate
A Step can be created to use one of the built-in consolidation/calculation options available.
Name
The name of the Data Management Step.
Description (Optional)
The description of the Data Management Step.
Step Type
The type of Data Management Step chosen.
Calculation Type
Calculate
This executes a calculation when Parameter Components are changed. It runs calculations at the
Entity level within the Local Member of the Consolidation Dimension without translating or
consolidating.
Translate
This executes Translate when Parameter Components are changed. It runs the Calculate step
above at the Entity level and then translates data within the Translated Member of the
Consolidation Dimension for each applicable relationship.
Consolidate
This executes a Consolidate when Parameter Components are changed. This runs the Calculate
and Translate steps and then completes the calculations required all the way up the Consolidation
Dimension.
Cube
Specify the Cube where the consolidation/calculation will run.
Entity Filter
Specify the Entity, list of Entities or any combination of Entity hierarchies to be included in the
consolidation/calculation.
Parent Filter
If alternate hierarchies are used, a Parent may be specified in order to be included in the
consolidation/calculation.
Consolidated Filter
Specify the Consolidation Member or Members to be included in the consolidation/calculation.
Scenario Filter
Specify the Scenario Member or Members to be included in the consolidation/calculation.
Time Filter
Specify the Time Member or Members to be included in the consolidation/calculation.
Clear Data
Name
The name of the Data Management Step.
Description (Optional)
A description of the Data Management Step.
Step Type
The type of Data Management Step chosen.
Cube
Select the Cube where the consolidation/calculation will run.
Entity Filter
Select the Entity, list of Entities or any combination of Entity hierarchies to be included in the
consolidation/calculation.
Scenario Filter
Select the Scenario Member or Members to be included in the consolidation/calculation.
Time Filter
Select the Time Member or Members to be included in the consolidation/calculation.
Copy Data
Name
The name of the Data Management Step.
Description (Optional)
A description of the Data Management Step.
Step Type
The type of Data Management Step chosen.
Source Cube
Select the source Cube for the data copy.
Source Scenario
Select the source Scenario Member or Members to be included in the data copy.
Source View
Select the View Member to be included in the data copy. This selection allows users to copy YTD
data into a Periodic Member and vice versa.
Destination Cube
Select the destination Cube for the data copy.
Destination Scenario
Select the destination Scenario Member or Members to be included in the data copy.
Destination View
Select the destination View Member to be included in the data copy.
NOTE: If the Source and Destination View fields are left blank, the data will copy to the
same View Member. (e.g., YTD will copy to YTD or Periodic will copy to Periodic)
Custom Calculate
The typical use of the Custom Calculate Step is for speed of calculations during data entry and
flexibility. For instance, a user could make on-the-spot changes in a Form, run this Custom
Calculate and quickly experience What-if analysis based on the limited amount of data on the
Form. Instead of running a full Calculate or Consolidation on a Data Unit, the Custom Calculate
Data Management Step can be used to run a calculation on a slice of data within one or many
Data Units. The calculation could be executed from within Data Management, by clicking Save on
a Form in Workflow (through a Forms Event Handler) or related to a button on a Dashboard being
used to enter Budget data, to state a few examples.
This type of calculation does not create audit information for each data cell affected, therefore will
run faster than using the Copy Data Data Management Step type.
The user executing must be a member of the Scenario’s Manage Data Group or the Step will fail.
This helps prevent unauthorized users from launching Steps like this, which could alter or clear
data unintentionally.
Name
The name of the Data Management Step.
Description (Optional)
A description of the Data Management Step.
Data Units
These are the Data Unit settings for what will be affected by calculation being applied. A single
Cube and Scenario can be affected by this Step. A single or multiple Entities, Entity Parents,
Consolidation members and Time Periods can be affected through Filter settings for Entity,
Parent, Consolidation and Time in this section using the Member Filter Builder dialog. The related
Business Rule below will run once for each of the number of Data Units specified here. These
Data Unit members (e.g. Entity, Scenario, Time, etc.) can be referenced in the related Business
Rule with functions such as API.Pov.Entity.Name or similar, but otherwise will not need to be
mentioned within the context of the Business Rule. Upon running, this will reset Calculation Status
of the affected Data Units to needing calculation.
Point of View
These are the single member entries of dimensions not in the Data Unit to be affected by the
calculation. Settings are for View, Account, Flow, Origin, IC and UD1-UD8. These values are
provided in the Step as a convenience and to be referenced from within the Business Rule. For
instance, the UD1 member listed here in the POV setting could be referenced from within the
Business Rule as api.Pov.UD1.Name. This can make the Business Rule flexible since the same
rule could be used against multiple Data Management Custom Calculate Steps, but run differently
based on the POV setting.
Business Rule / Function Name
The name of the Finance-Type Business Rule and contained Function to run when this
Step runs. This allows the user to specify settings such as Durable Storage within
the rule. A simple example of a Business Rule which calculates data with Durable
Storage is displayed below:
NOTE: When a Calculation or Consolidation runs on this same Data Unit after this Data
Management Step is run, the data saved as calculated by this Step will be cleared
unless it is saved with a Storage Type of Durable. ClearCalculatedData is first step in the
standard Calculation Sequence that runs during a Calculation or Consolidation on Cube
data. In this case, Durable data will be ignored during a calculate or even a Force
Calculate or Consolidate unless a ClearCalculated function is used within the Business
Rule or Member Formula to purposely clear the Durable data. However, if a calculation
recalculates even data marked as Durable, it will then be replaced by the newly
calculated data.
It is suggested that if this calculation within a Custom Calculate Data Management Step is to be
replicated within a separate Member Formula or Business Rule, both can refer to the same saved
Finance Business Rule function by name. To refer to this logic, see Defining a Reference to a
Shared Business Rule in "Implementing Security" on page 322.
See Examples of Key Functions in "About the Financial Model" on page 2 for examples of using
API.Data.Calculate and API.Data.ClearCalculated to store and clear Durable data as desired.
Parameters
Refer to any parameters inside the Business Rule by listing them here in name-value pairs in this
fashion:
Name1=Frankfurt, Name2=[Houston Heights]
Custom Parameters can be used by using the correct syntax. This will result in a prompt to the
user at run time.
Name3=|!myParam!|
Description (Optional)
A description of the Data Management Step.
Step Type
The type of Data Management Step chosen.
Business Rule
Select from the available application Business Rules to run custom scripts or procedures.
Parameters (Optional)
This field is provided to pass Parameters or variables into the selected Business Rule.
Export Data
Name
The name of the Data Management Step.
Description (Optional)
A description of the Data Management Step.
Step Type
The type of Data Management Step chosen.
Include Zeroes
If set to True, zero amount records will be included when exporting.
Cube
Specify the Cube where consolidation/calculation will run.
Entity Filter
Specify the Entity, list of Entities or any combination of Entity hierarchies to be included in the
consolidation/calculation.
Parent Filter
If alternate hierarchies are used, a Parent may be specified in order to be included in the
consolidation/calculation.
Consolidated Filter
Specify the Consolidation Member or Members to be included in the consolidation/calculation.
Scenario Filter
Specify the Scenario Member or Members to be included in the consolidation/calculation.
Time Filter
Specify the Time Member or Members to be included in the consolidation/calculation.
Combinations of Data Filters (use #All for all stored base-level data)
Use standard Member Filter functionality to select specific data required for all Dimensions listed
below:
Export File
Use the Export File Data Management Step to export an Extensible Document or Report Book to
OneStream’s File Share.
Name
The name of the Data Management Step.
Description (Optional)
A description of the Data Management Step.
Step Type
The type of Data Management Step chosen.
NOTE: An Extensible Document is a Text, Word, PowerPoint, or Excel file that uses
Parameters in its content. The file name must contain .xfDoc before the extension.
Example: StatusReport.xfDoc.docx
Parameters
Enter a comma separated list of name value pairs.
Example: ParameterName1=ValueName1, ParameterName2=[Value Name2]
Export Report
Name
The name of the Data Management Step.
Description (Optional)
A description of the Data Management Step.
Step Type
The type of Data Management Step chosen.
Object Type
Indicate whether a Dashboard or Dashboard Profile object is to be exported.
Object Name
The name of the Dashboard or Dashboard Profile being used.
Reset Scenario
Similar to Clear Data Step, except it clears additional related application data, yet does not create
audit information for each data cell affected, therefore will run faster. It clears data within a range
(including parent Entity data), audit information, Workflow Status and Calculation Status as if it
never existed. This is intended for Administrator use. User executing must be a member of the
Scenario’s Manage Data Group or the Step will fail. This helps prevent unauthorized users from
launching Steps like this, which could alter or clear data unintentionally. It is recommended to only
changes Manage Data Group from Nobody to a select exclusive User Group before running and
then change back to Nobody afterwards. Also, ensure that your application database is backed up
before performing a Reset Scenario. Note that Reset Data will clear even data marked as
Durable.
Name
The name of the Data Management Step.
Description (Optional)
A description of the Data Management Step.
Scenario
Choose one Scenario to be reset.
When running a Sequence in Data Management, go to System Tab > Tools > File Explorer to
find any file export.
Go to File Share > Applications > Choose an application > DataManagement > Export >
Username > Choose the latest folder.
Spreadsheet
The OneStream Windows App Spreadsheet feature enables users to incorporate spreadsheet
functionality experienced by users of the OneStream Excel Add-in without needing to have Excel
loaded on their computer. It can be used for ad hoc querying/reporting, analysis, data entry, and
formatted reports. The Spreadsheet feature enables users to stay within the OneStream Windows
App while being able to use functionality similar to what they know and love in Excel. Similar to
the Excel Add-in, the Spreadsheet feature leverages OneStream’s re-usable Cube Views for fast
and easy analysis. See "Navigating the Excel Add-In" on page 1097 for more details on this
feature.
l Change point-of-view, interact with Forms, assign Cube Views, drill through to source data
and update Workflow status
l Eliminate risk of errors and duplication of efforts with standardized and centralized
spreadsheet controls
l Safely edit, update and analyze data as spreadsheet forms respect Application and
Workflow and security
l Eliminate spreadsheet maintenance when metadata changes because Cube Views are
read dynamically
l Sheet-based calculations remain intact even when rows or columns are added
The OneStream Windows App Spreadsheet feature can perform most of the tasks accomplished
by the OneStream Excel Add-in, with some limitations. These limitations include, but are not
limited to:
l Macros
l Solver
l Document properties
l Preview format does not update for format types Effects (Superscript, Subscript,
Strikethrough)
l Shift+End
l Ctrl+PageUp / Ctrl+PageDn correctly changes the tab but does not reset tab focus in
spreadsheet
l CTRL+N creates a new spreadsheet, however if an existing spreadsheet was opened with
unsaved changes, it closes the existing spreadsheet without saving edits made since last
save
Some functionality is currently not supported in the Spreadsheet Tool. Functionality that is known
not to be supported in the Spreadsheet feature at this time is listed below:
l Undo/Redo
The following Excel Column Charts are incompatible and unavailable with the Spreadsheet
feature:
l Stacked 3-D
l Stacked Cylinder
l Stacked Cone
l Stacked Pyramid
The following Excel Line chart is incompatible and unavailable with the Spreadsheet feature:
l 3-D Line
The following Excel Bar charts are incompatible and unavailable with the Spreadsheet feature:
The OneStream Windows application spreadsheet feature does not require that Excel be loaded
on your computer. It can be used for querying and reporting, analysis, data entry, and formatted
reports. The Spreadsheet feature allows you to remain in the Windows application using Excel-
like functionality.
Chart Types
Microsoft Excel 2016 introduced several chart types that help you visualize financial, statistical,
and hierarchical data. They include:
l Waterfall
l Histogram
l Pareto
l Funnel
l Sunburst
l Treemap
You can add charts to a OneStream spreadsheet worksheet in the same manner as any other
chart type. Chart styles allow you to quickly change chart appearance. Styles change the
background fill, specify the color of the data series, and apply different shape effects and outlines
to the chart.
To apply Excel 2016 chart types to a predefined chart style, right-click on the chart and select one
of the available styles.
Box and whisker charts are commonly used in statistical analysis. For example, you could use a
box and whisker chart to compare medical trial results or teachers' test scores.
Waterfall
A waterfall chart shows a running total as values are added or subtracted. It is useful for
understanding how an initial value (for example, net income) is affected by a series of positive and
negative values. The columns are color coded so you can quickly tell positive from negative
numbers. The initial and the final value columns often start on the horizontal axis, while the
intermediate values are floating columns.
If your data includes values that are considered Subtotals or Totals, such as Net Income, you can
set those values so they start on the horizontal axis at zero and don't float. To do this:
2. In the dialog box, select the checkbox for value(s) in the dataset to be set as Total.
3. To make the column float again, clear the Set as total checkbox.
Histogram
Data plotted in a histogram chart shows the frequencies within a distribution. Each column of the
chart is called a frequency bin.
Pareto
A pareto chart is a sorted histogram that contains columns sorted in descending order along with
a line representing the cumulative total percentage.
Funnel
Funnel charts show values across multiple stages in a process. Typically, the values decrease
gradually, causing the bars to resemble a funnel. For example, you could use a funnel chart to
show the number of sales prospects at each stage in a sales pipeline.
Sunburst
The sunburst chart displays hierarchical data and can be plotted when empty (blank) cells exist
within the hierarchical structure. Each level of the hierarchy is represented by one ring or circle
with the innermost circle as the top of the hierarchy. A sunburst chart without hierarchical data
(one level of categories), looks similar to a doughnut chart. However, a sunburst chart with
multiple levels of categories shows how the outer rings relate to the inner rings. The sunburst
chart is most effective at showing how one ring is broken into its contributing pieces.
Treemap
The treemap chart provides a hierarchical view of your data and an easy way to compare different
levels of categorization. The treemap chart displays categories by color and proximity and can
show data which might be difficult to display with other chart types. The treemap chart is plotted
when empty (blank) cells exist within the hierarchical structure. Treemap charts are good for
comparing proportions within the hierarchy.
Conclusion
These chart types help you visualize financial, statistical, and hierarchical data inside the
Windows application spreadsheet component. They are ideal for those who prefer working inside
the Windows application rather than Microsoft Excel.
Retrieve Functions
Retrieving and changing data can be done by using functions. To see the functions and their
Parameters, open the Spreadsheet feature and select the Formulas tab. Within the Spreadsheet
feature, select Insert Function and select User Defined where it says to Select a category. See
Retrieve Functions in "Navigating the Excel Add-In" on page 1097 for more details on this feature.
Open
File type to open
Local Folder
Select a file to be opened from a location on the local computer/network.
File System
Select a file to be opened from a location within the File Explorer.
NOTE: Displayed files can also be opened from here, or from OnePlace |Documents by
right clicking and selecting one of the three options – “Open in Spreadsheet Page”;
“Open” (opens file directly in Excel if the application is found on the local computer) or
“Open With…” (user specifies program).
Save As
File type to be saved as.
Local Folder
Select location on local computer/network to save a file to.
File System
Select location within File Explorer to save a file to.
Application Dashboard
Select Application Dashboard location to save a file to.
OneStream Ribbon
When using the Spreadsheet feature, there will be a OneStream menu item and a ribbon. See
OneStream Ribbon "Navigating the Excel Add-In" on page 1097 for an explanation of the items
within Spreadsheet that function the same as those in the Excel Add-In. Items documented below
function differently than they do in the Excel Add-In, or do not exist in the Spreadsheet Tool.
Analysis
The File Explorer option does not exist in the Spreadsheet feature because files are automatically
stored. They do not exist within Excel.
Administration
The Display Context Pane option does not exist in the Spreadsheet feature because it is not
needed. The OneStream task pane automatically appears when using the Spreadsheet feature.
Under Preferences, the General options are not needed in the Spreadsheet for Windows Only
tool. These options manage how Microsoft Sign In is handled, how Macros are handled, and how
data is refreshed in an open workbook. The Spreadsheet feature does not utilize this
functionality.
Task Scheduler
Task Scheduler provides the ability to schedule data management sequences that execute a data
management step within the application. If the sequence doesn’t have a step, the job will fail.
Grid View
Go to Application > Tools > Task Scheduler.
The default view is the calendar view. You can change the view to a grid view which is initially
blank until you’ve scheduled sequences.
The default is to Show Tasks for all Users, but you can click the option to turn it off. Once you’ve
created tasks, they will show in the grid.
Whatever view you are currently on is the view you will come back to the next time you go to the
page.
Next Start Time The next time the task is scheduled to run.
Field Definition
Expire Date/Time The time and date the task will expire and no longer run.
If enabled by an administrator:
Invalidate The time and date the task will be suspended and not run until
Date/Time validated.
You can filter on any of the fields in the grid that have the filter icon and you can filter on multiple
selections.
You can group by column name by dragging and dropping the group into the header bar.
Double-click on a task or click Edit to open Edit Task. The only option that you can not change is
the Name in the Task tab.
If you are not the Administrator, you will not have the rights to change the Enabled by Manager
check box.
l Click to Delete.
Calendar View
Provides you a view of the task in the calendar.
If there is more than one user showing, the tasks are color coded when Show Tasks for All Users
is selected.
You can view your own tasks and all user’s tasks if checked.
You can go backward and forward within the different views and you can view by today, work
week, work, month, timeline, and agenda.
When activated, you can choose to view the calendar by time scales or working hours.
When you hover over a task on the calendar you will see the information specific to the task
including User Name, Task Name, Sequence Name, Schedule, State, Expire Date/Time, if
enabled, the Invalidate Date/Time.
You can also create a new task, edit an existing task, or delete a task from the calendar view.
Click New Task or select a task in the calendar and click Edit Task or Delete Task.
You can also double-click on the task to edit or you can right-click to edit or delete the task.
Once the job runs, you can see the status of the job in Task Activity.
The Description is the name of the task separated by a hyphen followed by the sequence.
Logon Activity
1. Go to System > Logon Activity.
2. In the Client Module column, you can see that the log in was through Scheduler.
Security Roles
1. Go to Application > Tools > Security Roles > Application User Interface Roles.
2. You must have the TaskSchedulerPage role to see the Task Scheduler page. You can only
view all user tasks with this role.
TaskScheduler allows you to create new tasks, edit your tasks, validate tasks if the setting is on.
You can view all user tasks but only edit your own. You will not have access to load and extract.
You cannot change the task name.
ManageTaskScheduler allows you to create your own tasks, view every task no matter who
created it, edit your own tasks and other user’s tasks, delete or disable your own tasks. You
cannot enable or disable a task that is not your own, but you can disable the user’s task in the
Administration section. You can load and extract. You cannot change the task name.
If you are a manager, you can load additional task scheduler files.
Load
1. Go to Application > Tools > Load/Extract.
3. Click Open.
4. Click Load.
Extract
1. Go to Application > Tools > Load/Extract.
2. Click Extract.
1. Click Create Scheduled Task .The New Task dialog box opens.
2. Click Task.
b. The Description.
c. Start Date/Time, which you can enter or click the drop-down to select. This should be
set to the time you want the task to start running. If you don't specify a time, the value
defaults to the current date and time.
d. Select the data management Sequence either by scrolling through the list or entering
the sequence in the Filter field.
e. When you select a sequence, if there are Parameters that have been set up for the
sequence, they will show. You can add additional validated parameter settings, if the
parameters are not valid, the job will fail.
f. If you click OK before you schedule it, the default is to run it one time.
4. Click Schedule.
a. One Time triggers the task to run once based on the time in the Start/Date Time: field
on the Task tab.
b. Minutes triggers the task to run on a recurring basis from 5 – 180 minutes. You can
set tasks to run during predetermined times by typing the start time in the Time From:
field, then typing the stop time in the To: field. By default, the Time from: field is
unchecked.
If you create a task, for example, that is set to run every 30 minutes, starting at
2:30pm and ending at 5:30pm, the first run of the task occurs at 2:30pm and runs
every 30 minutes during that timeframe.
NOTE: Calendar entries are created in the Calendar view even though they
may fall outside of the selected run time frame. This means, for example,
that a task scheduled to run every 30 minutes between the hours of 2:00pm
and 5:00pm, will display all day every 30 minutes.
d. Weekly you can choose how many times it recurs and the days that it runs.
e. Monthly you can choose how many times it recurs and the days that it runs.
6. Click Advanced to set the number of times to retry a task if it fails, the maximum is three.
7. Click OK and the new task has been added to the Grid View and the Calendar View.
Text Editor
This is used to create, edit and view text documents like those created in Microsoft Word. This
component only works in the OneStream Windows App version.
The OneStream Windows App Text Editor feature can perform many of the tasks that can be
completed in Microsoft Word, with some limitations. These limitations include, but are not limited
to:
l Mail Merge
l Spelling
l Insert Fields
The following shortcut keys function differently in the Text Editor feature than they do in Microsoft
Word. The Text Editor functionality is listed below:
l CTRL+N – Used to create a new blank document. If an existing document exists and has
not been saved, using this keystroke combination will close the existing document without
saving and create a new document.
l ALT+I – Used to increase the letter/number increments within an outline or numbered list
Common
New
This creates a new text document. If clicked when an open document has unsaved changes, a
window appears asking if the current changes should be saved before the new document is
created.
Open
File type to open
Local Folder
Select a file to be opened from a location on the local computer/network
File System
Select a file to be opened from a location within the File Explorer
NOTE: Displayed files can also be opened from here or from OnePlace | Documents by
right clicking and selecting one of the three options – “Open in Text Editor Page”; “Open”
(opens file in Compatibility mode, directly in Word if the application is found on local
computer,) or “Open With…” (user specifies program).
Save
Save changes to the open file using the current file name. Only available after the file has been
given a name.
Save As
File type to be saved as
Local Folder
Select location on local computer/network to save a file to
File System
Select location within File Explorer to save a file to
Application Dashboard
Select Application Dashboard location to save a file to
Quick Print
Send the document to the default printer without changing any printer/printing properties.
Print
Displays the Print dialog box for setting options to print the displayed document.
Print Preview
Displays the Print Preview dialog box that shows how printed document looks. The Print Preview
Ribbon associated with this button remembers the setting used last that dictates if it is hidden or
displayed. The next time Print Preview is opened, the Ribbon is initialized the same way it was
left.
Clipboard
Paste
Inserts copied data into the document
Cut
Removes and transfers selected data from the document to the clipboard for placement in a
different location.
Copy
Copies selected data to the to the clipboard for inclusion in a different location.
Paste Special
Displays the Paste Special dialog box for additional pasting options.
Font
Provides options for changing the text found in the document. Examples include bold, italic,
underline and text color.
Paragraph
Provides options for formatting text in the document. Examples include bullets, numbering,
indenting and paragraph alignment
Editing
Find
Enter text to be located within the document.
Replace
Enter information to replace located text within the document.
Pages
Page
Insert a page break at the current location within the document.
Tables
Table
Inserts a table at the current location within the document.
Table Styles
This toolbar within the Design tab only displays when working within a table.
Shading
Used to add color to the background of the selected cells.
Borders
Used to customize the borders of the selected cells. Used in combination with the Pen Color
button.
Draw Borders
This toolbar within the Design tab only displays when working within a table.
Pen Color
Used in combination with the Borders button. This button changes the color of the border lines
selected when using the Borders button. Click this button first to select the border color, then use
the Borders button to define where border lines should be displayed.
Table
This toolbar within the Layout tab only displays when working within a table.
Select
Used to select the current cell, column, row or entire table.
View Gridlines
Used to show or hide the gridlines within a table. When turned on, the gridlines only appear where
the display of cell borders has been turned off.
Properties
Used to display the Table Properties dialog box. Advanced formatting options such as margins,
alignment, text wrapping, borders and shading can all be managed from here.
This toolbar within the Layout tab only displays when working within a table.
Delete
Used to delete cells, rows, columns or the entire table.
Insert Above
Used to add a new row directly above the selected cell.
Insert Below
Used to add a new row directly below the selected cell.
Insert Left
Used to add a new column directly to the left of the selected cell.
Insert Right
Used to add a new column directly to the right of the selected cell.
Insert Cells
Used to insert a single cell into the table. The Insert Cell dialog box appears, with options to shift
cells right, shift cells down, insert an entire row or insert an entire column.
Merge
This toolbar within the Layout tab only displays when working within a table.
Merge Cells
Used when two or more cells are selected to join/merge them into one cell.
Split Cells
Used to split the selected cell(s) into smaller cells. The Split Cells dialog box displays and the
number of new columns and rows needed can be entered.
Split Table
Used to split the table selected into two tables. The row that the selected cell(s) belongs to will
become the first row of the new table that is created.
Cell Size
This toolbar within the Layout tab only displays when working within a table.
AutoFit
Used to automatically resize the width of the column based on the text within. Options include
autofitting each cell based on the content within it, autofitting the table to take up the width of the
window and setting the columns to a fixed width.
Alignment
This toolbar within the Layout tab only displays when working within a table.
Various options are available to adjust the alignment of text with the table cells. Examples include
align top left, center left, bottom left, top center, center, bottom center, top right, center right,
bottom right. Cell margins and spacing can be set using the Cell Margins button.
Illustrations
Picture
Inserts a picture file at the current location within the document. Standard picture file options are
available to select from.
Arrange
This toolbar within the Format tab only displays when an inserted picture has been selected.
Wrap Text
Changes the way text wraps around the selected object. Six different wrap options are available
including Square, Tight, Through, Top and Bottom, Behind Text and In Front of Text.
Position
Positions the selected object on the page. Text is automatically set to wrap around the object.
Bring to Front
Brings the selected object forward so that it is hidden by fewer objects in front of it. Three options
are available including Bring Forward, Bring to Front and Bring in Front of Text.
Send to Back
Sends the selected object backward so that it is hidden by the objects in front of it. Three options
are available including Send Backward, Send to Back and Send Behind Text.
Links
Bookmark
Creates a bookmark for selected text and assigns a name to that specific area of the document.
Hyperlinks can be made to move directly to that location.
Hyperlink
Creates a link to a webpage, a file, an application, an email address or a place in the same
document.
Header
Insert a Header into the document or go to the Header section if a Header already exists within the
document.
Footer
Insert a Footer into the document or go to the Header section if a Header already exists within the
document.
Page Number
Inserts the current page number, wherever the cursor is located within the Header/Footer areas of
the document.
Page Count
Inserts the total number of pages in the document, wherever the cursor is located within the
Header/Footer areas of the document.
Navigation
This toolbar within the Design tab only displays when working within the Header/Footer areas of
the document.
Go to Header/Go to Footer
Activates the Header/Footer section on the page so it can be edited.
Show Previous
If the document has been broken into sections, this navigates to the previous section’s
Header/Footer.
Show Next
If the document has been broken into sections, this navigates to the next section’s Header/Footer.
Link to Previous
Creates a link to the previous section so the Header/Footer in this section contains the same
content as the previous section.
Options
This toolbar within the Design tab only displays when working within the Header/Footer areas of
the document.
Close
This toolbar within the Design tab is only displayed when working within the Header/Footer areas
of the document.
Text
Text Box
Inserts a text box into the document.
Symbols
Symbol
Inserts standard characters and those not found on the keyboard into the document.
Page Setup
Margins
Used to set pre-defined or custom margin sizes for the entire document, or the current section.
Orientation
Used to change the pages between landscape and portrait layouts.
Size
Used to set the paper size for the current section.
Columns
Used to split the text into two or more columns.
Breaks
Used to insert page, column or section breaks (Next, Continuous, Even or Odd).
Line Numbers
Used to insert line numbers in the margins on each line of the document. Options include
restarting numbering on each page, each section, suppressing numbers for current paragraph
and custom settings.
Page Setup
Opens the Page Setup dialog box where margins, orientation, paper size, headers/footers and
header/footer placement can all bet set/edited.
Page Background
Page Color
Used to choose a background color for all the pages in the document.
Table of Contents
Table of Contents
Used to create a Table of Contents for the document.
Add Text
Used to add the current paragraph into the Table of Contents
Update Table
Used to update the Table of Contents so that all entries reference the correct page number.
Captions
Insert Caption
Used to add a caption to a picture or another image. It is used to describe the object associated
with it and appears below the object. Captions can be created for the following items:
Figures Caption
Used to add a caption to a figure or picture within the document.
Tables Caption
Used to add a caption to a table within the document.
Equations Caption
This option is not used or supported because equations cannot be created in the OneStream
Windows App Text Editor.
Table of Figures
Includes a list of all the Figures in the document that have captions associated with them.
Table of Tables
Includes a list of all the Tables in the document that have captions associated with them.
Table of Equations
This option is not used or supported because equations cannot be created in the OneStream
Windows App version of Text Editor.
Update Table
Used to update the Table of Figures to include all entries within the document.
Protect
Protect Document
Used to add a password to the document, so that when protected, only users who know the
password can edit the document or add comments. There are options for both of these, and only
one can be implemented at a time.
Unprotect Document
Used to unprotect a document that has been locked for edited. Click the button and provide the
previously created password to unprotect the document.
Comment
New Comment
Used to add comments to the document. Automatically numbers the comments as they are
added to the document. The comment located at the top of the document is numbered 1 and
additional comments are numbered sequentially down the document based on their position. If a
comment is added or deleted, remaining comments in the document are automatically
renumbered based on their location within the document.
Delete
Used to delete comments in the document. Options include the following:
Delete Comment
Used to delete the selected comment only.
Tracking
View Comments
Used to show Comments that have been added to the document. They will be displayed in a
column on the right side of the document.
Reviewing Pane
Used to show/hide Comments within the document in a separate window. The window opens on
the left side of the document and can be used to select comments.
Document Views
Draft View
Used to view the document as a draft for quicker editing. Certain features, such as
Headers/Footers are not visible when using this view.
Print Layout
Used to show how the document will look on the printed page.
Simple View
Used to show the document as a simple memo. Page Layout features are ignored in this view.
Show
Horizontal Ruler
Used to display the Horizontal Ruler, which is used to measure and line up objects in the
document.
Vertical Ruler
Used to display the Vertical Ruler, which is used to measure and line up objects in the document.
Zoom
Zoom Out
Used to change the view to see more of the page at a reduced size.
Zoom In
Used to change the view to get a close-up view of the document.
Spell Checking
A text spell check feature is available only when using the Windows Application. This feature is
set as a default to be inactive. To enable the Spell Check feature, users must have access to the
Application / Tools / Text Editor. In the Text Editor Tool, the Review ribbon will allow the user to
activate Spell Check using the Spell Check button.
The Spell Check feature is enabled for English Culture only. The culture is determined by each
user’s culture assigned in OneStream User Security. The culture is assigned to the OneStream
application on the Application Server Configuration Utility as “en-US”. Users with cultures other
than English (United States) will not have Spell Check available.
To access the Spell Check options, double-click to select the error, then a right-click will expose
the Spell Check menu choices. Choosing the “Ignore” option will only be retained for the current
session. Closing and re-opening to the edit mode will re-check any previously ignored items.
The Options button will allow the user to modify the Spell Check behavior within the current task
session. The settings are not persisted as user preference.
Extract
Choose an item from the drop-down list, click the Extract icon to start the extract process, then
name the output file.
Load
After browsing to the file, click the Load icon to initiate the process.
Application Properties
This covers all the Application Properties found under Application | Tools | Application Properties.
After choosing this option, the screen will display Items To Extract and by default All is chosen.
Workflow Channels
This covers all the Workflow Channels found under Application | Workflow | Workflow Channels.
The screen will display Items To Extract and by default All is chosen.
Metadata
This has multiple sections such as Business Rules, Time Dimension Profiles, Dimensions, and
Cubes. The metadata can be found under Application | Cube | Dimension Library. OneStream
can search for Dimension Member and Relationship changes by username and time stamp. For
more details on the Find Modified Items feature, see Extracting and Loading Dimensions in
"Implementing Security" on page 322.
Cube Views
This has multiple sections such as Groups and Profiles which go together. Cube Views can be
found under Application | Presentation | Cube Views. The screen will display Items To Extract
and by default All is chosen. Choose specific items in each section, or turn the sections on or off.
Data Management
This has multiple sections such as Groups and Profiles which go together. Data Management can
be found under Application | Tools | Data Management. The screen will display Items To Extract
and by default All is chosen. Choose specific items in each section, or turn the sections on or off.
Application Dashboards
This has multiple sections such as Maintenance Units and Profiles. Under the Groups,
Dashboard Components, Dashboard Adapters, and Dashboard Parameters go together.
Dashboards can be found Application | Presentations | Dashboards. The screen will display
Items To Extract and by default All is chosen. Choose specific items in each section, or turn the
sections on or off.
Confirmation Rules
This has multiple sections such as Groups and Profiles which go together. Confirmation Rules
can be found under Application | Workflow | Confirmation Rules. The screen will display Items To
Extract and by default All is chosen. Choose specific items in each section, or turn the sections on
or off.
Certification Questions
This has multiple sections such as Groups and Profiles which go together. Certification Questions
can be found under Application | Workflow | Certification Questions. The screen will display Items
To Extract and by default All is chosen. Choose specific items in each section, or turn the sections
on or off.
Transformation Rules
This has multiple sections such as Business Rules, Groups, and Profiles which go together.
Transformation Rules can be found under Application | Data Collection| Transformation Rules.
The screen will display Items To Extract and by default All is chosen. Choose specific items in
each section, or turn the sections on or off.
Data Sources
This has multiple sections such as Business Rules and Data Sources which go together. Data
Sources can be found under Application | Data Collection | Data Sources. The screen will display
Items To Extract and by default All is chosen. Choose specific items in each section, or turn the
sections on or off.
Form Templates
This has multiple sections such as Groups and Profiles which go together. Form Templates can
be found under Application | Data Collection | Form Templates. The screen will display Items To
Extract and by default All is chosen. Choose specific items in each section, or turn the sections on
or off.
Journal Templates
This has multiple sections such as Groups and Profiles which go together. Journal Templates can
be found under Application | Data Collection | Journal Templates. The screen will display Items
To Extract and by default All is chosen. Choose specific items in each section, or turn the sections
on or off.
Workflow Profiles
This has multiple sections such as Workflow Profiles and Workflow Profile Templates. Workflow
Profiles can be found under Application | Workflow | Workflow Profiles. The screen will display
Items To Extract and by default All is chosen. Choose specific items in each section, or turn the
sections on or off. When loading Workflow Profiles via an XML file extracted previously or from a
different Application, the Load process clears old property settings if they are not specified in the
loaded XML file. This approach ensures that any property edits made to the Workflow Profile
upon extracting it are honored when the same XML file is re-loaded.
Extensibility Rules
This only exports the Extensibility Rules in the Business Rules section; all others are exported
with the object to which they are tied. (e.g., if there is an Account description rule under the Parser
section, it will be exported with the Data Source.) Extensibility Rules can be found under
Application | Tools | Business Rules | Extensibility Rules. The screen will display Items To Extract
and by default All is chosen. Choose specific items in each section, or turn the sections on or off.
FX Rates
Rates have their own Cube. This has multiple sections such as All FX Rate Types and All Time
Periods. FX Rates can be found under Application | Cube | FX Rates. The screen will display
Items To Extract and by default All is chosen. Choose specific items in each section, or turn the
sections on or off.
FX Rate CSV
This is a CSV extract of the application FX Rates which contains FxRateType, Time,
SourceCurrency, DestCurrency, Amount, HasData. FX Rates can be found under Application |
Cube | FX Rates.
l Native, with your password stored in OneStream. In this case you are configured only with
native authentication.
l External, configured to authenticate through one of these external identity providers (IdPs):
o Azure AD
o Okta
o PingFederate
o SAML 2.0
o LDAP
o MSAD
For information about configuring these providers, see the Installation Guide.
For information about how group-based assignment to system security roles determines a user's
access to artifacts and capabilities, see "Manage System Security" on page 927.
l Define them manually in System Security. See "Creating and Managing Groups" on
page 907 and "Creating and Managing Users " below.
l Load them in a bulk import using an XML load file that contains user and group properties
and parameters. We suggest building this file using sample security Excel template
provided with the Sample Templates MarketPlace solution. See "Loading Users" on
page 906 and "Loading Groups" on page 914.
l Use APIs. See "Managing Users and Groups Using BRApi Functions" on page 936.
In this topic:
Requirements
To let non-Administrators create and manage users and groups, grant them the required security
roles. This involves assigning the group to which non- Administrator belongs, to the required
roles.
2. Click Edit by these roles to assign the group in which the user is a member:
To revoke the ability to manage users, groups and roles, remove assignees from the relevant role.
See "Manage System Security" on page 927.
l The Administrator cannot be disabled and is unaffected by inactivity thresholds that disable
users who try to log in after a specific period of time elapses.
Creating Users
1. Click System > Security > Users.
4. From User Type, select the license type purchased that governs the user's access to
artifacts and associated OneStream offerings:
l View: They can access data, reports, and dashboards in a production environment
and associated database, but cannot load, calculate, consolidate, certify, or change
data.
l Restricted: They cannot use some MarketPlace Solutions features such as Lease,
Account Reconciliation and more due to contractual limitations.
l Third Party Access: They can access applications with a third-party application by
logging in using a named account. They cannot change data, modify artifacts or
access the Windows application or a browser-based application.
l Financial Close: They can use Account Reconciliation and Transaction Matching
MarketPlace solutions.
5. Set Is Enabled to True to activate the user. Select False to deactivate the user.
6. The information in Status will reflect the user's activity, such as their latest login. Inactivity
Threshold displays the number of days a user can remain active in the system without
logging in. The user receives an error if they try to log in after the specified number of days
elapses. See "Defining an Inactivity Threshold" on page 904.
7. Read "About User Authentication" below, then "Specify Authentication Settings" below.
l To use native authentication: Select Not Used and enter the user's password in
Internal Provider Password. The first time the user logs in, they can change their
password.
b. In External Provider User Name, enter the user name in the IdP. For
example, if a user's name in Azure AD is [email protected], enter
[email protected]. This name must be unique and match the user
name in the IdP.
2. In Culture select the user's locale. Supported locals and languages are specified during
OneStream server configuration. See "International Settings" on page 938.
3. In Grid Rows Per Page specify how many rows to display on grids before a page break.
Consider the rate of connectivity and screen resolution.
4. Use Custom Text to personalize aspects of functionality given the user's responsibilities.
For example, you could define a text field to:
l Filter a distribution list or to provide text and images for the user's default workflow
profile.
5. In Group Membership, click Add Groupsto include the user in the groups that provide
access to the features and tools that the user needs.
If the appropriate group does not exist, define it. See "Creating Groups" on page 910.
Managing Users
Perform the following tasks to edit, copy, delete and perform other management tasks. You can
also manage users with API functions.
1. Ensure that you have the necessary security role. See "Requirements" on page 899.
l Copy Selected Item: Create a new user based on the selected user's settings.
See"Copying Users" below.
l Show all parent groups for user: See the groups in which the user is a member.
Since group assignment to system security roles and user interface roles determines
what a user can do, use this option to identify their access to artifacts.
4. Click Save.
Copying Users
1. Click System > Security > Users.
4. Select Copy References made by parent groups to add the user to the original user's
groups except exclusion groups.
5. Click OK and modify other settings as needed. See "Creating Users" on page 900.
6. Click Save.
Disabling Users
Administrators can disable users, preventing them from logging in by:
l Modifying a user's account to set Is Enabled to False. See "Disabling a User Manually"
below.
l Specifying an inactivity threshold to deactivate all users - native and external - that try to log
in after a particular amount of time elapses. This threshold applies to users in the
Administrator group, but not the Administrator. Users that try to log in after the threshold
expires receive an error. See "Defining an Inactivity Threshold" below.
4. Click Save.
2. Select File > Open Application Server Configuration File and browse to:
OneStreamShare\Config\XSFAppServerConfig.xml.
4. In Logon Inactivity Threshold, enter the number of days after which inactive users who
try to log in, receive an error.
Enabling Users
1. Select System > Security.
4. Click Save.
3. In Group Membership, select the group and click Remove Selected Item .
4. Click Save.
For example, if a dashboard's design includes a button that uses a BRApi function to support
adding users, users that click the button are validated to ensure they are in the Administrator
group or associated with the ManageSystemSecurityUsers role.
l Extract users in the sample GolfStream application, modify the xml export file to specify
user and group properties, and then load the file. See "Loading Users" below.
Administrators must load users and groups into a new application or one without existing users
and groups. If user accounts exist, loaded users are validated by comparing their current security
settings to those in the xml load file.
Requirements
By default, only Administrators can load and extract users. To let other users load and extract
users, grant them these system security roles:
l ManageSystemSecurityUsers
l ManageSystemSecurityGroups
l ManageSystemSecurityRoles
Loading Users
1. Click System > Tools > Load/Extract.
3. Click Load.
4. When processing finishes, review the user list and user settings to ensure the loaded users
are correctly defined.
Extracting Users
You can extract users, groups, and security roles to an xml export file.
3. In Items to Extract, select Users. Select Export Unique IDs to also extract each user's
ID.
4. Perform a task:
l Click Extract and Edit to view and modify the data in an XML Editor.
In this topic:
Requirements
To let non-Administrators create and manage users and groups, grant them group-based access
to the required security roles.
2. Click Edit by these roles to assign the group in which the user is a member:
To revoke the ability to manage users, groups and roles, remove assignees from the relevant role.
See "Manage System Security" on page 927.
You can define nested, hierarchical groups to best suit your organizational entities, workflows and
reporting structures. Nested groups contain lower-level child groups. In this case, child groups -
and the users that they contain - inherit the access defined for the parent group.
Because groups providing access to data and artifacts may contain many groups, removing just a
few users - to reflect a corporate reorganization for example, from the group hierarchy can be time
consuming. To handle the reorganization, create an exclusion group with the users involved and
apply it to the roles they no longer need.
l Child groups, nested in higher-level parent groups, can access the tools and artifact that the
parent group can, given the parent group's assignment to system security roles (roles). This
access is inherited in group hierarchies, from a parent level downward to child groups.
l Removing child groups from parent groups revokes access to the tools and artifacts that the
parent group provides, based on its assignments to roles.
l In an exclusion group, access to artifacts is determined based on the exclusion order you
specify, regardless of a user's membership in a group. To ensure that users who are in
several groups can not access artifacts but everyone else can, put:
o The groups to which the users belong at the top, set to "Allow Access".
o The individual users below the groups and set the users to "Deny Access".
Creating Groups
1. If you are not an Administrator, be sure you meet the "Requirements" on page 908.
5. In Group Membership specify the users or child groups to add to the group you are
defining, or select a parent group to which to add your new group. Perform any task:
l To include users and groups in the new group, click Add User | Add Child Group and
selecting the user or group. Members can access the artifacts and tools for each system
security role to which the group is assigned.
l To revoke group membership, select users and groups and click Remove.This revokes
their access to the tools and artifacts group membership provides based on role
assignments.
l To nest the group in a parent group, click Add Groups in Parent Groups That Contain
This Group and select a higher level group. Members in this parent group inherit it's
access and permissions based on role assignments.
6. Click Save.
3. Enter an intuitive name and description that indicates who is being restricted by this group.
For example, to omit a department, consider Everyone-But-<Department name>.
4. In Group Membership click Add Child Groups or Add Users to specify who to include in
the group.
5. To prevent members accessing artifacts to which the group has role-based access, set
particular users or groups to Deny Access. Otherwise, set members to Allow Access.
6. Use the arrows to order the exclusions carefully, because access is granted - regardless of
a user's membership in a group, based on the order you specify.
For example, if Amelia and Bob are in the Frankfurt Controller group, the order below does
not restrict them from artifacts, even though they are listed first, because they are in the
Frankfurt Controller group.
To ensure Amelia and Bob can not access artifacts, put Frankfurt Controller first, set to
Allow Access. Put Amelia and Bob below the group, set to Deny Access.
7. Click Save.
Managing Groups
1. If you are not an Administrator, meet the "Requirements" on page 908.
l Copy Selected Item: Create a group based on the existing group's settings. Select
Copy child user and group references to add the same users or groups to the new
group.
l Show All Groups: View the lower level groups that are nested in the group.
5. In Group Membership, change who is in the current group and modify the parent groups
associated with the current group. Perform any task:
l To withdraw users and groups, select them and click Remove. This revokes their
access to artifacts and tools based on the system security roles (roles) to which the
group is assigned.
l To add users and groups, click Add User | Add Child Group. This grants them
access to the artifacts and tools the group provides based on it's role assignments.
l To withdraw the current group from a parent group, select the parent and click
Remove. This revokes access to the artifacts and tools provided by the parent group
based on it's role assignments.
l To include the current group in another parent group, click Add Groups and select
the group.
l To include a user or group, click Add User or Add Group and select the user or
group to add.
l To withdraw the current group from a parent group, select the parent and click
Remove. This revokes access to the artifacts and tools provided by the parent group
based on it's role assignments.
l To include the current group in a parent group, click Add Groups and select the
group.
l Extract the groups in the sample GolfStream application, modify the xml export file to
specify your group properties, and then load the file. See "Loading Groups" on the next
page.
Administrators must perform the load for a new or empty application that does not contain users or
groups. To enable other users to load and extract groups, assign them the necessary security
roles. See "Requirements" on page 906.
Requirements
To enable other users to load and extract groups, Administrators must assign the group in which
these users are members, access to these roles:
l ManageSystemSecurityGroups
l ManageSystemSecurityUsers
l ManageSystemSecurityRoles
Loading Groups
1. If you are not an Administrator, see "Requirements" on the previous page.
4. Click Load.
5. When processing finishes, review the Groups list and individual group settings to ensure
the loaded groups are correctly defined.
Extracting Groups
1. If you are not an Administrator, see "Requirements" on the previous page.
5. Perform a task:
l Click Extract and Edit to modify the extracted data in an XML Editor.
Manage System Dashboards - Use to manage all System Dashboards regardless of access to
certain Dashboards. This role links the SystemPane Role and the System User Interface Roles
section, meaning this security role must include this group in order to be active.
Manage System Database Files -There are two file systems in the Framework database (such
as the System database) and each Application database. Users with the
ManageSystemDatabaseFiles and ManageApplicationDatabaseFiles roles have read and write
access to all folders and files in these file systems, respectively. Other users can be given read
and/or write access to specific folders in the database file systems using the a folder’s security
settings. This option ties the FileExplorerPage role from the System User Interface Roles section,
meaning this security role must include this group in order to be active.
Manage File Share - The File Share is a Windows folder that Application Servers can read and
write. It is configured in the XFAppServerConfig.xml file using the FileShareRootFolder
setting. The File Share is a server-side storage area where external systems or IT can stage and
upload ifles. Users with the ManageFileShare role can edit these folder and files using File
Explorer. This option ties the FileExplorerPage role from the System User Interface Roles section,
meaning this security role must include this group in order to be active.
There are two system roles that manage six categories of configurations. The application
maintains an audit trail so changes to the configurations are accessible. Upon install, the
application server configurations are enabled by default. To mitigate misuse of configurations
settings, Customer Support can disable full features, sections, and property changes via
XML/App config.
System Roles
Two roles exist to grant Administrators and advanced IT users the ability to change server
configurations:
l Default to Nobody
l System User Interface Role should be “view only” to all users assigned
System Configurations
There are six categories of system configurations:
l General
l Environment
l Memory
l Multithreading
l Recycling
An IIS restart is not required to capture changes as they are automatically applied every two
minutes. Saved changes are tracked via the Audit tab, which is available for each server
configuration.
When enabled, permitted users can adjust system configurations in app by navigating to
System>Administration>System Configuration.
General Menu
Use Detailed Error Logging: When true, stack trace information is shown. When false, stack
trace information is not shown.
User Inactivity Timeout (minutes): Set the number of minutes a user is timed out due to
inactivity.
Task Inactivity Timeout (minutes): Set the number of minutes a task is timed out due to
inactivity.
Logon Inactivity Threshold (days): Set the Logon Inactivity Threshold (days) to the number of
days of inactivity before the user can no longer access the system. Set to -1 to disable the setting.
Task Scheduler Validation Frequency (days): Set the number of days in which the Task
Scheduler validation runs.
Culture Codes: Set the appropriate code for display settings using the standard Microsoft Local
designation for each language. (ex. en-US)
White List File Extensions: When blank, any file type can be saved to a root folder then
uploaded. Add custom file types by clicking the box then the ellipses to restrict the types of files
which will be supported in the File Explorer.
Num Seconds Before Logging Slow Formulas: Set the number of seconds before slow
formulas are logged. This will enable logging of formulas and impact consolidation performance.
Disable when no longer required.
Number Seconds Before Logging Get Data Cells: Set the number of seconds before Get Data
Cells are logged. Default is 180 and should only be increased for debug purposes
Log Cube Views: When set to True, a log is created in Task Activity when a Cube View is
opened, a report is run or an export to Excel is completed in the data explorer. The intention of this
feature is to analyze data analysis performance.
Log Quick Views: When set to True, a log is created in Task Activity when a new Quick View is
created or when rows/columns are shifted/moved around. The intention of this feature is to
analyze data analysis performance.
Log Get Data Cells Threshold: This logs the calls to GetDataCells and
GetDataCellsUsingScript. It includes context information such as the Excel file name or the Cube
View name. It only creates logs if the number of Data Cells being requested is equal to or greater
than the value provided in this field.
Max Number of Expanded Cube View Rows: Set the max number of rows displayed when
using an expanded Cube View.
Max Number of Unsuppressed Row Per Cube View Page: The default value of 2000 is used,
which is determined by the settings on the OneStream Server Configuration Utility, The maximum
value is 100,000 rows. If the Cube View performs well, but you want 2500 rows to display, for
example you may want something in the tree to display in the first page, then you would increase
the rows.
Max Number Seconds To Process Cube View: This setting impacts paging behavior. The
default value of 20 will be used which is determined by the OneStream Server Configuration
Utility. The maximum value is 600 seconds.
Environment Menu
The configurations allow the Administrator to tailor the environment login process for the user by
providing a custom label or by triggering an acceptance criteria upon each login.
Environment Name: Enter the name to be displayed (in white) for the environment. You can
enter up to 150 characters.
Environment Color: Specify a provided environment color or enter a hex value to display the
name on a colored background.
Logon Agreement Type: To display a specific message after a user logs on, select Custom. and
enter the message text.
Recycling Menu
Auto Recycle Num Running Hours: Default is 24, which means once a day, the server will
recycle. Automatic Recycling allows Application Servers a chance to recycle, which is a
recommended practice. These first four settings control this behavior.
Auto Recycle Start Hour UTC: Default is 5, which means 05:00 UTC time. This is the earliest
time in a day when a server can automatically recycle. It is best to set this and the End Hour to be
a range of time with the lowest amount of Application Server activity.
Auto Recycle End Hour UTC: Default is 7, which means 07:00 UTC time. This is the latest time
in a day when a server can automatically recycle.
Auto Recycle Max Pause Time (minutes): Default is 30. This means that when it is time to
recycle a server automatically, it will first pause from accepting more server tasks, but allow for
existing assigned tasks to complete processing for 30 minutes before recycling. If there are no
active tasks for this server, it will recycle when the time comes.
An IIS restart is not required to capture changes as they are automatically applied every two
minutes. Saved changes are tracked via the Audit tab, which is available for each server
configuration
Internal Menu
Configure Database Server Connection, Security, and Timeouts settings from this menu.
External Menu
Configure Database Server Connection, Security, Database, and Timeouts settings from this
menu.
Custom
Configure Database Server Connection, Security, Database, and Timeouts settings from this
menu.
External only and the Connection String will be visible without displaying password and Customer
can Edit.
l Connection String can be modified without changing password and saved with “?” not
overriding original password
l Connection String password can be modified and upon save will display “?” in UI
o Connections Strings using the following will be masked in the UI: pwd; pass;
password
o o All others will display password and need to be addressed separately – majority of
connection strings password will be handled above
Pre-existing External Connections, when moving over to Customer Connections, should use the
previous name to avoid breaking any connections with Task Scheduler, Business Rules, etc.
Audits
Configuration setting changes are tracked via the Audit tab. View changes by navigating to
System > System Configuration > Audit.
System Database Tables audits are viewed by navigating to System > Tools > Database >
SystemConfig.
The Environment will show what settings the server is currently using. The server updates every 2
minutes. Here the user can see what settings are being picked up. The “Last Updated” column
shows the date and time settings were changed. Regardless of what type of connection (Internal,
External, Custom) any available configurations that can be changed will show up here.
Environment audits are viewed by navigating to System > Tools > Environment > Application
Server Sets
By default Administrators have access to all security roles. Assigning other groups to roles does
not remove an Administrator's complete system access. Users with Manage System Security
roles can access to the System User Interface Roles of SystemAdministrationLogon and
SystemPane.
Roles are not exclusionary or limiting. If granted, users can get additional functionality through
their membership in groups having corresponding role assignments. See "System Security Roles"
on page 915.
For example:
l A member of the IT team who is not an administrator might need to manage the system
security roles.
l Create users.
l Modify users.
l Delete users.
l Disable users.
l General
l Authentication
l Preferences
l Custom Text
Limitations
Users with the Manage System Security role cannot create, modify, or delete administrators,
directly or indirectly. Also, they cannot:
l Delete themselves.
Limitations
l Assign the Everyone or Nobody groups that require Administrator level privileges.
The controls limiting Manage System Security user's capabilities is enforced during the Load and
Extract process. Validation occurs by comparing the current state of security in the target
environment to the changed state determined by the processing of the source file. Therefore, the
validation of XML loads for Manage System Security users requires that security is pre-existing to
determine the changed state.
For example, although Manage System Security users cannot create Administrators, if the current
Administrator Groups existed in the target environment prior to the XML load, then the XML will
pass the validation and will be loaded. However, when an empty or new environment exists with
no pre-existing users and groups, then an Administrator would need to perform the load.
BRApi
You can manage user and group system security using BRApi functions such as CopyUser,
DeleteUser and CopyGroup. These are controlled by the assigned Manage System Security role.
See "System Security Users and Groups" on page 935 for more information.
For example, if a dashboard is created to insert new users, and a dashboard button executed a
BRApi to insert a user, the system validates that the user clicking the button is in the
Administrators Group or has role permission to ManageSystemSecurityUsers.
Combined Roles
When granted access to more than one of these roles, you gain more functionality within the
scope of the designed capabilities and restrictions. For example, if you have both the role of users
and groups, you can copy a User or you can also can add a user to a group. Certain functionality
requires assignment of combined roles, such as Load and Extract.
l Exposes the Contents Folder in File Explorer\FileShare under Application and System.
AccessFileShareContents
l Exposes the Contents Folder in File Explorer\FileShare under Application and System.
l Allows only the ability to see the Contents folder and its sub-files, and allows download.
RetrieveFileShareContents
l The Contents Folder is not exposed to the user in the File Share for Application or System
using the File Explorer.
l All files are accessible through the OneStream application, such as through Dashboards
and Business Rules.
See Application Security Roles in Application Tools for more information on Application based
security roles.
Encrypt System Business Rules - This allows a user to encrypt/decrypt a rule from the
Business Rule screen in the System tab, if the user is in the role.
See Business Rules in under Application Tools for more information on Business Rule Encrypt
(Decrypt) functionality and use.
View All Logon Activity - Based on the security group chosen, this will allow a user access to
the Logon Activity section found under System Tab|Logging| Logon Activity. The user will have
access to view all users, but will not be able to log them off. By default, only administrators have
access to this.
View All Error Log - Based on the security group chosen, this will allow a user access to the
Error Log section found under System Tab|Logging| Error Log. By default, only administrators
have access to this.
View All Task Activity - Based on the security group chosen, this will allow a user access to the
Task Activity section found under System Tab|Logging| Task Activity. By default, only
administrators have access to this.
SystemAdministrationLogon
This is for the System Administration application and set to Administration by default. When a
security group is assigned to this role, it becomes available in the Application list during logon.
SystemPane
Based on the security group chosen, this role lets you access the System Tab found at the bottom
left of the screen. By default, only administrators have access to this tab.
ApplicationAdminPage
Based on the security group chosen, this role lets you access the Application tab at the bottom left
of the screen. By default, only administrators can access this tab.
SecurityAdminPage
Based on the selected security group, this role lets you view but not modify security artifacts and
settings other than your password on System > Administration. By default, only administrators
have access to this section.
SystemDashboardAdminPage
Based on the security group chosen, this will allow a user access to the Dashboard section found
under System Tab > Administration > Dashboards. This section ties to the
ManageSystemDashboards which is under the System Security Roles in the above section. By
default, only administrators have access to this.
ApplicationServersPage
Based on the security group chosen, this will allow a user access to the Application Servers
section found under System Tab >Tools > Application Servers. By default, only administrators
have access to this.
DatabasePage
This role lets you access the Database Page on System > Tools. By default, only system
administrators have access to this section.
FileExplorerPage
Based on the selected security group, this role lets you access the File Explorer on System
>Tools. This section ties to the ManageSystemDatabaseFiles and ManageFileShare roles. By
default, only administrators can access the File Explorer.
SystemLoadExtractPage
Based on the security group chosen, this role lets you access Load/Extract on System > Tools but
you cannot actually import or extract items. By default, only administrators have access to this
section.
ErrorLogPage
Based on the security group chosen, this role lets you access Error Log on System > Logging. By
default, only administrators can access to this section.
LogonActivityPage
Based on the selected security group, this role lets you access to Logon Activity on System >
Logging. You can view all users, but cannot log them off. By default, only administrators have
access to this section.
TaskActvityPage
Based on the selected security group, this role lets you access Task Activity in System > Logging.
By default, only administrators can access to this section.
TimeDimensionPage
Based on the security group chosen, this will allow a user access to the Time Dimension on
System > Tools. By default, only administrators have access to this section.
Every user must be assigned a user ID. Users can be added as native users or as references to
users stored in external repositories (e.g. Active Directory). Users can be externally authenticated
with these standard providers.
l LDAP
l MSAD
l Okta
l PingFederate
l Azure AD
l SAML
For information about external authentication with standard providers, see the Installation and
Configuration Guide.
BRApi.Security.Admin.GetUser
BRApi.Security.Admin.GetUser
BRApi.Security.Admin.SaveUser
BRApi.Security.Admin.RenameUser
BRApi.Security.Admin.DeleteUser
BRApi.Security.Admin.CopyUser
BRApi.Security.Admin.GetGroupsAndExclusionGroups
BRApi.Security.Admin.GetGroups
BRApi.Security.Admin.GetGroup
BRApi.Security.Admin.GetGroupInfoEx
BRApi.Security.Admin.SaveGroup
BRApi.Security.Admin.RenameGroup
BRApi.Security.Admin.DeleteGroup
BRApi.Security.Admin.CopyGroup
BRApi.Security.Admin.GetExclusionGroups
BRApi.Security.Admin.GetExclusionGroup
BRApi.Security.Admin.SaveExclusionGroup
BRApi.Security.Admin.RenameExclusionGroup
BRApi.Security.Admin.DeleteExclusionGroup
BRApi.Security.Admin.CopyExclusionGroup
BRApi.Security.Admin.GetSystemRoles
BRApi.Security.Admin.GetApplicationRoles
BRApi.Security.Admin.GetRole
BRApi.Security.Admin.CopyExclusionGroup
Examples
Get a UserInfo object and change the User Description
End If
Get a Group and UserInfo object and add the Group to the User's list of parent Groups
If (Not objUserInfo.ParentGroups.ContainsKey
(objGroupInfo.Group.UniqueID)) Then
parentGroupIDs.Add(objGroupInfo.Group.UniqueID)
End If
End If
End If
Create a User
Dim objUser As User = New User()
objUser.Name = "NewUser"
Create a Group
Dim objGroup As Group = New Group()
objGroup.Name = "NewGroup"
objGroupInfo.Group = objGroup
International Settings
Culture-aware number and text formatting can be used with:
l Server components
For consistency, we recommend that each user’s Culture setting match the Windows Regional
settings of their primary computer.
2. Click Format and set properties to reflect regional preferences. For exmple:
3. Click Location and set the Home Location to reflect the operating system’s location such
as Canada, Germany or Australia.
Allow Database Creation Via UI must be True in the Application Server Configuration.
Application Name
Enter the name of the new application.
Description
Provide a brief description (optional).
The new application displays in the Application list when you log on.
Select an existing application and click the test checkbox to verify an active connection.
System Dashboards
System Dashboards are similar to Application Dashboards but differ in that you can use
OneStream Framework database objects such as security and users. Application Dashboards
use Application-level data stored in Cubes.
See Application Dashboards in "Presenting Data With Books, Cube Views and Other Items" on
page 576 for information about designing Dashboards.
Logging
Each logging section display content - that you can sort and filter - in a grid. To sort, click a column
heading and select ascending or descending.
Navigate pages by clicking page numbers, the forward button, the back button and so on. These
buttons are to the bottom left of a grid. To export data, right-click, select Export and then an
export file type.
Logon Activity
Use the following to identify who is logged on who logged off.
Logon Status
This shows when users logged on and logged off.
User
This shows the user ID.
Application
This shows the application the user is logged into.
Client Module
This will display items such as Excel.
Client Version
This will display the version of the application you are using.
Client IP Address
This will display the end users IP address.
Logon Time
A time stamp of when the user logged in.
Logoff Time
A time stamp of when the user logged off.
Task Activity
You can access task activity to identify user actions in two ways:
o Click Task Activity at the top right section of the web client.
o Administrators: Select System > Logging > Task Activity.
Refresh
This refreshes the Task Activity in order to see any changes made
Task Type
This shows the user’s activity. (e.g., Consolidate, Process Cube, Load and Transform, Clear
Stage Data, etc.)
Description
This shows the details of the activity. (e.g., the POV, Multiple Data Units, etc.)
Duration
This displays the length of time the activity took.
Task Status
This shows the status of the activity. (e.g., Completed, Failed, etc.) Canceling the task will
transition it from Canceling to Canceled.
User
This shows the user ID.
Application
This shows the application to which the task was processed.
Server
This shows the application server being utilized.
Start Time
The starting time stamp of the activity.
End Time
The ending time stamp of the activity.
Queued CPU
This provides the % CPU utilization for when the task was initiated.
Start CPU
This provides the % CPU utilization when the job began from the queue.
For more details, Task Queuing, see Data Management in "Application Tools" on page 779.
Within the grid, there are two icons on the left side of the row. If highlighted, there is the ability to
drill down by clicking on them.
The first icon shows child steps within a particular task that has run. The second shows detailed
information of the error.
Error Logs
Administrators can use the following to evaluate errors on System > Logging.
Description
Displays a brief description of the error.
Error Time
Indicates when the error occurred.
Error Level
Displays the error type such as Unknown, Fatal, Warning etc.
User
Displays the user ID.
Application
Identifies the application a user is logged onto.
Tier
Displays the application tier such as App Server, Web Server or Client.
App Server
Identifies the application server to which a user was connected when they encountered an error.
System Tools
There are a variety of system tools that can be used to manage the OneStream application. These
tools include system business rules, database, tools that allow you to check the overall health of
the application environment, and the File Explorer to name a few. In this section you will learn how
to use these and other tools to manage the application system.
Under each Case statement, use these rules and related Args and BRApis to check the current
Server Set capacity, query metrics about a Server Set or Azure Database, and identify the impact
of the Server Sets volume or level of Azure Database deployed. See "Azure Database
Connection Setting and Server Sets" in the Installation and Configuration Guide.
Example starting point of empty System Extender Business Rule upon creation:
Database
The Database screen allows System Administrators to view all of OneStream’s database tables
and provides tools for managing stored data and other information.
Tables
This gives read-only access to all data tables in the database and can be used for tasks such as
trying to debug issues without having access to the database, or deletion logging.
Tools
Data Records
Enter a Member Filter in order to view data for the entire system.
Environment
This section can be used to check the overall health of the environment, which contains Web
Servers, Mobile Web Servers, Application Server Sets and Database Servers. This will check the
connection status as well as the configuration.
The Environment page is designed to give both IT and power business users a way to manage
and optimize their applications and the environment that is running under. Using the Environment
page, the user can monitor the environment, isolate bottlenecks, look at properties and
configuration changes, and scale in/out application servers and database resources if needed.
They can also customize what data to collect in log files, save collection metrics files, and replay
collected performance data in many ways.
To access the Environment page, select System > Tools > Environment.
NOTE: The Environment page is only accessible via the OneStream Windows App.
Monitoring Environments
Instead of logging onto the server to collect metrics, use the Monitoring page to access real time
Key Performance Indicators (KPIs) across an environment. Click System > Tools >
Environment > Monitoring to:
l Open: Access a metrics and configuration setting file from the File System or a local folder.
l Save As: Save metrics and configurations locally or in the File System.
l Settings: Specify these types of KPI metrics to monitor and the monitor frequency:
o Environment
o Application Servers
o Database Servers
o Server Sets
l Refresh Automatically Using a Timer: Retrieve metrics based on the Play Update
Frequency interval setting.
l Metric and Task Time Range: Indicate how much historical data to retrieve and depict on
the Performance Chart. Applying a time range can help identify the cause of a server event
or issue.
l Y-Axis: If Auto Range box is selected, the system sets the range for the Min and Max
values on the Y-Axis. If cleared, you can set the Min and Max ranges on the Y-Axis.
l Filter Type: Specify the type of servers for which to collect metrics.
l Result List: Displays selected Source List items for which to collect metrics.
l Filter Type: Refine the types of application servers, database servers or server sets for
which to collect metrics.
l Source List: Displays the list of filtered application servers, database servers or server
sets.
These settings shown below in the Application Server Configuration File in Environment
Monitoring, determine which metrics are collected and how often.
Web Servers
This section lists all the web servers that are participating in that environment. Each web server
will display its configuration and its audited setting.
Configuration
Sample Web Server config settings
Identity Provider
Used to specify how often each server updates its record that it is alive and responding to user
input.
Name
This is the name of the server as it is defined in the web configuration file.
WCF Address
The full URL address of the server.
WCF Connection
This examines the status of the connection. Ok = connected
Audit
Sample Web Server audit setting
Property Type
Type of server property in the hardware or software that was changed.
Property Name
The name of the property in the hardware or software that was changed.
Value From
Displays the original value of the property.
Value To
Displays the new value of the property.
Description From
Displays the original description of the setting if available.
Description To
Displays the new description of the setting if available.
Timestamp From
Displays the original date and time of the setting if available.
Timestamp To
Displays the new date and time of the setting.
User From
Displays the original user name if available.
User To
Displays the user name of the person that made the change if available.
Pause
Used to pause any request to a WCF Address connection. This connection could be either an
Application Sever or a Load Balancer. This can be set in the web configuration file.
Resume
Used to Resume any request to a WCF Address connection. This connection could be either an
Application Sever or a Load Balancer. This can be set in the web configuration file.
Name
This is the name of the server as it is defined in the web configuration file.
WCF Address
The full URL address of the server.
WCF Connection
This examines the status of the connection. Ok = connected
You can specify the following settings at the Server Set level using the Server Configuration
Utility, but can override them at the individual Application Server level.
l Process Queued Consolidation Tasks: If true, the server can processes consolidation
tasks.
l Process Queued Data Management Tasks: If true, the server can process data
management tasks.
l Process Queued Stage Tasks: If true, the server can process stage tasks.
l Queued Tasks Require Named Application Server: If true, the server only runs
assigned tasks.
l Azure Resource Group Name: (Azure Scale Set Only) The resource group name for the
set.
l Azure Scale Set Name: (Azure Scale Set Only) The scale set name in the resource group.
l Can Change Queuing Options On Servers: If true, Administrators can change the
queuing behavior - impacting stage, consolidation and data management tasks - of specific
servers.
l Can Pause or Resume Servers: If set to true, then the user can pause and/or resume the
server from the Environment page via the Pause and Resume buttons.
l Can Stop or Start Servers (Azure Scale Set Only): If true, users can stop / restart the
server on the Environment page.
l Maximum | MinimumCapacity (Azure Scale Set Only): A fail-safe setting to identify the
maximum | minimum number of servers.
l Process Queued Consolidation Tasks: If true, the server can process consolidation
tasks.
l Process Queued <feature> Tasks: If true, the server can process tasks such as data
management and state tasks.
l Queued Tasks Require Named Application Server: If true, the server only runs its
assigned tasks.
l Server Names for Standard Server Sets (Supports *? Wildcards): Specify server
names if the standard server set type is used.
l Server Set Provider Type: Specify the external authentication provider used, such as
Azure.
l System Business Rule Name (Azure Scale Set Only): The name of the extender business
rule for Azure scale set and database scaling.
Pause
Pause will stop the server from seeking more tasks to run from the queue but will let tasks that
have already started to finish.
Resume
Server will resume accepting tasks from the queue.
Stops the server, but while in this state will continue to incur Azure compute charges. The public
and internal will be preserved.
Stopping this way will mean the cost of virtual machine will not be charged, but the public and
internal IP will be deleted.
From the database connection items, the user can expand the SQL server elastic pool if one is
configured, view hardware and server properties, view audit info, look at SQL server metrics, and
run some diagnostic commands to track down performance issues.
Behaviors
Sample database connection Behavior tab:
The behavior will show up if SQL Server Azure and Elastic Pool are configured and used. Using
the Behavior tab, you can increase or decrease the resources available to an Elastic Pool based
on resource needs. When rescaling Elastic Pool DTUs, database connections are briefly
dropped. This is the same behavior as occurs when rescaling Elastic Pool DTUs for a single
database (not in a pool).
Configuration
The configuration tab will display SQL server configuration properties.
Hardware
The hardware tab will display hardware related information pertaining to SQL server. Sample
hardware properties:
Audit
The audit tab will display any changes to the SQL Server properties. Sample configuration audit
report:
Performance
The Performance tab will display metrics pertaining to SQL Server:
Diagnostics
The Diagnostics tab allows the user to run SQL diagnostic commands to determine performance
issue on the database instance.
l The Deadlock SQL command will list out any deadlocks if any on the SQL server instance.
l Top SQL Commands will list out the top number of SQL using the order select by the user
(Total Logical Reads, Total Logical Writes, Total Worker Time)
Configuration
The Configuration tab contains the application-specific information:
Audit
The audit tab will show if any of the application configuration has been changed.
Diagnostic
Diagnostics tab will show a report of current schema table fragmentation. Sample of table
fragmentation on the framework DB showing all tables beyond 70% fragmented:
Azure Configurations
(Azure Only or if Azure Elastic Pool Being Used)
Environment Monitoring
The Environment Monitoring section configures how and how often metrics are collected. What is
available for metrics or monitoring is based on configurations made in the OneStream Server
Configuration Utility. See Installation and Configuration Guide.
Used to specify the address of the recycle management service. The protocol for the address
should be set to however the service is deployed (https or http) and the port (default is 50002).
The asterisk will force the service to use the fully qualified domain name of the executing server.
Used to specify how often the app servers should be restarted. The default is 24.0. Use 0.0 to turn
off auto recycling. Fractional numbers may be used in this setting.
Used to specify the time period within each day that the server can be recycled. This setting is
only used if Number of Running Hours Before Automatic Recycle is set to 24.0. The default setting
is 5. With that value, the application servers will be recycled between 5:00am UTC and what the
setting for the End Hour for Automatic Recycle is set to.
Used to specify the time period within each day that the server can be recycled. This setting is
only used if Number of Running Hours Before Automatic Recycle is set to 24.0. The default setting
is 7. With that value, the application servers will be recycled between 7:00am UTC and what the
setting for the Start Hour for Automatic Recycle is set to.
Used to specify the maximum number of minutes that the server should pause its ability to run
newly queued tasks before recycling. The default setting is 30. This setting is the time period that
the server will use to allow previously started tasks to finish. If there are no previously running
tasks on the application server, the setting is ignored.
How often system checks for system monitoring (i.e.; deadlocks ...)
Detailed Logging
If true, then log whenever we enter and exit the metric collection and the Active System check.
Detailed Logging
If true, then log whenever we enter and exit the Task Load Balancing function and what was run.
The Azure section is used to set the Azure resource group name and the scale set name as it
exists on Azure. The Scaling options are used to set the scaling capacity and the Scaling Type
(e.g. Manual or BusinessRule). The System Business Rule Name must be set if the Scaling Type
is set to Business Rule.
The Behaviors section is used determine how this scale set is used. Whether it can be called by
the web server or it is used as a processing server for consolidations and Stage Load.
The General section is used to determine whether this scale set is used and who is its provider
(i.e.; Azure, External …).
File Explorer
Documents and saved Point of View settings can be stored in the Application database.
Upload a specific file into any folder. If a user has Administrator rights or is in the
ManageFileShare security role, he/she can upload and delete files in the Harvest folder.
Import or Extract from the Application database and the System database. These are
sent to the main product folder created during the initial configuration (e.g. C:\OneStream).
A common Cube Point of View can be stored for continued reuse by multiple users. The POV can
be stored for all users in the Public area or for the specific user in the User directory. To create a
saved, named POV, right click the Cube POV node under Point of View within the Context Pane:
File Share
File Share is a self-service directory that temporarily stores files before they are imported into the
Application or System Database. Files stored in an Import folder are accessible only through that
Workflow Profile.
Permissions
Administrators grant permissions for folders and individual files by adding a user to the associated
group:
Content Folder
Both the Application and System folders contain an auto-generated folder named Content,
intended to store files larger than 300 MB.
Permissions
The Application-specific and System Content folders are managed by the following system
security roles:
l Non-Administrators can be assigned rights to modify, access or have limited rights to the
Content folders.
Uploads
By default, when this flag is set to True, authorized users can upload or edit files or folders in the
File Share using the OneStream File Explorer (required to use Content folder). When set to False,
no users will have upload, edit file or edit folder access to the File Share using the OneStream File
Explorer. However, normal access rules will apply for browsing files and folders in the File System
section.
Users will receive a Security error when attempting to write or edit the files and folders when
attempting to use API or non File Explorer methods.
Windows Application
l Uploads (Content): Up to 2 GB
l Downloads (Content): Up to 2 GB
Overview
My Company Name, LLC lets you save documents or upload documents from your device into the
application. These documents are stored in one of three file storage root folders:
l Application Database: displays stored documents for the current application only.
l System Database: displays stored documents for the entire system without affecting the
current application.
l File Share: this is a self-service directory that temporarily stores files before they are
imported into the Application or System Database.
File Explorer is the product tool inside the Windows Application that lets you browse saved
documents or upload new ones.
For example, a spreadsheet can be saved to any of the three root folders. A Cube POV can be
saved to either the application or system folders.
The Dashboard component “File Viewer” also launches the File Explorer component.
1. Go to Start > OneStream Software > OneStream Server Configuration Utility, right-
click and select Run as Administrator.
4. In the Application Server Configuration Settings section, locate Whitelist File Extensions
and click the ellipses (...) on the far-right of the field.
5. In the dialog box, click the plus sign (+) then type the whitelisted file extension. For
example, .txt.
8. Click Save to save the changes to the Application Server Configuration File.
NOTE: Cloud customers should contact support to have this configuration change
made for them.
Also, if you enter any of the following characters, they will be automatically removed. For
example, typing .txt (with a period) becomes txt.
/ ^ = '
3. Browse to the file you would like to upload and click OK.
If the extension of the file is whitelisted, it will be uploaded. If not, you will receive an error
message.
Troubleshooting
After a whitelist is defined, new files whose extension is not in the whitelist will be restricted from
uploading.
If you upload a file that contains an exploit, malware, or a malicious script or macro, other users
can download the file and infect their machine.
Conclusion
Whitelisting a file extension provides a means of allowing only certain types of documents to be
saved in the File Explorer. This helps to alleviate the possibility of introducing malicious file types
into the system.
Documents
Public
Files available to everyone with access.
Security Access
Enables a user to access an object and read the content.
Maintenance (Group)
Enables a user to view an object and create, modify and delete objects in Groups. If in a
Maintenance Group, a user does not need to be in the Access Group. The Maintenance Group
also controls user profile contents.
Users
Private documents only available to named users.
Internal
Files used internally that cannot be modified.
Tip: Only System Administrators have access to this portion of the tool.
Extract
Choose an item from the drop-down list, click the Extract icon to start the extract process, then
name the output file.
Load
After browsing to the file, click the Load icon to initiate the process.
Security
This covers System Roles, Users, Security Groups, and Exclusion Groups which can be found
under the System Tab |Administration| Security. The screen will display Extract Unique IDs. If
this box is checked, the unique IDs OneStream assigns to each user will be extracted with the
security. When moving security changes from one OneStream environment to another, trying to
load the security with the unique IDs into the destination environment can result in an error if some
of the records already exist. If this is the case, uncheck this box and extract without the unique
IDs. Items to Extract also displays and by default All is chosen. Choose specific items in each
section or turn the sections on or off.
System Dashboards
This has multiple sections such as Maintenance Units and Profiles. Under Maintenance Units, the
Groups, Dashboard Components, Dashboard Adapters, and Dashboard Parameters all go
together. System Dashboards can be found under the System Tab| Administration| Dashboards.
The screen will display Items To Extract and by default All is chosen. Choose specific items in
each section or turn the sections on or off.
Error Log
This covers the Error Log found under System Tab| Logging| Error Log. The screen will display
an Extract All Items check box along with a Start Time and End Time. If the check box is
notchecked, choose the Start and End Time by clicking . Click on the extract button and define
where the file will be saved.
Task Activity
This covers the Task Activity found under System Tab > Logging > Task Activity. The screen will
display an Extract All Items check box along with a Start Time and End Time. If the check box is
not checked, choose the Start and End Time by clicking . Click on the extract button and define
where the file will be saved.
Logon Activity
This covers the Logon Activity found under System Tab| Logging| Logon Activity. The screen will
display an Extract All Items check box along with a Start Time and End Time. If the check box is
not checked, choose the Start and End Time by clicking . Click on the extract button and define
where the file will be saved.
Time Dimensions
Applications can have a monthly or weekly Time Dimension. Time Dimension Types determine if
an application uses a weekly, monthly or 12/13 period frequency, and the type of calendar (such
as 445 or 454.) In a new application, use a custom Time Dimension Type so users can specify the
number of months in a quarter and the number of weeks in a month.
After you create a Time Dimension, an XML file is generated. New applications use this file to
implement the desired Time Dimension. If a Time Dimension Type is not assigned to a new
application database, the default Standard Type is used.
Note: Applications created prior to 4.1.0 can convert a monthly application to weekly application
using the Database Configuration Utility. Contact Support for assistance.
l Time Dimension Type Standard: Creates a Monthly Time Dimension and stores data by
month in the data tables. Applications created prior to version 4.1.0 use this type.
l StandardUsingBinaryData: Creates a Monthly Time Dimension and stores data in a
binary data table. Use this type if you may need to later convert an application to a Weekly
Time Dimension.
l M12_3333_W52_445: Creates a 12 Month, 4 quarter, 52 Week, 445 calendar Time
Dimension.
l M12_3333_W52_454: Creates a 12 Month, 4 quarter, 52 Week, 454 calendar Time
Dimension.
A Workflow Unit is an individual period within the selected year and Scenario combination for a
particular Workflow Profile. Twelve Workflow Units or periods are always displayed for selection,
but only one can be selected and active at a given time. Each Workflow Origin step available
must be completed as well as the collective data confirmation and certification process for a single
Workflow process to be finalized.
Right-Click Options
Right-click a Workflow month, Workflow Input Type, or Dependent Status cell to display the
following options. Not all options are available for every object.
Displays the Workflow Status of each Origin process (Import, Forms, and Journals). See
Dependent Status under Certify for more details on this feature.
Right-click any Workflow channel. This provides an audit for every Workflow task’s process
including the date and time of the process, the user performing the process, how long it took for
the process to complete, and any errors that occurred during the process. It also provides audit
history through Lock History > Workflow Lock/Unlock.
Lock/Lock Descendants
Unlock/Unlock Descendants
This will navigate to the Transformation Rules screen and allow a user to fix any Transformation
Rule errors that occur during the Workflow process.
This will clear all files and data within the current import. If you select All, you will get a warning
message before you are able to proceed.
This will clear all imported data from the Cube, but the data still remains in the Stage.
This allows a user to unlock and uncertify ancestors, or lock and certify descendants.
This allows a user to preserve data if changes need to be made and restore the preserved data if
necessary.
A description can be added to a Workflow Profile during its design. Descriptions can only be
added to Workflow Profiles using the Default Scenario Type. Scenario Type specific descriptions
cannot be used.
Workflow Tasks
This simplifies the data collection, consolidation, and certification process. Importing data,
entering a form, making a journal entry, and signing off on reports is completed through the
Workflow process.
Import
This process step gives the end user the ability to import data into the system. This can be through
a defined Data Source or a Data Connector. First click on the Import Workflow Origin located
under the active month, then click Import in the task bar.
After a user clicks this icon, it will prompt him/her to either search for a file on the drive or initiate
the Data Connector. The system will then import the data into the Stage engine. The file will be
parsed into a clean tabular format with information on the Amounts, the Source ID, and each
Dimension. Once the data is loaded successfully, the Import task will change from blue to green.
Load Method
Upon clicking Load and Transform, a dialog will appear with four Load Options:
Replace
This will clear all data for the previous file that correlates with the specific Source ID and replace it
with the new file’s data. This can be done even if the previous data has already been loaded into
the Cube. Once the file is re-loaded, the user will need to complete all Workflow Tasks and load
the new data into the Cube.
Replaces all Workflow Units in the selected Workflow View (if multi-period). Forces a replace of all
time values in a multi-period workflow view
Replaces all Workflow Units in the selected Workflow View and all Source ID’s in a background
thread while the new file parse or connector execution is running. The delete is being performed
while parse is being performed.
NOTE: This load method always must always be used to delete ALL Source IDs. If the
workflow uses multiple Source IDs for partial replacement during a load, this method
cannot be used.
Append
This is used when additional rows are added to a source file and need to be loaded into the Stage.
This will not change any of the data already loaded for the source file, it will only add rows that
were not included in the previous file load.
This is used after data has already been loaded, but some changes have been made and the
calculations need to be repeated.
When this icon displays, the Global POV is enforced and data loading is limited to the Global
POV. The Enforce Global POV setting is controlled under Application|Tools|Application
Properties.
When this icon displays, data cannot be imported to time periods prior to the current Workflow
year. The Allow Loads Before Workflow View Year setting is controlled under
Application|Tools|Application Properties.
When this icon displays, data cannot be imported to time periods after the current Workflow year.
The Allow Loads After Workflow View Year setting is controlled under
Application|Tools|Application Properties.
Right-Click Options
View Source Document
This will open the source file imported into the application.
This will open a processing log and provide information on when and how the source file was
imported.
Drill Back
Using a SQL Connector allows a user to drill back to a source system and show detailed records
from a document, PDF, website, . within the application This option is only available if data was
loaded using a Connector Data Source. For more details on this feature, see Drill Back in
"Collecting Data" on page 164.
Export
This will export the data into an Excel XML< CSV, Text, or HTML file
Validate
After the data is loaded, the user will then validate the map and intersections. During the Validate
step two specific actions happen. First, OneStream will check to make sure each piece of data
has a map. Next, OneStream will check to make sure the combination of Dimensions can be
loaded into the Cube based on the constraints defined in the system (e.g. using Intercompany).
Click the green Validate task in the taskbar.
Click this icon to validate the data. If there are no errors, the Validate task will change from blue to
green. If there are any errors, Validate will turn red and the errors will need to be fixed before
proceeding.
Transformation Errors
The errors will be listed by the Dimension with the error, such as a new account that was added,
but not mapped. The user will be able to see the Source Value and the (Unassigned) Target
Value. The system will already know the Dimension and pull up the possible targets. On the right,
the user will be able to use the search box and click the funnel/filter button to find the correct
account. The default will be a One-To-One Transformation Rule. For more information on these
rules, see Transformation Rules in "Data Collection" on page 547.
Save any changes made and then click the Retransform icon, this will automatically re-validate
the data. If there is an intersection validation error, this will now appear. If there are no
intersection errors, the Validate Workflow step will change from blue to green.
Intersection Errors
Intersection Errors mean something is not correct with the entire intersection of data. For
example, a Customer Dimension may be mapped to a Salary Grade Account. This does not
make sense and the system will identify this prior to the load. To fix this error, click on the bad
intersection and drill down to see the GL/ERP Account; (see Drill Down in "Using OnePlace Cube
Views" on page 1020 for more details). Right-click to View Transformation Rules and from there,
investigate each rule for each Dimension in order to find which one does not make sense. Each
Dimension will be on the left, and each Target Value Column needs to be investigated. To fix a
Transformation Rule here, select the Target Value to correct, right-click and select Edit Rule.
Choose the appropriate Target Value and fix the intersection error. Click Save and close the next
two windows. Click the Validate icon again and if it was fixed correctly, the Validate task will
change from blue to green.
See Right-Click Options under the Import Task for details on the right-click functions.
Load
This simply loads the cleansed data from the Stage to the Analytic Engine. Click the Load task in
the taskbar.
Next, click the Load Cube icon. This now loads the data into the Consolidation Engine and
completes the Import process.
A green check will appear next to the Import Origin Workflow Step.
See Right-Click Options under the Import Task for details on the right-click functions.
Pre-Process
The Pre-Process task is the same as the normal Process task. This can be added as a first step in
the Form channel process. Please refer to the Process description further down in this section for
more details.
Input Forms
Once in this step, there will be two options under Workflow Forms: Required and Optional. The
required Forms will have to be completed before the end user can move on from this step. Click
on the specific Form, and manually enter the required data.
Attachments
This allows the user to attach supplemental files to specific Forms.
Once a Form is completed, click Save and then the Complete Form icon.
Note: A drop-down time period menu will display for a form with a defined time filter containing
multiple periods. If a Form does not have a time filter defined, this will be a standard button.
This will re-open a Form and clear the data. Once the Form is updated, Complete Form must be
clicked again.
Note: A drop-down time period menu will display for a form with a defined time filter containing
multiple periods. If a Form does not have a time filter defined, this will be a standard button.
When each Form is completed, click Complete Workflow and a green check will appear for the
Forms Origin Workflow Step.
This will re-open the Workflow. To make changes to the Form, Revert Form must also be clicked.
Make any necessary changes to the Forms and click Complete Form/Workflow again.
Show Report
This shows the Form in a Cube View Report.
Export to Excel
This exports a copy of the Form into Excel.
Cell Detail
Cell Detail can be entered on a Cube View used for a data entry form, or on a Cube View or Quick
View in Excel. Cell Detail is available on any writeable O#Forms or O#BeforeAdj Member.
See Right-Click Options in "Using OnePlace Cube Views" on page 1020 for details on the right-
click options in Forms.
Workflow Icons
6. Confirmation Rules; Pass = green circle white checkmark, flat design style.
7. Confirmation Rules; Fail = red circle white exclamation mark, flat design style.
Journals
Input Journals
Once in this step, there will be two options under Workflow Forms: Required and Optional. The
required Journals will have to be completed before the end user can move on from this step.
Use this to create a Journal using the selected template. Manually enter the required Journal data
and Click Save.
Search
Type a keyword from a journal entry name and quickly navigate to the specific entry.
Multi-select for multi journal deletion is not supported. Journals must be deleted one at a time.
Once a Journal is completed, select Quick Post or Post. When all required Journals and any
optional Journals are finished, click Complete Workflow and a green check will appear for the
Journals Origin Workflow Step.
Depending on the security configuration, there are multiple options for Journals. If full security is
in place, the end user will be able to create and submit. The approver will approve or reject, then
the end user can post and complete the Workflow.
Process
Once in this step, click Process Cube in order to process the loaded data.
This has Calculation Definitions built behind it. This icon performs No Calculate and all the
standard Calculate, Translate, and Consolidate options. Furthermore, this can be configured to
filter by Entity for Reviewer level processes. Once the Cube is processed, the Process task will
change from blue to green and move to the Confirm task.
Confirm
This step runs the Confirmation Rules defined for this particular Workflow. This will immediately
inform users if they have passed or failed the data quality rules. The two types of statuses for this
step are Warning or Error. Warning means the user is outside of the threshold, but it will not stop
the process. Error means the user is outside of the threshold and this will stop the process and
turn this step to red. If anything has failed, revisit one or many of the previous steps to make sure
the data is accurate and complete. Once the data has passed all quality rules, the Confirm task
will change from blue to green and move to the Certify task.
Certify
This is typically the final step in the Workflow process. This certifies and completes the phase of
the Workflow.
Some questions may need to be answered regarding the processes. Click each set of questions
under the Questionnaires area. Answer the questions by clicking in the response cell and
selecting the correct answer. Comments can also be added in order to explain the answers. The
status will be displayed on the right. When this is completed, click on the Set Questionnaire
Status icon and select Completed and then OK.
Once each group of questions is completed, the Set Certification Status icon will be enabled, click
this and select Certify in order to certify the data as complete and accurate. This will give the final
green check for the month being processed and the data can now be trusted as complete and
accurate by any stakeholder that is analyzing this information.
This is a one click option to expedite the Certify process. No questions need to be answered.
This will give the final green check for the month being processed and the data can now be trusted
as complete and accurate by any stakeholder that is analyzing this information.
This data can now be used for Consolidations by users or managers responsible for this
Workflow. They can look at data as it moves up and perform their own top side adjustments,
confirmations and certifications at as many levels as is appropriate for the organization.
Click on Dependent Status to see the status of each required Workflow task to ensure they are all
completed. This will display the Workflow Profile name and all input types, the Workflow Channel,
the status of each input type, the last step completed for each input type, the percentage of each
step that is OK, In Process, Not Started, and steps with errors, and a record of when the last
activity took place for each step. See Right-Click Options for details on these right-click functions.
Import (Stage Only), Import, Validate (Stage Only), Import, Validate, Certify (Stage Only)
This setting in the Workflow limits the data load process to only the (Import), (Import, Validate), or
(Import, Validate, Certify) step. When the POV is set to this Workflow, only these steps will be
available. No data will be loaded to the Cube.
Multi-Period Processing
Click on the Year in the Navigation Pane to enact Multi-Period Processing options:
From here, perform multiple Workflow tasks for one to many time periods, as shown above.
Analysis Pane
In each Workflow step at the single period process within the monthly Process, Confirm and
Certify tasks, there will be an Analysis Pane under the Status Pane.
To view the Cube Calculation Status, click on Cube Views| Calculation Status to show a data grid
presenting the Calculation Status of the current active Workflow POV. This will be available at the
total Monthly (not an Origin process) Review, Process, or Certify tasks. See Calculation Status in
"About the Financial Model" on page 2 for more details on this feature.
Intercompany Matching
IC Matching which will show any Intercompany discrepancies. If the button is red in the status
column, click the item to see the details. Each Intercompany counterparty will be visible and again
in red will be counterparties with an Intercompany variance. By clicking on the counterparty,
details including the Reporting Currency, Entity Currency and Partner Currency will be visible.
Select the Intercompany Partner to see the status of the Intercompany issue and any annotations
the partner may have made. Select the Difference row to see both parties’ statuses and
annotations.
This allows the user to update the status by selecting Not Started, Loaded, Adjusting, Disputed,
Finalized. The user may also include comments for the counterparty to see.
This allows the user to see the Workflow status of the counterparty.
Drill Down
This will allow the user to drill down on the Dimensions in order to get more information about the
Intercompany data.
See Drill Down in "Using OnePlace Cube Views" on the next page for more details on this feature.
See Intercompany Eliminations in "About the Financial Model" on page 2 for more details on
Intercompany.
In order to copy Cube View cells from a Data Explorer Grid to an Excel spreadsheet, click CTRL,
select the cells desired, and then click CTRL-C. Navigate to an Excel spreadsheet, select a cell,
and click CTRL-V, this will paste the cells into Excel. This can also be done from an Excel
spreadsheet into a Data Explorer Grid.
While viewing these reports, users can right click on any cell in order to learn more about any
given number or piece of data. For more information on creating Cube Views, see Cube Views in
"Presenting Data With Books, Cube Views and Other Items" on page 576. For information on how
to use Cube Views, or advanced uses, see Cube Views in "Presenting Data With Extensible
Documents" on page 235.
Toolbar
Consolidate
This consolidates the data in the Cube View
Translate
This translates the data in the Cube View
Calculate
This performs a calculation on the Cube View data
Row Suppression
This suppresses the data based on the Cube View’s row suppression settings.
Suppress Rows
This suppresses any data rows with zeroes, no data, etc. regardless of the Cube
View suppression settings.
Unsuppress Rows
This removes any data suppression set via the Cube View suppression settings and
displays all data rows including those with no data, zeroes, etc.
NOTE: The actions above vary based on the user’s security settings and restriction
properties set on the Cube View.
Show Report
Export to Excel
Users can open multiple excel exports using a Cube View without being prompted to rename or
save the file. A version number will change with each export in sequence.
Select Parameters
This allows the user to select new Parameters and view the Cube View data differently.
NOTE: This is based on the Parameters set for this Cube View. This feature will not
apply to all Cube Views
This opens the Cube View Application page where changes can be made to the Cube View
design.
When viewing a Cube View with a large number of rows, use the search filter at the top of the
screen in order to navigate to the desired row. Type a keyword into the search filter and click the
Find Next Row icon in order to navigate to the preferred row. Continue clicking the icon in order to
navigate to any row with the keyword included in the row header name. This also works when
rows are collapsed.
Use <Ctrl+C> or <Ctrl+V> to copy and paste values into Cube View data cells. Select the desired
data cell, click <Ctrl+C> to copy the cell value and then select another cell to paste <Ctrl+V> the
value. Hold down <Ctrl> to select multiple cells and paste the value into the selected cells
simultaneously.
Right-Click Options
Expand/Collapse
If nested rows are used in a Cube View, right-click on any row header in order to collapse or
expand the selected row content. This feature works with the RowExpansionMode property in the
Cube View designer and controls how users view Cube Views in the data grid. See Rows and
Columns under Cube Views in "Presenting Data With Books, Cube Views and Other Items" on
page 576 for more details.
Calculate/Translate/Consolidate
Cube Views can be set to enable processing of the Cube View data. This includes Calculate,
Translate and Consolidate. There are also option to force these calculations and log the activity in
the Task Activity. Each of these options can be enabled/disabled on an individual Cube View.
For more details on these options, see Launching a Consolidation in "About the Financial Model"
on page 2.
Spreading
Spreading can be done from a Cube View while viewing it in the Data Explorer grid, in the
Spreadsheet feature and Excel Add-in. User experience may differ slightly across these three
user interfaces and more functions may be available in the Spreadsheet feature than others. This
functionality provides the ability to spread values over selected cells.
NOTE: The Spreading Dialog can be left open while entering data. If the dialog is left
open while users are spreading values on the grid, it will update to the respective
spreading behavior.
Spreading Type
Fill
This fills each selected data cell with the value in the Amount to Spread property.
Clear Data
This clears all data within the selected cells.
Factor
This takes the selected cell’s value and multiplies it by the rate specified.
Accumulate
This takes the first selected cell’s value and multiplies it by the rate specified. It then takes that
value, multiplies it by the specified rate and places it in the second cell selected, and does this for
all selected cells. For example, four cells are selected and the first cell has a value of 900.
Even Distribution
This takes the Amount to Spread and distributes it evenly across the selected cells.
Proportional Distribution
This takes the selected cell’s value, multiplies it by the specified Amount to Spread, and then
divides it by the total sum of all selected cells. If all the cells have a zero value, the Amount to
Spread will behave like an Even Distribution.
Result
445 Distribution
This takes the Amount to Spread and distributes it with a weight of 4 to the first two selected cells
and a weight of 5 to the third cell.
454 Distribution
This takes the Amount to Spread and distributes it with a weight of 4 to the first selected cell, a
weight of 5 to the second cell and a weight of 4 to the third.
544 Distribution
This takes the Amount to Spread and distributes it with a weight of 5 to the first selected cell and a
weight of 4 to the second and third cells.
Spreading Properties
Amount to Spread
Specify the value to spread over the selected cells. The value defaults to the last cell selected.
The way the amount in this field spreads varies by Spreading Type.
Clear Flags
Select this to clear any flagged cells.
Apply
Enter spreading values in the dialog and select Apply to perform spreading without closing the
dialog. The dialog closes upon clicking OK or Cancel.
In order to select multiple data cells, hold the Control Key (Ctrl) on the keyboard, and click on all
cells that apply. Spreading occurs across rows first and then down the columns. Once spreading
has been performed using the Spreading Dialog, users can type values on specific Members
without having to launch the dialog again. This spreading function applies the settings from the
last time the Spreading Dialog was used, so all Spreading Types and flagged cells can be utilized.
Even Distribution is the default Spreading Type if there is nothing currently set in the Spreading
Dialog.
Users can select multiple Cube View data cells and type over the primary cell in order to apply
spreading. If the primary selected cell is a Parent Member, a value can still be typed over it if
multiple cells are selected. Any locked base-level data cells are automatically flagged when
applying spreading.
When spreading across multiple time periods on one data row, users can select the Parent Time
Member and automatically apply spreading to all its Base Time Members without having to
manually select them.
For example, double-click on the Q1 Parent Member and the periods associated with Q1 will be
selected.
Click <Enter> and it will automatically apply to the periods associated with the Quarter.
Allocation
Form data Allocations can be completed on a Cube View in the Data Explorer Grid, a Cube View
in Excel, and Quick Views.
Allocation Type
Clear Data
This clears all Form data for the specified Destination POV.
NOTE: Users can clear data for Dimension Members that are not displayed on the
Form.
Even Distribution
This provides an even distribution across the Destination Members.
Source POV
The Source POV determines the source intersection and applies the value from this intersection
to the allocation. The Source POV defaults to the last cell selected. Users can also select a data
cell and drag and drop the cell’s POV into this property in order to update it. Any Members not
included in the Source POV will default to the user’s Cube POV.
Destination POV
The Destination POV determines the intersection where the allocation will take place. A Member
for each Dimension must be specified for the allocation to take place in the correct intersection.
Users can select a data cell and drag and drop a POV into this field in order to display all
Dimensions. If the Destination POV is similar to the Source POV, specify as many Dimensions as
necessary and the remaining Members will come from the Source POV, or leave this field blank
and all POV Members will be based on the Source POV. Any Members not included in the Source
or Destination POV will default to the user’s Cube POV.
445 Distribution
A 445 Distribution takes the source amount and applies a weight of 4 to the first two specified
destination intersections and then a weight of 5 to the third intersection. This applies the
allocation across rows first and then moves down the column.
454 Distribution
A 454 Allocation takes the source amount and applies a weight of 4 to the first destination
intersection, a weight of 5 to the second and then a weight of 4 to the third intersection. This
applies the allocation across rows first and then moves down the column.
544 Distribution
A 544 Allocation takes the source amount and applies a weight of 5 to the first destination
intersection and then a weight of 4 to the second and third intersections. This applies the
allocation across rows first and then moves down the column.
For Source and Destination property definitions, see Even Distribution above.
Weighted Distribution
This applies a weighted value to each specified Destination Member. The weights are determined
in a Weight Calculation Script which uses the specified Dimension intersections’ cell values.
For Source and Destination property definitions, see Even Distribution above.
It then applies the weighted values to the intersections specified in the Destination POV/Member
Filter combination.
These are system Substitution Variables and are determined by the following:
|Weight|
This is the weight value applied to each Member in the Destination POV.
|TotalWeight|
This multiplies the weight by number of allocated intersections. For example, if there is a weight
of 5 being allocated across 12 intersections, the |TotalWeight| would be 60.
Identify specific Dimension Members separated by a colon in order to determine the |Weight| and
|TotalWeight|. Any Dimensions not specified in this field will come from the Destination POV.
Users can drag and drop a POV from a data cell in order to jumpstart the calculation script. In
order to apply the Members specified in the Member Filter property, delete that particular
Dimension in the script.
Example
Destination POV
Cb#Houston:E#[Houston
Heights]:C#USD:S#Actual:T#2011M5:V#Periodic:A#56000:F#None:O#Forms:I#None:U1#None
:U2#None:U3#None:U4#None:U5#[IFRS Adj]:U6#None:U7#None:U8#None
Member Filter
T#2011M7,T#2011M8,T#2011M9
A#26000:O#Top
The remaining Dimensions are determined by the Destination POV and Member Filter.
Cb#Houston:E#[Houston
Heights]:C#USD:S#Actual:V#Periodic:A#56000:F#None:O#Forms:I#None:U1#None:U2#None:U3#
None:U4#None:U5#[IFRS Adj]:U6#None:U7#None:U8#None
The Time Dimension was removed from the script in order to use the three Time Members in the
Member Filter.
In the example above, the Weight Calculation Script is identifying three intersections of
data. OneStream uses the sum of those three intersections as the |TotalWeight| and each
individual intersection as the |Weight|. This determines how to spread the Source Amount
amongst the Destination Members.
Enter a value in order to create an even distribution to all specified Destination Members.
Advanced
Advanced Allocations are similar to Weighted Distributions, but allow users to override two
destination Dimensions, control how the weights are calculated, and offset amounts for the
Source and Destination Members. See Form Allocations in "Collecting Data" on page 164 for an
Advanced Allocation example.
For Source and Destination property definitions, see Even Distribution above.
This determines how to calculate the Weight Calculation Script. The default calculation is
|SourceAmount| * (|Weight|/|TotalWeight|).
Example:
(|SourceAmount| * (|Weight|/|TotalWeight|)) *1.5
This will calculate the weighted value for each specified intersection, multiply it by 1.5 and apply
that value to the destination data cell.
Set this to True in order to translate the destination currency if it differs from the currency on the
Form.
Offset
Specify specific Members to offset when doing an allocation. Transferring values out of the
Source POV requires an entry in the Source Transfer POV while entries in Source Transfer Offset
POV and Destination Offset POV will ensure that entries are balanced by updating a related
Account.
In the example above, Monterey is transferring their IT Services Expense to four other Entities.
If a balanced entry is not required, the method below can be used by configuring the Destination
and Source Transfer POV.
NOTE: Both the expense and liability have been removed from Monterey and
transferred to the other four Entities:
Data Attachments for Selected Cell/Data Attachments for Selected Data Unit
Any cell in any grid can contain a data attachment. To attach a file, right click on any cell in a data
grid and select Data Attachment for Selected Cell. Click the upper left Create Data Attachment
icon and select an Attachment Type from the drop-down menu, give it a title, select a file to
attach or simply type in text, and click OK. The attachment will now appear in the Data
Attachments box.
TIP: This attachment can be viewed from any data grid that employs a Workflow Unit,
Scenario, and Period. To view all data attachments for any Workflow Unit, Scenario and
Period, right click on any cell and select Data Attachments For Selected Data Units. This
ensures that all data attachments become valuable analysis tools. This is part of the
View Dimension because each is given a Type and the data attachments can be
reported on as a whole and included in other types of reports.
The Spell Check feature is enabled for English Culture only. The culture is determined by each
user’s culture assigned in OneStream User Security. The culture is assigned to the OneStream
application on the Application Server Configuration Utility as “en-US”. Users with cultures other
than English (United States) will not have Spell Check available.
The user must right-click the triggered item to activate the Spell Check options where suggested
solutions are presented. Choosing the “Ignore” option will only be retained for the current session.
Closing and re-opening to the edit mode will re-check any previously ignored items.
Choosing Check Spelling will allow the checking of spelling within content of the text box from any
starting point.
The Options button will allow the user to modify the Spell Check behavior within the current task
session. The settings are not persisted as user preference.
Cell Detail
Cell Detail can be entered on a Cube View used for a data entry form, or on a Cube View or Quick
View in Excel. Cell Detail can also be loaded via an Excel or CSV template. See Loading Cell
Detail in "Collecting Data" on page 164 for more details on creating these templates. Cell Detail is
available on any writeable O#Forms or O#BeforeAdj Member. In order to disable this by account
or any other intersection, use a Conditional Input Business Rule. See Finance Business Rules in
"Application Tools" on page 779 for an example of this rule.
YTD/Periodic
This determines how the data is being entered in the Form. This only applies to Income
Statement Accounts. If a YTD Form is being used and a Periodic line item is entered, the Form
will calculate the YTD value and store it accordingly
These icons add, delete, and move Cell Detail records in the grid.
Clear All
This removes all Cell Detail records in the grid but does not remove any stored numerical data
from the Forms Member in the Cube.
1. Amount
Enter the amount for the Cell Detail Record Or Records.
2. Aggregation Weight
Enter an aggregation weight in order to calculate a cell item using simple multiplication. For
example, entering a -1 will reverse the value in the amount column.
3. Classification
The user can select a Classification from a drop list on each line item, where the list of
Classifications is defined using a Dashboard Parameter specifically named as
CellDetailClassifications. Once this Dashboard Parameter is created, Cell Detail will
recognize the classifications without having to assign the Parameter to any Cube View or
Form.
Note: Users can add additional value items to a Parameter, or change existing items,
however, this will not change any classification assigned and stored to a line item in a form.
4. Description
Enter text information about the Cell Detail.
5. This displays the values that are going to be stored in the Origin Members based on what
was stored in the Cell Detail form.
Right click on any cell and select Cell POV Information in order to see a detailed summary of the
selected Members related to this intersection. All the major properties of these Members can be
seen from this dialog. The full Member Script and formula syntax to get to this value is also
displayed. Users can use the clipboards to copy and paste the retrieve functions. The XFGetCell
formula can be copied in this dialog and used in an Excel file to retrieve this specific data in Excel.
The XFCell formula can be copied in this dialog and used in a text file, such as Word or
PowerPoint, to retrieve this specific data in an Extensible Document. See Extensible Document
Framework in "Presenting Data With Extensible Documents" on page 235 for more details on this
retrieve function.
Interact with the Point of View in the Context Pane for the Cube View’s bolded Dimensions. When
these Dimensions are changed, the Cube View’s results will change. In the example below the
Cube View allows the Scenario and Time to change.
Cell Status
Right click and select Cell Status in order to view status properties and Dimension information
about a specific cell.
Drill Down
When choosing Drill Down in the right click menu item in a Cube View or Quick View another tab
will open with the Drill Down results. Drill down works the same whether a user is drilling from the
data explorer grid in OneStream, from the Excel Add-In or OneStream Spreadsheet. An
administrator might want to know what makes up Net Income for all Entities in Europe across all
products groups, and with the Drill Down option this can be accomplished.
TIP: Drill into any cell on a data grid or in Excel; it does NOT need to be a base level
number.
The resulting screen shows the drilled back intersection in the Drill Down History section. The
white cells show base amounts, meaning drilling cannot go further. The green cells can continue
to be drilled.
Download Results
These icons allow a download of the drill results into either a CSV or Excel file.
The following options become available when a user right-clicks on any field. Not all options will
be available for every cell.
Cell Status
Select this to see information about the cell such as if the Members have children, if it is
calculated, the lock status, etc.
Calculation Inputs
This gives details on the formula source accounts for a specific account.
TIP: OneStream keeps a bread crumb trail of all drill actions in the right side of the top
drill screen under the heading Selected Action. Users will never get lost in the data as
they can always start over from the top and begin a drill again.
Import Drill
Right click on the Origin Member Import and select All Aggregated Data. Right click again on the
Origin Channel and click Load Results for Imported Cell and then select Navigate to Source Data
to get down to the GL Account level. Right click on any cell in the source system data line to get
down to the source document that created that line item.
Forms Drill
Right click on the Origin Channel Forms and select All Aggregated Data Below Cell. Right click
again on the Origin Channel and click Audit History for Forms or Adjustment Cell. This is the
specific line item that created the drilled-on value.
TIP: To see each line item from the Form used to create the drilled-on value, click the
button View All Submitted Data Cells. This is now showing all the line items from the
Form used to create the actively drilled line item.
TIP: To see each line item from the Journal used to create the drilled-on value, click the
button View All Submitted Data Cells. This is now showing all the line items from the
Journal used to create the actively drilled line item.
To generate a smoothly formatted presentation of the Cube View, click this button while in
preview mode and a report similar to what one would see in Dashboards will open. Control column
widths and row heights from within the Cube View. See Application Tab| Application Properties in
order to show a company name and logo on all reports. There are also several application-wide
settings for these Data Explorer reports under Application Properties and under the Application
Tab| Presentation| Cube Views.
In order to print a Cube View, first generate a Report as shown above. Once the Report is
generated, print and export that view to other Formats, such as PDF, HTML, RTF (for Word),
CSV, Text, XPS, MHT or Excel.
Select a Dashboard and see a complete and organized series of reports, grids, charts, and graphs
all combined. A user may be prompted to enter Parameters in order to make the Dashboard
relevant. The Parameters are pre-defined when the Dashboard is created in the Application Tab.
NOTE: For OneStream Windows App, you must right-click the bookmark to jump to that
location.
Dashboard Toolbar
Select Parameters
Use this to select specific Parameters when viewing a Dashboard
Edit Dashboard
If security permits, select this to launch the Application Tab and make changes to the Dashboard
properties and its components.
Printing Options
Select to print a single Dashboard report from the web. Click the down arrow and select
Click via PDF to turn the report into a PDF first and then print.
In order to save and print an entire Dashboard Book as one PDF, right-click on the Dashboard
name and select Combined PDF File or PDFs In Zip File.
For more information on building Dashboards, see "Application Dashboards" on page 648 in
Presentation.
Storing Documents
An administrator can save public documents or templates for users to access for their close
process. These documents are saved in the Systems Tab|Documents where only administrators
have access. However, a user can access these public documents in OnePlace.
NOTE: Right click on any document or folder and select File Explorer in order to upload
files from the OnePlace or Systems tab. See File Explorer in "System Tools" on
page 947 for more details on this feature.
NOTE: Files created with either the Spreadsheet or Text Editor feature are also visible
here. These files can be opened by Right clicking the file and selecting one of the three
options.
l Open- opens file in its related Microsoft application, if loaded on the local PC
l Open With… this allows the user to select which application they want to use to open the
file
NOTE: The File Explorer option is also available by right clicking a file. Files can also be
uploaded or opened from this window.
l Change point-of-view, interact with Forms, assign Cube Views, drill through to source data
and update Workflow status.
l Eliminate risk and duplication with standardized and centralized spreadsheet controls.
l Sheet based calculations remain, even when rows or columns are added.
l You can enter formulas in the Cube View and retain formulas while making changes to the
sheet or workbook.
Log on
1. Click Logon.
2. In the Server Address field Click to add or select the URL for the server. The Manage
Connections window opens.
NOTE: If you use the External Provider Sign In, the following Authentication
window opens.
3. Enter the URL of the server and a description in the Description field.
NOTE: After connections have been added, they will be visible in the list.
8. Click Logon and then in the Application field, click the arrow and select the Application.
If using OneStream and the Excel Add-In needs to be updated, the executable file will need to be
un-installed and re-installed. See Installing the Excel Add-In Client Package in Installation and
Configuration in the Installation and Configuration Guide for more information.
1. Documents can be found in the OnePlace tab in the Navigation Pane. There are Public and
User-based documents at the Application and System level. You can load and launch Excel
templates.
2. Forms In Workflow, can have an optional Excel template attached that you can launch
when you’re filling out data. This can be an alternate method to do analysis or enter values.
3. Data Attachments can be linked to either a single cell or an entire Data Unit and contain text
and file attachments. A reader of the comment can launch attachments.
Task Pane
The Task Pane includes three tabs: POV, Quick Views, and Documents.
Point of View
The three points of view are Global POV, Workflow POV, and Cube POV. When a cell is
selected from a Cube View, the bolded Dimensions can be changed and will impact the data.
Documents
You can open Application or System Documents for Public or Users from the File Explorer.
Quick Views
You can pivot, drill back, create data sets, and design workbooks to quickly analyze data. You can
enter members to generate a Quick View or even create a Quick View from another Quick View.
Once you have narrowed down the data set, it can be saved and used again. See Create or
Modify Quick Views using Type in Functionality for more details on how to create and modify
Quick Views by entering Dimension Members.
This option allows the ability to select a saved Quick View. It will then highlight the Quick View
selected.
Button Description
Select Member Click Select Member in the Quick View POV to select a dimension
member.
Button Description
Filter Click filter to launch the Member Filter Builder and query several
Dimension Members at once. Click Apply in the Member Filter Builder to
see the results prior to closing the Builder dialog. See Member Filters in
"Cubes" on page 400 for more details on Member Filters and how to use
the Member Filter Builder.
Column The dimensions for the column view are defined and you can drag and
Dimensions drop from the Quick View POV or Row Dimensions.
Row The dimensions in the row view are defined and you can drag and drop
Dimensions from the Quick View POV or Column Dimensions.
Create Quick When the window opens, you can enter a name or select the default.
View Refers To is where the quick view shows in the Excel sheet. You can
insert or delete rows or columns so when it’s created, existing fields aren’t
overwritten.
Button Description
Rebuild Quick This will Rebuild a Quick View when using the Type in feature of Quick
View. When additional Members are added to or extended past the range
of the existing Quick View, highlighting those news fields along with
View selecting the button will rebuild the Quick View and add the new Members
and data accordingly.
Rename Select a Quick View and choose this icon to delete it.
Selected Quick
View
Delete Select a Quick View and choose this icon to delete it.
Selected Quick
View
Edit Quick The following properties appear once the icon is selected.
View Options
Insert Or Delete Rows/ Columns When Resizing Cube View
This will move existing data either by row or column, so when the new
Quick View is created, it does not overwrite the existing fields. Settings
are True or False.
Button Description
Name
This will use the name given to the Column or Row Dimension.
Description
This will use the description given to the Column or Row Dimension.
Short Description
This will use the short description given to the Column or Rown
Dimension.
Data Style
This is where the data style is set for the values in a Quick View. Choose
any of the default Excel versions, or create a custom version through
Excel and attach it here. For more information on Styles, see Style Types.
Button Description
View
Undo Changes Clicking on this icon will undo any changes not wanted. OneStream will
and Revert to remember up to 100 previous actions
Prior Quick
View Settings
Button Description
Select Member This icon can be used to Search for Members within a Dimension Type
and Dimension. This allows the resulting Members to be displayed in the
Hierarchy and multi-selected Members can be added to a Quick View.
See more details below on the use of this button within Quick Views
Keep Only This icon will clear everything except the selected items.
Tip: Use the control key to keep more than one item.
Remove Only
This icon will clear only the selected items.
Next Level This icon has the same function as double-clicking on a row. It will go
down to the next level.
All Tops This icon will go back to the Tops of the Dimensions.
Button Description
3. Then highlight the area and select Create Quick View, if using the Excel Add-in, select the
Quick Views tab.
Use a comma separated member list to build the Quick View in the single row and column cell to
generate the expected results.
Use the Member Expansion Functions to build the Quick View using comma separated members
in the current cell to generate the expected results.
NOTE: If suppression options are on, certain items may not display.
3. You can also insert a new row if needed or in the blank row above the Quick View enter the
#MemberName.
1. In the row below the existing dimension type the member name 69000, or in the column
next to the existing dimension type member name 2018M2.
1. In the row below the existing dimension add the new member names 60000, 61000, 54500.
1. In the row below the existing dimension add the new member names: 60999, 43000,
61000, 54500, 62000, 69000.
2. For example, change the Account dimension to the Entity dimension using E#Houston
Heights and add the Operating Sales A#60000 above the time dimension.
1. Enter E#Houston Heights, South Houston and add Operating Sales A#60000 above the
time dimension.
3. The two entities that were comma separated in a single cell in the two rows and the account
in both columns.
1. Enter three new members 60999, 43000, 61000. A dimension# prefix isn’t necessary if it’s
the same dimension.
2. Then highlight the new members and click Rebuild Quick View.
3. This results in three new members 60999, 43000, 61000 replacing 60000 and 54500.
1. Enter three new members 60999, 43000, 61000 in the same cell.
A dimension# prefix isn’t necessary if using the same dimension; you need to use a space
after the comma if member names are numeric or start with an apostrophe i.e;
‘60999,43000,61000.
2. Then highlight the new members and click Rebuild Quick View.
3. This results in three new members 60999, 43000, 61000 in addition to 60000, 61000,
54500, 69000.
Notice that the Account selected in the Quick View by default is “63100 – Interest Income”.
The “Select Member” button will default to the respective dimension type, dimension and
The cursor will default to the selected member in the Quick View Hierarchy. The user can
now multi-select members in the Hierarchy tab by using Ctrl+click or Shift+click, then
selecting the right arrow button to add these members to the Result List.
3. Highlight the new records added and click the Rebuild Quick View button.
The results return the members that were multi-selected in the Quick View.
NOTE: To allow the Member Selector to transpose the selections to Columns, the user
should first highlight the blank columns, outside the current QuickView range. The
selections in the Member Selector will fill to this range without overwriting the current
QuickView Column, rather than applying down rows. Once added, select the active
QuickView and new column range as documented to enable the QuickView Rebuild
function for Type-In.
NOTE: The Select Member Dialog button on the Quick View toolbar control will include
the dimension key in the result list and on OK. Separately the Select Member Dialog
button on the ribbon will not include the dimension key in the result list and on OK. Users
should refer to the Select Member Dialog button on the ribbon when using to build any
XFGet… function. The context menu will include the dimension key or not depending if
the cell is on a Quick View or not.
2. Switch to the search tab and type in the member name for the search and click Search.
To change the double-click behavior, select Preferences under the Administration menu. In the
Quick View Double-Click Behavior section, select the drop-down list for Default Expansion For
Rows and select TreeDescendantsR.
In this example the result is the reverse direction of the originating account on expansion for rows
when using the double click.
Logon
This displays the current user and application. A user can logon to a different application by
clicking this icon.
Data
Refresh Workbook
This pulls down updated data from the server and refreshes the entire Excel workbook.
Refresh Worksheet
This pulls down updated data from the server and only refreshes the selected worksheet.
Submit Workbook
After editing data in Excel, click this icon to send it back to OneStream. This icon will send data
back for every tab in the Excel workbook.
Submit Sheet
After editing data in Excel, click this icon to send it back to OneStream for Cube Views, Quick
Views, XFSetCells, and Table Views on the active sheet.
Numeric Precision
Decimals are automatically truncated after the ninth character in a cell or a function.
Calculation
Consolidate/Translate/Calculate
If permission is granted, these calculations can be performed on the selected cell.
Analysis
Quick Views
Create a Quick View
This will create a new Quick View in the worksheet’s selected cell.
For more details on this feature, see Quick View in "Getting Started with the Excel Add-In " on
page 1059.
Cube Views
Add a Cube View to an Excel sheet.
2. From this window, the cube views added to an Excel workbook can be managed. You can
add, remove, edit, or go to styles. Click Add, to add a new Cube View.
4. Resize Initial Column Widths Using Cube View Settings is the default setting. If you
uncheck it, you can change the columns and save the Cube View. However, if you go back
into the same Cube View Connection, the check box will be enabled and you’ll need to
uncheck it again to keep your new cube view settings.
5. Select whether there needs to be inserted or deleted rows and/or columns when resizing.
This setting will move around other content in the sheets if the size of the Cube View
changed since the last refresh.
6. You can select Include Cube View Header to add header rows to the spreadsheet.
7. Retain Formulas in Cube View Content allows you to enter formulas in the Cube View
(Excel or Spreadsheet) and retain those formulas pre and post submission of the sheet or
workbook. When the sheet or workbook is refreshed the formulas will remain. If the value
resulting from the value is different than the value of the OneStream database, the cell will
initially become a dirty cell and will turn the cell format to yellow.
NOTE: When using external Excel workbooks, or after any updates to referenced
sheets within the same workbook, you must Refresh Sheet to visualize the dirty
cells and then Submit Sheet, unless Dynamically Highlighted Evaluated Cells is
turned on.
8. After the Cube View is added, it will appear on the sheet. If formatting was applied to the
Cube View (see Cube Views in Presentation), the formatting will come forward into the
Excel sheet. Otherwise, apply Excel Styles. These styles are stored in the Excel sheet and
can be copied from workbook to workbook. For more information on Excel Styles, see
"Styles" on page 1135.
NOTE: In order to copy Excel spreadsheet cells into a Data Explorer Grid on the
web, click CTRL, select the cells desired, and then click CTRL-C. Navigate to the
Data Explorer Grid, select a cell, and click CTRL-V, this will paste the cells into the
grid. This can also be done from a Data Explorer Grid into an Excel Spreadsheet.
The Retain Formulas in Cube View Content feature, allows users to plan, budget or forecast and
use the familiar functionality of Excel while still submitting data back to the OneStream database.
Use the Retain Formulas in Cube View Content feature to enter formulas in the Cube View (Excel
or Spreadsheet) and retain those formulas pre and post submission of the sheet or workbook.
When the sheet or workbook is refreshed, the formulas will remain. If the value resulting from the
formula differs from the existing value in the OneStream database, the cell will initially become a
dirty cell and will turn the cell format to yellow.
NOTE: When using external Excel workbooks, or after any updates to referenced
sheets within the same workbook, you must Refresh Sheet to visualize the dirty cells
and then Submit Sheet, unless you’ve turned on Dynamically Highlight Evaluated Cells
in the cube view.
Retain Formulas in Cube View Content links to other Excel worksheets or worksheets in other
Excel workbooks.
1. From the OneStream menu, select Cube Views > Cube View Connections.
2. Click Add in the Cube View Connection window or click Edit if you already have a cube
view.
3. Click Retain Formulas in Cube View Content box and click OK.
4. Add the Cube View, if one is not already selected, and click Close.
Dynamically Highlight Evaluated Cells saves you a step because the cell changes without
requiring a refresh. This feature identifies the values in the cube view that have changed relative
to its original value in the database. Evaluating all the cells in the spreadsheet.
Excel users who want to continue working in Excel to access can log in through the OneStream
menu, update the cube view content and submit it to the database without leaving Excel. You can
also perform these tasks in Spreadsheet within the application.
You can Retain Formulas in a Cube View Content that are related to values within a function,
within an existing workbook, within a sheet, within other sheets, in external workbooks, and in
external renamed worksheets in Excel. Spreadsheet also offers this functionality, but it doesn’t
allow you to point the cell references to external workbooks.
Click Refresh Sheet to see all changes within the cube view content and then click Submit
Sheet or activate Dynamically Highlight Evaluated Cells and the cell updates automatically.
When a value for a formula in the cube view is changed by a cell reference, or a function related to
a different cell is modified, if the value is different than what is in the database, a dirty cell is
created. This means the value of the cell is different than the value of what is in the database and
the cell will change colors.
The number of cells with formulas in the cube view determines the amount of time it takes to
update the cells. You can turn the feature on or off and only use Refresh Sheet to update the
values in the cells. Changes will show very quickly, no matter the size of the worksheet, when
using Spreadsheet.
3. Click Add.
7. Then click Dynamically Highlight Evaluated Cells so you can see the changes as they
are made.
8. Even if you don’t activate the dynamically highlight evaluated cells feature, you can click
Refresh Sheet after you make changes to see them.
9. If you’re prompted, click OK once you’ve selected the parameters for the cube view.
10. Once the cube view has been added, you can click Edit to review, if needed.
11. Make changes to the sheet and press <Enter> to see the updated cell, which will change
from white to yellow.
Use Cases
These use cases are for both Excel and Spreadsheet unless otherwise noted.
The placing of formulas or cell references. Retain Formulas can reference the following types of
formulas. In all instances the formula will stay after refresh and/or submission.
Cell References to cells on other sheets. These can also be factored by another value as well.
Referenced cell(s) on another saved workbook can also be factored by another value. (This
applies to Excel only.)
Best Practices
Well-Formed Grid
It is suggested to create a “Well-Formed Grid” (Root.List or Comma Separated List) in Cube
Views. When using this “Well-Formed Grid” (Root.List or Comma Separated List) in Cube Views,
the Excel/SpreadSheet relative (=C2) and absolute formulas (=$C$2) will be retained.
However, when using these relative and absolute formulas within an Excel/Spreadsheet formula,
users can use either the cell reference or text within the formula depending upon how members
will be added or removed:
l Pivoting the existing Dimensions of the Cube View will break formulas.
l Changing the “structure” of the Cube View grid in the rows or columns will also break the
formulas. For example; If you have Account, Entity, UD3 as the dimensions used in the row
and switch it to UD3, Entity, Account, it will break the formulas.
l Users can change the POV to select a new dimension. This will change the Cube View
results but retain the existing formulas that were established. The user at this point can
choose to utilize the existing formulas, modify or delete. If the original formulas are modified
or deleted, the last action will be saved.
l Linking to a white cell (writeable cell) to another cell in a different workbook will work ONLY
in Excel and NOT in Spreadsheet.
l Prior to establishing links to an external workbook, the user should save the external
workbook being referenced.
l When the user renames or saves as the (referenced) file, the user will need to update the
links to the newly created file. Updating the links on the spreadsheet should be done
BEFORE doing a refresh or submit.
l Formulas with cell references (VLOOKUP, INDEX(MATCH(, etc) that return errors (#N/A,
#ERROR, etc) or non-numeric data will not retain the formula and return to its original value
from the Cube View ; this error text cannot be converted into a number so the formulas will
not retain.
l If a Dimension Member Name is renamed; i.e.; “52200 – Rent” is now “52200 – Rent
Commercial”, the formula will break.
File Explorer
Use this option to upload and download files.
Create Folder
This creates a new folder under the selected folder on the left-hand side of the File Explorer pane.
Upload File
This uploads the selected file and allows the user to save.
Data Attachments
This pulls up the Data Attachments dialog to show existing comments or attachments on a
selected cell, or to allow data attachment edits.
Cell Detail
Enter Cell Detail for a Cube View or Quick View data cell. See Cell Detail in "Using OnePlace
Cube Views" on page 1020 for more details on this function.
Drill Down
Drill down on a specific cell in order to see more details or gather more information. See Drill
Down in "Using OnePlace Cube Views" on page 1020 for more details on this function.
General
Convert to XFGetCells
This will convert an existing Quick View into an XFGetCells. After clicking this option, OneStream
will prompt with the following: Are you sure you want to convert all of the data in Quick View ‘Name
of the Quick View’ to XFGetCells? By clicking OK, the Quick View definition will be deleted and
converted to XFGetCells.
Object Lookup
Use the Object Lookup to insert objects from OneStream into Excel such as Foreign Exchange
Rate Types when building formulas. If creating an Extensible Document in Excel, users can also
use the Object Lookup to insert Parameters, Substitution Variables, or Image Content. See
Object Lookup in "Presenting Data With Books, Cube Views and Other Items" on page 576 for
more details on this feature.
Select Member
Select a Dimension Type from the drop-down list in order to view the Members of that Dimension.
Select a Member of the hierarchy, and the Member name will display in the selected cell.
Spreading
Allows users to see what type of spreading was used to spread data values over several columns
or rows without having to type in each cell’s values.
Spreading Types
Spreading (Even)
This distributed the active cell amount evenly across all selected cells.
Spreading (445)
This distributed the active cell amount using a weighted 445 pattern across all selected cells.
Spreading (454)
This distributed the active cell amount using a weighted 454 pattern across all selected cells.
Spreading (544)
This distributed the active cell amount using a weighted 544 pattern across all selected cells.
Spreading (Factor)
Multiplied all cells in the data range by the specified rate.
Spreading (Fill)
Filled all cells in the data range with a specified value.
Spreading (Proportional)
Distributed a value in all cells in the data range by the proportional based on the number of cells in
the range.
Spreading (Accumulate)
This starts with the active cell amount and cumulatively multiplies it by the specified rate.
Spreading (Clear)
Cleared all data that was previously entered in the data range.
Flag Selected Cells
Flags selected cells so the original amount in the cell is retained during the spreading process.
Clear Flags
Select this to clear any flagged cells.
Administration
Preferences
General
Enable Microsoft Sign In
Set this to True if Azure is used for authentication to sign into the Excel Add-In. Setting this
property to true will enable the Microsoft Sign In button on the login dialog allowing users to enter
their Azure credentials. Set this to False, to disable the Microsoft Sign In button and users will be
prompted to enter their username and password.
If set to True, this enables Excel macros for OneStream API calls. The default is False.
If set to True, this will force a data refresh on the opened workbook. The default is False.
This is for Excel Add-In only, not the Spreadsheet feature in OneStream Windows App. The
default is True this will only calculate formulas and Excel functions in the active sheet. Set to False
to revert to a full calculation of all workbooks and all sheets.
NOTE: Performance is best when Excel is set to use Manual Calculation Mode.
NOTE: Setting this to True may result in incompatibility issues with other Excel Add-ins.
For the following properties, See Quick View in "Getting Started with the Excel Add-In " on
page 1059 In
Default Display Settings for New Quick Views
Default Suppression Settings for New Quick Views
Excel Calculation
The Excel Calculation icon has the option of Automatic, Automatic Except for Data Tables, and
Manual. It is recommended that the Calculation be set to Manual when using OneStream
spreadsheets because the Automatic setting results in an Excel re-calculation every time a
OneStream’s interactive workbook changes data (e.g., when navigating a Quick View). However,
this is not forced because a user might prefer Excel’s Automatic calculation, especially when there
is not a significant amount of OneStream data in the workbook.
Right-Click Options
When working with a Cube View in Excel, the following right-click options are available:
Quick View
See Quick View in "Navigating the Excel Add-In" on page 1097.
Expand
Select the cell of a Member and choose how to view its data.
AllTops
This returns the Top of the given Dimension.
AllBase
This returns all Base Members of the given Dimension regardless of what Member is selected.
All
This returns all Members in a given Dimension.
NextLevel
This returns the next level of Members under the selected Member.
KeepOnly
This will only keep the selected Members.
Parents
This returns the direct Parents of the selected Member regardless of how many hierarchies to
which the Member belongs.
Ancestors
This returns all Members up the chain from the selected Member.
Children
This returns the first level of Children under the selected Member.
ChildrenInclusive
This returns the selected Member and its first level of Children.
Descendants
This returns every Member under the selected Member in a list, not a hierarchy.
DescendantsInclusive
This returns the selected Member and every Member under it in a list, not a hierarchy.
TreeDescendants
This returns every Member under the selected Member in a hierarchy.
TreeDescendantsInclusive
This returns the selected Member and every Member under it in a hierarchy.
Base
This returns the Base level for the selected Member.
Paste POV
This allows a user to Paste a POV into a selected cell in order to change the data within that Quick
View.
Clear POV
This will clear the POV for the selected Quick View.
For the following properties, refer to Quick View in "Navigating the Excel Add-In" on page 1097
Undo
Redo
Options
Refresh
Select Member
Select a Dimension Type from the drop down list in order to view the Members of that Dimension.
Select a Member of the hierarchy, and the Member name will display in the selected cell.
Convert to XFGetCells
See General
Cell Detail
See Analysis
Data Attachments
See Analysis
Cell Status
This returns a long list of properties about a given cell.
Drill Down
See Analysis
Spreading
This allows users to enter data into an aggregate Member, like an annual time period, and spread
values over several columns or rows without having to type in each cell’s values.
Spreading Type
Fill
This fills each selected data cell with the value in the Amount to Spread property.
Clear Data
This clears all data within the selected cells.
Even Distribution
This takes the Amount to Spread and distributes it evenly across the selected cells.
445 Distribution
This takes the Amount to Spread and distributes it with a weight of 4 to the first two selected cells
and a weight of 5 to the third cell.
454 Distribution
This takes the Amount to Spread and distributes it with a weight of 4 to the first selected cell, a
weight of 5 to the second cell and a weight of 4 to the third.
544 Distribution
This takes the Amount to Spread and distributes it with a weight of 5 to the first selected cell and a
weight of 4 to the second and third cells.
Factor
Multiply all cells by the specified rate.
Accumulate
This takes the first selected cell’s value and multiplies it by the rate specified. It then takes that
value, multiplies it by the specified rate and places it in the second cell selected, and does this for
all selected cells. For example, four cells are selected and the first cell has a value of 900.
Proportional Distribution
This takes the selected cell’s value, multiplies it by the specified Amount to Spread, and then
divides it by the total sum of all selected cells. If all the cells have a zero value, the Amount to
Spread will behave like an Even Distribution.
Result:
Spreading Properties
Amount to Spread
Specify the value to spread over the selected cells. The value defaults to the last cell selected.
The way the amount in this field spreads varies by Spreading Type.
Rate (Factor and Accumulate Spreading Types Only) Enter a rate to multiply by a cell value.
3. Click Add.
6. Either right-click to open Format Cells or use the Home menu to choose formatting.
7. After formatting, click OneStream > Cube Views > Selection Styles.
8. Enter a Name and Range to apply to the current Selection, Entire Row, or Entire Column.
Then click OK.
10. Save the file to save the formatting to the cube view.
2. Click Excel Styles to select a style. If a style is not selected the Default Cube View Format is
used.
6. In Selection Styles, you will see style that are no longer Active. Click Activate to enable
them.
7. You can delete a style from the cube view but it will be available in the current workbook.
Merging Styles
You can use styles created in other workbooks in Excel only.
Conditional Formatting
Use Conditional Formatting in Cube Views to visually explore and analyze data. You can highlight
cells or ranges of cells, identify key values, and represent data using data bars, color scales, and
icon sets that correspond to specific variations in the data. If there are any existing formats prior to
applying Conditional Formatting, they will be retained if the range of cells containing the
conditional formats do not meet the conditions of the rule. All styles from the cube view and the
selection styles that had been previously applied to that range are overridden by conditional
formatting.
3. Go to OneStream and click Refresh Sheet to see that your changes have been applied.
4. If you make a change that is different than the value in the database, the cell will change to
pale yellow, until you refresh or submit.
5. If you submit, it will revert to the formatting that was in the cube view since it is no longer
greater than 6000.
6. If you make a change to a cell that has conditional formatting and a selection style,
when you submit, it will convert back to the selection style since it is no longer greater than 6000.
7. To add icons, go Home > Conditional Formatting > Icon Sets and select the icons to use, in
this example, select the arrows.
9. You can also create, edit, delete, and view all conditional formatting rules.
Styles
The same standard Styles are used in Excel, however, if you want to create a new style in order to
change the format of how the numbers are displayed, see the example that follows.
Finally, under Type, enter the custom formatting. This example will be formatting for Millions. (#,,)
Click OK.
Now that a new style sheet has been created in Excel, it can be assigned to a Quick view.
To add the formatting, click on the Edit Quick View Options in the Quick View Tab on the right
side of the screen.
Named Regions
Bringing a Cube View into Excel creates several Named Regions that you can select, refer to, and
use with Styles. Named Regions are created for the Cube View, column headers, row headers,
and data sections.
If there are multiple named columns or rows in the Cube View, go to the intersection-based
Named Regions and use different formatting to differentiate sections. For example, a Total row is
separated from detailed data. This combination of Named Regions and Styles generates a nicely
formatted report:
Retrieve Functions
Retrieving and changing data can be done by using functions. To see the functions and their
Parameters, open Excel and select the Formulas tab. Select Insert Function and select
OneStreamExcelAddIn.XFFunctions where it says to Select a category.
The output of the function will look something like this: =XFGetCell(A1,A2,A3,A4,A5)
The equivalent functions like XFGetCell provide a separate Parameter to specify each Dimension
Member without using the Member Script syntax. (e.g., E#CT:A#Sales would not be used, CT and
Sales would be used in the correct Parameter for that Dimension)
NOTE: If a field within the function is unneeded, enter a double quote to ignore it.
XFGetCell
This function retrieves data based on the Parameters supplied. Each Parameter needs to be
defined.
XFGetCell(NoDataAs Zero, Cube, Entity, Parent, Cons, Scenario, Time, View, Account, Flow,
Origin, IC, UD1, UD2, UD3, UD4, UD5, UD6, UD7, UD8)
XFGetCell5
This has the same functionality as XFGetCell except it limits the User Defined Dimensions to five
instead of eight.
XFGetFXRate
This function retrieves rates from the system. Each Parameter needs to be defined.
XFGetFXRate(DisplayNoDataAsZero, FXRateType, Time, SourceCurrency, DestCurrency)
XFGetCalculatedFxRate
This function directly retrieves an exchange rate even if only the inverse rate exists in the system.
XFGetMemberProperty
This function retrieves any Dimension Member property from the Member Properties tab in the
Dimension Library. Note there are no spaces used when defining property name.
XFGetMemberProperty(“DimTypeName”,“MemberName or Script”,“PropertyName”,
“VaryByCubeTypeName”,“VaryByScenarioTypeName”,“VaryByTimeName”)
NOTE: f the function does not need to vary by Cube Type, Scenario, or Time, enter a
double quote in order to ignore it.
Example: Retrieving an Account Formula that only occurs in the Budget Scenario
XFGetMemberProperty(“Account”,”51000”,”Formula”,””,”Budget”,””)
XFGetRelationshipProperty
This function retrieves any Dimension relationship property from the Relationship Properties tab
in the Dimension Library.
XFGetRelationshipProperty(“DimTypeName”,“ParentMemberName or
Script”,“ChildMemberName or
Script”,“PropertyName”,“VaryByScenarioTypeName”,“VaryByTimeName”)
NOTE: If the function does not need to vary by Cube, Scenario, or Time, enter a double
quote in order to ignore it.
XFGetHierarchyProperty
This function determines whether or not a Dimension has children and returns True or False
XFGetHierarchyProperty(“DimTypeName”,”DimName”,“MemberName or
Script”,“PropertyName”,”PrimaryCubeName”,”ScenarioTypeNameForMembers”,
”MergeMembersfromReferencedCubes”)
XFGetDashboardParameterValue
This function is available to Excel Add-in and Spreadsheet. If that function is used within an XLSX
file that is using a function like XFGetCell or XFSetCell (or similar) where these are referencing a
custom parameter value (e.g. ParamEntity) that is on the Dashboard that references this
Spreadsheet from within it as a Component. The practice to get this Custom Parameter value is to
use XFGetDashboardParameterValue to fetch the text from that Parameter or its default value
and place it in a cell on the Spreadsheet (e.g. B1). Then the cell that is using a retrieve function
such as XFGetCell would reference this other cell (i.e. B1).
XFGetMemberInfo
This function retrieves the description in the system. Each Parameter needs to be defined.
XFGetMemberInfo(MemberInfoType, DimTypeName, MemberName, NameorDesc,
NameandDesc)
XFInternalGetDataFromServer
This function returns True or False. It does not take any arguments.
XFSetCell
This function saves data to the amount field based on the Parameters supplied. Each Parameter
needs to be defined.
XFSetCell(CellValue, StoreZeroAsNoData, Cube, Entity, Parent, Cons, Scenario, Time, View,
Account, Flow, Origin, IC, UD1, UD2, UD3, UD4, UD5, UD6, UD7, UD8
XFSetFXRate
This function saves rates to the system. Each Parameter needs to be defined.
XFSetFXRate(Value, StoreZeroAsNoData, FXRateType, Time,
SourceCurrency,DestinationCurrency)
XFGetCellUsingScript
XFGetMemberInfoUsingScript
XFSetCellUsingScript
All of the functions that have …UsingScript are based on a Member Script (e.g.,
A#Sales:E#Texas). The multiple Parameters provide the ability to specify multiple portions of the
full Member Script using different Excel cells. All of the Member Scripts in the function Parameters
combine to create one Member Script. It will then use the combined Member Script to retrieve the
data cell.
XFGetCellUsingScriptEx
XFGetMemberInfoUsingScriptEx
XFSetCellUsingScriptEx
All of the functions that have …Ex have many more Parameters to use for combining Member
Scripts (e.g., commonly used when creating another version of a function that has extra
Parameters). Ex would also be used to combine many Member Script Parameters.
XFSetCellLocalForms
XFGetCellLocalAdjInput5
XFGetCellLocalForms5
XFGetCellLocalImport5
XFGetCellLocalOTop5
XFGetCellTransAdjInput5
XFGetCellTranForms5
XFGetCellTransImport5
XFGetCellTransOTop5
XFSetCellLocalForms5
These functions use the Consolidation and Origin Dimensions. For example,
XFSetCellLocalForms is using Local Consolidation and the Forms Origin Member. The number
five at the end of the functions limit the User Defined Dimensions to five instead of eight.
XFGetCellUSingScriptVolatile
XFGetCellVolatile
XFGetFXRateVolatile
XFGetMemberInfoUsingScriptExVolatile
XFGetMemberInfoUsingScriptVolatile
XFGetMemberInfoVolatile
XFSetCellUsingScriptExVolatile
XFSetCellUsingScriptVolatile
XFSetCellVolatile
XFSetFXRateVolatile
In some cases, Excel requires a volatile function for proper refreshing, for example, some Excel
Charts that reference calculated cells.
XFInternalPrepareCalculationStep
XFInternalSendDatatoServer
XFInternaSetConnectionInfo
All of the functions that begin with XFInternal only work for internal processes.
A Sub procedure begins with a Sub statement, followed by the tasks to be performed, and ends
with an End Sub statement. The following snippet of VBA code represents the structure of a Sub
procedure:
OneStream Functions are used with Sub procedures to automate data submission from
XFSetCells formulas.
l Logoff()
Sub RefreshXFFunctions()
Set xfAddIn = Application.COMAddIns("OneStreamExcelAddIn")
If Not xfAddIn Is Nothing Then
If Not xfAddIn.Object Is Nothing Then
Call xfAddIn.Object.RefreshXFFunctions
End If
End If
End Sub
l RefreshXFFunctionsForActiveWorksheet()
l RefreshQuickViews()
l RefreshQuickViewsForActiveWorksheet()
l RefreshCubeViews()
l RefreshCubeViewsForActiveWorkSheet()
l ShowParametersDlg()
l ShowParametersDlgForActiveWorksheet()
l SubmitXFFunctions () -- Automates the data loading process and eliminates the need to
open the Excel files individually and submit data manually. Using a VBA routine, files with
XFSET functions that are linked to other cells, sheets, and files can be programmatically
submitted to OneStream. This procedure calls only XFSetCells. Refer to the following
example:
Sub SubmitXFFunctionsTest()
Set xfAddin = Application.COMAddIns("OneStreamExcelAddin")
If Not xfAddin Is Nothing Then
If Not xfAddin.Object Is Nothing Then
Call xfAddin.Object.SubmitXFFunctions
Call xfAddin.Object.RefreshXFFunctions
End If
End If
End Sub