SlideShare a Scribd company logo
Dynamic Creation and/or Content Modification of Spotfire output (DXP files)   Proposed Integration Solution with Pipeline Pilot
What are Spotfire  Output  (or DXP) files? Software and Use TIBCO Spotfire is a comprehensive software platform that allows customers to access, analyze and create dynamic report(s) on their data. A report that is created/opened using Spotfire is referred to as an  Analysis Document  and when saved is given an extension of “ .DXP ”. This  DXP  file not only contains a series of metadata information, but it also contains references to the data itself and to various other components being part of the document, such as pages, filtering, bookmarks etc.
TIBCO Spotfire Architecture TIBCO Spotfire Automation Services Spotfire Automation Services provides a platform and tools for automating TIBCO Spotfire without  user interaction  and  visible user interface .
TIBCO Spotfire Developer (SDK) TIBCO Spotfire Developer  has a complete set of application programming interfaces (APIs) for not only configuring and automating the platform, but extending it with entirely new capabilities as well . The SDK provides required Spotfire developer resources: the Spotfire Extension Project Template, development assemblies, example projects, and the Package Builder application wrapping the  extensions  for deployment A Spotfire  extension  is the smallest functional unit added to the platform. It is developed in Visual Studio® and is included in a Spotfire add-in, enabling versioning, licensing, deployment and loading.
TIBCO Spotfire APIs & Integration with PP (PP Custom Extension)  Pipeline Pilot
Automation Services - Terminology Task   An Automation Services Task is the building block for creating a complex Job. Automation Services comes with a number of predefined tasks such as  Apply Bookmark  or  Send Email , but since the platform is extendible, additional tasks may be supplied by other vendors or by end user’s IT organization.  Job   An Automation Services Job is a collection of  Tasks  to be executed in a sequential manner. A Job is stored in a Job File. Job File  A Job File is a human-readable XML defining a Job.  Job Builder  The Automation Services Job Builder is an application that runs within Spotfire Professional and provides an easy-to-use interface for creating and editing Jobs
Spotfire Automation Services - Architecture  The  Job  is normally created in the  Job Builder  and saved as an XML  Job File .  This file is then sent to the Automation Services Web Service for execution - from the Job Builder window (for testing), - or using the provided  ClientJobSender  tool,  - or by your own custom tool.  For each  Job  in XML format sent to the  Automation Services Web Service  for execution, a  Job Executor  process is started.  This internal Job Executor logs on to a Spotfire Analytics Server using credentials stored in the  Spotfire.Dxp.Automation.Launcher.exe.config  file, executes the tasks one by one and then exits when a task fails or all tasks have completed successfully The  PP Custom Task  will be used when creating a  Job  and saving the  Job file . The  ClientJobSender  tool will be used by  Pipeline Pilot  to send the  Job file  to the  Automation Services Web Service . (Pipeline Pilot)  ( PP Custom Ext. )
Automation Services – Scheduling a Job To programmatically activate a Job file, Spotfire has provided an application that excepts two arguments. The first parameter is the URL to the web server where the Automation Services is installed and the second argument is the path to the Job file, consisting of the custom task(s). (Spotfire.Dxp.Automation.ClientJobSender.exe) For example: First Argument:  http://localhost/SpotfireAutomation/JobExecutor.asmx Second Argument:  pathToJobFile/jobfilename.xml "c:\temp\dxpdata\Spotfire.Dxp.Automation.ClientJobSender.exe" "http://laptopdayton/SpotfireAutomation/JobExecutor.asmx" "c:\temp\dxpdata\DXPRequestJob.xml"
Solution – Develop Communication Message Using Spotfire provided  ClientJobSender  tool. An XML formatted message will be defined to enable Pipeline Pilot to specify details for the type of DXP file to generate or the updates to make to an existing DXP file. The PP Custom Task will also be aware of this custom developed XML formatted message and will be translate them into APIs calls to create the DXP file dynamically. Spotfire Automation Services PP Custom Task Capable of creating a new DXP file or modifying an exiting DXP by using the Spotfire APIs Pipeline Pilot Protocol Capable of activating the PP Custom Task that is part of the Spotfire Automation Services.
Solution – Transmitting / Receiving Messages Spotfire Automation Services PP Custom Task Pipeline Pilot Protocol Activate the PP Custom Task job using Spotfire provided  ClientJobSender  tool. Pipeline Pilot formats and places a XML formatted message requesting a DXP file. The details for the type of DXP file to create, the data file to use and other required information is stored in this file. Pipeline Pilot activates the PP Custom Task to process it’s request for a DXP file. The PP Custom Task retrieves the DXPRequest.xml file, interprets the instructions from Pile Pilot and using the APIs create/modify a DXP file.  The resultant DXP is stored in the Results folder. Pipeline Pilot retrieves the resulting DXP file. REQUEST FOLDER Accessible to both Spotfire and Pipeline Pilot. DXPRequest.xml 1 3 1 2 3 2 RESULTS FOLDER Accessible to both Spotfire and Pipeline Pilot. filename.dxp 5 4 4 5
Implementing the Solution – DXP Request Message format DXP Request Message:  An XML formatted message structure will be defined for Pipeline Pilot to request customized DXP files from the Custom Spotfire task. Both Pipeline Pilot and the Custom Spotfire Task will need to be aware of the message structure. Communication Medium:  Some form of communication mechanism is required between Pipeline Pilot and the Spotfire Custom Task. One option is to use a common folder or folders to store the requests and the results. Pipeline Pilot and the Spotfire App would each monitor these folder(s) periodically or when activated.
Implementing the Solution – Spotfire Related Extension: A custom extension will be developed as a Spotfire Add-in using the Spotfire SDK.  Task: The custom Spotfire Add-in is then registered as an Automation Services Task. Job Builder : The Automation Services Job Builder will then be used for defining a Job consisting of this custom task and a Job File will be saved.
Implementing a Solution: Pipeline Pilot Related Pipeline Pilot Components:  Components will be developed to support this functionality. As a minimum, a component that can create a Spotfire Analysis Request file using the predefined message structure. Another component to move the Analysis request file to the predefine Spotfire folder and activate the Automation Services Job. And yet another component that will retrieve the resulting analysis document (.DXP file) that the custom task has created.
Proof-Of-Concept Demo Communication Between Spotfire and Pipeline Pilot:   Three folder were created that were accessible to both Pipeline Pilot and Spotfire. Folder named  DXPRequests : This folder is in which the XML formatted analysis requests are stored by Pipeline Pilot and retrieved by the custom task in Spotfire. Folder named DXPResults: This folder is in which the Spotfire generated analysis document (.DXP file) is stored and retrieved by Pipeline Pilot. Folder Named DXPData: This is a folder is used for storing sample data files and preconfigured XML formatted requests (that will be used in the demo.)
Proof-Of-Concept Demo Spotfire Setup:   The custom task was registered as an Spotfire Automation Services task. Using the Job Builder, a Job file was created that consisted on our custom Automation Services task. The resulting XML formatted Job File was saved in the  DXPData  folder. (DXPRequestJob.xml) A copy of the Spotfire  Spotfire.Dxp.Automation.ClientJobSender.exe  was saved in the  DXPData  folder.
Proof-Of-Concept Demo Pipeline Pilot Setup:   Two sample Analysis Request files were created and stored in the DXPdata folder. The contents of one of the files is shown below.  (This request file is for the demo only and the format of the final message is still to be developed.) <?xml version=&quot;1.0&quot; encoding=&quot;utf-8&quot;?> <PPDXPRequest xmlns:xsi=&quot;http://www.w3.org/2001/XMLSchema-instance&quot; xmlns:xsd=&quot;http://www.w3.org/2001/XMLSchema&quot;> <dxpRequest>2</dxpRequest> <xDataColumnName>Cost</xDataColumnName> <yDataColumnName>Sales</yDataColumnName> <plotDataFile>c:\temp\dxpdata\SalesData.txt</plotDataFile>  <dxpResultFile>c:\temp\dxpresults\SalesDataScatterPlot.dxp</dxpResultFile> </PPDXPRequest> The custom task is hard coded to recognize that  dxpRequest code of 2  is for a scatter plot  and that the data source is the file specified in  plotDataFile  node and that the resultant DXP file is to be stored as specified in the  dxpResultFile  node.
Proof-Of-Concept Demo (Cont.)  Pipeline Pilot Setup:  Three components were created to execute a command line to copy a predefined request file from the  DXPData  folder to the  DXPRequestFile  folder. to execute a command line to activate the  Automation Services Job  containing our custom task. to open the resulting analysis (.DXP) file stored in the  DXPResults  folder by Spotfire.
Proof-Of-Concept Demo Pipeline Pilot Setup:   Using these simple components the following protocol was created.
Proof-Of-Concept Demo Execution Sequence:  1) Create and Submit Analysis Request (file) On activating the protocol, a user is requested to selected one of the two predefine analysis request files. The selected request file is copied to the  DXPData  folder and renamed to  dxprequest.xml  –  This is the name of a file that the custom task on Spotfire will look for.
Proof-Of-Concept Demo Execution Sequence:  2) Activate Automation Services Job Once the request file is copied to the  DXPRequests  folder, a another component activates the Automation Services Job on Spotfire to process this request. (there are other options for activating the Automation Services Job besides the method used in this demo.) The command line executed by this component is &quot;c:\temp\dxpdata\Spotfire.Dxp.Automation.ClientJobSender.exe&quot; &quot;http://laptopdayton/SpotfireAutomation/JobExecutor.asmx&quot; &quot;c:\temp\dxpdata\DXPRequestJob.xml“
Proof-Of-Concept Demo Execution Sequence:  3) Execute Analysis Instructions (Spotfire) The XML formatted instructions from the file and the Spotfire APIs are used for modifying an existing analysis document or for creating a new analysis document. In our example the contents of the XML Request file is as follows <?xml version=&quot;1.0&quot; encoding=&quot;utf-8&quot;?> <PPDXPRequest xmlns:xsi=&quot;http://www.w3.org/2001/XMLSchema-instance&quot; xmlns:xsd=&quot;http://www.w3.org/2001/XMLSchema&quot;> <dxpRequest>2</dxpRequest>  A Request Type of 2 was implied to mean a Scatter Plot  <xDataColumnName>Cost</xDataColumnName>  Column name of the x data values <yDataColumnName>Sales</yDataColumnName>  Column name for the y data values <plotDataFile>c:\temp\dxpdata\SalesData.txt</plotDataFile>  Location of the source data <dxpResultFile>c:\temp\dxpresults\SalesDataScatterPlot.dxp</dxpResultFile>  location for the resulting analysis document. </PPDXPRequest>
Proof-Of-Concept Demo Cont.  When the Automation Services Job is activated, it executes the code of the custom task and also provides access to the running TIBCO Spotfire instance. A document opened in a running instance of TIBCO Spotfire is referred to as an Analysis Document. The document not only contains a series of metadata information, but it also contains references to the data itself, and to various other components being part of the document, such as pages, filtering, bookmarks etc. Since a data file was specified, the custom task uses the  Open  method of the Analysis Application  and the specified data file.  We now have a default analysis document that can be customized with the requested visualization. In this request file the requested visualization is for a scatter plot and so the available method(s) are used for adding a scatter plot visualization to the analysis document using the data file and the X and Y data column names. The analysis document is then saved using the name/location specified in the request file.
Proof-Of-Concept Demo Execution Sequence:  4) Receive Requested Analysis (.DXP File) Once control is returned, the resulting analysis file will have been stored in the  DXPResults  folder by the custom task that was executed by the Spotfire Automations Services Job.
Proof-Of-Concept Demo Execution Sequence:  5) View Analysis Document (.DXP file) And finally the analysis file is opened by a component executing the following command line c:\temp\dxpresults\salesdatascatterplot.dxp Note: The following plot was created using the default template provided by Spotfire’s API. The Filters; The Page and the Details-on-Demand  were preconfigured but could be customized. In this case the data file and name of the X and Y columns were provided.
Phase-II  Gather a list of all possible DXP file requests that are required from Pipeline Pilot. Using this data, an XML formatted message structure will need to be to determined that would be able to support all the requests. Determine if these requests can be implemented using Spotfire APIs. Develop a library of Pipeline Pilot components that an end user could use with or without modifications. Develop the custom task to interpret these DXP requests.
Ad

More Related Content

What's hot (20)

Hortonworks DataFlow (HDF) 3.3 - Taking Stream Processing to the Next Level
Hortonworks DataFlow (HDF) 3.3 - Taking Stream Processing to the Next LevelHortonworks DataFlow (HDF) 3.3 - Taking Stream Processing to the Next Level
Hortonworks DataFlow (HDF) 3.3 - Taking Stream Processing to the Next Level
Hortonworks
 
From an experiment to a real production environment
From an experiment to a real production environmentFrom an experiment to a real production environment
From an experiment to a real production environment
DataWorks Summit
 
Microsoft SQL Server - Parallel Data Warehouse Presentation
Microsoft SQL Server - Parallel Data Warehouse PresentationMicrosoft SQL Server - Parallel Data Warehouse Presentation
Microsoft SQL Server - Parallel Data Warehouse Presentation
Microsoft Private Cloud
 
IBM Spectrum Scale ECM - Winning Combination
IBM Spectrum Scale  ECM - Winning CombinationIBM Spectrum Scale  ECM - Winning Combination
IBM Spectrum Scale ECM - Winning Combination
Sasikanth Eda
 
Breathing New Life into Apache Oozie with Apache Ambari Workflow Manager
Breathing New Life into Apache Oozie with Apache Ambari Workflow ManagerBreathing New Life into Apache Oozie with Apache Ambari Workflow Manager
Breathing New Life into Apache Oozie with Apache Ambari Workflow Manager
DataWorks Summit
 
The Future of Data Warehousing, Data Science and Machine Learning
The Future of Data Warehousing, Data Science and Machine LearningThe Future of Data Warehousing, Data Science and Machine Learning
The Future of Data Warehousing, Data Science and Machine Learning
ModusOptimum
 
SAP PI 71 EHP1 Feature Highlights
SAP PI 71 EHP1 Feature HighlightsSAP PI 71 EHP1 Feature Highlights
SAP PI 71 EHP1 Feature Highlights
hanishjohn
 
Oracle Data Integration Presentation
Oracle Data Integration PresentationOracle Data Integration Presentation
Oracle Data Integration Presentation
kgissandaner
 
Big Data, Big Thinking: Simplified Architecture Webinar Fact Sheet
Big Data, Big Thinking: Simplified Architecture Webinar Fact SheetBig Data, Big Thinking: Simplified Architecture Webinar Fact Sheet
Big Data, Big Thinking: Simplified Architecture Webinar Fact Sheet
SAP Technology
 
Why and how to leverage the simplicity and power of SQL on Flink
Why and how to leverage the simplicity and power of SQL on FlinkWhy and how to leverage the simplicity and power of SQL on Flink
Why and how to leverage the simplicity and power of SQL on Flink
DataWorks Summit
 
Database@Home - Data Driven : Loading, Indexing, and Searching with Text and ...
Database@Home - Data Driven : Loading, Indexing, and Searching with Text and ...Database@Home - Data Driven : Loading, Indexing, and Searching with Text and ...
Database@Home - Data Driven : Loading, Indexing, and Searching with Text and ...
Tammy Bednar
 
Time to Talk about Data Mesh
Time to Talk about Data MeshTime to Talk about Data Mesh
Time to Talk about Data Mesh
LibbySchulze
 
Hybrid Cloud Keynote
Hybrid Cloud Keynote Hybrid Cloud Keynote
Hybrid Cloud Keynote
gcamarda
 
ETL Market Webcast
ETL Market WebcastETL Market Webcast
ETL Market Webcast
mark madsen
 
Real Time Streaming Architecture at Ford
Real Time Streaming Architecture at FordReal Time Streaming Architecture at Ford
Real Time Streaming Architecture at Ford
DataWorks Summit
 
OpenPOWER Roadmap Toward CORAL
OpenPOWER Roadmap Toward CORALOpenPOWER Roadmap Toward CORAL
OpenPOWER Roadmap Toward CORAL
inside-BigData.com
 
Make streaming processing towards ANSI SQL
Make streaming processing towards ANSI SQLMake streaming processing towards ANSI SQL
Make streaming processing towards ANSI SQL
DataWorks Summit
 
Database@Home : The Future is Data Driven
Database@Home : The Future is Data DrivenDatabase@Home : The Future is Data Driven
Database@Home : The Future is Data Driven
Tammy Bednar
 
SQLAnywhere 16.0 and Odata
SQLAnywhere 16.0 and OdataSQLAnywhere 16.0 and Odata
SQLAnywhere 16.0 and Odata
SAP Technology
 
DBCS Office Hours - Modernization through Migration
DBCS Office Hours - Modernization through MigrationDBCS Office Hours - Modernization through Migration
DBCS Office Hours - Modernization through Migration
Tammy Bednar
 
Hortonworks DataFlow (HDF) 3.3 - Taking Stream Processing to the Next Level
Hortonworks DataFlow (HDF) 3.3 - Taking Stream Processing to the Next LevelHortonworks DataFlow (HDF) 3.3 - Taking Stream Processing to the Next Level
Hortonworks DataFlow (HDF) 3.3 - Taking Stream Processing to the Next Level
Hortonworks
 
From an experiment to a real production environment
From an experiment to a real production environmentFrom an experiment to a real production environment
From an experiment to a real production environment
DataWorks Summit
 
Microsoft SQL Server - Parallel Data Warehouse Presentation
Microsoft SQL Server - Parallel Data Warehouse PresentationMicrosoft SQL Server - Parallel Data Warehouse Presentation
Microsoft SQL Server - Parallel Data Warehouse Presentation
Microsoft Private Cloud
 
IBM Spectrum Scale ECM - Winning Combination
IBM Spectrum Scale  ECM - Winning CombinationIBM Spectrum Scale  ECM - Winning Combination
IBM Spectrum Scale ECM - Winning Combination
Sasikanth Eda
 
Breathing New Life into Apache Oozie with Apache Ambari Workflow Manager
Breathing New Life into Apache Oozie with Apache Ambari Workflow ManagerBreathing New Life into Apache Oozie with Apache Ambari Workflow Manager
Breathing New Life into Apache Oozie with Apache Ambari Workflow Manager
DataWorks Summit
 
The Future of Data Warehousing, Data Science and Machine Learning
The Future of Data Warehousing, Data Science and Machine LearningThe Future of Data Warehousing, Data Science and Machine Learning
The Future of Data Warehousing, Data Science and Machine Learning
ModusOptimum
 
SAP PI 71 EHP1 Feature Highlights
SAP PI 71 EHP1 Feature HighlightsSAP PI 71 EHP1 Feature Highlights
SAP PI 71 EHP1 Feature Highlights
hanishjohn
 
Oracle Data Integration Presentation
Oracle Data Integration PresentationOracle Data Integration Presentation
Oracle Data Integration Presentation
kgissandaner
 
Big Data, Big Thinking: Simplified Architecture Webinar Fact Sheet
Big Data, Big Thinking: Simplified Architecture Webinar Fact SheetBig Data, Big Thinking: Simplified Architecture Webinar Fact Sheet
Big Data, Big Thinking: Simplified Architecture Webinar Fact Sheet
SAP Technology
 
Why and how to leverage the simplicity and power of SQL on Flink
Why and how to leverage the simplicity and power of SQL on FlinkWhy and how to leverage the simplicity and power of SQL on Flink
Why and how to leverage the simplicity and power of SQL on Flink
DataWorks Summit
 
Database@Home - Data Driven : Loading, Indexing, and Searching with Text and ...
Database@Home - Data Driven : Loading, Indexing, and Searching with Text and ...Database@Home - Data Driven : Loading, Indexing, and Searching with Text and ...
Database@Home - Data Driven : Loading, Indexing, and Searching with Text and ...
Tammy Bednar
 
Time to Talk about Data Mesh
Time to Talk about Data MeshTime to Talk about Data Mesh
Time to Talk about Data Mesh
LibbySchulze
 
Hybrid Cloud Keynote
Hybrid Cloud Keynote Hybrid Cloud Keynote
Hybrid Cloud Keynote
gcamarda
 
ETL Market Webcast
ETL Market WebcastETL Market Webcast
ETL Market Webcast
mark madsen
 
Real Time Streaming Architecture at Ford
Real Time Streaming Architecture at FordReal Time Streaming Architecture at Ford
Real Time Streaming Architecture at Ford
DataWorks Summit
 
OpenPOWER Roadmap Toward CORAL
OpenPOWER Roadmap Toward CORALOpenPOWER Roadmap Toward CORAL
OpenPOWER Roadmap Toward CORAL
inside-BigData.com
 
Make streaming processing towards ANSI SQL
Make streaming processing towards ANSI SQLMake streaming processing towards ANSI SQL
Make streaming processing towards ANSI SQL
DataWorks Summit
 
Database@Home : The Future is Data Driven
Database@Home : The Future is Data DrivenDatabase@Home : The Future is Data Driven
Database@Home : The Future is Data Driven
Tammy Bednar
 
SQLAnywhere 16.0 and Odata
SQLAnywhere 16.0 and OdataSQLAnywhere 16.0 and Odata
SQLAnywhere 16.0 and Odata
SAP Technology
 
DBCS Office Hours - Modernization through Migration
DBCS Office Hours - Modernization through MigrationDBCS Office Hours - Modernization through Migration
DBCS Office Hours - Modernization through Migration
Tammy Bednar
 

Viewers also liked (20)

Advanced Use of Properties and Scripts in TIBCO Spotfire
Advanced Use of Properties and Scripts in TIBCO SpotfireAdvanced Use of Properties and Scripts in TIBCO Spotfire
Advanced Use of Properties and Scripts in TIBCO Spotfire
Herwig Van Marck
 
TIBCO Spotfire
TIBCO SpotfireTIBCO Spotfire
TIBCO Spotfire
Varun Varghese
 
Extending the Reach of R to the Enterprise with TERR and Spotfire
Extending the Reach of R to the Enterprise with TERR and SpotfireExtending the Reach of R to the Enterprise with TERR and Spotfire
Extending the Reach of R to the Enterprise with TERR and Spotfire
Lou Bajuk
 
Advantages of Spotfire
Advantages of SpotfireAdvantages of Spotfire
Advantages of Spotfire
Intellipaat
 
TIBCO Spotfire deck
TIBCO Spotfire deckTIBCO Spotfire deck
TIBCO Spotfire deck
syncsite1
 
Getting the most out of Tibco Spotfire
Getting the most out of Tibco SpotfireGetting the most out of Tibco Spotfire
Getting the most out of Tibco Spotfire
Herwig Van Marck
 
Using the R Language in BI and Real Time Applications (useR 2015)
Using the R Language in BI and Real Time Applications (useR 2015)Using the R Language in BI and Real Time Applications (useR 2015)
Using the R Language in BI and Real Time Applications (useR 2015)
Lou Bajuk
 
TibcoSpotfire@VGSoM
TibcoSpotfire@VGSoMTibcoSpotfire@VGSoM
TibcoSpotfire@VGSoM
Nilesh Kumar
 
Validation
ValidationValidation
Validation
Boris Mergold בוריס מרגולד
 
TIBCO Advanced Analytics Meetup (TAAM) November 2015
TIBCO Advanced Analytics Meetup (TAAM) November 2015TIBCO Advanced Analytics Meetup (TAAM) November 2015
TIBCO Advanced Analytics Meetup (TAAM) November 2015
Bipin Singh
 
Adopting Tibco Spotfire in Bio-Informatics
Adopting Tibco Spotfire in Bio-InformaticsAdopting Tibco Spotfire in Bio-Informatics
Adopting Tibco Spotfire in Bio-Informatics
Herwig Van Marck
 
Real time applications using the R Language
Real time applications using the R LanguageReal time applications using the R Language
Real time applications using the R Language
Lou Bajuk
 
Two-Way Integration with Writable External Objects
Two-Way Integration with Writable External ObjectsTwo-Way Integration with Writable External Objects
Two-Way Integration with Writable External Objects
Salesforce Developers
 
100 days of Spotfire - Tips & Tricks
100 days of Spotfire - Tips & Tricks100 days of Spotfire - Tips & Tricks
100 days of Spotfire - Tips & Tricks
PerkinElmer Informatics
 
Deploying R in BI and Real time Applications
Deploying R in BI and Real time ApplicationsDeploying R in BI and Real time Applications
Deploying R in BI and Real time Applications
Lou Bajuk
 
Webinar: SAP BW Dinosaur to Agile Analytics Powerhouse
Webinar: SAP BW Dinosaur to Agile Analytics PowerhouseWebinar: SAP BW Dinosaur to Agile Analytics Powerhouse
Webinar: SAP BW Dinosaur to Agile Analytics Powerhouse
Agilexi
 
Building a Perfect Strategic Partnership in 5 Stages
Building a Perfect Strategic Partnership in 5 StagesBuilding a Perfect Strategic Partnership in 5 Stages
Building a Perfect Strategic Partnership in 5 Stages
Ogilvy
 
Fundamental JavaScript [UTC, March 2014]
Fundamental JavaScript [UTC, March 2014]Fundamental JavaScript [UTC, March 2014]
Fundamental JavaScript [UTC, March 2014]
Aaron Gustafson
 
Analysis and visualization of microarray experiment data integrating Pipeline...
Analysis and visualization of microarray experiment data integrating Pipeline...Analysis and visualization of microarray experiment data integrating Pipeline...
Analysis and visualization of microarray experiment data integrating Pipeline...
Vladimir Morozov
 
JavaScript Programming
JavaScript ProgrammingJavaScript Programming
JavaScript Programming
Sehwan Noh
 
Advanced Use of Properties and Scripts in TIBCO Spotfire
Advanced Use of Properties and Scripts in TIBCO SpotfireAdvanced Use of Properties and Scripts in TIBCO Spotfire
Advanced Use of Properties and Scripts in TIBCO Spotfire
Herwig Van Marck
 
Extending the Reach of R to the Enterprise with TERR and Spotfire
Extending the Reach of R to the Enterprise with TERR and SpotfireExtending the Reach of R to the Enterprise with TERR and Spotfire
Extending the Reach of R to the Enterprise with TERR and Spotfire
Lou Bajuk
 
Advantages of Spotfire
Advantages of SpotfireAdvantages of Spotfire
Advantages of Spotfire
Intellipaat
 
TIBCO Spotfire deck
TIBCO Spotfire deckTIBCO Spotfire deck
TIBCO Spotfire deck
syncsite1
 
Getting the most out of Tibco Spotfire
Getting the most out of Tibco SpotfireGetting the most out of Tibco Spotfire
Getting the most out of Tibco Spotfire
Herwig Van Marck
 
Using the R Language in BI and Real Time Applications (useR 2015)
Using the R Language in BI and Real Time Applications (useR 2015)Using the R Language in BI and Real Time Applications (useR 2015)
Using the R Language in BI and Real Time Applications (useR 2015)
Lou Bajuk
 
TibcoSpotfire@VGSoM
TibcoSpotfire@VGSoMTibcoSpotfire@VGSoM
TibcoSpotfire@VGSoM
Nilesh Kumar
 
TIBCO Advanced Analytics Meetup (TAAM) November 2015
TIBCO Advanced Analytics Meetup (TAAM) November 2015TIBCO Advanced Analytics Meetup (TAAM) November 2015
TIBCO Advanced Analytics Meetup (TAAM) November 2015
Bipin Singh
 
Adopting Tibco Spotfire in Bio-Informatics
Adopting Tibco Spotfire in Bio-InformaticsAdopting Tibco Spotfire in Bio-Informatics
Adopting Tibco Spotfire in Bio-Informatics
Herwig Van Marck
 
Real time applications using the R Language
Real time applications using the R LanguageReal time applications using the R Language
Real time applications using the R Language
Lou Bajuk
 
Two-Way Integration with Writable External Objects
Two-Way Integration with Writable External ObjectsTwo-Way Integration with Writable External Objects
Two-Way Integration with Writable External Objects
Salesforce Developers
 
Deploying R in BI and Real time Applications
Deploying R in BI and Real time ApplicationsDeploying R in BI and Real time Applications
Deploying R in BI and Real time Applications
Lou Bajuk
 
Webinar: SAP BW Dinosaur to Agile Analytics Powerhouse
Webinar: SAP BW Dinosaur to Agile Analytics PowerhouseWebinar: SAP BW Dinosaur to Agile Analytics Powerhouse
Webinar: SAP BW Dinosaur to Agile Analytics Powerhouse
Agilexi
 
Building a Perfect Strategic Partnership in 5 Stages
Building a Perfect Strategic Partnership in 5 StagesBuilding a Perfect Strategic Partnership in 5 Stages
Building a Perfect Strategic Partnership in 5 Stages
Ogilvy
 
Fundamental JavaScript [UTC, March 2014]
Fundamental JavaScript [UTC, March 2014]Fundamental JavaScript [UTC, March 2014]
Fundamental JavaScript [UTC, March 2014]
Aaron Gustafson
 
Analysis and visualization of microarray experiment data integrating Pipeline...
Analysis and visualization of microarray experiment data integrating Pipeline...Analysis and visualization of microarray experiment data integrating Pipeline...
Analysis and visualization of microarray experiment data integrating Pipeline...
Vladimir Morozov
 
JavaScript Programming
JavaScript ProgrammingJavaScript Programming
JavaScript Programming
Sehwan Noh
 
Ad

Similar to Spotfire Integration & Dynamic Output creation (20)

Ten Steps To Empowerment
Ten Steps To EmpowermentTen Steps To Empowerment
Ten Steps To Empowerment
Mohan Dutt
 
Dxl As A Lotus Domino Integration Tool
Dxl As A Lotus Domino Integration ToolDxl As A Lotus Domino Integration Tool
Dxl As A Lotus Domino Integration Tool
dominion
 
Automation Techniques In Documentation
Automation Techniques In DocumentationAutomation Techniques In Documentation
Automation Techniques In Documentation
Sujith Mallath
 
OPEN TEXT ADMINISTRATION
OPEN TEXT ADMINISTRATIONOPEN TEXT ADMINISTRATION
OPEN TEXT ADMINISTRATION
SUMIT KUMAR
 
Autoconfig r12
Autoconfig r12Autoconfig r12
Autoconfig r12
techDBA
 
ebs xml.ppt
ebs xml.pptebs xml.ppt
ebs xml.ppt
shubhtomar5
 
Cetas - Application Development Services
Cetas - Application Development ServicesCetas - Application Development Services
Cetas - Application Development Services
Kabilan D
 
PLAT-13 Metadata Extraction and Transformation
PLAT-13 Metadata Extraction and TransformationPLAT-13 Metadata Extraction and Transformation
PLAT-13 Metadata Extraction and Transformation
Alfresco Software
 
Filemaker Pro in the Pre-Media environment
Filemaker Pro in the Pre-Media environmentFilemaker Pro in the Pre-Media environment
Filemaker Pro in the Pre-Media environment
Tom Langton
 
Tutorial_Python1.pdf
Tutorial_Python1.pdfTutorial_Python1.pdf
Tutorial_Python1.pdf
MuzamilFaiz
 
Best Implementation Practices with BI Publisher
Best Implementation Practices with BI PublisherBest Implementation Practices with BI Publisher
Best Implementation Practices with BI Publisher
Mohan Dutt
 
Ibm
IbmIbm
Ibm
techbed
 
Intro to flask2
Intro to flask2Intro to flask2
Intro to flask2
Mohamed Essam
 
Intro to flask
Intro to flaskIntro to flask
Intro to flask
Mohamed Essam
 
Student mark Prediction application.pptx
Student mark Prediction application.pptxStudent mark Prediction application.pptx
Student mark Prediction application.pptx
SHIVAMKUMAR988270
 
Dost.jar and fo.jar
Dost.jar and fo.jarDost.jar and fo.jar
Dost.jar and fo.jar
Suite Solutions
 
Adobe Flex4
Adobe Flex4 Adobe Flex4
Adobe Flex4
Rich Helton
 
Understanding and Configuring the FO Plug-in for Generating PDF Files: Part I...
Understanding and Configuring the FO Plug-in for Generating PDF Files: Part I...Understanding and Configuring the FO Plug-in for Generating PDF Files: Part I...
Understanding and Configuring the FO Plug-in for Generating PDF Files: Part I...
Suite Solutions
 
sdzdas sd sdsd sa sd sadasd sd sadsadsadasdsad
sdzdas sd sdsd sa  sd sadasd sd  sadsadsadasdsadsdzdas sd sdsd sa  sd sadasd sd  sadsadsadasdsad
sdzdas sd sdsd sa sd sadasd sd sadsadsadasdsad
lokeshg5155
 
Design document Report Of Project
Design document Report Of ProjectDesign document Report Of Project
Design document Report Of Project
sayan1998
 
Ten Steps To Empowerment
Ten Steps To EmpowermentTen Steps To Empowerment
Ten Steps To Empowerment
Mohan Dutt
 
Dxl As A Lotus Domino Integration Tool
Dxl As A Lotus Domino Integration ToolDxl As A Lotus Domino Integration Tool
Dxl As A Lotus Domino Integration Tool
dominion
 
Automation Techniques In Documentation
Automation Techniques In DocumentationAutomation Techniques In Documentation
Automation Techniques In Documentation
Sujith Mallath
 
OPEN TEXT ADMINISTRATION
OPEN TEXT ADMINISTRATIONOPEN TEXT ADMINISTRATION
OPEN TEXT ADMINISTRATION
SUMIT KUMAR
 
Autoconfig r12
Autoconfig r12Autoconfig r12
Autoconfig r12
techDBA
 
Cetas - Application Development Services
Cetas - Application Development ServicesCetas - Application Development Services
Cetas - Application Development Services
Kabilan D
 
PLAT-13 Metadata Extraction and Transformation
PLAT-13 Metadata Extraction and TransformationPLAT-13 Metadata Extraction and Transformation
PLAT-13 Metadata Extraction and Transformation
Alfresco Software
 
Filemaker Pro in the Pre-Media environment
Filemaker Pro in the Pre-Media environmentFilemaker Pro in the Pre-Media environment
Filemaker Pro in the Pre-Media environment
Tom Langton
 
Tutorial_Python1.pdf
Tutorial_Python1.pdfTutorial_Python1.pdf
Tutorial_Python1.pdf
MuzamilFaiz
 
Best Implementation Practices with BI Publisher
Best Implementation Practices with BI PublisherBest Implementation Practices with BI Publisher
Best Implementation Practices with BI Publisher
Mohan Dutt
 
Student mark Prediction application.pptx
Student mark Prediction application.pptxStudent mark Prediction application.pptx
Student mark Prediction application.pptx
SHIVAMKUMAR988270
 
Understanding and Configuring the FO Plug-in for Generating PDF Files: Part I...
Understanding and Configuring the FO Plug-in for Generating PDF Files: Part I...Understanding and Configuring the FO Plug-in for Generating PDF Files: Part I...
Understanding and Configuring the FO Plug-in for Generating PDF Files: Part I...
Suite Solutions
 
sdzdas sd sdsd sa sd sadasd sd sadsadsadasdsad
sdzdas sd sdsd sa  sd sadasd sd  sadsadsadasdsadsdzdas sd sdsd sa  sd sadasd sd  sadsadsadasdsad
sdzdas sd sdsd sa sd sadasd sd sadsadsadasdsad
lokeshg5155
 
Design document Report Of Project
Design document Report Of ProjectDesign document Report Of Project
Design document Report Of Project
sayan1998
 
Ad

More from Ambareesh Kulkarni (20)

Travel Management Dashboard application
Travel Management Dashboard applicationTravel Management Dashboard application
Travel Management Dashboard application
Ambareesh Kulkarni
 
Carlson Wagonlit: Award winning application
Carlson Wagonlit: Award winning applicationCarlson Wagonlit: Award winning application
Carlson Wagonlit: Award winning application
Ambareesh Kulkarni
 
Analyze Optimize Realize - Business Value Analysis
Analyze Optimize Realize - Business Value AnalysisAnalyze Optimize Realize - Business Value Analysis
Analyze Optimize Realize - Business Value Analysis
Ambareesh Kulkarni
 
Evolution of Client Services functions
Evolution of Client Services functionsEvolution of Client Services functions
Evolution of Client Services functions
Ambareesh Kulkarni
 
Building the Digital Bank
Building the Digital BankBuilding the Digital Bank
Building the Digital Bank
Ambareesh Kulkarni
 
Packaged Dashboard Reporting Solution
Packaged Dashboard Reporting Solution Packaged Dashboard Reporting Solution
Packaged Dashboard Reporting Solution
Ambareesh Kulkarni
 
Actuate Certified Business Solutions for SAP
Actuate Certified Business Solutions for SAPActuate Certified Business Solutions for SAP
Actuate Certified Business Solutions for SAP
Ambareesh Kulkarni
 
Professional Services Project Delivery Methodology
Professional Services Project Delivery MethodologyProfessional Services Project Delivery Methodology
Professional Services Project Delivery Methodology
Ambareesh Kulkarni
 
Windows 10 Migration
Windows 10 MigrationWindows 10 Migration
Windows 10 Migration
Ambareesh Kulkarni
 
Actuate BI implementation for MassMutual's SAP BW
Actuate BI implementation for MassMutual's SAP BW Actuate BI implementation for MassMutual's SAP BW
Actuate BI implementation for MassMutual's SAP BW
Ambareesh Kulkarni
 
Professional Services packaged solutions for SAP
Professional Services packaged solutions for SAPProfessional Services packaged solutions for SAP
Professional Services packaged solutions for SAP
Ambareesh Kulkarni
 
SAP R3 SQL Query Builder
SAP R3 SQL Query BuilderSAP R3 SQL Query Builder
SAP R3 SQL Query Builder
Ambareesh Kulkarni
 
Zero Touch Operating Systems Deployment
Zero Touch Operating Systems DeploymentZero Touch Operating Systems Deployment
Zero Touch Operating Systems Deployment
Ambareesh Kulkarni
 
Ambareesh Kulkarni, Professional background
Ambareesh Kulkarni, Professional backgroundAmbareesh Kulkarni, Professional background
Ambareesh Kulkarni, Professional background
Ambareesh Kulkarni
 
Professional Services Roadmap 2011 and beyond
Professional Services Roadmap 2011 and beyondProfessional Services Roadmap 2011 and beyond
Professional Services Roadmap 2011 and beyond
Ambareesh Kulkarni
 
1E and Servicenow integration
1E and Servicenow integration1E and Servicenow integration
1E and Servicenow integration
Ambareesh Kulkarni
 
Enterprise BI & SOA
Enterprise BI & SOAEnterprise BI & SOA
Enterprise BI & SOA
Ambareesh Kulkarni
 
Professional Services Automation
Professional Services AutomationProfessional Services Automation
Professional Services Automation
Ambareesh Kulkarni
 
Storage Provisioning for Enterprise Information Applications
Storage Provisioning for Enterprise Information ApplicationsStorage Provisioning for Enterprise Information Applications
Storage Provisioning for Enterprise Information Applications
Ambareesh Kulkarni
 
Professional Services Sales Techniques & Methodology
Professional Services Sales Techniques & MethodologyProfessional Services Sales Techniques & Methodology
Professional Services Sales Techniques & Methodology
Ambareesh Kulkarni
 
Travel Management Dashboard application
Travel Management Dashboard applicationTravel Management Dashboard application
Travel Management Dashboard application
Ambareesh Kulkarni
 
Carlson Wagonlit: Award winning application
Carlson Wagonlit: Award winning applicationCarlson Wagonlit: Award winning application
Carlson Wagonlit: Award winning application
Ambareesh Kulkarni
 
Analyze Optimize Realize - Business Value Analysis
Analyze Optimize Realize - Business Value AnalysisAnalyze Optimize Realize - Business Value Analysis
Analyze Optimize Realize - Business Value Analysis
Ambareesh Kulkarni
 
Evolution of Client Services functions
Evolution of Client Services functionsEvolution of Client Services functions
Evolution of Client Services functions
Ambareesh Kulkarni
 
Packaged Dashboard Reporting Solution
Packaged Dashboard Reporting Solution Packaged Dashboard Reporting Solution
Packaged Dashboard Reporting Solution
Ambareesh Kulkarni
 
Actuate Certified Business Solutions for SAP
Actuate Certified Business Solutions for SAPActuate Certified Business Solutions for SAP
Actuate Certified Business Solutions for SAP
Ambareesh Kulkarni
 
Professional Services Project Delivery Methodology
Professional Services Project Delivery MethodologyProfessional Services Project Delivery Methodology
Professional Services Project Delivery Methodology
Ambareesh Kulkarni
 
Actuate BI implementation for MassMutual's SAP BW
Actuate BI implementation for MassMutual's SAP BW Actuate BI implementation for MassMutual's SAP BW
Actuate BI implementation for MassMutual's SAP BW
Ambareesh Kulkarni
 
Professional Services packaged solutions for SAP
Professional Services packaged solutions for SAPProfessional Services packaged solutions for SAP
Professional Services packaged solutions for SAP
Ambareesh Kulkarni
 
Zero Touch Operating Systems Deployment
Zero Touch Operating Systems DeploymentZero Touch Operating Systems Deployment
Zero Touch Operating Systems Deployment
Ambareesh Kulkarni
 
Ambareesh Kulkarni, Professional background
Ambareesh Kulkarni, Professional backgroundAmbareesh Kulkarni, Professional background
Ambareesh Kulkarni, Professional background
Ambareesh Kulkarni
 
Professional Services Roadmap 2011 and beyond
Professional Services Roadmap 2011 and beyondProfessional Services Roadmap 2011 and beyond
Professional Services Roadmap 2011 and beyond
Ambareesh Kulkarni
 
Professional Services Automation
Professional Services AutomationProfessional Services Automation
Professional Services Automation
Ambareesh Kulkarni
 
Storage Provisioning for Enterprise Information Applications
Storage Provisioning for Enterprise Information ApplicationsStorage Provisioning for Enterprise Information Applications
Storage Provisioning for Enterprise Information Applications
Ambareesh Kulkarni
 
Professional Services Sales Techniques & Methodology
Professional Services Sales Techniques & MethodologyProfessional Services Sales Techniques & Methodology
Professional Services Sales Techniques & Methodology
Ambareesh Kulkarni
 

Recently uploaded (20)

AsyncAPI v3 : Streamlining Event-Driven API Design
AsyncAPI v3 : Streamlining Event-Driven API DesignAsyncAPI v3 : Streamlining Event-Driven API Design
AsyncAPI v3 : Streamlining Event-Driven API Design
leonid54
 
Developing System Infrastructure Design Plan.pptx
Developing System Infrastructure Design Plan.pptxDeveloping System Infrastructure Design Plan.pptx
Developing System Infrastructure Design Plan.pptx
wondimagegndesta
 
Reimagine How You and Your Team Work with Microsoft 365 Copilot.pptx
Reimagine How You and Your Team Work with Microsoft 365 Copilot.pptxReimagine How You and Your Team Work with Microsoft 365 Copilot.pptx
Reimagine How You and Your Team Work with Microsoft 365 Copilot.pptx
John Moore
 
Top 5 Benefits of Using Molybdenum Rods in Industrial Applications.pptx
Top 5 Benefits of Using Molybdenum Rods in Industrial Applications.pptxTop 5 Benefits of Using Molybdenum Rods in Industrial Applications.pptx
Top 5 Benefits of Using Molybdenum Rods in Industrial Applications.pptx
mkubeusa
 
Shoehorning dependency injection into a FP language, what does it take?
Shoehorning dependency injection into a FP language, what does it take?Shoehorning dependency injection into a FP language, what does it take?
Shoehorning dependency injection into a FP language, what does it take?
Eric Torreborre
 
Top-AI-Based-Tools-for-Game-Developers (1).pptx
Top-AI-Based-Tools-for-Game-Developers (1).pptxTop-AI-Based-Tools-for-Game-Developers (1).pptx
Top-AI-Based-Tools-for-Game-Developers (1).pptx
BR Softech
 
Config 2025 presentation recap covering both days
Config 2025 presentation recap covering both daysConfig 2025 presentation recap covering both days
Config 2025 presentation recap covering both days
TrishAntoni1
 
Unlocking Generative AI in your Web Apps
Unlocking Generative AI in your Web AppsUnlocking Generative AI in your Web Apps
Unlocking Generative AI in your Web Apps
Maximiliano Firtman
 
Zilliz Cloud Monthly Technical Review: May 2025
Zilliz Cloud Monthly Technical Review: May 2025Zilliz Cloud Monthly Technical Review: May 2025
Zilliz Cloud Monthly Technical Review: May 2025
Zilliz
 
AI 3-in-1: Agents, RAG, and Local Models - Brent Laster
AI 3-in-1: Agents, RAG, and Local Models - Brent LasterAI 3-in-1: Agents, RAG, and Local Models - Brent Laster
AI 3-in-1: Agents, RAG, and Local Models - Brent Laster
All Things Open
 
Building the Customer Identity Community, Together.pdf
Building the Customer Identity Community, Together.pdfBuilding the Customer Identity Community, Together.pdf
Building the Customer Identity Community, Together.pdf
Cheryl Hung
 
Q1 2025 Dropbox Earnings and Investor Presentation
Q1 2025 Dropbox Earnings and Investor PresentationQ1 2025 Dropbox Earnings and Investor Presentation
Q1 2025 Dropbox Earnings and Investor Presentation
Dropbox
 
Mastering Testing in the Modern F&B Landscape
Mastering Testing in the Modern F&B LandscapeMastering Testing in the Modern F&B Landscape
Mastering Testing in the Modern F&B Landscape
marketing943205
 
AI-proof your career by Olivier Vroom and David WIlliamson
AI-proof your career by Olivier Vroom and David WIlliamsonAI-proof your career by Olivier Vroom and David WIlliamson
AI-proof your career by Olivier Vroom and David WIlliamson
UXPA Boston
 
Cybersecurity Threat Vectors and Mitigation
Cybersecurity Threat Vectors and MitigationCybersecurity Threat Vectors and Mitigation
Cybersecurity Threat Vectors and Mitigation
VICTOR MAESTRE RAMIREZ
 
AI x Accessibility UXPA by Stew Smith and Olivier Vroom
AI x Accessibility UXPA by Stew Smith and Olivier VroomAI x Accessibility UXPA by Stew Smith and Olivier Vroom
AI x Accessibility UXPA by Stew Smith and Olivier Vroom
UXPA Boston
 
machines-for-woodworking-shops-en-compressed.pdf
machines-for-woodworking-shops-en-compressed.pdfmachines-for-woodworking-shops-en-compressed.pdf
machines-for-woodworking-shops-en-compressed.pdf
AmirStern2
 
Smart Investments Leveraging Agentic AI for Real Estate Success.pptx
Smart Investments Leveraging Agentic AI for Real Estate Success.pptxSmart Investments Leveraging Agentic AI for Real Estate Success.pptx
Smart Investments Leveraging Agentic AI for Real Estate Success.pptx
Seasia Infotech
 
Agentic Automation - Delhi UiPath Community Meetup
Agentic Automation - Delhi UiPath Community MeetupAgentic Automation - Delhi UiPath Community Meetup
Agentic Automation - Delhi UiPath Community Meetup
Manoj Batra (1600 + Connections)
 
DevOpsDays SLC - Platform Engineers are Product Managers.pptx
DevOpsDays SLC - Platform Engineers are Product Managers.pptxDevOpsDays SLC - Platform Engineers are Product Managers.pptx
DevOpsDays SLC - Platform Engineers are Product Managers.pptx
Justin Reock
 
AsyncAPI v3 : Streamlining Event-Driven API Design
AsyncAPI v3 : Streamlining Event-Driven API DesignAsyncAPI v3 : Streamlining Event-Driven API Design
AsyncAPI v3 : Streamlining Event-Driven API Design
leonid54
 
Developing System Infrastructure Design Plan.pptx
Developing System Infrastructure Design Plan.pptxDeveloping System Infrastructure Design Plan.pptx
Developing System Infrastructure Design Plan.pptx
wondimagegndesta
 
Reimagine How You and Your Team Work with Microsoft 365 Copilot.pptx
Reimagine How You and Your Team Work with Microsoft 365 Copilot.pptxReimagine How You and Your Team Work with Microsoft 365 Copilot.pptx
Reimagine How You and Your Team Work with Microsoft 365 Copilot.pptx
John Moore
 
Top 5 Benefits of Using Molybdenum Rods in Industrial Applications.pptx
Top 5 Benefits of Using Molybdenum Rods in Industrial Applications.pptxTop 5 Benefits of Using Molybdenum Rods in Industrial Applications.pptx
Top 5 Benefits of Using Molybdenum Rods in Industrial Applications.pptx
mkubeusa
 
Shoehorning dependency injection into a FP language, what does it take?
Shoehorning dependency injection into a FP language, what does it take?Shoehorning dependency injection into a FP language, what does it take?
Shoehorning dependency injection into a FP language, what does it take?
Eric Torreborre
 
Top-AI-Based-Tools-for-Game-Developers (1).pptx
Top-AI-Based-Tools-for-Game-Developers (1).pptxTop-AI-Based-Tools-for-Game-Developers (1).pptx
Top-AI-Based-Tools-for-Game-Developers (1).pptx
BR Softech
 
Config 2025 presentation recap covering both days
Config 2025 presentation recap covering both daysConfig 2025 presentation recap covering both days
Config 2025 presentation recap covering both days
TrishAntoni1
 
Unlocking Generative AI in your Web Apps
Unlocking Generative AI in your Web AppsUnlocking Generative AI in your Web Apps
Unlocking Generative AI in your Web Apps
Maximiliano Firtman
 
Zilliz Cloud Monthly Technical Review: May 2025
Zilliz Cloud Monthly Technical Review: May 2025Zilliz Cloud Monthly Technical Review: May 2025
Zilliz Cloud Monthly Technical Review: May 2025
Zilliz
 
AI 3-in-1: Agents, RAG, and Local Models - Brent Laster
AI 3-in-1: Agents, RAG, and Local Models - Brent LasterAI 3-in-1: Agents, RAG, and Local Models - Brent Laster
AI 3-in-1: Agents, RAG, and Local Models - Brent Laster
All Things Open
 
Building the Customer Identity Community, Together.pdf
Building the Customer Identity Community, Together.pdfBuilding the Customer Identity Community, Together.pdf
Building the Customer Identity Community, Together.pdf
Cheryl Hung
 
Q1 2025 Dropbox Earnings and Investor Presentation
Q1 2025 Dropbox Earnings and Investor PresentationQ1 2025 Dropbox Earnings and Investor Presentation
Q1 2025 Dropbox Earnings and Investor Presentation
Dropbox
 
Mastering Testing in the Modern F&B Landscape
Mastering Testing in the Modern F&B LandscapeMastering Testing in the Modern F&B Landscape
Mastering Testing in the Modern F&B Landscape
marketing943205
 
AI-proof your career by Olivier Vroom and David WIlliamson
AI-proof your career by Olivier Vroom and David WIlliamsonAI-proof your career by Olivier Vroom and David WIlliamson
AI-proof your career by Olivier Vroom and David WIlliamson
UXPA Boston
 
Cybersecurity Threat Vectors and Mitigation
Cybersecurity Threat Vectors and MitigationCybersecurity Threat Vectors and Mitigation
Cybersecurity Threat Vectors and Mitigation
VICTOR MAESTRE RAMIREZ
 
AI x Accessibility UXPA by Stew Smith and Olivier Vroom
AI x Accessibility UXPA by Stew Smith and Olivier VroomAI x Accessibility UXPA by Stew Smith and Olivier Vroom
AI x Accessibility UXPA by Stew Smith and Olivier Vroom
UXPA Boston
 
machines-for-woodworking-shops-en-compressed.pdf
machines-for-woodworking-shops-en-compressed.pdfmachines-for-woodworking-shops-en-compressed.pdf
machines-for-woodworking-shops-en-compressed.pdf
AmirStern2
 
Smart Investments Leveraging Agentic AI for Real Estate Success.pptx
Smart Investments Leveraging Agentic AI for Real Estate Success.pptxSmart Investments Leveraging Agentic AI for Real Estate Success.pptx
Smart Investments Leveraging Agentic AI for Real Estate Success.pptx
Seasia Infotech
 
DevOpsDays SLC - Platform Engineers are Product Managers.pptx
DevOpsDays SLC - Platform Engineers are Product Managers.pptxDevOpsDays SLC - Platform Engineers are Product Managers.pptx
DevOpsDays SLC - Platform Engineers are Product Managers.pptx
Justin Reock
 

Spotfire Integration & Dynamic Output creation

  • 1. Dynamic Creation and/or Content Modification of Spotfire output (DXP files) Proposed Integration Solution with Pipeline Pilot
  • 2. What are Spotfire Output (or DXP) files? Software and Use TIBCO Spotfire is a comprehensive software platform that allows customers to access, analyze and create dynamic report(s) on their data. A report that is created/opened using Spotfire is referred to as an Analysis Document and when saved is given an extension of “ .DXP ”. This DXP file not only contains a series of metadata information, but it also contains references to the data itself and to various other components being part of the document, such as pages, filtering, bookmarks etc.
  • 3. TIBCO Spotfire Architecture TIBCO Spotfire Automation Services Spotfire Automation Services provides a platform and tools for automating TIBCO Spotfire without user interaction and visible user interface .
  • 4. TIBCO Spotfire Developer (SDK) TIBCO Spotfire Developer has a complete set of application programming interfaces (APIs) for not only configuring and automating the platform, but extending it with entirely new capabilities as well . The SDK provides required Spotfire developer resources: the Spotfire Extension Project Template, development assemblies, example projects, and the Package Builder application wrapping the extensions for deployment A Spotfire extension is the smallest functional unit added to the platform. It is developed in Visual Studio® and is included in a Spotfire add-in, enabling versioning, licensing, deployment and loading.
  • 5. TIBCO Spotfire APIs & Integration with PP (PP Custom Extension) Pipeline Pilot
  • 6. Automation Services - Terminology Task An Automation Services Task is the building block for creating a complex Job. Automation Services comes with a number of predefined tasks such as Apply Bookmark or Send Email , but since the platform is extendible, additional tasks may be supplied by other vendors or by end user’s IT organization. Job An Automation Services Job is a collection of Tasks to be executed in a sequential manner. A Job is stored in a Job File. Job File A Job File is a human-readable XML defining a Job. Job Builder The Automation Services Job Builder is an application that runs within Spotfire Professional and provides an easy-to-use interface for creating and editing Jobs
  • 7. Spotfire Automation Services - Architecture The Job is normally created in the Job Builder and saved as an XML Job File . This file is then sent to the Automation Services Web Service for execution - from the Job Builder window (for testing), - or using the provided ClientJobSender tool, - or by your own custom tool. For each Job in XML format sent to the Automation Services Web Service for execution, a Job Executor process is started. This internal Job Executor logs on to a Spotfire Analytics Server using credentials stored in the Spotfire.Dxp.Automation.Launcher.exe.config file, executes the tasks one by one and then exits when a task fails or all tasks have completed successfully The PP Custom Task will be used when creating a Job and saving the Job file . The ClientJobSender tool will be used by Pipeline Pilot to send the Job file to the Automation Services Web Service . (Pipeline Pilot) ( PP Custom Ext. )
  • 8. Automation Services – Scheduling a Job To programmatically activate a Job file, Spotfire has provided an application that excepts two arguments. The first parameter is the URL to the web server where the Automation Services is installed and the second argument is the path to the Job file, consisting of the custom task(s). (Spotfire.Dxp.Automation.ClientJobSender.exe) For example: First Argument: http://localhost/SpotfireAutomation/JobExecutor.asmx Second Argument: pathToJobFile/jobfilename.xml &quot;c:\temp\dxpdata\Spotfire.Dxp.Automation.ClientJobSender.exe&quot; &quot;http://laptopdayton/SpotfireAutomation/JobExecutor.asmx&quot; &quot;c:\temp\dxpdata\DXPRequestJob.xml&quot;
  • 9. Solution – Develop Communication Message Using Spotfire provided ClientJobSender tool. An XML formatted message will be defined to enable Pipeline Pilot to specify details for the type of DXP file to generate or the updates to make to an existing DXP file. The PP Custom Task will also be aware of this custom developed XML formatted message and will be translate them into APIs calls to create the DXP file dynamically. Spotfire Automation Services PP Custom Task Capable of creating a new DXP file or modifying an exiting DXP by using the Spotfire APIs Pipeline Pilot Protocol Capable of activating the PP Custom Task that is part of the Spotfire Automation Services.
  • 10. Solution – Transmitting / Receiving Messages Spotfire Automation Services PP Custom Task Pipeline Pilot Protocol Activate the PP Custom Task job using Spotfire provided ClientJobSender tool. Pipeline Pilot formats and places a XML formatted message requesting a DXP file. The details for the type of DXP file to create, the data file to use and other required information is stored in this file. Pipeline Pilot activates the PP Custom Task to process it’s request for a DXP file. The PP Custom Task retrieves the DXPRequest.xml file, interprets the instructions from Pile Pilot and using the APIs create/modify a DXP file. The resultant DXP is stored in the Results folder. Pipeline Pilot retrieves the resulting DXP file. REQUEST FOLDER Accessible to both Spotfire and Pipeline Pilot. DXPRequest.xml 1 3 1 2 3 2 RESULTS FOLDER Accessible to both Spotfire and Pipeline Pilot. filename.dxp 5 4 4 5
  • 11. Implementing the Solution – DXP Request Message format DXP Request Message: An XML formatted message structure will be defined for Pipeline Pilot to request customized DXP files from the Custom Spotfire task. Both Pipeline Pilot and the Custom Spotfire Task will need to be aware of the message structure. Communication Medium: Some form of communication mechanism is required between Pipeline Pilot and the Spotfire Custom Task. One option is to use a common folder or folders to store the requests and the results. Pipeline Pilot and the Spotfire App would each monitor these folder(s) periodically or when activated.
  • 12. Implementing the Solution – Spotfire Related Extension: A custom extension will be developed as a Spotfire Add-in using the Spotfire SDK. Task: The custom Spotfire Add-in is then registered as an Automation Services Task. Job Builder : The Automation Services Job Builder will then be used for defining a Job consisting of this custom task and a Job File will be saved.
  • 13. Implementing a Solution: Pipeline Pilot Related Pipeline Pilot Components: Components will be developed to support this functionality. As a minimum, a component that can create a Spotfire Analysis Request file using the predefined message structure. Another component to move the Analysis request file to the predefine Spotfire folder and activate the Automation Services Job. And yet another component that will retrieve the resulting analysis document (.DXP file) that the custom task has created.
  • 14. Proof-Of-Concept Demo Communication Between Spotfire and Pipeline Pilot: Three folder were created that were accessible to both Pipeline Pilot and Spotfire. Folder named DXPRequests : This folder is in which the XML formatted analysis requests are stored by Pipeline Pilot and retrieved by the custom task in Spotfire. Folder named DXPResults: This folder is in which the Spotfire generated analysis document (.DXP file) is stored and retrieved by Pipeline Pilot. Folder Named DXPData: This is a folder is used for storing sample data files and preconfigured XML formatted requests (that will be used in the demo.)
  • 15. Proof-Of-Concept Demo Spotfire Setup: The custom task was registered as an Spotfire Automation Services task. Using the Job Builder, a Job file was created that consisted on our custom Automation Services task. The resulting XML formatted Job File was saved in the DXPData folder. (DXPRequestJob.xml) A copy of the Spotfire Spotfire.Dxp.Automation.ClientJobSender.exe was saved in the DXPData folder.
  • 16. Proof-Of-Concept Demo Pipeline Pilot Setup: Two sample Analysis Request files were created and stored in the DXPdata folder. The contents of one of the files is shown below. (This request file is for the demo only and the format of the final message is still to be developed.) <?xml version=&quot;1.0&quot; encoding=&quot;utf-8&quot;?> <PPDXPRequest xmlns:xsi=&quot;http://www.w3.org/2001/XMLSchema-instance&quot; xmlns:xsd=&quot;http://www.w3.org/2001/XMLSchema&quot;> <dxpRequest>2</dxpRequest> <xDataColumnName>Cost</xDataColumnName> <yDataColumnName>Sales</yDataColumnName> <plotDataFile>c:\temp\dxpdata\SalesData.txt</plotDataFile> <dxpResultFile>c:\temp\dxpresults\SalesDataScatterPlot.dxp</dxpResultFile> </PPDXPRequest> The custom task is hard coded to recognize that dxpRequest code of 2 is for a scatter plot and that the data source is the file specified in plotDataFile node and that the resultant DXP file is to be stored as specified in the dxpResultFile node.
  • 17. Proof-Of-Concept Demo (Cont.) Pipeline Pilot Setup: Three components were created to execute a command line to copy a predefined request file from the DXPData folder to the DXPRequestFile folder. to execute a command line to activate the Automation Services Job containing our custom task. to open the resulting analysis (.DXP) file stored in the DXPResults folder by Spotfire.
  • 18. Proof-Of-Concept Demo Pipeline Pilot Setup: Using these simple components the following protocol was created.
  • 19. Proof-Of-Concept Demo Execution Sequence: 1) Create and Submit Analysis Request (file) On activating the protocol, a user is requested to selected one of the two predefine analysis request files. The selected request file is copied to the DXPData folder and renamed to dxprequest.xml – This is the name of a file that the custom task on Spotfire will look for.
  • 20. Proof-Of-Concept Demo Execution Sequence: 2) Activate Automation Services Job Once the request file is copied to the DXPRequests folder, a another component activates the Automation Services Job on Spotfire to process this request. (there are other options for activating the Automation Services Job besides the method used in this demo.) The command line executed by this component is &quot;c:\temp\dxpdata\Spotfire.Dxp.Automation.ClientJobSender.exe&quot; &quot;http://laptopdayton/SpotfireAutomation/JobExecutor.asmx&quot; &quot;c:\temp\dxpdata\DXPRequestJob.xml“
  • 21. Proof-Of-Concept Demo Execution Sequence: 3) Execute Analysis Instructions (Spotfire) The XML formatted instructions from the file and the Spotfire APIs are used for modifying an existing analysis document or for creating a new analysis document. In our example the contents of the XML Request file is as follows <?xml version=&quot;1.0&quot; encoding=&quot;utf-8&quot;?> <PPDXPRequest xmlns:xsi=&quot;http://www.w3.org/2001/XMLSchema-instance&quot; xmlns:xsd=&quot;http://www.w3.org/2001/XMLSchema&quot;> <dxpRequest>2</dxpRequest> A Request Type of 2 was implied to mean a Scatter Plot <xDataColumnName>Cost</xDataColumnName> Column name of the x data values <yDataColumnName>Sales</yDataColumnName> Column name for the y data values <plotDataFile>c:\temp\dxpdata\SalesData.txt</plotDataFile> Location of the source data <dxpResultFile>c:\temp\dxpresults\SalesDataScatterPlot.dxp</dxpResultFile> location for the resulting analysis document. </PPDXPRequest>
  • 22. Proof-Of-Concept Demo Cont. When the Automation Services Job is activated, it executes the code of the custom task and also provides access to the running TIBCO Spotfire instance. A document opened in a running instance of TIBCO Spotfire is referred to as an Analysis Document. The document not only contains a series of metadata information, but it also contains references to the data itself, and to various other components being part of the document, such as pages, filtering, bookmarks etc. Since a data file was specified, the custom task uses the Open method of the Analysis Application and the specified data file. We now have a default analysis document that can be customized with the requested visualization. In this request file the requested visualization is for a scatter plot and so the available method(s) are used for adding a scatter plot visualization to the analysis document using the data file and the X and Y data column names. The analysis document is then saved using the name/location specified in the request file.
  • 23. Proof-Of-Concept Demo Execution Sequence: 4) Receive Requested Analysis (.DXP File) Once control is returned, the resulting analysis file will have been stored in the DXPResults folder by the custom task that was executed by the Spotfire Automations Services Job.
  • 24. Proof-Of-Concept Demo Execution Sequence: 5) View Analysis Document (.DXP file) And finally the analysis file is opened by a component executing the following command line c:\temp\dxpresults\salesdatascatterplot.dxp Note: The following plot was created using the default template provided by Spotfire’s API. The Filters; The Page and the Details-on-Demand were preconfigured but could be customized. In this case the data file and name of the X and Y columns were provided.
  • 25. Phase-II Gather a list of all possible DXP file requests that are required from Pipeline Pilot. Using this data, an XML formatted message structure will need to be to determined that would be able to support all the requests. Determine if these requests can be implemented using Spotfire APIs. Develop a library of Pipeline Pilot components that an end user could use with or without modifications. Develop the custom task to interpret these DXP requests.