An introduction to the core features of the Globus data management service. This tutorial was presented at the GlobusWorld 2021 conference in Chicago, IL by Greg Nawrocki.
GlobusWorld 2021 Tutorial: The Globus CLI, Platform and SDKGlobus
An introduction to the Globus command line interface and the SDK for accessing Globus platform services. This tutorial was presented at the GlobusWorld 2021 conference in Chicago, IL by Greg Nawrocki.
Introduction to Globus (GlobusWorld Tour West)Globus
This document introduces Globus, which provides fast and reliable data transfer, sharing, and platform services across different storage systems and resources. It does this through software-as-a-service that uses existing user identities, with the goal of unifying access to data across different tiers like HPC, storage, cloud, and personal resources. Key features include secure data transfers without moving files, access control and sharing capabilities, and tools for building automations and integrating with science gateways. It also discusses options for handling protected data like health information with additional security controls and business agreements.
Globus is a non-profit data management service that allows users to transfer, share, and access data across different storage systems and platforms through software-as-a-service. It has transferred over 1.34 exabytes of data and aims to unify access to research data across different tiers of storage through connectors, APIs, and user interfaces. Globus ensures secure data transfers and sharing by using user identities, access controls, encryption, and audit logging without storing user credentials or data.
GlobusWorld 2021 Tutorial: Globus for System AdministratorsGlobus
An overview of installing and configuring a Globus endpoints on your storage system. This tutorial was presented at the GlobusWorld 2021 conference in Chicago, IL by Vas Vasiliadis.
GlobusWorld 2021 Tutorial: Building with the Globus PlatformGlobus
Using Globus platform services like Search and Flows to build data portals, science gateways and data commons that facilitate data discovery and collaboration. This tutorial was presented at the GlobusWorld 2021 conference in Chicago, IL by Vas Vasiliadis.
Instrument Data Orchestration with Globus Search and FlowsGlobus
This document discusses various Globus services for instrument data orchestration including the Timer service, platform services, authentication, search, transfer, flows, and the upcoming Trigger service. The Timer service allows for scheduled and recurring transfers. Platform services provide comprehensive data and compute orchestration. Authentication is handled by Globus Auth. Search allows for data description and discovery. Transfer shares and moves data. Flows automate distributed research tasks. Triggers will start flows based on events.
Globus Command Line Interface (APS Workshop)Globus
The document provides information about using the Globus Command Line Interface (CLI) to automate data transfers and sharing. It discusses installing the CLI and some basic commands like searching for endpoints, listing files, and doing transfers. It also covers more advanced topics like managing permissions, batch transfers, notifications, and examples of automation scripts that use the CLI to move data between endpoints and share it with other users based on permissions. The final section walks through an example of using a shell script to automate the process of moving data from an instrument to a shared guest collection and setting permissions for another user to access it.
This document provides an overview of Globus Data Portals and the Django Globus Portal Framework. Globus Data Portals allow users to securely discover, access, and act on research data through a customizable web interface. The Django Globus Portal Framework enables researchers to quickly build portals by leveraging Globus services like Search, Transfer, and Auth for features like authenticated access, data discovery, and integration with custom applications. It discusses key portal capabilities, architecture, and provides a link to demo building a new portal using the framework and example repo.
Making Storage Systems Accessible via Globus (GlobusWorld Tour West)Globus
1. Globus Connect Server software makes storage systems accessible via Globus by installing software on the storage system that connects it to the Globus network.
2. To set up access, you first register a Globus Connect Server, install the software, set up an endpoint, and create a storage gateway and mapped collection.
3. You can then associate the endpoint with a Globus subscription to manage access and share data by creating guest collections.
Introduction to the Globus Platform (APS Workshop)Globus
This document discusses the Globus Platform Services API and SDK. It provides an overview of the Globus Auth API for user authentication and file sharing capabilities. It also summarizes the Globus Transfer API and Python SDK for integrating file transfer and access management into applications. Several methods for tasks like endpoint search, file operations, task submission and management are covered at a high level.
Connecting Your System to Globus (APS Workshop)Globus
Connecting a storage system to Globus involves:
1. Registering a Globus Connect Server to get credentials.
2. Installing Globus Connect Server packages and setting up an endpoint and data transfer node.
3. Creating a POSIX storage gateway to make the storage accessible and define authentication policies.
4. Adding a mapped collection to provide access to files in the storage system via Globus.
5. Associating the endpoint with a Globus subscription to enable additional features.
Data Orchestration at Scale (GlobusWorld Tour West)Globus
This document discusses how the Globus platform can be used for instrument data orchestration at scale. It describes how Globus Auth provides foundational identity and access management, Globus Transfer enables data transfer and sharing, Globus Search allows for data description and discovery, and Globus Flows can automate multi-step data and computing workflows. Specific capabilities and use cases are outlined, such as federated cancer registry queries across multiple institutions using Globus services while maintaining local data control.
Simple Data Automation with Globus (GlobusWorld Tour West)Globus
Greg Nawrocki's document discusses how Globus provides tools for data automation programming including a command line interface, timer service, REST APIs, and Python SDK. These tools allow users to create integrated ecosystems of research data services and applications while managing security and authentication through Globus Auth. Specific examples are given for using the command line interface, timer service, REST APIs, and Python SDK to automate tasks like file transfers, scheduled jobs, and accessing endpoints. Resources for learning more and code examples are also provided.
Globus and Dataverse: Towards big Data PublicationGlobus
Globus is a non-profit service that aims to increase research efficiency through sustainable software. It unifies access to data across disparate systems like cloud storage, repositories, and research instruments. Globus Connectors support secure data sharing and transfer between these systems. The document proposes using Globus and Dataverse together for big data publication, allowing faceted searches of a Dataverse repository, authorization controls, and asynchronous bulk data transfer using Globus for larger datasets. A demonstration of this combination is available online.
An overview of developments in the Globus platform during 2020-2021, presented at a webinar hosted by Internet2. Includes an overview of Globus Connect Server v5, cloud storage connectors, and platform services for developers (e.g., Globus Search and Globus Flows).
This document provides information about installing and configuring Globus Connect Server (GCS) version 4. It discusses how GCS makes local storage accessible via Globus, installing GCS on an Amazon EC2 instance, common configuration options like restricting file paths and enabling sharing, and using subscriptions to create managed endpoints.
Introduction to the Globus Platform (GlobusWorld Tour - UMich)Globus
1) The Globus platform provides services for fast and reliable data transfer, sharing, and file management directly from storage systems via software-as-a-service using existing identities.
2) Globus can be used as a platform for building science gateways, portals and other web applications in support of research through APIs for user authentication, file transfer, and sharing capabilities.
3) The document provides an introduction to the Globus platform and its capabilities including code samples and walks through using the APIs via a Jupyter notebook to search for endpoints, manage files and tasks, and integrate Globus into other applications.
Globus Endpoint Setup and Configuration - XSEDE14 TutorialGlobus
This document provides an overview of how to create and manage Globus endpoints. It discusses installing and configuring Globus Connect Server to set up an endpoint on an Amazon EC2 server. The key steps are to install Globus Connect Server, run the setup process, and configure options in the configuration file like making the endpoint public or enabling sharing. Advanced configuration topics covered include using host certificates, single sign-on with CILogon, restricting file paths for transfers and sharing, and setting up multiple Globus Connect Server instances for load balancing.
Leveraging the Globus Platform in Web Applications (CHPC 2019 - South Africa)Globus
This document discusses how to leverage the Globus platform in web applications. It describes the Globus platform as providing secure and reliable data orchestration and transfer capabilities. It outlines several key Globus services like Auth, Transfer, and Helper Pages and provides examples of how they can be used to build applications. It also summarizes the Globus APIs and provides code examples of accessing endpoints, submitting tasks, and more.
Globus: Research Data Management as Service and Platform - pearc17Mary Bass
Scientists have embraced the use of specialized cloud-hosted services to perform data management operations. Globus offers a suite of data and user management capabilities to the community, encompassing data transfer and sharing, user identity and authorization, and data publication. Globus capabilities are accessible via both a web browser and REST APIs. Web access allows Globus to address the needs of research labs through a software-as-a-service model; the newer REST APIs address the needs of developers of research services, who can now use Globus as a platform, outsourcing complex user and data management tasks to Globus cloud-hosted services. Here we review Globus capabilities and outline how it is being applied as a platform for scientific services. Presentation by Steve Tuecke from The University of Chicago. Steve is Globus Founder and Project Lead.
We provide a summary review of Globus features targeted at those new to Globus. We demonstrate how to transfer and share data, and install a Globus Connect Personal endpoint on your laptop.
Managing Protected and Controlled Data with Globus Globus
This document discusses how Globus can be used to manage protected and controlled data with high assurance. It describes features for restricting data handling according to standards like NIST 800-53 and 800-171. Compliance focuses on access control, configuration management, maintenance, and accountability. Restricted data passed to Globus does not include file contents. The initial release includes a new web app, Globus Connect Server v5.2, and Connect Personal. High assurance capabilities include additional authentication, application instance isolation, encryption, and detailed auditing. Subscription levels like High Assurance and BAA provide these features.
This document provides an overview of Globus Data Portals and the Django Globus Portal Framework. Globus Data Portals allow users to securely discover, access, and act on research data through a customizable web interface. The Django Globus Portal Framework enables researchers to quickly build portals by leveraging Globus services like Search, Transfer, and Auth for features like authenticated access, data discovery, and integration with custom applications. It discusses key portal capabilities, architecture, and provides a link to demo building a new portal using the framework and example repo.
Making Storage Systems Accessible via Globus (GlobusWorld Tour West)Globus
1. Globus Connect Server software makes storage systems accessible via Globus by installing software on the storage system that connects it to the Globus network.
2. To set up access, you first register a Globus Connect Server, install the software, set up an endpoint, and create a storage gateway and mapped collection.
3. You can then associate the endpoint with a Globus subscription to manage access and share data by creating guest collections.
Introduction to the Globus Platform (APS Workshop)Globus
This document discusses the Globus Platform Services API and SDK. It provides an overview of the Globus Auth API for user authentication and file sharing capabilities. It also summarizes the Globus Transfer API and Python SDK for integrating file transfer and access management into applications. Several methods for tasks like endpoint search, file operations, task submission and management are covered at a high level.
Connecting Your System to Globus (APS Workshop)Globus
Connecting a storage system to Globus involves:
1. Registering a Globus Connect Server to get credentials.
2. Installing Globus Connect Server packages and setting up an endpoint and data transfer node.
3. Creating a POSIX storage gateway to make the storage accessible and define authentication policies.
4. Adding a mapped collection to provide access to files in the storage system via Globus.
5. Associating the endpoint with a Globus subscription to enable additional features.
Data Orchestration at Scale (GlobusWorld Tour West)Globus
This document discusses how the Globus platform can be used for instrument data orchestration at scale. It describes how Globus Auth provides foundational identity and access management, Globus Transfer enables data transfer and sharing, Globus Search allows for data description and discovery, and Globus Flows can automate multi-step data and computing workflows. Specific capabilities and use cases are outlined, such as federated cancer registry queries across multiple institutions using Globus services while maintaining local data control.
Simple Data Automation with Globus (GlobusWorld Tour West)Globus
Greg Nawrocki's document discusses how Globus provides tools for data automation programming including a command line interface, timer service, REST APIs, and Python SDK. These tools allow users to create integrated ecosystems of research data services and applications while managing security and authentication through Globus Auth. Specific examples are given for using the command line interface, timer service, REST APIs, and Python SDK to automate tasks like file transfers, scheduled jobs, and accessing endpoints. Resources for learning more and code examples are also provided.
Globus and Dataverse: Towards big Data PublicationGlobus
Globus is a non-profit service that aims to increase research efficiency through sustainable software. It unifies access to data across disparate systems like cloud storage, repositories, and research instruments. Globus Connectors support secure data sharing and transfer between these systems. The document proposes using Globus and Dataverse together for big data publication, allowing faceted searches of a Dataverse repository, authorization controls, and asynchronous bulk data transfer using Globus for larger datasets. A demonstration of this combination is available online.
An overview of developments in the Globus platform during 2020-2021, presented at a webinar hosted by Internet2. Includes an overview of Globus Connect Server v5, cloud storage connectors, and platform services for developers (e.g., Globus Search and Globus Flows).
This document provides information about installing and configuring Globus Connect Server (GCS) version 4. It discusses how GCS makes local storage accessible via Globus, installing GCS on an Amazon EC2 instance, common configuration options like restricting file paths and enabling sharing, and using subscriptions to create managed endpoints.
Introduction to the Globus Platform (GlobusWorld Tour - UMich)Globus
1) The Globus platform provides services for fast and reliable data transfer, sharing, and file management directly from storage systems via software-as-a-service using existing identities.
2) Globus can be used as a platform for building science gateways, portals and other web applications in support of research through APIs for user authentication, file transfer, and sharing capabilities.
3) The document provides an introduction to the Globus platform and its capabilities including code samples and walks through using the APIs via a Jupyter notebook to search for endpoints, manage files and tasks, and integrate Globus into other applications.
Globus Endpoint Setup and Configuration - XSEDE14 TutorialGlobus
This document provides an overview of how to create and manage Globus endpoints. It discusses installing and configuring Globus Connect Server to set up an endpoint on an Amazon EC2 server. The key steps are to install Globus Connect Server, run the setup process, and configure options in the configuration file like making the endpoint public or enabling sharing. Advanced configuration topics covered include using host certificates, single sign-on with CILogon, restricting file paths for transfers and sharing, and setting up multiple Globus Connect Server instances for load balancing.
Leveraging the Globus Platform in Web Applications (CHPC 2019 - South Africa)Globus
This document discusses how to leverage the Globus platform in web applications. It describes the Globus platform as providing secure and reliable data orchestration and transfer capabilities. It outlines several key Globus services like Auth, Transfer, and Helper Pages and provides examples of how they can be used to build applications. It also summarizes the Globus APIs and provides code examples of accessing endpoints, submitting tasks, and more.
Globus: Research Data Management as Service and Platform - pearc17Mary Bass
Scientists have embraced the use of specialized cloud-hosted services to perform data management operations. Globus offers a suite of data and user management capabilities to the community, encompassing data transfer and sharing, user identity and authorization, and data publication. Globus capabilities are accessible via both a web browser and REST APIs. Web access allows Globus to address the needs of research labs through a software-as-a-service model; the newer REST APIs address the needs of developers of research services, who can now use Globus as a platform, outsourcing complex user and data management tasks to Globus cloud-hosted services. Here we review Globus capabilities and outline how it is being applied as a platform for scientific services. Presentation by Steve Tuecke from The University of Chicago. Steve is Globus Founder and Project Lead.
We provide a summary review of Globus features targeted at those new to Globus. We demonstrate how to transfer and share data, and install a Globus Connect Personal endpoint on your laptop.
Managing Protected and Controlled Data with Globus Globus
This document discusses how Globus can be used to manage protected and controlled data with high assurance. It describes features for restricting data handling according to standards like NIST 800-53 and 800-171. Compliance focuses on access control, configuration management, maintenance, and accountability. Restricted data passed to Globus does not include file contents. The initial release includes a new web app, Globus Connect Server v5.2, and Connect Personal. High assurance capabilities include additional authentication, application instance isolation, encryption, and detailed auditing. Subscription levels like High Assurance and BAA provide these features.
Introduction to Globus for New Users (GlobusWorld Tour - Columbia University)Globus
This document provides an introduction to Globus for new users. It discusses how Globus can be used to move, share, describe, discover, and reproduce research data across different storage locations and computing resources. It highlights key Globus features like transferring files securely between endpoints, sharing data with collaborators, and building applications and services using the Globus APIs. The document also covers sustainability topics like Globus usage metrics, subscription plans, and support resources.
Automating Research Data Management at Scale with GlobusGlobus
Research computing facilities, such as the national supercomputing centers, and shared instruments, such as cryo electron microscopes and advanced light sources, are generating large volumes of data daily. These growing data volumes make it challenging for researchers to perform what should be mundane tasks: move data reliably, describe data for subsequent discovery, and make data accessible to geographically distributed collaborators. Most employ some set of ad hoc methods, which are not scalable, and it is clear that some level of automation is required for these tasks.
Globus is an established service from the University of Chicago that is widely used for managing research data in national laboratories, campus computing centers, and HPC facilities. While its intuitive web app addresses simple file transfer and sharing scenarios, automation at scale requires integrating Globus data management platform services into custom science gateways, data portals and other web applications in service of research. Such applications should enable automated ingest of data from diverse sources, launching of analysis runs on diverse computing resources, extraction and addition of metadata for creating search indexes, assignment of persistent identifiers faceted search for rapid data discovery, and point-and-click downloading of datasets by authorized users — all protected by an authentication and authorization substrate that allows the implementation of flexible data access policies for both metadata and data alike.
We describe current and emerging Globus services that facilitate these automated data flows while ensuring a streamlined user experience. We also demonstrate Petreldata.net, a data management portal and gateway to multiple computing resources, that supports large-scale research at the Advanced Photon Source.
We demonstrate Globus capabilities from the perspectives of an end-user researcher. This is an introduction to the many features of the Globus web application and command line interface. Beyond simple file transfers, you will learn how to securely share data with collaborators, set up personal endpoints and manage linked identities for unified access to storage resources.
Presented at a workshop at KU Leuven on July 8, 2022.
Globus is a non-profit service that aims to increase research efficiency by unifying access to disparate storage systems and simplifying secure data sharing. It allows users to easily, securely, and reliably transfer data between different resources like HPC systems, cloud storage, instruments, and personal computers. Globus also provides APIs and SDKs to help researchers build data-centric applications and automate workflows. Funding comes partly from government grants, with subscriptions enabling additional features and supporting ongoing operations.
We provide a summary review of Globus features targeted at those new to Globus. We demonstrate how to transfer and share data, and install a Globus Connect Personal endpoint on your laptop.
Presented at a workshop at Oak Ridge National Laboratory on June 22, 2022.
Globus: A Data Management Platform for Collaborative Research (CHPC 2019 - So...Globus
Globus is a non-profit data management platform developed by the University of Chicago to increase the efficiency of data-driven research. It allows researchers to easily transfer large datasets, securely share data with collaborators across different storage systems, and automate the movement of data from instruments and compute facilities. Globus sees growing adoption with over 120 subscribers and has transferred over 768 petabytes of data.
Facilitating Collaboration with Globus (GlobusWorld Tour - STFC)Globus
This document discusses how Globus services can facilitate collaboration and data sharing through automated workflows. It describes how Globus Auth enables authentication for shared access to endpoints. APIs and command line tools allow applications to programmatically manage permissions and transfer data. JupyterHub can be configured with Globus Auth to provide tokens for accessing remote Globus services within notebooks. This enables collaborative and distributed data analysis. The document also outlines how Globus services can support automated publication of datasets through search, identifiers, and metadata.
Globus is a SaaS platform for research data management that allows researchers to transfer, share, and publish research data directly from their storage systems via a web or command line interface. It provides reliable and secure data transfers between instruments, compute facilities, and personal computers. Researchers can select files to share, set access permissions for users or groups, and assemble data sets with metadata. Collaborators can then access and download shared files through Globus without a local account. The document discusses how Globus works, its target markets, business model, management tools, and representative subscribers. It also describes how Globus can integrate with storage systems like Spectra to provide a consistent interface and optimized data transfers across storage hierarchies.
Globus High Assurance for Protected Data (GlobusWorld Tour - Columbia Univers...Globus
This document discusses Globus High Assurance for managing protected data such as PHI and PII. It describes new Globus features for restricted data handling including additional authentication assurance, session isolation, and encryption. Globus Connect Server version 5.2 adds support for high assurance mapped and guest collections on various storage systems. Subscription levels of High Assurance and BAA provide tools to share data while meeting compliance requirements.
Simplified Research Data Management with the Globus PlatformGlobus
Overview of the Globus research data management platform, as presented at the Fall 2018 Membership Meeting of the Coalition for Networked Information (CNI), held in Washington, D.C., December 10-11, 2018
Introduction to the Globus SaaS (GlobusWorld Tour - STFC)Globus
This document summarizes a presentation about the Globus data management platform. It includes an agenda covering an introduction to the Globus Software as a Service and Platform as a Service, automating research data workflows, facilitating collaboration, and building services. There are demonstrations of file transfers, data sharing, publication, and high assurance endpoints. The sustainability model is discussed, with standard and high assurance subscriptions, branded websites, premium storage connectors, and identity providers. Support resources like documentation, email lists, and professional services are also mentioned.
We provide an overview of the Globus platform features, and demonstrate several data management features. Serving as an introductory session suitable for new users, we use the Globus web app to show data transfer and sharing, use of Globus Connect Personal for laptop/desktop access and introduce the Globus Command Line Interface for interactive and scripting.
This material was presented at the Research Computing and Data Management Workshop, hosted by Rensselaer Polytechnic Institute on February 27-28, 2024.
Introduction to Data Transfer and Sharing for ResearchersGlobus
We will provide a summary review of Globus features targeted at those new to Globus. We will present various use cases that illustrate the power of Globus data sharing capabilities.
Introduction to Globus (GlobusWorld Tour - UMich)Globus
This document provides an agenda for a Globus World Tour event taking place on Monday and Tuesday. On Monday, there will be sessions on introductions to Globus for new and administrative users. On Tuesday, sessions will focus on developing with Globus, including building research data portals, automating workflows, and working with instrument data. The document also provides background information on Globus and how it aims to make research data movement, sharing, and synchronization easy, reliable and secure for researchers.
Scalable Data Management: Automation and the Modern Research Data PortalGlobus
Globus is an established service from the University of Chicago that is widely used for managing research data in national laboratories, campus computing centers, and HPC facilities. While its interactive web browser interface addresses simple file transfer and sharing scenarios, large scale automation typically requires integration of the research data management platform it provides into bespoke applications.
We will describe one such example, the Petrel data portal (https://petreldata.net), used by researchers to manage data in diverse fields including materials science, cosmology, machine learning, and serial crystallography. The portal facilitates automated ingest of data, extraction and addition of metadata for creating search indexes, assignment of persistent identifiers faceted search for rapid data discovery, and point-and-click downloading of datasets by authorized users. As security and privacy are often critical requirements, the portal employs fine-grained permissions that control both visibility of metadata and access to the datasets themselves. It is based on the Modern Research Data Portal design pattern, jointly developed by the ESnet and Globus teams, and leverages capabilities such as the Science DMZ for enhanced performance and to streamline the user experience.
Globus Compute wth IRI Workflows - GlobusWorld 2024Globus
As part of the DOE Integrated Research Infrastructure (IRI) program, NERSC at Lawrence Berkeley National Lab and ALCF at Argonne National Lab are working closely with General Atomics on accelerating the computing requirements of the DIII-D experiment. As part of the work the team is investigating ways to speedup the time to solution for many different parts of the DIII-D workflow including how they run jobs on HPC systems. One of these routes is looking at Globus Compute as a way to replace the current method for managing tasks and we describe a brief proof of concept showing how Globus Compute could help to schedule jobs and be a tool to connect compute at different facilities.
Climate Science Flows: Enabling Petabyte-Scale Climate Analysis with the Eart...Globus
The Earth System Grid Federation (ESGF) is a global network of data servers that archives and distributes the planet’s largest collection of Earth system model output for thousands of climate and environmental scientists worldwide. Many of these petabyte-scale data archives are located in proximity to large high-performance computing (HPC) or cloud computing resources, but the primary workflow for data users consists of transferring data, and applying computations on a different system. As a part of the ESGF 2.0 US project (funded by the United States Department of Energy Office of Science), we developed pre-defined data workflows, which can be run on-demand, capable of applying many data reduction and data analysis to the large ESGF data archives, transferring only the resultant analysis (ex. visualizations, smaller data files). In this talk, we will showcase a few of these workflows, highlighting how Globus Flows can be used for petabyte-scale climate analysis.
We describe the deployment and use of Globus Compute for remote computation. This content is aimed at researchers who wish to compute on remote resources using a unified programming interface, as well as system administrators who will deploy and operate Globus Compute services on their research computing infrastructure.
Globus Connect Server Deep Dive - GlobusWorld 2024Globus
We explore the Globus Connect Server (GCS) architecture and experiment with advanced configuration options and use cases. This content is targeted at system administrators who are familiar with GCS and currently operate—or are planning to operate—broader deployments at their institution.
Innovating Inference - Remote Triggering of Large Language Models on HPC Clus...Globus
Large Language Models (LLMs) are currently the center of attention in the tech world, particularly for their potential to advance research. In this presentation, we'll explore a straightforward and effective method for quickly initiating inference runs on supercomputers using the vLLM tool with Globus Compute, specifically on the Polaris system at ALCF. We'll begin by briefly discussing the popularity and applications of LLMs in various fields. Following this, we will introduce the vLLM tool, and explain how it integrates with Globus Compute to efficiently manage LLM operations on Polaris. Attendees will learn the practical aspects of setting up and remotely triggering LLMs from local machines, focusing on ease of use and efficiency. This talk is ideal for researchers and practitioners looking to leverage the power of LLMs in their work, offering a clear guide to harnessing supercomputing resources for quick and effective LLM inference.
Providing Globus Services to Users of JASMIN for Environmental Data AnalysisGlobus
JASMIN is the UK’s high-performance data analysis platform for environmental science, operated by STFC on behalf of the UK Natural Environment Research Council (NERC). In addition to its role in hosting the CEDA Archive (NERC’s long-term repository for climate, atmospheric science & Earth observation data in the UK), JASMIN provides a collaborative platform to a community of around 2,000 scientists in the UK and beyond, providing nearly 400 environmental science projects with working space, compute resources and tools to facilitate their work. High-performance data transfer into and out of JASMIN has always been a key feature, with many scientists bringing model outputs from supercomputers elsewhere in the UK, to analyse against observational or other model data in the CEDA Archive. A growing number of JASMIN users are now realising the benefits of using the Globus service to provide reliable and efficient data movement and other tasks in this and other contexts. Further use cases involve long-distance (intercontinental) transfers to and from JASMIN, and collecting results from a mobile atmospheric radar system, pushing data to JASMIN via a lightweight Globus deployment. We provide details of how Globus fits into our current infrastructure, our experience of the recent migration to GCSv5.4, and of our interest in developing use of the wider ecosystem of Globus services for the benefit of our user community.
First Steps with Globus Compute Multi-User EndpointsGlobus
In this presentation we will share our experiences around getting started with the Globus Compute multi-user endpoint. Working with the Pharmacology group at the University of Auckland, we have previously written an application using Globus Compute that can offload computationally expensive steps in the researcher's workflows, which they wish to manage from their familiar Windows environments, onto the NeSI (New Zealand eScience Infrastructure) cluster. Some of the challenges we have encountered were that each researcher had to set up and manage their own single-user globus compute endpoint and that the workloads had varying resource requirements (CPUs, memory and wall time) between different runs. We hope that the multi-user endpoint will help to address these challenges and share an update on our progress here.
Enhancing Research Orchestration Capabilities at ORNL.pdfGlobus
Cross-facility research orchestration comes with ever-changing constraints regarding the availability and suitability of various compute and data resources. In short, a flexible data and processing fabric is needed to enable the dynamic redirection of data and compute tasks throughout the lifecycle of an experiment. In this talk, we illustrate how we easily leveraged Globus services to instrument the ACE research testbed at the Oak Ridge Leadership Computing Facility with flexible data and task orchestration capabilities.
Understanding Globus Data Transfers with NetSageGlobus
NetSage is an open privacy-aware network measurement, analysis, and visualization service designed to help end-users visualize and reason about large data transfers. NetSage traditionally has used a combination of passive measurements, including SNMP and flow data, as well as active measurements, mainly perfSONAR, to provide longitudinal network performance data visualization. It has been deployed by dozens of networks world wide, and is supported domestically by the Engagement and Performance Operations Center (EPOC), NSF #2328479. We have recently expanded the NetSage data sources to include logs for Globus data transfers, following the same privacy-preserving approach as for Flow data. Using the logs for the Texas Advanced Computing Center (TACC) as an example, this talk will walk through several different example use cases that NetSage can answer, including: Who is using Globus to share data with my institution, and what kind of performance are they able to achieve? How many transfers has Globus supported for us? Which sites are we sharing the most data with, and how is that changing over time? How is my site using Globus to move data internally, and what kind of performance do we see for those transfers? What percentage of data transfers at my institution used Globus, and how did the overall data transfer performance compare to the Globus users?
How to Position Your Globus Data Portal for Success Ten Good PracticesGlobus
Science gateways allow science and engineering communities to access shared data, software, computing services, and instruments. Science gateways have gained a lot of traction in the last twenty years, as evidenced by projects such as the Science Gateways Community Institute (SGCI) and the Center of Excellence on Science Gateways (SGX3) in the US, The Australian Research Data Commons (ARDC) and its platforms in Australia, and the projects around Virtual Research Environments in Europe. A few mature frameworks have evolved with their different strengths and foci and have been taken up by a larger community such as the Globus Data Portal, Hubzero, Tapis, and Galaxy. However, even when gateways are built on successful frameworks, they continue to face the challenges of ongoing maintenance costs and how to meet the ever-expanding needs of the community they serve with enhanced features. It is not uncommon that gateways with compelling use cases are nonetheless unable to get past the prototype phase and become a full production service, or if they do, they don't survive more than a couple of years. While there is no guaranteed pathway to success, it seems likely that for any gateway there is a need for a strong community and/or solid funding streams to create and sustain its success. With over twenty years of examples to draw from, this presentation goes into detail for ten factors common to successful and enduring gateways that effectively serve as best practices for any new or developing gateway.
Exploring Innovations in Data Repository Solutions - Insights from the U.S. G...Globus
The U.S. Geological Survey (USGS) has made substantial investments in meeting evolving scientific, technical, and policy driven demands on storing, managing, and delivering data. As these demands continue to grow in complexity and scale, the USGS must continue to explore innovative solutions to improve its management, curation, sharing, delivering, and preservation approaches for large-scale research data. Supporting these needs, the USGS has partnered with the University of Chicago-Globus to research and develop advanced repository components and workflows leveraging its current investment in Globus. The primary outcome of this partnership includes the development of a prototype enterprise repository, driven by USGS Data Release requirements, through exploration and implementation of the entire suite of the Globus platform offerings, including Globus Flow, Globus Auth, Globus Transfer, and Globus Search. This presentation will provide insights into this research partnership, introduce the unique requirements and challenges being addressed and provide relevant project progress.
Developing Distributed High-performance Computing Capabilities of an Open Sci...Globus
COVID-19 had an unprecedented impact on scientific collaboration. The pandemic and its broad response from the scientific community has forged new relationships among public health practitioners, mathematical modelers, and scientific computing specialists, while revealing critical gaps in exploiting advanced computing systems to support urgent decision making. Informed by our team’s work in applying high-performance computing in support of public health decision makers during the COVID-19 pandemic, we present how Globus technologies are enabling the development of an open science platform for robust epidemic analysis, with the goal of collaborative, secure, distributed, on-demand, and fast time-to-solution analyses to support public health.
The Department of Energy's Integrated Research Infrastructure (IRI)Globus
We will provide an overview of DOE’s IRI initiative as it moves into early implementation, what drives the IRI vision, and the role of DOE in the larger national research ecosystem.
Listen to the keynote address and hear about the latest developments from Rachana Ananthakrishnan and Ian Foster who review the updates to the Globus Platform and Service, and the relevance of Globus to the scientific community as an automation platform to accelerate scientific discovery.
Enhancing Performance with Globus and the Science DMZGlobus
ESnet has led the way in helping national facilities—and many other institutions in the research community—configure Science DMZs and troubleshoot network issues to maximize data transfer performance. In this talk we will present a summary of approaches and tips for getting the most out of your network infrastructure using Globus Connect Server.
Extending Globus into a Site-wide Automated Data Infrastructure.pdfGlobus
The Rosalind Franklin Institute hosts a variety of scientific instruments, which allow us to capture a multifaceted and multilevel view of biological systems, generating around 70 terabytes of data a month. Distributed solutions, such as Globus and Ceph, facilitates storage, access, and transfer of large amount of data. However, we still must deal with the heterogeneity of the file formats and directory structure at acquisition, which is optimised for fast recording, rather than for efficient storage and processing. Our data infrastructure includes local storage at the instruments and workstations, distributed object stores with POSIX and S3 access, remote storage on HPCs, and taped backup. This can pose a challenge in ensuring fast, secure, and efficient data transfer. Globus allows us to handle this heterogeneity, while its Python SDK allows us to automate our data infrastructure using Globus microservices integrated with our data access models. Our data management workflows are becoming increasingly complex and heterogenous, including desktop PCs, virtual machines, and offsite HPCs, as well as several open-source software tools with different computing and data structure requirements. This complexity commands that data is annotated with enough details about the experiments and the analysis to ensure efficient and reproducible workflows. This talk explores how we extend Globus into different parts of our data lifecycle to create a secure, scalable, and high performing automated data infrastructure that can provide FAIR[1,2] data for all our science.
1. https://doi.org/10.1038/sdata.2016.18
2. https://www.go-fair.org/fair-principles
Providing Globus Services to Users of JASMIN for Environmental Data AnalysisGlobus
JASMIN is the UK’s high-performance data analysis platform for environmental science, operated by STFC on behalf of the UK Natural Environment Research Council (NERC). In addition to its role in hosting the CEDA Archive (NERC’s long-term repository for climate, atmospheric science & Earth observation data in the UK), JASMIN provides a collaborative platform to a community of around 2,000 scientists in the UK and beyond, providing nearly 400 environmental science projects with working space, compute resources and tools to facilitate their work. High-performance data transfer into and out of JASMIN has always been a key feature, with many scientists bringing model outputs from supercomputers elsewhere in the UK, to analyse against observational or other model data in the CEDA Archive. A growing number of JASMIN users are now realising the benefits of using the Globus service to provide reliable and efficient data movement and other tasks in this and other contexts. Further use cases involve long-distance (intercontinental) transfers to and from JASMIN, and collecting results from a mobile atmospheric radar system, pushing data to JASMIN via a lightweight Globus deployment. We provide details of how Globus fits into our current infrastructure, our experience of the recent migration to GCSv5.4, and of our interest in developing use of the wider ecosystem of Globus services for the benefit of our user community.
Globus Compute with Integrated Research Infrastructure (IRI) workflowsGlobus
As part of the DOE Integrated Research Infrastructure (IRI) program, NERSC at Lawrence Berkeley National Lab and ALCF at Argonne National Lab are working closely with General Atomics on accelerating the computing requirements of the DIII-D experiment. As part of the work the team is investigating ways to speedup the time to solution for many different parts of the DIII-D workflow including how they run jobs on HPC systems. One of these routes is looking at Globus Compute as a way to replace the current method for managing tasks and I will give a brief proof of concept showing how Globus Compute could help to schedule jobs and be a tool to connect compute at different facilities.
Reactive Documents and Computational Pipelines - Bridging the GapGlobus
As scientific discovery and experimentation become increasingly reliant on computational methods, the static nature of traditional publications renders them progressively fragmented and unreproducible. How can workflow automation tools, such as Globus, be leveraged to address these issues and potentially create a new, higher-value form of publication? LivePublication leverages Globus’s custom Action Provider integrations and Compute nodes to capture semantic and provenance information during distributed flow executions. This information is then embedded within an RO-crate and interfaced with a programmatic document, creating a seamless pipeline from instruments, to computation, to publication.
Douwan Crack 2025 new verson+ License codeaneelaramzan63
Copy & Paste On Google >>> https://dr-up-community.info/
Douwan Preactivated Crack Douwan Crack Free Download. Douwan is a comprehensive software solution designed for data management and analysis.
Designing AI-Powered APIs on Azure: Best Practices& ConsiderationsDinusha Kumarasiri
AI is transforming APIs, enabling smarter automation, enhanced decision-making, and seamless integrations. This presentation explores key design principles for AI-infused APIs on Azure, covering performance optimization, security best practices, scalability strategies, and responsible AI governance. Learn how to leverage Azure API Management, machine learning models, and cloud-native architectures to build robust, efficient, and intelligent API solutions
How to Batch Export Lotus Notes NSF Emails to Outlook PST Easily?steaveroggers
Migrating from Lotus Notes to Outlook can be a complex and time-consuming task, especially when dealing with large volumes of NSF emails. This presentation provides a complete guide on how to batch export Lotus Notes NSF emails to Outlook PST format quickly and securely. It highlights the challenges of manual methods, the benefits of using an automated tool, and introduces eSoftTools NSF to PST Converter Software — a reliable solution designed to handle bulk email migrations efficiently. Learn about the software’s key features, step-by-step export process, system requirements, and how it ensures 100% data accuracy and folder structure preservation during migration. Make your email transition smoother, safer, and faster with the right approach.
Read More:- https://www.esofttools.com/nsf-to-pst-converter.html
Not So Common Memory Leaks in Java WebinarTier1 app
This SlideShare presentation is from our May webinar, “Not So Common Memory Leaks & How to Fix Them?”, where we explored lesser-known memory leak patterns in Java applications. Unlike typical leaks, subtle issues such as thread local misuse, inner class references, uncached collections, and misbehaving frameworks often go undetected and gradually degrade performance. This deck provides in-depth insights into identifying these hidden leaks using advanced heap analysis and profiling techniques, along with real-world case studies and practical solutions. Ideal for developers and performance engineers aiming to deepen their understanding of Java memory management and improve application stability.
Who Watches the Watchmen (SciFiDevCon 2025)Allon Mureinik
Tests, especially unit tests, are the developers’ superheroes. They allow us to mess around with our code and keep us safe.
We often trust them with the safety of our codebase, but how do we know that we should? How do we know that this trust is well-deserved?
Enter mutation testing – by intentionally injecting harmful mutations into our code and seeing if they are caught by the tests, we can evaluate the quality of the safety net they provide. By watching the watchmen, we can make sure our tests really protect us, and we aren’t just green-washing our IDEs to a false sense of security.
Talk from SciFiDevCon 2025
https://www.scifidevcon.com/courses/2025-scifidevcon/contents/680efa43ae4f5
Copy & Paste On Google >>> https://dr-up-community.info/
EASEUS Partition Master Final with Crack and Key Download If you are looking for a powerful and easy-to-use disk partitioning software,
Exploring Wayland: A Modern Display Server for the FutureICS
Wayland is revolutionizing the way we interact with graphical interfaces, offering a modern alternative to the X Window System. In this webinar, we’ll delve into the architecture and benefits of Wayland, including its streamlined design, enhanced performance, and improved security features.
Adobe After Effects Crack FREE FRESH version 2025kashifyounis067
🌍📱👉COPY LINK & PASTE ON GOOGLE http://drfiles.net/ 👈🌍
Adobe After Effects is a software application used for creating motion graphics, special effects, and video compositing. It's widely used in TV and film post-production, as well as for creating visuals for online content, presentations, and more. While it can be used to create basic animations and designs, its primary strength lies in adding visual effects and motion to videos and graphics after they have been edited.
Here's a more detailed breakdown:
Motion Graphics:
.
After Effects is powerful for creating animated titles, transitions, and other visual elements to enhance the look of videos and presentations.
Visual Effects:
.
It's used extensively in film and television for creating special effects like green screen compositing, object manipulation, and other visual enhancements.
Video Compositing:
.
After Effects allows users to combine multiple video clips, images, and graphics to create a final, cohesive visual.
Animation:
.
It uses keyframes to create smooth, animated sequences, allowing for precise control over the movement and appearance of objects.
Integration with Adobe Creative Cloud:
.
After Effects is part of the Adobe Creative Cloud, a suite of software that includes other popular applications like Photoshop and Premiere Pro.
Post-Production Tool:
.
After Effects is primarily used in the post-production phase, meaning it's used to enhance the visuals after the initial editing of footage has been completed.
⭕️➡️ FOR DOWNLOAD LINK : http://drfiles.net/ ⬅️⭕️
Maxon Cinema 4D 2025 is the latest version of the Maxon's 3D software, released in September 2024, and it builds upon previous versions with new tools for procedural modeling and animation, as well as enhancements to particle, Pyro, and rigid body simulations. CG Channel also mentions that Cinema 4D 2025.2, released in April 2025, focuses on spline tools and unified simulation enhancements.
Key improvements and features of Cinema 4D 2025 include:
Procedural Modeling: New tools and workflows for creating models procedurally, including fabric weave and constellation generators.
Procedural Animation: Field Driver tag for procedural animation.
Simulation Enhancements: Improved particle, Pyro, and rigid body simulations.
Spline Tools: Enhanced spline tools for motion graphics and animation, including spline modifiers from Rocket Lasso now included for all subscribers.
Unified Simulation & Particles: Refined physics-based effects and improved particle systems.
Boolean System: Modernized boolean system for precise 3D modeling.
Particle Node Modifier: New particle node modifier for creating particle scenes.
Learning Panel: Intuitive learning panel for new users.
Redshift Integration: Maxon now includes access to the full power of Redshift rendering for all new subscriptions.
In essence, Cinema 4D 2025 is a major update that provides artists with more powerful tools and workflows for creating 3D content, particularly in the fields of motion graphics, VFX, and visualization.
TestMigrationsInPy: A Dataset of Test Migrations from Unittest to Pytest (MSR...Andre Hora
Unittest and pytest are the most popular testing frameworks in Python. Overall, pytest provides some advantages, including simpler assertion, reuse of fixtures, and interoperability. Due to such benefits, multiple projects in the Python ecosystem have migrated from unittest to pytest. To facilitate the migration, pytest can also run unittest tests, thus, the migration can happen gradually over time. However, the migration can be timeconsuming and take a long time to conclude. In this context, projects would benefit from automated solutions to support the migration process. In this paper, we propose TestMigrationsInPy, a dataset of test migrations from unittest to pytest. TestMigrationsInPy contains 923 real-world migrations performed by developers. Future research proposing novel solutions to migrate frameworks in Python can rely on TestMigrationsInPy as a ground truth. Moreover, as TestMigrationsInPy includes information about the migration type (e.g., changes in assertions or fixtures), our dataset enables novel solutions to be verified effectively, for instance, from simpler assertion migrations to more complex fixture migrations. TestMigrationsInPy is publicly available at: https://github.com/altinoalvesjunior/TestMigrationsInPy.
WinRAR Crack for Windows (100% Working 2025)sh607827
copy and past on google ➤ ➤➤ https://hdlicense.org/ddl/
WinRAR Crack Free Download is a powerful archive manager that provides full support for RAR and ZIP archives and decompresses CAB, ARJ, LZH, TAR, GZ, ACE, UUE, .
Avast Premium Security Crack FREE Latest Version 2025mu394968
🌍📱👉COPY LINK & PASTE ON GOOGLE https://dr-kain-geera.info/👈🌍
Avast Premium Security is a paid subscription service that provides comprehensive online security and privacy protection for multiple devices. It includes features like antivirus, firewall, ransomware protection, and website scanning, all designed to safeguard against a wide range of online threats, according to Avast.
Key features of Avast Premium Security:
Antivirus: Protects against viruses, malware, and other malicious software, according to Avast.
Firewall: Controls network traffic and blocks unauthorized access to your devices, as noted by All About Cookies.
Ransomware protection: Helps prevent ransomware attacks, which can encrypt your files and hold them hostage.
Website scanning: Checks websites for malicious content before you visit them, according to Avast.
Email Guardian: Scans your emails for suspicious attachments and phishing attempts.
Multi-device protection: Covers up to 10 devices, including Windows, Mac, Android, and iOS, as stated by 2GO Software.
Privacy features: Helps protect your personal data and online privacy.
In essence, Avast Premium Security provides a robust suite of tools to keep your devices and online activity safe and secure, according to Avast.
🌍📱👉COPY LINK & PASTE ON GOOGLE http://drfiles.net/ 👈🌍
Adobe Illustrator is a powerful, professional-grade vector graphics software used for creating a wide range of designs, including logos, icons, illustrations, and more. Unlike raster graphics (like photos), which are made of pixels, vector graphics in Illustrator are defined by mathematical equations, allowing them to be scaled up or down infinitely without losing quality.
Here's a more detailed explanation:
Key Features and Capabilities:
Vector-Based Design:
Illustrator's foundation is its use of vector graphics, meaning designs are created using paths, lines, shapes, and curves defined mathematically.
Scalability:
This vector-based approach allows for designs to be resized without any loss of resolution or quality, making it suitable for various print and digital applications.
Design Creation:
Illustrator is used for a wide variety of design purposes, including:
Logos and Brand Identity: Creating logos, icons, and other brand assets.
Illustrations: Designing detailed illustrations for books, magazines, web pages, and more.
Marketing Materials: Creating posters, flyers, banners, and other marketing visuals.
Web Design: Designing web graphics, including icons, buttons, and layouts.
Text Handling:
Illustrator offers sophisticated typography tools for manipulating and designing text within your graphics.
Brushes and Effects:
It provides a range of brushes and effects for adding artistic touches and visual styles to your designs.
Integration with Other Adobe Software:
Illustrator integrates seamlessly with other Adobe Creative Cloud apps like Photoshop, InDesign, and Dreamweaver, facilitating a smooth workflow.
Why Use Illustrator?
Professional-Grade Features:
Illustrator offers a comprehensive set of tools and features for professional design work.
Versatility:
It can be used for a wide range of design tasks and applications, making it a versatile tool for designers.
Industry Standard:
Illustrator is a widely used and recognized software in the graphic design industry.
Creative Freedom:
It empowers designers to create detailed, high-quality graphics with a high degree of control and precision.
Proactive Vulnerability Detection in Source Code Using Graph Neural Networks:...Ranjan Baisak
As software complexity grows, traditional static analysis tools struggle to detect vulnerabilities with both precision and context—often triggering high false positive rates and developer fatigue. This article explores how Graph Neural Networks (GNNs), when applied to source code representations like Abstract Syntax Trees (ASTs), Control Flow Graphs (CFGs), and Data Flow Graphs (DFGs), can revolutionize vulnerability detection. We break down how GNNs model code semantics more effectively than flat token sequences, and how techniques like attention mechanisms, hybrid graph construction, and feedback loops significantly reduce false positives. With insights from real-world datasets and recent research, this guide shows how to build more reliable, proactive, and interpretable vulnerability detection systems using GNNs.
Scaling GraphRAG: Efficient Knowledge Retrieval for Enterprise AIdanshalev
If we were building a GenAI stack today, we'd start with one question: Can your retrieval system handle multi-hop logic?
Trick question, b/c most can’t. They treat retrieval as nearest-neighbor search.
Today, we discussed scaling #GraphRAG at AWS DevOps Day, and the takeaway is clear: VectorRAG is naive, lacks domain awareness, and can’t handle full dataset retrieval.
GraphRAG builds a knowledge graph from source documents, allowing for a deeper understanding of the data + higher accuracy.
Download Wondershare Filmora Crack [2025] With Latesttahirabibi60507
Copy & Past Link 👉👉
http://drfiles.net/
Wondershare Filmora is a video editing software and app designed for both beginners and experienced users. It's known for its user-friendly interface, drag-and-drop functionality, and a wide range of tools and features for creating and editing videos. Filmora is available on Windows, macOS, iOS (iPhone/iPad), and Android platforms.
Exceptional Behaviors: How Frequently Are They Tested? (AST 2025)Andre Hora
Exceptions allow developers to handle error cases expected to occur infrequently. Ideally, good test suites should test both normal and exceptional behaviors to catch more bugs and avoid regressions. While current research analyzes exceptions that propagate to tests, it does not explore other exceptions that do not reach the tests. In this paper, we provide an empirical study to explore how frequently exceptional behaviors are tested in real-world systems. We consider both exceptions that propagate to tests and the ones that do not reach the tests. For this purpose, we run an instrumented version of test suites, monitor their execution, and collect information about the exceptions raised at runtime. We analyze the test suites of 25 Python systems, covering 5,372 executed methods, 17.9M calls, and 1.4M raised exceptions. We find that 21.4% of the executed methods do raise exceptions at runtime. In methods that raise exceptions, on the median, 1 in 10 calls exercise exceptional behaviors. Close to 80% of the methods that raise exceptions do so infrequently, but about 20% raise exceptions more frequently. Finally, we provide implications for researchers and practitioners. We suggest developing novel tools to support exercising exceptional behaviors and refactoring expensive try/except blocks. We also call attention to the fact that exception-raising behaviors are not necessarily “abnormal” or rare.
Get & Download Wondershare Filmora Crack Latest [2025]saniaaftab72555
Copy & Past Link 👉👉
https://dr-up-community.info/
Wondershare Filmora is a video editing software and app designed for both beginners and experienced users. It's known for its user-friendly interface, drag-and-drop functionality, and a wide range of tools and features for creating and editing videos. Filmora is available on Windows, macOS, iOS (iPhone/iPad), and Android platforms.
2. 2
Globus delivers…
Fast and reliable data transfer, sharing,
and platform services…
…directly from your own storage
systems…
...via software-as-a-service using
existing identities with the overarching
goal of...
3. 3
Research Computing HPC
Desktop Workstations
Mass Storage Instruments
Personal Resources
Public Cloud
National Resources
Unifying access to data across tiers
4. Globus SaaS / PaaS: Research data lifecycle
Researcher initiates
transfer request; or
requested automatically
by script, science
gateway
1
Instrument
Compute Facility
Globus transfers files
reliably, securely
2
Globus controls
access to shared
files on existing
storage; no need
to move files to
cloud storage!
4
Researcher
selects files to
share, selects
user or group,
and sets access
permissions
3
Collaborator logs in to
Globus and accesses
shared files; no local
account required;
download via Globus
5
Streamlining research
workflows and
ensuring those that
need access to the
data have it.
8
Personal Computer
Transfer
Share
• Use a Web browser or
platform services
• Access any storage
• Use an existing identity
Build
The Globus
Command Line
Interface, API sets,
Python SDK and
Action Providers give
you a platform…
6
… for building
science gateways,
portals and
automations.
7
Search
Automate
5. Globus core security features
• Access Control
– Identities provided and managed by institution
– Globus is identity broker; no access to/storage of user credentials
– Fine grained access control
• Data remain at institutions, not stored by Globus
• Data does not flow through the Globus Service but directly
between Endpoints and their Collections
• Integrity checks of transferred data
• High availability and redundancy
• Encryption of user files and Globus control data
8. Endpoints, Collections and
Globus Connect
• Globus Connect Server
– Multi user Linux Systems
– https://docs.globus.org/globus-connect-server/
• Globus Connect Personal
– Personal Workstations and Laptops
– https://www.globus.org/globus-connect-personal
– OS specific instructions
o https://docs.globus.org/how-to/
10. Manage Protected Data
10
Higher assurance levels for HIPAA and other regulated data
• Support for protected data
such as health related
information
• Share data with collaborators
while meeting compliance
requirements
• Includes BAA option
11. Globus for high assurance data management
• Restricted data handling
– PII (Personally identifiable information)
– CUI (Controlled Unclassified Information)
– PHI (Protected Health Information)
• University of Chicago security controls
– NIST SP 800-53
– Superset of NIST SP 800-171
• Business Associate Agreements (BAA) will be between
University of Chicago and our subscribers
– University of Chicago has a BAA with Amazon
12. High Assurance features
• Additional authentication assurance
– Per storage gateway policy on frequency of authentication with
specific identity for access to data (timeout)
– Ensure that user authenticates with the specific identity that
gives them access within session (decoupling linked identities)
• Session/device isolation
– Authentication context is per application, per session (~browser
session)
• Enforces encryption of all user data in transit
• Audit logging