Globus Professional Services

What we do

Globus’s Professional Services team specializes in integrating Globus data management capabilities with research and education applications. Our team members have decades of experience contributing to project plans and grant proposals involving the construction of novel research applications and systems. We most often build web apps or portals, but have also contributed to desktop and supercomputer applications. Using Globus services and APIs, we can supercharge an application’s ability to import, export, share, and automate data. We can also simplify an application’s account management and permissions using Globus’s federated identity and access management (IAM) platform.


Sample Engagements and Use Cases

Portal Development

NIH CFDE

Challenge: Build a data submission system to enable NIH initiatives to declare and maintain their data holdings so we could build a combined catalog and web portal.

What we did: Used Globus Groups to register and authorize data submitters. Deployed Globus Connect Server in AWS for data uploads and retrospective access. Defined a data ingest flow definition for CFDE. Developed a lightweight command-line interface (CLI) for Globus authentication, data upload, and initiating the ingest flow.

Results: Authorized personnel from eleven national-scale NIH research initiatives are now able to update their data holdings in CFDE’s searchable public web catalog. Biomedical researchers now have a unified catalog to find high-value NIH data relevant to their research.

Globus services used:

  • Auth API
  • GA4GH passports
  • Groups API
  • HTTPS
  • Transfer
  • Flows
  • Globus Compute

Data Publication

DOE Argonne

Challenge: Build a lightweight data portal for publishing datasets generated at Argonne’s Advanced Photon Source. Simplify the beamline operator’s data publishing job.

What we did: Interviewed operators to understand publication processes. Helped install Globus on local storage systems and get allocations on Argonne’s Leadership Computing Facility (ALCF) for data processing. Drafted Globus Flow definitions for data processing, metadata extraction, and publishing steps. Developed Django Globus Portal Framework to enable discovery, inspection, and transfer of datasets.

Results: Argonne APS beamline operators can now use the ALCF for near-real time dataset processing and publication. Beamline researchers can easily transfer their data to their institutions or to the next step in the data’s lifecycle.

Globus services used:

  • Auth API
  • Transfer
  • Guest collections
  • SDK
  • HTTPS
  • Flows
  • funcX/Globus Compute
  • Search API
  • Groups API

Data Automation

NSF ACCESS

Challenge: Enable thousands of researchers at hundreds of campuses to transfer their research data (1GB–50TB) to and from ACCESS systems at a dozen supercomputing centers.

What we did: Provided ACCESS system operators a Globus Connect Server configuration using ACCESS’s Identity and Access Management (IAM). Worked with campus RCC teams to enable Globus on hundreds of campus systems. Wrote ACCESS-specific Globus documentation for public use. Developed and delivered ACCESS-specific Globus training. Provided researchers with FuncX configuration files for ACCESS computing systems.

Results: Researchers with ACCESS allocations can easily transfer their application data between ACCESS systems, campus systems, and other shared research facilities.

Globus services used:

  • Transfer
  • HTTPS
  • Timers
  • SDK
  • Share

Have a new research project? Consult with us to see how we can help you simplify your research data management. Or even accelerate your time to discovery.

Contact our team at outreach@globus.org


Related Content

User Story

Building a portal for the Human BioMolecular Atlas Program

University of Pittsburgh

The University of Pittsburgh together with the Pittsburgh Supercomputing Center are one of five funded components contributing to the infrastructure...