info@ismena.com
Ismena websiteIsmena websiteIsmena websiteIsmena website
  • Home
  • About us
  • Technologies
    • Cloud Services
      • Google Cloud Platform
        • Networking
        • Compute
        • Storage
        • SAP on GCP
        • Google Maps
        • Data Center Modernization
    • Infrastructure
      • iSolution Services
      • Unified Communication
      • Network Security
      • Access Security & Control
      • Computing Platforms
      • Structured Cabling Infrastructure
      • Datacenter Infrastructure
      • Networking Infrastructure
      • Retail Analytics
      • Cloud Infrastructure
    • Integration
      • Apigee
      • IBM
      • Custom Connectors
      • UnifAI
    • Security
      • Security Consulting Services
      • Security Solutions
    • Data & AI
      • BigQuery, Looker
      • Gemini
    • Collaboration Tools
      • Google Workspace For Enterprise
    • ERP-CRM
      • Odoo
      • Salesforce
      • SAP on GCP
    • DevOps
      • GCP
      • SonarSource
    • Managed Service Provider
      • Managed Service Provider
    • App Development
      • App Development
    • Open Banking
      • Open banking
    • Chrome Devices
  • Unplugged Podcast
  • Blog
    • Success Stories
    • News
    • Articles
  • Careers
  • Contact Us

Technologies

Integration

Custom Connectors

Explore All Connectors

TPG Open Data Connector

TPG Open Data Connector

Connector Details

Type

Virtual machines, Single VM , BYOL

Runs on

Google Compute Engine

Last Update

24 October, 2024

Category

Overview

Documentation

Pricing

Support

Overview

The TPG Open Data Connector facilitates seamless integration with the TPG Open Data API,, providing access to public transport datasets from Transports Publics Genevois (TPG) in Geneva, Switzerland. This connector acts as a proxy to streamline data retrieval, supporting actions for querying catalogs, datasets, records, and exporting data in formats like JSON, CSV, Parquet, GPX, and DCAT. It leverages ODSQL for filtering, sorting, and pagination, enabling transport-related applications with real-time and historical data.

Integration Overview

This document provides a detailed guide for each integration point, its purpose, configuration, and workflow support using the TPG Open Data Connector.
Supported Integration Action Points

  • getDatasets: Query catalog datasets.
  • listExportFormats: List export formats for the catalog.
  • exportDatasets: Export a catalog in the desired format.
  • exportCatalogCSV: Export a catalog in CSV.
  • exportCatalogDCAT: Export a catalog in RDF/XML (DCAT).
  • getDatasetsFacets: List facet values for datasets.
  • getRecords: Query dataset records.
  • listDatasetExportFormats: List export formats for a specific dataset.
  • exportRecords: Export a dataset in the desired format.
  • exportRecordsCSV: Export a dataset in CSV.
  • exportRecordsParquet: Export a dataset in Parquet.
  • exportRecordsGPX: Export a dataset in GPX.
  • getDataset: Show dataset information.

Detailed Integration Documentation

2.1 Get Datasets Retrieval

Action getDatasets
Purpose Retrieves available datasets from the TPG catalog, optionally filtered and sorted using parameters. Useful for accessing transport datasets for applications such as route planning or mobility analytics.
Configuration Ensure the connector is configured with the base URL via the CONNECTOR_ENV_TPGOPENDATA_BASE_URL environment variable.
Parameters
  • Required: None
  • Optional:
    • select (string) – Fields to return (e.g., dataset_id,title).
    • where (string) – ODSQL condition (e.g., records_count > 1000).
    • order_by (string) – Sorting (e.g., records_count DESC).
    • limit (integer, default: 10, max: 100) – Number of results.
    • offset (integer, default: 0) – Pagination offset.
    • refine (string) – Refine by facet (e.g., publisher:"TPG").
    • exclude (string) – Exclude by facet.
    • lang (string, default: en) – Language for labels.
    • timezone (string, default: UTC) – Timezone for date fields.
    • group_by (string) – Group results.
    • include_links (boolean, default: true) – Include navigation links.
    • include_app_metas (boolean, default: false) – Include application metadata.
Output
  • Successful: Returns:
    • links – Array of navigation links.
    • datasets – Array of dataset objects (including metadata and fields).
  • Failure: Returns an error object (e.g., message, error_code: ODSQLError).
Workflow Example
  • Configure the connector with the base URL.
  • Execute getDatasets with where="records_count > 1000" and limit=20.
  • Process the response to list transport datasets.

2.2 List Export Formats Retrieval

Action listExportFormats
Purpose Lists available export formats for the TPG catalog, allowing users to determine supported output types for bulk data export.
Configuration Ensure the connector is configured with the base URL via the CONNECTOR_ENV_TPGOPENDATA_BASE_URL environment variable.
Parameters
  • Required: None
  • Optional: None
Output
  • Successful: Returns:
    • links – Array of export format links.
  • Failure: Returns an error object (e.g., message, error_code: ODSQLError).
Workflow Example
  • Execute listExportFormats.
  • Review available export formats.
  • Use selected format in subsequent export operations.

2.3 Export Datasets Retrieval

Action exportDatasets
Purpose Exports datasets from the TPG catalog in a selected format, enabling bulk data acquisition for transport analysis or reporting.
Configuration Ensure the connector is configured with the base URL via the CONNECTOR_ENV_TPGOPENDATA_BASE_URL environment variable.
Parameters
  • Required:
    • format (string) – Export format (e.g., csv, json).
  • Optional:
    • select (string) – Fields to return.
    • where (string) – Filtering condition.
    • order_by (string) – Sorting.
    • group_by (string) – Group results.
    • limit (integer) – Number of results.
    • offset (integer, default: 0) – Pagination offset.
    • refine (string) – Refine on facet.
    • exclude (string) – Exclude on facet.
    • lang (string, default: en) – Language.
    • timezone (string, default: UTC) – Timezone.
Output
  • Successful: Returns a downloadable file.
  • Failure: Returns an error object (e.g., message, error_code: ODSQLError).
Workflow Example
  • Execute exportDatasets with format=json and where="records_count > 1000".
  • Download and store the exported file.
  • Use the file for transport reports and analytics.

2.4 Export Catalog CSV Retrieval

Action exportCatalogCSV
Purpose Exports the TPG catalog in CSV format, with specific configuration options for delimiters and quoting. Supports customized CSV exports for downstream data processing and import into spreadsheet applications.
Configuration Ensure the connector is configured with the base URL via the CONNECTOR_ENV_TPGOPENDATA_BASE_URL environment variable.
Parameters
  • Required: None
  • Optional:
    • delimiter (string, enum: [;, ,, \t, |], default: ;) – Field delimiter.
    • list_separator (string, default: ,) – Separator for multivalued strings.
    • quote_all (boolean, default: false) – Force quoting of all string fields.
    • with_bom (boolean, default: true) – Include Unicode BOM for compatibility.
Output
  • Successful: Returns a downloadable CSV file.
  • Failure: Returns error details (e.g., message, error_code: ODSQLError).
Workflow Example
  • Execute exportCatalogCSV with optional delimiter=,.
  • Save the returned CSV file.
  • Import into spreadsheet applications for analysis or reporting.

2.5 Export Catalog DCAT Retrieval

Action exportCatalogDCAT
Purpose Exports the TPG catalog described with the DCAT vocabulary (RDF/XML), enabling metadata interchange and catalog integration using standard linked-data vocabularies.
Configuration Ensure the connector is configured with the base URL via the CONNECTOR_ENV_TPGOPENDATA_BASE_URL environment variable.
Parameters
  • Required:
    • dcat_ap_format (string) – DCAT format variant (e.g., -ap).
  • Optional:
    • include_exports (string) – Comma-separated dataset export formats to include (e.g., csv,json,geojson).
    • use_labels_in_exports (boolean, default: true) – Use human-readable field labels in exports.
Output
  • Successful: Returns a downloadable RDF/XML (DCAT) file.
  • Failure: Returns error details (e.g., message, error_code: ODSQLError).
Workflow Example
  • Execute exportCatalogDCAT with optional include_exports=csv,json.
  • Save the returned RDF/XML file.
  • Use the metadata for catalog integration or semantic web consumption.

2.6 Get Datasets Facets Retrieval

Action getDatasetsFacets
Purpose Enumerates facet values for TPG datasets (publisher, topic, etc.), optionally refined by parameters. Useful to implement guided navigation and filters in large result sets.
Configuration Ensure the connector is configured with the base URL via the CONNECTOR_ENV_TPGOPENDATA_BASE_URL environment variable.
Parameters
  • Required: None
  • Optional:
    • facet (string) – Facets to enumerate (e.g., publisher).
    • refine (string) – Refine by facet (e.g., publisher:"TPG").
    • exclude (string) – Exclude by facet (e.g., publisher:"TPG").
    • where (string) – ODSQL condition (e.g., records_count > 1000).
    • timezone (string, default: UTC) – Timezone for date-based facets.
Output
  • Successful: Returns a JSON object with:
    • links – Array of navigation links.
    • facets – Array of facet enumerations (each with name, and facets array containing count, state, name, value).
  • Failure: Returns error details (e.g., message, error_code: ODSQLError).
Workflow Example
  • Execute getDatasetsFacets with facet=publisher.
  • Process the response to display facet values and counts.
  • Use facets to refine transport dataset searches in the UI.

2.7 Get Records Retrieval

Action getRecords
Purpose Performs a query on TPG dataset records, optionally filtered and sorted. This helps users retrieve specific transport data records from a dataset.
Configuration Ensure the connector is configured with the base URL via the CONNECTOR_ENV_TPGOPENDATA_BASE_URL environment variable.
Parameters
  • Required:
    • dataset_id (string, e.g., geonames-all-cities-with-a-population-1000)
  • Optional:
    • select (string, e.g., name,population)
    • where (string, e.g., population > 1000000)
    • group_by (string)
    • order_by (string, e.g., population DESC)
    • limit (integer, default: 10, max: 100)
    • offset (integer, default: 0)
    • refine (string)
    • exclude (string)
    • lang (string, default: en)
    • timezone (string, default: UTC)
    • include_links (boolean, default: true)
    • include_app_metas (boolean, default: false)
Output
  • Successful: Returns a JSON object with:
    • total_count – Total number of records (integer).
    • results – Array of record objects.
  • Failure: Returns error details (e.g., message, error_code: ODSQLError).
Workflow Example
  • Execute the getRecords action with dataset_id=geonames-all-cities-with-a-population-1000 and optional where="population > 1000000".
  • Review the response to obtain record data.
  • Use the records for transport route analysis.

2.8 List Dataset Export Formats Retrieval

Action listDatasetExportFormats
Purpose Lists available export formats for a specific TPG dataset. This helps users identify supported formats for dataset exports.
Configuration Ensure the connector is configured with the base URL via the CONNECTOR_ENV_TPGOPENDATA_BASE_URL environment variable.
Parameters
  • Required:
    • dataset_id (string, e.g., geonames-all-cities-with-a-population-1000)
  • Optional: None
Output
  • Successful: Returns a JSON object with:
    • links – Array of export format links.
  • Failure: Returns error details (e.g., message, error_code: ODSQLError).
Workflow Example
  • Execute the listDatasetExportFormats action with dataset_id=geonames-all-cities-with-a-population-1000.
  • Review the response to identify available export formats.
  • Use the formats for dataset exports.

2.9 Export Records Retrieval

Action exportRecords
Purpose Exports a TPG dataset in the desired format, optionally filtered using various parameters. This enables bulk data retrieval from a dataset in a specified format.
Configuration Ensure the connector is configured with the base URL via the CONNECTOR_ENV_TPGOPENDATA_BASE_URL environment variable.
Parameters
  • Required:
    • dataset_id (string, e.g., geonames-all-cities-with-a-population-1000)
    • format (string, e.g., csv, json)
  • Optional:
    • select (string, e.g., name,population)
    • where (string, e.g., population > 1000000)
    • order_by (string, e.g., population DESC)
    • group_by (string)
    • limit (integer)
    • refine (string)
    • exclude (string)
    • lang (string, default: en)
    • timezone (string, default: UTC)
    • use_labels (boolean, default: false)
    • compressed (boolean, default: false)
    • epsg (integer, e.g., 4326)
Output
  • Successful: Returns a file.
  • Failure: Returns error details (e.g., message, error_code: ODSQLError).
Workflow Example
  • Execute the exportRecords action with dataset_id=geonames-all-cities-with-a-population-1000, format=json, and optional where="population > 1000000".
  • Save the exported file.
  • Use the data for further processing.

2.10 Export Records CSV Retrieval

Action exportRecordsCSV
Purpose Exports a TPG dataset in CSV format, with specific configuration options for delimiters and quoting. Supports customized CSV exports for dataset records.
Configuration Ensure the connector is configured with the base URL via the CONNECTOR_ENV_TPGOPENDATA_BASE_URL environment variable.
Parameters
  • Required:
    • dataset_id (string, e.g., geonames-all-cities-with-a-population-1000)
  • Optional:
    • delimiter (string, enum: [;, ,, \t, |], default: 😉
    • list_separator (string, default: ,)
    • quote_all (boolean, default: false)
    • with_bom (boolean, default: true)
Output
  • Successful: Returns a file.
  • Failure: Returns error details (e.g., message, error_code: ODSQLError).
Workflow Example
  • Execute exportRecordsCSV with dataset_id=geonames-all-cities-with-a-population-1000 and optional delimiter=,.
  • Save the CSV file.
  • Import the data into analysis tools.

2.11 Export Records Parquet Retrieval

Action exportRecordsParquet
Purpose Exports a TPG dataset in Parquet format, with options for compression. Supports efficient storage and querying of large transport datasets.
Configuration Ensure the connector is configured with the base URL via the CONNECTOR_ENV_TPGOPENDATA_BASE_URL environment variable.
Parameters
  • Required:
    • dataset_id (string, e.g., geonames-all-cities-with-a-population-1000)
  • Optional:
    • parquet_compression (string, enum: [snappy, zstd], default: snappy)
Output
  • Successful: Returns a file.
  • Failure: Returns error details (e.g., message, error_code: ODSQLError).
Workflow Example
  • Execute exportRecordsParquet with dataset_id=geonames-all-cities-with-a-population-1000 and optional parquet_compression=zstd.
  • Save the Parquet file.
  • Use the data in big data processing frameworks.

2.12 Export Records GPX Retrieval

Action exportRecordsGPX
Purpose Exports a TPG dataset in GPX format, with options for name and description fields. Supports geospatial data export for GPS and transport route applications.
Configuration Ensure the connector is configured with the base URL via the CONNECTOR_ENV_TPGOPENDATA_BASE_URL environment variable.
Parameters
  • Required:
    • dataset_id (string, e.g., geonames-all-cities-with-a-population-1000)
  • Optional:
    • name_field (string) – Field for GPX name attribute.
    • description_field_list (string) – Fields for GPX description.
    • use_extension (boolean, default: true) – Use extension tag for attributes.
Output
  • Successful: Returns a file.
  • Failure: Returns error details (e.g., message, error_code: ODSQLError).
Workflow Example
  • Execute exportRecordsGPX with dataset_id=geonames-all-cities-with-a-population-1000 and optional name_field=name.
  • Save the GPX file.
  • Use the data in mapping or navigation tools.

2.13 Get Dataset Retrieval

Action getDataset
Purpose Returns information about a specific TPG dataset, including metadata and endpoints. Provides detailed dataset metadata for transport application integration.
Configuration Ensure the connector is configured with the base URL via the CONNECTOR_ENV_TPGOPENDATA_BASE_URL environment variable.
Parameters
  • Required:
    • dataset_id (string, e.g., geonames-all-cities-with-a-population-1000)
  • Optional:
    • select (string, e.g., metas,title)
    • lang (string, default: en)
    • timezone (string, default: UTC)
    • include_links (boolean, default: true)
    • include_app_metas (boolean, default: false)
Output
  • Successful: Returns a JSON object with:
    • dataset_id – Dataset identifier (string).
    • metas – Object with default metadata (records_count, modified, etc.).
    • fields – Array of field objects (label, type, name, etc.).
    • features – Array of features (e.g., analyze, geo).
  • Failure: Returns error details (e.g., message, error_code: ODSQLError).
Workflow Example
  • Execute getDataset action with dataset_id=geonames-all-cities-with-a-population-1000.
  • Process the response to obtain dataset metadata.
  • Use the information to configure data queries.

Workflow Creation with the Connector

Example Workflow: Public Transport Route Planning

Retrieve Datasets
  • Use the getDatasets action with optional refine=publisher:"TPG" to fetch a list of available transport datasets.
  • Identify target datasets (e.g., geonames-all-cities-with-a-population-1000).
Query Dataset Information
  • Execute the getDataset action with dataset_id=geonames-all-cities-with-a-population-1000 to fetch dataset metadata.
  • Process the response to understand fields and features.
Fetch Records
  • Use the getRecords action with dataset_id=geonames-all-cities-with-a-population-1000 and optional where="population > 1000000" to retrieve records.
  • Display the records in a user interface.
Perform Data Export
  • Use the exportRecords action with dataset_id=geonames-all-cities-with-a-population-1000, format=csv, and optional where="population > 1000000" to export data.
  • Integrate the exported data into route planning or archiving applications.

This workflow enables applications to provide users with accurate transport dataset exploration, querying, and export capabilities, enhancing mobility planning and decision-making in Geneva.

Pricing

Request a Quote

Support

For Technical support please contact us on

custom-connectors-support@isolutions.sa

iSolution logo - white - transparent 250 px

iSolution logo - white - transparent 250 px

A tech solution company dedicated to providing innovation thus empowering businesses to thrive in the digital age.

  • Home
  • About us
  • Blog
  • Careers
  • Success Stories
  • News
  • Articles
  • Contact Us
  • Terms and conditions
  • Privacy Policy
© Copyright 2024 iSolution | All Rights Reserved
  • Home
  • About us
  • Technologies
    • Cloud Services
      • Google Cloud Platform
        • Networking
        • Compute
        • Storage
        • SAP on GCP
        • Google Maps
        • Data Center Modernization
    • Infrastructure
      • iSolution Services
      • Unified Communication
      • Network Security
      • Access Security & Control
      • Computing Platforms
      • Structured Cabling Infrastructure
      • Datacenter Infrastructure
      • Networking Infrastructure
      • Retail Analytics
      • Cloud Infrastructure
    • Integration
      • Apigee
      • IBM
      • Custom Connectors
      • UnifAI
    • Security
      • Security Consulting Services
      • Security Solutions
    • Data & AI
      • BigQuery, Looker
      • Gemini
    • Collaboration Tools
      • Google Workspace For Enterprise
    • ERP-CRM
      • Odoo
      • Salesforce
      • SAP on GCP
    • DevOps
      • GCP
      • SonarSource
    • Managed Service Provider
      • Managed Service Provider
    • App Development
      • App Development
    • Open Banking
      • Open banking
    • Chrome Devices
  • Unplugged Podcast
  • Blog
    • Success Stories
    • News
    • Articles
  • Careers
  • Contact Us
Ismena website

Register To Wired For Banking

Register to Wired for Banking

Register To Palo Alto & iSolution Event

Register to Gemini in Action Workshop

[forminator_form id=”14485″]

Registration To Amman Unplugged Event

[forminator_form id=”14419″]

Register to Gemini in Action Workshop

[forminator_form id=”14298″]

Tech and Culture Riyadh

[forminator_form id=”13094″]