Pentaho Data Catalog
Data QualityData IntegrationBusiness Analytics
  • Overview
    • Pentaho Data Catalog ..
  • Overview
  • Data Catalog
    • Getting Started
      • Data Sources
      • Process the data
      • Identify the data
      • Business Glossaries & Terms
      • Reference Data
      • Data Lineage
    • Management
      • Users, Roles & Community
      • Data Identification Methods
      • Business Rules
      • Metadata Rules
      • Schedules
      • Workers
    • Components
      • Keycloak
      • Reverse Proxy Server
      • App Server
      • Metadata Store
      • Worker Server
      • Observability
Powered by GitBook
  1. Data Catalog
  2. Getting Started

Data Sources

Adding Data Sources ..

PreviousGetting StartedNextProcess the data

Last updated 7 months ago

Data Source Configuration Guide

Before integrating your data sources, it's essential to collect all the necessary configuration details. This guide outlines the key pieces of information required to establish a connection to your data sources. Your database administrator (DBA) will be a valuable resource in providing this configuration information.

  • URI (Uniform Resource Identifier): This unique identifier is used to locate your data source. You'll typically need a username and password to authenticate your connection.

  • Driver: Ensure you have the appropriate driver for your data source. This is crucial for enabling your application to communicate with the database.

  • Credentials: Username and password are fundamental for authentication.

  • Host Name and Port Number: These are required to point your application towards the right server and port where your database is running.

  • Key Store and Trust Store Configuration:

* Key Store Type and Location: Specifies the type (e.g., JKS) and location of the key store file.

* Key Store Password: The password to access the key store.

* Trust Store Type and Location: Defines the type and location of the trust store file.

* Trust Store Password: The password required to access the trust store.

* Cipher Suite: A list of encryption algorithms supported for SSL/TLS connections.

  • Encryption Type:

* Encryption Only: Data is encrypted during transmission but does not require authentication.

* Encryption with Server and Client Authentication: Both the client and server authenticate each other, providing a higher security level.

* SSL Configuration: For secure connections, SSL configuration details are essential, including the cipher suite, key and trust store information.

* Data Source Type: Identifying the type of data source (e.g., SQL database, NoSQL database, file system) is crucial for selecting the correct driver and configuration settings.

• Configuration Method: Decide on the approach for configuring your data source. This can be via direct credentials, SSL, or a URI.

Gathering this information beforehand will streamline the process of adding your data sources. Remember, your DBA is your go-to person for obtaining most of this configuration detail.

For Amazon Web Services (AWS) data source types, a configuration method isn't specified. You must have information such as AWS region, account number, IAM username, access key ID, and secret access key to configure these data source types.


Accessing Your Catalog

To access your catalog, please follow these steps:

  1. Open Google Chrome web browser. and click on the bookmark, or

  2. Enter the following email and password, then click Sign In.

Username

data_steward@hv.com

Password

Welcome123!

Security Advisory: Handling Login Credentials

For enhanced security, it is strongly recommended that users avoid saving their login details directly in web browsers. Browsers may inadvertently autofill these credentials in unrelated fields, posing a security risk.

Best Practice

• Disable Autofill: To mitigate potential risks, users should disable the autofill functionality for login credentials in their browser settings. This preventive measure ensures that sensitive information is not unintentionally exposed or misused.

  1. Click on: Management -> Resources tile.

  1. Click on: Add Data Source.

  2. Specify the following basic information for the connection to your data source (you'll find the connection details in the table below these descriptions):

Field
Description

Data Source Name

Specify the name of your data source. This name is used in the Data Catalog interface. It should be something your Data Catalog users recognize.

Names must start with a letter, and must contain only letters, digits, and underscores. White spaces in names are not supported.

Data Source ID (Optional)

Specify a permanent identifier for your data source. If you leave this field blank, Data Catalog generates a permanent identifier for you.

You cannot modify Data Source ID for this data source after you specify or generate it.

Description (Optional)

Specify a description of your data source.

Data Source Type

Select the database type of your source. You are then prompted to specify additional connection information based on the file system or database type you are trying to access.

After you have specified the basic information, specify the following additional connection information based on the file system or database type you are trying to access.

Field
Description

Affinity

This default setting specifies which agents should be associated with the data source in a multi-agent deployment.

Configuration Method: Select Credentials or URI as a configuration method.

Configuration Method: Credentials

• Username/Password: Credentials that provide access to the specified database.

• Host: The address of the machine where the Microsoft SQL database server is running. It can be an IP address or a domain name.

• Port: The port number on which the Microsoft SQL server is listening for incoming connections. The default port is 5432.

Configuration Method: URI

• Username/Password: Credentials that provide access to the specified database.

• Service URI: For example, URL would look like Server=myServerAddress;Database=myDatabase;User Id=myUsername;Password=myPassword;Port=1433;Integrated Security=False;Connection Timeout=30;.

Driver

Select an existing driver or upload a new driver to ensure that the communication between the application and the database is efficient, secure, and follows the required standards.

Database Name

The name of the database within the Microsoft SQL server that you want to connect with.


Connect to Demo Data Sources

Follow the steps below to connect to one of the demo datasets. In this workshop we're going to connect to the Synthea dataset, stored on a PostgreSQL database:

  1. To install the 'Synthea' demo datasource, click on the PostgreSQL tab below:

SyntheaTM is an open-source tool that generates synthetic patient data, simulating individuals' complete medical histories. This encompasses medications, allergies, encounters, and social health determinants for each mock patient.

The generated data is free from legal and privacy concerns.

To watch the videos please copy and paste the website URL into your host Chrome browser.

Create a connection to the Synthea dataset, then ingest the database schema.


Follow the steps below to connect and ingest the schema metadata:

Test Connection and Ingest Metadata Schema ..

After you have specified the detailed information according to your data source type, test the connection to the data source and add the data source.

  1. Enter the following details to connect to: PostgreSQL business_apps_db (Synthea) database.

Field
Setting

Data Source Name

postgresql:synthea

Data Source ID

Leave Blank to autogenerate

Description

Demo dataset of patients medical records

Data Source Type

PostgreSQL

Affinity

Default

Configuration Method

Credentials

Username

sqlreader

Password

2Petabytes

*Host

pdc.pdc.lab

Port

5432

**Driver

postgresql-42.7.1.jar

Database Name

business_apps_db

*Enter server IP address or FQDN.

**PDC does not ship with any database drivers.

  1. After you have specified the detailed information according to your data source type, test the connection to the data source and add the data source.

  2. Click Test Connection to test your connection to the specified data source.

  3. Take a look at the 'workers' to check for any issues.

Prior to completing and saving your new data source setup, it's essential to execute the 'Ingest Schemas' process. This step is crucial for importing the database schema and associated metadata into the system.

  1. Click Ingest Schema, select the 'synthea' schema, and then click Ingest Schemas.

While you have the option to select all schemas, it is advisable to exclude system-related schemas that are not relevant to your requirements.

  1. (Optional) Enter a Note for any information you need to share with others who might access this data source.

  2. Click: Create Data Source to establish your data source connection.

PDC does not ship with JDBC drivers. You will need to download the required driver from the vendor site.

To upload JDBC drivers

  1. Click on Manage Drivers.

  1. Click on 'Add New'.

  1. Select Database type: POSTGRES

The postgresql-42.7.1.jar driver is located:

~/Workshop--Pentaho-Data-Catalog/Database-Drivers

  1. Click Add Driver.

Install & Configure pgAdmin4

pgAdmin is an open-source administration and development platform for PostgreSQL.

  1. Ensure all the existing packages are up-to-date.

sudo apt update && sudo apt upgrade -y
  1. Install the public key for the PgAdmin4 repository.

curl -fsS https://www.pgadmin.org/static/packages_pgadmin_org.pub | sudo gpg --dearmor -o /usr/share/keyrings/packages-pgadmin-org.gpg
  1. Create the repository configuration file.

sudo sh -c 'echo "deb [signed-by=/usr/share/keyrings/packages-pgadmin-org.gpg] https://ftp.postgresql.org/pub/pgadmin/pgadmin4/apt/$(lsb_release -cs) pgadmin4 main" > /etc/apt/sources.list.d/pgadmin4.list && apt update'
  1. Choose your preferred mode for PgAdmin4 installation.

Recommended to install desktop mode only.

• For both desktop and web modes:

sudo apt install pgadmin4

• For desktop mode only:

sudo apt install pgadmin4-desktop

• For web mode only:

sudo apt install pgadmin4-web

Connect to Synthea database

  1. Start pgAdmin desktop.

  1. Click on Add New Server button and enter the information of your remote server.

Field
Setting

Name

Synthea

Host name

localhost

Port

5432

Username

sqlreader

Password

2Petabytes

  1. View the data in the synthea schema.

DBeaver Community is a free cross-platform database tool for developers, database administrators, analysts, and everyone working with data. It supports all popular SQL databases like MySQL, MariaDB, PostgreSQL, SQLite, Apache Family, and more.

  1. Easiest way to install DBeaver-ce is to use Snap.

sudo snap install dbeaver-ce

To create a connection to the Sythea Postgres database

  1. Select PostgreSQL database & click Next.

  1. Enter the following coneection details:

Connect by URL

jdbc:postgresql://pdc.pentaho.example:5432/businessapps_db

Username

sqlreader

Password

2Petabytes

  1. 'Test Connection' & download driver version 42.7.2

  1. Click Finish

  1. Click OK.

  1. Expand Databases -> Schemas

AdventureWorks database supports standard online transaction processing scenarios for a fictitious bicycle manufacturer - Adventure Works Cycles.

Scenarios include:

  • Human Resources - HumanResources

  • Contact Information - Person

  • Manufacturing - Production

  • Purchasing - Purchasing

  • Sales - Sales

To watch the videos please copy and paste the website URL into your host Chrome browser.

Create a connection to the AdventureWorks2019 dataset, then ingest the schema.


Follow the steps below to connect and ingest the schema metadata:

Test Connection and Ingest Metadata Schema ..

After you have specified the detailed information according to your data source type, test the connection to the data source and add the data source.

  1. Enter the following details to connect to: MSSQL AdventureWorks2019 database.

Field
Setting

Data Source Name

mssql:adventureworks2019

Data Source ID

Leave Blank to autogenerate

Description

Demo dataset of fictitious bicycle manufacturer

Data Source Type

Microsoft SQL Server

Affinity

Default

Configuration Method

Credentials

Username

sqlreader

Password

2Petabytes

Host

pdc.pdc.lab

Port

1433

Driver

mssql-jdbc-9.2.1.jre15.jar

Database Name

AdventureWorks2019

After you have specified the detailed information according to your data source type, test the connection to the data source and add the data source.

  1. Click Test Connection to test your connection to the specified data source.

Prior to completing and saving your new data source setup, it's essential to execute the 'Ingest Schemas' process. This step is crucial for importing the database schema and associated metadata into the system.

  1. Click Ingest Schema, select the following 5 schemas, and then click Ingest Schemas.

While you have the option to select all schemas, it is advisable to exclude system-related schemas that are not relevant to your requirements.

  1. (Optional) Enter a Note for any information you need to share with others who might access this data source.

  2. Click: Create Data Source to establish your data source connection.

For Linux folks you can access the MSSQL AdventureWorks2019 database with Azure Data Studio.

  1. Ensure all the existing packages are up-to-date.

sudo apt update && sudo apt upgrade
  1. Ensure dependencies are up-to-date.

sudo apt install libunwind8
  1. Extract the .deb file.

cd ~
sudo dpkg -i ./Downloads/azuredatastudio-linux-<version string>.deb

Connect to AdventureWorks2019 database

  1. Start Azure Data Studio.

azuredatastudio
  1. Select: Connections (first icon in left menu).

  1. Select SQL Login

  2. Enter the following details:

Field
Setting

Connection type

Microsoft SQL Server

Input type

Parameters

*Server

localhost,1433

Authentication type

SQL Login

User name

sqlreader

Password

2Petabytes

Database

AdventureWorks2019

Encrypt

Mandatory (True)

Trust server certificate

True

Server group

<Default>

Name (optional)

AdventureWorks2019

*Enter server IP address or FQDN.

**PDC does not ship with any database drivers.

  1. Click: Connect.

SyntheaTM is an open-source tool that generates synthetic patient data, simulating individuals' complete medical histories. This encompasses medications, allergies, encounters, and social health determinants for each mock patient.

x

x

x

Arlojet database is an airline demo dataset. You can query the data based on:

  • Passengers

  • Ticketing

  • Weather

  • Aircraft

  • Catering

To watch the videos please copy and paste the website URL into your host Chrome browser.

Create a connection to the Arlojet dataset, then ingest the schema.

x


Follow the steps below to connect and ingest the schema metadata:

Test Connection and Ingest Metadata Schema ..

After you have specified the detailed information according to your data source type, test the connection to the data source and add the data source.

  1. Enter the following details to connect to: MySQL arlojet database.

Field
Setting

Data Source Name

mysql:arlojet

Data Source ID

Leave Blank to autogenerate

Description

Demo dataset of airline / passenger data

Data Source Type

MySQL

Affinity

Default

Configuration Method

Credentials

Username

sqlreader

Password

2Petabytes

Host

pdc.pdc.lab

Port

3306

Driver

mysql-connector-j-8.2.0.jar

After you have specified the detailed information according to your data source type, test the connection to the data source and add the data source.

  1. Click Test Connection to test your connection to the specified data source.

Before you finalize and save your new data source configuration, you need to perform a process called Ingest Schemas. This process loads fundamental information about the database schema and related metadata into the system.

  1. Click Ingest Schema, select the 'arlojet' schema, and then click Ingest Schemas.

Although you can select all schemas, it is recommended to avoid selecting certain system-related schemas that are unnecessary for your needs.

  1. (Optional) Enter a Note for any information you need to share with others who might access this data source.

  2. Click Create Data Source to establish your data source connection.

Install & Configure Schemaworkbench

MySQL Workbench is a is a graphical MySQL database management tool.

  1. Ensure all the existing packages are up-to-date.

sudo apt update && sudo apt upgrade
  1. Install MySQL Workbench.

sudo snap install mysql-workbench-community

Connect to MySQL Workbench

  1. Select “Applications” from the menu.

  2. Search for the MySQL workbench application, and then launch it.

  3. Edit the default connection.

  1. Enter the following connection details:

Connection Name

arlojet

Username

sqlreader

Password

2Petabytes

Default Schema

arlojet

  1. Click 'Test Connection'.

  1. Click Close.


Connect to Arlojet database

  1. Check for arlojet database.

  1. Select the option: Schemas & expand Tables

MinIO is a high-performance, Kubernetes-native object storage service that is designed for cloud-native and containerized applications. It is open-source and allows enterprises to build Amazon S3-compatible data storage solutions on-premises, integrating smoothly with a wide range of cloud-native ecosystems.

  • Banking - Chat bot data.

  • Football -

  • IoT Sensor -

To watch the videos please copy and paste the website URL into your host Chrome browser.

Create a connection to the datasets in the minIo object store, then Scan the data.

x


Follow the steps below to connect and ingest the schema metadata:

Test Connection and Ingest Metadata Schema ..

After you have specified the detailed information according to your data source type, test the connection to the data source and add the data source.

Field
Description

Affinity

This default setting specifies which agents should be associated with the data source in a multi-agent deployment.

Region

Geographical location where AWS maintains a cluster of data centers.

Endpoint

Location of the bucket. For example, s3.<region containing S3 bucket>.amazonaws.com

Access Key

User credential to access data on the bucket.

Secret Key

Password credential to access data on the bucket.

Bucket Name

The name of the S3 bucket in which the data resides. For S3 access from non-EMR file systems, Data Catalog uses the AWS command line interface to access S3 data.

These commands send requests using access keys, which consist of an access key ID and a secret access key.

You must specify the logical name for the cluster root.

This value is defined by dfs.nameservices in the hdfssite.xml configuration file.

For S3 access from AWS S3 and MapR file systems, you must identify the root of the MapR file system with maprfs:///.

Path

Directory where this data source is included.

  1. Enter the following details to connect to: minIO 'Banking' Object Store.

Field
Setting

Data Source Name

minIO:sensor

Data Source ID

Leave Blank to autogenerate

Description

Demo IoT sensor-data

Data Source Type

AWS S3

Affinity

Default

Region

us-east-1

Endpoint

Access Key

minioadmin

Secret Key

minioadmin

Bucket Name

iot-sensors-data-lake

Path

/

After you have specified the detailed information according to your data source type, test the connection to the data source and add the data source.

  1. Click Test Connection to test your connection to the specified data source.

Before you finalize and save your new data source configuration, you need to perform a process that scans the datasource. This process loads metadata and related information into the system.

  1. Click: Scan Files.

Obviously, as this is a flat file datasource, a 'Lite' scan retrieves just the file metadata.

  1. (Optional) Enter a Note for any information you need to share with others who might access this data source.

  2. Click Create Data Source to establish your data source connection.

MinIO

The MinIO Console displays a login screen for unauthenticated users. The Console defaults to providing a username and password prompt for a minIO-managed user.

  1. Either click on the bookmark or enter the following URL to Log into minIO.

Username

minioadmin

Password

minioadmin


Managing Objects

The Object Browser lists the buckets and objects the authenticated user has access to on the deployment.

After logging in or navigating to the tab, the object browser displays a list of the user’s buckets, which the user can filter.

  1. Select 'Buckets' from the left hand menu.

  2. Browse 'banking-data' bucket to show a list of objects in the bucket.

  1. Highlight: banking77.csv

The user can perform actions on the bucket’s objects, depending on the policies and permissions that apply. Example actions the user may be able to perform include:

• Rewind to a previous version

• Create prefixes

• View deleted objects

• Upload objects

• Download objects

• Share

• Preview

• Manage legal holds

• Manage retention

• Manage tags

• Inspect

• Display versions

• Delete

x

x

x

x

x

Navigate to:

To upload JDBC drivers follow the instructions in tab:

Click Close & return to:

Download Deb binary available on the official website:

To upload JDBC drivers follow the instructions in tab:

https://pdc.pdc.lab/
Azure Data Studio
1.2 Upload JDBC drivers
Ingest Metadata
1.2 Upload JDBC drivers
http://172.17.0.1:9000
DbSchema Supported Databases
Link to download database drivers
Link to: Data Sources
Synthea dataset
http://172.17.0.1:9001/login172.17.0.1
Link to MinIO
Welcome to the Hitachi Vantara Documentation Portal
MinIO Console — MinIO Object Storage for Container
Link to MinIO documentation
Logo
Resources
Synthea
synthea - ERD
Worker - Test Connection
Select schemas
Ingesting Schemas
postgresql:synthea connection
Connection details
Manage Drivers
Add New
Drag & Drop JDBC driver to upload
pgAdmin4
Register server
Synthea schema - patients table
PostgreSQL connection to Sythea database
PostgreSQL driver
Test Connection
Connection Test
patients table
AdventureWorks2019 - ERD
Select schemas
mssql:adventureworks2019 connection
Person Schema - Person Address
Arlojet database
Select arlojet schema
mysql:arlojet connection
Default connection
Test Connection
SHOW DATABASES
View data
MinIO Object Store
Scans
minIO:banking connection
minIO - Log In
MinIO - Buckets
CSV file
Logo
Logo