Schedule a Demo
Visit the official
Paxata Documentation portal
for all of your doc needs.
How do I setup data connectors?
November 8, 2018 12:40AM
I need to set up connectors for Sharepoint, Hana, SFTP, Hive, and Oracle. What is the process? What information will I require? How do I go about this?
November 8, 2018 1:09AM
edited November 8, 2018 6:29PM
For on-premise deployments, your Paxata Administrator will provision any data connectors required. For those accessing our cloud version, you can work with your Customer Success Manager to get any data connectors provisioned. If you are not sure who that is, you can send an email to:
Once a data connector is provisioned, you can use it to connect to 1 or many different systems by setting up a data source configuration file. In order to configure your connector, you'll need to gather the following information:
What is the
of the server we are connecting to?
What is the
that will be listening to the request?
Is there a
, server name, database name, or some other name we need?
Are there specific folders, schemas, directories, or buckets that we should directly access?
user account & password
will be used for authentication purposes?
can access this connection? Are there security restrictions we should introduce?
If you need to setup a data source configuration to any Hadoop File Store (ex: CDH or HDP), Hadoop Database (ex: Hive) - those will require additional setup when provisioning the data connector (typically requiring hadoop configuration files to be deployed within Paxata). For cloud deployments, Paxata Expert Services will typically work with your internal Hadoop Administrators in provisioning the data connections before configuring the data source configuration files.
If you need to provision a connector to a database, we there may be some additional drivers and/or additional configurations that may also be required. Please work with your CSM on those details.
For specific requirements, you can check the following:
Azure Blob Storage (WASB)
Azure Data Lake Store (ADLS)
Google Cloud Storage
HDFS on CDH5 - Version: CDH 5.12-5.14
HDFS on HDP2 - Version: HDP 2.6.3
Network Share (SMB / Samba) - Version: SMB v2, v3
Most of the databases will only require 3 things:
JDBC URL (
Consult your database documentation for details of the JDBC connection URL syntax).
Here is a list of our databases we are currently certified for. You can click on any links that appear for more specific details.
Azure SQL- Data Warehouse
Hive (CDH5) - Version: CDH 5.12-5.14
Hive (HDP2) - Version: HDP 2.6.3
IBM DB2 - Version: 10.x+
IBM Netezza - Version: 7.x+
MS Azure SQL
MS SQL Server - Version: 2012
MySQL - Version: 5.1
Oracle - Version: 11, 12
PostgreSQL - Version: 8.4
SAP HANA SPS - Version: 11
Teradata - Version: 15.0+
REST API: HTTP, GET, and, POST requests with URL only.
Microsoft Dynamics 365
Salesforce - Version: API v40
MicroStrategy - Version: 10.9
Tableau - Version: 9.3.7, 10.0.x, 10.5.x, Online