Unify Data
Logo
OracleDB as Source

OracleDB as Source

Logo

4 mins READ

UnifyApps enables seamless integration with Oracle databases as a source for your data pipelines. This article covers essential configuration elements and best practices for connecting to Oracle sources.

Overview

Oracle Database is widely used for enterprise applications including ERP systems, financial platforms, and customer management solutions. UnifyApps provides native connectivity to extract data from these Oracle environments efficiently and securely.

Connection Configuration

ParameterDescriptionExample
Connection NameDescriptive identifier for your connection"Production Oracle ERP"
Host AddressOracle server hostname or IP address"oracle-db.example.com"
Port NumberDatabase listener port1521 (default)
UserDatabase username with read permissions"unify_reader"
PasswordAuthentication credentials"********"
Database/Service NameSID or service name of your Oracle instance"PRODDB"
PDB NameName of Pluggable Database (if using CDB)"SALES_PDB"
Schema NameSchema containing your source tables"FINANCE"
Connection TypeMethod of connecting to the databaseDirect, SSH Tunnel, or SSL

To set up an Oracle source, navigate to the Connections section, click New Connection, and select OracleDB Server. Fill in the parameters above based on your Oracle environment details.

Frame 427319245 (6).png
Frame 427319245 (6).png

Server Timezone Configuration

When adding objects from an Oracle source, you'll need to specify the database server's timezone. This setting is crucial for proper handling of date and time values.

  1. In the Add Objects dialog, find the Server Time Zone setting

  2. Select your Oracle server's timezone (e.g., "India Time (+05:30)")

This ensures all timestamp data is normalized to UTC during processing, maintaining consistency across your data pipeline.

Frame 427319246 (4).png
Frame 427319246 (4).png

Ingestion Modes

ModeDescriptionBusiness Use Case
Historical and LiveLoads all existing data and captures ongoing changesERP system migration with continuous synchronization
Live OnlyCaptures only new data from deployment forwardReal-time sales dashboard without historical context
Historical OnlyOne-time load of all existing dataFinancial quarter-end reporting or compliance snapshot

Choose the mode that aligns with your business requirements during pipeline configuration.

CRUD Operations Tracking

All database operations from Oracle sources are identified and logged as unique actions:

OperationDescriptionBusiness Value
CreateNew record insertionsTrack new customer accounts or orders
ReadData retrieval actionsMonitor query patterns and data access
UpdateRecord modificationsAudit changes to sensitive financial data
DeleteRecord removalsCompliance tracking for record deletion

This comprehensive logging supports audit requirements and troubleshooting efforts.

Supported Data Types

CategorySupported Types
TextVARCHAR2, NVARCHAR2, CHAR, NCHAR, LONG
NumericNUMBER, FLOAT, BINARY_FLOAT
Date/TimeDATE, TIMESTAMP, TIMESTAMP WITH TIME ZONE
BinaryRAW, LONG RAW
Large ObjectsCLOB, NCLOB, BLOB, BFILE
Row IdentifiersROWID, UROWID

All common Oracle data types are supported, including specialized types for financial and transactional systems.

Common Business Scenarios

  1. Financial Data Integration

    • Connect to Oracle Financials to extract GL, AP/AR data

    • Ensure fiscal period timestamps are properly timezone-adjusted

    • Consider regulatory requirements for financial data transfers

  2. Customer Data Synchronization

    • Extract customer records from Oracle CRM or EBS

    • Map customer hierarchies and relationships

    • Maintain referential integrity during extraction

  3. Supply Chain Visibility

    • Connect to Oracle SCM for inventory and order data

    • Configure real-time syncing for critical inventory levels

    • Apply appropriate filtering for high-volume transaction tables

Best Practices

CategoryRecommendations
PerformanceExtract only necessary columns to minimize network load, Use WHERE clauses to filter large tables, Schedule bulk operations during off-hours
SecurityCreate read-only Oracle users with minimum permissions, Use SSH tunneling for databases in protected networks, Secure credentials using enterprise password policies
Data GovernanceDocument source-to-target mappings, Maintain data lineage for compliance reporting, Set up alerts for pipeline failures
OptimizationIndex frequently queried columns in source tables, Monitor and tune Oracle AWR reports for extraction queries, Use partitioned tables for very large datasets

By properly configuring your Oracle source connections and following these guidelines, you can ensure reliable, efficient data extraction while meeting your business requirements for data timeliness, completeness, and compliance.