Job Description
Roles & Responsibilities:
- Oracle Warehouse Builder, OWB, Oracle Workflow Builder, Oracle TBSS
- Oracle Warehouse Builder 9i (Client Version 9.0.2.62.3/Repository Version 9.0.2.0.0)
- Oracle Warehouse Builder 4
- Oracle Workflow Builder 2.6.2
- Oracle Database 10gTNS for IBM/AIX RISC System/6000Version 10.2.0.5.0 - Production
- More than 5 years experience on Oracle Warehouse Builder (OWB) and Oracle Workflow Builder
- Expert Knowledge on Oracle PL/SQL to develop individual code objects to entire DataMart's.
- Scheduling tools Oracle TBSS (DBMS_SCHEDULER jobs to create and run) and trigger based for file sources based on control files.
- Must have design and development experience in data pipeline solutions from different source systems (FILES, Oracle) to data lakes.
- Must have involved in creating/designing Hive tables and loading analyzing data using hive queries.
- Must have knowledge in CA Workload Automation DE 12.2 to create jobs and scheduling.
- Extensive knowledge on entire life cycle of Change/Incident/Problem management by using ServiceNow.
- Oracle Warehouse Builder 9i (Client Version 9.0.2.62.3/Repository Version 9.0.2.0.0).
- Oracle Warehouse Builder 4
- Oracle Workflow Builder 2.6.2
- Oracle Database 10gTNS for IBM/AIX RISC System/6000Version 10.2.0.5.0 - Production.
- Oracle Enterprise Manager 10gR1.(Monitoring jobs and tablespaces utilization)
- Extensive knowledge in fetching Mainframe Cobol files(ASCII AND EBSDIC formats) to the landing area and processing(formatting) and loading(Error handling) of these files to oracle tables by using SQL*Loader and External tables.
- Extensive knowledge in Oracle Forms 6 to integrate with OWB 4.
- Extensive knowledge on entire life cycle of Change/Incident/Problem management by using Service-Now.
- work closely with the Business owner teams and Functional/Data analysts in the entire development/BAU process.
- Work closely with AIX support, DBA support teams for access privileges and storage issues etc.
- work closely with the Batch Operations team and MFT teams for file transfer issues.
- Migration of Oracle to Hadoop eco system:
- Must have working experience in Hadoop eco system elements like HDFS, MapReduce, YARN etc.
- Must have working knowledge on Scala & Spark Data frames to convert the existing code to Hadoop data lakes.
- Must have design and development experience in data pipeline solutions from different source systems (FILES, Oracle) to data lakes.
- Must have involved in creating/designing Hive tables and loading analyzing data using hive queries.
- Must have knowledge in creating Hive partitions, Dynamic partitions and buckets.
- Must have knowledge in CA Workload Automation DE 12.2 to create jobs and scheduling.
- Use Denodo for Data virtualization to the required data access for end users.
Skills Required
AIX Support
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application