Job Description

Data Engineer II
Summary:

Data Engineer / Analyst will work on designing, building, and implementing a data foundation to support a strategic analytics project that drives strategic growth for ADM. This involves liaising with functional teams that build or acquire the underlying data, application owners to understand raw or source data structures, and end users to accurately design strategic data marts. Finally, designing and building data pipelines, ETL / ELT integrations, constructing and modifying data marts, preparing (cleaning, transforming) data, and collaborating with extended team members to design impactful business solutions. Data Engineer needs to have specifically expert technical knowledge of SAP datasphere, SQL, Python and software technology (ETL/ELT, SQL, PySpark, Power BI, SAC, Python), deep expertise in data management techniques, ability to drive effective support of data analysis and model management, and experience with IT project management practices

Responsibilities:

  • Standardize data and reporting across businesses and teams to enable consistency and quality of key business data
  • Design and develop enterprise data warehouse platform that conforms to consistent methodologies, standards and industry best practices
  • Design, develop, test and document data integrations and assist with deployment, validation and hyper care
  • Maintain and support SAP Datasphere analytical model and Azure integrations like ADF, Databricks
  • Develop SQL view using Standard tables to monitor the Remote tables & Remote Queries which are scheduled and create task chains to send emails alerts for any errors on data loading activities.
  • Create SQL views to handle complex business logics and holds good hands-on SQL.
  • Develop multiple Analytical Models to make the reporting available for SAC & Power BI.
  • Develop Task chains to handle the Dataflow updates in a sequential order and for better monitoring.
  • Create Multiple Dataflow to move data from MSSQL tables to persistent tables in Datasphere and incorporating diverse data cleansing logics applied.
  • Develop procedures in DB explore of the Datasphere space to convert the ABAP programs logics to Persistent tables and facilitating streamlined reporting processes.
  • Adept at creating and managing Task chains for efficient data loading and monitoring using DWF Task chain Monitor and involved in meticulous unit testing of dataflows and conducting thorough data reconciliations with source systems to guarantee data accuracy.
  • Knowledge in setting up connections between Non-SAP sources & Datasphere.
  • Integrate TPM data by using Azure as a Source and use Replication flows to move data to Datasphere.
  • Develop replication flows for Delta enabled CDS views and move data into Datasphere.
  • Experience in creating Models and Dimensions in SAC.
  • Created Functional & Technical Design document for Datasphere requirements.
  • Transporting Datasphere artifacts from one landscape to another using Import/Export folder creation.

Requirements:

  • 4-year Bachelors degree or equivalent in IT, Computer Science, Science, Engineering, Statistics, Programming, Business Analytics, Mathematical or related field
  • At least 4 years of enterprise BI and Analytics technical experience implementing data warehouse and data marts in an Agile environment
  • 4 years of recent experience using data integration, SAP Datasphere, ETL data replication, and data warehouse automation tools such as Microsoft Azure Data Factory, Databricks and BEX.
  • 4 years of recent experience in data processing using SQL, PySpark,Python.
  • Strong working knowledge of Data Modeling & Data Transformation in Datasphere.
  • Extensive experience in Multiple Remote Tables with Semantic usage of (Relational Dataset, Fact, Dimension, Text, Hierarchy tables)
  • Hands on knowledge on creation of remote tables to consume data from S4 using the CDS views & S4 Tables.
  • Experience in dynamic filters in remote tables to optimize data retrieval from S4 CDS views and S4 tables, ensuring efficient and targeted data access.
  • Experience in Analyzing business requirements to identify and decide which data should be stored in Datasphere.
  • Experience in creating complex data modelling using Graphical Views (Relational Dataset/Fact/Dimension) based on the Functional design document and leveraging S4 & Non-SAP as a data source.
  • Experience working in DevOps / CICD framework
  • High accountability with a demonstrated ability to deliver
  • Strong communication skills including design documentation
  • Strong collaboration skills working with architecture, design and development teams

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application