Job Description

Job DescriptionData Pipeline Architecture and Development:
  • Architect and optimize scalable data storage solutions, including data lakes, warehouses, and NoSQL databases, supporting large-scale analytics.
  • Design and maintain efficient data pipelines using technologies such as Apache Spark, Kafka, Fabric Data Factory, and Airflow, based on cross-functional team requirements.
Data Integration and ETL:
  • Develop robust ETL processes for reliable data ingestion, utilizing tools like SSIS, ADF, and custom Python scripts to ensure data quality and streamline workflows.
  • Optimize ETL performance through techniques like partitioning and parallel processing.
Data Modeling and Schema Design:
  • Define and implement data models and schemas for structured and semi-structured sources, ensuring consistency and efficiency while collaborating with data teams to optimize performance.
Data Governance, Security, and Compliance:
  • Estab...

Apply for this Position

Ready to join Solanalytics? Click the button below to submit your application.

Submit Application