Job Description

Primary Role & Responsibilities:

  • The primary role of the candidate will be:
  • Design and maintain the organization's data architecture, including databases, data storage systems, and data integration processes.
  • Develop conceptual, logical, and physical data models that meet business requirements and optimize data flow and storage.
  • Design and implement ETL (Extract, Transform, Load) processes to integrate data from various sources into the centralized data repository.
  • Implement data governance frameworks, ensuring compliance with regulatory requirements, data privacy laws, and industry best practices.
  • Design, implement, and optimize database solutions (e.g., SQL, NoSQL) to ensure high performance, security, and reliability.
  • Evaluate new data technologies, tools, and software to enhance data architecture and improve business outcomes.
  • Work closely with data engineers, data scientists, business analysts, and other stakeholders to align data strategy with business objectives.
  • Create and maintain comprehensive documentation for data models, architecture designs, data flows, and data management processes.
  • Develop and enforce policies and procedures to ensure data security, integrity, and availability.


Required Soft Skills:

  • Strong analytical and problem-solving skills.
  • Excellent communication and collaboration skills.
  • Ability to translate business requirements into technical solutions.


Working Experience and Qualification:

Education: Bachelor?s or master?s degree in Computer Science, Information Technology, Data Science, or related field.

Experience: Proven experience of 8-10 years as a Data Architect or in a similar role, with a track record of designing complex data architectures.

Skills:

  • Expertise in database technologies (SQL, NoSQL, Hadoop, Cassandra, etc.).
  • Strong experience with cloud platforms (e.g., AWS, Azure, Google Cloud) for data storage and processing.
  • Proficient in data modeling tools and techniques.
  • Knowledge of ETL frameworks and data integration tools (e.g., Apache Nifi, Talend, Informatica).
  • Familiarity with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery)
  • Experience with programming languages like Python, Java, or Scala.
  • Knowledge of data governance and security practices.


Preferred Skills:

  • Experience with big data technologies such as Apache Spark, Kafka, or Flink.
  • Familiarity with machine learning or artificial intelligence frameworks for data analysis.
  • Knowledge of industry standards for data management (e.g., DAMA, TOGAF).

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application