Job Description

We are searching for a highly skilled and experienced **Lead Data Software Engineer** to design, build, and optimize scalable data solutions on modern cloud platforms such as BigQuery or Databricks. In this position, you will lead a team of Data Engineers, advocate for automation-first development practices, and enable impactful business insights through reliable data infrastructure and engineering excellence.
**Responsibilities**
- Lead and mentor a team of Data Engineers, fostering growth and technical expertise
- Design and implement scalable data pipeline architectures using Python and SQL with automation-first principles
- Build and maintain batch and real-time data processing solutions using BigQuery, Databricks, Apache Airflow, and DBT
- Architect end-to-end data infrastructure solutions that adhere to CI/CD and DevOps best practices
- Develop clean, reusable code that emphasizes maintainability, scalability, and fault tolerance
- Implement monitoring, alerting, and observability systems for data infrastructure reliability
- Collaborate with cross-functional teams to address complex technical challenges and align data solutions with business goals
- Develop data APIs, self-service tools, and automation frameworks to improve data accessibility and usability
- Optimize performance and cost-efficiency of data systems and pipelines on cloud platforms
- Ensure secure, compliant data handling by implementing governance frameworks and quality monitoring
**Requirements**:
- BS/MS in Computer Science, Software Engineering, or a related field
- 5+ years of experience in production-grade data engineering with solid expertise in automation and full-stack data development
- At least 1 year of relevant leadership experience
- Proficiency in software engineering principles like version control (Git), CI/CD workflows, and testing frameworks
- Expertise in Python and modern cloud data platforms such as Databricks or BigQuery
- Background in cloud-native architecture using AWS, GCP, or Azure with a focus on data processing pipelines
- Familiarity with containerization and orchestration tools like Docker and Kubernetes
- Capability to implement and manage data pipeline orchestration using Apache Airflow, DBT, or similar tools
- Skills in building APIs, event-driven services, and integrating microservices into data workflows
- Understanding of DataOps and DevOps practices, including infrastructure as code and automation workflows
- Knowledge of event-driven architectures and streaming platforms like Kafka or Kinesis
- Showcase of creating software solutions for automated data governance, quality monitoring, and data products
- English proficiency at a B2+ level
**Nice to have**
- Background in MySQL, Looker/Tableau, and analytics tools such as Amplitude or Segment
- Experience with MLOps for machine learning pipelines and model deployment
- Familiarity with real-time analytics platforms and data streaming technologies
- Qualifications in basic Linux/Unix administration and shell scripting
- Expertise with cloud DevOps tools for CI/CD, monitoring, and infrastructure as code solutions like Terraform or Pulumi
**We offer**
- Career plan and real growth opportunities
- Unlimited access to LinkedIn learning solutions
- International Mobility Plan within 25 countries
- Constant training, mentoring, online corporate courses, eLearning and more
- English classes with a certified teacher
- Support for employee’s initiatives (Algorithms club, toastmasters, agile club and more)
- Enjoyable working environment (Gaming room, napping area, amenities, events, sport teams and more)
- Flexible work schedule and dress code
- Collaborate in a multicultural environment and share best practices from around the globe
- Hired directly by EPAM & 100% under payroll
- Law benefits (IMSS, INFONAVIT, 25% vacation bonus)
- Major medical expenses insurance: Life, Major medical expenses with dental & visual coverage (for the employee and direct family members)
- 13 % employee savings fund, capped to the law limit
- Grocery coupons
- 30 days December bonus
- Employee Stock Purchase Plan
- 12 vacations days plus 4 floating days
- Official Mexican holidays, plus 5 extra holidays (Maundry Thursday and Friday, November 2nd, December 24th & 31st)
- Monthly non-taxable amount for the electricity and internet bills
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application