Job Description

Job Description
In this role, you will design, develop, and maintain both frontend and backend components of our reporting platform, ensuring seamless data flow and intuitive user experiences for clients and internal teams. You will architect and optimize end-to-end data pipelines using PySpark, Scala, Airflow, and Databricks, while leveraging AWS and in‑memory databases to support high‑performance processing at scale. Your work will involve building user interfaces, APIs, and backend services, troubleshooting production issues, and continuously improving system reliability and efficiency. You will collaborate closely with product managers, data engineers, and other cross-functional partners to translate business requirements into robust technical solutions, contributing actively to code reviews, design discussions, and the full software development lifecycle.

This role is onsite 4x a week in Downtown Denver

Pay for this position is $112K to $135K

We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to [email protected] learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.
Skills and Requirements
3 Years of experience as a software engineer in large enterprise data environments

Proficient in Pyspark, Scala, and Data Modeling

Proficient in Airflow, Databricks, AWS, and in-memory data bases

Experience architecting data pipelines from end to end Experience with data quality tools like Monte Carlo or similar

Prior experience in the AdTech Industry

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application