hero

Python Engineer - Adtech - 5+ Yrs Experience

Adelaide

Adelaide

Software Engineering
United States · Remote
Posted on Tuesday, May 16, 2023

Adelaide is looking for a Python Engineer to join our Product and Engineering team. You'll be joining a company making a positive impact on the digital media market using evidence-based metrics to increase the transparency of media quality.

We’re remote for the foreseeable future, but you should expect to be in NYC a couple of times a year. For those in the area, we have a space in Manhattan to work from if interested.

Key Responsibilities:

  • Design, develop, and implement scalable and maintainable data processing pipelines (ETL and ELT) using AWS services such as Amazon Elastic MapReduce (EMR), Amazon Elastic Container Service (ECS), Amazon Redshift, AWS Lambda, AWS Glue, Amazon Kinesis Data Firehose, AWS Step Functions.
  • Collaborate with cross-functional teams to define, design, and develop new features and improvements to our existing infrastructure and data pipelines.
  • Analyze and optimize data processing workflows for improved performance and efficiency.
  • Ensure the quality and reliability of data through thorough testing and validation.

What you'll learn:

  • How to maintain large-scale infrastructure and ETL pipelines across various products, partnerships, and teams
  • How to work across both Engineering and Data Science to ensure that stakeholders' needs are met
  • How to manage, support, and iterate upon robust machine learning models
  • How to understand and represent the measurement of true media quality and user engagement at an enormous scale
  • How the biggest brands and publishers in the world think about media measurement and technology

Qualifications:

  • 5+ years of experience as a Python Engineer, working with big data technologies and frameworks.
  • Proficiency in Python and experience with libraries such as PySpark and Pandas.
  • Strong understanding of AWS services related to big data processing, including but not limited to Amazon Redshift, AWS Lambda, AWS Glue, Amazon Kinesis Data Firehose, AWS Step Functions, Infrastructure as Code (IaC) tools like AWS CloudFormation and Terraform
  • Experience with big data storage and processing concepts, such as Spark, or similar technologies.
  • Knowledge of data visualization and business intelligence tools like Looker or other similar platforms.
  • Strong problem-solving skills and ability to work independently or as part of a team.
  • Excellent communication and collaboration skills.