Date listed
1 week agoRemote
YesFound on:
We are transforming the way we interact with data to drive better decisions and outcomes.
You'll be be working on a small remote team with a lot with Python, SQL, Airflow, Airbyte, dbt, Snowflake, and Kubernetes.
Requirements:
* 3+ years of experience leading data engineering projects, including architectural decision-making and team leadership.
* Strong hands-on experience with cloud-based data solutions (AWS, GCP, or Azure).
* Expertise in Python, SQL, and Git for data processing and software development.
* Deep understanding of data structures, design patterns, and scalable data architectures.
* Experience with containerization (Docker, Kubernetes) and Infrastructure as Code (Terraform, CloudFormation, or similar).
* Proficiency in pipeline orchestration tools (Airflow, Prefect, Dagster) and data integration frameworks (Airbyte, dbt).
* Strong knowledge of data warehousing (Snowflake, BigQuery, Redshift) and real-time data streaming.
* A coaching mindset with a passion for sharing knowledge and upskilling the team.
If you're interested in applying, please fill out the application here: https://meq.bamboohr.com/hiring/jobs/94. Under the "Who referred you for this position?" question, please put "HN".
(Internship) Data Engineering , Collective Measures
HQ in Minneapolis, MN2 months ago
Software Engineer, Data Engineer, Software Architect , Ookla
US, Western Europe2 months ago
DevOps Engineer , RemoteMore
2 months ago
Data Engineer , Pagos.ai
1 month ago
Dev Ops Engineer , Hume
Los Angeles, United States Full Time Employment | $110k - $130k2 months ago
Newsletter
Let's simplify your job search. Receive your tailored set of opportunities today.
Subscribe to our Jobs