Date listed
1 month agoRemote
YesFound on:
We are transforming the way we interact with data to drive better decisions and outcomes.
You'll be be working on a small remote team with a lot with Python, SQL, Airflow, Airbyte, dbt, Snowflake, and Kubernetes.
Requirements:
* 3+ years of experience leading data engineering projects, including architectural decision-making and team leadership.
* Strong hands-on experience with cloud-based data solutions (AWS, GCP, or Azure).
* Expertise in Python, SQL, and Git for data processing and software development.
* Deep understanding of data structures, design patterns, and scalable data architectures.
* Experience with containerization (Docker, Kubernetes) and Infrastructure as Code (Terraform, CloudFormation, or similar).
* Proficiency in pipeline orchestration tools (Airflow, Prefect, Dagster) and data integration frameworks (Airbyte, dbt).
* Strong knowledge of data warehousing (Snowflake, BigQuery, Redshift) and real-time data streaming.
* A coaching mindset with a passion for sharing knowledge and upskilling the team.
If you're interested in applying, please fill out the application here: https://meq.bamboohr.com/hiring/jobs/94. Under the "Who referred you for this position?" question, please put "HN".
Senior Backend Engineer - Data Engineering & AI/LLM Focus , Collaboration.Ai
1 month ago
Cloud Engineer - GCP , Paymentology
1 month ago
DevOps Engineer , RemoteMore
2 months ago
Founding Engineer , Glidely
2 weeks ago
Data Engineer , Pagos.ai
2 months ago
Newsletter
Let's simplify your job search. Receive your tailored set of opportunities today.
Subscribe to our Jobs