Astronomer designed Astro, a modern data orchestration platform, powered by Apache Airflow™. Astro enables companies to place Apache Airflow at the core of their data operations, providing ease of use, scalability, and enterprise-grade security, to ensure the reliable delivery of mission-critical data pipelines.
We’re a globally-distributed and rapidly growing venture-backed team of learners, innovators and collaborators. Our mission is to build an Enterprise-grade product that makes it easy for data teams at Fortune 500’s and startups alike to adopt Apache Airflow. As a member of our team, you will be at the forefront of the industry as we strive to deliver the world's data.
Your background may be unconventional; as long as you have the essential qualifications, we encourage you to apply. While having "bonus" qualifications makes for a strong candidate, Astronomer values diverse experiences. Many of us at Astronomer haven't followed traditional career paths, and we welcome it if yours hasn't either.
About this role:
Astronomer’s Data Science team is at the hub of everything that happens at our company — from dashboards for managing operations to models that identify new sales opportunities. We are looking for a full-time data engineer who will design and develop data pipelines and operational workflows. You will work within an agile, sprint-based team, using Airflow, SQL, and Python to build and orchestrate the data pipelines that power our business. You will also develop integrations between business applications like Salesforce and Zendesk.
What you get to do:
Orchestrate pipelines that perform ETL, reverse ETL, business processes, data quality checks, and operational analytics.
Work with a huge variety of different data sets — as part of real-world analytics projects that impact all of Astronomer’s operational systems.
Automate business operations with a combination of Airflow and product integrations from platforms including Salesforce, Zendesk, and Stripe.
Collaborate with stakeholders at all organizational levels to translate mission-critical business requirements into actionable analytics.
Help drive the development of a data orchestration and ETL/ELT framework for internal business operations and analytics.
As an internal user of our platform (as well as open-source and cloud technologies like Snowflake, Sigma, Fivetran, etc.) provide regular feedback to the product and engineering teams.
Develop tools and Airflow extensions to assist internal teams and the greater Apache Airflow community.
What you bring to the role :
Data engineering and ETL/ELT experience in a production environment
SQL experience
Python fluency
Strong analytical and problem-solving skills
Excellent interpersonal and communication skills, and enthusiasm for collaborating in a team-oriented environment
Bonus Points if you have:
Proficiency with using modern data engineering tools (Apache Airflow, dbt, Snowflake, BigQuery, etc.)
Familiarity with data analysis, statistics, or machine learning
Bachelor's degree in computer science, information technology, information systems, or a related field OR 1–2 years of equivalent experience
The estimated salary for this role ranges from $100,000 - $130,000 based on leveling and geography, along with an equity component and a comprehensive benefits package. This range is merely an estimate; actual compensation may deviate from this range based on skills, experience, and qualifications.
#LI-DNI
At Astronomer, we value diversity. We are an equal opportunity employer: we do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Astronomer is a remote-first company.