Overview
At Instacart, our mission is to create a world where everyone has access to the food they love and more time to enjoy it together. Millions of customers every year use Instacart to buy their groceries online, and the Data Engineering team is building the critical data pipelines that underpin all of the myriad of ways that data is used across Instacart to support our customers and partners.
About the Role
The Finance data engineering team plays a critical role in defining how financial data is modeled and standardized for uniform, reliable, timely and accurate reporting. This is a high impact, high visibility role owning critical data integration pipelines and models across all of Instacart’s products. This role is an exciting opportunity to join a key team shaping the post-IPO financial data vision and roadmap for the company.
About the Team
Finance data engineering is part of the Infrastructure Engineering pillar, working closely with accounting, billing & revenue teams to support the monthly/quarterly book close, retailer invoicing and internal/external financial reporting. Our team collaborates closely with product teams to capture critical data needed for financial use cases.
About the Job
- You will be part of a team with a large amount of ownership and autonomy.
- Large scope for company-level impact working on financial data.
- You will work closely with engineers and both internal and external stakeholders, owning a large part of the process from problem understanding to shipping the solution.
- You will ship high quality, scalable and robust solutions with a sense of urgency.
- You will have the freedom to suggest and drive organization-wide initiatives.
About You
Minimum Qualifications
- 8+ years of working experience in a Data/Software Engineering role, with a focus on building data pipelines.
- Expert with SQL and knowledge of Python.
- Experience building high quality ETL/ELT pipelines.
- Past experience with data immutability, auditability, slowly changing dimensions or similar concepts.
- Experience building data pipelines for accounting/billing purposes.
- Experience with cloud-based data technologies such as Snowflake, Databricks, Trino/Presto, or similar.
- Adept at fluently communicating with many cross-functional stakeholders to drive requirements and design shared datasets.
- A strong sense of ownership, and an ability to balance a sense of urgency with shipping high quality and pragmatic solutions.
- Experience working with a large codebase on a cross functional team.
Preferred Qualifications
- Bachelor’s degree in Computer Science, computer engineering, electrical engineering OR equivalent work experience.
- Experience with Snowflake, dbt (data build tool) and Airflow
- Experience with data quality monitoring/observability, either using custom frameworks or tools like Great Expectations, Monte Carlo etc
#LI-Remote