Description
WHAT YOU’LL DO
Join our dynamic team dedicated to revolutionizing data infrastructure and products for impactful decision-making at Braze. We collaboratively shape data engineering strategies, optimizing data pipelines and architecture to drive business growth and enhance customer experiences.
Responsibilities:
- Lead the design, implementation, and monitoring of scalable data pipelines and architectures using tools like Snowflake and dbt
- Develop and maintain robust ETL processes to ensure high-quality data ingestion, transformation, and storage
- Collaborate closely with data scientists, analysts, and other engineers to design and implement data solutions that drive customer engagement and retention
- Optimize and manage data flows and integrations across various platforms and applications
- Ensure data quality, consistency, and governance by implementing best practices and monitoring systems
- Work extensively with large-scale event-level data, aggregating and processing it to support business intelligence and analytics
- Implement and maintain data products using advanced techniques and tools
- Collaborate with cross-functional teams including engineering, product management, sales, marketing, and customer success to deliver valuable data solutions
- Continuously evaluate and integrate new data technologies and tools to enhance our data infrastructure and capabilities
WHO YOU ARE
The ideal candidate for this role possesses:
- 5+ years of hands-on experience in data engineering, cloud data warehouses, and ETL development, preferably in a customer-facing environment
- Proven expertise in designing and optimizing data pipelines and architectures
- Strong proficiency in advanced SQL and data modeling techniques
- A track record of leading impactful data projects from conception to deployment
- Effective collaboration skills with cross-functional teams and stakeholders
- In-depth understanding of technical architecture and data flow in a cloud-based environment
- Ability to mentor and guide junior team members on best practices for data engineering and development
- Passion for building scalable data solutions that enhance customer experiences and drive business growth
- Strong analytical and problem-solving skills, with a keen eye for detail and accuracy
- Extensive experience working with and aggregating large event-level data
- Familiarity with data governance principles and ensuring compliance with industry regulations
- Prefer, but don’t require, experience with Kubernetes for container orchestration and Airflow for workflow management
#LI-Remote