See yourself at Twilio.
Join the team as our next Staff Engineer, Platform Extensibility & Ecosystem.
Who we are & why we’re hiring
Twilio powers real-time business communications and data solutions that help companies and developers worldwide build better applications and customer experiences.
Although we're headquartered in San Francisco, we have presence throughout South America, Europe, Asia and Australia. We're on a journey to becoming a globally anti-racist, anti-oppressive, anti-bias company that actively opposes racism and all forms of oppression and bias. At Twilio, we support diversity, equity & inclusion wherever we do business. We employ thousands of Twilions worldwide, and we're looking for more builders, creators, and visionaries to help fuel our growth momentum.
About the job
This position is needed to create high-quality, modern solutions to complex engineering problems and help define the future of Customer Data Platforms. Our mission is to help businesses unlock customer insights and make better decisions faster and cheaper by creating a suite of industry-leading storage and compute services.
In this role, you will be responsible for designing and building a suite of platforms and services that form the basis of a Customer Data Platform capable of handling billions of events in near-realtime You’ll also be responsible for mentoring, sharing knowledge, and guiding the technical decisions of the team to set us up for long-term success, both as a product and as a team.
More about the CDP products:
- https://segment.com/product/connections/functions
- https://segment.com/blog/introducing-functions
- https://segment.com/blog/use-functions-to-customize-your-data-pipeline
- https://segment.com/docs/connections/sources/about-cloud-sources
- https://segment.com/docs/connections/destinations
Responsibilities
In this role, you’ll:
- Design and build the next generation of Platform Extensibility Platforms, process billions of events, and power various use cases across Twilio Data and Applications
- Build high-performance data pipelines using Go and Kafka
- Ship features that opt for high availability and throughput with eventual consistency
- General experience working with data systems or data pipelines
- Experience in architecting reliable distributed systems, with an emphasis on high-volume data management
- Support runtime for invoking untrusted code at the extremities of the pipeline
- Support the reliability and security of the integrations platform
- Build and optimize globally available and highly scalable distributed systems
- Participate in an on-call rotation to support our business-critical infrastructure
Qualifications
Not all applicants will have skills that match a job description exactly. Twilio values diverse experiences in other industries, and we encourage everyone who meets the required qualifications to apply. While having “desired” qualifications makes for a strong candidate, we encourage applicants with alternative experiences to also apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table!
Required
- 8+ years experience writing production-grade code in a modern programming language.
- Strong theoretical fundamentals and hands-on experience designing and implementing highly available and performant fault-tolerant distributed systems.
- Experience programming in one or more of the following: Go, Java, Scala, or similar languages
- Well-versed in concurrent programming, along with a solid grasp of Linux systems and networking concepts.
- Experience operating large-scale, distributed systems on top of cloud infrastructure such as Amazon Web Services (AWS) or Google Compute Platform (GCP)
- Have hands-on experience with container orchestration frameworks (e.g. Kubernetes, EKS, ECS)
- Experience shipping services (products) following CI/CD development paradigm.
- Deep understanding of architectural patterns of high-scale web applications (e.g., well-designed APIs, high volume data pipelines, efficient algorithms)
- Domain expertise in the Modern Data stack with experience in developing cloud-based data solution components and architecture covering data ingestion, data processing and data storage
- Excellent written and verbal communication skills.
- Experience with Data platforms and warehouses is a plus.
Location
This role will be remote and based in India.
What we offer
There are many benefits to working at Twilio, including things like competitive pay, generous time-off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location.
Twilio thinks big. Do you?
We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts.
So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now!If this role isn't what you're looking for, please consider other open positions.