See yourself at Twilio
Join the team as our next Security Data Engineer on Twilio’s Information Security Team!
Who we are & why we’re hiring
Twilio powers real-time business communications and data solutions that help companies and developers worldwide build better applications and customer experiences.
Although we're headquartered in San Francisco, we have presence throughout South America, Europe, Asia and Australia. We're on a journey to becoming a global company that actively opposes racism and all forms of oppression and bias. At Twilio, we support diversity, equity & inclusion wherever we do business.
About the job
This position is needed to build and maintain the Security DataLake at Twilio that centralizes security data from various sources across a vast portfolio of security tooling. The data needs to be ingested, analyzed, and visualized to support a variety of executive reporting requirements and monitor the health of core operational security operations.
As a Data Engineer, you will be responsible for building and managing security data infrastructure that supports efficient extraction, transformation and loading of data from various data sources. Additionally, you will partner with various cross functional stakeholders across the organization to help visualize key metrics iteratively and ensure data completeness, quality, and accuracy of metrics.
Responsibilities
In this role, you’ll:
- Integrate data from multiple sources including databases, APIs, logs, and Security tools, ensuring data quality and consistency.
- Design, develop, and optimize Extract, Load, Transform (ELT) processes to move and transform data from source systems to target systems efficiently.
- Implement and maintain data warehouses or data lakes, optimizing for storage, query performance, and accessibility for analytics and reporting.
- Establish and enforce data governance policies and best practices to ensure data security, privacy, quality, and compliance with regulatory requirements.
- Monitor and optimize the performance of data pipelines, databases, and data processing frameworks to meet SLAs and handle growing data volumes.
- Collaborate with cross-functional teams and business stakeholders to understand data requirements and deliver solutions.
- Document data engineering processes, architecture, and best practices.
- Troubleshoot data-related issues, providing timely resolution and support to maintain data availability and reliability.
- Implement automation and orchestration tools to streamline data engineering workflows, reduce manual intervention, and improve efficiency.
Qualifications
Not all applicants will have skills that match a job description exactly. Twilio values diverse experiences in other industries, and we encourage everyone who meets the required qualifications to apply. While having “desired” qualifications make for a strong candidate, we encourage applicants with alternative experiences to also apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table!
Required:
- 5+ years of experience in data engineering or a data analytics field.
- Proficiency in programming languages such as Python, SQL, and/or Scala.
- Experience with distributed systems, such as Hadoop, Spark, or other big data frameworks.
- Experience with cloud-based data platforms, such as AWS, Google Cloud Platform.
- Experience with modern data warehousing technologies, such as Snowflake or BigQuery.
- Experience with ELT/ETL processes and orchestration tools, such as Airflow.
- Experience in ingesting data from APIs and proficiency in parsing JSON, XML, and other data formats returned by APIs.
- Knowledge of data governance principles, data security, and compliance.
- Strong cross-team communication and collaboration skills.
- Excellent problem-solving skills and attention to detail.
- Ability to work independently and as part of a team
Desired:
- Experience with visualization tools, such as Tableau and knowledge of Tableau products including Tableau Server, Tableau Public, Tableau desktop.
- Experience building data pipelines with dbt (data build tool)
- Experience with devops tools, such as Git, GitLab, Terraform.
- Bachelor's degree in Computer Science, Engineering, or related field; Master's degree preferred.
Location
This role will be remote and based in Colombia.
Travel
We prioritize connection and opportunities to build relationships with our customers and each other. For this role, you may be required to travel occasionally to participate in project or team in-person meetings.
What We Offer
There are many benefits to working at Twilio, including, in addition to competitive pay, things like generous time-off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location.
Twilio thinks big. Do you?
We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts.
So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now!
If this role isn't what you're looking for, please consider other open positions.