Company Description
Spendesk is a 7-in-1 spending solution built for finance teams to make faster, smarter spending decisions. Founded in 2016, Spendesk is now one of the fastestgrowing fintechs in Europe, with over 4,000 customers and an international team of 500+ employees based in Paris, Berlin, London, Hamburg, and remote. We’ve raised over €260M from leading investors and been named a French tech unicorn. And we’re not stopping there!
We are glad to welcome a Data Engineering expert with an entrepreneurial, performance-driven mindset and strong experience in implementing and delivering Data Analytics solutions to join us as a Data Engineer.
We offer you a flexible and dynamic environment with opportunities to go beyond your comfort zone to grow personally and professionally.
PROJECT
Spendesk Financial Services (SFS) is a payments institution that offers a platform for embedding financial services into the Spendesk product. SFS offers capabilities such as accounts management, KYC, card payments, wire transfers, etc. SFS projects can be divided into three major categories:
- To develop our core banking system to replace the legacy solution
- To migrate clients from legacy to the new core banking system
- To extend the core banking system with better payment capabilities
Job Description
- Contributing to new technologies investigations and complex solutions design, supporting a culture of innovation considering matters of security, scalability, and reliability with the focus on building out our ETL processes
- Working with modern data stack, coming up with well-designed technical solutions and robust code, implementing data governance processes
- Working and professionally communicating with the customer’s team
- Taking up responsibility for delivering major solution features
- Participating in requirements gathering & clarification process, proposing optimal architecture strategies, leading the data architecture implementation
- Developing core modules and functions, designing scalable and cost-effective solutions
- Performing code reviews, writing unit, and integration tests
- Scaling the distributed system and infrastructure to the next level
- Building data platform using the power of AWS cloud provider
Qualifications
- 3+ years of strong experience with Python as a programming language for data pipelines and related tools
- Familiarity and understanding of distributed data processing with Spark, for data pipeline optimization and monitoring workloads
- Experience with Snowflake
- Proven strong track record of building data transformations in data build tools
- Excellent data modeling and data warehousing best practices implementation
- Experience with Looker with a developer proficiency
- Strong Data Domain background - knowledge of how data engineers, data scientists, analytics engineers, and analysts work to be able to work closely with them and understand their needs
- Good written and spoken English communication skills
- Software engineering best practices: Testing, PRs, Git, code reviews, code design, releasing
WOULD BE A PLUS
- Data certifications Data Engineend or Data Analytics
- Experience with Databricks and Airflow
- Experience with DAGs and orchestration tools
- Experience with developing Snowflake-driven data warehouses
- Experience with developing event-driven data pipelines