Company Description
๐๐ผ We're Nagarro.
We are a digital product engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale โ across all devices and digital mediums, and our people exist everywhere in the world (18,500+ experts across 36 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in!
By this point in your career, it is not just about the tech you know or how well you can code. It is about what more you want to do with that knowledge. Can you help your teammates proceed in the right direction? Can you tackle the challenges our clients face while always looking to take our solutions one step further to succeed at an even higher level? Yes? You may be ready to join us.
Job Description
As a Senior Developer, you will work closely with cross-functional teams to deliver high-quality solutions in domains such as Supply Chain, Finance, Operations, Customer Experience, HR, Risk Management, and Global IT.
Responsibilities:
- Develop a comprehensive technical plan for the migration, including data ingestion, transformation, storage, and access control in Azure's Data Factory and data lake
- Design and implement scalable and efficient data pipelines to ensure smooth data movement from multiple sources using Azure Databricks
- Developing scalable and re-usable frameworks for ingesting data sets
- Ensure data quality and integrity throughout the entire data pipeline, implementing robust data validation and cleansing mechanisms
- Working with event based / streaming technologies to ingest and process data
- Provide technical guidance and support to the team, resolving any technical challenges or issues that may arise during the migration and post-migration phases
- Stay up to date with the latest advancements in cloud computing, data engineering, and analytics technologies, and recommend best practices and industry standards for implementing the data lake solution.
Qualifications
- 8 to 10 years of IT experience, with at least 4 years working with Azure Databricks.
- Experience in data modeling and source system analysis.
- Strong knowledge of PySpark, SQL, and data engineering using Python.
- Proficiency with Azure Data Factory, Azure Data Lake, Azure SQL DW, and Azure SQL.
- Ability to conduct data profiling, cataloging, and mapping for the design and construction of data flows.
- Experience with data visualization and exploration tools.
- Strong technical leadership and guidance skills.
- Excellent problem-solving skills, with the ability to translate business requirements into data solutions.
- Effective communicator, able to explain complex concepts to both technical and non-technical stakeholders.
- Collaborative team player with strong interpersonal skills.
- Ability to manage multiple projects and deliver quality results within deadlines.
Additional Information
Will be considered as an advantage but are not required:
- Microsoft Certified: Azure Data Engineer Associate
- Experience preparing data for Data Science and Machine Learning
- Knowledge of Jupyter Notebooks or Databricks Notebooks for Python development