- Senior
Experience Required: 8+Years
Mode of work: Remote
Skills Required: Azure DataBricks, Eventhub, Kafka, Architecture,Azure Data Factory, Pyspark, Python, SQL, Spark
Notice Period : Immediate Joiners/ Permanent/Contract role (Can join within September 15th 2025)
- Design, develop, and maintain scalable and robust data solutions in the cloud using Apache Spark and Databricks.
- Gather and analyse data requirements from business stakeholders and identify opportunities for data-driven insights.
- Build and optimize data pipelines for data ingestion, processing, and integration using Spark and Databricks.
- Ensure data quality, integrity, and security throughout all stages of the data lifecycle.
- Collaborate with cross-functional teams to design and implement data models, schemas, and storage solutions.
- Optimize data processing and analytics performance by tuning Spark jobs and leveraging Databricks features.
- Provide technical guidance and expertise to junior data engineers and developers.
- Stay up to date with emerging trends and technologies in cloud computing, big data, and data engineering.
- Contribute to the continuous improvement of data engineering processes, tools, and best practices.
Requirements
- Bachelor’s or master’s degree in computer science, engineering, or a related field.
- 10+ years of experience as a Data Engineer, Software Engineer, or similar role, with a focus on building cloud-based data solutions.
- Strong knowledge and experience with Azure cloud platform, Databricks,EventHub,Architecture,Spark,Kafka, ETL Pipeline,Python/Pyspark, SQL EventHub, Copilot Studio..
- Strong experience with cloud platforms such as Azure .
- Experience with big data systems, including Apache Spark / Kafka
- Experience contributing to the architecture and design of large-scale distributed systems
- Expertise in Databricks Lakehouse Platform, its architecture, and its capabilities.
- Experience building production pipelines using Databricks and Azure services
- Experience with multiple coding languages such as Python or SQL.