Data Engineer-Data engineer-Engineering en exl Service.com (India) Private Limited
exl Service.com (India) Private Limited · Noida, Indien · Hybrid
- Senior
- Oficina en Noida
Key Technologies & Skills:
- ~8 years of experience as a Cloud Architect with extensive expertise in designing and implementing scalable, secure cloud architecture on any cloud AWS /Azure/GCP.
- Proven leadership in building enterprise-grade data platforms utilizing Databricks, Snowflake, Spark, and Data Lake/Warehouse solutions, supporting advanced analytics and AI/ML integrations.
- Research and development for custom architecture PoCs, including AI/ML model deployment, security enhancements, and scalable data workflows to meet evolving business needs.
- Hands-on experience in designing and deploying data pipelines with AWS (S3, Lambda, Glue, Data Pipeline), Databricks, and Snowflake for real-time streaming, batch processing, and analytics workloads.
- Expertise in cloud infrastructure provisioning using Terraform, CloudFormation, and Control Tower, ensuring multi-AZ, high availability, and multi-tenant architectures.
- Extensive experience in hybrid cloud architecture, enabling cloud-to-cloud integrations, cross-cloud data migration, and multi-cloud orchestration involving AWS/GCP or Azure, and on-prem systems.
- Proven ability to design and implement CI/CD pipelines using CloudFormation, Jenkins, Git, Terraform, supporting automated deployment of applications and infrastructure.
- Development of SaaS solutions with secure management provisioning, license management, and custom deployment architectures.
- Monitoring and logging expertise with Prometheus, Grafana, Elasticsearch, Kibana, ensuring performance optimization, fault detection, and quick bug resolution.
- Data Architecture & Engineering Design, develop, and optimize data pipelines. Architect and implement data warehousing solutions using platforms like AWS, Snowflake, Databricks.
- Design scalable, distributed data processing systems using Apache Spark, Hadoop, or similar frameworks.
- Data Modeling & Storage Develop and maintain logical and physical data models to support analytics, reporting, and ML use cases.
- Collaboration & Stakeholder Communication Work closely with Data Scientists, Analysts, and Business Teams to support data-driven decision-making. Provide guidance on data best practices, performance tuning, and data-driven solutions.
- Mentor and train junior data engineers and contribute to team growth
Cloud Platforms: AWS, GCP, and Azure
- Data Platforms & Tools: Databricks, Snowflake, Hadoop, Spark, Data Lake, Data Warehouse
- DevOps & Automation: Terraform, CloudFormation, Kubernetes, Docker, CI/CD pipelines
- Monitoring & Logging: Prometheus, Grafana, ELK Stack
- Programming & Scripting: Python, Java, Bash, SQL
- Security & Compliance: SOC2, HIPAA, audit logs, security controls
Responsibilities
Key Technologies & Skills:
- ~8 years of experience as a Cloud Architect with extensive expertise in designing and implementing scalable, secure cloud architecture on any cloud AWS /Azure/GCP.
- Proven leadership in building enterprise-grade data platforms utilizing Databricks, Snowflake, Spark, and Data Lake/Warehouse solutions, supporting advanced analytics and AI/ML integrations.
- Research and development for custom architecture PoCs, including AI/ML model deployment, security enhancements, and scalable data workflows to meet evolving business needs.
- Hands-on experience in designing and deploying data pipelines with AWS (S3, Lambda, Glue, Data Pipeline), Databricks, and Snowflake for real-time streaming, batch processing, and analytics workloads.
- Expertise in cloud infrastructure provisioning using Terraform, CloudFormation, and Control Tower, ensuring multi-AZ, high availability, and multi-tenant architectures.
- Extensive experience in hybrid cloud architecture, enabling cloud-to-cloud integrations, cross-cloud data migration, and multi-cloud orchestration involving AWS/GCP or Azure, and on-prem systems.
- Proven ability to design and implement CI/CD pipelines using CloudFormation, Jenkins, Git, Terraform, supporting automated deployment of applications and infrastructure.
- Development of SaaS solutions with secure management provisioning, license management, and custom deployment architectures.
- Monitoring and logging expertise with Prometheus, Grafana, Elasticsearch, Kibana, ensuring performance optimization, fault detection, and quick bug resolution.
- Data Architecture & Engineering Design, develop, and optimize data pipelines. Architect and implement data warehousing solutions using platforms like AWS, Snowflake, Databricks.
- Design scalable, distributed data processing systems using Apache Spark, Hadoop, or similar frameworks.
- Data Modeling & Storage Develop and maintain logical and physical data models to support analytics, reporting, and ML use cases.
- Collaboration & Stakeholder Communication Work closely with Data Scientists, Analysts, and Business Teams to support data-driven decision-making. Provide guidance on data best practices, performance tuning, and data-driven solutions.
- Mentor and train junior data engineers and contribute to team growth
Cloud Platforms: AWS, GCP, and Azure
- Data Platforms & Tools: Databricks, Snowflake, Hadoop, Spark, Data Lake, Data Warehouse
- DevOps & Automation: Terraform, CloudFormation, Kubernetes, Docker, CI/CD pipelines
- Monitoring & Logging: Prometheus, Grafana, ELK Stack
- Programming & Scripting: Python, Java, Bash, SQL
- Security & Compliance: SOC2, HIPAA, audit logs, security controls
Qualifications
Bachelor's/Master's in Engineering