- Senior
- Optionales Büro in Chennai
Job Summary:
The Big Data Engineer is responsible for designing, building, and maintaining scalable data pipelines and infrastructure to process and analyze massive datasets. This role supports data-driven decision-making by ensuring data is clean, accessible, and optimized for analytics and machine learning applications.
Key Responsibilities:
- Data Architecture & Engineering:
- Design and implement big data solutions using Hadoop, Spark, and other distributed systems
- Develop, construct, test, and maintain scalable data architectures and processing systems
- Integrate data from various sources into centralized data lakes or warehouses.
- Design and implement big data solutions using Hadoop, Spark, and other distributed systems
- ETL & Data Pipeline Development:
- Build robust ETL pipelines for batch and real-time data ingestion
- Optimize data flow and processing for performance and scalability.
- Build robust ETL pipelines for batch and real-time data ingestion
- Collaboration & Integration:
- Work closely with data scientists, analysts, and business stakeholders to translate requirements into technical solutions
- Collaborate with architects to ensure systems align with enterprise architecture.
- Work closely with data scientists, analysts, and business stakeholders to translate requirements into technical solutions
- Monitoring & Maintenance:
- Monitor data systems for performance, reliability, and security.
- Troubleshoot and resolve data-related issues.
- Governance & Compliance:
- Implement data governance, privacy, and security protocols.
- Ensure compliance with regulatory standards.
Required Skills:
- Proficiency in big data technologies: Hadoop, Spark, Hive, Pig, Impala
- Experience with NoSQL databases: MongoDB, Cassandra, HBase.
- Strong programming skills: Python, Java, Scala, or Shell scripting.
- Familiarity with data modeling, warehousing, and MPP platforms.
- Knowledge of cloud platforms (AWS, Azure, GCP) and data services.
- Experience with tools like Kafka, Flume, Airflow, and data visualization platforms.
Qualifications:
- Bachelor’s or master’s degree in computer science, Engineering, or related field.
- 2–5+ years of experience in data engineering or big data roles.
- Certifications in Big Data technologies or cloud platforms are a plus.
Soft Skills:
- Strong analytical and problem-solving abilities.
- Excellent communication and collaboration skills.
- Ability to manage multiple priorities and work independently.
Work Environment: Hybrid
Jetzt bewerben