- Professional
- Office in Kochi
Skill: Pyspark
Experience: 6 to 9 years
Location: AIA Kochi
Responsibilities
- Develop and maintain scalable data pipelines using Python and PySpark.
- Collaborate with data engineers and data scientists to understand and fulfill data processing needs.
- Optimize and troubleshoot existing PySpark applications for performance improvements.
- Write clean, efficient, and well-documented code following best practices.
- Participate in design and code reviews.
- Develop and implement ETL processes to extract, transform, and load data.
- Ensure data integrity and quality throughout the data lifecycle.
- Stay current with the latest industry trends and technologies in big data and cloud computing.