- Junior
- Escritório em Coimbatore
Bosch Global Software Technologies Private Limited is a 100% owned subsidiary of Robert Bosch GmbH, one of the world's leading global supplier of technology and services, offering end-to-end Engineering, IT and Business Solutions. With over 28,200+ associates, it’s the largest software development center of Bosch, outside Germany, indicating that it is the Technology Powerhouse of Bosch in India with a global footprint and presence in the US, Europe and the Asia Pacific region.
Job Description:Responsibilities:
- Responsible and self-driven work in the field of data engineering.
- Design, develop, and maintain data pipelines and ETL processes to ingest, process, transform and store large volumes of data from diverse sources.
- Collaborate with the business stakeholders to understand their data requirements and provide data-driven solutions and insights for decision making.
- Optimize and tune data pipelines and database performance for scalability, efficiency, and reliability.
- Implement and maintain data warehouses, data lakes, and other data storage solutions for efficient data retrieval and analysis.
- Ensure data quality, integrity, and security throughout the data lifecycle.
- Create data visualizations and reports to communicate findings effectively to technical and non-technical audiences.
- Stay updated with the latest data engineering, tools, and best practices.
- Interest in Data (Analysis, Schema, Validation, Visualization)
- Should be able to handle both big data and small data.
Required Skills:
- Proven experience as a Data Engineer, showcasing expertise in both domains.
- Proficiency in programming languages such as Python with proper coding standards.
- Strong knowledge in PySpark, IMPALA, HIVE & Tableau.
- Strong experience with SQL and relational databases (e.g., PostgreSQL, MySQL, etc.).
- Familiarity with NoSQL databases (e.g., MongoDB, Cassandra) is a plus.
- Knowledge of data modeling, data warehousing, and data architecture.
- Experience with big data processing frameworks (e.g., Hadoop, Spark).
- In-depth knowledge of orchestration and scheduling jobs for data engineering pipelines with Apache Airflow.
- Experience with Cloud-based data solutions (e.g., AWS, Azure, GCP) is advantageous.
- Experience with JIRA, GIT & Bitbucket.
- Excellent problem-solving and analytical skills.
- Strong communication and presentation skills.
- Ability to work independently and collaboratively in a team environment.
Qualifications
Bachelor’s or Master’s degree in Computer Science or a related field.
Qualifications:Bachelor’s or Master’s degree in Computer Science or a related field
Additional Information:4+
Candidatar-se agora