Firmenlogo

Data Engineer at Cronos Europa

Cronos Europa · Brussels, Belgium · Onsite

Apply Now

Responsabilities:

  • Develop, deploy, and maintain scalable data pipelines from REST APIs and databases using Python, PySpark, Azure Synapse, Knime, SQL, and ETL tools to ingest, transform, and prepare data.
  • Process and transform complex JSON and GIS data into structured datasets optimized for analysis and reporting. This includes parsing, transforming, and validating JSON data to ensure data quality and consistency.
  • Load and organize processed data into cloud-based Azure Data Lake storage ensuring accessibility and performance.
  • Document ETL processes, metadata definitions, data lineage, and technical specifications to ensure transparency and reusability.
  • Collaborate with data analysts, BI developers, and business stakeholders to understand data requirements and deliver reliable, well-documented datasets aligned with organizational needs.
  • Implement data quality checks, logging, and monitoring within data pipelines to support maintainability and troubleshooting

Technical skills:

  • Excellent knowledge of data engineering tools Python, PySpark and Azure Synapse Analytics
  • Excellent knowledge of working with REST APIs, including ingestion and parsing of JSON and GIS data
  • Excellent knowledge of Azure data lake storage and Oracle database
  • Ability to implement robust data quality checks, logging, and monitoring in ETL processes
  • Ability to document ETL workflows, metadata, and technical specifications clearly and consistently
  • Familiarity with DevOps and version control best practices. Experience with CI/CD pipelines
  • Experience working in an Agile and Scrum framework
  • Analysis and problem solving skills
  • Ability to participate in technical meetings and good communication skills

Profile :

  • Bachelor’s or Master’s degree in IT, Business Management, or a related field.
  • Following specific expertise is mandatory for the performance of tasks:
  • at least 5 years of excellent knowledge in Azure data lake storage and Oracle databases
  • at least 5 years of excellent expertise in developing data pipelines from REST APIs and on integration (such as PySpark, Python, Azure Synapse, SQL, Knime)
  • at least 5 years of excellent expertise in processing JSON and GIS data

Why Cronos Group?

We’ll propose you:

  • An attractive salary package
  • A good work-life balance environment
  • The assurance of working in cutting-edge technologies in an entrepreneurial spirit.
  • The opportunity to develop your skills thanks to tailor-made training courses according to your needs
  • A good job in a friendly place

If you wish to integrate a dynamic structure on a human scale while working with the latest technologies, don't wait anymore and join Cronos!

Apply Now

Other home office and work from home jobs