- Professional
- Oficina en Noida
Version 1 has celebrated over 28 years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat, OutSystems and Snowflake. We’re also an award-winning employer reflecting how employees are at the heart of Version 1.
We’ve been awarded: Innovation Partner of the Year Winner 2023 Oracle EMEA Partner Awards, Global Microsoft Modernising Applications Partner of the Year Award 2023, AWS Collaboration Partner of the Year - EMEA 2023 and Best Workplaces for Women by Great Place To Work in UK and Ireland 2023.
As a consultancy and service provider, Version 1 is a digital-first environment and we do things differently. We’re focused on our core values; using these we’ve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of the fastest-growing consultancies globally.
Job Description:Job Description
This is an exciting opportunity for an experienced developer of large-scale data solutions. You will join a team delivering a transformative cloud hosted data platform for a key Version 1 customer.
The ideal candidate will have a proven track record as a senior/self-starting data engineer in implementing data ingestion and transformation pipelines for large scale organisations. We are seeking someone with deep technical skills in a variety of technologies, Informatica, Talend, Azure Data Factory, Snowflake, Azure Synapse Analytics, or Amazon Redshift, to play an important role in developing and delivering early proofs of concept and production implementation.
You will ideally have experience in building solutions using a variety of tools, and a proven track record in delivering high quality work to tight deadlines.
Your main responsibilities will be:
- Design, develop, and maintain scalable data warehouse solutions using modern cloud-based platforms such as Snowflake, Azure Synapse Analytics, or Amazon Redshift to support enterprise-level analytics and reporting needs.
- Core responsibilities include building efficient Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) pipelines using tools like Informatica, Talend, or Azure Data Factory, conducting thorough data validation to ensure accuracy and consistency, and providing support for Business Intelligence (BI) reporting initiatives.
- Strong proficiency in SQL is required for querying and data manipulation, along with experience in dimensional data modeling to optimize analytical performance.
- Familiarity with reporting and visualization tools such as Power BI or Tableau is essential.
- Developing scalable and re-usable frameworks for ingestion and transformation of large data sets
- Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times
- Working with other members of the project team to support delivery of additional project components (Reporting tools, API interfaces, Search)
- Evaluating the performance and applicability of multiple tools against customer requirements
- Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.
- Bachelor’s or master's degree in Computer Science, Business Information Systems, Mathematics, or related quantitative field
- 3+ years of experience in DWH implementation
- Candidates must possess hands-on experience with Structured Query Language (SQL), data quality management tools, and should be proficient in facilitating data exchange through Application Programming Interfaces (APIs).
- Deep knowledge of operating systems and relational database with advanced SQL and Python skills
- Proficiency in the data concepts related to curation, archival, analysis, visualization and integrity
- Good knowledge in at least one business intelligence solution (e.g., Power BI)
- Strong analytical, critical-thinking, and problem-solving skills
- Excellent communication skills and team-oriented approach
- Have led minimum 2 mid-scale Data Warehouse projects
- Direct experience of building data piplines using Azure Data Factory and Databricks
- Experience building data warehouse solutions using ETL / ELT tools like Informatica, Talend, Azure DataFactory
- Comprehensive understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching.
Nice to have
- Additional knowledge of data governance principles and metadata management practices is considered a valuable asset.
At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being, professional growth, and financial stability.
One of our standout advantages is the ability to work with a hybrid schedule along with business travel, allowing our employees to strike a balance between work and life. We also offer a range of tech-related benefits, including an innovative Tech Scheme to help keep our team members up-to-date with the latest technology.
We prioritise the health and safety of our employees, providing private medical and life insurance coverage, as well as free eye tests and contributions towards glasses. Our team members can also stay ahead of the curve with incentivized certifications and accreditations, including AWS, Microsoft, Oracle, and Red Hat.
Our employee-designed Profit Share scheme divides a portion of our company's profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential, offering Pathways Career Development Quarterly, a programme designed to support professional growth.
Solicitar ahora