Platzhalter Bild

Hybrid Data Engineer presso RADcube, LLC

RADcube, LLC · Indianapolis, Stati Uniti d'America · Hybrid

Candidarsi ora
Description

 Essential Duties/Responsibilities: - Work closely with the solution leads, project managers, data architects, data scientists, and data analysts on solution design, architecture, and implementation - Performing extraction, transformation, and loading of data from a wide variety of data sources using various data engineering tools and methods. - Designing and implementing data solutions for operational and secure integration across systems. - Assist, with guidance and oversight from data architects, in creating database models and architecture design and documentation - Conduct research and development as well as contribute to the long-term positioning of and emerging technologies related to data sourcing, cleansing, and integration - Documenting and demonstrating solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code. - Improving operations by conducting systems analysis; recommending changes in policies and procedures. - Involvement requirements gathering, solution reviews, and explaining technical complexities and business benefits in layperson terms. 

Job Requirements: 

- Bachelor’s degree in Computer Science, Engineering or a similar field is required 

- 3+ years of data engineering, software engineer, or similar experience 

- 2+ hands-on industry experience working with SQL on various relational database platforms (Microsoft, Oracle, Hana, Postgres, etc.) 

- 2+ hands-on industry experience working with enterprise ETL/DW tools like Azure Data Factory, Redshift, Informatica, etc. 

- Hands-on experience with aspects of data engineering design and implementation including data sourcing, data modeling of warehouses/marts/repositories, data integration/transformation/ETL, APIs, reporting, business intelligence and analytics 

- Hands-on experience with modern programing languages like Python, C#, JavaScript, etc. 

- Hands-on experience with cloud platforms like AWS, Azure, GCP, etc. 

- Experience with Docker for containerization and Kubernetes for orchestration a plus 

- Collaborative team player who is detailed oriented, focused on solution quality and execution 

- Progressive mindset particularly around deployment models and emerging technologies 

Requirements

 

 Essential Duties/Responsibilities: - Work closely with the solution leads, project managers, data architects, data scientists, and data analysts on solution design, architecture, and implementation - Performing extraction, transformation, and loading of data from a wide variety of data sources using various data engineering tools and methods. - Designing and implementing data solutions for operational and secure integration across systems. - Assist, with guidance and oversight from data architects, in creating database models and architecture design and documentation - Conduct research and development as well as contribute to the long-term positioning of and emerging technologies related to data sourcing, cleansing, and integration - Documenting and demonstrating solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code. - Improving operations by conducting systems analysis; recommending changes in policies and procedures. - Involvement requirements gathering, solution reviews, and explaining technical complexities and business benefits in layperson terms. 

Job Requirements: 

- Bachelor’s degree in Computer Science, Engineering or a similar field is required 

- 3+ years of data engineering, software engineer, or similar experience 

- 2+ hands-on industry experience working with SQL on various relational database platforms (Microsoft, Oracle, Hana, Postgres, etc.) 

- 2+ hands-on industry experience working with enterprise ETL/DW tools like Azure Data Factory, Redshift, Informatica, etc. 

- Hands-on experience with aspects of data engineering design and implementation including data sourcing, data modeling of warehouses/marts/repositories, data integration/transformation/ETL, APIs, reporting, business intelligence and analytics 

- Hands-on experience with modern programing languages like Python, C#, JavaScript, etc. 

- Hands-on experience with cloud platforms like AWS, Azure, GCP, etc. 

- Experience with Docker for containerization and Kubernetes for orchestration a plus 

- Collaborative team player who is detailed oriented, focused on solution quality and execution 

- Progressive mindset particularly around deployment models and emerging technologies 

Candidarsi ora

Altri lavori