Firmenlogo

Hybrid Data Engineer bei NationsBenefits, LLC

NationsBenefits, LLC · Hyderabad, Indien · Hybrid

Jetzt bewerben

About NationsBenefits:

At NationsBenefits, we are leading the transformation of the insurance industry by developing innovative benefits management solutions. We focus on modernizing complex back-office systems to create scalable, secure, and high-performing platforms that streamline operations for our clients. As part of our strategic growth, we are focused on platform modernization — transitioning legacy systems to modern, cloud-native architectures that support the scalability, reliability, and high performance of core back- office functions in the insurance domain.

As a Data Engineer, you will be responsible for the Requirement Gathering, Data Analysis, Development and implementation of Orchestrated data pipeline solutions to support our organization's data-driven initiatives to ensure data accuracy and enable data-driven decision-making across the organization. The ideal candidate will possess a minimum of 3-5 years of hands-on experience in data engineer on high-performing teams. Expertise in DBT, Airflow, Azure Databricks, SQL, Python, Py-spark, Automation is a must and knowledge of reporting tools is addon.

Key Responsibilities:

  • 3 to 5 years of hands-on experience using DBT. Airflow, Azure Databricks, Python, Py-spark and SQL , Preferred from Healthcare & Fintech Domain having Automation First Mindset.
  • Hands-on experience with Data Collection, Data Analysis, Data modeling, Data Processing using DBT, Airflow, Azure Databricks, Py-spark, SQL, Python.
  • Performance Optimization and Automation: Continuously monitor and optimize existing solutions and Debugging DAG failures and resolving.
  • Data Processing: Leverage his expertise building Robust Data pipelines using mentioned tech stack with CI/CD.
  • Collaboration: Collaborate with cross-functional teams, including data scientists, business analysts, and stakeholders, to understand their data needs and deliver solutions.
  • Data Quality: Implement data validation and cleansing processes to ensure data accuracy, consistency, and reliability.
  • Influence: bring right solution for use cases and convince the team to use.
  • Open to Ad hoc Data Analysis and Reporting/Dashboard Development: Perform exploration data analysis, develop data visualizations, and generate actionable insights to support business decision-making.
  • Stay Current: Stay up to date with emerging trends and technologies in data engineering and analytics and make recommendations for their adoption.

Requirements:

  •  Bachelor’s degree in computer science, Information Technology, or a related field.
  •  Minimum 3+ years of hands-on experience using DBT. Airflow, Azure Databricks, Py-spark, SQL, Python, Automation
  •  Flexible to build Data Reports and Dashboards using SQL, Python, Reporting Tools
  •   Strong Debugging and Automation skills
  •  Strong understanding of DWH/Data Lake concepts and methodologies.
  • Experience with cloud platforms such as Azure, AWS or GCP
  • Excellent communication, Presentation and interpersonal skills
  • Knowledge of data quality, data Validation, data security and compliance standards is a plus.
  • Excellent problem-solving skills and attention to detail
Jetzt bewerben

Weitere Jobs