Platzhalter Bild

Data Engineer na VieMed

VieMed · Lafayette, Estados Unidos Da América · Hybrid

Candidatar-se agora

Essential Duties and Responsibilities:

  • Engineering & Development
  • Build and maintain scalable ETL/ELT pipelines for batch and streaming data.
  • Write high-quality SQL and Python for data transformations and integrations.
  • Optimize data models, queries, and workflows for performance and cost efficiency.
  • Partner with analysts and data scientists to deliver reliable datasets for BI, analytics, and ML.

 

  • Architecture & Infrastructure
  • Contribute to the design of modern data warehouse and lakehouse solutions (Snowflake, Redshift, BigQuery, Databricks, etc.).
  • Implement data orchestration and transformation frameworks (Airflow, dbt, Prefect).
  • Support data infrastructure in the cloud (AWS/GCP/Azure), ensuring scalability, reliability, and security.

 

  • Mentorship & Collaboration
  • Provide technical guidance and informal mentoring to less experienced data engineers.
  • Participate in code reviews, design discussions, and knowledge-sharing sessions.
  • Promote best practices in coding standards, documentation, and testing.

 

  • Quality & Operations
  • Establish and maintain standards for data quality, observability, and reliability.
  • Troubleshoot pipeline and infrastructure issues in production environments.
  • Contribute to automation and monitoring efforts to reduce manual operations.

 

Minimum Qualifications:

  • Experience
  • 4–6+ years of experience in data engineering or related fields.
  • Prior experience collaborating with and mentoring junior engineers.

 

  • Technical Skills
  • Advanced SQL and query optimization.
  • Strong Python for data engineering (Pandas, PySpark, or similar).
  • Hands-on experience with data orchestration (Airflow, dbt, Prefect).
  • Familiarity with modern cloud-based data platforms (AWS Redshift, Snowflake, BigQuery, Synapse, Databricks).
  • Strong understanding of data modeling and schema design.

 

Preferred Knowledge, Skills and Abilities:

  • Experience with streaming technologies (Kafka, Kinesis, Pub/Sub).
  • Familiarity with CI/CD and version control workflows.
  • Exposure to data governance and security frameworks.

 

You will be expected to work during normal business hours, which are Monday through Friday, 8:00 a.m. – 5:00 p.m. Please note this job description is not designed to cover and/or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee for this job. Duties and responsibilities may change at any time with or without notice.

 

Candidatar-se agora

Outros empregos