Platzhalter Bild

Data Engineer - DBT bei Charger Logistics Inc.

Charger Logistics Inc. · Brampton, Kanada · Onsite

Jetzt bewerben

Charger logistics Inc. is a world- class asset-based carrier with locations across North America. With over 20 years of experience providing the best logistics solutions, Charger logistics has transformed into a world-class transport provider and continue to grow.

Charger logistics invests time and support into its employees to provide them with the room to learn and grow their expertise and work their way up. We are entrepreneurial-minded organization that welcomes and support individual idea and strategies. We are seeking a skilled Data Engineer with strong DBT (data build tool) experience to join our modern data stack team. The successful candidate will leverage DBT, Python, and SQL expertise to build scalable, maintainable data transformation pipelines that power our analytics and business intelligence initiatives.

Responsibilities:

  • Develop and maintain data transformation models using DBT for scalable analytics workflows
  • Build reusable, modular SQL transformations following DBT best practices and software engineering principles
  • Implement data quality tests and documentation within DBT framework
  • Design and optimize complex SQL queries for data modeling and transformation
  • Create Python applications for data ingestion, API integrations, and pipeline orchestration
  • Collaborate with analytics teams to translate business requirements into robust data models
  • Implement version control workflows and CI/CD processes for DBT projects
  • Monitor data pipeline performance and implement optimization strategies
  • Establish data lineage tracking and impact analysis using DBT's built-in capabilities
  • Mentor team members on DBT development patterns and SQL optimization techniques

Requirements

Required Qualifications:

  • Bachelor's degree in Computer Science, Engineering, Data Science, or related field
  • 2+ years of hands-on experience with DBT (data build tool) development and deployment
  • Expert-level SQL skills including CTEs, window functions, and advanced analytical queries
  • Strong Python programming experience, particularly for data processing and automation
  • Experience with modern data warehouses (Snowflake, BigQuery, Redshift, or Databricks)
  • Solid understanding of dimensional modeling and data warehouse design patterns
  • Experience with version control (Git) and collaborative development workflows
  • Knowledge of data testing strategies and data quality frameworks

Preferred Qualifications:

  • DBT certification or demonstrated advanced DBT knowledge
  • Experience with cloud data platforms and their native services
  • Familiarity with workflow orchestration tools (Airflow, Prefect, Dagster)
  • Knowledge of data visualization tools (Looker, Tableau, Power BI)
  • Experience with streaming data processing frameworks
  • Understanding of DataOps and analytics engineering principles
  • Experience with Infrastructure as Code (Terraform, CloudFormation)

Technical Skills:

  • DBT: Model development, testing, documentation, macros, packages, deployment
  • SQL: Advanced querying, performance optimization, data modeling
  • Python: pandas, SQLAlchemy, requests, data pipeline frameworks
  • Data Warehouses: Snowflake, BigQuery, Redshift, or similar cloud platforms
  • Tools: Git, Docker, CI/CD pipelines, data orchestration platforms
  • Concepts: Dimensional modeling, data testing, analytics engineering, DataOps

What You'll Build:

  • Scalable DBT models that transform raw data into analytics-ready datasets
  • Automated data quality tests and monitoring systems
  • Self-documenting data pipelines with clear lineage and dependencies
  • Reusable data transformation components and macros
  • Robust CI/CD workflows for data model deployment

Benefits

  • Competitive Salary
  • Healthcare Benefits Package
  • Career Growth
Jetzt bewerben

Weitere Jobs