Firmenlogo

Azure Data Warehouse Engineer at LingaTech

LingaTech · Harrisburg, United States Of America · Hybrid

Apply Now
MUST be a PA Resident
Location: Harrisburg, PA
Position Type: Hybrid
Hybrid Schedule: Onsite as needed, at least one day a month
Contract Length: 10+ months


This role supports the modernization of the Enterprise Data Warehouse (EDW) by designing, developing, and implementing advanced cloud-based data solutions in Azure. The Architect / Azure DW Developer will modernize data architecture, support reporting and analytics needs, and deliver scalable solutions that improve data-driven decision-making across the organization.

Required Skills:
  • 5 years of technical experience and expertise with Azure, Azure Delta Lake, Azure Databricks, Azure Data Factory, Pipelines, Apache Spark, and Python.

  • 5 years of experience with design, implementation, and maintenance of business intelligence and data warehouse solutions, with expertise in SQL Server and Azure Synapse.

  • 5 years of experience producing ETL/ELT using SQL Server Integration Services (SSIS) and other tools.

  • 5 years of experience with SQL Server, T-SQL, scripts, and queries.

  • 5 years of experience as an Azure DevOps CI/CD Pipeline Release Manager, including design, implementation, and maintenance of robust, scalable pipelines.

  • 5 years of experience with data formatting, capture, search, retrieval, extraction, classification, quality control, cleansing, and information filtering.

  • 5 years of experience in data engineering, database file system optimization, APIs, and analytics as a service.

  • 5 years of experience with data mining architecture, modeling standards, reporting, and data analysis methodologies.

  • 4-year college degree in Computer Science or related field (advanced study preferred).

Duties:
  • Design, develop, test, and implement data lakes, databases, ELT programs, applications, and reports in Azure.

  • Modernize the EDW into Microsoft Azure Cloud using Databricks, Delta Lake, Synapse, Data Factory, Pipelines, Apache Spark, and Python.

  • Plan, organize, prioritize, and manage data warehouse development and modernization efforts.

  • Collaborate with business analysts, application developers, DBAs, and system staff to meet project objectives.

  • Gather and analyze business and technical requirements to design optimized data solutions.

  • Perform research on potential technologies and recommend solutions to support data modernization.

  • Support large-scale data processing, statistical analysis, and reporting for enterprise-wide initiatives.

  • Develop centralized data models and ensure compliance with federal and organizational data standards.

  • Build and maintain relationships with stakeholders, presenting technical solutions clearly to varied audiences.

  • Create, review, and maintain technical documentation, flowcharts, diagrams, test plans, and code reviews.

  • Implement and maintain CI/CD pipelines using Azure DevOps, including automation of builds, tests, and deployments.

  • Conduct testing, quality assurance reviews, and performance optimization of implemented solutions.

  • Provide knowledge transfer, training, and procedural documentation for ongoing system maintenance.

  • Track progress, submit status reports, and ensure timely project deliverables.

Apply Now

Other home office and work from home jobs