ETL Engineer- Integrations & Data na undefined
undefined · Glennville, Estados Unidos Da América · Hybrid
- Professional
- Escritório em Glennville
Job Title: ETL Engineer - Integrations & Data
Location: Remote (U.S. Only) or Glennville, GA
Employment Type: Full-Time
Seniority Level: Mid-Level
Job Overview:
Onshore Outsourcing is seeking an experienced and highly professional Software Engineer. This role is responsible for designing, developing, and supporting data pipelines and system-to-system integrations, primarily using Informatica PowerCenter and SQL Server within a data warehousing environment. The Engineer will contribute to medium-sized projects with limited supervision, providing mentorship and technical guidance as needed to other team members. Responsibilities include coding, testing, and analyzing solutions, recommending standards, and supporting existing systems. The role requires a solid understanding of IT and business processes, effective communication across varied audiences, and the ability to inspire trust and collaboration. This role requires participation in an after-hours on-call rotation.
Key Responsibilities:
Core Competencies
- Ensures Accountability – Taking ownership of commitments and outcomes, both personally and across teams.
- Action Oriented – Demonstrating initiative and decisiveness in tackling challenges.
- Communicates Effectively – Clear, empathetic, and purposeful communication across all levels.
- Drives Results – Delivering outcomes consistently, even under pressure.
- Instills Trust – Building credibility through integrity, transparency, and reliability.
Develop/Code/Configure
- Design, build, and optimize scalable data pipelines for batch and real-time processing.
- Develop and maintain data integration workflows between internal systems and external platforms.
- Implement data quality checks, validation rules, and exception handling within pipelines.
- Collaborate with business analysts and stakeholders to translate data requirements into technical solutions.
- Create and maintain documentation for data flows, transformations, and system interfaces.
- Monitor and troubleshoot data pipeline performance and reliability issues.
- Ensure compliance with data governance, security, and privacy standards.
- Work with large datasets and optimize SQL queries for performance and scalability.
- Support and enhance existing data warehouse solutions, including schema changes and performance tuning.
- Demonstrate understanding of dimensional modeling concepts including dimensions, conformed dimensions, facts, and aggregated facts.
- Apply Kimball methodology and star schema design principles in data warehouse development.
- Knowledge of Medallion architecture is preferred but not required
Code Review
- Participate in code review with other developers and designers to confirm what is developed is in alignment with the design.
Unit Test
- Design and Write Unit Tests
- Create test cases for individual components of data pipelines and integration logic.
- Ensure coverage for both typical and edge-case scenarios.
- Validate Data Transformations
- Test that data is correctly transformed according to business rules.
- Verify mappings and calculations within the pipeline.
- Document Test Cases
- Clearly describe the purpose and expected outcome of each test.
- Make tests readable and maintainable for other developers.
- Handle Exceptions and Errors
- Test error handling and logging mechanisms to ensure robustness under failure conditions.
Data Architecture Design – While design is not a core responsibility for this role, prior experience in design is valued and considered beneficial.
- Perform Data Analysis / Data Modeling & Warehousing
- Review requirements & design and analyze data sources
- Review/query data sources (various systems) – must have familiarity with query language – SQL/T-SQL
- Create Data Mappings (source, target, and calculations needed) for source to Data Warehouse as well as Source to Source (example: system to system integrations)
- Review and interpret both logical and physical data models (ERDs).
Qualifications:
- 5+ years of direct experience with designing, developing, and supporting data pipelines and system-to-system integrations.
- Proven ability to design, build, and optimize scalable data pipelines for both batch and real-time processing, integrating diverse internal and external systems
- Strong command of SQL/T-SQL for querying and performance optimization, with experience in dimensional modeling, Kimball methodology, and star schema design.
- Demonstrated experience implementing data validation, exception handling, and ensuring compliance with governance, security, and privacy standards.
- Skilled in performing data analysis, creating data mappings, and interpreting logical and physical models (ERDs) to support data warehouse and integration design.
- Experienced in unit testing, validating data transformations, handling exceptions, and documenting test cases to ensure reliability and maintainability.
- Strong ability to collaborate with analysts, developers, and stakeholders to translate business needs into technical solutions, while communicating effectively across teams.
- Direct experience monitoring, tuning, and troubleshooting data pipeline performance for scalability and reliability.
Actively participates in peer code reviews to maintain quality, consistency, and adherence to design standards. - Demonstrates ownership of deliverables and a purposeful mindset, ensuring timely and high-quality outcomes under pressure.
- Builds credibility through transparency, dependability, and adherence to best practices that foster trust across teams and stakeholders.
Compensation:
- Competitive total compensation package (base salary plus performance bonuses).
- Estimated total compensation: $90,000 annually, commensurate with experience and location.
Comprehensive Benefits:
We offer a comprehensive benefits package, including:
- Medical, dental, and vision insurance
- Life and AD&D insurance
- Short-term and long-term disability coverage
- 401(k) retirement plan with employer matching
- Generous paid time off (vacation, sick leave, holidays)
- Professional development and certification allowances
About Onshore Outsourcing:
Founded in 2005, Onshore Outsourcing provides world-class IT services from rural America, dedicated to revitalizing local communities and transforming economic landscapes through meaningful employment. Join our mission-driven, innovative team to deliver impactful solutions, engage with innovative Microsoft technologies, and shape a brighter future for our clients and communities.
How to Apply:
Interested candidates are encouraged to apply online. Please include a resume and a brief cover letter highlighting your relevant experience and passion for joining Onshore Outsourcing.
Candidatar-se agora