Platzhalter Bild

Sr Data Engineer (93326) at Freeman

Freeman · Atlanta, United States Of America · Hybrid

Apply Now

Career Opportunities: Sr Data Engineer (93326)

Requisition ID 93326 - Posted  - Freeman - Country (1) - Location (2) - Function (1) - Job Type (1)

 

About Us

Freeman is a global leader in events, on a mission to redefine live for a new era. With a data-driven approach and the industry’s largest network of experts, Freeman’s insights shape exhibitions, exhibits, and events that drive audiences to action. The integrated full-service solutions leverage a 97-year legacy in event management as well as new technologies to deliver moments that matter.

Summary

We’re looking for an experienced and mission-driven Data Engineer. The role will report directly to the Manager of Data Engineering and join the company’s transformative IT team. You will be responsible for developing and working with a highly motivated group of data engineers, data architects and data contractor resources covering multiple disciplines, for refining Freeman’s transformative systems to meet its customer's needs and contributing to the strategic direction of the business.

 

This position will support our IT team. It is eligible to work a hybrid schedule, generally requiring work in-office and/or show site 2-3 days per week. The ideal candidate will be based out of any of the following locations:

  • Atlanta, GA
  • Dallas, TX

Essential Duties & Responsibilities

This position will focus on the development of data platform solutions. Sometimes it will be building upon existing solutions for the enterprise, and other times creating new complex data solutions.

  •  Development, testing, change & configuration management, and automated deployment of data management solutions – data ingestion, data transformation, data virtualization, data model, interfaces, databases, etc.
  • System integration across foundational data sources, the on-premises and cloud platforms they reside within, and other third-party data solutions.
  • Designing, building, and maintaining ETL/ELT data pipelines.
  • Working with large datasets, data lakes (lakehouse), and big data technologies, preferably cloud-based, such as Snowflake, Synapse, Databricks, or similar.
  • Use Git and Azure DevOps Pipelines for deployment automation of data solutions.
  • Design and develop data warehouses and data pipelines within Azure and or AWS.
  • Act as a key contributor to the design and development lifecycle of analytic applications utilizing Snowflake, AWS, Microsoft Azure and BI technology platforms.
  • Participate in Agile ceremonies including daily stand-ups, sprint planning, retrospectives, and product demonstrations.
  • Produce efficient and elegant code that meets business requirements.
  • Author unit tests that adhere to code coverage guidelines.
  • Proactively communicate progress, issues, and risks to stakeholders.
  • Create and maintain technical documentation.
  • Performs other related duties as directed.

Education & Experience

  • Experience designing Data Lakes and Data Warehouses.
  • Experience with Snowflake, Azure Synapse and Azure Databricks preferable.
  • Participate in driving best practices around data engineering and software development process.
  • Understand data and query optimization, query profiling, and query performance monitoring tools and techniques.
  • Experience with a range of AWS and/or Azure services, including Infrastructure and Security related services such as Azure AD, IaaS, Containers, Storage, Networking, and Azure Security, Airflow, SNS, SQS, S3.
  • Familiarity with designing and building data platforms supporting both batch and real-time (event-based) architecture for cloud data platforms is considered a strong plus.
  • Familiarity with big data platforms, tools, Kafka, predictive modeling, machine learning, etc. is considered a strong plus.
  • 2-5 years of hands-on experience designing and implementing large-scale distributed data architecture for BI and OLTP systems.
  • 2-5 years of hands-on experience designing and implementing large-scale data pipelines. 
  • 2-5 years of hands-on experience in Snowflake, AWS and Azure data services. 
  • 2-5 years of hands-on experience with data integration using ETL / ELT tools. 
  • Experience with cloud-based technologies, preferably AWS and Azure, Airflow, SNS, SQS, S3, ADLS Gen 2, Data Factory, Snowflake, Databricks, Synapse Analytics preferred.
  • Experience with one or more Python parallel processing libraries preferable.
  • Experience with one or more Python data analysis libraries preferable.
  • Experience with data integration through APIs, Web Services, SOAP, and/or REST services. 
  • Experience using Azure DevOps and CI/CD as well as Agile tools and processes including Git, Jenkins, Jira, and Confluence.
  • Knowledge of SOA and Micros Services Application Architecture. 
  • Ability to work in a fast-paced, collaborative team environment.
  • Excellent written and verbal communication skills and ability to express ideas clearly and concisely. 

What We Offer

Freeman provides benefits that aim to empower our people and their families to thrive mentally, physically, and financially.  These are a handful of the types of programs and benefits our full-time people may be eligible for.  There may be some variances in specific benefits across regions.

  • Medical, Dental, Vision Insurance
  • Tuition Reimbursement
  • Paid Parental Leave
  • Life, Accident and Disability
  • Retirement with Company Match
  • Paid Time Off

Diversity Commitment

At Freeman, our commitment to diversity and inclusion is helping us to create not only a great place to work, but also an environment where our employees, our customers and our communities around the world can reach their goals and connect with each other. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, veteran status and other characteristics protected by federal, state or local laws.

 

#LI-Hybrid

 

Requisition ID 93326 - Posted  - Freeman - Country (1) - Location (2) - Function (1) - Job Type (1)

About Us

Freeman is a global leader in events, on a mission to redefine live for a new era. With a data-driven approach and the industry’s largest network of experts, Freeman’s insights shape exhibitions, exhibits, and events that drive audiences to action. The integrated full-service solutions leverage a 97-year legacy in event management as well as new technologies to deliver moments that matter.

Summary

We’re looking for an experienced and mission-driven Data Engineer. The role will report directly to the Manager of Data Engineering and join the company’s transformative IT team. You will be responsible for developing and working with a highly motivated group of data engineers, data architects and data contractor resources covering multiple disciplines, for refining Freeman’s transformative systems to meet its customer's needs and contributing to the strategic direction of the business.

 

This position will support our IT team. It is eligible to work a hybrid schedule, generally requiring work in-office and/or show site 2-3 days per week. The ideal candidate will be based out of any of the following locations:

  • Atlanta, GA
  • Dallas, TX

Essential Duties & Responsibilities

This position will focus on the development of data platform solutions. Sometimes it will be building upon existing solutions for the enterprise, and other times creating new complex data solutions.

  •  Development, testing, change & configuration management, and automated deployment of data management solutions – data ingestion, data transformation, data virtualization, data model, interfaces, databases, etc.
  • System integration across foundational data sources, the on-premises and cloud platforms they reside within, and other third-party data solutions.
  • Designing, building, and maintaining ETL/ELT data pipelines.
  • Working with large datasets, data lakes (lakehouse), and big data technologies, preferably cloud-based, such as Snowflake, Synapse, Databricks, or similar.
  • Use Git and Azure DevOps Pipelines for deployment automation of data solutions.
  • Design and develop data warehouses and data pipelines within Azure and or AWS.
  • Act as a key contributor to the design and development lifecycle of analytic applications utilizing Snowflake, AWS, Microsoft Azure and BI technology platforms.
  • Participate in Agile ceremonies including daily stand-ups, sprint planning, retrospectives, and product demonstrations.
  • Produce efficient and elegant code that meets business requirements.
  • Author unit tests that adhere to code coverage guidelines.
  • Proactively communicate progress, issues, and risks to stakeholders.
  • Create and maintain technical documentation.
  • Performs other related duties as directed.

Education & Experience

  • Experience designing Data Lakes and Data Warehouses.
  • Experience with Snowflake, Azure Synapse and Azure Databricks preferable.
  • Participate in driving best practices around data engineering and software development process.
  • Understand data and query optimization, query profiling, and query performance monitoring tools and techniques.
  • Experience with a range of AWS and/or Azure services, including Infrastructure and Security related services such as Azure AD, IaaS, Containers, Storage, Networking, and Azure Security, Airflow, SNS, SQS, S3.
  • Familiarity with designing and building data platforms supporting both batch and real-time (event-based) architecture for cloud data platforms is considered a strong plus.
  • Familiarity with big data platforms, tools, Kafka, predictive modeling, machine learning, etc. is considered a strong plus.
  • 2-5 years of hands-on experience designing and implementing large-scale distributed data architecture for BI and OLTP systems.
  • 2-5 years of hands-on experience designing and implementing large-scale data pipelines. 
  • 2-5 years of hands-on experience in Snowflake, AWS and Azure data services. 
  • 2-5 years of hands-on experience with data integration using ETL / ELT tools. 
  • Experience with cloud-based technologies, preferably AWS and Azure, Airflow, SNS, SQS, S3, ADLS Gen 2, Data Factory, Snowflake, Databricks, Synapse Analytics preferred.
  • Experience with one or more Python parallel processing libraries preferable.
  • Experience with one or more Python data analysis libraries preferable.
  • Experience with data integration through APIs, Web Services, SOAP, and/or REST services. 
  • Experience using Azure DevOps and CI/CD as well as Agile tools and processes including Git, Jenkins, Jira, and Confluence.
  • Knowledge of SOA and Micros Services Application Architecture. 
  • Ability to work in a fast-paced, collaborative team environment.
  • Excellent written and verbal communication skills and ability to express ideas clearly and concisely. 

What We Offer

Freeman provides benefits that aim to empower our people and their families to thrive mentally, physically, and financially.  These are a handful of the types of programs and benefits our full-time people may be eligible for.  There may be some variances in specific benefits across regions.

  • Medical, Dental, Vision Insurance
  • Tuition Reimbursement
  • Paid Parental Leave
  • Life, Accident and Disability
  • Retirement with Company Match
  • Paid Time Off

Diversity Commitment

At Freeman, our commitment to diversity and inclusion is helping us to create not only a great place to work, but also an environment where our employees, our customers and our communities around the world can reach their goals and connect with each other. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, veteran status and other characteristics protected by federal, state or local laws.

 

#LI-Hybrid

The job has been sent to

About Us

Freeman is a global leader in events, on a mission to redefine live for a new era. With a data-driven approach and the industry’s largest network of experts, Freeman’s insights shape exhibitions, exhibits, and events that drive audiences to action. The integrated full-service solutions leverage a 97-year legacy in event management as well as new technologies to deliver moments that matter.

Summary

We’re looking for an experienced and mission-driven Data Engineer. The role will report directly to the Manager of Data Engineering and join the company’s transformative IT team. You will be responsible for developing and working with a highly motivated group of data engineers, data architects and data contractor resources covering multiple disciplines, for refining Freeman’s transformative systems to meet its customer's needs and contributing to the strategic direction of the business.

 

This position will support our IT team. It is eligible to work a hybrid schedule, generally requiring work in-office and/or show site 2-3 days per week. The ideal candidate will be based out of any of the following locations:

  • Atlanta, GA
  • Dallas, TX

Essential Duties & Responsibilities

This position will focus on the development of data platform solutions. Sometimes it will be building upon existing solutions for the enterprise, and other times creating new complex data solutions.

  •  Development, testing, change & configuration management, and automated deployment of data management solutions – data ingestion, data transformation, data virtualization, data model, interfaces, databases, etc.
  • System integration across foundational data sources, the on-premises and cloud platforms they reside within, and other third-party data solutions.
  • Designing, building, and maintaining ETL/ELT data pipelines.
  • Working with large datasets, data lakes (lakehouse), and big data technologies, preferably cloud-based, such as Snowflake, Synapse, Databricks, or similar.
  • Use Git and Azure DevOps Pipelines for deployment automation of data solutions.
  • Design and develop data warehouses and data pipelines within Azure and or AWS.
  • Act as a key contributor to the design and development lifecycle of analytic applications utilizing Snowflake, AWS, Microsoft Azure and BI technology platforms.
  • Participate in Agile ceremonies including daily stand-ups, sprint planning, retrospectives, and product demonstrations.
  • Produce efficient and elegant code that meets business requirements.
  • Author unit tests that adhere to code coverage guidelines.
  • Proactively communicate progress, issues, and risks to stakeholders.
  • Create and maintain technical documentation.
  • Performs other related duties as directed.

Education & Experience

  • Experience designing Data Lakes and Data Warehouses.
  • Experience with Snowflake, Azure Synapse and Azure Databricks preferable.
  • Participate in driving best practices around data engineering and software development process.
  • Understand data and query optimization, query profiling, and query performance monitoring tools and techniques.
  • Experience with a range of AWS and/or Azure services, including Infrastructure and Security related services such as Azure AD, IaaS, Containers, Storage, Networking, and Azure Security, Airflow, SNS, SQS, S3.
  • Familiarity with designing and building data platforms supporting both batch and real-time (event-based) architecture for cloud data platforms is considered a strong plus.
  • Familiarity with big data platforms, tools, Kafka, predictive modeling, machine learning, etc. is considered a strong plus.
  • 2-5 years of hands-on experience designing and implementing large-scale distributed data architecture for BI and OLTP systems.
  • 2-5 years of hands-on experience designing and implementing large-scale data pipelines. 
  • 2-5 years of hands-on experience in Snowflake, AWS and Azure data services. 
  • 2-5 years of hands-on experience with data integration using ETL / ELT tools. 
  • Experience with cloud-based technologies, preferably AWS and Azure, Airflow, SNS, SQS, S3, ADLS Gen 2, Data Factory, Snowflake, Databricks, Synapse Analytics preferred.
  • Experience with one or more Python parallel processing libraries preferable.
  • Experience with one or more Python data analysis libraries preferable.
  • Experience with data integration through APIs, Web Services, SOAP, and/or REST services. 
  • Experience using Azure DevOps and CI/CD as well as Agile tools and processes including Git, Jenkins, Jira, and Confluence.
  • Knowledge of SOA and Micros Services Application Architecture. 
  • Ability to work in a fast-paced, collaborative team environment.
  • Excellent written and verbal communication skills and ability to express ideas clearly and concisely. 

What We Offer

Freeman provides benefits that aim to empower our people and their families to thrive mentally, physically, and financially.  These are a handful of the types of programs and benefits our full-time people may be eligible for.  There may be some variances in specific benefits across regions.

  • Medical, Dental, Vision Insurance
  • Tuition Reimbursement
  • Paid Parental Leave
  • Life, Accident and Disability
  • Retirement with Company Match
  • Paid Time Off

Diversity Commitment

At Freeman, our commitment to diversity and inclusion is helping us to create not only a great place to work, but also an environment where our employees, our customers and our communities around the world can reach their goals and connect with each other. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, veteran status and other characteristics protected by federal, state or local laws.

 

#LI-Hybrid

Apply Now

Other home office and work from home jobs