Platzhalter Bild

Homeoffice GCP Data Engineer at Prodapt

Prodapt · Richardson, United States Of America · Remote

Apply Now
Overview:

Prodapt is the largest and fastest-growing specialized player in the Connectedness industry, recognized by Gartner as a Large, Telecom-Native, Regional IT Service Provider across North America, Europe and Latin America. With its singular focus on the domain, Prodapt has built deep expertise in the most transformative technologies that connect our world. Prodapt is a trusted partner for enterprises across all layers of the Connectedness vertical. Prodapt designs, configures, and operates solutions across their digital landscape, network infrastructure, and business operations – and craft experiences that delight their customers. Today, Prodapt’s clients connect 1.1 billion people and 5.4 billion devices, and are among the largest telecom, media, and internet firms in the world. Prodapt works with Google, Amazon, Verizon, Vodafone, Liberty Global, Liberty Latin America, Claro, Lumen, Windstream, Rogers, Telus, KPN, Virgin Media, British Telecom, Deutsche Telekom, Adtran, Samsung, and many more. A “Great Place To Work® Certified™” company, Prodapt employs over 6,000 technology and domain experts in 30+ countries across North America, Latin America, Europe, Africa, and Asia. Prodapt is part of the 130-year-old business conglomerate The Jhaver Group, which employs over 30,000 people across 80+ locations globally.

Looking for a GCP Data Engineer with 6-10 years of experience for one of our clients in Irving, Texas and below are the required skills for this role.

  • Cloud Technologies: Proficiency in Google Cloud Platform services, especially BigQuery, Dataflow, Cloud Storage, and Pub/Sub.
  • Programming Languages: Strong programming skills in languages such as Python, Java, or SQL.
  • Data Warehousing: Knowledge of data warehousing concepts and experience with data modeling techniques.
  • ETL Processes: Experience in designing and implementing ETL (Extract, Transform, Load) processes.
  • Big Data Technologies: Familiarity with big data tools and frameworks like Apache Spark, Hadoop, or similar technologies.
  • CI/CD: Understanding of Continuous Integration and Continuous Deployment practices for data solutions
Responsibilities:
  1. Data Pipeline Development: Design, build, and maintain scalable data pipelines to process large datasets using GCP services like Dataflow, BigQuery, and Pub/Sub.
  2. Data Integration: Collaborate with data scientists and analysts to understand their data requirements and integrate various data sources.
  3. Data Modeling: Develop and maintain data models and architecture that support business intelligence initiatives and analytics.
  4. Monitoring and Optimization: Implement monitoring and logging of data processes to ensure data quality and optimize performance.
  5. Collaboration: Work closely with software engineers, data analysts, and other stakeholders to ensure seamless data flow and accessibility.
  6. Documentation: Prepare technical documentation for data systems and processes to facilitate knowledge sharing and maintainability.
  7. Security and Compliance: Ensure data security and compliance with relevant regulations and best practices in data governance.
Requirements:
  • Education: Bachelor’s degree in Computer Science, Information Technology, Data Science, or a related field.
  • Experience: 6-10 years of experience in data engineering or related roles, with a focus on cloud environments.
  • Certifications: GCP certifications (like Google Cloud Professional Data Engineer) preferred 
  • Strong analytical and problem-solving skills.
  • Excellent communication and teamwork abilities.
  • Adaptability and eagerness to learn new technologies and methodologies
  •  
Apply Now

Other home office and work from home jobs