
Lavoro Google-cloud-platform a distanza a tampa ∙ Pagina 1
4 Lavori a distanza e a domicilio online



Hybrid DevSecOps and Collaboration Tools Engineer
Dark Wolf Solutions · Tampa, Stati Uniti d'America · Hybrid

Homeoffice Cloud Alliance Director | Remote, USA
Optiv · Leawood, Kansas, US', 'Atlanta, Georgia, US', 'Tampa, Florida, US', 'Minneapolis, Minnesot, Stati Uniti d'America · Remote
Hybrid GCP Technical Lead
Qode · Tampa, Stati Uniti d'America · Hybrid
- Senior
- Ufficio in Tampa
- Design, develop, and optimize BigQuery data warehouses, ensuring high performance and cost efficiency.
- Build and maintain scalable ETL pipelines using GCP services such as Cloud Dataflow, Cloud Dataproc, and Cloud Composer (Airflow).
- Implement data ingestion, transformation, and processing workflows using Python, SQL, and Spark.
- Work with Cloud Storage, Pub/Sub, and Cloud Functions to facilitate real-time and batch data processing.
- Ensure data security, governance, and compliance using IAM, encryption, and audit logging.
- Collaborate with Data Scientists and Analysts to optimize query performance and enable advanced analytics.
- Troubleshoot performance issues, optimize queries, and manage BigQuery cost and resource utilization.
- Implement CI/CD pipelines for data workflows using Terraform and Cloud Build.
- Monitor and maintain GCP infrastructure, ensuring reliability and scalability.
- 10+ years of experience working as a GCP Data Engineer or similar role.
- Strong expertise in Google BigQuery – performance tuning, partitioning, clustering, and cost optimization.
- Hands-on experience with ETL tools and frameworks, including Cloud Dataflow (Apache Beam), Cloud Dataproc (Spark), and Cloud Composer (Airflow).
- Proficiency in SQL, Python, and Shell scripting for data transformation and automation.
- Experience with Google Cloud Storage, Pub/Sub, and Cloud Functions.
- Strong understanding of GCP IAM, security best practices, and data governance.
- Familiarity with Terraform, Cloud Build, and CI/CD for infrastructure automation.
- Ability to work with large-scale datasets and real-time streaming data processing.
- Experience in data modeling, schema design, and optimization techniques.
- Strong analytical and problem-solving skills.
- GCP Professional Data Engineer Certification is a plus.
- Experience with Machine Learning workflows on GCP (Vertex AI, AutoML, TensorFlow on AI Platform) is a plus.
- Experience with Kubernetes (GKE) and containerized deployments is a plus.