Google Cloud Platform Data Engineer/ Architect

Remote Full-time
Google Cloud Platform Data Engineer Remote Contract Only W2/1099 Detailed JD: Designs, builds, and maintains scalable data pipelines and architectures on Google Cloud, transforming raw data into accessible insights using tools like BigQuery, Dataflow, and Cloud Storage, ensuring quality, security, and performance, while collaborating with analysts and scientists to meet business needs. Key responsibilities include ETL/ELT, data warehousing, monitoring, and optimizing cloud data solutions, requiring skills in SQL, Python, and Google Cloud Platform services, along with strong analytical and communication abilities. Responsibilities Pipeline Development: Design, build, and maintain robust, scalable data pipelines (batch/streaming) using Google Cloud Platform services. Architecture: Develop and manage data warehousing and data lake solutions (e.g., BigQuery, Cloud Storage). Data Quality & Governance: Ensure data integrity, consistency, security, and compliance. Collaboration: Work with data scientists, analysts, and business stakeholders to define requirements and deliver solutions. Optimization: Monitor and optimize data infrastructure for performance, cost, and reliability. Automation: Implement CI/CD, Infrastructure as Code (IaC), and automation for data workflows. Core Skills & Qualifications At least 3+ years of experience with Google Cloud Platform data services (BigQuery, Dataflow, Dataproc, Cloud Storage, etc.). Must have delivered al least 2 to 3 end to end projects as a data engineer using Google Cloud Platform Services Strong understanding of database design, data modeling (relational, dimensional, NoSQL). Expertise in data integration, ETL/ELT, and data pipeline development. Knowledge of cloud security best practices, identity management, and networking. Familiarity with DevOps, CI/CD, and containerization (Docker, Kubernetes). Excellent communication, problem-solving, and leadership skills Google Cloud Platform Data Architect Remote Contract Only W2/1099 Detailed JD: Designs, builds, and manages scalable, secure data solutions on Google Cloud, focusing on big data, warehousing (like BigQuery), pipelines (ETL/ELT), and governance, collaborating with stakeholders to translate business needs into robust data strategies, ensuring performance, cost-efficiency, and compliance, using Google Cloud Platform tools and cloud-native tech. Key duties involve architecture design, security implementation, cost optimization, and mentoring teams. Core Skills & Qualifications At least 8+ years of experience with Google Cloud Platform data services (BigQuery, Dataflow, Dataproc, Cloud Storage, etc.). Must have delivered 4 to 5 end to end projects as a data architect using Google Cloud Platform Services Strong understanding of database design, data modeling (relational, dimensional, NoSQL). Expertise in data integration, ETL/ELT, and data pipeline development. Knowledge of cloud security best practices, identity management, and networking. Familiarity with DevOps, CI/CD, and containerization (Docker, Kubernetes). Excellent communication, problem-solving, and leadership skills. Key Responsibilities Architecture & Design: Architecting end-to-end data solutions (data lakes, warehouses, pipelines) using Google Cloud Platform services (BigQuery, Dataflow, Pub/Sub, etc.). Data Modeling & Integration: Designing data models, schemas, and ETL/ELT processes for data ingestion, transformation, and delivery. Security & Governance: Implementing security frameworks, access controls, and data governance policies (GDPR, HIPAA, CCPA). Performance & Cost: Optimizing cloud infrastructure for performance, scalability, and cost-effectiveness. Collaboration & Leadership: Working with data engineers, analysts, developers, and business stakeholders to define requirements and provide technical guidance. Documentation: Creating detailed architecture diagrams, design documents, and technical specifications. Emerging Tech: Evaluating and adopting new Google Cloud Platform features and cloud technologies (AI/ML, IoT, serverless) Apply tot his job
Apply Now →

Similar Jobs

GCP / Python Engineer

Remote Full-time

Senior Software Engineer, Infrastructure and Developer Productivity, GCP

Remote Full-time

Sr Gen AI Engineer - TX, USA (Remote)

Remote Full-time

Cloud Engineer - Azure, AWS, GCP Certified

Remote Full-time

Senior/Staff Software Engineer (Generative AI-Big Data /10030, 10032)

Remote Full-time

Assistant General Counsel Governance and Securities

Remote Full-time

Generative AI Solution Engineer- Remote (Anywhere in the U.S.)

Remote Full-time

Artificial Intelligence (AI) Engineers

Remote Full-time

Director & Assistant General Counsel, Employment Law | Procore Technologies | Remote (United States)

Remote Full-time

Senior Staff GenAI/ ML Ops Engineer

Remote Full-time

Immediately Need PATIENT CARE TECHNICIAN - CARDIAC TELEMETRY, ONCOLOGY & HOSPICE (2 WEST) in Marietta, OH

Remote Full-time

Experienced Remote Data Entry Clerk and Typist – Flexible Work from Home Opportunity with Comprehensive Training and Competitive Compensation

Remote Full-time

[Remote] Help Desk Specialist

Remote Full-time

Compliance Analyst, Advertising Review

Remote Full-time

Experienced Live Chat Support Specialist - Delivering Exceptional Customer Experiences at blithequark

Remote Full-time

Oracle Finance Functional Solution Architect - Pittsburgh, PA (Remote)

Remote Full-time

**Experienced Data Entry Clerk – Entry-Level Opportunity for Remote Work at blithequark in Los Angeles**

Remote Full-time

**Experienced Customer Success Specialist – Inventory and Order Management Software**

Remote Full-time

Senior Consultant, OpenShift - FSI

Remote Full-time

Experienced Customer Success Representative – Remote Work Opportunity with arenaflex, Delivering Exceptional Travel Experiences to Valued Customers

Remote Full-time
← Back to Home