← All Positions
Posted Apr 8, 2026

GCP Cloud Engineer (GCP + Python + Flask + BigQuery)

Apply Now
Job Title: GCP Programmer – Python (Remote) Location: Remote role Duration: 6 months Pay Rate: $65/hr on W2 Interview Process: Visa sponsorship is currently not available. Only candidates authorized to work without restrictions will be considered. Job Description The focus will primarily be on: • Investigating and resolving defects • Enhancing and extending existing functionality • Supporting operational stability and readiness • Assisting with O&M, including follow the sun support as much of our ingestion operations run in early morning / late night outside of working hours We have built meaningful momentum over the past several weeks, and it is important that we maintain that pace. To do so, we need to distribute the workload across additional engineers rather than relying on a single resource. The environment in which we operate includes strict security controls, as well as a sensitive source system from which we extract data. Because of this, careful and disciplined engineering practices are essential. Now that we have begun exposing downstream data products, the platform must be operated and maintained continuously. This includes monitoring, maintaining, and safely evolving the stack while ensuring that downstream consumers are not disrupted. Frameworks & Libraries in Our Stack (Python Ingestion System) Web Framework • Flask 3.0 – Core web framework providing REST APIs and web dashboard capabilities • Flask-Login – Session management and authentication decorators • Authlib – Google OAuth 2.0 integration • itsdangerous – Secure session token signing Database • psycopg2-binary – PostgreSQL adapter used for job queue, node registry, and user/role storage • SQLAlchemy – ORM and connection URL builder supporting Oracle and SQL Server • oracledb – Oracle database driver • pymssql – SQL Server adapter for source data extraction Google Cloud • google-cloud-bigquery – BigQuery table creation, DDL operations, and streaming inserts • google-cloud-storage – GCS uploads used for CSV chunk backups • google-cloud-secret-manager – Secure credential retrieval Data Processing • pandas – DataFrame manipulation for Oracle export processing • openpyxl – Excel support for report generation Scheduling & System • APScheduler – Background job scheduler for replication and validation jobs • psutil – Process and system metrics Configuration • PyYAML – YAML configuration file parsing Runtime Infrastructure • Oracle Instant Client 21.12 – Native libraries required by the oracledb driver • Python 3.11 – Base runtime • Gunicorn – Production WSGI server • Podman / Docker – Containerization • Kuberenetes (not GKE flavor, using Gardener) Top Skills & Years of Experience Required Engineer Skills Backend Development • Python 3.x development including threading, decorators, generators, and context managers • Flask framework concepts including routing, middleware, blueprints, and request lifecycle • REST API design and implementation • SQL proficiency including transactions, row-level locking, and window functions Databases • PostgreSQL (GCP PAAS) including connection pooling, migrations, query optimization, and FOR UPDATE SKIP LOCKED patterns • Oracle familiarity including connection configuration and SQL dialect differences • SQL Server familiarity for source data extraction Google Cloud Platform • BigQuery including table creation, DDL/DML operations, schema management, and dataset usage/config/etc • Google Cloud Storage including object uploads and bucket management • Secret Manager usage (consumption), use in k8s • Service account usage and Application Default Credentials (WIF/WIP) • General k8s familiarity, as the stack is running on k8s DevOps & Infrastructure • Docker / Podman container builds and runtime management • CI/CD pipelines such as Azure DevOps or similar platforms • Environment variable and secrets management • Distributed system concepts including heartbeats, distributed locking, and crash recovery (Kubernetes runtime) Data Engineering • ETL/ELT pipeline design (our framework) • Chunked data processing for large dataset exports, particularly focused on partitioning workloads into scaled masses of worker nodes • Note: This is VERY different from the existing wellstack implementation, this is not running on VMs, it’s running on kube so – obviously some level of understanding of distributed workloads is necessary • Schema mapping between heterogeneous systems (Oracle or SQL Server to BigQuery) • Row-count validation and data quality checks Authentication & Security • OAuth 2.0 and OpenID Connect authentication flows • Role-based access control implementation • Secure credential handling and Secret Manager integration General Software Engineering • Multithreading concepts including locking and thread-safe shared state • Background job scheduling patterns • Structured logging design and per-job log isolation • Robust error handling and graceful degradation in distributed systems • Database migration authoring and maintenance #GSKIT Pay: $60.00 - $65.00 per hour Work Location: Remote