Google Cloud Platform Setup,  French Customer 

  • Project management with relevant stakeholders
  • Cloud Landing Zone for Google Cloud Platform across 4 regions 
  • Setup included Authentication & Authorization, Networking, Security, Monitoring/Logging along with full documentation.


Google Cloud Dataflow Pipeline, South American Customer

  • Reading hundreds of millions of plain text files from a GCS bucket into Cloud PubSub using Cloud Dataflow. 
  • Pubsub message should also contain the complete path of the GCS Object and the “created time” of the GCS Object. 

GCP setup and Docker image deployment on Google Cloud Compute, Swedish Customer

  • Setup of GCP environment
  • Running the application on Google Compute Engine with a container optimized OS machine with images in Google Container Registry.


GCP CI/CD setup with Application  deployment on GKE, US Customer

  • Provisioning the infrastructure via Terraform.
  • Architecting, scripting & provisioning a Kubernetes cluster to run a Saas application.
    • 2 publicly exposed services (WebApp & RestAPI),
    • 1 private service (RestAPI) to be consumed by the above mentioned 2 publicly facing services,
    • Persistent DB (PostgresQL)
    • Shared Caching Layer (Redis)
    • Cloud Files
    • Cloud Functions that fire on changes to files on Cloud Files
    • Separate dev & a prod environment
  • Deploy the application using Cloud Build

Google Cloud function to transform video, Latvian Customer

  • Function gets automatically triggered by placing a video file in a Google Storage bucket.
  • Extracts format information from file metadata.
  • Calls Coconut API to produce videos and images based on the input file and places them into configurable locations.

Migration of app from Heroku to GCP, US Customer

  • App hosted in Heroku which had repeated performance issues and kept crashing was migrated to GCP App Engine. 
  • Deployment of the application using Github and Cloud Build


Audit of Apache Beam pipeline and enhancing it, US Customer

    • First streaming pipeline reading data from Kafka, processing it and writing to BigQuery, GCS and Pub/Sub.
    • Second streaming pipeline reading data from Pub/Sub and writing to ElasticSearch.
    • Enhancement included adding new transforms, processing on JSON data, proper error handling and handling changing schema.