Posts

Showing posts from May, 2021

Set Up and Configure a Cloud Environment in Google Cloud: Challenge Lab

Image
  Task - 1 : Create development VPC manually :- gcloud compute networks create griffin-dev-vpc --subnet-mode custom gcloud compute networks subnets create griffin-dev-wp --network=griffin-dev-vpc --region us-east1 --range=192.168.16.0/20 gcloud compute networks subnets create griffin-dev-mgmt --network=griffin-dev-vpc --region us-east1 --range=192.168.32.0/20 ---------------------------------------------------------------------------------------------------------------------------------------------- Task - 2 : Create production VPC manually :- gsutil cp -r gs://cloud-training/gsp321/dm . cd dm sed -i s/SET_REGION/us-east1/g prod-network.yaml gcloud deployment-manager deployments create prod-network \     --config=prod-network.yaml cd .. ---------------------------------------------------------------------------------------------------------------------------------------------- Task - 3 : Create bastion host :- gcloud compute instances create bastion --network-interface=network=griffin-

Deploy to Kubernetes in Google Cloud: Challenge Lab

Image
  [GSP318] : Deploy to Kubernetes in Google Cloud: Challenge Lab :- ---------------------------------------------------------------------------------------------------------------------------------------------- Task - 1 : Create a Docker image and store the Dockerfile :- gcloud auth list gsutil cat gs://cloud-training/gsp318/marking/setup_marking.sh | bash gcloud source repos clone valkyrie-app cd valkyrie-app cat > Dockerfile <<EOF FROM golang:1.10 WORKDIR /go/src/app COPY source . RUN go install -v ENTRYPOINT ["app","-single=true","-port=8080"] EOF docker build -t valkyrie-app:v0.0.1 . cd .. cd marking ./step1.sh ---------------------------------------------------------------------------------------------------------------------------------------------- Task - 2 : Test the created Docker image :- cd .. cd valkyrie-app docker run -p 8080:8080 valkyrie-app:v0.0.1 & cd .. cd marking ./step2.sh ---------------------------------------------------

Build a Website on Google Cloud: Challenge Lab

Image
GSP319 : Build a Website on Google Cloud: Challenge Lab :- ---------------------------------------------------------------------------------------------------------------------------------------------- Task 1: Download the monolith code and build your container :- git clone https://github.com/googlecodelabs/monolith-to-microservices.git cd ~/monolith-to-microservices ./setup.sh cd ~/monolith-to-microservices/monolith npm start gcloud services enable cloudbuild.googleapis.com gcloud builds submit --tag gcr.io/${GOOGLE_CLOUD_PROJECT}/fancytest:1.0.0 . ---------------------------------------------------------------------------------------------------------------------------------------------- Task 2: Create a kubernetes cluster and deploy the application :- gcloud config set compute/zone us-central1-a gcloud services enable container.googleapis.com gcloud container clusters create fancy-cluster --num-nodes 3 kubectl create deployment fancytest --image=gcr.io/${GOOGLE_CLOUD_PROJECT}/fancyt

Perform Foundational Data, ML, and AI Tasks in Google Cloud

Image
  Task 1: Run a simple Dataflow job In this task, you have to transfer the data in a CSV file to BigQuery using Dataflow via Pub/Sub. First of all, you need to create a BigQuery dataset called  lab  and a Cloud Storage bucket called with your project ID. 1.1 Created a BigQuery dataset called  lab In the Cloud Console, click on  Navigation Menu  >  BigQuery . Select your project in the left pane. Click  CREATE DATASET . Enter  lab  in the Dataset ID, then click  Create dataset . Run  gsutil cp gs://cloud-training/gsp323/lab.schema .  in the Cloud Shell to download the schema file. View the schema by running  cat lab.schema . Go back to the Cloud Console, select the new dataset  lab  and click  Create Table . In the Create table dialog, select  Google Cloud Storage  from the dropdown in the Source section. Copy  gs://cloud-training/gsp323/lab.csv  to  Select file from GCS bucket . Enter  customers  to “Table name” in the Destination section. Enable  Edit as text  and copy the JSON dat

Create ML Models with BigQuery ML:Challenge Lab

Image
  GSP341 : Create ML Models with BigQuery ML: Challenge Lab :- ---------------------------------------------------------------------------------------------------------------------------------------------- Task 1: Create a dataset to store your machine learning models :- In Cloud Shell :- bq mk austin // Navigation Menu -> BigQuery. ---------------------------------------------------------------------------------------------------------------------------------------------- Task 2: Create a forecasting BigQuery machine learning model :- // In BigQuery Console Query Editor :- CREATE OR REPLACE MODEL austin.location_model OPTIONS   (model_type='linear_reg', labels=['duration_minutes']) AS SELECT     start_station_name,     EXTRACT(HOUR FROM start_time) AS start_hour,     EXTRACT(DAYOFWEEK FROM start_time) AS day_of_week,     duration_minutes, FROM     `bigquery-public-data.austin_bikeshare.bikeshare_trips` AS trips JOIN     `bigquery-public-data.austin_bikeshare.bikesha

Engineer Data in Google Cloud: Challenge Lab

Image
  GSP327 : Engineer Data in Google Cloud: Challenge Lab :- ---------------------------------------------------------------------------------------------------------------------------------------------- Task - 1 : Clean your training data :- CREATE OR REPLACE TABLE   taxirides.taxi_training_data AS SELECT   (tolls_amount + fare_amount) AS fare_amount,   pickup_datetime,   pickup_longitude AS pickuplon,   pickup_latitude AS pickuplat,   dropoff_longitude AS dropofflon,   dropoff_latitude AS dropofflat,   passenger_count AS passengers, FROM   taxirides.historical_taxi_rides_raw WHERE   RAND() < 0.001   AND trip_distance > 0   AND fare_amount >= 2.5   AND pickup_longitude > -78   AND pickup_longitude < -70   AND dropoff_longitude > -78   AND dropoff_longitude < -70   AND pickup_latitude > 37   AND pickup_latitude < 45   AND dropoff_latitude > 37   AND dropoff_latitude < 45   AND passenger_count > 0 --------------------------------------------------------

Integrate with Machine Learning APIs: Challenge Lab

Image
  GSP329 : Integrate with Machine Learning APIs: Challenge Lab :- ---------------------------------------------------------------------------------------------------------------------------------------------- // Run in Cloud Shell :- export SANAME=challenge gcloud iam service-accounts create $SANAME gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID --member=serviceAccount:$SANAME@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com --role=roles/bigquery.admin gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID --member=serviceAccount:$SANAME@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com --role=roles/storage.admin gcloud iam service-accounts keys create sa-key.json --iam-account $SANAME@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com export GOOGLE_APPLICATION_CREDENTIALS=${PWD}/sa-key.json gsutil cp gs://$DEVSHELL_PROJECT_ID/analyze-images.py . // Open Editor and replace the content of "analyze-images.py" file with :- # Dataset: image_classification_dataset # Table name: