HOW TO ACE Google Professional Cloud Architect certification exam...
Updated: Dec 18, 2020
I see that many of you are starting out in your Google Cloud Platform (GCP) certification journey with "Google Professional Cloud Architect".
There must be some reasons behind: Some of you may deduce that it is easier to read and pass this exam as compared to "Associate Cloud Engineer"; Others may prefer not to be tested on some gcloud, gsutil, and kubectl CLIs as in their job roles, they do not need to deal with commands.
Anyway, congratulation to all of you who has obtained any or multiple of the Google Cloud Platform certifications. For those who plans to get on it with either "Associate Cloud Engineer" or "Google Professional Architect", you can read my blog about the methodology in the above blog.
So which exam is easier to pass? In my personal opinion and based on my own journey, if you have spent time preparing for Associate Cloud Engineer exam, you will find that it is easier to pass Google Professional Cloud Architect.
But there is a methodology I would like to share with you and you will benefit from it. The preparation approach differs among individuals. But you know what? I find out that following my recommended preparation method works for me.
- You can read up all about it in my Google Cloud Platform Associate Cloud Engineer blog as I went into some details, hence that long blog. Basically, it revolves around using the very good documentation curated by Google and bound by Google Exam Guide. A departure from the usual options recommenced by others: whizlabs.com, coursera.org, acloudguru.com, linuxacademy.com, udemy.com, etc.
1. First thing first, visit the official google certification page for Professional Cloud Architect
2. Go through the Google Professional Cloud Architect exam guide. Paste the content into an Excel or Google sheet (The recommended approach of mine as in my Associate Cloud Engineer blog)
Tips:
For Google Professional Cloud Architect exam, what Google wants you to achieve is visualizing you being a real customer-facing cloud architect, helping customers of all sizes and shapes in architecting or designing using whatever Google Cloud Platform products, services and solutions for them based on their i) technical and ii) business requirements.
3. So to attain this mastery as a Professional Cloud Architect, you have to know what Google Cloud Platform has to offer in terms of its plethora of products, services and portfolios. Couples that with Google's partner ecosystem, Google Cloud Platform has every imaginable customer covers, isn't it? Don't be daunted though as Google list is still manageable vs. Amazon Web Services though it is not the length of list that counts most of the time.
4. After getting a good grasp and decent understanding of Google Cloud Platform products, services and solutions, your next focus is to understand and memorize the key features and unique propositions of each product, service and solution in GCP context. And match its characteristic with Use Case.
Let me illustrate:
Example:
A fictitious customer tells you that they need a database which can scale globally to replace their on-premises Sales Order processing backend database which currently is struggling with scalability beyond the single data center they operate from. To consolidate all regional data for reporting now is another painstaking task which needs them to merge all regional data into another big database.
From the above business requirement, you can then sift out the peculiar key requirements. So intuitively, you should be able to pick up the following:
Scale globally - This implies a database which can scale globally
Sales Order processing - This implies a Relational Database Management System (RDBMS) or transactional database with the mandated and tightly enforced properties like Atomicity, Consistency, Isolation, Durability. Also, RDBMS has to be SQL based, so you will rule out all NoSQL database of GCP.
Consolidate all regional data - This implies that the on-premises Sales Order Processing backend database is regional in nature i.e. One database for one region.
At the back of your head, you should figure out one database from your knowledge about GCP databases that satisfies all the requirements.
Let use the following GCP database diagram as a guide (Source: Google)

Upon analysis, you should be able to conclude that the only GCP RDBMS which can satisfy all the business or technical requirements is Google invincible Cloud Spanner.
5. Basically, you need to repeat Step 3-4 above for most of the Google Cloud Platform products, services and solutions using the Exam Guide as a cue. I would highly recommend you focus on the following too though it is not the exhaustive list as they always said:
- GCP Database - Cloud Spanner, BigTable, BigQuery, Cloud SQL (Microsoft SQL Server, PostgreSQL and MySQL), Cloud Memorystore
- Google Cloud Storage based on high availability (Multi-region, Dual-region, Region) and storage classes (STANDARD, NEARLINE, COLDLINE & ARCHIVE). Please understand Cloud Storage Object Lifecycle Management too.
The aim of Cloud Storage Object Lifecycle Management is to help you save cloud storage cost. But you have to define and enable lifecycle management for the intended cloud storage bucket either via gsutil Cli or Google Cloud Storage console or code esp. PYTHON or REST API.
Tip: You will not be tested on coding but may be presented a small sample of easy-to-understand code snippet in JSON format. What you have to understand is the eventual cloud storage object state after execution of the code snippet and pick the right answer. You have to know the logical sequence of such object lifecycle management policy like when the cloud storage objects are MORE THAN 30 days, you want to either delete them or move them to a cheaper storage class like COLDLINE. Either of these options is to save cost.
Example:
The following lifecycle management configuration JSON document specifies that all objects in this bucket that are more than 365 days old are deleted automatically:
{
"rule":
[
{
"action": {"type": "Delete"},
"condition": {"age": 365}
}
]
}
- Know the different connectivity options to Google Cloud Platform and their bandwidth limitations, hence justifying the choice of one over the other. An example is GCP VPN bandwidth is limited up to 3Gbps, so to attain a 10Gbps bandwidth would require you to choose a Cloud Interconnect.
https://cloud.google.com/network-connectivity/docs/how-to/choose-product#cloud-interconnect
- Be familiar with DevOps and Continuous Delivery concept like using CI/CD tool like Jenkins utilizing tags, canary testing, and Google Kubernetes Engine.

- App Engine (Standard vs Flexible. Their use cases based on their characteristics like App Engine Standard is ran in a Google Sandbox environment and App Engine Flexible is hosted in GCP Compute Engine VMs and it has the option of letting customer connects to their App Engine Flexible environment via Virtual Private Network)
6. Pay special attention to the three Google provided use cases mentioned in the Exam Guide. I leave it toward the end of my preparation as by then, I would have gotten myself a better idea.
Tip: You will get at least 10 questions based on a combination of these three use cases. My advice is also include them in your spreadsheet as three separate sheets. Go through each of them, put in your own words or take notes which Google Cloud Platform products, services and/or solutions should you propose based on each Use Case's technical and business requirements.
Let's use Point 1 and 2 of TerramEarth Use Case for illustration:
1. Solution concept
There are 20 million TerramEarth vehicles in operation that collect 120 fields of data per second. Data is stored locally on the vehicle and can be accessed for analysis when a vehicle is serviced. The data is downloaded via a maintenance port. This same port can be used to adjust operational parameters, allowing the vehicles to be upgraded in the field with new computing modules.
Approximately 200,000 vehicles are connected to a cellular network, allowing TerramEarth to collect data directly. At a rate of 120 fields of data per second, with 22 hours of operation per day, TerramEarth collects a total of about 9 TB/day from these connected vehicles.
2. Existing technical environment
TerramEarth’s existing architecture is composed of Linux and Windows-based systems that reside in a single U.S. west-coast-based data center. These systems gzip CSV files from the field and upload via FTP and place the data in their data warehouse. Because this process takes time, aggregated reports are based on data that is three weeks old.
With this data, TerramEarth has been able to preemptively stock replacement parts and reduce unplanned downtime of their vehicles by 60%. However, because the data is stale, some customers are without their vehicles for up to four weeks while they wait for replacement parts.
Analysis and proposed Google Cloud Platform products, services and/or solutions.
200,000 out of 20,000,000 vehicles are connected to a cellular network so only 1% is connected.
Analysis #1 based on provided Solution Concept:
Data is stored locally on the vehicle and can be accessed for analysis when a vehicle is serviced. The data is downloaded via a maintenance port.
This implied that the collection method is a batch process.
Proposed GCP Product/Service/Solution to address the pain point:
Google Cloud IoT Core and its supporting products like Cloud Dataflow for massive real-time and batch data ingestion (Extract, Transform & Load) should spring to your mind. Cloud Pub/Sub is a globally scalable message queueing system, making it an excellent choice to act as a buffer to store the streams of vehicle data from slow cellular network while at the same time decoupling the specifics of the backend processing implementation. For example, the backend system can pick up as and when it needs for processing from Cloud Pub/Sub queue because of its durable storage.
What happen if 10 fold more is connected to the cellular network and which GCP product can accommodate such exponential growth in future bearing also in mind that 9TB/day is collected based on 200,000 vehicles? If it is 10% more, that means 90TB/day!
Proposed GCP Product/Service/Solution to address the pain point:
120 fields of data per second per vehicle has to be collected. This implies that we need a time-series NoSQL database. Google BigTable should come to our mind as it has low latency and high throughput making it the perfect match for writing 9TB or more dataset and IoT data.
9TB/day of dataset will need 1 day to be transferred to the FTP servers assuming the cellular network bandwidth of 1Gbps as there is no mention of the actual bandwidth.
Proposed GCP Product/Service/Solution to address the pain point:
Google Cloud Storage should suffix as the ideal storage product here since it is ideally placed to store processed or historical data in either COLDLINE and/or ARCHIVE storage classes as an example.
Its tight, native integration and the ease with which data can be imported or exported with many of Google Cloud Platform products mentioned here make it the natural choice.
https://cloud.google.com/storage/docs/google-integration
Analysis #2 based on provided Existing technical environment:
TerramEarth’s existing architecture is composed of Linux and Windows-based systems that reside in a single U.S. west-coast-based data center.
What happen if there is an outage of the single data center? How can you improve the availability? Vehicle data is uploaded via FTP then stored in data warehouse and the aggregated reports are three weeks old.
Proposed GCP Product/Service/Solution to address the pain point:
GCP provides Google Compute Engine or Google Kubernetes Engine to host Linux and Windows-based systems.
BigTable supports regional replication to address the single point of failure of the single data center. Regional replication to support at least 2 zones within the region.
Regional or zonal persistent disks or block devices should also come to your mind which can be used with Google Cloud Platform Compute Engine VMs. Regional persistent disks as its name implies has higher availability.
For data warehouse, BigQuery should be the number one choice as it is the sole Google MANAGED, SERVERLESS, Enterprise grade and highly scaleable data warehouse product based on the familiar SQL user interface. Its unique properties of allowing data to be stored in date or timestamp partitioned table either from live streaming data or in conventional I/O pattern makes it especially appealing as data warehousing database.
For end users like data analysts or business analyst, BigTable's high speed due to its in-memory business intelligent analytic engine is another pull factor for reporting to address the issue of aggregated report which is three week old!
7. Other resources I have used to pass this Google Cloud Professional Architect is the following books:
Official Google Cloud Certified Professional Cloud Architect Study Guide

Google Cloud Platform for Architects

8. When you have passed either as a Associate Cloud Engineer or a Professional Cloud Architect, you will receive an email from Google sent to the email you use to register for the exam at https://www.webassessor.com.
In that email, you have the option to claim the certificate and have it printed or downloaded y, add it to your LinkedIn profile, or share to any of your preferred social networks . In addition, you can opt-into having your digital certificate shown in the Google Hall of Fame, Google Cloud Certified directory.
To add icing to the cake for supporting the Google Cloud certification program and congratulate you on your achievement, Google also offers you to redeem a free Google Cloud certification merchandise. Simply Google Cloud Certification Perks Webstore and enter the redemption code included in the same email mentioned above.
Note:
If you also have AWS certification like me, you will have similar benefits from Google. What AWS has offered you in addition is:
AWS Certified Practice Exam Voucher
AWS Certified Exam Discount Benefit
AWS Certification SME Program - Basically, you may be invited to participate in AWS Certification exam development events like the creation and review of exam content
One disappointing event I encountered while I was trying to redeem my AWS certification merchandise for both AWS Certified Solutions Architect - ASSOCIATE & PROFESSIONAL for AWS Gear Store was the following message.
We're Sorry, we no longer ship to this country. Please update with a valid Country address.
By the way, Google on the other hand is willing to ship to my Little Red Dot, Singapore even for the merchandise as big as a backpack.
Lastly, I would like to wish you success in your Google Professional Cloud Architect certification exam.
Please watch this space for updates or you can drop me a note providing feedback and suggestion to make our blog better together.
Thank you so much for your time.