Code360 powered by Coding Ninjas X Naukri.com. Code360 powered by Coding Ninjas X Naukri.com
Table of contents
1.
Introduction
2.
Choosing an Endpoints option
2.1.
Computing option constraints
2.1.1.
App Engine generation 1 environment constraints
2.1.2.
Cloud Functions and App Engine do not support gRPC APIs.
3.
Cloud Endpoints for OpenAPI
3.1.
Supported compute platforms
4.
Deploy a Cloud Endpoints-managed API.
4.1.
Prerequisites
4.2.
Launching Cloud Shell
4.3.
Obtaining a sample code
4.4.
Endpoints configuration deployment
4.5.
Enabling required services
4.6.
Deploying the API backend
4.7.
Requesting information from the API
4.8.
Monitoring API activity
4.9.
Adding the API's quota
4.10.
Cleaning Up
5.
Cloud Endpoints for gRPC
5.1.
API management
5.2.
Supported compute platforms
6.
Cloud Endpoints Frameworks 
6.1.
API management
6.2.
Limitations
7.
Frequently Asked Questions
7.1.
What operating cloud endpoints support systems?
7.2.
Does the endpoint standard support PCI DSS malware requirement 5?
7.3.
Does the endpoint standard make use of signature based protection?
8.
Conclusion
Last Updated: Mar 27, 2024

Cloud Endpoints

Author Prerna Tiwari
0 upvote
Leveraging ChatGPT - GenAI as a Microsoft Data Expert
Speaker
Prerita Agarwal
Data Specialist @
23 Jul, 2024 @ 01:30 PM

Introduction

Cloud Endpoints is an API management system that allows you to secure, monitor, analyse, and set quotas on your APIs by utilising the same infrastructure that Google uses for its own APIs. After deploying an API to Endpoints, you can use the Cloud Endpoints Portal to create a developer portal, which is a website where API users can view documentation and interact with your API.

Now, Let’s have a look at the choices we get for cloud endpoints.
 

Cloud- GCP

 

Must Recommended Topic, procedure call in compiler design

Choosing an Endpoints option

You have three options for having your API managed by Cloud Endpoints, depending on where your API is hosted and the type of communication protocol your API uses:

  • Cloud Endpoints for gRPC
  • Cloud Endpoints for OpenAPI
  • Cloud Endpoints Framework for the App Engine standard environment

Endpoints supports various Google Cloud computing options for hosting your API's backend code. Endpoints provides API management by collaborating with the Extensible Service Proxy (ESP) or the Extensible Service Proxy V2 (ESPv2).

Computing option constraints

Endpoints for OpenAPI and gRPC may use ESP or ESPv2 as a proxy. ESP/ESPv2 is deployed as a container in front of the application or as a sidecar alongside your application on non-serverless platform. ESPv2 is deployed as a Cloud Run service. It is a remote proxy to manage your serverless platform applications for serverless platforms such as Cloud Run, Cloud Functions, and App Engine.

After you deploy the backend code for your API, ESP or ESPv2 intercepts all requests and performs any required checks (such as authentication) before forwarding the request to the API backend. When the backend responds, ESP uses Service Infrastructure to collect and report telemetry.

App Engine generation 1 environment constraints

 

Endpoints for the App Engine generation1 environment were previously implemented using Endpoints Frameworks, which only supported the Python 2.7 and Java 8 runtime environments.

Endpoints Frameworks does not use ESP because the App Engine standard environment did not support multi-container deployments when it was being developed. Endpoints Frameworks, on the other hand, includes a built-in API gateway that provides API management features comparable to those provided by ESP for Endpoints for OpenAPI and Endpoints for gRPC.

Cloud Functions and App Engine do not support gRPC APIs.

 

gRPC is a framework for remote procedure calls (RPC) that can run in any environment. A client application can use gRPC to call methods in a server application on another machine as if it were a local object. Bi-directional streaming with HTTP/2 transport is a key feature of gRPC.

HTTP/2 is not supported by App Engine or Cloud Functions.

Get the tech career you deserve, faster!
Connect with our expert counsellors to understand how to hack your way to success
User rating 4.7/5
1:1 doubt support
95% placement record
Akash Pal
Senior Software Engineer
326% Hike After Job Bootcamp
Himanshu Gusain
Programmer Analyst
32 LPA After Job Bootcamp
After Job
Bootcamp

Cloud Endpoints for OpenAPI

Cloud Endpoints is an API management system that allows you to secure, monitor, analyse, and set quotas on your APIs while utilising the same infrastructure that Google uses for its own APIs. Endpoints provides API management in collaboration with the Extensible Service Proxy (ESP) and the Extensible Service Proxy V2 (ESPv2). More information about Endpoints, ESP, and ESPv2 can be found in About Endpoints.

Endpoints supports OpenAPI Specification version 2 (formerly known as the Swagger spec), the industry standard for defining REST APIs. If you're not familiar with the OpenAPI Specification, take a look at the OpenAPI Overview.

Supported compute platforms

For API management, OpenAPI endpoints rely on either ESP or ESPv2. Both ESP and ESPv2 are Open Source projects that can be obtained in the following ways:

  • A container in Google Container Registry
  • Source code in GitHub.

Deploy a Cloud Endpoints-managed API.

This QuickStart tells the process of deploying a sample API that Endpoints manages. The following is a sample code:

  • A REST API that is queried to determine the name of an airport based on its three-letter IATA code.
  • A script responsible for uploading the API configuration to Endpoints.
  • A script for setting up an App Engine flexible environment backend to host the API.

    We can view Google Cloud's operations suite logs and the Endpoints activity graphs in the Google Cloud console after sending requests to the sample API. These tools allow you to monitor your APIs and gain insights into how they are being used.

This QuickStart uses scripts for  simplifying the configuration steps so you can see the activity graphs and logs in action as soon as possible.

Prerequisites

  • Create an account if you're new to Google Cloud to see how our products perform in real-world scenarios. In addition, new customers receive $300 in free credits to run, test, and deploy workloads.
  • Select or create a Google Cloud project in the Google Cloud console's project selector page.
  • Check that billing for your Cloud project is enabled.

Launching Cloud Shell

  • Make sure you're in the project you want to use for the sample API in the console.
  • Launch Cloud Shell.
  • A Cloud Shell session appears at the bottom of the console in a new frame and displays a command-line prompt. The session may take a few seconds to initialise.
  • If you're working with an existing project, make sure you have the most recent version of all gcloud components installed:
     
gcloud components update

Obtaining a sample code

  • To obtain the sample API and scripts, run the following command in Cloud Shell
     
git clone https://github.com/GoogleCloudPlatform/endpoints-quickstart
  • Navigate to the directory with the sample code:
     
cd endpoints-quickstart

Endpoints configuration deployment

An OpenAPI configuration file describing the API is required to publish a REST API to Endpoints. The sample API includes an OpenAPI configuration file called openapi.yaml.

Endpoints creates and manages APIs and services using Service Management, a Google Cloud infrastructure service. Endpoints are used to manage APIs by deploying the OpenAPI configuration file for the API to Service Management.

To deploy the Endpoints configuration, follow these steps:

  • Enter the following in Cloud Shell, in the endpoints-quickstart directory:
     
                            cd scripts
  • Run the sample script, which includes the following script:
     
                           ./deploy_api.sh


Endpoints identify the service using the host field in the OpenAPI configuration file. The deploy api.sh script adds your Google Cloud project's ID to the name specified in the host field. When creating an OpenAPI configuration file for your own service, you must do so manually.

The script then uses the command: gcloud endpoints services deploy openapi.yaml to deploy the OpenAPI configuration to Service Management.

Service Management outputs information to the console as it creates and configures the service. You can safely disregard the warnings in openapi.yaml about paths that do not require an API key. Upon successful completion, you will see a line displaying the service configuration ID and the service name.

Enabling required services

  • Endpoints requires these Google services to be enabled at a minimum:
Name Title
servicemanagement.googleapis.com Service Management API
servicecontrol.googleapis.com Service Control API
endpoints.googleapis.com Google Cloud Endpoints
  • To ensure that the required services are enabled, run the following command:    
     
gcloud services list

 

  • If the required services are not listed, enable them:
     
gcloud services enable servicecontrol.googleapis.com
gcloud services enable servicemanagement.googleapis.com
gcloud services enable endpoints.googleapis.com

 

  • Enable your Endpoints service as well:
     
gcloud services enable YOUR-PROJECT-ID.appspot.com

Deploying the API backend

You have so far deployed the OpenAPI configuration to Service Management, but not the code to serve the API backend. The sample's deploy app.sh script creates an App Engine flexible environment to host the API backend before deploying the API to App Engine.

To set up the API backend, follow these steps:

  • Run the following script in Cloud Shell, in the endpoints-quickstart/scripts directory:
     
./deploy_app.sh

 

  • To create an App Engine environment in the US-Central region, the script executes the following command: —region="$REGION" gcloud app create
    The App Engine flexible environment backend is created in a matter of minutes. The following is the output of the application:
     
Success! The app is now created.

 

  • The script then executes the gcloud app deploy command, which deploys the sample API to App Engine.
    The result is as shown below: 
     
Deploying ../app/app_template.yaml...You are about to deploy the following services:

 

  • The API is deployed to App Engine in a matter of minutes. After successfully deploying the API to  App Engine, the following output is produced:
     
Deployed service [default] to [https://examles-projects.appspot.com]

 

Requesting information from the API

After deploying the API, you can send requests to it in Cloud Shell by running the following script:
 


./query_api.sh


The script repeats the curl command used to send a request to the API before displaying the result. The result is as shown below:
 

curl "https://example-project.appspot.com/airportName?iataCode=SFO"
San Francisco International Airport


API expects a single query parameter, iataCode, to be set to a valid IATA airport code like SEA or JFK. As an example:
 

./query_api.sh JFK

You recently installed and tested an API in endpoints.

Monitoring API activity

With APIs deployed with Endpoints, we can monitor critical operations metrics in the console and use Cloud Logging to gain insight into your users and usage.

  • Run the traffic generation script in Cloud Shell to populate the graphs and logs:
     
./generate_traffic.sh
  • Examine the activity graphs for your API in the console.
    The requests may take a few moments to be reflected in the graphs. While you wait for the data to appear:

    • Click +Permissions if the Permissions side panel isn't already open. The Permissions panel allows users to control the access to  API and how much access they have.
    • Then, select Deployment history. This tab shows a history of your API deployments, including the time of deployment and who made the change.
    • Select Overview. You can see the traffic approaching. After a minute of running the traffic generation script, three lines appear on the Total latency graph (50th, 95th, and 98th percentiles). This information estimates response time.
       
  • Scroll down to the table and click View Logs for GET/airportName under Links. The Logs Explorer page displays the API request logs.
  • Launch Cloud Shell.
  • Control+C will stop the script.

Adding the API's quota

Endpoints allow you to set quotas that control how frequently applications can call your API. Quotas can be used to protect API from overuse by a single client.

  • Deploy the Endpoints configuration with a quota in Cloud Shell.
     
./deploy api.sh ../openapi with ratelimit.yaml

 

Within a minute of deploying an updated Endpoints configuration, it becomes operational.

  • Navigate to the Credentials page in the console.
  • Click Create credentials, followed by the API key. On the screen, a new API key is displayed.
  • Select Copy.
  • Enter the following into Cloud Shell. Replace YOUR API KEY with the newly created API key.
     
export API_KEY=YOUR_API_KEY

 

  • Send a request to your API using the API key you just created.
     
./query_api_with_key.sh $API_KEY

 

The result is something like this:

curl -H 'x-api-key: AIzeSyDbdQdaSdhPMdiAuddd_FALbY7JevoMzAB' "https://example-project.appspot.com/airportName?iataCode=SFO"
San Francisco International Airport

 

  •  API is now limited to 5 requests per minute. For sending traffic to the API and triggering the quota limit, use the following command.
     
./generate_traffic_with_key.sh $API_KEY

 

  • Enter Control+C to stop the script after it has run for 5-10 seconds.
  • Send another authenticated API request.
     
                  ./query_api_with_key.sh $API_KEY

You will get the following output:

{
 "code": 8,
 "message": "Insufficient tokens for quota 'airport_requests' and limit 'limit-on-airport-requests' of service 'example-project.appspot.com' for consumer     'api_key:AIzeSyDbdQdaSdhPMdiAuddd_FALbY7JevoMzAB'.",
 "details": [
  {
   "@type": "type.googleapis.com/google.rpc.DebugInfo",
   "stackEntries": [],
   "detail": "internal"
  }
 ]
}

 

If your result is different, rerun the generated traffic with key.sh script and retry.

Your API has been rate-limited successfully. You can also set different limits on different API methods, create different types of quotas, and track which consumers use which APIs.

Cleaning Up

These steps avoid incurring charges to Google Cloud account for the resources used on this page.

To avoid charges, you can delete your Cloud project, which will stop billing for all resources used in that project.

  1. Navigate to the Manage resources page in the console.
  2. Select the project to delete from the project list, then click Delete.
  3. Enter the project ID in the dialogue, then click Shut down to delete the project.

Cloud Endpoints for gRPC

Google's gRPC is a high-performance, open-source universal RPC framework. A client application can call methods on a server application on a different machine as if it were a local object in gRPC, making it easier to create distributed applications and services.

One of the primary advantages of using gRPC is documentation; you can generate reference documentation for your API using your service configuration and API interface definition files. For more information, see Developer Portal Overview.

API management

Endpoints provides API management by collaborating with the Extensible Service Proxy (ESP) or the Extensible Service Proxy V2 (ESPv2).

You can use Endpoints for gRPC's API management capabilities to add an API console, monitoring, hosting, tracing, authentication, and more to your gRPC services. Furthermore, once special mapping rules are specified, ESP and ESPv2 translate RESTful JSON over HTTP into gRPC requests. This means you can set up a gRPC server managed by Endpoints and call its API with a gRPC or JSON/HTTP client, giving you a lot more flexibility and ease of integration with other systems.

 

Deployed endpoints gprc application

Deployed Endpoints gRPC application

Supported compute platforms

For API management, OpenAPI endpoints rely on either ESP or ESPv2. Both ESP and ESPv2 are Open Source projects that can be obtained in the following ways

  • A container in Google Container Registry
  • Source code in GitHub.

Cloud Endpoints Frameworks 

Cloud Endpoints Frameworks is a web framework for the standard Python 2.7 and Java 8 runtime environments on App Engine. Cloud Endpoints Frameworks includes tools and libraries for creating REST APIs and client libraries for your application.

Endpoints Frameworks, like other web frameworks, manages the low-level communication details of HTTP requests and responses for your application. Endpoints Frameworks routes the request's URL to the function or method in your code that processes the request when a client sends a request to your API. Endpoints Frameworks converts the response value to JSON and sends it. You add metadata to your source code (via annotations in Java and decorators in Python). The metadata defines the surface of your application's REST APIs.

API management

For OpenAPI and gRPC Endpoints, the Extensible Service Proxy (ESP) offers API administration functions. Along with your backend instances, ESP operates in a container.

Endpoints Frameworks does not use ESP because the App Engine standard environment did not support multi-container deployments when it was being developed. Endpoints Frameworks, on the other hand, includes a built-in API gateway that provides API management features comparable to those provided by ESP for Endpoints for OpenAPI and Endpoints for gRPC.

Endpoints Frameworks stop all requests before sending them to the API backend and carry out any necessary checks (such as authentication). Endpoints Frameworks collects and reports telemetry when the backend responds. Metrics for your API can be found on the Endpoints Services page of the console.

Limitations

  • Endpoints Frameworks is only available on App Engine's standard Python 2.7 and Java 8 runtime environments.
  • Endpoints Frameworks does not support the Node.js, PHP, and Go runtime environments on the App Engine standard environment.
  • The App Engine flexible environment is not supported by Endpoints Frameworks.
  • Endpoints Frameworks do not support Compute Engine, Google Kubernetes Engine, or Cloud Functions.

Frequently Asked Questions

What operating cloud endpoints support systems?

Endpoint Standard is supported on Windows, MacOS, and Linux operating systems.

Does the endpoint standard support PCI DSS malware requirement 5?

VMware Carbon Black Cloud Endpoint Standard is certified to replace antivirus in meeting PCI DSS requirement 5 for Windows, Mac and Linux systems.  .

Does the endpoint standard make use of signature based protection?

Endpoint Standard can block known malware, suspect malware, and potentially unwanted programmes using traditional signature-based prevention. Traditional signature prevention is useful for preventing commodity malware from being executed, allowing Endpoint Standard to focus resources on unknown new, advanced threats.

Conclusion

In this blog, we have extensively discussed the concept of Cloud Endpoints. We started with introducing the Cloud Endpoints, cloud endpoints option, cloud endpoints for openAPI and gPRC, and finally concluded with cloud endpoints frameworks.

You can also refer to our Guided Path on Coding Ninjas Studio to upskill yourself in Data Structures and AlgorithmsCompetitive ProgrammingSystem Design, and many more! You may also check out the mock test series and participate in the contests hosted on Coding Ninjas Studio! For placement preparations, you must look at the problemsinterview experiences, and interview bundle.

Nevertheless, you may consider our paid courses to give your career an edge over others!

Happy Coding!

Live masterclass