Natural language processing enables computers to speak with people in their language while also automating other language-related processes. NLP enables computers to read text, hear the voice, analyze it, gauge sentiment, and identify which portions are significant.
Google Cloud Natural Language API provides developers with natural language understanding capabilities such as sentiment analysis, entity recognition, entity sentiment analysis, and other text annotations. The Natural Language API is a REST API that accepts and responds to JSON queries.
👉 Basic Setup
To begin using Natural Language Processing in your journey, you can follow the following steps:
1️⃣ Start by creating a project. To use Google Cloud services, you must first build a project.
2️⃣ Enable billing in your account. A billing account is used to specify who pays for a specific set of resources and can be associated with one or more projects. The associated billing account is charged for project usage. Check if billing is enabled in your project.
3️⃣ You must activate the Cloud Natural Language API for your project.
4️⃣ Any client application that uses the API must be authenticated and provided access to the resources requested.
5️⃣ Make a service account and save the private key file.
To create a service account:
A. Go to the Create service account page, present in the console.
B. Select the specific project.
C. Enter a name in the service account name field. The Service account ID field gets filled by the console based on this name. Similarly, enter a description in the account description field.
D. Go on Create and Continue. E. Click on Done to create the service account. The same window is used for further steps.
To create a service account key:
A. Click on the email address in the console for the service account that you created.
B. Go to Keys. C. Click on Add key, and then go to Create new key. D. Click Create, after which a JSON key file is downloaded in your system.
E. Go to Close.
6️⃣ Set the environment variable GOOGLE APPLICATION CREDENTIALS to provide authentication credentials to your application code. This variable only pertains to the current shell session.
For PowerShell:
$env:GOOGLE_APPLICATION_CREDENTIALS="KEY_PATH"
For command prompt:
set GOOGLE_APPLICATION_CREDENTIALS=KEY_PATH
7️⃣ Install and run the google cloud CLI. The gcloud CLI needs to be installed for using the Natural Language API.
8️⃣ Ensure that you have done validation and SDK testing. Gcloud CLI is used for testing the authentication environment. The following command can be executed to verify that no error occurs and that credentials are returned:
Sentiment Analysis examines the provided text and finds the dominant emotional opinion within the text, particularly to establish whether a writer's attitude is positive, negative, or neutral. The analyzeSentiment technique is used to analyze sentiment in the provided text.
The Natural Language API allows you to perform sentiment analysis directly on a file stored in Cloud Storage, eliminating the need to submit the file's contents in the body of your request.
Curl is used to make a POST request to the documents:analyzeEntities method and provide the appropriate request body.
Consider the following example:
The example uses the gcloud auth application-default print-access-token command to obtain an access token for the service account that you created when you set up the project.
Before using the google cloud tool for sentiment analysis, you must follow the following procedure:
1️⃣ Create an account on Google Cloud and create a new project. This is done to evaluate how the products perform in real-world scenarios.
2️⃣ Ensure that billing for the Cloud project is enabled.
3️⃣ Make the Cloud Natural Language API enabled.
4️⃣ Create a service account and a key for it.
To do so, follow the below steps:
⭐ In the console, go to the Create service account page.
⭐ Select the project.
⭐ In the Service account name field, enter the name in the Service account name field. The Service account ID field is filled-based on this name. Also, do the same forthe account description.
⭐ Click Create and continue.
⭐ Click Done to finish creating the service account. Use the same window for further steps.
5️⃣ Set the GOOGLE APPLICATION CREDENTIALS environment variable to the location of the JSON file containing your service account key.
6️⃣ Install and launch the Google Cloud CLI.
Now you’re all set to perform sentiment analysis in Google cloud. Just two actions can do it:
🔥 Entity Analysis Request
Use the google cloud command line tool to invoke the analyze-entities command, and specify the text to be analyzed using the —content flag.
gcloud ml language analyze-entities --content="Pablo Picasso, Spanish painter, is known for 'Guernica'."
🔥 Clean Up
1️⃣ Navigate to the Manage resources page in the console.
2️⃣ Select the project you wish to delete from the project list, then click Delete.
3️⃣ Enter the project ID in the window, then click Shut down to delete the project.
👉 Analyzing Entities
Entity Analysis searches the provided text for recognized entities, such as public persons, landmarks, and so on, and provides information about such things. The analyzeEntities method is used to do entity analysis.
Let us look at shoe to perform entity analysis on text string via Natural Language API:
// Imports the Google Cloud client library
const language = require('@google-cloud/language');
// Creates a client
const client = new language.LanguageServiceClient();
const text='Enter text';
// Preparing the document having the provided text
const doc={
content: text,
type: 'PLAIN_TEXT',
};
// Detecting entities in the document
const [output]=await client.analyzeEntities({doc});
const entities=output.entities;
console.log(‘Entities’');
entities.forEach(entity => {
console.log(entity.name);
console.log(` - Type: ${entity.type}, Salience: ${entity.salience}`);
if (entity.metadata && entity.metadata.wikipedia_url) {
console.log(` - Wikipedia URL: ${entity.metadata.wikipedia_url}`);
}
});
You can also try this code with Online Javascript Compiler
Content Classification examines a document and produces a list of content categories corresponding to the document's text. Call the classifyText method to classify the content of a document. Make a POST call to the documents:classifyText REST API with the necessary request body to categorize material supplied as a string.
Let us look at classifying the content given as string:
// Importing the Google Cloud client library
const language = require('@google-cloud/language');
// Creating a client
const client=new language.LanguageServiceClient();
const cont='Enter text';
// Preparing a document that has the provided text
const doc={
content: cont,
type: 'PLAIN_TEXT',
};
// Classifying the content in the document
const [classification]=await client.classifyText({doc});
console.log('Categories:');
classification.categories.forEach(category => {
console.log(`Name: ${category.name}, Confidence: ${category.confidence}`);
});
You can also try this code with Online Javascript Compiler
While most Natural Language techniques examine the content of a given text, the analyzeSyntax approach examines the structure of the language itself. Syntactic Analysis divides a given text into sentences and tokens (usually words) and offers linguistic information about those tokens.
Let us now look at performing syntactic analysis on a text string via Natural Language API:
// Importing the Google Cloud client library
const language=require('@google-cloud/language');
// Creating a client
const new_client=new language.LanguageServiceClient();
const cont='Enter the text to be analyzed';
// Preparing a document that contains provided text
const doc={
content: cont,
type: 'PLAIN_TEXT',
};
// Specifying the encoding type for receiving the word offsets
const encodingType = 'UTF8';
// Detecting the sentiment of the document
const [syntax]=await client.analyzeSyntax({doc, encodingType});
console.log('Tokens:');
syntax.tokens.forEach(part => {
console.log(`${part.partOfSpeech.tag}: ${part.text.content}`);
console.log('Morphology:', part.partOfSpeech);
});
You can also try this code with Online Javascript Compiler
The syntactic analysis can be performed directly on a file located in Cloud by making the following alterations:
const bucket_name='Enter bucket name';
const file_name='Enter file name';
// Preparing a document that has a text file in Cloud Storage
const doc={
gcsContentUri: `gs://${bucketName}/${fileName}`,
type: 'PLAIN_TEXT',
};
You can also try this code with Online Javascript Compiler
Entity Sentiment Analysis combines entity and sentiment analysis to identify the sentiment (positive or negative) conveyed about entities in the text. Entity sentiment is expressed numerically by a score and magnitude value for each mention of an entity. These scores are then averaged to provide an entity's overall sentiment score and magnitude.
Let us look at analyzing entity sentiment given on a string:
// Importing the Google Cloud client library
const lang=require('@google-cloud/language');
// Creating a client
const client_new=new language.LanguageServiceClient();
const content='Enter content to be analyzed’
// Preparing the document and inputting the provided text.
const doc= {
content: text,
type: 'PLAIN_TEXT',
};
// Detecting sentiment of entities of the document
const [results]=await client.analyzeEntitySentiment({doc});
const entities=results.entities;
console.log('Entities and sentiments:');
entities.forEach(entity=>{
console.log(`Name: ${entity.name}`);
console.log(`Type: ${entity.type}`);
console.log(`Score: ${entity.sentiment.score}`);
console.log(`Magnitude: ${entity.sentiment.magnitude}`);
});
For using the content from clous(specifically, Google Cloud), the following alterations have to be made in the above code:
const bucket_name='Enter bucket name';
const file_name='Enter file name';
// Preparing the document that represents the text file in Cloud Storage
const document = {
gcsContentUri: `gs://${bucket_name}/${file_name}`,
type: 'PLAIN_TEXT',
};
You can also try this code with Online Javascript Compiler
Utility computing enables users to pay just for what they use. It is a plug-in maintained by an organization that determines what sort of cloud services must be provided.
Explain the distinction between elasticity and scalability in cloud computing.
Scalability is a feature of cloud computing that allows the rising workload to be handled by expanding resource capacity proportionally. Elasticity, on the other hand, is one of the features that emphasize the notion of commissioning and decommissioning a considerable amount of resource capacity.
What exactly is a hypervisor in cloud computing?
A hypervisor is a Virtual Machine Monitor that maintains virtual machine resources.
👉 Conclusion
This blog discussed the Google Cloud Natural Language API, Setting it up for a project, Sentiment analysis, Entity and Sentiment analysis, and how to analyze Entity Sentiment of the given data using google cloud tools.
Cheers, you have reached the end. Hope you liked the blog and it has added some knowledge to your life. Please look at these similar topics to learn more: Basics of GCP, Networking in GCP, and Google Cloud Platform.
Refer to our Coding Ninjas StudioGuided Path to learn Data Structures and Algorithms, Competitive Programming, JavaScript, System Design, and even more! You can also check out the mock test series and participate in the contests hosted by Coding Ninjas Studio! But say you're just starting and want to learn about questions posed by tech titans like Amazon, Microsoft, Uber, and so on. In such a case, for placement preparations, you can also look at the problems, interview experiences, and interview bundle.
You should also consider our premium courses to offer your career advantage over others!
Please upvote our blogs if you find them useful and exciting!