Code360 powered by Coding Ninjas X Naukri.com. Code360 powered by Coding Ninjas X Naukri.com
Table of contents
1.
Introduction
2.
What is Kafka?
3.
Add Kafka API to the Project🔥
3.1.
To Import API Definition
3.2.
Create API Definition🎯
4.
Simulating Producers
4.1.
Use-case
4.2.
Produce Events
4.3.
Test Step Settings
5.
Connection Settings
5.1.
Message Editor🧾
6.
Simulating Consumers
6.1.
Consume events
6.2.
Test Step Settings
7.
Connection Settings
7.1.
Property List
7.2.
Received Data🧾
8.
Authentication in Kafka
9.
SASL/PLAIN with SSL Encryption
10.
Other Authentication Methods
11.
Frequently Asked Questions
11.1.
What is Ready API?
11.2.
What is an API, and why is it used?
11.3.
What are the four types of API?
11.4.
What kind of tool is Kafka synchronous or asynchronous?
11.5.
What communication pattern does Kafka follow?
12.
Conclusion
Last Updated: Mar 27, 2024

Kafka Requests in Ready API

Author Gunjan Batra
0 upvote

Introduction

In this blog, you will learn about what is Kafka, and Kafka Request in Ready API. You will learn how to create and import API in your project. You will also see the simulating producers and simulating consumers in Kafka. Furthermore, we will explore the authentication in Kafka and different authentication methods.

Kafka Image

Ready API is a  REST and SOAP API automation tool. It is an integrated suite of applications for API testing. In functional testing, virtualization, and integration, you will be using Ready API. It is an easy-to-use tool for the development team. It makes the process easy and reliable.

What is Kafka?

Kafka is an asynchronous API. It has a different way of dealing with requests and responses.

What is Kafka Image
  • In asynchronous API, you do not wait for the response. It will ask for a subscription to get the object's response.
     
  • Kafka request performs two operations: the request and response. These are also called publishing and subscription.
     
  • The API connection test step configures these two operations.      
                        
    The publish Mode.  
Publish Kafka

The subscribe Mode

Subscribe kafka

The above images are the publish and subscribe view in the Kafka Request.

Add Kafka API to the Project🔥

To test Kafka API, it is necessary to import Kafka API into the project. 

To add Kafka API to the project, you can directly import an AsyncAPI definition or manually create an API.

To Import API Definition

1️⃣ To import an API definition, click on this symbol Plusnext to the APIs node. 

Import API

Or 

You can also right-click on the APIs. You will get an option to import API Definitions. Select that, and you can import the API definition.

Import API

2️⃣Enter the file's path from my computer, or you can directly enter the URL. Ready API will automatically determine its type of definition. After importing, click on the import API button, and API will get added to your project.

import API

3️⃣The next step is to test the test cases. Add the API connection test step in the test case.

Create API Definition🎯

1️⃣To create an API definition in the project, you first need to add the Kafka API.

Create API

2️⃣The next is to mention the service endpoints. If required, you can also mention the authorization.

Add endpoint

3️⃣Select the required topics in the API. 

4️⃣The next step is to test the test cases. Add the API connection test step in the test case.

API connection test

Simulating Producers

In Kafka, you can produce events, and consuming events will be the other function of Kafka. Your Kafka broker can use ReadyAPI to replicate the event producer and post-events.

Use-case

Let's try to understand this with an example. You have a payment system, and you want to check whether this payment system works correctly or not. For this, you want your system to check the payment status, and if the payment is successful, it should redirect to the publish event Payment-confirmation topic. You should get the response as the payment is successful.

To check the system,  we will use ReadyAPI to replicate the producer and the payment status.

Produce Events

Events are blocks of functions that we expect our system to perform.

To start with producing events, we need to follow some steps.

1️⃣Add a Kafka API to the project

Import the API definition or create the API. We have already learned about adding the Kafka API to the project.

2️⃣ Create a test step

The next step is to create a Kafka test step. In ready API to produce a Kafka event, you can use the API Connection test step based on the publish option available.

To do this, follow the steps mentioned below:

  1. Click on open a test case and then add API Connection Test Step
     
  2. In the next step, when you receive the dialog box, select the publish option and then click  Add Test Step:
Produce Events

OR

  1. In the navigator panel, right-click the publish operation in the API Node and select add a test case.
     
  2. Select the test case and add on the added test step. Click when the dialog box appears.
Select test case

3️⃣ Send the Event

You need to enter the message in the Data field and select Connect for publishing the event to the topic.

Send event

You will learn about Authentication in Kafka later in his blog.

Test Step Settings

Test step toolbar

As we can see in the image, the test step toolbar is helpful in various activities. It allows us to change the authorization settings and connections established for the test step.

Test step toolbar

Connection Settings

For changing the connection settings, open the connection settings and select connection settings in the test step toolbar.

Option 

Description

Confluent Settings

To connect to the confluent schema registry, you will be using Confluent settings. This setting is only available when JSON via Protobuf or JSON via Avro message format is selected. 

Schema Registry URL: The schema registry URL.

Registry Authorization – Using the profile, you can authenticate the schema registry.

Publish Settings Kafka providers might require other parameters.
Reset Settings to Default

These settings depend on the environment that is selected.

  Confluent Settings and Publish Settings are not available when there is no environment.

Property List

The API connection test step properties panel allows you to change the test step behaviour. To change the test step behaviour, you can use the step properties present in the navigator.

Property Description
Name The test step name.
Description Information about the test step.

Message Editor🧾

The test step will send the message to the topic through the message editor. In the message editor, you pass the actual message that you want to send.

Message editor

Format of the message:  The formats that are received. Only two formats are supported.

JSON: Regular JSONs are present in the messages.

JSON via Avro(Schema Registry):  Avro schema stored in the Confluent schema registry helps serialize the messages. Setting up a schema registry is possible through connection settings.

JSON via Protobuf: Protobuf schema stored in the Confluent schema registry helps serialize the messages. Setting up a schema registry is possible through connection settings.

Custom: In custom, you will use the Protobuf schema or Avro schema to serialize the messages. You can connect to the schema registry from the connection settings. 

Metadata: For passing the parameter to the message, click . You can pass parameters with the message. Parameters are of three types:

Header: In the header parameter, messages are sent with Kafka headers. 

Path:  By the path parameter, you can update the values in the channel.

Path

Kafka: Parameters that are specific to Kafka. Key and Partition are the two Kafka parameters that are supported. 

Data: The messages are generally in JSON format. You can beautify the fields in the data using the beautify button. To get data from one test step to another or parameters, you will be using get data. 

Simulating Consumers

In Kafka, you can consume events, and the production events will be the other function of Kafka. In your Kafka broker, you can use ReadyAPI to replicate the event producer and read events from the topic.  

Use case

Let's try to understand this with an example. You have a payment system, and you want to check whether this payment system works correctly or not. For this, you want your system to check the payment status, and if the payment is successful, it should redirect to the publish event Payment-confirmation topic. You should get the response as the payment is successful.

To check the system,  we will use ReadyAPI to replicate consumers and check the payment status message. You can take the subscription of payment-confirmation topic.

Consume events

1️⃣Add a Kafka API to the project

To test Kafka API, it is necessary to import Kafka API into the project.

2️⃣ Create a test step

The next step is to create a Kafka test step. In ready API to produce a Kafka event, you can use the API Connection test step based on the subscribe option available.

  1. Click on open a test case and then add API Connection Test Step.
  2. In the next step, when you receive the dialog box and select the subscribe option and then click Add Test Step
Add test step

OR

  1. In the navigator panel, right-click on the subscribe operation in the API Node and select add a test case.
  2. Select the test case and add on the added test step. Click when the dialog box appears.
Dialog box

3️⃣ Subscribe to the Topic

For consuming the events, it is essential to connect. The events that come to the topic, ready API will connect with them. It will be connected to the test step till the time disconnect criteria are not met.

Subscribe kafka

You will learn about Authentication in Kafka later in his blog.

4️⃣Add Assertions

For validating the messages, assertions get added to the topic. Any topic the subscribe test case consumes will have the assertions applied.

Add assertions

Assertions are useful in the API connection test step.

Test Step Settings

Test step toolbar

As we can see in the image, the test step toolbar is useful in various activities. Establishing the test step allows us to change the authorization settings and connections.

Test step toolbar

Connection Settings

For changing the connection settings, open the connection settings and select connection settings in the test step toolbar.

Option 

Description

Confluent Settings

It is used when you want to connect to the confluent schema registry. This setting is only available when JSON via Protobuf or JSON via Avro message format is selected. 

Schema Registry URL: The schema registry URL.

Registry Authorization – This profile is used for authenticating to the schema registry.

Close Subscription When

To close the subscription, you can use this setting. This happens when one condition is satisfied, or all of them are true.

Messages Received: Before closing the connection, the test step will receive a minimum number of messages.

Run Time: After connecting to the topic, the time passed.

Idle   Time: No event is published for a long time for a topic.

Reset to Default

To change the connection settings to default. These settings depend upon the environment.

If there is no environment, settings are Run Time = 60, Messages Received = 50, and Idle Time = 60.

If any environment is present, settings are from the environment settings.

Property List

The API connection Test Step properties panel allows you to change the test step behavior. This can be done by using the step properties present in the navigator.

Property  Description
Name The test step name.
Description Information about the test step.
Idle time Same  idle time setting as the setting in the connection setting.
Messages Received  Repeats the Messages Received setting in the Connection Settings.
Run time Same as the Run time setting in the connection setting.

Received Data🧾

The messages that the test step has consumed from the topic will appear in the received data window.

Received data

Format of the message:  The formats that are received. Only two formats are supported.

JSON: Regular JSONs are present in the messages. ReadyAPI will retrieve the messages without processing them. 

JSON via Avro(Schema Registry):  Avro schema stored in the Confluent schema registry helps serialize the messages. Using this schema, ReadyAPI will deserialize messages. Through connection settings, schema registry setup can be done. 

JSON via Protobuf: Protobuf schema stored in the Confluent schema registry helps serialize the messages. Using this schema, ReadyAPI will deserialize messages. Through connection settings, schema registry setup can be done.

Custom: In custom, the messages will be serialized using either the Protobuf schema or Avro schema. You can connect to the schema registry from the connection settings.

Metadata: You can pass parameters with the message. Parameters are of two types: 

Header: In the header parameter, messages are sent with Kafka headers. 

Kafka: Parameters that are specific to Kafka. Key and Partition are the two Kafka parameters that are supported. 

Data: The message itself.

Authentication in Kafka

Using the SASL/ PLAIN  method with SSL encryption ready API supports the authentication of Kafka brokers and schema. In confluent, it is a very common authentication process. You need to manually mention the authentication parameters to use any other way of authentication. 

SASL/PLAIN with SSL Encryption

A basic authorization profile is used for implementing this authentication.

It is done in the following steps:

1️⃣The first step is to create a basic authorization profile.

Sasl image

2️⃣The second is to give details of the credentials.

  • Username – Username or the API key.
  • Password – User password or the client secret.
Credentials

3️⃣choose the profile you want

Choose profile

Other Authentication Methods

If you use another authentication process, you need to set Authorization Profile to No Authorization and mention the authentication parameters to Connection Settings.

For example: without SSL encryption, in SASL/ PLAIN method, you need to mention the following parameters.

sasl.mechanism: Plain

security.protocol: SASL_PLAINTEXT

Sasl.jaas.config: org.apache.kafka.common.security.plain.PlainLoginModule required username="Username or API key" password="Password or Client secret";

Sasl image

Frequently Asked Questions

What is Ready API?

Ready API is a REST and SOAP API automation tool. It is an integrated suite of applications for API testing.

What is an API, and why is it used?

API stands for an application programming interface. It is used so that there can be an interaction between the two applications.

What are the four types of API?

Web-based applications commonly use these four types of API. These are named public, partner, composite and private.

What kind of tool is Kafka synchronous or asynchronous?

Kafka is an asynchronous tool.

What communication pattern does Kafka follow?

Kafka follows a single partition and single consumer pattern. The messages are passed to the single partition topic, and the consumer can only consume that message from a single partition.

Conclusion

In this blog, we learn about Kafka Request in Ready API. We look at how we can create or import API into our project. We further discuss the simulating producers and simulating consumers in Kafka. 

To further learn more about Ready API, please refer to our blog

gRPC Request in Ready API

Refer to our guided paths on Coding Ninjas Studio to learn more about DSA, Competitive Programming, JavaScript, System Design, etc. Enrol in our courses, refer to the mock test and problems; look at the interview experiences and interview bundle for placement preparations.

Happy Coding!

Live masterclass