Table of contents
1.
Introduction
2.
Importing data 
3.
Exporting data
4.
Importing data files
5.
Frequently Asked Questions
5.1.
What are the different export statuses while exporting data dumps from Postman?
5.2.
How to eliminate errors while reading data files?
5.3.
What is the Collection Runner in Postman?
6.
Conclusion
Last Updated: Mar 27, 2024

Importing Datafiles in Postman

Author Yashesvinee V
0 upvote
Career growth poll
Do you think IIT Guwahati certified course can help you in your career?

Introduction

Data in Postman can be a collection, environment and global variables or data dumps. Importing and exporting data is a beneficial feature in the API development workflow. Users can import API specifications, API schemas and data files. Let us see how to perform imports and exports in Postman.

Importing data 

We can import files and folders from the local system, raw text, or any code repository in GitHub, Bitbucket, GitLab or Azure. Postman automatically recognises the files it can import from the source provided. After selecting the files to be imported, click on Import.

Import in Postman

Collection v1 format is deprecated and no longer supported. Importing a collection in this format will return an error and needs to be converted from v1 to v2. To change the collection format from v1 to v2:

Step 1: Install the Postman Collection Transformer by running the command in any terminal.

sudo npm install -g postman-collection-transformer

Step 2: Run the following command to convert the format. It downloads the target file path in v2 format.

postman-collection-transformer convert -i <path to the input Postman Collection file> -o <path to the downloaded Postman file> -j 1.0.0 -p 2.0.0 -P

Bulk data is imported from code repositories using the Import option discussed before. For importing GitHub repositories, the user must confirm and authorise postmanlabs to access the repositories. In the Import window on Postman, select the files' organisation,  repository and branch. BitBucket and GitLab code repositories are also imported the same way. 

To import an Azure DevOps repository, a user must enable third-party application access for Postman to be able to connect to the repo. Third-party application access can be enabled under Policies in Organisation settings. Users receive notifications once the import is complete and can view the files on Postman.

Exporting data

Collections on Postman are exported as JSON files. They can be imported back into Postman instances in the future. Users can export a collection by clicking on the three dots next to the collection name and selecting Export. This will generate a JSON file.

Export collection

Data can be exported in v1 or v2 formats, but v2 is recommended as v1 is deprecated. To export all data, including the environments, globals and collections, click on the Setting button at the top of the window, next to the invite button and click on Settings. Go to the Data tab and click Export data to request a data export. A notification is sent when the export is complete and ready for download.

Export as data dumps

Importing data files

Data files are used to pass values during a collection run. CSV and JSON files are used in the Collection Runner to test the request with different values in the same run. 

Step 1: To select data files for a collection run, select the Runner option present at the bottom of the window.

Collection Runner

Step 2: Drag and drop the collection into the Run Order area.

Step 3: Choose the data file using the Select file option.

Step 4:  Preview and inspect the data and click on Run using data files to begin the collection run.

Data file values can be accessed throughout the requests and by scripts. Values from the data file can be used in a Pre-request script or Test code using iterationData. It provides access to the data file record currently in use.

// 'value' field from the data file
pm.iterationData.get("value")

Frequently Asked Questions

What are the different export statuses while exporting data dumps from Postman?

When the export request is placed, it is in the Scheduled state. Transferring and transferred state when the data transfer has started or completed. Zipping or Zipped state when the file is being zipped or is already zipped. Download state tells that the file is ready for Download.

How to eliminate errors while reading data files?

Users must ensure that their data file is correctly formatted as CSV or JSON. The data file should be appropriately encoded.

What is the Collection Runner in Postman?

The Collection Runner runs the API requests of a collection in sequential order. All test results of a collection run are logged along with the data used. It can be configured to run collections in a specific environment according to the user's needs.

Conclusion

This blog discusses importing and exporting data in Postman. It also explains the use of data files in a Collection run and how they can be imported. Check out our articles on Run in PostmanScripts in Postman and Using Collection runner in Postman. Explore our Library on Coding Ninjas Studio to gain knowledge on Data Structures and Algorithms, Machine Learning, Deep Learning, Cloud Computing and many more! Test your coding skills by solving our test series and participating in the contests hosted on Coding Ninjas Studio! 

Looking for questions from tech giants like Amazon, Microsoft, Uber, etc.? Look at the problems, interview experiences, and interview bundle for placement preparations. Upvote our blogs if you find them insightful and engaging! Happy Coding!

Thank you

Live masterclass