Introduction
Google Cloud Console provides a web-based graphical user interface that we can use to manage our Google Cloud projects and resources. In this article, we will be discussing the overview of Google Cloud Console.
Google Cloud console
We can use the Google Cloud Console to perform simple Cloud Storage storage management tasks. The general uses of the console are:
- Enable the Cloud Storage API for our project.
- Create and delete buckets.
- Upload, download, and delete objects.
- Manage Identity and Access Management (IAM) policies.
This article provides an overview of the console, including the tasks we can perform on the console to manage our data. For more advanced tasks, use the gsutil command line tool or any client libraries that support Cloud Storage.
Try it for yourself
Get started by creating an account to experience cloud storage. Also, get $300 in free credits (only for new users) to run, test and deploy workloads.
Access to the console
The console requires no installation or setup, and we can access it directly in a browser. We access the console in slightly different ways depending on our use case. If we are:
A user granted access to a project
Use: https://console.cloud.google.com/.
The current project owner can grant access to the entire project. This applies equally to all buckets and objects defined in the project. For more information, see Adding Principals to Project.
A user is granted access to a bucket.
Use: https://console.cloud.google.com/storage/browser/BUCKET_NAME.
In this case, the project owner grants access to a single bucket within a larger project. The owner then sends the bucket name. Enter this in the URL above. We can only work with objects in the specified bucket. This is useful for users who do not have access to the entire project but need access to the bucket. When we access the URL, we will be prompted to authenticate with our Google account if we are not already signed in.
A variation of this use case is when the project owner grants all users permission to read the objects in the bucket. This creates a bucket whose contents are publicly readable. For more information, see " Setting permissions and metadata."
A user is granted access to an object.
Use: https://console.cloud.google.com/storage/browser/_details/BUCKET_NAME/OBJECT_NAME
In this use case, the project owner grants access to individual objects within a bucket and submits a URL to access the object. When accessing the URL, we will be asked to authenticate with our Google account if we are not already signed in.
Please note that the URL above's format differs from the URL for publicly shared objects. If we share the link publicly, the URL will be of the form https://storage.googleapis.com/BUCKET_NAME/OBJECT_NAME. This public URL does not require the recipient to authenticate with Google and can be used for unauthenticated access to the object.
Tasks we can perform with the Google Cloud console
The console allows us to perform basic storage management tasks on our data from our browser. To use the console, we must authenticate with Google and have permission to perform specific tasks. If we're the account owner that created the project, we likely already have all the necessary permissions to perform the following tasks. Otherwise, they may be granted access to the project or permission to perform actions on the bucket.
Creating a bucket
Cloud Storage uses a flat namespace to store data, but we can use the console to create folders to mimic your folder hierarchy. The data is not physically stored in a hierarchical structure, but it appears as such in the console.
Cloud Storage has no concept of folders, so when viewing folders using gsutil or other command-line tools that work with Cloud Storage, we see the folder suffix and object name separator. For instructions on creating a bucket using the console, see Creating Storage Buckets.
Uploading data to a bucket
We can upload data to a bucket by uploading one or more files or a folder of files. When we upload a folder, the console maintains the same folder hierarchy, including all files and folders contained within the folder. We can track the progress of our upload to the console using the Upload Progress window. We can minimize the progress window and continue working on our bucket. For instructions on uploading an object to a bucket using the console, see Uploading Objects.
We can also upload objects to the console by dragging and dropping files and folders from our desktop or file manager tool to the console's buckets or subfolders.
Downloading data from a bucket
For instructions on downloading objects from our bucket using the console, see Downloading Objects.
We can also click the object to see the details. If we can see the object, we will see a preview of it on the detail page.
Creating and using folders
Folders created in the console help organize the objects in our bucket, as the Cloud Storage system doesn't have the concept of folders. The console displays folders with a folder icon image as a visual aid to help us distinguish between folders and objects. Items added to the
folder appear in a folder on the console. In reality, all the objects exist at the bucket level, and the name contains the directory structure. For example, if we create a folder named "Pets" and add a file named "cat.jpeg" to that folder, the console will make the file appear in the folder. In reality, there is no separate folder entity. The file resides in the bucket and is named pets / cat.jpeg. Unlike the
bucket, folders do not have to be globally unique. The bucket name can only be used if the bucket with that name does not exist, but the folder name can be used repeatedly unless it is in the same bucket or subfolder.
When navigating a folder in the console, we can access the upper level of the directory by clicking the desired folder or bucket name in the breadcrumb path above the file list.
When working with buckets and data using other tools, folders may look different than in the console. See Folders for more information on how various tools such as gsutil simulate folders in Cloud Storage.
Filtering buckets or objects to view
In our project's console list of buckets, we can use the Filter Bucket text box to filter the buckets that appear.
- We can always filter by the bucket name prefix.
- We can always filter for projects with less than 1000 buckets by additional criteria such as Bucket location.
- For projects with more than 1000 buckets, we should enable filtering by additional criteria using the drop-down list next to the filter text box. However, keep in mind that for projects with thousands of buckets, filtering performance will be impacted if filtering by additional criteria is enabled.
In the bucket's object console list, we can filter the displayed objects by specifying a prefix in the Filter by Object or Folder Name Prefix ... text box above the object list. This filter displays objects that start with the specified prefix. The prefix filters only the objects in the current bucket view. Do not select the objects contained in the folder.
Setting object metadata
We can configure object metadata in the console. Object metadata controls how requests are processed, such as the type of content the data represents and how the data is encoded. Use the console to set the metadata to one of her objects simultaneously. Use gsutil setmeta to set metadata for multiple objects at once.
See Viewing and Editing Object Metadata for instructions on Viewing and Editing Object Metadata.
Deleting objects, folders, and buckets
We can delete a bucket, folder, or object by checking the check box next to the bucket, folder, or object in the Google Cloud console and clicking the Delete button to confirm that we want to continue the action. Deleting a folder or bucket also deletes all objects in it, including those marked as public. For a step-by-step guide to deleting objects from the bucket using the console, see Deleting Objects.
For instructions on removing a bucket from your project, using the
console, see Deleting Buckets
Setting bucket permissions
We can use IAM permissions to control access to our Cloud Storage buckets. For example, we can set bucket permissions to allow entities such as users and groups to view or create objects within the bucket. This can be done if adding project-level users is not appropriate. The entity specified in the IAM credentials must authenticate by signing in to Google when accessing the bucket. Share bucket URL as https://console.cloud.google.com/storage/browser/BUCKET_NAME/ with users.
Setting object permissions
We can use IAM permissions in the console to easily and consistently control access to objects in our bucket. If we want to customize access to individual objects in our bucket, we can use Signed URLs or Access Control Lists (ACLs) instead.
For detailed instructions on viewing and editing IAM permissions, see Using IAM Permissions.
To view or change the permissions of individual objects, see Changing ACLs.
Note: We cannot set permissions on folders.
Giving users project-level roles
The project name appears in the list when a project role is assigned. If we're an existing project owner, we can grant the principal access to our project. See Using IAM with projects for detailed instructions on adding and removing project-level access.
Note: In general, set the least possible permissions while giving team members the access they need. For example, if team members should only read objects stored in a project, select the Storage Object Viewer permission. Similarly, select Storage Object Management if team members need full control over the objects in the project instead of the buckets.
Working with Object Versioning
We can enable versioning of an object to keep the old version of the object if we accidentally delete or replace it. However, enabling object versioning increases storage costs. We can reduce costs by also adding Object Lifecycle Management conditions when we enable object versioning. These conditions automatically remove or downgrade older object versions based on the settings we specify. A configuration example for deleting an object shows a set of possible conditions for this use case. Older versions of the object are listed and managed on the Version history tab.
Scanning buckets with Cloud Data Loss Prevention
Cloud Data Loss Prevention (Cloud DLP) is a service that helps us identify and protect sensitive data in our buckets. Cloud DLP helps us meet compliance requirements by finding and editing information such as:
- Credit Card Numbers
- IP addresses
- Other forms of personally identifiable information (PII)
We can initiate a CloudDLP scan of the bucket by clicking the three-dot menu on the bucket and selecting Scan with Cloud Data Loss Prevention. For information on how to run a cloud DLP scan our bucket, see Inspecting a Cloud Storage location.