Integrate with Google Cloud Platform

Firebase Storage is tightly integrated with Google Cloud Platform. Under the hood, Firebase Storage creates a default bucket in the Google App Engine free tier. This allows you to quickly get up and running with Firebase Storage, without having to put in a credit card or enable a billing account. It also allows you to easily share data between Firebase and a Google Cloud Platform project.

Since Firebase Storage is backed by a Google Cloud Storage bucket, as your app grows, you can easily integrate other Cloud services.

Integrating with Google Cloud Platform requires a Firebase project on the Blaze plan. Learn more about the plans on our pricing page.

Google Cloud Storage

You can use the Google Cloud Storage (GCS) APIs to access files stored in Firebase Storage, especially to perform more complex operations, such as copying or moving a file, or listing all the files available at a reference.

It's important to note that these requests use Google Cloud Storage ACLs, rather than Firebase Authentication and Storage Security Rules.

APIs

In addition to the Firebase Storage APIs, there are a number of other ways to access data stored in your GCS bucket, depending on what you want to do. If you're accessing data on a server, we offer server side libraries, as well as a JSON and XML RESTful API, or if you need to script changes or perform other administrative tasks, we've got a command line tool that will come in handy.

GCloud Server SDK

Google Cloud Platform offers high-quality server SDKs for a number of GCP products, including Google Cloud Storage. These libraries are available in Node.js, Java, go, Python, PHP, and Ruby. For more information, including installation instructions, authentication, and troubleshooting, please consult the platform specific docs linked.

Example usage for Goocle Cloud Storage is shown below:

Node.js

    // Require gcloud
    var gcloud = require('google-cloud');

    // Enable Storage
    var gcs = gcloud.storage({
      projectId: 'grape-spaceship-123',
      keyFilename: '/path/to/keyfile.json'
    });

    // Reference an existing bucket.
    var bucket = gcs.bucket('my-existing-bucket');

    // Upload a local file to a new file to be created in your bucket.
    bucket.upload('/photos/zoo/zebra.jpg', function(err, file) {
      if (!err) {
        // "zebra.jpg" is now in your bucket.
      }
    });

    // Download a file from your bucket.
    bucket.file('giraffe.jpg').download({
      destination: '/photos/zoo/giraffe.jpg'
    }, function(err) {});
    

Java

    // Enable Storage
    Storage storage = StorageOptions.builder()
      .authCredentials(AuthCredentials.createForJson(new FileInputStream("/path/to/my/key.json"))
      .build()
      .service();

    // Upload a local file to a new file to be created in your bucket.
    InputStream uploadContent = ...
    BlobId blobId = BlobId.of("my-existing-bucket", "zebra.jpg");
    BlobInfo blobInfo = BlobInfo.builder(blobId).contentType("text/plain").build();
    Blob zebraBlob = storage.create(blobInfo, content);

    // Download a file from your bucket.
    Blob giraffeBlob = storage.get("my-existing-bucket", "giraffe.jpg", null);
    InputStream downloadContent = giraffeBlob.getInputStream();
    

Go

    // Enable Storage
    client, err := storage.NewClient(ctx, option.WithServiceAccountFile("path/to/keyfile.json"))
    if err != nil {
        log.Fatal(err)
    }

    // Download a file from your bucket.
    rc, err := client.Bucket("my-existing-bucket").Object("giraffe.jpg").NewReader(ctx)
    if err != nil {
        log.Fatal(err)
    }
    defer rc.Close()
    body, err := ioutil.ReadAll(rc)
    if err != nil {
        log.Fatal(err)
    }
    

Python

    # Import gcloud
    from google.cloud import storage

    # Enable Storage
    client = storage.Client()

    # Reference an existing bucket.
    bucket = client.get_bucket('my-existing-bucket')

    # Upload a local file to a new file to be created in your bucket.
    zebraBlob = bucket.get_blob('zebra.jpg')
    zebraBlob.upload_from_filename(filename='/photos/zoo/zebra.jpg')

    # Download a file from your bucket.
    giraffeBlob = bucket.get_blob('giraffe.jpg')
    giraffeBlob.download_as_string()
    

PHP

    // Require gcloud
    require 'vendor/autoload.php';
    use Google\Cloud\Storage\StorageClient;

    // Enable Storage
    $storage = new StorageClient([
        'projectId' => 'grape-spaceship-123'
    ]);

    // Reference an existing bucket.
    $bucket = $storage->bucket('my-existing-bucket');

    // Upload a file to the bucket.
    $bucket->upload(
        fopen('/photos/zoo/zebra.jpg', 'r')
    );

    // Download a file from your bucket.
    $object = $bucket->object('giraffe.jpg');
    $object->downloadToFile('/photos/zoo/giraffe.jpg');
    

Ruby

    # Require gcloud
    require "google/cloud"

    # Enable Storage
    gcloud = Google::Cloud.new "grape-spaceship-123", "/path/to/keyfile.json"
    storage = gcloud.storage

    # Reference an existing bucket.
    bucket = storage.bucket "my-existing-bucket"

    # Upload a file to the bucket.
    bucket.create_file "/photos/zoo/zebra.jpg", "zebra.jpg"

    # Download a file from your bucket.
    file = bucket.file "giraffe.jpg"
    file.download "/photos/zoo/#{file.name}"
    

REST API

If you're using a language without a client library, want to do something that the client libraries don't do, or just have a favorite HTTP client that you'd prefer to use, GCS offers both JSON and XML APIs that you can use.

gsutil

gsutil is command line tool that gives you direct access to Google Cloud Storage. You can use gsutil to do a wide range of bucket and object management tasks, including:

  • Uploading, downloading, and deleting objects.
  • Listing buckets and objects.
  • Moving, copying, and renaming objects.
  • Editing object and bucket ACLs.

gsutil allow for other advanced operations, such as moving files from one directory to another, or deleting all the files below a certain location.

Moving all the files from one reference to another is as easy as:

gsutil mv gs://bucket/old/reference gs://bucket/new/reference

Batch deleting all the files below a reference is similarly intuitive:

# Delete all files under a path
gsutil rm -r gs://bucket/reference/to/delete

# Delete all the files in a bucket but not the bucket gsutil rm -r gs://bucket/**

# Delete all the files AND the bucket # This will break Firebase Storage and is strongly discouraged gsutil rm -r gs://bucket

Object Versioning

Have you ever deleted something by accident and not had a backup? GCS Object Versioning provides an automatic way to back your data up, and restore from those backups. You can enable Object Versioning using the gsutil versioning set command:

gsutil versioning set on gs://<your-firebase-storage-bucket>

Firebase Storage always picks up the most recent version, so if you want to restore an object, you need to use one of the other APIs or tools above to set the desired object as the most recent.

Object Lifecycle Management

Having the ability to automatically archive or delete stale files is a useful feature for many applications. Luckily, GCS provides Object Lifecycle Management, which allows you to delete or archive objects after a certain amount of time.

Consider a photo sharing application that you want all photos to be deleted within one day. You can set up an object lifecycle policy as follows:

// lifecycle.json
{
  "lifecycle": {
    "rule":
    [
      {
        "action": {"type": "Delete"},
        "condition": {"age": 1}
      }
    ]
  }
}

And deploy it using the gsutil lifecycle set command:

gsutil lifecycle set lifecycle.json gs://<your-firebase-storage-bucket>

Note that this applies to all files in the bucket, so if you're storing important user backups you want to store for a long time along side photos that you want to delete daily, you might want to use two separate buckets or perform deletions manually with gsutil or your own server.

Google App Engine

Google App Engine is a Platform as a Service that automatically scales backend logic in response to the amount of traffic it receives. Just upload your backend code and Google will manage your app's availability, there are no servers for you to provision or maintain. GAE is a fast and easy way to add additional processing power or trusted execution to your Firebase application.

Firebase Storage uses the Google App Engine default bucket, which means that if you build an App Engine app, you can use the built in App Engine APIs to share data between Firebase and App Engine. This is useful for performing audio encoding, video transcoding, and image transformations, as well as other computation intensive background processing.

The Java, Python, and go Standard environments for Google App Engine include the GAE Images API (Java, Python, go), which can resize, rotate, flip, and crop an image, as well as return an image serving URL which allows for client side transformations, similar to Cloudinary and Imgix.

When importing a an existing Google Cloud Platform project into Firebase, if you want to make any existing App Engine objects available in Firebase, you'll need to set the default access control on your objects to allow Firebase Storage to access them by running the following command using gsutil:

gsutil -m acl ch -r -u firebase-storage@system.gserviceaccount.com:O gs://<your-firebase-storage-bucket>

Known Issues

There are two known cases where you can't import your GAE app:

  1. The project contains a former App Engine Datastore Master/Slave app.
  2. The project has a domain prefixed project ID, for example: domain.com:project-1234.

In either of these cases, the project won't support Firebase Storage, and you should create a new Firebase project in order to use Firebase Storage. Please contact support so we can help you out.

Google Cloud Functions (Alpha)

Google Cloud Functions is a lightweight, event-based, asynchronous compute solution that allows you to create small, single-purpose functions that respond to events without the need to manage a server or a runtime environment. These functions can be used for transcoding video, classifying images using machine learning, or syncing metadata with the Firebase Realtime Database. With even less overhead than Google App Engine, Cloud Functions is the fastest way to react to changes in Firebase Storage.

Google Cloud Vision API

The Google Cloud Vision API enables developers to understand the content of an image by encapsulating powerful machine learning models in an easy to use API. It quickly classifies images into thousands of categories, detects individual objects and faces within images, finds and reads printed words contained within images, identifies offensive content, and even provides image sentiment analysis.

Google Speech Vision API

Similar to the Vision API, the Google Cloud Speech API enables developers to extract text from an audio file stored in GCS. The API recognizes over 80 languages and variants, to support your global user base. When combined with the Google Cloud Natural Language API, a developers can both extract the raw text and infer meaning about that text. And if a global audience is required, couple this with the Google Translate API to translate the text to 90+ languages.

Send feedback about...

Need help? Visit our support page.