Skip to main content

setup scheduled export on google appe

If you are using the google app engine you might get this message from the google.

Hello Cloud Datastore Customer,

We're writing to let you know that the Datastore Admin backup feature is being phased out as of February 28, 2018, in favour of the generally available managed export and import for Cloud Datastore. Please migrate to the managed export and import functionality at your earliest convenience. To help you make the transition, Datastore Admin will continue to be available over the next 12 months prior to the shutdown date of February 28, 2019. 
I am writing this post to show how to set schedule export.
[ I am showing the steps described by the google at this link.  Scheduled-export ]

Enable the billing for google cloud service

Ensure that you are using a billable account for your GCP project. Only GCP projects with billable accounts can use the export and import functionality. 
You can Set your billing details at

Make sure about billing details
Make sure about billing details

Create a bucket for backup

Create a bucket inside cloud storage for your project.
 [ where you want  to export your datastore backup]

Go to storage in the cloud console
Go to storage in the cloud console
You can find the storage in the sidebar of the console.

Create a new bucket where you want to export the datastore backup. and make sure bucket type is regional or multi-regional.  and the location is the same where you project is hosted for me its us-central-1

Schedule export does not support the cold line or near line for schedule export.

Create a backup bucket
Create a backup bucket

Assign the Cloud Datastore Import Export Admin role

Assign the roleCloud Datastore Import Export Admin to the App Engine default service account
The role can be assigned from the IAM & Admin Section of the cloud console

Assing role of cloud datastore import export admin IAM & Admin in the google cloud console
Assing role of cloud datastore import export admin IAM & Admin in the google cloud console

Give write permission to the AppEngine default service account 

Give write permission to the AppEngine default service account on your backup bucket.You can set permission for a bucket from the storage section in the cloud console.


Copy Application [ Service ] Files From Google Cloud Doc

Copy this three file from the Google Cloud doc as it is.

  • app.yaml
  • cron.yaml
Update only url and schedule in the cron.yaml according to your need.
url: /cloud-datastore-export?namespace_id=&output_url_prefix=gs://BUCKET_NAME
In our case 
url: /cloud-datastore-export?namespace_id=&output_url_prefix=gs://backup-demo-purpose

Deploy the service

Deploy the service through the gcloud. 
[ It was failing when i was using app engine to deploy]
cmd : gcloud app deploy app.yaml cron.yaml
In case you have multiple projects you can use --project to deploy service to the specific project
cmd  : gcloud app deploy app.yaml cron.yaml --project=demo-project

Verify Deployment

Make sure that service is working. After deployment of the app you will be able to see the job inside your projects Appengine > Task queues > crons

Task Queues
Task Queues

Run the cron ( using the run button at this time for testing) to verify and make sure status is Success

Cron-job inside task queue section
Cron-job inside task queue section


Popular posts from this blog

Google blogger Ideas panel

Google blogger Ideas  I opened by blogger today, and..   I got this.  Google blogger Ideas  A panel suggesting a topic on which I can write my next blog. It's fetching unanswered question from web according to your previous post and topics. It was something, I was really looking for, after all it takes time to finding subject on which to write next and still being in the same niche.  Awesome feature Blogger! 

Apache : setup basic auth with apache in windows

Authentication is any process by which you verify that someone is who they claim they are. Authorization is any process by which someone is allowed to be where they want to go or to have information that they want to have. I will show here how to set up basic auth on the apache with windows. Pre-requests  Windows VPS Apache server ( That's it ) ( In windows it might be difficult to setup the Apache alone. So instead use something ling xampp , wamp or laragon .) RestClient (  I personally use the postman , but you can use your preferable client)  Windows VPS provider Steps  Enable the necessary modules in the Apache Create the password file Set the auth directives in the virtual host file. Verify basic auth. Enable the  necessary   modules  in the Apache Open the httpd.conf file in the apache's conf folder. httpd.conf file Enable the necessary modules to make the basic auth working. Necessary modules  mod_auth_basic

Firebase - update a spacific fields of single element of object of array in firestore

Firebase - update a spacific fields of single element of object of array in firestore  Its actully advisable to use map instead of array when ever it is possible. But, there are cetain cases where you don't have option to do so.  For example, you are directly saving the response from some outer source without any modification and they send you an array. In this case you will have array to work with. Firestore does not support array here is why  "bad things can happen if you have multiple clients all trying to update or delete array elements at specific indexes. In the past, Cloud Firestore addressed these issues by limiting what you can do with arrays " For more details information you can refer to Kato Richardson post Best Practices: Arrays in Firebase .  Firestore document having array [ used from stackoverflow question ] Suppose you have array of object something like shown in array. Now you want to update endTime field of the object on the index [1]