Skip to main content

copy datastore from one project to another project example

copy datastore from one google project to another google proejct
copy datastore from one google project to another google project


Its actually happens that we need to copy datastore values from one project to another project of google cloud platform.

Requirements :

  • Gcloud SDK on the local machine

This process goes through 4 steps.
  • Export datastore from origin project
  • Give default service user of target project the legacy storage read role over the bucket 
  • Transfer bucket data to target project's bucket 
  • Import datastore file 

Step 1  Export Datastore 

From Feb 2019, you will not be able to use datastore admin as Google is going stop that support. So you have to use cloud datastore import-export for it.

Cloud datastore import-export exports data in you specified storage bucket. To see how to setup automated datastore import-export refer this post.


Step 2 Give default service user of targe projet the legacy storage read role over the bucket 


Here we are giving read right to the bucket for origin project to targe project,
I.E. If you want to copy from project A to project B. Then give legacy storage reader role to the user of project B on the source bucket of project A
set bucket permission to make it accessble by target project's default service user
set bucket permission to make it accessible by target project's default service user

Step 3 Transfer bucket data to the target projects bucket 

Actually, in this step, it works like more copy then transfer. We have to specify from where to where we want to copy data.

So let's just start transferring data

Cloud storage transfer
Cloud storage transfer 

You will see this page if you transferring data storage for the first time

Select Source bucket
Select Source bucket
Here source bucket has to be from origin project. You can find bucket from the bucket information. please see below image. ( note bucket URL start with gs:// )
Find origin bucket url
Find origin bucket URL

Select Destination Bucket in the target project.



Select destination bucket on  target project
Select destination bucket on  target project

Create and run transfer job


Create transfer job
Create transfer job

So, up till now, we have moved datastore backup from our origin project to targe project.
Now the last step remains which is importing data from cloud storage of target project to datastore of target project.


Step 4 import to datastore


This step requires GCLOUD SDK on your local machine. I will recommend using latest SDK available at moment. As I have to update my current version.
Run this command in cmd 


Run Gcloud import command
Run Gcloud import command
 gcloud datastore import  gs://[path-to-the-over_all_export_file].overall_export_metadata


Hope you find it useful.

Stay connected!😁

Comments

Popular posts from this blog

Google blogger Ideas panel

Google blogger Ideas  I opened by blogger today, and..   I got this.  Google blogger Ideas  A panel suggesting a topic on which I can write my next blog. It's fetching unanswered question from web according to your previous post and topics. It was something, I was really looking for, after all it takes time to finding subject on which to write next and still being in the same niche.  Awesome feature Blogger! 

Apache : setup basic auth with apache in windows

Authentication is any process by which you verify that someone is who they claim they are. Authorization is any process by which someone is allowed to be where they want to go or to have information that they want to have. I will show here how to set up basic auth on the apache with windows. Pre-requests  Windows VPS Apache server ( That's it ) ( In windows it might be difficult to setup the Apache alone. So instead use something ling xampp , wamp or laragon .) RestClient (  I personally use the postman , but you can use your preferable client)  Windows VPS provider Steps  Enable the necessary modules in the Apache Create the password file Set the auth directives in the virtual host file. Verify basic auth. Enable the  necessary   modules  in the Apache Open the httpd.conf file in the apache's conf folder. httpd.conf file Enable the necessary modules to make the basic auth working. Necessary modules  mod_auth_basic

Firebase - update a spacific fields of single element of object of array in firestore

Firebase - update a spacific fields of single element of object of array in firestore  Its actully advisable to use map instead of array when ever it is possible. But, there are cetain cases where you don't have option to do so.  For example, you are directly saving the response from some outer source without any modification and they send you an array. In this case you will have array to work with. Firestore does not support array here is why  "bad things can happen if you have multiple clients all trying to update or delete array elements at specific indexes. In the past, Cloud Firestore addressed these issues by limiting what you can do with arrays " For more details information you can refer to Kato Richardson post Best Practices: Arrays in Firebase .  Firestore document having array [ used from stackoverflow question ] Suppose you have array of object something like shown in array. Now you want to update endTime field of the object on the index [1]