Backup copy of google drive

Hi,
I wonder if there is an option to override daily upload limit for google drive ?
I would like to make an copy of one google drvie (edu one) to google enterprise account but it would take ages to do it.

1 Like

Add the GSZITE user to the EDU members

Right click in the folder
Move to new drive

200Tb needs 3h over Google internal
Without redownload and reupload :wink:

Ohh, nice, is there any chance for tutorial how to do it ?
as I just have a drive with edu domain bought from somebody for few $$

Hmm, you actually want to copy the data first then to move it to a team drive?

This might take a bit more. From what I recall, Google Drive does not make it easy to make a copy. I mean, one can make a copy of file, but it is more difficult to copy entire folders that match folder scheme.

I have a potential method to facilitate the copying, but I do not recall if the daily upload limit applies.

Part 1

Step 1: Create a Google Colaboratory Notebook in the EDU drive

Step 2: In the document, mount your Google Drive. I think the command goes like this.

from google.colab import drive
drive.mount('/gdrive')

Step 3:
Copy the folder in question via shell. replace the src and dest of what you are actually trying to copy)

!cp -r '/gdrive/My Drive/src/.' '/gdrive/My Drive/dest'

Not sure how long the process takes nor if it is a way to bypass the limits. But afterwords, as doobsi says, the movement is a relatively simple affair.

Part 2

Step 1: Create a team drive in new google enterprise account.

Step 2: In the new team drive, add the EDU user to by clicking the dropdown and clicking manage members.

Screen Shot 2021-02-09 at 2.03.42 PM

Step 3: Go to your EDU drive. You should see a team drive there. Basically drag and drop the copied folder to the shared team drive.

Although I think the Part 1 method should work, it has been some time since I tried this. Part 2 should definitely work

P.S. I may not be around to answer questions, but hopefully this method should facilitate what one can do and how one can get started.

I am using rclone now but it will over a year to copy all data.
I have If I am right team drive under edu domain I bout cheap from somebody ( I am not an admin of that edu domain) and I want to copy it to Enterprise account (I bought it also from somebody so I am not the owner of that account just got an account) as second backup.
I ll try your version
I saw that there is an option to copy same as Part 2 but I can`t copy whole folders

It that is the case, replace I think ‘/gdrive’ is the name of the team drive? Having it as an actual normal gdrive would make things easier as I am unsure of how google names certain things.

Also try a smaller test case first before trying out the entirety of 1 years worth of transfer.

ok, but it seems that those 2 drives need to be under same email ?
what if they are under two different ones ?

I am confused to the question.

But I surmise the answer is to open the google collab document with the email that has access to the drive in question (the drive with the data on it). This assumes the email is also a google email that has access to google account services.

Part 2 is a separate step. For now we just need to create a copy of the data first. That is that main issue.

I was wondering as those two databases are on different accounts (different emails).
So I was wondering if part 1 will work as mounting drive from one account / email and copy it to another account / email which one are not under same domain.
As if My thinking is right I would have to mount two different drives into one to copy them like that.

Part1 is in regards to the account that has the data and copying data in the same account (since it is unlimited, it should not be a problem to copy and have double the data.) Plus google has data deduplication capabilities on their end.

Part2 is where the 2nd account comes into play where we move the copied data. Thanks for clarifying the question.

It may be possible to use part 1 to combine both parts, but I am unsure how google handles it and it there are permission issues due to the different accesses of the team drives. By all means, you can try it. If it works, then we have another way to copy data directly.

One think I don`t understand is why I should do data copy on the account where is data while I already have this data on that account.
also I am unable to create new drvies or even add new members to that drive
I just got limited access.

I wonder if there is an option using google colab to copy data from “shared with my” to “my drive”

The only answer I can give is I / other people like me do not wish for google to place or stop transfers due to limits.

The answers I give are based on my history of Google stopping previous transfers. The main issue is sometimes google does not mention these limitations, so I find them out firsthand through the throwing of errors. But sometimes these limitations disappear.

The copying in the same account is keep it within the limit of that account.

On the movement side, I want Google to handle to movement which in my experience does not have a limitation. Worse and sometimes, new limitations are presented.

You imply as if there is a reasonable logic to certain things. There are simply expressed limitations. Same way that Google does not allow one to copy a folder of items in the Google Drive GUI as you pointed out, but one can copy a file.

What you say is the shorter more reasonable thing to do, but I do not know if it is possible. If I have more time, I will likely search.

Found such youtube tutorial.

seems it works but while I would like to run ls command on my Shared drive nothing happens.

ok I know what might happen I can t add those drives to account to be visible in each other in a section for drives as I don`t have permission. What I can do is to add them to section “shared with me”
so I think only rclone left to copy between those accounts…

Just occurred to me that you could use the rclone option --drive-server-side-across-configs

I admit I completely forgot about this option. I also forget if it can bypass the google limits; here are some links that may be of assistance:

But to be able to do server side copy I need to have that enabled in in API permissions ?
as I do got some error while using this, also reading by the comments it still has an daily upload limit of 70GB

Hmmmm, the option should work irrespective of the API permissions. To the best of my knowledge, the API permissions have more to do with creating Google service accounts and OAuth client IDs.

using such command
rclone copy GDRIVE1: GDRIVE2: -v -P --checkers 2 --tpslimit 2 --transfers 1 --drive-chunk-size 32M --drive-server-side-across-configs --drive-stop-on-upload-limit --log-file Documents/plex1ato1c.log
got an error with user rate limit exceeded but nothing has been uploaded for over 24h

while using same command without --drive-stop-on-upload-limit it just checks files without doing a copy…

I hope you can forgive me for not replying sooner. My ability to respond / assist was greatly diminished due to winter storms that have cut power and internet for the greater than a week. I hope you did not think we abandoned your efforts.

Irrespective. I do not know exactly why the command you have has not worked. I will share at least what I did where I transferred 13 TB as a test case. Hopefully for you it will work (assuming you did not find alternatives).

  1. Be sure to have service accounts, client_id, and client_secret.

In my case, in a google project, I created a new client id and secret and created 20 service accounts named in increasing order like so:

rclone001.json
rclone002.json
rclone003.json
.
.
.
rclone020.json

Be sure to share the created service accounts with the drives you wish to copy from and send to(the email that looks like so: [email protected]). Put the JSON files created when creating service accounts in a folder. I used the home rclone config folder of

/home/euler/.config/rclone/keys/rclone001.json

  1. In rclone config, I created different drive remotes for each of the two drives for each service accounts with these settings:

[gdrive-transfer-01]
type = drive
client_id = YOUR_CLIENT_ID
client_secret = YOUR_CLIENT_SECRET
service_account_file = /home/euler/.config/rclone/keys/rclone001.json
team_drive = the one used
root_folder_id =
scope = drive
server_side_across_configs = true

.
.
.
[gdrive-transfer-20]
type = drive
client_id = YOUR_CLIENT_ID
client_secret = YOUR_CLIENT_SECRET
service_account_file = /home/euler/.config/rclone/keys/rclone020.json
team_drive = TEAM_DRIVE_USED
root_folder_id =
scope = drive
server_side_across_configs = true

[gdrive-uni-01]
type = drive
client_id = YOUR_CLIENT_ID
client_secret = YOUR_CLIENT_SECRET
scope = drive
service_account_file = /home/euler/.config/rclone/keys/rclone001.json
team_drive = TEAM_DRIVE_USED
root_folder_id =
server_side_across_configs = true
.
.
.
[gdrive-uni-20]
type = drive
client_id = YOUR_CLIENT_ID
client_secret = YOUR_CLIENT_SECRET
scope = drive
service_account_file = /home/euler/.config/rclone/keys/rclone020.json
team_drive = TEAM_DRIVE_USED
root_folder_id =
server_side_across_configs = true


  1. I created a screen session in terminal (so that when I log out, the command is still running), but you could also create a bash file and execute it.

The code that I used is this:

for COUNTER in {01…20} ; do
echo Using service account $COUNTER
rclone copy --transfers=8 --fast-list --tpslimit 7 -c --checkers=20 --drive-service-account-file=/home/euler/.config/rclone/keys/rclone0$COUNTER.json --drive-server-side-across-configs --drive-stop-on-upload-limit --log-file /home/euler/.config/rclone/rclone.log --log-level INFO gdrive-uni-$COUNTER:torrent-collection gdrive-transfer-$COUNTER:
done

Using this I was able to get speeds faster than 1 GB (not Gb) per second (I am sure it would be faster, but I also have a lot of small files which slow down the process).

The counter allows one to enumerate the next subsequent rclone command as the 750 GB limit is reached. I also put the remotes at the end of the command instead of the beginning like yours.

I do hope it works. My ability to troubleshoot is diminishing as this is reaching beyond my expertise as well.

2 Likes

Did this work?

What do you mean by service account? Not sure I follow. But I think your method looks like what I need to transfer 200TB between accounts.

He’s talking about the API accounts for upload (PG Blitz).