Skip to content
This repository has been archived by the owner on Apr 19, 2023. It is now read-only.

Rate Limit Exceeded #306

Open
ghost opened this issue Jun 7, 2017 · 27 comments
Open

Rate Limit Exceeded #306

ghost opened this issue Jun 7, 2017 · 27 comments

Comments

@ghost
Copy link

ghost commented Jun 7, 2017

Is there a way for fix this error?
Failed to upload file: googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded

@patbiber
Copy link

patbiber commented Jun 8, 2017

You need to request for higher Limits on the Google API.
Over the Google Drive API on the Google Cloud Platform.

@ghost
Copy link
Author

ghost commented Jun 8, 2017

Can I do it? Can you tell me how?

@euklid
Copy link

euklid commented Jun 9, 2017

See #132
I just opened a pull request because I encountered the same problem and thus I just learnt Go for this purpose ;) @EverMineServer If you only need to use the download or download query command, then you should be able to use the code in my branch, but no promises until it's actually merged

@jucor
Copy link

jucor commented Aug 21, 2017

I'm having the same problem, when "gdrive upload --recursive" a directory containing 2000 small files totally a measly 8 MB.

@euklid
Copy link

euklid commented Aug 22, 2017

@jucor probably you can take a look at my PR #309 and try to apply the same changes for the upload. I'm not using this program any more (only needed it for exactly one occasion for downloading...) and currently don't have time to apply similar changes to the other parts of the program. At least the download did work for me then, when I was downloading over 2000 files.

@singhravi1
Copy link

singhravi1 commented Jan 29, 2019

Hi,
Any updated solution for this?

I'm getting the same error while uploading just a single .gz file

@NickMoutsios
Copy link

I am having the exact same issue when trying to upload a single file. The rate which I am uploading in the past few days is like 20 MB / day, hard to believe I have reached a limit.

@NickMoutsios
Copy link

Interesting... It just works again, have touched nothing other than visiting the API dashboard to check what is going on. These computers nowadays....

@singhravi1
Copy link

Yes, this is what is happening, it is giving the error randomly.
This needs to be fixed.

@javaarchive
Copy link

It happened to me and then I reran the program and everything was fixed.

@herbat73
Copy link

herbat73 commented Feb 7, 2019

It occurs randomly for me when I upload one large file (60mb)

gdrive: 2.1.0
Golang: go1.6
OS/Arch: linux/amd64

@cnrting
Copy link

cnrting commented Feb 8, 2019

+1 rerun is ok

@andresmitre
Copy link

I don't thinks it's a size limit, I uploaded around 1 GB yesterday, and now I can not upload around 50 MB. PD. I'm in different Internet connection.

gdrive: 2.1.0
OS: linux (MINT)

@javaarchive
Copy link

For me, this error occured the first time I used gdrive. Rerunning it fixed everything.
Note: I'm on a G Suite Account

@andresmitre
Copy link

I've tried in multiple connections but unsuccessful. it sucks...

@javaarchive
Copy link

Probaly somebody could make a auto restart feature

@311u1
Copy link

311u1 commented Feb 25, 2019

I encountered the same issue yesterday, used the --recursive option on all files within a folder. the empty folder was put on google drive but none of the files and I received this error. Google API had my rate at 1,000 calls per 100 seconds.

Does anyone know what is making it throw the error?

@AquariusPower
Copy link

On my script retry loop, I catch the output and grep for "Failed.*rateLimitExceeded" and after 10s it tries again until it works, usually doesnt take more than 3 retries :).
It happens with any file sizes apparently.
I think (expect) they may improve the servers to handle more satisfied users soon! xD

@frankli0324
Copy link

frankli0324 commented Feb 27, 2019

On my script retry loop, I catch the output and grep for "Failed.*rateLimitExceeded" and after 10s it tries again until it works, usually doesnt take more than 3 retries :).
It happens with any file sizes apparently.
I think (expect) they may improve the servers to handle more satisfied users soon! xD

if you do

while 1;do
./gdrive download {asdf} --recursive --skip
grep...|sleep etc
done

you are still re-calling the API for every single file, including the files that you've already downloaded, so it's still highly possible to get another 302 error before you come to the next file to download

  • reason: gdrive checks whether a file is already there when calling "saveFile" function AFTER downloading it

my dirty fix:
edit $GO_DIR/src/.../gdrive/drive/download.go at around line 245

for _, f := range files {
    // Copy args and update changed fields
    newArgs := args
    newArgs.Path = newPath
    newArgs.Id = f.Id
    newArgs.Stdout = false
    retry:
    err = self.downloadRecursive(newArgs)
    if err != nil {
        print("retry after 5 seconds")
        time.Sleep(5*time.Second)
        goto retry
        return err
    }
}

dirty, but (kind of) works

still looking forward to a fix from repo owner
suggestion: use API keys provided by users instead of an app for everyone


you know what, inspired by the wireless protocol, i came up with an idea
sleep for random seconds!
if everyone complies to this rule, there might be less possible for us to "collide" with each other, which would cause the "overspeeding"
just kidding XD


upd: do this only if when you are in a rush
check out @euklid 's fork for a stable fix

@AquariusPower
Copy link

cool, i will try to run that script thx!
btw, I never use --recursive, I only work with single files recreating each directory remotely as needed :)

@guylabbe
Copy link

guylabbe commented Mar 9, 2019

It works when you run the command a second time.

@arjanna
Copy link

arjanna commented Mar 14, 2019

Same problem here, but I don't use recursive and I am not downloading the files: I upload them through
~/gdrive sync upload --keep-largest

Any help is appreciated!

@petarov
Copy link

petarov commented Mar 14, 2019

Same problem as @arjanna describes it on my side. However, looking at my sync logs back till 2017, I think this problem started occurring recently on my side. First log entry is from May, 2018 and it seems to come and go. I think this is somehow Google related. I found no explicit info about rate limits for user authorized tokens.

@arjanna
Copy link

arjanna commented Mar 14, 2019

Same problem as @arjanna describes it on my side. However, looking at my sync logs back till 2017, I think this problem started occurring recently on my side. First log entry is from May, 2018 and it seems to come and go. I think this is somehow Google related. I found no explicit info about rate limits for user authorized tokens.

I am working around the issue and I found a solution here: #426 I hope it works for you @petarov
!
This fixes the problem from my laptop, but I have to upload files from a supercomputer, where I don't have sudo permits, so I can't use this fix.

@tiomno
Copy link

tiomno commented Mar 19, 2019

Interesting... It just works again, have touched nothing other than visiting the API dashboard to check what is going on. These computers nowadays....

This worked for me!!! 🎉 🎈 🎂 Definitely and odd validation condition on the API. 🤔

@andresmitre
Copy link

I gave up, I switched to grive2. Is actually the same.

@AquariusPower
Copy link

AquariusPower commented Mar 19, 2019

@andresmitre I moved from grive2 to gdrive because to upload anything, even 1kb, I had to always download about 5MB (the whole remote file list with info to compare locally) and that would spend a lot of my daily quota, and there was absolutely nothing I could do to make it always one way (force upload only only local changes)

but I had to create my own big script to deal with everything :>

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests