How to upload big files to S3 on a flaky connection?











up vote
8
down vote

favorite
1












I have a bunch of files (between 500MB and 7GB) that I need uploaded to an S3 bucket. My connection is very flaky.



I tried uploading a 500MB file via s3cmd but it timed out after it was 91% done (which took 16 hours)



The I tried with CyberDuck, but the same happened. It failed after 20% or so, and when I tried to retry the transfer, it started over from the beginning. CyberDuck is supposed to have multipart support, but I guess not...



I could split the files up into smaller files like this: How do I split a .zip file into multiple segments?, but I'd rather not unless it was my only option. What is a good program that I can use that will allow me to upload big files to S3 with resume support?










share|improve this question




























    up vote
    8
    down vote

    favorite
    1












    I have a bunch of files (between 500MB and 7GB) that I need uploaded to an S3 bucket. My connection is very flaky.



    I tried uploading a 500MB file via s3cmd but it timed out after it was 91% done (which took 16 hours)



    The I tried with CyberDuck, but the same happened. It failed after 20% or so, and when I tried to retry the transfer, it started over from the beginning. CyberDuck is supposed to have multipart support, but I guess not...



    I could split the files up into smaller files like this: How do I split a .zip file into multiple segments?, but I'd rather not unless it was my only option. What is a good program that I can use that will allow me to upload big files to S3 with resume support?










    share|improve this question


























      up vote
      8
      down vote

      favorite
      1









      up vote
      8
      down vote

      favorite
      1






      1





      I have a bunch of files (between 500MB and 7GB) that I need uploaded to an S3 bucket. My connection is very flaky.



      I tried uploading a 500MB file via s3cmd but it timed out after it was 91% done (which took 16 hours)



      The I tried with CyberDuck, but the same happened. It failed after 20% or so, and when I tried to retry the transfer, it started over from the beginning. CyberDuck is supposed to have multipart support, but I guess not...



      I could split the files up into smaller files like this: How do I split a .zip file into multiple segments?, but I'd rather not unless it was my only option. What is a good program that I can use that will allow me to upload big files to S3 with resume support?










      share|improve this question















      I have a bunch of files (between 500MB and 7GB) that I need uploaded to an S3 bucket. My connection is very flaky.



      I tried uploading a 500MB file via s3cmd but it timed out after it was 91% done (which took 16 hours)



      The I tried with CyberDuck, but the same happened. It failed after 20% or so, and when I tried to retry the transfer, it started over from the beginning. CyberDuck is supposed to have multipart support, but I guess not...



      I could split the files up into smaller files like this: How do I split a .zip file into multiple segments?, but I'd rather not unless it was my only option. What is a good program that I can use that will allow me to upload big files to S3 with resume support?







      internet-connection upload amazon-s3






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Mar 20 '17 at 10:16









      Community

      1




      1










      asked Jan 1 '13 at 21:57









      user2254

      293249




      293249






















          3 Answers
          3






          active

          oldest

          votes

















          up vote
          6
          down vote













          I just tried using s3tools (s3cmd-1.5.0-rc1). Got this hint from their FAQ http://s3tools.org/kb/item13.htm. See below.




          Does s3cmd support multipart uploads?



          Yes, the latest version of s3cmd
          supports Amazon S3 multipart uploads.



          Multipart uploads are automatically used when a file to upload is
          larger than 15MB. In that case the file is split into multiple parts,
          with each part of 15MB in size (the last part can be smaller). Each
          part is then uploaded separately and then reconstructed at destination
          when the transfer is completed.



          With this new feature, if an upload of a part fails, it can be
          restarted without affecting any of the other parts already uploaded.



          There are two options related to multipart uploads in s3cmd. They are:



          --disable-multipart



          Disable multipart uploads for all files



          and



          --multipart-chunk-size-mb=SIZE



          Size of each chunk of a multipart upload. Files bigger than SIZE are automatically uploaded as
          multithreaded-multipart, smaller files are uploaded using the
          traditional method. SIZE is in Mega-Bytes, default chunk size is 15MB,
          minimum allowed chunk size is 5MB, maximum is 5GB.




          So when I upload I choose the smallest chunk size. You should see below splitting and resuming of upload.



          $ s3cmd put --multipart-chunk-size-mb=5 some_video.mp4 s3://some_bucket/

          some_video.mp4 -> s3://some_bucket/some_video.mp4 [part 1 of 52, 5MB]
          5242880 of 5242880 100% in 164s 31.08 kB/s done
          some_video.mp4 -> s3://some_bucket/some_video.mp4 [part 2 of 52, 5MB]
          5242880 of 5242880 100% in 193s 26.46 kB/s done
          some_video.mp4 -> s3://some_bucket/some_video.mp4 [part 3 of 52, 5MB]
          2023424 of 5242880 38% in 135s 14.59 kB/s^CERROR:
          some_video.mp4' part 3 failed. Use
          /usr/local/bin/s3cmd abortmp s3://some_bucket/some_video.mp4 XXX_SOME_HASH_XXX
          to abort the upload, or
          /usr/local/bin/s3cmd --upload-id XXX_SOME_HASH_XXX put ...
          to continue the upload.
          See ya!


          Then I resume.



          /usr/local/bin/s3cmd --upload-id XXX_SOME_HASH_XXX put --multipart-chunk-size-mb=5 some_video.mp4 s3://some_bucket/





          share|improve this answer




























            up vote
            2
            down vote













            I believe in Cyberduck in the transfers window, you can right click and select resume.



            If that doesn't work, Cloudberry suports resuming uploads






            share|improve this answer





















            • For some reason in Cyberduck my resume does not work for S3 multi-part. Any hints?
              – f01
              Dec 18 '14 at 8:38


















            up vote
            1
            down vote













            You can use FileZilla Pro to transfer files to and from a S3 bucket. FileZilla Pro supports multipart upload and in case of failure it will resume the transfer.



            For the records it comes with a lot of other features: large file support, bulk transfers, filters, directory comparison, remote file search, drag&drop, speed limits configuration.



            I'm a member of FileZilla Pro team.



            Learn more at https://filezillapro.com and at https://youtube.com/c/FileZillaPro






            share|improve this answer























            • Please note that if you are in any way affiliated with the product you have to disclose that in your answer.
              – confetti
              Nov 22 at 13:59










            • @confetti, sorry I didn't know that. I couldn't find anything specific in the help or in the code of conduct. I actually assumed it was forbidden. Addded the information.
              – josuegomes
              Nov 23 at 16:12












            • No problem, I can't find it in the help text either so maybe that's something the mods should add, but it's definitely allowed! It might even be better in general since people know they can ask something specific to the software in the comments and receive a sort-of "official" answer. :)
              – confetti
              Nov 23 at 16:43











            Your Answer








            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "3"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f526814%2fhow-to-upload-big-files-to-s3-on-a-flaky-connection%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            3 Answers
            3






            active

            oldest

            votes








            3 Answers
            3






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes








            up vote
            6
            down vote













            I just tried using s3tools (s3cmd-1.5.0-rc1). Got this hint from their FAQ http://s3tools.org/kb/item13.htm. See below.




            Does s3cmd support multipart uploads?



            Yes, the latest version of s3cmd
            supports Amazon S3 multipart uploads.



            Multipart uploads are automatically used when a file to upload is
            larger than 15MB. In that case the file is split into multiple parts,
            with each part of 15MB in size (the last part can be smaller). Each
            part is then uploaded separately and then reconstructed at destination
            when the transfer is completed.



            With this new feature, if an upload of a part fails, it can be
            restarted without affecting any of the other parts already uploaded.



            There are two options related to multipart uploads in s3cmd. They are:



            --disable-multipart



            Disable multipart uploads for all files



            and



            --multipart-chunk-size-mb=SIZE



            Size of each chunk of a multipart upload. Files bigger than SIZE are automatically uploaded as
            multithreaded-multipart, smaller files are uploaded using the
            traditional method. SIZE is in Mega-Bytes, default chunk size is 15MB,
            minimum allowed chunk size is 5MB, maximum is 5GB.




            So when I upload I choose the smallest chunk size. You should see below splitting and resuming of upload.



            $ s3cmd put --multipart-chunk-size-mb=5 some_video.mp4 s3://some_bucket/

            some_video.mp4 -> s3://some_bucket/some_video.mp4 [part 1 of 52, 5MB]
            5242880 of 5242880 100% in 164s 31.08 kB/s done
            some_video.mp4 -> s3://some_bucket/some_video.mp4 [part 2 of 52, 5MB]
            5242880 of 5242880 100% in 193s 26.46 kB/s done
            some_video.mp4 -> s3://some_bucket/some_video.mp4 [part 3 of 52, 5MB]
            2023424 of 5242880 38% in 135s 14.59 kB/s^CERROR:
            some_video.mp4' part 3 failed. Use
            /usr/local/bin/s3cmd abortmp s3://some_bucket/some_video.mp4 XXX_SOME_HASH_XXX
            to abort the upload, or
            /usr/local/bin/s3cmd --upload-id XXX_SOME_HASH_XXX put ...
            to continue the upload.
            See ya!


            Then I resume.



            /usr/local/bin/s3cmd --upload-id XXX_SOME_HASH_XXX put --multipart-chunk-size-mb=5 some_video.mp4 s3://some_bucket/





            share|improve this answer

























              up vote
              6
              down vote













              I just tried using s3tools (s3cmd-1.5.0-rc1). Got this hint from their FAQ http://s3tools.org/kb/item13.htm. See below.




              Does s3cmd support multipart uploads?



              Yes, the latest version of s3cmd
              supports Amazon S3 multipart uploads.



              Multipart uploads are automatically used when a file to upload is
              larger than 15MB. In that case the file is split into multiple parts,
              with each part of 15MB in size (the last part can be smaller). Each
              part is then uploaded separately and then reconstructed at destination
              when the transfer is completed.



              With this new feature, if an upload of a part fails, it can be
              restarted without affecting any of the other parts already uploaded.



              There are two options related to multipart uploads in s3cmd. They are:



              --disable-multipart



              Disable multipart uploads for all files



              and



              --multipart-chunk-size-mb=SIZE



              Size of each chunk of a multipart upload. Files bigger than SIZE are automatically uploaded as
              multithreaded-multipart, smaller files are uploaded using the
              traditional method. SIZE is in Mega-Bytes, default chunk size is 15MB,
              minimum allowed chunk size is 5MB, maximum is 5GB.




              So when I upload I choose the smallest chunk size. You should see below splitting and resuming of upload.



              $ s3cmd put --multipart-chunk-size-mb=5 some_video.mp4 s3://some_bucket/

              some_video.mp4 -> s3://some_bucket/some_video.mp4 [part 1 of 52, 5MB]
              5242880 of 5242880 100% in 164s 31.08 kB/s done
              some_video.mp4 -> s3://some_bucket/some_video.mp4 [part 2 of 52, 5MB]
              5242880 of 5242880 100% in 193s 26.46 kB/s done
              some_video.mp4 -> s3://some_bucket/some_video.mp4 [part 3 of 52, 5MB]
              2023424 of 5242880 38% in 135s 14.59 kB/s^CERROR:
              some_video.mp4' part 3 failed. Use
              /usr/local/bin/s3cmd abortmp s3://some_bucket/some_video.mp4 XXX_SOME_HASH_XXX
              to abort the upload, or
              /usr/local/bin/s3cmd --upload-id XXX_SOME_HASH_XXX put ...
              to continue the upload.
              See ya!


              Then I resume.



              /usr/local/bin/s3cmd --upload-id XXX_SOME_HASH_XXX put --multipart-chunk-size-mb=5 some_video.mp4 s3://some_bucket/





              share|improve this answer























                up vote
                6
                down vote










                up vote
                6
                down vote









                I just tried using s3tools (s3cmd-1.5.0-rc1). Got this hint from their FAQ http://s3tools.org/kb/item13.htm. See below.




                Does s3cmd support multipart uploads?



                Yes, the latest version of s3cmd
                supports Amazon S3 multipart uploads.



                Multipart uploads are automatically used when a file to upload is
                larger than 15MB. In that case the file is split into multiple parts,
                with each part of 15MB in size (the last part can be smaller). Each
                part is then uploaded separately and then reconstructed at destination
                when the transfer is completed.



                With this new feature, if an upload of a part fails, it can be
                restarted without affecting any of the other parts already uploaded.



                There are two options related to multipart uploads in s3cmd. They are:



                --disable-multipart



                Disable multipart uploads for all files



                and



                --multipart-chunk-size-mb=SIZE



                Size of each chunk of a multipart upload. Files bigger than SIZE are automatically uploaded as
                multithreaded-multipart, smaller files are uploaded using the
                traditional method. SIZE is in Mega-Bytes, default chunk size is 15MB,
                minimum allowed chunk size is 5MB, maximum is 5GB.




                So when I upload I choose the smallest chunk size. You should see below splitting and resuming of upload.



                $ s3cmd put --multipart-chunk-size-mb=5 some_video.mp4 s3://some_bucket/

                some_video.mp4 -> s3://some_bucket/some_video.mp4 [part 1 of 52, 5MB]
                5242880 of 5242880 100% in 164s 31.08 kB/s done
                some_video.mp4 -> s3://some_bucket/some_video.mp4 [part 2 of 52, 5MB]
                5242880 of 5242880 100% in 193s 26.46 kB/s done
                some_video.mp4 -> s3://some_bucket/some_video.mp4 [part 3 of 52, 5MB]
                2023424 of 5242880 38% in 135s 14.59 kB/s^CERROR:
                some_video.mp4' part 3 failed. Use
                /usr/local/bin/s3cmd abortmp s3://some_bucket/some_video.mp4 XXX_SOME_HASH_XXX
                to abort the upload, or
                /usr/local/bin/s3cmd --upload-id XXX_SOME_HASH_XXX put ...
                to continue the upload.
                See ya!


                Then I resume.



                /usr/local/bin/s3cmd --upload-id XXX_SOME_HASH_XXX put --multipart-chunk-size-mb=5 some_video.mp4 s3://some_bucket/





                share|improve this answer












                I just tried using s3tools (s3cmd-1.5.0-rc1). Got this hint from their FAQ http://s3tools.org/kb/item13.htm. See below.




                Does s3cmd support multipart uploads?



                Yes, the latest version of s3cmd
                supports Amazon S3 multipart uploads.



                Multipart uploads are automatically used when a file to upload is
                larger than 15MB. In that case the file is split into multiple parts,
                with each part of 15MB in size (the last part can be smaller). Each
                part is then uploaded separately and then reconstructed at destination
                when the transfer is completed.



                With this new feature, if an upload of a part fails, it can be
                restarted without affecting any of the other parts already uploaded.



                There are two options related to multipart uploads in s3cmd. They are:



                --disable-multipart



                Disable multipart uploads for all files



                and



                --multipart-chunk-size-mb=SIZE



                Size of each chunk of a multipart upload. Files bigger than SIZE are automatically uploaded as
                multithreaded-multipart, smaller files are uploaded using the
                traditional method. SIZE is in Mega-Bytes, default chunk size is 15MB,
                minimum allowed chunk size is 5MB, maximum is 5GB.




                So when I upload I choose the smallest chunk size. You should see below splitting and resuming of upload.



                $ s3cmd put --multipart-chunk-size-mb=5 some_video.mp4 s3://some_bucket/

                some_video.mp4 -> s3://some_bucket/some_video.mp4 [part 1 of 52, 5MB]
                5242880 of 5242880 100% in 164s 31.08 kB/s done
                some_video.mp4 -> s3://some_bucket/some_video.mp4 [part 2 of 52, 5MB]
                5242880 of 5242880 100% in 193s 26.46 kB/s done
                some_video.mp4 -> s3://some_bucket/some_video.mp4 [part 3 of 52, 5MB]
                2023424 of 5242880 38% in 135s 14.59 kB/s^CERROR:
                some_video.mp4' part 3 failed. Use
                /usr/local/bin/s3cmd abortmp s3://some_bucket/some_video.mp4 XXX_SOME_HASH_XXX
                to abort the upload, or
                /usr/local/bin/s3cmd --upload-id XXX_SOME_HASH_XXX put ...
                to continue the upload.
                See ya!


                Then I resume.



                /usr/local/bin/s3cmd --upload-id XXX_SOME_HASH_XXX put --multipart-chunk-size-mb=5 some_video.mp4 s3://some_bucket/






                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Dec 20 '14 at 1:21









                f01

                33425




                33425
























                    up vote
                    2
                    down vote













                    I believe in Cyberduck in the transfers window, you can right click and select resume.



                    If that doesn't work, Cloudberry suports resuming uploads






                    share|improve this answer





















                    • For some reason in Cyberduck my resume does not work for S3 multi-part. Any hints?
                      – f01
                      Dec 18 '14 at 8:38















                    up vote
                    2
                    down vote













                    I believe in Cyberduck in the transfers window, you can right click and select resume.



                    If that doesn't work, Cloudberry suports resuming uploads






                    share|improve this answer





















                    • For some reason in Cyberduck my resume does not work for S3 multi-part. Any hints?
                      – f01
                      Dec 18 '14 at 8:38













                    up vote
                    2
                    down vote










                    up vote
                    2
                    down vote









                    I believe in Cyberduck in the transfers window, you can right click and select resume.



                    If that doesn't work, Cloudberry suports resuming uploads






                    share|improve this answer












                    I believe in Cyberduck in the transfers window, you can right click and select resume.



                    If that doesn't work, Cloudberry suports resuming uploads







                    share|improve this answer












                    share|improve this answer



                    share|improve this answer










                    answered Jan 1 '13 at 22:09









                    ernie

                    5,51821927




                    5,51821927












                    • For some reason in Cyberduck my resume does not work for S3 multi-part. Any hints?
                      – f01
                      Dec 18 '14 at 8:38


















                    • For some reason in Cyberduck my resume does not work for S3 multi-part. Any hints?
                      – f01
                      Dec 18 '14 at 8:38
















                    For some reason in Cyberduck my resume does not work for S3 multi-part. Any hints?
                    – f01
                    Dec 18 '14 at 8:38




                    For some reason in Cyberduck my resume does not work for S3 multi-part. Any hints?
                    – f01
                    Dec 18 '14 at 8:38










                    up vote
                    1
                    down vote













                    You can use FileZilla Pro to transfer files to and from a S3 bucket. FileZilla Pro supports multipart upload and in case of failure it will resume the transfer.



                    For the records it comes with a lot of other features: large file support, bulk transfers, filters, directory comparison, remote file search, drag&drop, speed limits configuration.



                    I'm a member of FileZilla Pro team.



                    Learn more at https://filezillapro.com and at https://youtube.com/c/FileZillaPro






                    share|improve this answer























                    • Please note that if you are in any way affiliated with the product you have to disclose that in your answer.
                      – confetti
                      Nov 22 at 13:59










                    • @confetti, sorry I didn't know that. I couldn't find anything specific in the help or in the code of conduct. I actually assumed it was forbidden. Addded the information.
                      – josuegomes
                      Nov 23 at 16:12












                    • No problem, I can't find it in the help text either so maybe that's something the mods should add, but it's definitely allowed! It might even be better in general since people know they can ask something specific to the software in the comments and receive a sort-of "official" answer. :)
                      – confetti
                      Nov 23 at 16:43















                    up vote
                    1
                    down vote













                    You can use FileZilla Pro to transfer files to and from a S3 bucket. FileZilla Pro supports multipart upload and in case of failure it will resume the transfer.



                    For the records it comes with a lot of other features: large file support, bulk transfers, filters, directory comparison, remote file search, drag&drop, speed limits configuration.



                    I'm a member of FileZilla Pro team.



                    Learn more at https://filezillapro.com and at https://youtube.com/c/FileZillaPro






                    share|improve this answer























                    • Please note that if you are in any way affiliated with the product you have to disclose that in your answer.
                      – confetti
                      Nov 22 at 13:59










                    • @confetti, sorry I didn't know that. I couldn't find anything specific in the help or in the code of conduct. I actually assumed it was forbidden. Addded the information.
                      – josuegomes
                      Nov 23 at 16:12












                    • No problem, I can't find it in the help text either so maybe that's something the mods should add, but it's definitely allowed! It might even be better in general since people know they can ask something specific to the software in the comments and receive a sort-of "official" answer. :)
                      – confetti
                      Nov 23 at 16:43













                    up vote
                    1
                    down vote










                    up vote
                    1
                    down vote









                    You can use FileZilla Pro to transfer files to and from a S3 bucket. FileZilla Pro supports multipart upload and in case of failure it will resume the transfer.



                    For the records it comes with a lot of other features: large file support, bulk transfers, filters, directory comparison, remote file search, drag&drop, speed limits configuration.



                    I'm a member of FileZilla Pro team.



                    Learn more at https://filezillapro.com and at https://youtube.com/c/FileZillaPro






                    share|improve this answer














                    You can use FileZilla Pro to transfer files to and from a S3 bucket. FileZilla Pro supports multipart upload and in case of failure it will resume the transfer.



                    For the records it comes with a lot of other features: large file support, bulk transfers, filters, directory comparison, remote file search, drag&drop, speed limits configuration.



                    I'm a member of FileZilla Pro team.



                    Learn more at https://filezillapro.com and at https://youtube.com/c/FileZillaPro







                    share|improve this answer














                    share|improve this answer



                    share|improve this answer








                    edited Nov 23 at 16:14

























                    answered Nov 22 at 13:32









                    josuegomes

                    214




                    214












                    • Please note that if you are in any way affiliated with the product you have to disclose that in your answer.
                      – confetti
                      Nov 22 at 13:59










                    • @confetti, sorry I didn't know that. I couldn't find anything specific in the help or in the code of conduct. I actually assumed it was forbidden. Addded the information.
                      – josuegomes
                      Nov 23 at 16:12












                    • No problem, I can't find it in the help text either so maybe that's something the mods should add, but it's definitely allowed! It might even be better in general since people know they can ask something specific to the software in the comments and receive a sort-of "official" answer. :)
                      – confetti
                      Nov 23 at 16:43


















                    • Please note that if you are in any way affiliated with the product you have to disclose that in your answer.
                      – confetti
                      Nov 22 at 13:59










                    • @confetti, sorry I didn't know that. I couldn't find anything specific in the help or in the code of conduct. I actually assumed it was forbidden. Addded the information.
                      – josuegomes
                      Nov 23 at 16:12












                    • No problem, I can't find it in the help text either so maybe that's something the mods should add, but it's definitely allowed! It might even be better in general since people know they can ask something specific to the software in the comments and receive a sort-of "official" answer. :)
                      – confetti
                      Nov 23 at 16:43
















                    Please note that if you are in any way affiliated with the product you have to disclose that in your answer.
                    – confetti
                    Nov 22 at 13:59




                    Please note that if you are in any way affiliated with the product you have to disclose that in your answer.
                    – confetti
                    Nov 22 at 13:59












                    @confetti, sorry I didn't know that. I couldn't find anything specific in the help or in the code of conduct. I actually assumed it was forbidden. Addded the information.
                    – josuegomes
                    Nov 23 at 16:12






                    @confetti, sorry I didn't know that. I couldn't find anything specific in the help or in the code of conduct. I actually assumed it was forbidden. Addded the information.
                    – josuegomes
                    Nov 23 at 16:12














                    No problem, I can't find it in the help text either so maybe that's something the mods should add, but it's definitely allowed! It might even be better in general since people know they can ask something specific to the software in the comments and receive a sort-of "official" answer. :)
                    – confetti
                    Nov 23 at 16:43




                    No problem, I can't find it in the help text either so maybe that's something the mods should add, but it's definitely allowed! It might even be better in general since people know they can ask something specific to the software in the comments and receive a sort-of "official" answer. :)
                    – confetti
                    Nov 23 at 16:43


















                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Super User!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.





                    Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                    Please pay close attention to the following guidance:


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f526814%2fhow-to-upload-big-files-to-s3-on-a-flaky-connection%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Сан-Квентин

                    Алькесар

                    Josef Freinademetz