Last post Dec 22, 2016 08:11 AM by PatriceSc
Dec 21, 2016 10:59 AM|tariqadib|LINK
I'm trying to build a small application that will transfer files between windows and linux using HTTP/HTTPS. I have been able to transfer files using Restful WCF services as well as traditional ASMX services, but I need the service to be able to send large
files as well, and that too over an unreliable connection.
How do I make a method that can break the data into chunks and/or allows resumable transfer?
Dec 21, 2016 11:12 AM|mgebhard|LINK
I would use FTP as that's what FTP is for; file transfer protocol.
Dec 21, 2016 11:22 AM|tariqadib|LINK
Yeah, but I need to use HTTP. Any suggestions?
Dec 21, 2016 12:17 PM|PatriceSc|LINK
And you are on which side or you control both sides? AFAIK this is often done using
https://en.wikipedia.org/wiki/Byte_serving which allows to ask a server for a particular part of a resource. Or maybe as part of your existing service ?
By the way, not suree but I believe that accept-ranges is for download. I'm not sure if you can upload in multiple steps and resume an upload. Try
http://stackoverflow.com/questions/1830130/resume-uploads-using-http (and of course you can always implement your own mechanism IF control both sides).
Dec 22, 2016 04:49 AM|tariqadib|LINK
I do have control on both sides. But I need the solution to be in .Net. The link you posted is in js. Can you suggest something in .Net?
Dec 22, 2016 08:11 AM|PatriceSc|LINK
For now I don't really find an end to end scenario. Maybe something such as
So as the "download manager" found in most browser the content would be downloaded to a temp file, and so you could resume later from where you stopped requesting just the rest of the file in multiple steps (byte ranges).
Not sure if something similar could be done for putting files on the server.
As you have control on both sides, another option could be to implement your own partial download/upload "protocol" at the service level.