Get Help:Ask a Question in our Forums|Report a Bug|More Help Resources
Last post Feb 26, 2013 03:41 AM by Rion Williams
Feb 25, 2013 08:49 AM|LINK
I have a self hosted SignalR hub that sends huge data sets in JSON format to the client. On average, I am sending close to 10MB worth of data. Is there a way that I can compress the JSON objects prior to it being sent to the client?
Thanks in advance for your advise.
Feb 25, 2013 12:06 PM|LINK
This Stack Overflow is a discussion attempting to use gzip compression along with SignalR. It suggests using a custom HttpClient to handle decompressing the compressed
messages after they come across. It may be worth looking into althought there are some SignalR experts around here that may be able to help out a bit more.
Feb 26, 2013 12:51 AM|LINK
Thanks Rion for the response.
Took a look at the StackOverflow thread and it seems like people are not too optimistic about compression and data streaming.
Perhaps I can should the JSON before I send it over to SignalR ;_)
Feb 26, 2013 03:25 AM|LINK
Streaming and compression are incompatible unless you did your own custom compression and decompression. You can gzip just fine if you're willing to only use longpolling since it requires the whole stream to be read before compressing. The problem is there's
no "whole stream" when you're streaming infinitely.
Feb 26, 2013 03:41 AM|LINK
I would trust David's response to this issue (as he was the lead developer of SignalR). Just another one of the great things about these forums :)