Last post Jun 20, 2016 05:13 PM by PatriceSc
Jun 16, 2016 07:09 AM|psib3r|LINK
I have a readerQuota set as so
<readerQuotas maxDepth="2147483647" maxStringContentLength="2147483647" maxArrayLength="2147483647" maxBytesPerRead="2147483647" maxNameTableCharCount="2147483647" />
do those sizes have any detrimental effects? Now in theory I suppose you should use as small as possible when transferring, has anyone used reflection to do this programatically, and if so, does that work on a per call basis, or is it a global change,
so if I make it small programatically will it effect somebody else's call that is receiving large data, or will there be conflict/race situation?
Jun 20, 2016 05:13 PM|PatriceSc|LINK
IMO changing this dynamically for each and every request would just anyway have the effect to still allow the largest values you'll ever need. So you'll end up with having done some work and research for no benefit at all and at the price of an increased
complexity (compared with just defining once for all the higher value you want to allow).
Now instead of jumping direcly to excessively high values (you likely won't ever use an array with as much as 2147483647 elements?) and doing so for each value, I would increase each individual value when really needed (maybe with a x2 factor each time).
It should allow to find quite quickly realistic values that works.
Or use high values but then I would likely try to see if I could use the IIS logs or maybe performance counters to see what is realistics and set something that is a bit more realistic. Then when a problem happens you'll be able to process that and maybe
see that the actual issue is that someone is anyway requesting really too much data than what he really needs etc...