Get Help:Ask a Question in our Forums|Report a Bug|More Help Resources
Last post Sep 21, 2012 12:56 PM by cognitix360
Jan 05, 2012 08:56 PM|LINK
> I saw this solution and tried it on my application, it works fine. But if I increase the value of this key, does that mean that my application can be easily attacked?
Well, the "attack" is that someone can more easily run a denial of service against your site by forcing the hashtable that contains submitted form elements to be inserted in the most sub-optimal way possible. So someone can make your site run slow and chew
up 100% CPU just by throwing a few hundred KB of data at it.
It is hard to say whether this is a real concern or not. Someone who wants to DOS your site can simply throw enough traffic at it, no matter what. Or they can find any other page on your site that takes a long time to calculate.
I would suggest turning up the value and not worrying about it. However, reconfiguring your web application to use fewer form fields would be the ideal solution. If having that many form fields at once is truly necessary, then some sort of AJAX partial submits
would be a good solution - not only for site performance, but for your users.
Jan 05, 2012 09:54 PM|LINK
This appears to me to be a case where the cure is worse than the disease.
The patch broke our applications as well and the
MaxHttpCollectionKeys fix worked. Thanks!
If this exploit was so "easy" why didn't the Internet just stop, or at least those parts using .NET.
Reading further into the release notes I would recommend thet everyone rename your local "administrator" account so that they don't know a local account ID, or at least not that one.
But that is a recommendation I have made for years just on general security princlples, along with moving INETPUB off of C: (I know call me old fashioned).
If you really want to mess with them rename "NoGuest" to "administrator" so that they end up with even less access then the account that they started with.
Jan 06, 2012 03:31 PM|LINK
@tbannar: I'm currently doing some additional research. Unfortunately we no longer operate any 2003-server in our network, so the version numbers may be different on your system.
You might try searching for files named "System.Web.dll" on your server and check their timestamps and version numbers. The one contained in the update should have a version number greater than 4.0.30319.1 and a timestamp greater than november 2011 (mine
is 4.0.30319.272 from december 26th, 2011 03:54:00).
System.Web.dll 4 before update
System.Web.dll 4 after update
To check in detail, open up .NET Reflector, select "file > open" from the menu, subsequently pointing to each System.Web.dll found, and navigate to "System > Web > Util > AppSettings" (as it says in the call stack in the exception you've previously posted).
Then see if there's an entry called "MaxJsonDeserializerMembers". If not, probably one of these actions must be taken:
Feb 04, 2012 03:12 AM|LINK
I spent a day in hell yesterday due to the POS known as MS11-100. I had to change 50 machine.configs on 15 servers and even reboot two of them in production because they had a .Net 1.1 app on it and the only way to set it for 1.1 is to make registry setting.
What a clusterf__k. I demand to know who was the kucklehead at MS who decided that 1000 data elements was enough! Further, the impact of this POS was not noted in any of the release notes on the KB articles. So, I can't even blame it on our testers who
did regression test before allow the tests to make it into production.
I have been in the IT field for 27 years and I would never treat my Customers the way Microsoft does by issueing patches that they had to know would break some people's production servers. If they didn't know, then they are more stupd than I thought.
Feb 10, 2012 03:05 PM|LINK
I ran into the same problem after a windows update on our servers. I had to make registry changes to get our site working again as we are running .net framework 1.1
Feb 10, 2012 03:19 PM|LINK
I encourage everyone on this thread to complain to their Microsoft rep about this impact as we have. The rep stated that not any of his other clients have expressed the same impact, but I suspect people are suffering in silence. We need to send a message
to Microsoft that there are many reasons people move to different platforms. And on the home front, I think that I have seen my lact PC.
Sep 21, 2012 12:56 PM|LINK
I fully agree with the criticisms of this "fix". Microsoft's making this arbitrary change is INCREDIBLY destructive. Web applications that worked fine suddenly stop working in a very aggressive fashion and it's been extremely time-consuming to find out why.
For any complex web application the value of 1000 is too small. Indeed, their count doesn't even seem correct - I've had it throw the error with only 140 <input tags and the default value of 1000.
Very poor, Microsoft. Why could you not default it to not use this value, or provide the capability easily to turn off this intrusive and unnecessary "security" feature by, for instance, setting it to -1?
The main problem is that Microsoft don't appear to understand that there is a difference between public websites - which might suffer from DOS attacks - and private, application-delivering sites, which are typically either run on an Intranet or on a highly
secured site running over the Internet. As "flashfearless" says, everyone ought to complain to Microsoft about this ludicrous "correction".