Last post Jan 04, 2021 02:29 PM by Manannan
Jan 03, 2021 04:55 AM|Manannan|LINK
I am using ASP .NET SignalR to create a realtime game in the browser. It seems to run fine, other than occasional stutters on the display output.
I first noticed this when I deployed my application is a virtual PC in the cloud. To rule out bandwidth issues, I then deployed it to another virtual PC running locally. Even when running the application from a virtual PC locally, I still get the stutters.
This issue does not happen when I run the application on localhost using IIS express.
This makes me think that the issue has something to do with IIS, since both the cloud virtual PC and local virtual have this in common and local host does not.
After Googling I found lots of people advising to check certain settings in IIS. I checked the websocket config:
The receive buffer limit is set to 4194304, which seems more than enough.
Another suggestion was to check the queue length in the advanced settings of the App Pool. Currently it is 1000. I believe that this is enough, as according Perfmon, the number of requests received peeks at about 100 per second.
Has anyone else got any ideas? Thanks
Just to be clear I am using websockets as the transport method (I think that is the right word), rather than polling or anything else.
Jan 04, 2021 10:18 AM|XuDong Peng|LINK
During the operation of the program, is there any data loss during the request and response?
You could open the browser developer tool, view the network bar, and filter the network sockets ( choose 'WS' ).
Something like this:
If the program runs normally and the data is correct, but there is just a stutters phenomenon. I think the reason may be related to the hardware. Because the virtual machine needs to simulate the underlying hardware instructions, it takes up the usage rate
of the calculator's CPU, thereby reducing the computer's performance.
Jan 04, 2021 02:29 PM|Manannan|LINK
Thank your for your learned response.
Let me answer your questions 1 by 1.
The cause is certainly not a lost request/response.
Let's provide a little more information; the game is running completely on the server, and simply sends screen updates to the client. The only time data is sent from the client to the server is to tell the game that the user has pushed a button. The client
front end is a 'dumb terminal' if you know what I mean.
So therefore when the game is just left to run on its own, without any user input, there are no requests being initiated from the client. I still see the stutter in this case, so we can rule out dropped requests from the client to the server.
That of cause leaves open dropped requests from server to client. I opened up the debugger tools as you suggested. It is hard for me to tell if frames are being dropped by looking at this data. There are many screen update requests from the server, which
are not very intelligible to a human.
A faster way to check to see if frames are being dropped is to look at the screen. I set the game character to walk from from one side of the screen to the other on the fastest setting. What I see is that the character freezes in the middle of his walk,
and them resuming walking.
This to me suggests that there is intermittent delays in data getting to the client. If there was dropped frames, we would expect to see screen corruption, as the screen update command to 'scrub' the previous image of the character as he walks from one part
to another would be dropped at some point.
I checked the CPU usage as you suggested, using the Perfmon 'Processor Time' counter, I can see that some of the cores are running at close to 100% at times, this to me suggests that it may be a CPU usage issue, and I will have to dig into the code and find
I am going to close this issue for now, and remeasure after fixing the high CPU usage.
Thank you for help.