Hey,
Recently we tried to switch from Windows to Linux on our Gameserver project. This included to change our server framework from .Net Framework 4.6.1 to .Net (core) 5.0. It allowed us to run our gameserver natively on linux or in a linux docker container (instead of using mono on .Net Framework which apparently has some performance issues).
The only code change required to switch to linux is to change the exit routine from SetConsoleCtrlHandler (example: https://docs.microsoft.com/en-us/dotnet/api/system.gc.keepalive?view=net-5.0) to Mono.UnixSignal. This change was required to catch exit events (like ctrl+c or killing the process etc.) to do some final cleanup.
Locally the server seems to run fine, there are no big differences besides cpu usage that seems increased slightly on linux, even when the process is idling.
On remote however, the game seems to lag when running on linux because the client gets out of sync with the server. On the same hardware on windows, it doesn't seem to happen.
Example log from the server on linux (explaination below):
...
741.437 (UTC:11:52:15.962): CPU: 2%, Memory: 441 MB, networkMessagesSent: 8800 (2 since last check) 741.521 (UTC:11:52:16.046): SyncTimeHeartbeat sent for 741500 743.232 (UTC:11:52:17.757): SyncTimeHeartbeat sent for 742500 743.437 (UTC:11:52:17.962): CPU: 1%, Memory: 441 MB, networkMessagesSent: 8802 (2 since last check)
...
In this example you can see two prints of performance monitoring (Line 1+4), and two prints of a heartbeat that syncs the client and the server (Line 2+3).
The cpu monitoring has an exact interval of 2s and the heartbeat is supposed to sync every 1 second.
The second heartbeat is a bit late tho (after 1.7 seconds) and should've synced at 742.500 instead of 743.323. The next heartbeat after that would trigger after 0.2 seconds, so something seems to get stuck in the pipeline and we are not sure what exactly or why. For some reason the performance monitoring is still perfectly in sync every 2 seconds.
Besides the high idle cpu usage (2-4% on linux instead of 0% on windows) when the process shouldn't be doing anything we didn't notice any difference.
I've already tried to disable the linux specific code (UnixSignal) but the problem still exists.
I'm trying to create a minimum reproduce but wasn't able to figure out the main origin of this issue yet.
Please let me know if you need more details and I would appreciate any help.