Jump to content

Server FPS question


neyoneit

Recommended Posts

Hi,

 

i have 7 ark servers on my PC and everyone except one is running at 30ish fps.
Is there a comand or settings i can put into cfg that can force server FPS higher ?

server cpu is at 50% and RAM usage is aroun 45% so there is a room, thats why i dont understand why is one of my servers only at 9-11 fps :/

 

Thanks a lot for an answer or solution to my problem.
Best regards,
neyoneit

Link to comment
Share on other sites

My guess is that this is caused by your CPU. What processor is on your server? You may need to go into task manager and manually assign each server to only use 2 cores each, and not the same cores either.

For instance,  my first server is on Core 0 and 1,  my second server is on 2 and 3 etc...

Aside from that, there are no commands I am aware of that will boost your FPS.

 

Link to comment
Share on other sites

59 minutes ago, Thorium said:

I don't even know what you mean by server FPS.

But 50% CPU means your CPU has a hard time running the servers.

He means that if you are in the server and use the showmyadminmanager command, it tells you what "fps" the server is running at.

Ark servers cap at 30FPS and should run there at all times. If it drops below that, players will lag and rubber band like crazy.

 

And yes, 50% CPU is WAY high. I run 4 Ark servers and am usually around 10% - 15% CPU usage, but I set the affinity like i mentioned.

 

Also, neyoneitcan you list your server specs (CPU, RAM, hard drive type and OS)? And also can you tell me what your Disk % is in Windows Task Manager(assuming you are running windows)?

Link to comment
Share on other sites

On 14.9.2017 at 8:35 PM, neyoneit said:

Hi,

 

i have 7 ark servers on my PC and everyone except one is running at 30ish fps.
Is there a comand or settings i can put into cfg that can force server FPS higher ?

server cpu is at 50% and RAM usage is aroun 45% so there is a room, thats why i dont understand why is one of my servers only at 9-11 fps :/

 

Thanks a lot for an answer or solution to my problem.
Best regards,
neyoneit

So your using windows servers then :)

ARK dedicated is absolutely unoptimized and it can't use 100% of the hardware.  1 ARK Server never uses more than 25% of the complete CPU. The most load is on the first CPU Core.

 

So the solution for you is to set affinitys  (Which ARK server should use which CPU).  Change the Affinity of your low fps server to CPU-Cores which doesn't have much load. Take this server away from Core0 where the most of your load at the moment is.

Link to comment
Share on other sites

4 hours ago, Toni said:

ARK dedicated is absolutely unoptimized and it can't use 100% of the hardware.  1 ARK Server never uses more than 25% of the complete CPU. The most load is on the first CPU Core.

I don't think it's that unoptimized. I think it uses only one core by design. It makes sense because typically you run multiple servers on one machine. By getting rid of the multi threading overhead it should actually run better this way if you run multiple servers at once.

Link to comment
Share on other sites

i have 6 servers atm - with total aprox. 100players on them.

If u run even 10servers with 1-5players on them ur CPU wont even notice anything to be honest...

at this moment there are around 40-50players on them

servers.thumb.PNG.f14932c18b342c2d3c70626ec20d2cd8.PNG

On 19. 9. 2017 at 6:07 PM, LockeCPM4 said:

And yes, 50% CPU is WAY high. I run 4 Ark servers and am usually around 10% - 15% CPU usage, but I set the affinity like i mentioned.

Also, neyoneitcan you list your server specs (CPU, RAM, hard drive type and OS)? And also can you tell me what your Disk % is in Windows Task Manager(assuming you are running windows)?

cpu: i7-7800X
RAM: DDR4 2600mhz 3x 16gb
OS: Windows 10 64bit pro.
DISK: M.2 WD BLACK 256GB, and one SSD

 

3 hours ago, Thorium said:

I don't think it's that unoptimized. I think it uses only one core by design. It makes sense because typically you run multiple servers on one machine. By getting rid of the multi threading overhead it should actually run better this way if you run multiple servers at once.

check picture ive attached... it seems to be ok in my opinion..

usage.PNG

Link to comment
Share on other sites

13 hours ago, Thorium said:

I don't think it's that unoptimized. I think it uses only one core by design. It makes sense because typically you run multiple servers on one machine. By getting rid of the multi threading overhead it should actually run better this way if you run multiple servers at once.

It is unoptimized...otherwise it would use the Core0 (Single Core design) by 100%, not only by about 70-80%.

The Servers fps starts to drop under 30 fps before the Core0 usage is 100%. Thats simply bad programming.  I understand the Single-Core programming (Because of Dino AI and things like that)..but it could at least be optimized to use 100% of the first core.

And btw...Standard recommended Unreal Engine fps is 40, not 30.....

 

If you run a single server on a quad core CPU, then overall CPU usage never goes over 25%. (70% on Core0, 10 - 15% on the other Cores).  Thats simply not optimized. There is no other games dedicated server which does use the hardware so badly.

 

Edit: When you wanna run multiple servers on one machine, then you can split the Cores by setting the Affinity when you start the server. So this isn't a thing which needs to be done by program design. A programm should EVER use the full hardware power. There are 13000 private servers and only a few 100 official ones. So the game shouldn't be optimized for dual core virtual machines... it should be using as much Cores as available.

Link to comment
Share on other sites

2 hours ago, Toni said:

Edit: When you wanna run multiple servers on one machine, then you can split the Cores by setting the Affinity when you start the server. So this isn't a thing which needs to be done by program design. A programm should EVER use the full hardware power. There are 13000 private servers and only a few 100 official ones. So the game shouldn't be optimized for dual core virtual machines... it should be using as much Cores as available.

I see were you are coming from with not utilizing 100% of one core. You are right, it should.

However you are not correct about fixing multi threading by adjusting affinity masks. The programer needs to put a lot of work into the design, because it's allways a trade of, the more cores you utilize the more overhead you have for thread management and you will get stals because one thread is waiting for another to finish. It's actualy quite complicated to optimize multi threading. You need to take care of race conditions and dead locks. It's easy and straight forward for some tasks, but it can be incredibly complicated for other tasks that require to be worked in sequence. Especialy on games it's very complicated and often it turns out you actualy lose performance instead of gaining it.

The point is if you optimize for single core, it will run faster on a single core as if it would be optimized for multiple cores. Especialy for game servers this can be a significant difference in performance per core.

Link to comment
Share on other sites

7 servers with a 6 core cpu?

I can see that you are using hyperthreading to your advantage to overcome the problem of needing 2 threads per server. That's an interesting idea there. Personally I want 2 real cores per server, and turn SMT off. Your setup is where the server uses the real core to do the world rendering, and then the second cache then takes it's turn at the CPU to calculate players, aka hyperthreading. But your CPU only has 6 cores, meaning 6 servers. But you are running a 7th server, which is sharing a core with another server. That might be where your problem is?

Link to comment
Share on other sites

On 21.9.2017 at 4:38 PM, neyoneit said:

i was hoping for some suggestions or tips on my last post..

I already suggested the solution for you.  Set Affinity for your servers.

Server 1 and 2 --> Core 0 to Core 3

Server 3 and 4 --> Core 4 to Core 7

Server 5 and 6 --> Core 8 to Core 11

 

Have a look here about how to configure the affinities at startup of your servers:

https://stackoverflow.com/questions/19187241/change-affinity-of-process-with-windows-script

 

Link to comment
Share on other sites

I believe I mentioned this before... I guess it wasnt here on the Ark Forums but your CPU provides minimal gains to your Ark Server. Ive had very old machines hosting ark servers that ran fantastic. Ive spent numerous hours hosting Ark servers since Ark has come out more than half of those hours were spent tweaking the engine. Several NON-Ark developers working with me to increase server performance. 

It took hours upon hours of tinkering to achieve 450+FPS on an ark server. However it does put a much heavier load on your CPU and uses significantly more ram.

My highest achieved FPS on an empty ark server is 1248

iNDunNk.png

My highest achieved working FPS for 47 people online is just over 400

4FO6Ggc.png

All this talk about server affinity and server cores and highest CPU ghz, is mostly rubbish. Having a super epic cpu provides minimal gains. I would NOT recommend hosting many ark servers on the same pc. The player experience will quickly decline. 

Link to comment
Share on other sites

On 4.10.2017 at 10:20 PM, Karpetbomb said:

I believe I mentioned this before... I guess it wasnt here on the Ark Forums but your CPU provides minimal gains to your Ark Server. Ive had very old machines hosting ark servers that ran fantastic. Ive spent numerous hours hosting Ark servers since Ark has come out more than half of those hours were spent tweaking the engine. Several NON-Ark developers working with me to increase server performance. 

It took hours upon hours of tinkering to achieve 450+FPS on an ark server. However it does put a much heavier load on your CPU and uses significantly more ram.

My highest achieved FPS on an empty ark server is 1248

iNDunNk.png

My highest achieved working FPS for 47 people online is just over 400

4FO6Ggc.png

All this talk about server affinity and server cores and highest CPU ghz, is mostly rubbish. Having a super epic cpu provides minimal gains. I would NOT recommend hosting many ark servers on the same pc. The player experience will quickly decline. 

Lool...running an ARK Server with 400 fps when people only get about 30 - 80 fps on client side its just senseless. It doesn't help anything if a server can answer faster than clients are asking.

All over 100 fps is simply wasted power costs...

BTW: The default UE fps is 40 which was lowered to 30 by wildcard because of some reasons.  If it would help anything to have a higher fps, then wildcard would already had changed it on their own servers and in the base game config.

BTW2: Not even BF1 which is a high fps shooter runs at such fps. BF1 servers do run at 60 fps

BTW3: A better CPU drastically increases your fps of the server when many players are connected.  i7 3700k -> i7 6700k is about 50% performance gain!

Link to comment
Share on other sites

  • 10 months later...

Okay, i can maybe explain some of the confusion about what server fps is and is not..

First off calling it server fps is a decision made by the engine devs aka The Unreal engine. A fun fact, it is also called this in The Conan Exiles server running on the same engine ?

However, what is meant by "server fps" is usually described as "tick rate" it is the time it takes for the server to react.
 

Quote

 

Note:
Tick rate

Tick rate is the frequency with which the server updates the game state. This is measured in Hertz. When a server has a tick rate of 64, it means that it is capable of sending packets to clients at most 64 times per second. These packets contain updates to the game state, including things like player and object locations. The length of a tick is just its duration in milliseconds. For example, 64 tick would be 15.6ms, 20 tick would be 50ms, 10 tick 100ms, etc.

 


Now that we got that straiten out, there has to be a limit on server fps(Tick rate) since game coding "can be" dependent on the server running at a predefined max rate. The game could become wonky if it is not coded correct. There is also a fact that if the server fps was unlimited it could stall the windows or Linux server running the game server.

When you change the server fps in Ark you change the amount of single core performance that can be used and since Ark Unreal engine does not run on a real multi threaded engine. the threads are set in serial so having high CPU clock is really impotent for ark servers. In practice an ark server running 30 server fps on a 3 GHz CPU core will use about 18 to 20 % at most per core if you are running on a quad core CPU. I would think that changing the server fps, would give you better performance but also use more of your CPU power per core.

Just a note: server fps(Tick rate) has nothing to do with client fps (Frame rate), so get that out of your dirty mind!

Where stuff gets bad with Ark servers is that they hide some of the impotent engine commands ether because they think we are to stupid to use them or the devs did not code some of the mechanics right. Like in Conan exiles NPC turn rate gets broken when you tinker with the server fps ? I have not yet tested this in Ark, so i am not sure.

I would however also like to test this in Ark, i found this information that might help you:

lines to add into ShooterGame\Saved\Config\WindowsServer\Engine.ini

[/script/onlinesubsystemutils.ipnetdriver]
NetServerMaxTickRate=30

Please do test changing this on a test server before using it! Setting it to 60 should set the new server fps max to 60.

The full engine.ini command overview:

Quote

lines to add into ShooterGame\Saved\Config\WindowsServer\Engine.ini (LinuxServer or whatever)

[/script/onlinesubsystemutils.ipnetdriver]
NetServerMaxTickRate=XX
LanServerMaxTickRate=XX
MaxClientRate=1048576 (in bytes per second)
MaxInternetClientRate=1048576 (in bytes per second)
MinClientRate=1048576 (in bytes per second)
MinInternetClientRate=1048576 (in bytes per second)

[/script/engine.player]
ConfiguredInternetSpeed=1048576 (in bytes per second)
ConfiguredLanSpeed=1048576 (in bytes per second)

Settings in the game.ini:

Quote

lines to add into ShooterGame\Saved\Config\WindowsServer\Game.ini (LinuxServer or whatever)

[/script/engine.gamenetworkmanager]
TotalNetBandwidth=1048576
MaxDynamicBandwidth=149796
MinDynamicBandwidth=6990

Again test before use!

Gl hf guys.

Link to comment
Share on other sites

5 hours ago, Sphere said:

" Rubs hands together "

I can't wait to try this. I wonder what would happen if I changed the tick rate. ?

Please post your findings Sphere.

I might be wrong, but I get the feeling these options are picked up by the Unreal engine rather than by Ark itself.  

 

There's some discussion here https://allarsblog.com/2016/02/25/basicsteamintegration/  but note that this guy uses different section headings so I'm sceptical as to whether or not editing the [/script/onlinesubsystemutils.ipnetdriver] section will have an effect on Ark.

Link to comment
Share on other sites

4 hours ago, FightzGamer said:

This guy, can now confirm that NetServerMaxTickRate=XX works on my ark server, hehe ?

 

I still need to test the Network rats work, but it appears that it is running quit well so fare ?

What value did you use?

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...