Jump to content

Shrinking the size of official saves (PC)


arkark
 Share

Recommended Posts

I haven't been on the forums for a while but this couple of days I've been doing a little utility to clean up the maps and preserve that little piece of history that is my own bases and the ones of my friends on the official servers.

Some examples of how it turned out:

Server		|	    Size in disk	|		Memory usage					|	  Server save time
		|	before		after	|	before		after		after wilddinowipe	|	before		after
------------------------------------------------------------------------------------------------------------------------------------------------------
Ragnarok 210	|	1.6GB		400MB	|	18GB		10GB		2.4GB			|	25s		0.7s
Gen2 1008	|	2.5GB		187MB	|	24GB		12GB		3.1GB			|	35s		0.4s

 

Mainly I have done this because I think that having to handle so much data that we do not need its not very environmental friendly, a waste of energy and computing resources (Wildcard, which from what I see has employees on twitter who do activism on a consistent basis, should have been more careful about this and been more responsible, facilitating it themselves), and also in terms of monthly expenses if we are going to deploy the servers at an external provider it is not optimal because, like Nitrado, they will want to charge us, rightly so, for a giant server.

My utility extracts information from the save and removes everything that is not related to a list of tribes I want to preserve. It is almost fully automatic, relatively fast and secure.

I don't know if there are people interested, or if there are other similar methods to clean the server in an almost unattended way and guaranteeing that everything is cleaned. At least from what I researched before I started programming my own utility is that what there was was compatible with saves prior to version 11. One of the ways to achieve something similar is to allow everything on the server to decay. But this is very inefficient, takes a lot of time (+80 ingame days due to structures like tek dedi-storage), and can also cause problems if we want to do it very fast and we have several bases to maintain and have to refresh them every few time cycles.

If I see a lot of interest I might upload my utility for anyone who wants to use it, or create a discord server to support it.

PS: While doing this I have seen some absolutely crazy things, for example in Ragnarok 210 there was one tribe with over 68000 structures, another with over 16000 cryoed dinos. It explains a lot why the servers were going the way they were going. I think it makes you wonder to what extent the developers shouldn't put some parameter in the official servers to limit this.

  • Like 1
  • Thanks 2
  • Haha 1
Link to comment
Share on other sites

@arkark - Interesting, how does it clean things up? 

I have an app that will at least now load the save data and show what is in there to help manual clean-up but wasn't aware there was any reliable way to write the data back out without breaking more things so it's all read-only and provides copy/paste commands with pre-populated tribe id's etc. to clean them up manually in-game.

https://survivetheark.com/index.php?/forums/topic/491802-admin-trusted-player-tool-game-save-visualiser/

..edit.. Guessing you use something similar but then RCON onto the server to issue the command without being in-game?

Edited by MirageUK
Link to comment
Share on other sites

Vaults / storage box content can also impact on memory usage. They always told us cryopods have lower memory impact than structures, vaults contents and of course tames out of pods. And I noticed it a bit doing some tests cleanups: save file size isn't really proportional to memory usage at runtime. Also Offline Raid Protections helps a lot. Deleting a tribe with a "small base" but tons of cryofridges may worth at save size, but deleting a tribe with gazillion of structures and a relative "small" amount of pods worth far more at runtime. BTW, Mirage utility is really helping us on take right steps to decide what to delete and what not, also in preparation of final save.

Edited by darkradeon
Link to comment
Share on other sites

2 hours ago, darkradeon said:

Vaults / storage box content can also impact on memory usage. They always told us cryopods have lower memory impact than structures, vaults contents and of course tames out of pods. And I noticed it a bit doing some tests cleanups: save file size isn't really proportional to memory usage at runtime. Also Offline Raid Protections helps a lot. Deleting a tribe with a "small base" but tons of cryofridges may worth at save size, but deleting a tribe with gazillion of structures and a relative "small" amount of pods worth far more at runtime. BTW, Mirage utility is really helping us on take right steps to decide what to delete and what not, also in preparation of final save.

The hope is structures will have a lot less effect with nanite building structures over Mesh texturing. Cryopods storing less needed tames (non breeders/egg layers for kibble) would further improve performance.  Anything they can do to improve performance should be the target at launch as i'm sure once they see what true server loads which they have not seen with no beta test and only internal test with lack of multiple connection speeds affecting server performance.   Servers with multiple players with different connection speeds are affected.  Having 30 players on a server with 100 ping each plays very different than having 30 players on a server with 30 ping each.

Link to comment
Share on other sites

The server doesn't care about textures, nor about triangles vertices beside clipping/bounding meshes. What we know for sure is that UE5 uses double precision floating points (64-bits) for coordinates system vs single precision (32-bits) so I expect memory usage to getting worse if they don't do radical changes on compressing other info into bitmasks and removing other useless info (who the hell cares about the server name where a dino was born?).

Link to comment
Share on other sites

5 hours ago, MirageUK said:

@arkark - Interesting, how does it clean things up? 

I have an app that will at least now load the save data and show what is in there to help manual clean-up but wasn't aware there was any reliable way to write the data back out without breaking more things so it's all read-only and provides copy/paste commands with pre-populated tribe id's etc. to clean them up manually in-game.

https://survivetheark.com/index.php?/forums/topic/491802-admin-trusted-player-tool-game-save-visualiser/

..edit.. Guessing you use something similar but then RCON onto the server to issue the command without being in-game?

Yes, I use the -parseservertojson option then either RCON or 'exec' (depending on the tribe size) command to destroy the stuff. 

  • Like 1
Link to comment
Share on other sites

51 minutes ago, arkark said:

Yes, I've seen some json up to 0.5GB but not really getting struggled when parsing it. I extract all the info from them in about 10-15 secs.

try to parse a big official server save :V

I tried that with a relatively "small-medium" size legacy and it was a very slow process and the file produced was pretty big.

It's a nice debug feature for mods developers for sure.

Link to comment
Share on other sites

26 minutes ago, darkradeon said:

try to parse a big official server save :V

I tried that with a relatively "small-medium" size legacy and it was a very slow process and the file produced was pretty big.

It's a nice debug feature for mods developers for sure.

Which server number, I want to see how much time it takes

Link to comment
Share on other sites

9 hours ago, darkradeon said:

Thanks. The largest one I've completed is the one at position 44 with 2.4GB. There might be others slightly larger (up to 2.8 or 3.0GB) that are feasible given my computer's specifications (32GB RAM). I tried with valguero544, and my computer crashed after 3 hours, struggling with 79GB of virtual memory used. So, given my specs, I would need to adopt a different approach or just increase RAM amount. I did another program before discovering the json server option, to parse the save file and extract all the tribe IDs. However, unlike with the JSON, I didn't studied the save file format in depth to be able to determine how many structures each tribe has.

Anyway, out of almost 1900 servers only 15 would be problematic for someone with 32GB of RAM. And also most of that top 15 list are PVP servers, so I assume that the vast majority of structures and dinos will belong to the server's alpha tribe.

Edited by arkark
Link to comment
Share on other sites

  • 1 month later...
On 10/17/2023 at 9:40 PM, Danu09 said:

can you share it pls ? so i can play my gen2 server saves

thank you

sure,

https://github.com/alez-repos/shrink-ark-saves

It can be a bit complex if you are not familiar with the subject. If you get stuck at any point, let me know and I'll give you a hand and improve the documentation.

Link to comment
Share on other sites

  • 2 weeks later...
15 hours ago, Danu09 said:

is there any way else beside starting server or login to the game, but i can clean the save files or reduce it ? i only have 16gb ram. i ran save files manually in game crashed, using ark server manager crashed also. 

As far as I know a tool that can do this completely offline doesn't exists (yet).

Link to comment
Share on other sites

  • 2 weeks later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...