Guild icon
DDraceNetwork
Development / developer
Development discussion. Logged to https://ddnet.tw/irclogs/ Connected with DDNet's IRC channel, Matrix room and GitHub repositories — IRC: #ddnet on Quakenet | Matrix: #ddnet-developer:matrix.org GitHub: https://github.com/ddnet
Between 2022-01-05 00:00:00Z and 2022-01-06 00:00:00Z
Avatar
3805921 Version 15.8.1 - def- 2345967 Merge #4542 - bors[bot]
Avatar
7ce5b24 Revert "Support 32bit color depth, default to it (fixes #4549)" - def- 3b3b2c7 Editor: Don't react to server settings shortcuts when dialog is open - def- 118ae1b Merge #4551 #4557 - bors[bot]
Avatar
@heinrich5991 im doing the tar zstd thing
13:10
sadly doing it streaming losses parallelization
13:10
but i could paralelize it from day to day instead from file to file
13:10
but i dont think i have enough ram to hold all that
13:10
xd
13:11
actually maybe i can
13:12
or maybe not
13:13
Construct an iterator over the entries in this archive.
>
Note that care must be taken to consider each entry within an archive in sequence. If entries are processed out of sequence (from what the iterator returns), then the contents read for each entry may be corrupted.
13:16
Hi there. I'm in the process of performance improvement for rustup, and have settled on extracting tars in parallel, having a POC showing its effectiveness on Windows. Currently, the tar pa...
13:28
12 cores = to much memory i guess
13:31
damn my pc froze and had to hard reset
13:32
idk how to do this streaming properly, from my understand you need to decompress everything first to get the tar files
Avatar
zstd can decompress in a stream aswell, no?
Avatar
@Learath2 ye but how do i know when to stop to get the entire tar entry
13:33
or im misunderstanding this
Avatar
Hm, idk enough about tar to know this but isn't there some entry header telling you when to stop?
13:34
currently im using decode_all
13:34
let me check one thing
13:34
xd
13:35
ok
13:35
i think i figured it out
Avatar
I'd probably do something like decompress one header -> parse it to learn when to stop -> decompress entire file -> parse file -> repeat
13:36
this should work
13:36
so beautiful
13:36
yep it does
Avatar
Oh that's even nicer
Avatar
12 cores
Avatar
Good API design
Avatar
yes the io api from rust is beautiful
13:36
read write
Avatar
Does it even parallelize that well? I'd expect the decompression to completely overwhelm the time it takes to parse
Avatar
im at 3gb ram (idk how much i had when the script wasnt running)
13:37
before i got to 16gb and swapped and crashed
13:37
xd
Avatar
Avatar
Learath2
Does it even parallelize that well? I'd expect the decompression to completely overwhelm the time it takes to parse
im paralelizing by day
13:37
not by file
13:37
each day has 17k json files
13:37
13:38
it still tkes quite some time xd
Avatar
What is a par_bridge? 😄
Avatar
@Learath2 a rayon trait
13:39
it turns any iterator into a paralel iterator
13:39
using a thread pool
13:39
magic
13:39
its thread safe thanks to rust
Avatar
Looks like a cool library
Avatar
hm maybe i should check my cpu thermal paste xd
13:43
100% for a long time gets to 86C
13:43
ryzen 5600x
14:14
hmm idk it takes a stupidly long time to parse 1 day
14:15
when i did it before, where all the files where decompressed in a folder it processed everything in 3 seconds (concurrently)
14:15
maybe streaming is not that efficient
Avatar
Um, how did you even have the entire thing decompressed? It would be insanely large, no?
Avatar
The duration it takes to parse an entire day shouldn't have changed, so maybe your parallel iterator is doing sth wrong?
Avatar
ye im looking into it
14:20
im paralelizing the wrong thing
Avatar
Try on 1 hour basis? And sum up the data you want after 1 day
14:22
Whats the data about?
14:22
Teehistorian data?
14:25
each tar file has a json file for every minute(?)
14:25
around 17k json files
Avatar
So my guess is you wanna summarize / visualize server stats?
Avatar
Smaller time intervals could help you utilize the cpu better
14:26
im making this plot
14:26
for every day
14:27
generating the plot isnt the issue here i think
14:27
its reading the data in an efficient way
Avatar
Mhmmm, do you need the data based on such high detail?
Avatar
the thing is to get to the next minute
14:28
i have to iterate thrhough every second
14:28
well 5 seconds
14:28
cuz there is a file every 5 seconds
14:28
and they are sequential
14:29
the thing is processing the data in each file is not the issue
14:29
its reading those
14:29
there are just to many
14:29
well ill try to skip
14:29
maybe it improves
Avatar
It should help a bit atleast
Avatar
Get rid of json & go for msgpack ( if you wanna rely on key-value patterns )
14:30
here is the code
14:30
@Avolicious i dont have control over the data origin xd
14:30
i mean i dont decide whether its json
Avatar
Avatar
Ryozuki
@Avolicious i dont have control over the data origin xd
Yeah, parse the json data & write it in msgpack xD
14:31
So you can skip a lot of overhead
Avatar
but the overhead is not it being json
14:31
i think
Avatar
JSON adds a bunch
Avatar
most cpu time is spent decompressing
Avatar
Avatar
Ryozuki
Click to see attachment 🖼️
look here
14:31
the flamegraph
Avatar
But CPU shouldnt be the issue, or is it time relevant?
14:31
It looks like the core problem is RAM
14:32
If you open too much files, too much data will be loaded & ram gets killed
Avatar
@Ryozuki This flamegraph is for the main thread, right?
Avatar
ram is non issue cuz its streaming
Avatar
How busy are your cores?
Avatar
Avatar
Ryozuki
ram is non issue cuz its streaming
But its not time relevant, just leave it open
14:32
Keep it running for 3 - 4 hours
14:33
xd
14:33
@Learath2 a lot
14:33
monkalaugh
Avatar
Okay, so maybe take a look at the threads flamegraphs?
14:34
It might be that the children are just not parsing these fast enough, in that case how fast the decompression goes wouldn't really matter
14:34
14:34
open it in desktop
14:34
it allows click
14:34
xd
14:35
idk how to check threads
14:35
i just run this CARGO_PROFILE_RELEASE_DEBUG=true cargo flamegraph
14:35
xd
14:38
im gonna try to filter to every min instead every 5 secs
Avatar
@Ryozuki you could try to make the children do nothing and see how fast that goes
14:43
Hm, the goal is to basically always have another chunk of data ready when a child thread is free
Avatar
skipping 12 files seems to make it faster
14:44
if a file exists every 5 seconds 5 * 12 = 60 seconds
14:44
so its every 1 minute
14:44
why im so smart
14:45
14:45
this is way faster
14:45
wait i can skip 1 map xd
Avatar
Is this skipping in the child threads?
14:46
yeah
14:46
i cant paralelize this entries()
14:46
because its not Send
Avatar
If that is helping the problem wasn't the speed of decompression
Avatar
decompressing in stream and that
14:47
maybe thats true
14:47
maybe its the json parsing
14:47
thats the slow thing
14:47
but
14:47
ye i guess
Avatar
I remember reading some performance issues with buffered readers and serde_json, maybe research that a bit?
Avatar
i could hack it and do a regex match
14:48
xd
Avatar
Avatar
Ryozuki
i could hack it and do a regex match
This is a cheat that I sometimes resort to, it's not very elegant but it does work
Avatar
but that wont be future proof
14:49
cuz rn i only plot max players
14:49
but i may want to plot more stuff
14:49
and then it will look horrible
Avatar
Yep, it's really something you do when you want quick results one time
Avatar
@Learath2 why was json the chosen format to serve this info?
Avatar
It's extremely cross-compatible I guess. People can easily make webtools for it
Avatar
I personally would have gone for a completely custom format
Avatar
i wonder if protobuf fits well in this
14:51
xd
14:53
it takes about 10-20 secs now
14:53
to generate a image
14:54
14:54
this is using every minute
14:54
maybe i can do it every 5 minutes
Avatar
I wonder if a json parser like jq exists that is efficient. Parsing the data you won't use isn't very efficient
14:55
e.g. you basically only use the players array, it could technically discard everything else possibly avoiding allocations
14:56
every 5 minutes
14:56
hardly any difference
14:56
and way faster
14:56
now it takes like 3 seconds
14:57
so its definitly the parsing
Avatar
it could also be your plotting lib
Avatar
oh maybe
Avatar
that's why I wanted a flamegraph on a child thread, I was curious about that 😄
Avatar
but i coudlnt see anything on the flamegraph
Avatar
I guess you could parse the file and just not do anything with the result, see how much of a difference that makes
Avatar
don't skip too many so the difference is obv
Avatar
nah its not the plot
15:00
ill try not deserializing and just giving a number
15:00
yep
15:00
its the deserialization
15:01
that takes way to much
15:02
simd-json is a rust port of the simdjson c++ library. It follows most of the design closely with a few exceptions to make it better fit into the rust ecosystem.
Avatar
I now am curious if one could generate an optimized json parser
15:03
Like say you only wanted the location of all servers, it is much cheaper to just immediately break after you parse the location entry which is near the start of the json object
15:03
i guess i need to enable native arch
15:03
xd
15:05
lol
15:05
indeaad it is fast af
15:05
simd_json wins
15:08
15:08
all the images
15:08
since http master existed
Avatar
simdjson is very fast indeed
Avatar
now i just gotta display it on a web somehow
15:16
all images displayed here
15:16
xd
15:16
i wonder why the time looks fucked on the first images
15:19
15:19
that looks funny
Avatar
broken json generation for a few hours
Avatar
That's when out of space broke the masterserver a couple days ago 😄
15:20
Train a neural network on the rest of the dataset then use that to reconstruct the missing part
Avatar
This one looks like a player flood attack
15:24
15:24
this maybe is ddos?
Avatar
or just network problems of the master server
Avatar
Plotters - A Rust drawing library focus on data plotting for both WASM and native applications 🦀📈🚀
15:55
@Learath2 this lib looks way better xd
16:28
using the new library
16:29
16:29
xd
16:29
i just have to format a bit the dates
16:51
thats it for today greenthing
Avatar
I will try to ask one moe time. Idk if it is a bug in code or something but whenever i press mouse1 which may fire instantly, it has a few milisecs delay. And it happens even on my own server and only with fire so it should not be high ping. Could you please check if you didnt make any mistake in coding for version on mac? It has been happening since last update
Avatar
Cellegen | HU 2022-01-05 16:53:55Z
Local delays... hmm pretty sus
18:16
meta is here monkalaugh
18:16
Godot Engine receiving a new grant from Meta's Reality Labs
Avatar
Finally, maybe I can find happiness in VR
😂 1
Avatar
i rly rly hope godot becomes the blender of games
18:17
its on good track i think
18:17
they just need to release 4.0
18:17
which overhauls 3d
Avatar
@Learath2 did u know js has iterators
18:33
but its pointless since most useful functions in arrays like map() dont return iterators
18:33
and arent lazily evaluated like rust
18:33
xd
Avatar
meta exists so zucky can have anime catgirls
Avatar
tell him to share them
18:37
Avatar
i have them irl
18:38
its called schizophrenia
monkaS 5
Avatar
@Ryozuki have you tried looking for a streaming json parser?
Avatar
b3aa169 Add 50 € sponsoring by David*°Villa - def-
Avatar
@heinrich5991 what would that solve?
19:27
memory is not an issue anymore
Avatar
it would potentially solve not looking at data you don't need
19:28
you'd have to try whether it's faster though
Avatar
hmm if you know any
Avatar
@heinrich5991 do you know of a keyword I can google for to look for a json parser I can compile ahead?
19:33
Like a regex program of sorts
Avatar
like what jq maybe does?
Avatar
Yeah, I looked whether libjq exposes that kind of functionality but didnt see it
Avatar
@deen is last year a moving window or fixed on 2021
Avatar
Fixed probably
Avatar
moving window
20:09
last 365 days, same as the last month and week
Avatar
Oh, cool
Avatar
I need to warn every beginner that they need to turn down the volume before joining Sunny Side Up. The default volume should probably be lower.
21:41
feb8633 Remove SQLite dependency on macOS, add DDNet-Server to client app - def- 21594a0 DDNet-Server-Launcher (for macOS): Support running without config - def- b8fb704 Improve wording - def- fec2067 Create custom look for DMG using dmgbuild - def- ccaca2c Add dmg background image by Ravie - def- 0522250 Add tools to dmg so they are available for macOS users too - def- af89b31 Update ddnet-libs - def- 8ed2147 Update ddnet-libs - def- d150367 Update ddnet-libs - def- a5783e4 Merge #4529 #4530 - bors[bot]
Avatar
the last year ryozuki was talking about is on the rank page https://ddnet.tw/ranks/
Avatar
@Learath2 so you want a specialized json info extractor?
22:13
perhaps json xpath or so?
Avatar

Checklist

  • [ ] Tested the change ingame
  • [ ] Provided screenshots if it is a visual change
  • [ ] Tested in combination with possibly related configuration options
  • [ ] Written a unit test if it works standalone, system.c especially
  • [ ] Considered possible null pointers and out of bounds array indexing
  • [ ] Changed no physics that affect existing maps
  • [ ] Tested the change with [ASan+UBSan or valgrind's memcheck](https://github.com/ddnet/ddnet/#using-addresssanitizer--u...
Exported 275 message(s)