Guild icon
DDraceNetwork
DDraceNetwork / off-topic
Any languages allowed
Between 2025-01-29 00:00 and 2025-01-30 00:00
Avatar
I'm losing my mind over chinese anime gambling game
04:05
Making plots of the worst case scenarios where I have awful luck and the gamba doesn't pay off
04:05
There's almost a 1% chance that I'll end up at least 30 pulls away from soft pity!
Avatar
It's fun how complex the curves end up The starting point is just min(1.0, 0.006 + max(0.0, (x - 72) * 0.06)), it's just a piecewise linear function with two pieces (technically three but the third part is kinda irrelevant) But then you apply random chance a whole lot of times and you get these fun curvy graphs that go up and down all smooth like (edited)
Avatar
lol what
04:54
justatest
Avatar
thoughtful
Avatar
it looks so silly how it answering itself
Avatar
the future is now
Avatar
Avatar
Tsumugi
it looks so silly how it answering itself
probably why o1 doesn't show you the CoT
04:58
deepseek doesn't care i guess
04:59
it doesnt know about netherite upgrade
05:00
pepeW
Avatar
can you tell, that it has that item which does this thing, then run the question again?
Avatar
have you tried let it search the web
Avatar
I would like to see if it can remember your convo in a different chat
Avatar
it probably doesn't
Avatar
you never know
05:01
Also, was there any minecraft version where the netherite upgrades weren't a thing?
Avatar
Avatar
Cellegen
Also, was there any minecraft version where the netherite upgrades weren't a thing?
ye
Avatar
then maybe specify what version you play on
Avatar
alr
Avatar
for me, I had to specify that I'm coding on Godot 4.3, cuz other versions have other systems and logic
Avatar
when i mentioned it now it knows about 1.20
Avatar
I see
Avatar
It just needs more specifics from the looks of it, already way better than OpenAI
05:05
OpenAI tends to hallucinate things, but here the deep thinking model actually makes reasoning
05:05
for fucking free
05:05
and with open sourcing
Avatar
Avatar
Cellegen
for fucking free
it for sure wont be free forever
05:06
justatest
05:06
just as chatgpt
05:06
Avatar
it will be. remember, you can fork their open source project
05:06
and as long as a modified version exists, it'll be free
05:07
you can even run it locally with a decent amount of training size
05:07
heck, I could also implement an API for Godot to use it's AI model
Avatar
Avatar
Tsumugi
Click to see attachment 🖼️
Ah yeah, that one is annoying af, servers are getting overloaded. Opening a new chat somewhat fixes it
Avatar
Avatar
Cellegen
I would like to see if it can remember your convo in a different chat
well now it knows about template in different chat but it somehow broke thinking
Avatar
It happens with OpenAI as well, you can regenerate it... but with the recent server stress, it's more annoying to deal with
Avatar
Avatar
Cellegen
you can even run it locally with a decent amount of training size
Yeah, only takes like a quarter million in hardware to run the full version at decent speed :) :)
Avatar
Well, normally yes
05:15
But that means we haven't reached a critical point in power consumption, it's only going to be better from here onwards
Avatar
Avatar
Cellegen
But that means we haven't reached a critical point in power consumption, it's only going to be better from here onwards
There's obviously going to be a very high lower bound for hardware capable of running very good models You can't run something actually smart on a potato (edited)
Avatar
I don't talk about locally as in, in your home. I talk about hosting a server which handles it for you
05:17
That costs money yes, huge amounts of it infact. But think about it on a company's perspective. that is a huge deal.
Avatar
And the very best models will always be ludicrously expensive because well funded AI labs have the big bucks and they will use all the resources they're given
Avatar
It's only expensive now, cuz all AI models from big tech are spending ridiculous amounts of money to it
Avatar
So even if you've got your own company and you can afford a few million on something you're still going to get stomped by the big boys
Avatar
Avatar
Cellegen
It's only expensive now, cuz all AI models from big tech are spending ridiculous amounts of money to it
And they will keep spending ridiculous amounts of money
Avatar
I don't say an ordinary person should do that. But what if Facebook decides to use their open source model which is free, for their own benefit? Instead of spending even more money to maintain their own?
Avatar
because bigger is better
Avatar
Think about how the monopolies of today's tech can change if they now use a free alternative, which costs nothing for them to use
Avatar
wdym costs nothing to use
05:20
the compute costs heaps
Avatar
costs nothing to develop
05:21
can you tell how much money has been sank to develop OpenAI?
05:21
not maintaining it, developing it
Avatar
Yeah, actually running finished models is peanuts compared to training
05:22
but even still, you need a lot of money to even run the models at significant scale
Avatar
exactly, consider this OpenAI sents billions of dollars developing their own stuff they sell their stuff for other companies for a fee then a new free competitor comes up which tells the companies that theirs is free suddenly, all other companies want to use that free version instead of yours
05:23
Boom. they just lost that much money. Billions
05:24
Not in an instant, this will accumulate over time
05:25
They lost the game, completely. Follows up all the other competitors which do the same, or similar thing
05:25
What a wonderful day
Avatar
Avatar
Cellegen
What a wonderful day
I guess...
Avatar
Oh, did I tell that the open source models have no restrictions?
05:29
This actually blew my mind when I read it from an article
Avatar
Avatar
risu
There's obviously going to be a very high lower bound for hardware capable of running very good models You can't run something actually smart on a potato (edited)
Avatar
I think llama 88 is feasible ye?
Avatar
Avatar
Cellegen
I think llama 88 is feasible ye?
8B
Avatar
me big blind
Avatar
It's nice to have better versions of models that people can actually run on their own hardware
05:58
quite practical
05:59
I'm just sad that this is the kind of area where you can never be at the cutting edge unless you have big corpo assets
Avatar
Avatar
Cellegen
Oh, did I tell that the open source models have no restrictions?
wdym
06:30
not all of them are unfiltered
Avatar
I meant the open sourced deepseek AI models
06:57
Since their largest model is hosted by them on their website and the company is based on the Eastern region, it tend to have restrictions, but their open source models don't have that.
06:58
Cannot really verify tho myself, but I didn't hear anyone saying the opposite, and they themselves can check the open source project (edited)
Avatar
miku
07:22
Avatar
Is it possible to run AI on AMD 12gb vram gpu?
Avatar
i think amd bad for ai
09:50
i tried stable diffusion on my amd once
09:50
it didnt work
09:50
or maybe i did something wrong
Avatar
I own amd so
09:51
It would be nice to get it working
Avatar
everything made for novidea i suppose
Avatar
Avatar
pilonpl
Is it possible to run AI on AMD 12gb vram gpu?
Jupstar ✪ 2025-01-29 10:01
Ofc
10:01
You just need a smaller model
10:02
DeepSeek's first-generation of reasoning models with comparable performance to OpenAI-o1, including six dense models distilled from DeepSeek-R1 based on Llama and Qwen.
10:02
Maybe this works
10:03
If you never used one, install ollama and run ollama run deepseek-r1:8b (edited)
10:04
But be warned, the smaller the model the dumber it is.
10:05
Even 32b will feel stupid compared to free chat gpt
Avatar
BREAKING: President Trump announces the U.S. will be placing tariffs on all semi-conductors and pharmaceuticals imported from 🇹🇼Taiwan in the very near future
10:27
this is why nvidia went down
10:27
insider trading
Avatar
Jupstar ✪ 2025-01-29 10:52
But some of the yes votes can't be trusted either lol
Avatar
Avatar
risu
I'm losing my mind over chinese anime gambling game
How have you been generating them? Monte carlo?
Avatar
Avatar
Ryozuki
this is why nvidia went down
They did it to stop me from accumulating wealth
Avatar
Avatar
Learath2
How have you been generating them? Monte carlo?
Keep a list of states and their probabilities, and at every step go through each tuple and produce every possible resulting state and its corresponding probability
11:58
I checked, the number of states tracked grows linearly each step until it reaches a maximum at a bit over 1k states, then starts linearly decreasing until in the end all paths lead to the final C6 state
12:02
So I can claim that it runs in O(n) even though for a couple hundred pulls it's quadratic at best greenthing
12:06
"At best" I say because my "list of tuples" is actually a Python dict nouis
Avatar
anyone got any idea of why windows crashes after connecting second display ?
13:25
i use hdmi - vga converters (edited)
13:25
amd gpu
13:25
it just disables image and doesnt work
13:25
after connecting
Avatar
Driver issue? Gpu issue?
Avatar
perchance
Avatar
Making sure the drivers are up to date is the first thing I would check
Avatar
it is
13:26
at first i thought power issue but no
13:27
maybe bad converter
13:27
but on first boot i managed to get image but only on second display
13:27
[0129/162149.937:ERROR:registration_protocol_win.cc(84)] TransactNamedPipe: The pipe has been ended. (0x6D) [0129/162149.940:ERROR:settings.cc(319)] Settings magic is not 1129342067
13:27
oh
13:28
whats that
13:28
on my desktop
13:28
debug.log
Avatar
Doesn't sound very interesting
Avatar
well second display works if using only one
13:34
what could possibly be the problem
Avatar
Avatar
Tsumugi
well second display works if using only one
As in both displays work on their own?
Avatar
i guess?
13:36
they work if only one is connected
Avatar
I would also suspect the hdmi -> vga converters if they are cheap. VGA requires tight timings
Avatar
if i connect two at the same time my pc crashes
Avatar
Avatar
Learath2
I would also suspect the hdmi -> vga converters if they are cheap. VGA requires tight timings
but they work ?
Avatar
If you have two native hdmi monitors I would try those, that would eliminate any conversion weirdness
Avatar
i dont have
13:37
Avatar
cyberfighter 2 2025-01-29 13:37
ew
Avatar
Avatar
Tsumugi
but they work ?
Well there is some tolerance, but maybe when you have two the timing just gets too out of sync
Avatar
maybe i have to set the same resolution
13:38
and everything
Avatar
Also worth trying is going very low resolution
Avatar
Avatar
Tsumugi
maybe i have to set the same resolution
now i have plugged my first display in after using second one and second one stopped working
13:55
alright i reconnected second one and it crashed
13:55
cool
13:55
♿
Avatar
Avatar
Tsumugi
Click to see attachment 🖼️
why does it seem like everyone has that mouse
Avatar
maybe using dvi-vga converter for second display would work?
14:08
@Learath2 what do u think
Avatar
Avatar
Cammodude
why does it seem like everyone has that mouse
i bought it when i was like 12, there was no good options
Avatar
still works fine ?
Avatar
Avatar
Tsumugi
maybe using dvi-vga converter for second display would work?
You can give it a go, I doubt any modern gpu has an analog dvi output, so you'll still have the d->a issue
14:10
you also need to be careful to buy the correct kind of dvi adapter, the passive one wont work unless your gpu is analog dvi capable
Avatar
Avatar
Cammodude
still works fine ?
well no, cuz i spilled some coke and tea on it, and mouse1 resource run out
Avatar
damn
14:11
only my dpi switch died
14:11
14:11
glue couldnt hold the rubber anymore so they fell off
Avatar
Avatar
Learath2
you also need to be careful to buy the correct kind of dvi adapter, the passive one wont work unless your gpu is analog dvi capable
what do i have
14:13
i dont know a shit about dvi
14:14
14:16
alr i dont know, gotta go buy another converter then (edited)
14:17
that sucks
Avatar
didnt know it could be this hard to use 2 monitors
Avatar
Avatar
Learath2
I would also suspect the hdmi -> vga converters if they are cheap. VGA requires tight timings
now i see that its "Sync"Masters for a reason
Avatar
can someone send me a tff (opentype font) for ddnet?
14:33
i need a good font for ddnet
Avatar
Avatar
emrecimke
can someone send me a tff (opentype font) for ddnet?
u can use any font if u put this in %appdata%\DDNet\fonts\ and change the values to urs
553 bytes
14:37
Avatar
tyty
14:53
14:53
how can i fix this
Avatar
check the local consle for details
Avatar
^ that's right
Avatar
Avatar
Overlord
^ that's right
♂S1mple♂ 2025-01-29 15:05
no
15:05
this is right >
Avatar
Avatar
Jupstar ✪
If you never used one, install ollama and run ollama run deepseek-r1:8b (edited)
so 8b is ok for 12gb vram?
Avatar
Jupstar ✪ 2025-01-29 15:16
yes
Avatar
so like what's the limit for 12gb vram
Avatar
Jupstar ✪ 2025-01-29 15:16
that one should be ok
Avatar
and 32 gb ram
15:17
is there some rule of thumb for this?
Avatar
Avatar
♂S1mple♂
this is right >
who are U, so wise in the way of directions
Avatar
Jupstar ✪ 2025-01-29 15:17
theoretically you can also run a 32b on that. but it will have to constantly switch vram with ram, or force the GPU to read from your ram. which makes the model much much slower
Avatar
okay but what model sizes can fit into 12gb?
Avatar
Jupstar ✪ 2025-01-29 15:18
a model that is less than 12gb of size
15:18
plus minus what your OS uses
Avatar
well
15:18
so 8b model has 4gb download size
15:18
does that mean it uses 4gb vram?
Avatar
Jupstar ✪ 2025-01-29 15:18
you can also try the 14b model
Avatar
Avatar
pilonpl
does that mean it uses 4gb vram?
Jupstar ✪ 2025-01-29 15:18
almost
15:19
but i dunno if it's 100% guaranteed
Avatar
Jupstar, when will AMD rise to power?
Avatar
Jupstar ✪ 2025-01-29 15:20
As soon as they release a GPU with 2000 TB of VRAM
Avatar
:(
Avatar
Jupstar ✪ 2025-01-29 15:23
I mean is nvidia even impressive rn? they litterally just increased the power consumption to get more perf
Avatar
seems like running an LLM localy is quite slow
Avatar
Avatar
pilonpl
seems like running an LLM localy is quite slow
Jupstar ✪ 2025-01-29 15:25
which one did you choose now?
Avatar
8b deepseek
15:26
is it even using the gpu?
Avatar
Avatar
risu
Keep a list of states and their probabilities, and at every step go through each tuple and produce every possible resulting state and its corresponding probability
Wait, there are only 1440 possible states (or 5760 with the new system that we don't have exact details on yet), so we could just store the probability of each state in a 1440 element vector, and to calculate next set of probabilities is just a matrix multiplication
15:28
Genshin gacha can be represented as a 1440x1440 matrix (5760x5760 actually)
Avatar
Avatar
pilonpl
is it even using the gpu?
Jupstar ✪ 2025-01-29 15:28
Hard to say, on rx 6900 XT on the 14B model i get this
15:28
It's not as fast as chatgpt, but for a single gpu it's quite ok ig
Avatar
Just buy 7 mac pros with 192G of unified memory and run the full model with distributed inference
Avatar
can ollama use the CPU?
15:29
but i suppose the CPU would be super slow
15:29
idk
Avatar
Jupstar ✪ 2025-01-29 15:29
it can yes
15:29
is your VRAM full?
15:29
then it usually also uses the GPU
Avatar
Time to throw Tensorflow at my funny plotting script? lol
Avatar
Jupstar ✪ 2025-01-29 15:29
also ofc if the gpu is at 100% usage
Avatar
seems like one 1gb vram is used so not much
15:31
i think it might be using the CPU
Avatar
Jupstar ✪ 2025-01-29 15:31
15:31
it should look almost like this xd
Avatar
what app is this?
Avatar
Jupstar ✪ 2025-01-29 15:32
amdgpu_top
Avatar
yeah no it's 1gb vram
15:34
and it uses CPU a lot
Avatar
Jupstar ✪ 2025-01-29 15:35
Get up and running with Llama 3.3, DeepSeek-R1, Phi-4, Gemma 2, and other large language models. - ollama/ollama
Avatar
but i don't think my AMD Radeon RX 6750 XT supports ROCm
Avatar
Jupstar ✪ 2025-01-29 15:38
really?
15:38
I mean I have same gen
Avatar
i mean i can try installing it
15:44
it might still work even without official support
15:44
and this is still a very new gpu
15:45
so i would expect things to work
15:51
deepseek only supports photos to extract text
15:51
sadge
Avatar
deepseek is died
Avatar
Wow my first speedcube just arrived
16:28
And it's awesome
16:28
New main
16:28
I suppose lol
16:29
But seriously it's so much better than i thought it would be (edited)
16:29
And fairly cheap too
Avatar
Avatar
pilonpl
But seriously it's so much better than i thought it would be (edited)
:) :)
16:36
Which one did you get?
Avatar
Avatar
TsPiggy
deepseek is died
Why
Avatar
Avatar
Jupstar ✪
I mean is nvidia even impressive rn? they litterally just increased the power consumption to get more perf
Avatar
Avatar
Nixus🔥
Click to see attachment 🖼️
Jupstar ✪ 2025-01-29 16:43
who are you and what do you want to tell me?
Avatar
Avatar
Overlord
Why
idk
Avatar
Avatar
TsPiggy
idk
Ach this explains everything
16:53
Isnt it 24-01 shouldnt u sleep vacation boy?
Avatar
Avatar
Overlord
Ach this explains everything
17:18
picture failed to upload before
Avatar
Avatar
risu
Which one did you get?
The same you have i think
Avatar
Purple cube yess
Avatar
Btw how do you carry your cube without damaging it?
Avatar
I mean how to carry it in a backpack
Avatar
In a square 📦
Avatar
I don't really do backpacks anymore but I generally just toss it in my bag
Avatar
Avatar
Overlord
In a square 📦
Avatar
they're fairly sturdy
Avatar
@Overlord the circle fits the square hole
Avatar
Avatar
Ryozuki
@Overlord the circle fits the square hole
Thats right, it goes into the square hole
Avatar
I guess you can get like a cube bag if you're scared of scratches or something but you won't crack it if you don't hit it too hard
Avatar
:3
Avatar
I guess i am gonna use the display box i got with the cube
Avatar
did anyone get a tungsten cube?
17:42
i want one
17:42
and have it on the desktop
17:42
Avatar
How many lightbulbs are required to create a tungsten cube from them?
Avatar
Avatar
Ryozuki
i want one
why would you want a tungsten cube? just as a meme?
17:46
it is a fairly expensive metal that doesn't really appreciate in value much
Avatar
Avatar
Learath2
why would you want a tungsten cube? just as a meme?
It's a meme item also having a really heavy really hard cube is satisfying
Avatar
Avatar
Learath2
why would you want a tungsten cube? just as a meme?
satisfying anti stress
Avatar
Get a platinum or gold cube, it atleast doesn't depreciate 😄
Avatar
Is this cube bad for H perm or am i just bad lol
Avatar
Avatar
pilonpl
Is this cube bad for H perm or am i just bad lol
Lots of M moves in H perm. M moves are usually stable on most cubes I've used
Avatar
But i just end up doing half turns instead of quarter moves
20:42
Or the other way around
20:42
Idk
20:43
But it think i am just bad
20:45
I can't stop solving now lol
20:50
Also why do some many people here know how to solve a Rubik's cube?
Avatar
MilkeeyCat 2025-01-29 20:51
any megaminx enjoyers?
Avatar
Avatar
pilonpl
Also why do some many people here know how to solve a Rubik's cube?
♂S1mple♂ 2025-01-29 20:52
i mean it's like one of the most popular puzzles and regular 3x3 is quite easy to solve
Avatar
For most people that's just unnecessary mental gimnastics tho
Avatar
♂S1mple♂ 2025-01-29 20:53
means average ddnet player is smart
20:53
certified
Avatar
Yeah
Avatar
Avatar
pilonpl
But i just end up doing half turns instead of quarter moves
The middle layers on maglev cubes are heavier due to all the magnets and it makes M moves feel different
21:21
I like it but it can take a bit of getting used to
Avatar
Avatar
pilonpl
Also why do some many people here know how to solve a Rubik's cube?
It's simply the concentration of nerds
Avatar
Avatar
pilonpl
But i just end up doing half turns instead of quarter moves
Huh, only quarter moves in h perm are the U layer moves. I guess you just flick too hard?
21:33
Never used a cube with magnets tho, so maybe quarters are harder somehow?
Avatar
the attached sound: RED SUN IN THE SKY (edited)
21:50
Avatar
Avatar
Learath2
Never used a cube with magnets tho, so maybe quarters are harder somehow?
The magnets add to the moment of inertia of the middle layers, so 180° flicks feel easier
21:51
At least imo
21:52
But your flicks should be controlled double flicks anyways so it shouldn't matter much
21:53
Just because you can flick M2 with one finger doesn't mean you should
21:54
Have you tuned the cube to your liking? @pilonpl
Avatar
My M2s are better on my slower cube with thicker lube
Avatar
Avatar
Learath2
My M2s are better on my slower cube with thicker lube
I mean more more effortless, not necessarily easy to execute cleanly
Avatar
Avatar
♂S1mple♂
i mean it's like one of the most popular puzzles and regular 3x3 is quite easy to solve
Avatar
Avatar
Tsumugi
Click to see attachment 🖼️
Exported 317 message(s)
Timezone: UTC+0