I wish i was using an immediate mode ui. Swiftui is a disaster. Luckily damus android is immediate mode. So much nicer to work with.
Discussion
Immediate mode GUIs are known for being hard to get perf right in though. Are you using egui?
It’s the opposite. Performance is very predictable.
is it to do with keeping the background, worker tasks under control so they don't disrupt the framerate?
Yes, it’s all about the difference between responsiveness and determinism.
In my work, determinism is more important, for which an immediate mode can be good. But in many desktop apps, dealing with background processes *can* be a pain.
Will is a C guru though… little things like that probably aren’t a struggle for him. 😅
not sure that C is really the language of multithreading...
getting it right is complicated... really, there is the painter thread, which renders the state, and then there is the input thread, which reacts to user input and alters the painter's data set
coroutines are pretty good for doing this but honestly if i was designing a proper GUI system I'd be making two dedicated threads with the render worker and the painter, the painter needs to be top priority and the render worker needs to be keyed directly to input
it's one of the weaknesses of coroutine scheduling, it is by nature random and frames are regular
rendering interfaces is still a struggle i guess... making them shiny is a real black art
you just don’t do any heavy work in the render thread? Not sure what you mean.
oh, i got confused, yes, exactly, immediate mode is easier to get right it only gets glitchy when you overload the cpu - when you are writing a multithreaded app and you have one thread doing the frames you can simply make it exclude any other work using a mutex and that can at least avoid you tampering with the frames
back in the old days on the amiga there was an interrupt and the interrupt called your render code and any logic had to wait until that was done
i'm amazed how long it's taken for well established conventions that were devised by apple and amiga devs have taken to become well understood and used, i mean, ffs i'm literally talking about 35 years ago i wrote a flicker free sprite on a 700hz 8 bit cpu running interpreted basic ... and i'm supposed to believe it's difficult??? more like most programmers are retarded
hold on, immediate mode GUI code means you create logic trees that decide how to paint the display, as code, with branches and switches and if/then/whatever stuff
they have to be multithreaded, or at least the render thread needs to interrupt everything else the app is doing or you lose fps
the tricky thing is sometimes needing to do secondary processing in between painting, like with scrolling and physics and animations, these need to sometimes be computed before you do the paint or the variability of the time required to do the compute can impact your frame rate
yes they are hard to get right because you need to manage data in a background thread and not pester the render thread ever
i've wanted to dig deeper into that kind of stuff, when i first started programming as a kid i wanted to make graphics... my first big achievement was a flicker free mouse sprite, used bitmasking - i can't remember where i learned how to do it but apparently microsoft didn't figure it out for 20 years after that and windows had a flicker free cursor only because of graphics cards, even though it's just a bunch of simple logics