upvote
Nice thanks, gotta love knowing a bit about a niche and then encountering someone who knows a great deal more. That's the beauty of HN.

Could you point to any literature/freely available resource that comes close to the SOTA for these kinds of operations? I would be greatly helped.

reply
> This is a fundamentally unsolvable problem with floating point math

It's a fundamentally unsolvable problem with B-reps! The problem completely disappears with F-reps. (In exchange for some other difficult problems).

reply
>(In exchange for some other difficult problems).

Ahhaha.

(I used to work in nTop, and boy is this an understatement when it comes to field based solid modeling)

reply
> They don’t just lean into epsilons, the session context tolerance is used for almost every single point classification operation in geometric kernels and many primitives carry their own accumulating error component for downstream math.

The GP wasn't wrong. To "lean in" means to fully commit to, go all in on, (or, equivalently, go all out on).

reply
I think his point is: rather than "leaning into" it as in, masking through epsilons, he argues that tolerance is fundamental to the problem space, not a way to resolve edge cases.
reply
Right. And my point is that "leaning in" doesn't mean masking, it means committing to. Taking seriously. Exactly the sort of thing he's describing.

I'm wondering if people have heard the expression "leaning in" from people who were insincere/lying, and assumed that that was what the phrase means?

reply