I'm also not a huge fan of leaking server-side information; I suspect UUIDv7 could still be used in statistical analysis of the keyspace (in a similar fashion to the german tank problem for integer IDs). Also, leaking data about user activity times (from your other comment) is a *really* good point that I hadn't considered.
I've read people suggest using a UUIDv7 as the primary key and a UUIDv4 as a user-visible one as a remedy.
My first thought when reading the suggestion was, "well but you'll still need an index on the v4 IDs, so what does this actually get you?" But the answer is that it makes joins less expensive; you only require the index once, when constructing the query from the user-supplied data, and everything else operates with the better-for-performance v7 IDs.
To be clear, in a practical sense, this is a bit of a micro-optimization; as far as I understand it, this really only helps you by improving the data locality of temporally-related items. So, for example, if you had an "order items" table, containing rows of a bunch of items in an order, it would speed up retrieval times because you wouldn't need to do as many index traversals to access all of the items in a particular order. But on, say, a users table (where you're unlikely to be querying for two different users who happen to have been created at approximately the same time), it's not going to help you much. Of course the exact same critique is applicable to integer IDs in those situations.
Although, come to think of it, another advantage of a user-visible v4 with v7 Pk is that you could use a different index type on the v4 ID. Specifically, I would think that a hash index for the user-visible v4 might be a halfway-decent way to go.
I'm still not sure either way if I like the idea, but it's certainly not the craziest thing I've ever heard.
I think a bigger benefit from doing that would be that inserts would be cheaper. Instead of an expensive insert into the middle of an index for every table that needs an index on that key, you can do a cheaper insert at the end of the index for all of them except for the one that uses uuid4.
But if you are doing that, why not just use an incrementing integer instead of a uuidv7?
Certainly for many applications, the autoint approach would be fine.
The benefit of uuid in this case is that it allows horizontally scalable app servers to construct PKs on their own without risk of collisions. In addition to just reducing database load by doing the ID generation on the app server (admittedly usually a minor benefit), this can be useful either to simplify insert queries that span multiple tables with FK relationships (potentially saving some round trips in the process) or in very niche situations where you have circular dependencies in non-nullable FKs (with the constraint deferred until the end of the transaction).
For those that can't get it to load (it takes a minute, and I noticed my desktop's fan kick it up a notch while things were getting initialized, so... YMMV): this is a portfolio site done via a cozy-gaming-style AWSD game where you drive around in a jeep-like thingamabob. There are some cute easter eggs, including a sort of... shrine to each of the socials, which you can run into with your car and knock over (though the links remain clickable, of course!). It also looks like there's some degree of global state; for example, you can "sacrifice yourself to the gods of chaos" (ie drive into a portal) and a counter on the side of the portal goes up, presumably for everyone (since I certainly didn't drive into it 1700 times myself!). There's a strongly consistent art style, and just generally... seems pretty polished. Or at least, that's what it felt like after 5 minutes of driving around.
All in all I'd say, I'm impressed, and enjoyed it. Though I think the HN title ("handsdown one of the coolest 3D websites") is maybe a bit much. It's an extremely-well-executed portfolio site; no more, no less.
If you count game demos on the web, then Epic Citadel, based on the port of Unreal Engine to various mobile and web platforms including HTML5/WebGL/asm.js, had a much detailed and more 3D world - it used all 3 dimensions fully, unlike OP which appears to be a flat world that’s quite restricted vertically. That demo first came out about 15 years ago, with the HTML5 version coming out a few years later.
Since then I’ve seen several other sites along similar lines, since Unity released a similar capability, but I haven’t kept track of them. The problem is they’re all essentially games that are more impressive for their look than their functionality, so they tend to have a spike of interest when people first see them and then you never hear of them again. And typically, the tech bitrots and the sites stop working after a while.
You say that as if I've got any control over the browser on the end users device, some of which will be configured to not apply these rules globally for accessibility reasons...
I am irked that on desktop it does not work in Firefox, but only in Chrome (and presumably other Chromium based browsers).
I'm not a big fan of Chrome, for a variety of reasons, but principally because I don't trust it and can no longer use a good ad blocker, so I never really enjoy having to fire it up.
After watching developer's "making of" video, I went and grabbed my USB gamepad (from twenty years ago) — whenever the gamepad is plugged in, Bruno's gamesite stops responding (until controller is unplugged).
I would recon if this isn't playing on your end, it has more to do with using uncommon hardware configurations (not necessarily lack of horsepower).
When it avoids a chrome (and thus google) monopoly: yes. And don’t talk to me about Firefox engine. Its market share is negligible and the whole thing only even still exists because google allows it.
The App Store is the walled garden that doesn't allow anyone else to ship a browser engine, except in certain markets where they have been forced by law to create a "Web Browser Engine Entitlement" that non-WebKit browsers can use with super special permission from Apple.
Because so many people use iPhones as their primary devices it's hard for businesses to accept making their websites only support browsers which are built off Chromium. It somewhat limits the power Google has to unilaterally dictate web standards.
From TFA, they're using it to print bioinks. Think scaffolding for cell cultures.
At these kinds of physical scales, biology is almost certainly a much larger market than mechanical applications. A 20 um line width (slightly less than one thou for US folks) is certainly a tolerance you might encounter on a drawing for subtractive manufacturing, but for addative, feature sizes that small will be strength limited.
Mechanical applications at that scale are not well developed, but that doesn't mean their potential is small.
Member sizes below the critical diameter for flaw-sensitivity are crucial to the hardness and durability of, for example, human teeth and limpet teeth, as well as the resilience of bone and jade. Nearly all metals, glasses, and ceramics are limited to a tiny percentage of their theoretical mechanical performance by flaw-sensitivity.
Laparoscopes that require smaller incisions are better laparoscopes. Ideally you could thread in a biopsy-needle instrument through a large vein to almost anywhere in the body.
Visible-light optical metamaterials such as negative-index lenses require submicron feature sizes.
I know a research group that is gluing battery-powered RFID transponders to honeybees.
Electrophoretic e-paper displays are orders of magnitude more power-hungry than hypothetical MEMS flip-dot displays. We just don't have an economical way to make those.
And of course MEMS gyroscopes, accelerometers, and DLP chips are already mass-market products.
There's still a lot of room at the bottom, even if EUV takes thetakes purely computational opportunities off the table.
I'm not trying to say that there aren't plenty of applications for small scale mechanical devices, but rather that the applications where FDM-style 3d printing would be an appropriate manufacturing process are likely to be largely biological.
Biological applications (of which tooth and bone would of course be included) are extremely well-suited for additive manufacturing because they're frequently one-offs, and therefore cannot scale, and oftentimes highly insensitive to price. Mass market products are a whole different ball game; even for applications where there isn't currently an economical manufacturing method, I'm very skeptical that there's a path where AM could be scaled out to the volumes required to sell the end component at a commercially viable cost.
To be fair though, I didn't do a good job expressing that, because I just took it for granted that it would be clear that large ratios between feature size and nozzle size are rarely economical for FDM-style AM, which isn't necessarily an obvious observation.
I largely agree, but I'll take the opportunity to fill in some of the other gaps in the conversation.
I didn't mean that you could 3-D print tiny laparoscopes or even visible-light metamaterials; I meant that you could 3-D print machines for making tiny laparoscopes and visible-light metamaterials.
I agree that FDM-like 3-D printing is not currently attractive for feature sizes many times larger than the nozzle size. You'd need printers with thousands or millions of "hotends".
With respect to biological applications of 3-D printing, I think you're overlooking the part of the iceberg that's currently below the waterline of economic feasibility. Biological applications of 3-D printing are frequently highly-price-insensitive one-offs that cannot scale because people don't even consider the things that will become possible when prices drop by a factor of a billion or a trillion.
> I didn't mean that you could 3-D print tiny laparoscopes or even visible-light metamaterials; I meant that you could 3-D print machines for making tiny laparoscopes and visible-light metamaterials.
Huh, thanks for the clarification, that's an angle I hadn't considered.
> I think you're overlooking the part of the iceberg that's currently below the waterline of economic feasibility.
Hm. I think to a degree you probably have a point; I certainly agree that people tend to overlook the explosion of new development that is made possible by drastic cost reductions, though with the aside that having price insensitive applications is often instrumental in developing the technology that enables those cost reductions in the first place, because it allows for profitability early on in the technology's maturation, as opposed for "well it won't be profitable until we hit X milestone in Y years".
That being said, it's not clear to me how many mass-market biological applications would be possible under reasonable regulatory regimes. Maybe I'm just showing my ignorance when it comes to small-scale biological applications, but can you name some examples? (Or is this more of a "you never know until somebody does it" kind of thing?)
Having used both extensively, Geizhals doesn't hold a candle to McMaster. McMaster is, bar none, the single best e commerce website I've ever used (if you already know what you're looking for, and definitely still top shelf if you don't).
But McMaster and eg Amazon are optimizing for different things. McMaster knows its clientele isn't going shopping, they're solving problems. As such, McMaster focuses on helping your solve your problem and get back to work. Amazon, on the other hand, is focused on just selling you "as much 'anything' as possible" and wants you to spend as much time there as possible in the hopes that you'll stumble on an impulse buy.
For context: one of the several projects I'm working on right now is an automated extraction system for literate-code-style documentation in python. This isn't the place nor time to talk about the why of it (especially compared to other existing similar solutions). The important thing is the how: it uses a temporary import hook to stub out all module imports, allowing the docs generator to process each module independently at runtime, track imports between them, etc. At the end of the process, it also cleans itself up nicely.
Point being, it's a lot of really complicated fiddling with the python import system. And a lesson I have learned is that messing around with import internals in python is extremely tricky to get right. Furthermore, trying to coordinate correctly between modules that do and don't get modified my the hook is very finicky. Not to mention that supply side attacks on the import system itself could be a terrifying attack vector that would be absurdly difficult to detect.
All this to say, I'm not a big fan of monkeypatching, but I know exactly how it behaves, its edge cases, and what to expect if I do it. It is, after all, pretty standard practice to patch things during python unit tests. And even with all its warts, I would prefer patching to import fiddling any day of the week and twice on Sunday.
Feedback for the author: you need to explain the "why" of your project more thoroughly. I'm sure you had a good reason to strike out in this direction, and maybe this is a super elegant solution. But you've failed to explain to me under what circumstances I might also encounter the same problems with patching that you've encountered, in order to explain to me why the risk of an import hook is justified.
I didn't really get why I'd want to actually use it (vs. just a cool demo) either, until:
> means if you want to make changes to a third-party package, you don't have to take on the maintenance burden of forking, you can package and distribute just your changes.
That's a big win. I've seen and done my share of `# this file from github.com/blah with minor change X to L123` etc.
If the goal is to actually package and distribute the changes via import hook, that makes the supply chain attack question particularly relevant. And it still doesn't explain why you couldn't just package and distribute the monkeypatch itself, instead of creating a whole new import ecosystem surrounding hooks.
I've done my fair share of that too, but I'm still not seeing the benefit vs patching.
Let me explain what inspired me to create modshim:
I've written a Jupyter client for the terminal (euporie), for which I've had to employ monkey-patching of various third-party packages to achieve my goals and avoid forking those packages. For example, I've added terminal graphics support & HTML/CSS rendering to prompt-toolkit (a Python TUI library), and I've changed aiohttp to not raise errors on non-200 http responses. These are things the upstream package maintainers do not want to maintain or will not implement, and likewise I do not want to maintain forks of these packages.
So far I've got away with monkey-patching, but recently I implemented a kernel for euporie which runs on the local interpreter (the same interpreter as the application itself). This means that my patches are exposed to the end user in a REPL, resulting in potentially unexpected behaviour for users when using certain 3rd party packages in Python through euporie. Modshim will allow me to keep my patched versions isolated from the end user.
Additionally, I would like to publish some of my patches to prompt_toolkit as a new package extending prompt_toolkit, as I think they would be useful to others building TUI applications. However, the changes required need to be deeply integrated to work, which would mean forking prompt_toolkit (something I'd like to avoid). modshim will make it possible for me to publish just my modifications.
Perhaps it's a somewhat niche use-case, and modshim is not something most Python users would ever need to use. I just thought it was something novel enough to be of interest to other HN users.
> messing around with import internals in python is extremely tricky to get right
This is true! modshim has been the most complicated thing I've written by some way!
Monkey patching an object attribute, such as a method or a function of a module, may affect 3rd party libraries code that use said object.
This solution is interesting, as it provides the patched code as if it were a new package, indendant of the existing one you have installed, like vendoring, but without the burden of it.
In case you want to be the only one seing your patch, this is great. It also makes the whole maintenance easier, as you don't have to wonder if you patch it at the right time or in the right way. MK can fail in many subtle edge cases.
Inheritance, particularly, is a great Mk pitfall I expect this method to transparently work with.
If you only want your own code to see the patch, then why not just wrap it?
I mean if you really need super strong isolation, you can always create a copy of the library object; metaprogramming, dynamic classes, etc, all make it really easy to even, say, create a duplicate class object with references to the original method implementations. Or decorated ones. Or countless other approaches.
My point isn't that I don't see problems that could be solved by this; my point is that I can't think of any problems that this solves, that wouldn't be better solved by things that don't do any innards-fiddling in what is arguably the most sharply-edged part of python: packaging and imports.
And speaking from experience... if you think patching can fail in subtle edge cases, then I've got some bad news for you re: import hooks.
At the end of the day, people who might use this library are looking for a solution to a particular problem. When documenting things, it's really important to be explicit about the pros and cons of your solution, from the perspective of someone with a particular problem, and not from the perspective of someone who's built a particular solution. If I need to drive a nail, and you're selling wrenches, I don't want to hear about all of the features of your wrenches; I want to know if your wrench can drive my nail, and why I would ever want to choose it instead of a hammer.
I can think of a lot of differently-shaped metaphorical nails that fall under the broad umbrella of "I need to change some upstream code but don't want to maintain a fork". And I can think of a whole lot of python-specific specialty hammers that can accomplish that task. But I still can't think of a signle situation where using import hooks to solve the problem is doing anything other than throwing a wrench into a very delicate gearbox. That is the explanation I would need, if I were in the market for such a solution, to evaluate modshim as a potential approach.
> I mean if you really need super strong isolation, you can always create a copy of the library object; metaprogramming, dynamic classes, etc, all make it really easy to even, say, create a duplicate class object with references to the original method implementations. Or decorated ones. Or countless other approaches.
> My point isn't that I don't see problems that could be solved by this; my point is that I can't think of any problems that this solves, that wouldn't be better solved by things that don't do any innards-fiddling in what is arguably the most sharply-edged part of python: packaging and imports.
All these examples have the dependency order wrong, and you're right on those - it's simpler to wrap them somehow. But this is doing something different, that is either much harder or outright impossible with those methods: Tweaking something internal to the module while leaving its interface alone. This is shown in both their examples where they modify the TextWrapper object but then use it through the library's wrap() function, and modify the Session object but then just use the standard get() interface to requests.
In that case I'd opt for dynamic module creation using metaprogramming instead of an import hook. And I personally would argue that grabbing the code objects from module members and re-execing them into a new module object to re-bind their globals is simpler than an AST transformation.
But regardless of the transformation methodology: the import hook itself is just a delivery mechanism for the modified code. There's nothing stopping the library from using the same transformation mechanism but accessing it with dynamic programming techniques instead of an import hook. And there's nothing you can't do that way.
IMO, the trick to really enjoying python typing is to understand it on its own terms and really get comfortable with generics and protocols.
That being said, especially for library developers, the not-yet-existant intersection type [1] can prove particularly frustrating. For example, a very frequent pattern for me is writing a decorator that adds an attribute to a function or class, and then returns the original function or class. This is impossible to type hint correctly, and as a result, anywhere I need to access the attribute I end up writing a separate "intersectable" class and writing either a typeguard or calling cast to temporarily transform the decorated object to the intersectable type.
Also, the second you start to try and implement a library that uses runtime types, you've come to the part of the map where someone should have written HERE BE DRAGONS in big scary letters. So there's that too.
So it's not without its rough edges, and protocols and overloads can be a bit verbose, but by and large once you really learn it and get used to it, I personally find that even just the value of the annotations as documentation is useful enough to justify the added work adding them.
Additionally, a lot of people who would otherwise be in the used car market for price reasons in the US cannot afford to purchase even a used car outright, and getting financing for cars sold on the private market is, to my knowledge, not really possible.
> Additionally, a lot of people who would otherwise be in the used car market for price reasons in the US cannot afford to purchase even a used car outright, and getting financing for cars sold on the private market is, to my knowledge, not really possible.
To me, that's the biggest reason for used-car prices to be low - the market is restricted to those who have cash.
IME, the prices of cars that cannot be financed (older than 5 years) is severely constrained. I've seen a 5 year old model advertised for exactly twice the price as an almost identical 6 year old model, with similar mileage and similar condition[1].
I find that hard to reconcile with the other posters in this thread who observed that used cars are still too expensive. Maybe they are looking at used-cars that can still be financed (under $X years, for example)?
=======================
[1] Condition of both was reported by the same independent assessor - see my posting upthread
Ah, so this is really just another way poverty is punished. I can afford to save up $8000 for a second hand car, but people who can't, have to buy more expensive cars on a (predatory?) car loan.
I'm not saying it's intentional, but there are a lot of mechanisms like these that make it more expensive to be poor, creating a vicious cycle of poverty.
> I'm not saying it's intentional, but there are a lot of mechanisms like these that make it more expensive to be poor, creating a vicious cycle of poverty.
You are quite correct, but this is one of those rare cases when "learning things" is enough to uplift one from some of the poverty.
I did not start out rich enough to afford a 6 year old car cash, I started out by saving for a year (sacrificing a lot in other spheres of my life at that age, early 20s) to buy a barely running beater that I then maintained cheaply for the next 5 years.
Cost over five years, including repairs and maintenance was about a tenth of a new car price.
Many who cannot afford the cash $8k for a decent 2nd-hand car can afford the cash $2k for a dodgy barely running car.
The problems they face is being clueless about cars, repairing, etc in general.
IOW, the problem they have is not "not enough cash to own a car", it's "not enough knowledge to fix a car". The only person who can remedy that is themselves, not the market.
Looked at through this lens, this is not a punishment on poverty, it's a punishment on lack of knowledge.
That's not true. There's a whole industry of lenders for used cars. Interest rates are typically slightly higher than new cars but as long as you have a decent credit rating and the car isn't too old you can get financing.
Note that those lenders are separate from "buy here pay here" used car dealers. They offer their own in-house financing at much higher interest rates, targeting consumers with bad credit. Some of these are close to being a scam.
Another thing that makes no sense to me: I couldn't get my bank to finance the purchase of an older vehicle at any rate.
Last year, my air conditioner condenser went out and had to be replaced for a total of of $4.5k. I could have paid that out of pocket, but didn't want to deplete my emergency fund and always check my options. The same bank was happy to give me a "personal loan" for $5k at 9% APR. I asked if I putting up collateral could lower that rate - they offered 6% if I'd use a vehicle title to secure the loan.
So, I can get a loan for $10k on a 2020 vehicle at 9%, or I could get a loan for $5k at 6%... but only if I already had the vehicle's title in my name at the time.
It makes no sense to me at all, but that's how it's structured. It's far easier to get credit when you "want" it than when you "need" it.
Yes, I could have gotten a home equity loan. I didn’t want to do that because I want to be able to sell it in the near future depending on how things work out, and didn’t want the additional title encumbrance.
It's much easier to get financing than you might think if you've not tired - but the rates aren't great, and you're limited to fairly new vehicles. My bank only finances back to 2018 model years, and only 80% of what they consider to be "market value". In practice this means two things.
The first is that the cars that can be financed tend to be almost as expensive as new - but at a higher interest rate and shorter loan term. That means from a monthly budget perspective, you can end up paying as much for a used car as a new one while needing to keep funds available for repairs and maintenance.
The other is that older vehicles are typically financed without traditional "auto loans". If I wanted to finance a $5k truck, I could do so - but it would be an unsecured "personal loan" at significantly higher interest. You have to have fairly good credit to do that, too.
I'm not sure if the American credit score system is at all familiar to those outside the US, but the minimum score for a personal loan is around 600. While granted, I haven't been good with money most of my life, it took me from ages 30 to 35 to get my credit score to the point I could buy a home in 2018 (605). As soon as we closed on our home, my score went back down to ~570, and it has been a struggle ever since. In the seven years since then I've had exactly two "late" payments reported. Both were issues with autopay failing and my not noticing until I got a letter in the mail - at which point I immediately resolved the issue and brought the account current. My credit score right now is ~615. On paper, I carry a high credit card utilization that keeps it from being higher. In reality I pay all of my cards but one off every month. I could get my score up to ~680 in two months by paying off those cards and not using them - but unless I need the score for something, why would I? I have ~$6k in expenses that I run through those cards every month. I get 3% "cash back" for every purchase I make on them, so I end up earning an addition ~$180 / month for doing so; it'd be silly to not take advantage of that.
The result is that while I can get an auto loan on a used vehicle, it makes no sense for me to do so. Instead, I've not borrowed for a vehicle of my own since 2012. When I totaled my Jeep, I took the insurance payout and bought my '91 GMC.
We have two other vehicles: my wife has a 2019 F-150 Platinum, and my daughter has a 2024 Subaru Crosstrek Wilderness. Both are financed.
For the Ford, my sister wanted to trade it in on a Ram 2500. It wasn't drivable at the time due to a failed timing cover gasket, which the dealership estimated at $5k to repair. Another dealership offered her $12k for it. She owed $23k on a 0% note. With the towing package and other upgrades, it's worth ~$35k on the private market in running order - so I offered to pay off her loan on it ($23k). She agreed, then asked if I could just take over the loan she already had on it to continue to build her credit. That's what we did. I've got another two years to pay on it at ~$700 / month.
For the Subaru... well, we knowingly overspent there because my daughter's previous vehicle was destroyed when someone t-boned her on her way to school last winter. She was very hesitant to get back on the road, and my wife and I weren't exactly comfortable with the idea either. We compensated by buying her the absolute safest vehicle we could find that fit her use case. It came down to Subaru and Volvo; Subaru offered the best price and she preferred the Crosstrek or Outback anyhow. We could have gone as low as ~$25k or as high as ~$45k for a Crosstrek, depending on trim. We went for the Wilderness because it's geared slightly lower and therefore has more low-end acceleration - which is important to us, so she has the power necessary to get out of bad situations on the road if warranted. Total financed there: ~$37k, for six years, with a ~$750 / month payment.
All of this is to say: I make good money in tech. Though I am the sole income earner for the family, we're well over 2x the mean household income in my area. We have three vehicles as a family and ~$1.5k / month in vehicle payments. I don't understand how families making half of what I make are driving three vehicles <5 years old - but they absolutely are.
My entire debt load -- including our mortgage and some lingering student loans -- is less than my annual salary. I'm extremely uncomfortable with that much debt. Meanwhile, the people I see around me are living in homes that cost twice as much as ours, have more and newer vehicles, and make half as much. It makes no sense to me.
This is a really clear picture of things, thanks! I am actually originally from the US, but I've been living in the EU for 5ish years and haven't had a car since I gave the one I had in college back to my parents after I moved to the bay area, where public transit was good enough that I couldn't really justify the ongoing expense for something I was just using for grocery runs. Otherwise, I've just been purchasing motorcycles in cash.
I guess I'd never really seriously considered the availability of loans for used cars (beyond the unsecured personal loans); it makes sense that they would exist, but in the ~30 years I lived in the states, I never once heard mention of them, or talked to anyone who had gotten one. Which is pretty crazy if you think about it. I think that, combined with the complete and total lack of regulation surrounding private sales (ie, no way for banks to de-risk the loan), made me assume they just weren't a thing. So I stand corrected.
But I think your anecdote might offer an even more compelling reason why: to satisfy the conditions to actually get one, you find yourself in territory where, from a short-term month-to-month financial perspective, you might as well just be buying new. And if you're struggling to make ends meet... welp, that's that.
The credit score is another good point; it might also be the case that the people who have a good enough credit score to land a decently-priced financing deal on a used car are almost by definition financially capable of buying new. Where I grew up, there was definitely a stigma against buying used, so that seems like it might also have some explanatory power.
Certificate transparency logs are likely the only realistic way, but you could make the same argument against your DNS provider. Trust has to start somewhere.
Whether or not something like this makes sense to you is probably a question of your personal threat model.
Seeing how people are worried about third parties issuing certificates, I encourage using a tool to monitor CT Logs. It really makes the fog of war disappear around your certificates.
That's not what the exit tax is, though. The German exit tax is effectively just a way to give the existing capital gains tax a way to tax unrealized gains when you leave the country, to prevent you from dodging taxes on capital gains by simply leaving the country.
In other words, it's not an additional claim. It's simply an enforcement mechanism for the money you already hypothetically owe.
Yes, that's true, but the implementation is.. not very elegant.
In theory, the exit tax should ensure that Germany gets the taxes of the sale of your company. So, if you ever sold your company once you're no longer in Germany, Germany wouldn't get those taxes, so it charges you immediately once you leave Germany in a sort-of "virtual" sale.
This, of course, sucks tremendously because you actually haven't sold your company, and "normal" people don't have this sort of cash on hand.
Other countries have "smarter" exit tax implementations and only charge you when you actually sell your company in the future. I think that's pretty fair. It also doesn't hinder people from leaving the country.
Another reasonable implementation would be for the government to accept payment in the form of shares of your company. Personally I think this is how all taxation of illiquid assets should be done, but I suppose it could get complicated.
As an immigrant to Germany, I've often made the observation that Germany frequently has a really severe implementation problem. So I'm generally very sympathetic to that idea.
That being said, I'm not entirely sure that's the case here, and this is often also brought up in the context of strengthening the inheritance tax in Germany. In both the inheritance tax and the exit tax, the inherent applicability conditions are such that the end result is that there simply aren't that many people in a situation where it actually has a measurable impact. For the exit tax, you'd need to find people who 1. want to leave Germany, 2. already started a company here, 3. that company grew large enough that the Wegzugssteuer would really be a burden, and 4. that don't have enough liquidity, or cannot raise enough liquidity by selling some of their ownership, to cover the tax. That ends up being a really small number of people, which always eases questions about the reasonability (Angemessenheit) of the law. And in the context of inheritance tax, there's the added point that there's a floor to its application.
As another commenter mentioned, even for those situations where the exit tax actually is burdensome, just as with inheritance tax, there are two really simple solutions: first, create a floor for the minimum valuation by which the exit tax is actually assessed, and second, allow you to "sell" shares to the German government as a means of paying the tax, turning the Finanzamt into a silent shareholder in the company. I think both of these would be substantial improvements to both the German exit tax and inheritance tax.
I've read people suggest using a UUIDv7 as the primary key and a UUIDv4 as a user-visible one as a remedy.
My first thought when reading the suggestion was, "well but you'll still need an index on the v4 IDs, so what does this actually get you?" But the answer is that it makes joins less expensive; you only require the index once, when constructing the query from the user-supplied data, and everything else operates with the better-for-performance v7 IDs.
To be clear, in a practical sense, this is a bit of a micro-optimization; as far as I understand it, this really only helps you by improving the data locality of temporally-related items. So, for example, if you had an "order items" table, containing rows of a bunch of items in an order, it would speed up retrieval times because you wouldn't need to do as many index traversals to access all of the items in a particular order. But on, say, a users table (where you're unlikely to be querying for two different users who happen to have been created at approximately the same time), it's not going to help you much. Of course the exact same critique is applicable to integer IDs in those situations.
Although, come to think of it, another advantage of a user-visible v4 with v7 Pk is that you could use a different index type on the v4 ID. Specifically, I would think that a hash index for the user-visible v4 might be a halfway-decent way to go.
I'm still not sure either way if I like the idea, but it's certainly not the craziest thing I've ever heard.