Hacker Newsnew | past | comments | ask | show | jobs | submit | matxip's commentslogin


Ok fine, unique except for maybe that one time.


This seems like it's implicitly painting a false dichotomy between an implied autonomous Spez that is causing damage and a subservient Spez that is just a name attached to board action. How about this, if Spez is being asked to do stupid things by the board, he's failing extraordinarily to even cushion them with PR, which is the actual least he can do.


I built this and the other rocm packages from AUR (some removed now) along with pytorch build-flagged for rocm. It did eventually work, but It was a real pain in the ass and cost a whole lot of time. After that though, something broke (who knows) and I ended up rebuilding everything again just to get it back to a working state. I'm currently too afraid to update to these community packages, partially because I don't know if the build flags I used are the duct tape keeping my builds functional (I have an rx590-8gb which is maybe at the edge of supported gpus). Instead, I just added every rocm package to my IgnorePkgs, which isn't a great sign.


I really did loath the removal of SMS. It went like this for me: I would set up non-techie family with signal and that would be SMS and secure messaging for them. Then Signal removed SMS, so I had to explain to them that I had actually set them up with something "unstable" and needed to change their apps around. As a consequence, they're more hesitant to try things I suggest and I can't blame them. As for the actual messaging, inevitably they'll forget to juggle apps and just default to SMS and there's only so much I can go against the stream here.

This is cool, but I think being the default in the mainline app is critical. Also, iirc signal doesn't like modified apps, so this might be on shaky ground.


The key here is that this isn't really a "third party app," per se. It's all their code. All I did was replace this:

  return getBoolean(CLIENT_DEPRECATED, false);
With

  return false;
I also updated a dependency on libsignal.


A related video on the topic that I found pretty interesting:

https://www.youtube.com/watch?v=iWlqxGQXZx8

It goes over an older Ketchup recipe and gives some related history. If you like historical food recipes and/or culinary history, you might want to check the channel out.


His channel is one of my favorite food related ones. So much interesting history


At first, I was a little put off by how "batteries not included" the standard library was in rust, but I ended up preferring it. The quality of the rust community has generally made library selection pretty easy. There's usually a tacit first choice and often one or more optimized alternative for a particular situation or preference. When there is community disagreement and library churn, I think it ends up evolving better, more liked solutions; I truly believe this wouldn't happen as easily if it was enshrined in the std. Case and point, I've seen standard library implementations in C++ and python that just rot away.

The case sensitive visibility/privacy sound terrible to me. I dislike having to type pub as much as I do, but that seems worse for readability and clarity on top of throwing a wrench into the established rustfmt naming system.

About "Memory management", they already made some progress with non-lexical lifetimes in the past which did indeed improve the end user experience. I think "Polonius" aims to reduce the friction further, but I haven't looked into the specific improvements it might bring in user experience.


> When there is community disagreement and library churn, I think it ends up evolving better, more liked solutions;

That's an optimistic way of thinking about. I feel hesitant to pull in libraries because you don't know when or why the community is going decide to change. Then your that fool using that old code, stuck with sudden tech debt.

It also makes stuff like stackoverflow weird to use where you get different answers depending on the year. I think the biggest problem is with libraries that try to make things more ergnomic. Learning about pin and how to use it is weird, everyone keeps recommending some higher level library.


Unfortunately a standard library doesn't really reduce churn. It just means to you chunks of the standard library being deprecated, which can be confusing. For example, what's the common way to download a file from the web in Python?

Instead, I think every language should have a list of batteries-suggested libraries, with a roughly 2 year turnover period. This way you get a community suggested group of the common functionality needed to be productive. It won't change so frequently that you're constantly looking up function names, but it also won't get sclerotic, and can move from old grungy libraries to new libraries when necessary. Of course, you can always just stick with an old "edition" of the community libs if you've got a legacy app that you don't want to migrate.


The author mentions "Serialization / Deserialization: JSON" as something the standard library needs. Would Rust have gotten `serde` in this case? I guess we'll never know, but serde is seriously impressive on many levels.


Go included json in the standard library, so it can't ever change it.

It's wildly inefficient (mostly due to reflection), not type safe, and somewhat surprising. Guess the output https://go.dev/play/p/lSyiaN3IYOR

The author also mentioned "Images manipulation" as a thing that should exist in the standard library.

We actually know how that one went from go too: There's the stdlib image/draw (https://pkg.go.dev/image/draw). But generally, it's recommended to use the non-stdlib golang.org/x/image/draw package instead these days (https://pkg.go.dev/golang.org/x/image/draw)

For backwards compatibility reasons, the stdlib couldn't change, and it's actually far more confusing as a result.

I take a much different lesson than the author of this post about what lessons to take from Go's "large stdlib" experiment.


> I take a much different lesson than the author of this post about what lessons to take from Go's "large stdlib" experiment.

Personally, i think that this is simply a problem of not being able to throw the entire thing away, learn from the mistakes and make a better iteration.

I'm all for "batteries included" anything - be it a standard library for a programming language, an IDE with its own component libraries like Lazarus, a web development framework like Angular that's less fragmented than React or maybe even an operating system that has a lot of the software that you'll want to use preinstalled (or available on a DVD or a similar medium directly).

It's just that making sure that those batteries are good is impossible without iteration. Having to do backwards compatibility essentially makes that sort of iteration impossible. So what's the solution here?

Have a clearly defined end of life for the current version and a new major version in the future. Python 2 might have had a lot of warts, but Python 3 would clearly be better! One could say the same for Python 4, Python 5 and so on. The same goes for Go, Rust and any other high level language - just look at Java 8 and the newer iterations. You just need to learn to throw old things away instead of keeping them around in some half-alive state.

I say that as someone who's currently maintaining a Java 8 monolith that cannot be updated to anything more recent. If your software gets so large to the point where it cannot be migrated/rewritten in a manageable way, you've made a mistake. If you cannot feasibly migrate it over, then consider whether you even need it in the first place (e.g. build systems that you do less).

There are sub-optimal cases where that simply isn't possible: the Linux kernel comes to mind, as does a lot of other large legacy software that's slowly trudging along on the backs of tens of thousands of developer years. But somehow you don't see many new/existing systems using FORTRAN or COBOL and that's all for the better.


The lesson that I take is that all non-essential packages should instead be in an official, separately versioned, stdlib.

Go moved "image/draw" into "golang.org/x/image/draw". This was a good solution because it meant you could update the compiler without having to worry about also updating your image drawing stuff and that possibly breaking.

I wish they did that with http, tls, and all the other stuff that isn't really part of the language itself.

As is, I usually have to wait months to get compiler benefits (like being able to use generics, or smaller binary sizes, or fixes to bugs) because the compiler update also comes with breaking changes to the http package (like the http2 change) or tls or whatever.

My lessons is that there should be 3 tiers of libraries:

1. Compiler-versioned libraries. Things like io, syscalls, reflection, and things who's implementation depends heavily on the compiler internals. Breaking changes are never made without extreme circumstances.

2. Official, separately versioned, standard libraries. Things like golang.org/x/ or the idea behind the rust-lang-nursery. This is for stuff like http, logging, tls, crypto algorithms, image manipulation, high-level filesystem APIs, other protocols and abstractions. It is supported to use an old version of these libraries with a new compiler forever, and the compiler will never change libraries in 1 in such a way that they could break any old version of 2. You should update these whenever you can, but they may make breaking changes, and may be versioned as such.

3. The rest of the packages, stuff the community writes. These should prefer to use semver, and should prefer to use libraries from 1 and 2, but no promises are made.

I think having these 3 tiers is ideal, especially because it makes it easier to update the compiler (it will never break you), ensures perpetual support for older versions of official non-stdlib-but-still-core libraries (may make major releases that break, but the old one builds and works forever).

This is already pretty close to how C works actually, and it's great. Tier 1 is the actual language spec itself, linking libraries, etc. Tier 2 is libc. Tier 3 is the rest of the world. I really don't have to worry about how my pthread_create call in glibc will break when I update gcc because that separation is well maintained, and backwards compatibility is assured.

Python and go are both a clear step back, where I update the compiler to get some feature, and then my http server stops working even though I had no intention of updating it.


> The lesson that I take is that all non-essential packages should instead be in an official, separately versioned, stdlib.

Hmm, honestly, this is a really great take. A thin OS base, with the "batteries" being optional. Or at least being to use specific versions, bits and pieces from various points in time as needed.

Of course, this could turn into an Eldritch mess, but then again, it might also lead to working software over not being able to update the stuff you want to update because the rest would break otherwise.

Overall, there's probably a lot of consideration to be made about the details, but that seems workable. Stability might still sometimes be an issue (e.g. should the language internals change, a la Python 2 to 3), though i'm not sure whether that can even be addressed all that well, at least in the higher abstraction level languages (with separate runtimes like CLR, JVM etc.).


>At first, I was a little put off by how "batteries not included" the standard library was in rust, but I ended up preferring it. The quality of the rust community has generally made library selection pretty easy.

I still think it would be good to have a rust installer where you get a set of "almost standard" libraries pre-installed. So that in companies where you have to get everything approved, you could get this package approved.


I think it has plenty enough of batteries for building things that can be used to build things. Very good abstractions for the most part, with mostly good implementations (ie. cases like parking_lot mutex show where it fails, and the `async` stuff is bit sketch).

Don't understand and don't support initiatives that want to stuff it full of actual applications and stuff (like http server, async runtime) etc. Just use I don't know, something else at that point.


There is no right answer. Too much core functionality defined by the community can be confusing and lead to mixed quality. As a newbie, how do I even know what the “rust SDK” even looks like and what parts are better than others.

A large official language SDK has its own downsides too though. For my money, the .NET base class library is the best “batteries included” platform out there. It has certainly had its mis-steps over the years though. How many projects that went all-in on the original ASP.NET ( now called Webforms ) are now totally stranded on a technology island that the broad .NET community is racing away from? How about the graveyard of UI frameworks for .NET? WCF? Using the “official” solutions did not help guarantee any longevity for these technologies ( though they are mostly still “supported” ).

It is hard really to say that one approach is superior. The Rust tool chain at least makes the distributes solution easier. For some stuff at least. The lack of a baked in library might make things like use in the Linux kernel harder in some ways.


Wow, WGSL looks different from the last time I checked it out (not too long ago iirc). I've been playing around with wgpu and vulkan, but I disliked the wgsl syntax so much that I didn't bother pursuing it any further at the time. I'm actually going to give it another look now since I don't really like glsl syntax much at all either.


Just be prepared for the shaders to break with each Chrome release, e.g. the new attribute style broke all existing samples.


Yeah, wgpu isn't updated for the new attributes yet apparently. I looked it up and it was just changed in the 2022-01-19 revision. Definitely a "working draft" as stated.


Unlike most others, they updated this one with care. I guess it's a somewhat notorious dev joke, so it makes sense.

https://knowyourmeme.com/photos/2246626-grand-theft-auto-the...


That's the front of the sign. The back of the sign is the real secret :) I'm trying to find out if they kept the back of the sign. See the Reddit link I posted.


The Reddit link isn’t particularly illuminating. Would you mind sharing what TFT means?


Let's pretend there was a secret video game cabal and they left their mark in many different places over the years. Of course, there is no cabal.


The reddit post you linked doesn't answer anything. What is TFT?


Let's pretend there was a secret video game cabal and they left their mark in many different places over the years. Of course, there is no cabal.


Trying to go down the rabbit hole, any suggested phrases to search?


For those interested, this is a previously streamed speaking event in the CHAZ: https://www.youtube.com/watch?v=t7nDkd3V2V0


Comcast, not AT&T, but my mom recently discovered that her Turner Classic Movie channel was not longer activated. After talking to someone on the phone (who suggested a service technician visit), she independently discovered that TCM was being moved to a sports package along with a military history channel and a country music channel.

It was a watershed moment for her. Apparently, there are online services that provide TCM, and now she's considering downgrading her cable service rather than upgrade it and getting an online service to make up the difference.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: