Hacker Newsnew | past | comments | ask | show | jobs | submit | ctas's commentslogin

I’m working on something similar for Linux. Would love to chat if this is interesting to you.

The idea is to bring the UX of OSX Snow Leopard back, adjusted for today’s possibilities (better developer experience, AI, etc.). I’m developing a DE, SwiftUI/AppKit-equivalent, and a bunch of reference apps I‘m personally missing in terms of quality (e.g. Raycast/Spotlight, Mail).


Definitely let's talk. You'll find us on GitHub Discussions and Libera Chat.


> adjusted for today’s possibilities

You would want to adjust it for today's display and input technologies. A high resolution OLED display deserves a different UI design than a 6-bit low-contrast TN LCD display did.


I agree that displays and input changed. But if you think in fundamentals, like clarity, readability, affordances, you tend to arrive at the right answers anyway.

Those principles survived CRTs, TN panels, Retina, touch, trackpads. They’re not tied to a specific technology.

Can you give me an example of a change in todays UI that was motivated by change in display quality?


> Can you give me an example of a change in todays UI that was motivated by change in display quality?

There are a lot of places where I now see a miniature thumbnail preview of a file's contents, where in the 1990s you would only have seen an icon corresponding to the file type. Those previews are enabled partly by faster IO and processors making the preview rendering cheap, but also by higher resolution displays making the previews a lot more useful than they could have been at 32 pixels or smaller.

While it's not exactly a quality change as the driving force, the proliferation of dark mode UIs is a result of OLED displays that draw meaningfully less power with darker content, so pushing users toward darker UIs helps battery life. And it looks much better on a display with decent black levels than it would on a crappy LCD that washes out all the dark colors.


> Those previews are enabled partly by faster IO and processors making the preview rendering cheap

That's not really it. You could cache the thumbnails so it was never expensive to do.

If anything standards for performance were lower then (no 120hz displays) so people felt free to do more expensive things at the time.


> Can you give me an example of a change in todays UI that was motivated by change in display quality?

The extremely heavy "pinstripe" Aqua UI existed because displays were so low contrast at the time that it didn't look nearly as heavy. A much higher contrast display that actually displays blacks properly means it'd look more like visual noise.


Is there a link for this project? It sounds exciting!


Thank you. Not all parts are open-sourced yet. I published the first repo yesterday: https://github.com/cihantas/applib

Going to launch a few apps powered by AppLib in the next few weeks and then continue with the DE.



Can you share a bit more on the small LLMs you've trained? I'm interested in the applicability of current consumer hardware for local training and finetuning.


I'm not the AI expert in the company but one of my colleagues creates image segmentation models for our specific use case. I've been able to run the PyTorch training code on my computer without any issues. These are smaller models that are destined to run on Jetson boards so they're limited compared to larger LLMs.

edit: just to be clear, I can't train anything competitive with even the smallest LLMs.


Are consumer cards also benefiting from the improvements or only datacenters?


For me personally, blender uses ROCm and ROCm hiprt for Cycles tracing and cycles raytracing acceleration. I am using a Radeon RX 6800 and AMD Ryzen AI 7 350, and it works on Linux. ROCm HIPRT really does speed up cycles rendering (at the cost of more Vram usage). Of course, using the AMD driver on Linux, you get access to the system's RAM with GTT.


I've shared this example in another thread, but it fits here too. Few weeks ago, I talked to a small business owner who found out that Google's AI is telling users his company is a scam, based on totally unrelated information where a different, similarly named brand is mentioned.

We actually win customers who's primarily goal is getting AI to stop badmouthing them.


A desktop environment for Linux, visually inspired by OSX Snow Leopard with a touch of contemporary. Coming with compositor, apps like dock, finder, status bar, and a UI framework like AppKit. Scratching my own itch and would love to see if it can gain traction. Still in the early innings though.


what framework/techstack are you using? i'd love to see something built on top of GNUstep which is close to what OSX is originally based on. (don't know how much of that is still found in Snow Leopard)


The goal is not to use a similar tech stack e.g. GNUstep. Instead I'm focusing more on outcome - a desktop environment with a similar degree of polish and functionality without the need for third party tools.

To stay competitive and iterate fast I'm adding a high-level JS/CSS API on top of Wayland, think AppKit + SwiftUI. If you look over my shoulder it might look like I'm making a webapp, but on a custom browser.


We (Geostar.ai) work with many brands and companies that have experienced near-death situations caused by Google's AI Overviews. The negative impact this feature has had on people's livelihoods is heartbreaking to witness.

Just today, I met with a small business owner who showed me that AIO is warning users that his business is a scam, based on bogus evidence (some unrelated brands). It's a new level of bullshit. There's not much these businesses can do other than playing the new GEO game if they want to get traffic from Google.

Who knows if Google will even present any search results other than AIO a few years from now.


A desktop environment for Linux, inspired by macOS. Coming with compositor, apps like dock, finder, status bar, and a UI framework like AppKit.


Would be interested to learn more about how and why this was coded in Common Lisp, in particular which value it provides specific to the problem being solved compared to other languages.


Ability to write at a very high level (and macros), incredibly fast when compiled, ability to use the repl on a live server for diagnosis and patching functions, language design choices work extremely well for parallel recursive hierarchical inference.


Please take into account that Bluesky is still young and run by a small team which is moving fast. Ignoring a few reports that they consider non-critical is not a strong negative signal, especially not during a time of rapid growth and while they're in beta and invite-only. They just cracked the million users and are probably navigating through a lot of chaos on a daily basis. Those of us who've been in similar situations know that this is normal.


> Slack uses PHP for most of its server-side application logic […].

Slack migrated to Hacklang in 2016 [1].

[1] https://slack.engineering/hakana-taking-hack-seriously/ "We started migrating to a different language called Hack in 2016."


Isn't Hack basically compiled PHP so that it is faster for FB's use case? I understand it's not technically PHP, but I imagine it is effectively still PHP, or is it more like what C++ is to C?

I understand PHP is fast adding any actual language features that Hack has over it?


At one point it was, but since then it’s become it’s own language with a completely new (JIT-based) implementation


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: