Hacker Newsnew | past | comments | ask | show | jobs | submit | jaywee's commentslogin

> "we have to cater to what 96% of users know" Indeed. Just recently saw a talk "Are we stuck with the same Desktop UX forever? | Ubuntu Summit 25.10" (https://www.youtube.com/watch?v=1fZTOjd_bOQ) where Scott Jenson (Apple/Google UX) laments we stopped innovating on Desktop.

Such appeal to conformity is indeed quite sad from GNOME.


Jonathan McDowell keeps track of all the Starlink Sat orbits, including failures:

https://planet4589.org/space/con/star/stats.html


Don't confuse launch _price_ with launch _cost_. It's been estimated the internal F9 launch costs are around $15M-$20M.

The $5M is a marginal cost-target for fully reusable Starship.


Quite helpful infographics from ULA: https://blog.ulalaunch.com/hubfs/orbital%20debris.jpeg


Well, divide et impera. Fairly straightforward for AI inference (not training): The existing Starlink constellation:

3491 V1 sats × 22.68 m² = 79176 m²

5856 V2-mini sats × 104.96 m² = 614 646 m²

Total: 0.7 km² of PERC Mono cells with 23% efficiency.

At around 313W/m² we get 217MW. But half the orbit it's in shade, so only ~100MW.

The planned Starship-launched V2 constellation (40k V3 sats, 256.94 m²) comes out at 10 km², ~1.5GW.

So it's not like these ideas are "out there".


Take those 40,000 satellites, and combine their solar panels, and combine the cooling panels, and centralize all the compute.

Distances are not our friend in orbit. Efficiency hyperscales down for many things, as distances and area scale up.

Things that need to hyperscale when you scale distance and area:

• Structural strength.

• Power and means to maneuver, especially for any rotation.

• Risk variance, with components housed together, instead of independently.

• Active heat distribution. Distance is COMPOUNDING insulation. Long shallow heat gradients move heat very slowly. What good does scaling up radiative surface do, if you don't hyperscale heat redistribution?

And you can't hyperscale heat distribution in 2D. It requires 3D mass and volume.

You can't just concatenate satellites and get bigger satellites with comparable characteristics.

Alternatives, such as distributing compute across the radiative surface, suffer relative to regular data centers, from intra-compute latency and bandwidth.

We have a huge near infinite capacity cold sink in orbit. With structural support and position and orientation stabilization for free. Let's use that.


Ideally, in an organization this should be a centrally pushed group policy defining CIDRs.

Like, at home, I have 10/8 and public IPv6 addresses.


Yeah, I had to create a user on many language versions just to keep the old skin. I fail to see a point in wasting 2/3 of my screen estate to whitespace.


The problem with unbounded page width is that the content often didn't work well that wide. I use a 1440p monitor and on the old theme, paragraphs were often a bit too wide to read comfortably. As a Wikipedia editor, it was also very often a challenge when writing articles – it's difficult to make your placement of images and tables and so forth look decent on viewports anywhere from ~400px to >2000px wide.

All that said, the new theme does allow you to return to wide mode -- on any page on the site, there's an Appearance options panel in the rightmost column where you can switch from Standard to Wide.


I'm in the same boat. I think it's unfortunate that MediaWiki doesn't accept a theme header or cookie, which would be very easy to set with extensions. User sessions don't persist for very long and don't work in private browsing mode, while the querysring useskin= parameter is annoying to work with and isn't applied to hyperlinks.


People creating an account just to browse isn’t great for the WMF either. You don’t get the pages cached by Varnish and hit the Mediawiki servers directly. There’s still parser cache, but it’s still more load while logged in.


I mean, a header/cookie just for skin would probably have the same issue, cache would have to be split or it would have to cause (varnish) cache to be skipped. Something people aren't going to be eager for if its a large number of users.

Depending on where you are connecting from this is also a latency increase since you are no longer hitting geo-located servers.


We could go the Reddit route and have monobook.en.wikipedia.org


In theory user sessions are supposed to persist for a year, although it seems there have been issues recently.


Having a 60 days consultation with the Fish and Wildlife Service whether a falling hot-stage ring (essentially dumb steel piece) causes danger to fish is just silly.


I wonder if fishing boats need to have a 60 day consultation with the Fish and Wildlife Service before each fishing expedition.

It seems to me that fishing expeditions pose a significant threat to the welfare of fish.


Have multiple explicit ones. One liberal, one conservative, one progressive...


Not the first case - It was Kodak who built the first digital camera after all.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: