Hacker Newsnew | past | comments | ask | show | jobs | submit | broodbucket's commentslogin

IBM ostensibly failing with Watson (before Krishna was CEO for what it's worth) doesn't inherently invalidate his assessment here


It makes it suspect when combined with the obvious incentive to make the fact that IBM is basically non-existent in the AI space look like an intentional, sagacious choice to investors. It very may well be, but CEOs are fantastically unreliable narrators.


You expect somebody to be heavily invested currently and also completely openly pessimistic about it?


No, I don’t trust a word Sundar or Satya say about AI either. CEOs should be hyping anything they’re invested in, it’s literally their job. But convincing investors that every thing they don’t invest in heavily is worthless garbage is effectively part of their job too.

What is more convincing is when someone invests heavily (and is involved heavily) and then decides to stop sending good money after bad (in their estimation). Not that they’re automatically right, but is at least pay attention to their rationales. You learn very little about the real world by listening to the most motivated reasoner’s nearly fact-free bloviation.


> What is more convincing is when someone invests heavily (and is involved heavily) and then decides to stop sending good money after bad (in their estimation).

But Watson doesn't count?


Commercially, Watson was a joke from beginning to end. If their argument is that Watson’s failure indicates a machine that can at the very least convincingly lie to you will definitely fail to make money, that’s an insipid argument.


Yeah I was going to say the same thing ha. I get what they’re (the commenter) saying, but one could also argue IBM is putting their money where their mouth is by not investing.


I suspect the reality is that they missed the boat, as they have missed tens of other boats since the mainframe market dried up. I guess you could argue they came to the boat too early with their pants on backwards (i.e. Watson), and then left before it showed up. But it’s hard to tell from the outside.

Maybe that will turn out to be a good decision and Microsoft/Google/etc. will be crushed under the weight of hundreds of billions of dollars in write-offs in a few years. But that doesn’t mean they did it intentionally, or for the right reasons.


Totally. Just sort of mulling over possibilities. But I think you’re right it’s likely they just got it wrong


IBM is probably involved somewhere in the majority of things you interact with day to day


Yep. When a brand has tarnished itself enough, it makes sense for the brand to step back. Nowadays, we interact with their more popular properties, such as Redhat.


GrapheneOS is moving their servers out of France if you weren't aware


They're terrorists for not having uBlock Origin instead


OBS smashing XSplit comes to mind


I do still write assembly sometimes, and it's a valued skill because it'll always be important and not everyone can do it. Compilers haven't obsoleted writing assembly by hand for some use cases, and LLMs will never obsolete actually writing code either. I would be incredibly cautious about throwing all your eggs into the AI basket before you atrophy a skill that fewer and fewer will have


I'm pretty sure people will tell you that Brexit was bad for a laundry list of other reasons too that do very much directly affect the average person


as a casual observer living in the uk, what brexit has done is stopped the influx of highly educated and economically contributing people from the EU, and instead replaced them with people who are claiming "asylum" from asian and african countries

downvotes ahoy


As a long term Brit I kind of get that impression too although there has been a lot of regular immigration also. I bet the brexit voters who tended not to be keen on immigration have been pleased with that.

Also a lot of regular Brits have moved abroad. Dyson who famously advocated for brexit to help Britain moved to Singapore, my friends have moved to France, Portugal, Spain and Dubai.


Downvotes because while you're right it has reduced immigration from the EU, the vast majority of post-Brexit migration to the UK has no been asylum seekers, and most asylum seekers have not been Asian or African.


Had to look it up but I found Indian and Nigeria specifically as country of origin for work related migration


Economic migration is very different from asylum seekers, as the person above claimed.

The vast majority of people arriving from Nigeria and India do so on visas, and would have near zero chance of getting asylum claims approved.


Maybe this is a new low for the more regular consumer facing stuff, but this is hardly new for Apple. The $1000 wheels come to mind.



Apple were also widely ridiculed for the iPad (just a big iPhone), AirPods (everybody who wears them looks goofy), and Apple Watch (ugly square).


Screenshot this when the iPhone Pocket is the hot new product everyone must buy, but somehow I don't think these are even remotely in the same category. I don't think Ballmer laughing at the iPhone's price is in the same category as this or the wheels, somehow. Maybe I'm just not enough of a thought leader.


I’m not saying it’s going to be a hot new product that everyone must buy, I’m pointing out that “Apple product ridiculed online” is a completely meaningless non-event that it makes no sense to report on. It’s going to happen for excellent, incredibly successful products; it’s going to happen for bad products; and it’s going to happen to all the products in-between.


Apple Watch also that no one not a boomer wears watches any longer.


Yeah this is out of touch. They're definitely still popular, and besides, older generations exist and their product preferences are valid


Except for all the young, fit people that want to track their workouts and health. Maybe the time of watches that just tell time has passed - I would argue even against that with the continuing existence of Swiss luxury brands - but watch as a small health monitor is still in full force.


To be honest I usually wear a cheap Timex even though I have an Apple Watch because charging is a task. Wear for hiking. Care less about regular fitness tracking.


People that care about battery life probably use something more sporty like the new Suunto's (I get about 2weeks+ of batter life). However the smart watch health features are nowhere near as good.


You're kidding, right? It's ubiquitous. I see it everywhere. It's almost unusual to see someone wearing a watch that isn't one.


The anticheats themselves typically do support Linux, it's the devs that don't choose to use them


Well EAC for example is user space only because it has to be, which some games decide is not an acceptable level of security.


Those are generally not the same anticheats with the same levels of functionality. As an analogy it's like saying Excel supports iPad. Or a gaming example that used to be way more common: Tony Hawk Pro Skater 2 is supported on Game Boy Advance.

It's a game and it is Tony Hawk, but it's not really comparable as Tony Hawk on PS1.


The Year of the Linux Desktop is kind of happening. Not at the scale that the meme implies, but I've never seen anywhere near as much adoption of the Linux desktop as this year. The combination of Valve's efforts, more usage of Linux gaming handhelds, distributions like Bazzite that have strong selling points for Windows gamers, and Microsoft pissing everyone off with everything that is Windows 11, the Linux desktop has some legitimate momentum for once


Especially considering how much software these days on Windows are all Electron/Web. So is not a hard switch as it once was.

I switched from Windows to Linux it's been 2 years. One of the few things I missed on Windows, was the native WhatsApp app, as the Web WhatsApp it's horrible. Then a few months Meta killed the native app and made into a webview-app :)


It only takes one application to force you back to using Windows.

e.g. HellDivers 2 didn't work well until recently on Linux. If you are playing certain factions it is a very fast paced game and I would frequently experience slow downs on Linux.

So if I wanted to play HellDivers 2, I would have to reboot into Windows. Since running kernel 6.16 and updates to proton it now runs better.


And I can just take about any Linux distro, install it to about any computer and have an extremely nice device to work, play games, and handle almost any daily task with. I call that a huge success.


Yet, still 1/4th of the time my ThinkPad with Linux wakes with a Thunderbolt display connected it dies with a kernel panic deep in the code that handles DDC (no matter what kernel version).

And the latest gen finger print scanner only works between 10-50% of the time depending on the day, humidity, etc., no matter hof often you re-enroll a fingerprint, enroll a fingerprint multiple times, etc.

And the battery drains in 3-4 hours. Unless you let powertop enable all USB/Bluetooth autosuspend, etc. But then you have to write your own udev rules to disable autosuspend when connected to power, because otherwise there is a large wakeup latency when you use your Bluetooth trackball again after not touching it for one or two seconds.

And if you use GNOME (yes, I know use KDE or whatever), you have to use extensions to get system tray icons back. But since the last few releases some icons randomly don't work (e.g. Dropbox) when you click on it.

And there are connectivity issues with Bluetooth headphones all the time plus no effortless switching between devices. (Any larger video/audio meeting, you can always find the Linux user, because they will need five minutes to get working audio.)

As long as desktop/laptop Linux is still death by a thousand paper cuts, Linux on the desktop is not going to happen.


I have had worse experiences on each and every count with various Windows installs on various laptops, and yet it is the "de facto" desktop OS.


That is simply not true. I have tried to get so many people on Linux, just for it to fail when they try to do something simple, enough times in a row for them to want to go back to Windows.

I really wish it was seamless and good, but it just isn't (and frankly it's a bit embarrassing it isn't given desktop environments for GNU Linux have been in development for 20+ years).


I'm not saying it's seamless and good. I'm saying that I have had windows fail in similar or worse ways.

For example the laptop I had from my previous employer (a pretty beefy Dell) was failing to go to sleep, I had to unplug the charger and the HDMI cable on my desk each night, otherwise every second night it was keeping my monitor lit on the lock screen; when low on battery it clocked the CPU down so much that the whole system froze to a grinding stop not even the mouse pointer was moving, and even after putting it back on the charger it remained similarly unusable for a good 10 mins..

Like I have been using Linux since the Xorg config days when you could easily get a black screen if you misconfigured something, but at least those issues are deterministic and once you get to a working state, it usually stays there. Also, Linux has made very good progress in the last decade and it has hands down the best hardware support nowadays (makes sense given that the vast vast majority of servers run Linux, so hardware companies employ a bunch of kernel devs to make their hardware decently supported).


> Yet, still 1/4th of the time my ThinkPad with Linux wakes with a Thunderbolt display connected it dies with a kernel panic deep in the code that handles DDC (no matter what kernel version).

This doesn't happen on my ThinkPad but does on my MacBook. If anyone else faces these kernel panics on their Mac, you have to set your monitor to a hard 120hz rather than a variable rate on the macOS display settings. KDE handles the variable rate just fine on the ThinkPad for me.


I had so many more issues running Windows over the years than Linux. BSODs were a common occurrence, and yearly fresh installs were a thing to keep my computer usable.

I moved to Mint almost 4 years ago at this point, running it on a now fairly old Dell G5 from 2019. Runs as smoothly as ever.

I had one problem during this 4 year run (botched update and OS wouldn't start). Logging to terminal and getting Timeshift to go back to before the update did the trick. Quick and painless. I could even run all the updates (just had to be careful to apply one of those after a reboot).

I have no idea what you are talking about. Maybe I am just very lucky with Linux.


It's the same in every discussion about OS vs OS. People who like one OS will claim that the other OS is full of problems, and vice versa. In some cases I guess people are just lucky/unlucky. Personally, I've been using both in parallel for about 15 years, and while I've never had any issues with Windows (no BSODs), Linux constantly gives me problems. But I'm a developer and much prefer to develop on Linux, so I stick with it.


Though I think that is not warranted with respect to my original comment. I have used Linux in some form or shape for 31 years now (jikes), I would love Linux to win, and I have used Linux on a wide variety of hardware (last few laptops have been ThinkPads).

I think desktop Linux will not improve until people start acknowledging the issues and work on it. It's the same as the claim that Linux is very secure (which Linux fans will often repeat), while it has virtually no layered security, and a fairly large part of the community is actively hostile towards such improvements (e.g. fully verified boot).


I think people tend to have double standards when it comes to Linux. People who run Linux generally choose to run Linux intentionally and are for that reason more willing to accept/overlook issues.

I have both Linux machines and Macs and Linux has always been objectively worse when it comes to driver and software issues. It's just has a large number of paper cuts.


I think people tend to have double standards when it comes to MacOS. People who run MacOS generally choose to run MacOS intentionally and are for that reason more willing to accept/overlook issues.

I use both Linux machines and Macs (at work) and Macs has always been objectively worse when it comes to usability ajd development. It's just has a large number of paper cuts.


The odds of having just about any Linux distro work "out of the box" without manual tweaking on just about any computer are still pretty low I'm afraid (by "work" I mean "support all of the functionality"). For instance, the laptop I'm writing this on connects without problems to a Bluetooth mouse, but won't for the life of me work with my Bluetooth headphones.


> The odds of having just about any Linux distro work "out of the box" without manual tweaking on just about any computer

Well, show me that magic OS that works on "just about any computer", because I am sure Windows ain't that. OSX only works on their select devices, and Windows have its own way of sucking. Let's be honest, there are shitty hardware out there and nothing will work decently on top. People just try to save these by putting Linux on top and then the software gets the blame.


As long as it isn't a gamer laptop.


Not really, because Proton is Win32, kind of.


Half of the applications people use on Windows are just browsers in a native frame, at this point Win32 is just one of the many "stacks" that you can run on Linux.


It really isn't. This is a temporary sugar rush that comes after pretty much every time Microsoft does something awful. After a while the buzz will fizz out and the majority of those PC gamers that looked to switching go back to Windows.

IME a lot developers don't even use Linux on their desktop machine. I've met three developers that use Linux professional IRL. A lot of devs have a hard time even using git bash on Windows.

I am always called up by people at work because I am "the Linux guy" when they have a problem with Linux or Bash.

Sure, there are a lot of people that use Linux indirectly e.g. deploy to a Linux box, use Docker or a VM. But if someone isn't running Windows, 9 times out of 10 they are running a Mac.

More generally the thing that has paid the bills for me is always these huge proprietary tech stacks I've had to deal with. Whether it be Microsoft's old ASP.NET tech stack with SQL Server, AWS, Azure, GCP, what pays the bills is proprietary shite. I hate working with this stuff, but that what you gotta to pay the bills.


I mean, this strongly has to depend on what kind of software you are developing. I don't know a single developer who primarily uses Windows. Literally everyone around me uses Linux for development work (and a large portion of them also use Linux for their personal machines).


Of course. However if a developer isn't using Windows typically they are using a Mac.

In corpo-world. Everyone is using Windows. If they are using Linux it would be through a VM or WSL. I guarantee none of those people are using Linux at home.

So for every developer you know that is using Linux, there are many more people using Windows supplied to by their IT department.


> In corpo-world. Everyone is using Windows. If they are using Linux it would be through a VM or WSL. I guarantee none of those people are using Linux at home.

And I guarantee that you're wrong, because I work a corporate job where I have to put up with Windows and am 99% Linux at home. (The other 1% is *BSD and illumos.)


You are the minority but you can believe whatever you like.

The vast majority of developers I have worked with (and I've contracted a lot of places) know next to next to nothing about Linux. They can barely use a terminal (Powershell, CMD, Bash/Zsh) and often can't do anything outside of the IDE.

If they do use Linux. It be on a Raspberry PI that gets stuck in a drawer after a few months.

To those that keep voting me down on this. The teams and environments you work in are the outliers. I've had to accept that I am in the minority as a Linux user even amongst software professionals.


Yeah, I'm probably a minority. That doesn't mean that nobody uses linux, just that it's less common.


I never said that nobody uses Linux. I said that it was extremely uncommon even amongst developers.


> I guarantee none of those people are using Linux at home.

[...]

> I never said that nobody uses Linux.

I'm willing to believe that this is just a misunderstanding resulting from nonliteral exaggerated language for effect, but ... yes, you did.


>Sure, there are a lot of people that use Linux indirectly e.g. deploy to a Linux box, use Docker or a VM. But if someone isn't running Windows, 9 times out of 10 they are running a Mac.

That was my original comment. It is pretty easy to that to assume that when someone says "none" in a subsequent comment they mean "almost none" following that statement.


> This is a temporary sugar rush that comes after pretty much every time Microsoft does something awful.

I think what it fundamentally comes down to is that for consumer-oriented Linux to see widespread adoption, it needs to succeed on its own merits. Right now, and since forever, Linux exists in a space for the majority of consumers who consider it where they think "I might use it, because at least it's not the other guy". A real contender would instead make the general public think "I'll use this because it's genuinely great and a pleasure to experience in its own right". And that's why I have absolutely zero faith in Linux becoming a viable smartphone ecosystem. If it were truly viable, it would have been built out already regardless of what Android was doing. "Sheltering Android refugees" is not a sustainable path to growth any more than "sheltering Windows refugees" is.


I agree, with a caveat. The vast number of consumers don't even know Linux/BSD or any the alternatives exist.

I have zero faith in a Linux smartphone. What will happen is that there will be some GNU/FSF thing with specs that are 15 years out date and you will have to install Linux via a serial console using Trisquel and the only applications available will the Mahjong (yes I am being hypobolic).


Clearly hyperbole! We'll also have TuxPaint, SuperTuxKart (CPU rendering only, because the toolchain doesn't support Android's HAL), and a couple of (long-abandoned) LibreOffice forks that crudely adapt different subsets of the interface for a touch device.


Unfortunately in the past people have taken obvious hyperbole literally.

I realised a few years ago when one of my friends didn't know what the browser was on her phone, that any notion of people caring about the OS outside of branding is pretty much non-existent.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: