Yes, this is apparently the moment international law finally collapsed. After decades of regime change, covert action, targeted killings, proxy wars, and executive unilateralism carried out by every administration of both parties. What makes this case uniquely intolerable is not the conduct itself, but the failure to dress it up in the usual layers of euphemism and multilateral paperwork. The real violation, it seems, is skipping the rituals.
Right, because this time humans will heroically reject cheaper, faster, better substitutes out of pure reverence for "the human condition." Any day now. We’ve famously done that with factory goods, photography, recorded music, CGI, calculators, spellcheck, and literally every other technology that "could never replace" human craft.
The claim boils down to: AI won’t replace humans because humans like feeling special. A rigorous argument. History definitely supports the idea that sentiment reliably beats convenience, especially once the imitation is indistinguishable and half the price. But yes, this disruption will surely pause out of respect.
2011: Microsoft buys Skype for $8.5 billion, rebrands Lync to Skype for Business, an incredible branding manoeuvre.
2012: Skype peer-to-peer nature and end-to-end encryption removed.
2017: Microsoft launches Teams, competes with own product after driving Skype into the ground for 6 years with encroaching advertising, removing features, and abandonment.
2025: Skype shut-down.
What was the point? When MS bought Skype, they already held a majority market share in the IM market with MSN, which they also shut-down. Between 2011 and 2025 they lost almost all market share for domestic users to WhatsApp and Discord.
This series of events baffles me to no end.
If I remember correctly, they had to buy Skype twice because they didn't get everything g in the first transaction. Also, the purchase was backed by the cia through one of their companies (I don't remember if it was palantir though) to remove the end-to-end encryption.
I think that was ebay. They bought skype but not the p2p tech backing it; fucked the founders on earnouts; the founders refused to reup the contract; ebay was incapable of replacing the tech; and the founders got a bunch of the business back. Presided over by Meg Whitman, who appears to be profoundly incompetent, understanding neither tech nor business.
Not quite--this predates .net. They acquired Hotmail in 1997, while it was running on Solaris mail servers and Apache on FreeBSD for the web frontend. In a highly publicized move, Microsoft ventured to port it to Exchange and IIS on Windows NT. This went on for years on end, with MS claiming to have finished the transition several times, while getting egg on their face. Eventually, they got it running on Windows 2000 and a combination of their flagship products and Windows Services for Unix (the WSL of those times).
It has since been rebranded as MSN Hotmail, Windows Live Hotmail, Hotmail, and Outlook, likely with some 365 thrown in.
Meanwhile, they have mismanaged their once great mail user agent Outlook Express, as well as their quite useful personal information manager Microsoft Outlook, to the point where their newest offering is absolutely unusable.
What's the point? Did someone in marketing just decide that "Active Directory" is too 90s and must be tossed out based on vibes? Entra ID sounds like something completely unrelated.
IIRC one reason was that Azure Active Directory bore little technical relation to Active Directory and it was endlessly confusing to customers. Especially as AAD evolved into an identity system and away from a directory.
Trivial example is that AAD doesnt do LDAP, unlike regular AD which was built on it. It's not surprising that some PM would keep "AD" in the name of AADto make the transition to cloud seem less scary, but after a few years its actively unhelpful as the majority of customers have made the switch to cloud based auth and identity.
Actually "Azure Active Directory" was confusing in the first place, since Active Directory already existed for decades, and Azure Active Directory was not-really it's cloud-equivalent
Though some of the rest of Clippy's "friends" (Lynx the cat and Rover) came from Bob and were Bob's "friends" first. Windows 2000 added Microsoft Agent which shared some characters in 2D with Office 2000 and invented a few new 3D characters (which Office never used). Windows XP added "Search Assistant" using some of the Office characters (Rover being the default search character and Lynx still hanging around as an alternative).
The legacy of Bob lived on for a while. Also Microsoft Agent was a lot of fun to play with as a kid in High School. I built some wild PowerPoint Presentations scripting Agents from the Notes field.
[edit] Maduro remained under US federal indictment on narco‑terrorism and related cocaine trafficking conspiracy charges throughout the Biden administration.
Venezuela has always been a minor player in the drug trade compared to other countries. The whole narco-terrorism thing has always been code for "he took back the oil and we don't like him"
Reuters calling the switch a "font" change instead of a typeface change is troubling, though consistent with a society that now casually refers to all pasta as "spaghetti". A typeface is the design; a font is its specific instance. This is basic knowledge, taught to children, houseplants, and most domesticated goats.
A simple correction would stop this spiral, but Reuters appears committed to forging a bold new era in which terminology is chosen at random, like drawing Scrabble tiles from a bag and declaring them journalism.
I’m a professional graphic designer, people in the industry use font, type and typeface interchangeably. No one goes “Umm Actually…”
you should also tell that to who wrote css, because font-weight doesn’t make sense if a font is already a specific weight. Words mean something specific until they don’t and the meaning changes over time and that’s okay
Originally, a font (also spelled fount, at least formerly) was a physical thing: a collection of metal slugs, each bearing the reversed shape of a letter or other symbol (a glyph, in typographical parlance). You would arrange these slugs in a wooden frame, apply a layer of ink to them, and press them against a sheet of paper.
The typeface dictated the shapes of those glyphs. So you could own a font of Caslon's English Roman typeface, for example. If you wanted to print text in different sizes, you would need multiple fonts. If you wanted to print in italic as well as roman (upright), you would need another font for that, too.
As there was a finite number of slugs available, what text you could print on a single sheet was also constrained to an extent by your font(s). Modern Welsh, for example, has no letter "k": yet mediaeval Welsh used it liberally. The change came when the Bible was first printed in Welsh: the only fonts available were made for English, and didn't have enough k's. So the publisher made the decision to use c for k, and an orthographical rule was born.
Digital typography, of course, has none of those constraints: digital text can be made larger or smaller, or heavier or lighter, or slanted or not, by directly manipulating the glyph shapes; and you're not going to run out of a particular letter.
So that raises the question: what is a font in digital terms?
There appear to be two schools of thought:
1. A font is a typeface at a particular size and in a particular weight etc. So Times New Roman is a typeface, but 12pt bold italic Times New Roman is a font. This attempts to draw parallels with the physical constraints of a moveable-type font.
2. A font is, as it always was, the instantiation of a typeface. In digital terms, this means a font file: a .ttf or .otf or whatever. This may seem like a meaningless distinction, but consider: you can get different qualities of font files for the same typeface. A professional, paid-for font will (or should, at least) offer better kerning and spacing rules, better glyph coverage, etc. And if you want your text italic or bold, or particularly small or particularly large (display text), your software can almost certainly just digitally transform the shapes in your free/cheap, all-purpose font, But you will get better results with a font that has been specifically designed to be small or italic or whatever: text used for small captions, for example, is more legible with a larger x-height and less variation in stroke width than that used for body text. Adobe offers 65 separate fonts for its Minion typeface, in different combinations of italic/roman, weight (regular/medium/semibold/bold), width (regular/condensed) and size (caption/body/subhead/display).
In my experience, "font" is the colloquial term referring to either. Programmers get to demand precision, for journalists it's a bit tougher. The de facto meaning of terms does, unfortunately, evolve in sometimes arbitrary ways. And it's tough to fight.
If all DoS documents are prepared with the same software or software suite (e.g. MS Office), isn't that a distinction without much of a difference? They've gone back to using TNR.ttf instead of Calibri.ttf (or whatever the files are actually called).
Yanis, wasn’t an aristocrat. However, these were all from upper-class or elite backgrounds too:
Che Guevara, Fidel Castro, Ho Chi Minh, Pol Pot, Mao Zedong, Josip Broz Tito.
There are tonnes of people like that in Marxism. The irony is that they claim to want to liberate the working class when they were never part of it. Guevara especially came from a wealthy background.
YV was born into a monied family and married into one. He also went to private school as a child. As far as I know he has spent most of his life in an ivory tower.
irony: the expression of one's meaning by using language that normally signifies the opposite
> they claim to want to liberate the working class when they were never part of it
that is not irony.
Do you believe only working class people can improve the situation for working class people? That seems counter-intuitive to me, as people outside the working class usually have more time and education to think about changes and advocate for them.
Oh it's irony alright. I've encountered many such folk. They claim to mean well but barely know what to deal with a real live proletarian when they encounter one.
What was it the Who once sang, "meet the new boss, same as the old boss"? I don't think he has ever been much out of privileged circles.
The way ordinary people's lives can improve is self-advocacy and self-determination. You're not going to find that from Cambridge University where he teaches (one of the snobbiest and class ridden institutions in the UK which often resembles Hogwarts more than a modern university), the World Economic Forum (which prefers closed meetings to public ones and is furtive about its aims), or anything like that.
In this article he is right to voice concern about Big Tech oligarchy. But his analysis is off and he is not aware of what it really means to millions of people.
He is a World Economic Forum stooge, mate. There are only two ways into that. You either pay hundreds of thousands to attend each day, or you get invited from the inside as he did.
The WEF includes the top 100 companies in the world, along with the leaders and opposition leaders of every country. He's part of it.
I'm with saubeidl—while I, too, am deeply suspicious of anyone who comes from that level of privilege, and I've heard some shady stuff about Varoufakis in the past, you are refusing to present a) any meaningful criticism of his actual arguments, or b) any specific causal link between aspects of his background and reasons to doubt his work.
You just keep pounding on the fact that he is from privilege, but that alone is not enough to be suspicious.
At the very best, you are arguing guilt by association.
Guilt by association? Yes, he's associated with the WEF, a group of elite politicians and wealthy tycoons who meet away from the prying eyes of the public to discuss policies, and then tells us he's on our side.
They say you can know someone by the company they keep. The company he keeps are the Cambridge University students (who come disproportionately from private schools), mainstream media pundits and the billionaires of the WEF.
He doesn't just come from privilege... He never really left it.
But one can come from privilege, and choose to use that privilege to, effectively, infiltrate the ranks of the moneyed elite, seeking to make things better for the average person.
Some people have claimed that this is what he is doing. You have provided zero evidence that it is not. You merely keep repeating the same statements about his origins and his affiliation with the WEF.
I am not saying you are wrong. I am saying you have not supported your position in any meaningful way.
We have heard plenty from this particular messenger. He is/was never off the BBC at some stage. I always ask why someone is pushed on the public so hard, whether it is him or even someone like Jordan Peterson. I have the same basic issue with him as I do with the Guardian... He is completely out of touch with what is happening at ground level because of his current life and his origins.
How we would design a rigorous study that measures total cost of ownership when teams integrate AI assistance, including later maintenance and defect rates, rather than just initial output speed?
Your experience illustrates something that often gets lost in the autism-vs-not-autism debate: many people don’t fall into clean diagnostic categories. You’re describing a profile that mixes autism traits, trauma adaptations, ADHD features, and developmental history, and instead of neatly labeling you, the system failed you outright with a bipolar misdiagnosis. That alone shows how fragile clinical certainty really is.
I think the most important part of what you wrote is that you changed over time. Whether that improvement came from meditation, therapy, maturity, trauma processing, or simply growing into yourself, it challenges the idea that autism is a static essence. Development, coping skills, neurology, and environment interact in ways the current diagnostic boundaries don’t fully capture.
Where I push back slightly is on the conclusion that self-diagnosis can automatically fill the gaps. For some people it’s deeply accurate and validating, for others it may explain one part of their experience but obscure another. As you said, many people carry a “grab-bag” of traits, and a single label can illuminate or compress that complexity depending on how it’s used.
You’re right that the field has a painful history and uneven present. Misdiagnosis is real. Forced treatment is real. Diagnostic tools are blunt instruments for a very diverse human reality. Supporting research while staying critical of the system makes sense, not because autism isn’t real, but because the categories we have are still evolving. Your story is a perfect example of why humility in diagnosis matters, whether it’s done by a psychiatrist or by oneself.
reply