Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I love weird jargon. There's plenty of it in the terms we normally use, like despite ROM being a type of RAM, everyone knows when you say RAM you are excluding ROM.

One of my favorites is BLDC which stands for BrushLess Direct Current, which is used to describe motors which are driven by an alternating current at a variable frequency, not to be confused with VFD, which in this case stands for Variable Frequency Drive, and describes how BLDC actually functions, and is on rare occasions used interchangably, but is most often used with analog or open-loop controllers, while BLCD is used for VFDs that are digitally controlled. BLDCs are Brushless not Brush Less and, AC not DC. I like to think the acronym stands for Bitwise Logic Driven Current, which follows the acronym, correctly describes how it works, and differentiates it from non-BLDC VFDs. Also, while VFD can mean Variable Frequency Drive, it is also used to mean Vacuum Fluorescent Display, and for Lemony Snicket fans it can me a whole lot more. (https://snicket.fandom.com/wiki/V.F.D._%28disambiguation%29)

Uncommon jargon is even better, especially when it comes out of marketing departments and describes something ordinary. I have a rice cooker that has terms like "micom" and "fuzzy logic" on the labels. They describe microcontrollers and variables, respectively, which are found in all but the simplest electromechanical appliances.

It's normal for companies to trademark their names for common technology, like Nvidia and AMD trademarking their implementation of variable framerate as G-Sync and FreeSync, respectively, and some companies are more agressive with their trademarks than others, such as Intel trademarking their symetric multithreading as Hyper-Threading, but AMD just calling it symetric multithreading.

Cessna used to have a bunch of random trademarked jargon in their ads in the 70's, from trademarking flaps to back seats.



Your comment reminds me of a couple things. On the topic of RAM, I was recently researching the vintage IBM 650 computer (1953). You could get it with Random Access Memory (RAM), but this RAM was actually a hard disk. I guess IBM considered the disk to be more random access than the rotating drum that provided the 650's main storage. One strange thing is that the disk had three independent read/write arms. Unlike modern disks, the arm could only access one platter at a time and had to physically retract to move to a different platter. Having three independent arms let you overlap reads and seeks.

As far as "fuzzy logic", there's a whole lot of history behind how that ended up in your rice cooker. In the 1990s, fuzzy logic was a big academic research area that was supposed to revolutionize control systems. The basic idea was to extend Boolean logic to support in-between values, and there was a lot of mathematics behind this, with books, journals, conferences, and everything. Fuzzy logic didn't live up to the hype, and mostly disappeared. But fuzzy logic did end up being used in rice cookers, adjusting the temperature based on how cooked the rice was, instead of just being on or off.

[1] https://bitsavers.org/pdf/ibm/650/22-6270-1_RAM.pdf

[2] https://en.wikipedia.org/wiki/Fuzzy_logic


On the topic of RAM, I was recently researching the vintage IBM 650 computer (1953) ... but this RAM was actually a hard disk.

Actually it was a drum. No moving arms at all, but some tracks had multiple read/write heads to reduce access time.

Knuth's high school science fair project was to write an angular optimizing assembler for the IBM 650, called SOAP III.


Actually, no :-) The 650 had a drum for main storage. The "355 Random Access Memory" was something different, an optional hard drive peripheral, which IBM literally called "RAM" (see the link above). This became the RAMAC system.


Right, you could add a RAMAC disk, but that was a very expensive option. Many universities had IBM 650s, but usually without the RAMAC.


The 650 main memory was a drum; but what IBM called Random Access Memory (and RAM) for this machine was a hard drive. As described in the Manual of Operation linked above. Here are a few quotes:

"Records in the IBM Random Access Memory Unit are stored on the faces of magnetic disks."

"The stored data in the Random Access Memory Unit are read and written by access arms."

"The IBM 355 RAM units provide extemely large storage capacity for data... Up to four RAM units can be attached to the 650 to provide 24,000,000 digits of RAM storage."

The main memory on the other hand: "The 20,000 digits of storage, arranges as 2000 words of memory on the magnetic drum..."


The rice cooker audibly turns the heating element completely on or off with a relay or thermostat or similar. It does have some kind of PWM , but so does a purely electromechanical rice cooker. There are different settings for brown and white rice, as well as different keep-warm levels, so I presume it can electronically modify the set point for the temperature.

I had looked up the history of fuzzy logic, when I originally saw the term. Everything I came across was from the mid 60's during the minicomputers era (e.g. the DEC PDP and IBM System/3). I just did a Google ngram search for it, and it does look like it peaked highest in the mid 90's. (https://books.google.com/ngrams/graph?content=fuzzy+logic&ye...) It was especially redundant then, because by that point 32-bit floating-point processors were already extremely common. It made sense during the 60's to have two-bit values in ladder logic control systems, because mini computers were prohibitively expensive, but by the 90's control systems were already using personal computer components to emulate old relay logic, while newer control systems were being written in languages that are still common today.

You just sent me down a rabbit hole of mid-90's buzzword ands jargon, and I found something especially great: neuro-fuzzy logic (https://link.springer.com/article/10.1007/s44196-024-00709-z)

There's also quantum logic (https://en.wikipedia.org/wiki/Quantum_logic) and someone wrote a paper relating it with fuzzy logic: (https://link.springer.com/chapter/10.1007/978-3-540-93802-6_...)

Once academia gets involved, you get fractal (https://xkcd.com/1095/) sub-niches of sub-niches, and it goes further off the rails than marketing jargon.

I swear there's a computing buzzword cycle of embedded systems (e.g. fuzzy logic, smart, IoT, etc.), artificial intelegence (e.g. genetic algorthims, machine vision, neural networks, etc.), and quantum (just quantum, it's never been practical enough to have sub-niches). Right now we're on the trailing end of AI and working our way to quantum. There's a tech company called C3 that has changed their name from the original C3 Energy to C3 IoT and now C3 AI. When they change it to C3 Quantum, we'll know were in the next phase of the buzzword cycle.


Right. For a few years, 3-phase motors used as controlled servos were called "brushless DC" below about 1 HP, and "variable frequency drive" above 1 HP. Now everybody admits they're really all 3-phase synchronous motors.


Can of worms. My understanding is that VFD refers to any control system capable of varying speed and torque, usually through varying the supply frequency and voltage to the coils of an asynchronous AC induction motor. However, it is important to note that a VFD can also control synchronous motors such as BLDC and Permanent Magnet Synchronous Motors (PMSM), although in practice the term is usually applied to control systems for high power industrial AC asynchronous induction motors. It would therefore be incorrect to state "they're really all 3-phase synchronous motors", although some VFD control systems could be seen to emulate synchronous motors with asynchronous motors.


I think it really boils down to being jargon from two different groups, so which term gets used isn't a matter of how the device works as much as it is what the person talking about it does for a living.


Pretty much.

At the windings, all motors are driven by some alternating waveform. A classic "DC motor" has a mechanical commutator which turns DC into square wave AC, with the phase leading the motor so the motor turns. Classic AC motors are driven from sinusoidal waveforms. There's a whole theory of DC motors, and an elegant theory of AC motors that goes back to Tesla. Here's the motor family tree.[1]

Then came power MOSFETs. Today you can make pretty much whatever waveforms you want. It took a while for motor designers to learn how to exploit that properly, and for MOSFETs to get small, cheap, and heat-tolerant. Then drone and electric vehicle motors got really good, at the cost of needing a CPU to manage the motor.

[1] https://www.allaboutcircuits.com/textbook/alternating-curren...


I have a rice cooker that has terms like "micom"

I'm going to guess it's a Japanese rice cooker, because "micom" is a Japanese shortening of "microcomputer", which is what they've been calling microcontrollers. Using it for marketing most certainly dates from the time when computerised control was considered a desirable novel feature, much like when "solid state" was used in English.


When a a language gets a word from another language, linguists call it either a loanword or a borrowed word. They're odd terms, because loaning and borrowing mean it'll be given back, which isn't expected when a langauge adopts a word from a different language.

Japan loves to give words back to English though, with cosplay, anime, emoticon, and now micom all originally being from English, then used in Japanese, and brought back to English from their Japanese form.


It isn't expected until one then learns enough comparative linguistics and etymology to discover that loanwords coming back has happened throughout history. And not just with European languages, although for obvious historical reasons the borrowings and weird routes that some words have travelled across multiple languages is far better documented for European languages.

It's a thing that happens with linguistic contact, and the deeper learning is that it's odd to expect it not to happen to the same word twice between languages with long-standing contact on occasion, given how often it happens and for how many millennia the process has been active.


Nit: ‘Emoji’ is actually a native Japanese word, not a loanword based on ‘emoticon’. The resemblance was originally pure coincidence, though it probably does help explain ‘emoji’’s rapid adoption in English.


> I'm going to guess it's a Japanese rice cooker, because "micom" is a Japanese shortening of "microcomputer"

How do you produce a final "m" in Japanese without a following M, B, or P?


It looks like "マイコン" is transliterated to "micom" in English, even though the Japanese word ends in "n", not "m". For example, Micom BASIC, Micom Car Rally, Micom Games.


Since the jargon we've invented in technology has derived from natural language, it's often repurposing common terms as terms of art. In my opinion this leads to ambiguity and I sometimes pine for the abstruse but more precise jargon from classical languages you can use in medicine (for example).

For example, how many things does "link" mean? "Process"? "Type"? "Local"? It makes people (e.g., non-technical people) think that they understand what I mean when I talk about these things but sometimes they do and sometimes they don't. Sometimes we use it in a colloquial sense, but sometimes we'd like to use it in a strict technical sense. Sometimes we can invent a new, precise term like "hyperlink" or "codec" but as often as not it fails to gain traction ("hyperlink" is outdated).

That's one reason we get a lot of acronyms, too. They're too unconversational but they can at least signal we're talking about something specific and rigorous rather than loose.


Medical jargon (or at least biology jargon) using can still conflict with common language. For example: thorn, spine, and prickle all have different meaning in biology, and the term thorn doesn't cover anything native to England, where that word direves and was used in Shakespeare's plays.


Although SMP is an abbreviation for Symmetric Multi-Processing (multiple processors or processor cores with shared memory), SMT is not symmetric but Simultaneous Multi-Threading. To get back on topic, SMT is often confused with barrel processors that switch threads between clock cycles, like the Honeywell 800 with special register groups. The "simultaneous" in SMT means that a single processor core runs instructions from multiple threads on the same clock cycle.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: