Hacker Newsnew | past | comments | ask | show | jobs | submit | giraffe_lady's commentslogin

How do you get from occupy to the tea party is the link this hypothesis is missing.

Tea party was the clear predecessor to the maga movement, with its nucleation point being simple racist backlash against obama and trump being personally & directly involved in stoking that racism. In retrospect it obviously laid the groundwork for trump's movement, and I can't see any direct link from occupy to tea party other than perhaps some individuals like tunney.


I don't think it did, she was openly pro-musk during his purges with doge so very recently.


Well, and now you use it for this so what was all that for?

> In character of what, that Thiel is a mustache twirling villain?

Well, yes, exactly. Sorry it's that simple & unsatisfying but it absolutely is.


Rural and semi-exurban people consider themselves a nation¹ that the urban majority are not members of. And now they want that nationalism socialized. If you see what I mean.

¹: by the formal denotation in sociology, which they agree with but not describe it that way if asked.


Could be a parallel construction type thing. They already have access but they need to document a legal action by which they could have acquired it so it doesn't get thrown out of court.

I think this is pretty unlikely here but it's within the realm of possibility.


Seems like it would be hard to fake. The was she tells it she put her finger on the pad and the OS unlocked the account. Sounds very difficult to do

I think they mean if they already have her fingerprint from somewhere else, and a secret backdoor into the laptop. Then they could login, setup biometrics and pretend they had first access when she unlocked it. All without revealing their backdoor.

It's not a literal allusion to satan and never was, it's akin to the idiom "the devil is in the details."

It was nearly or actually impossible to harmonize or progress tritones while following the medieval "rules" for church music composition. So it is someone complaining about their work, not a statement about musical morality or whatever.


multiply by 3^m/2^n is a human construction. and it breaks after a bunch of m,n. to expect it "not to break" is well an expectation. nature always has a comma, its not clockwork.

Very cool explanation. Something I've come across a few times on here was wanting to explain how 12tet "includes" or handles approximations of intervals from other scales, and how that affects the musical choices of musicians or especially the notation choices for transcription of improvised music.

But it's impossible to explain without getting into like, what is even the problem solved by tuning systems. Without the intuition that comes from making music, programmers and engineers see the fractions & obvious series and get too fixated on finding the "perfect" system. When these are much more physical tools, created over time to make certain processes easier. Tuning systems are more like a woodworker's knives than like the unit circle: being perfect does not make them better tools for creation if they are already fit enough.


12tet is a practical compromise between overtone approximation and playability.

Any fretted or keyed acoustic instrument with more than 12 notes/octave is extremely difficult and expensive to build and hard to tune and play.

The music is also hard to notate, because there are so many more possible pitch positions.

So 12 notes became a practical default for instrument builders. And from there, ET became a practical default for tuning to allow smooth-ish modulation through different keys. The errors in the tuning are the same for every key, so the tuning relationships stay the same while the key root changes.

That doesn't quite happen because overtone perception and beating effects aren't completely linear. But it avoids the obvious out-of-tune notes you get when playing in distant keys with non-ET tunings.


An interesting option for fretted instruments is to build it with 12 regular frets per octave, and then add temporary extra frets if you want to make specific notes more in tune for the key you are in.

This was common for lute players in the Renaissance and Baroque periods. Lute frets were usually made of the same material as strings and were tied on around the neck, so adding an extra full width fret was not hard.

Of course being tied on lute frets were also movable, so they could also reposition the regular 12 frets instead of adding extras.

Instead of tying on an extra full width frets another common option was partial frets. They would glue a piece of string or wood under a specific string, to just give an option to play one note with either the regular fret or the more in tone little fret (called a "tastino" (plural is "tastini")).

In this video [1] Brandon Acker uses a guitar that his luthier friend was building that had not yet had the frets installed to demonstrate with lute style tied on frets some different 12 tone tunings, then demonstrates using tastini to improve specific notes, such as little fret on the G string a little before the first regular fret so that G♯ and A♭ can be different notes.

The tastini he uses in the video are simply pieces of string held on with a piece of tape, so quick and easy to add and remove.

[1] https://www.youtube.com/watch?v=tiKCORN-6m8


A simpler framework with just as much explanatory power is just depression. If you spend a lot of time depressed you will "predict nine of the last five recessions" but with everything. You will have seen every bad event coming, along with a lot of bad events that didn't end up coming.

Anxiety too. When things get bad for me I always know exactly what's going to happen and it's all bad. The voice that says "well maybe not" just goes away.

From the perspective of an archive, library, or museum preservation isn't really the goal in itself, just a strictly mandatory prerequisite. The pieces have to be made available to researchers (and depending on the institution the public) for the archive to be able to consider itself fulfilling its mission.

There is kind of a cost/preservation/accessibility triangle with curatorial preservation, and museums already normally choose storage that is somewhere other than the most expensive/best preservation corner of that triangle. Oxygen-depleted facilities significantly extend that corner, but if we're already not using what we have there then it may not be a useful addition.

Low-oxygen environments also have their own preservation issues. I'm not actually a museum curator so I don't know the specifics. But it is a very complex and old discipline and they've tried just about everything. The problem is usually funding, which unfortunately boils this whole thing down to another boring "you can't solve social problems with technical solutions."


What is the social problem you refer to?

Presumably society doesn't deem preservation to be worth any cost.

There are other considerations as well. We could probably preserve works for longer if we kept them sealed away in darkness, but we value these works in part because of what we get by experiencing them. What we get out of them as artistic works makes them worth taking such good care of as opposed to just being something that's really really old.

Society wants to see these things, and learn from them, even though every moment they spend out in the open exposes them to more harms.

We're fortunate that digitizing has come such a long way. We can preserve and even recreate a lot of things long after the physical objects themselves are gone. It's not the same as having the originals, but at a certain point the reproductions are all we'll have left.


That's what I was wondering. We can't redirect the entire output of society towards museum conservation, so some tradeoffs will have to be made. That isn't a problem, just reality.

When a large book turns into an epub/zip that is under 100kb, what makes the paper so important?

When you add up all the books that were required for our careers, would they be a megabyte?

The little that we understand is uncomfortably summarized this way.


It's hard to measure the information content of anything, because information is fundamentally about differences which matter, and we don't always know what matters. The text content can be preserved dutifully through centuries through copying, then in our time, we find out that what we really would have wanted was the handwriting style of the original, or the environmental DNA from pollen attached to the original vellum...

But even so, there's so much archive material which hasn't even been digitized. I run into it in genealogy all the time. It's in some box in a museum, if you're lucky they made microfiche images of it fifty years ago.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: