Hacker Newsnew | past | comments | ask | show | jobs | submit | DanielBMarkham's commentslogin

I hate to be terse, but this subject has been covered many times before. Let me phrase the contrapositive: why blog only for people to read?

I see a lot of people (I'm looking at you, pg) that go on at length about how important it is to be precise, plain-spoken, come to a point clearly and then move on.

Sure, when you've reached the point you have nothing new to say, regurgitate it, chew it up good again, think it over, make an insightful, pithy, extremely useful and precise essay.

Congrats. You've now reached AI/LLM status of intelligence. Nothing wrong with that, of course, many times society needs the same point made over and over again in different words until they finally take effect. But the real meat of essay writing is thrashing about semantically until you finally reach a conclusion that you probably knew to some degree all along but didn't really understand all the implications. That means your essays should be thoughtful, researched, messy, creative, self-contradictory, etc.

Guess what? Nobody wants to read that stuff. They all want 2-minute videos on how to lead a meaningful life. I get it: life's short. But you can consume pithy, terse, useful fact-bombs all your life and not know a damned thing aside from how to parrot back others. Writing for others is fine. You might make a difference. Good luck. But if you're not creating refining, and recasting content that nobody reads? You're not getting any personal value from it. Don't expect to write for others and actually make yourself a better person. Most of the time you end up becoming a popular grifter (and this time I'm most definitely talking about pg).

Decide whether you want an audience or not. Unless you're truly that one-in-a-million person with tremendous important insight, I recommend against it. Still, for your own good, please write.


I lost all of my photos (along with everything else) when growing up, so taking pictures and videos was important to me as I became an adult.

I'm 59 now. In the 1990s I started taking VHS videos of family events. Sometimes I would walk around "interviewing", sometimes I would walk around and try to normally talk to people while holding that huge recorder. (That didn't work). I even set it up on a tripod and just let the recorder run while my parents and others visited.

This past year I've ripped a couple of dozen DVDs out of all of those tapes. In the past two weeks I've then ffmpeg'ed them to mp4s and loaded on an SD drive and put in a e-picture frame.

Now we have 30-40 hours of "family memory TV" playing constantly in our living room. It is one of the most amazing things I've done with technology. I can't describe the feeling of looking back 30+ years to see folks who are long gone -- or now adults with their own kids!

God I'm glad I didn't record all of this on a cell phone or use social media. It would have been impossible to have the patience and time to scale all of those walled gardens for this project.

Best videos? The "family interview show", where I ask questions and everybody performs some kind of art. Wish I'd done one of those every year. Second best? Just setting the cam up and letting it run. Third place are videos of family members doing things that'll never happen again, like watching a sonogram of a new baby on the way.

Worst videos? As I know (and knew at the time!), a bunch of videos and pictures of things we were looking at that were interesting to us at the time but stuff you could find online in a couple of seconds. Unless it has audio commentary, it was a pointless exercise.


I'm fascinated by the "family memory TV" idea. Losing cherished memories—photos, videos, or writings—is a recurring fear of mine and a big reason I’ve embraced digital hoarding. Having a place to share this with yourself and others is really powerful.

Could you share more details about your setup? Do videos play continuously in your living room, or are they triggered by presence? Is sound muted or do you just have it at a lower level?


> God I'm glad I didn't record all of this on a cell phone or use social media. It would have been impossible to have the patience and time to scale all of those walled gardens for this project.

Why? I've done data takeout from multiple social media sites. It's much less tedious than babysitting VHS->digital transfer (which I've also done).


The generalized answer is layered curation versus tagged curation. In my VHS scenario, there are several layers involved. First, I'm taking the video because I think there's something important going on. Many times I'll be in the background explaining why I'm recording this for the future. When the tapes and DVDs were made, there's a general summary of the topic, ie "Children Christmas plays and grandma's house, 1993" Finally, when I'm scanning/ripping, I'm also adding information and filtering as desired.

My next project is exactly what you mention: using takeout. This project is involving going through hundreds if not thousands of videos with names like "45437905_521345565468507_6881371949495794958_n_10156232738427354.mp4" all stuck in one big folder. 10-30% of these are probably memes or other throwaway stuff, mostly because there is little to no curation going on (until now). Even if I sort keep from delete, there's still the issue of generalized topic. (FB Takeout, for instance, gives all the files the same date) Assuming I could go through them all in some automated fashion, I'd still only end up with categories an automated system could provide. That's far too reductionist to actually work, eg who wants pictures of the dinners you ate, but that one time they made the 17-layer cake, oh yeah, don't want to forget that. These edge cases are part of what makes the curation so personalized and special.

Feed-based sharing is not the same activity as recording for the future. There are different goals, audiences, situations, etc. More specifically, I'm probably going to end up with 4-7 huge hunks of hundreds of impenetrable filenames from different services, each with their own nuances -- and that's after going through takeout.

Starting in on that next project today, yikes, the last thing I want is thousands of small randomized videos from my life. You could conceivably do that by wearing a GoPro around and coding a bit. That would be a terrible (and egocentric) thing to inflict on house guests.


My wife used to give me shit for recording her and the children when we did things like the grand canyon and aquarium and what not. She used to say I should be taking pictures or filming the event or activity.

Now that they're older, she gets it. I can Google a dolphin, but I can't Google my oldest's reaction to seeing one.


Thanks for sharing, this is wholesome.


Beautiful and inspiring. Can you share the specific e-picture frame you mentioned?


Not the OP, but I've had great experience with Aura frames...Costco sells them each year on the run-up to Christmas.


We're primates, we see and tactilely use physical objects.

Use your strength. Put different things in different objects. Now you rationally reason about them.


Related: People wonder why English has so many weird spellings. It's a complicated answer. The Vikings seem to show up way too often (grin). One of the reasons, though, is that several hundred years ago we all thought that Latin was the bees knees. The Greeks and Romans were the model. So took words that were perfectly-well phonetically-spelled and "fixed" them, returning them to some kind of bastardized form that was "better".

For some words it didn't work -- people went back to the old ways. But for some it did.

This chaotic priggish churning in society is not new, as pg points out. I love how language, manners, idioms, and cultures interact. It can be a force for good. It can also be extremely destructive, usually in tiny ways and over centuries.

While I love these intricacies, I also always fall back on the definition of manners I was taught early on: good manners is how you act around people with poor manners. Add complexity as desired on top of that. The form of communication and behavior can never replace the actual meaning and effects of it. (There's a wonderful scene in "The Wire" where they only use the f-word. Would have worked just as well for their job to have used the n-word. 100 years ago, the n-word would have been fine and the f-word beyond the pale. Draw your lessons from that.)

ADD: I always try to be polite and abide whatever traditions are in place in any social group. One thing I've noticed, though: the more people express their politics, their priggishness, their wokeness, etc -- the crappier they seem to be in their jobs. I don't know why. Perhaps it's because this is such as easy social crutch to lean on and gain social advantage that it becomes kind of a "communications drug". Scratch a loud prude or moralizer, you find a dullard or slacker. Conversely, people who produce usable advances in mankind tend to be jerks. I suspect this relationship has held up over centuries. cf Socrates and the Sophists, etc. (A good book among many along these lines is "Galileo's Middle Finger")


Related: From ~5 years ago.

https://danielbmarkham.com/for-the-love-of-all-thats-holy-us...

Understanding CCL is both critically important and will sink you into deep professional despair once you realize that the coding community has deep-set standards and social mores that prevent serious adoption.


I feel both strong agreement and strong disagreement with your comment.

Epistemology is probably the only topic that I would recommend being 30+ before you read. Before that, in my opinion most folks aren't ready for it. You need to both accept ultimate uncertainty and also deliberately create your own certainty in your life. That's a tough ask even for many older people.

I've come to believe that an important part of any society is creating a series of positive narrative myths that are increasingly-detailed and nuanced. Why positive? Because introducing negativity in any form early in the education process turns the kids off to receiving anything more on that topic or from that viewpoint. We need optimistic learners, not pessimistic curmudgeons.

So yeah, we're going to lie to you about the number line. We're going to lie to you about history. We're going to lie to you about damned near everything, and a simple search online will prove the lie. But we lie in order to encourage you to rebel, not to indoctrinate. Find the problems and fix them. It's not our business to tell you what they are. Hell, we don't know ourselves. We're in the same boat you are.

This is not a declarative, literal topic. Already comments here decry the big words. So while I agree with you, epistemology is just like any other intellectual super-power: you gotta be able to deal with the repercussions or you shouldn't dive in. The water's deep.

You lose all of that googling around for Wikipedia articles. Long-form books are the only way forward, along with the confidence and intellectual curiosity needed to eventually make a difference.


> Epistemology is probably the only topic that I would recommend being 30+ before you read.

I think you may be on to something, but I would also add that maybe you should consider whether prior to the age of 13 may also be a viable range. I think 13 to it depends is when the problem (roughly, the mind/ego "coming into its own", or something like that) manifests.


Disagree about the age threshold on epistemology. Being introduced to Hume in my teenage years is what got me to repeatedly revisit and reconsider old ideas, and look for new ones in much the way you describe. Human mental life & development isn't so simple as arbitrary age boundaries and specific, fixed learning environments.

Rather than selective pragmatic bias, maybe better is the ability to consider multiple viewpoints with multiple degrees of skepticism and evaluate the strong and weak points, benefits and negative consequences of a thing in tandem.

This is sort of the grander point & motivation behind essays like The Order Of Things -- it's being able to acknowledge how much of what we know & can know is determined for us, and to see the uncertainty of many parts of our existence headon, and see that as something that sets you free and puts the onus on you without defensive denialism or diving straight into flimsy pseudo-certainty.

It's no surprise that when philosophers started publishing books like that, they were accused by more conservative contemporaries of trying to undermine all of civilisation and dive into nihilism. It must have been unnerving to be confronted with the possibility that one's deeply-held convictions might not be eternally robust & not tied to any culture or time period, and seeing the only alternative as nihilistic would come naturally to such people over seeing it as something to celebrate and explore.


We presume that it is us who have digested the thinness of the veil of reality who should be deciding epistemological questions but it is the younger generations who have grown up in this environment of 'Hacking the Matrix' who have the moral right to do it.


I'm all for that. Sounds great.

I very well might be wrong. I hope I am, since I can't of any other way to make things work.


Just to clarify, I'm asking where to host a static page that would be a modified introduction, addendum, errata, "if you like this try these", "The author died by wooden badger attack", etc

Not the entire book. I wouldn't have to copyleft the entire thing, right?

My current static page is just an "I like this book, please give me this free thing and put me on a mailing list" kind of thing. But that's not really the point. The point is "how can I use static pages, presumably the thing the internet was built for, to put a few pages of extra material on the book and perhaps give links for folks to follow if they're interested in the topic?"

Traditionally, you'd issue a new edition with such information, but that seems like a hella work and expenditure, and small stuff like this should be the kind of thing you'd be able to stick somewhere online, if nothing else as a parking spot until you eventually publish a new edition.

I wonder if the Internet Archive would be interested in this kind of archival.


Traditionally, the answer would be "Use PURLs". The Internet Archive happens to operate the purl.org resolver.

Unfortunately, with the recent attack, the purl.org resolver went offline and remains so. All the purl data is still there and is accessible to the public, you just don't get resolvable URLs for them.

Nowadays, it seems like ARKs are the better way to go, anyway, and the attack on archive.org proved it perfectly. Look into <https://arks.org/>

(The Internet Archive mints ARKs for every item uploaded to its collections. You may have noticed them in the infobox below the fold when looking at a given item.)

As for your latter remarks, the Internet Archive will take just about anything. Create an account and start uploading. Ted Nelson has a huge chunk of his papers on there.


Anna Karenina. Nothing mind-blowing. I didn't see a light in the sky.

I read a lot, fiction and non-fiction. When I read Tolstoy, I remember thinking "What sort of dark magic is this?" He drew characters in a way I haven't seen since. I _knew_ these people.

I remember this book, decades later. I remember a lot of what I've read, but Tolstoy was the man. I have no idea how or why his magic worked.


Anna Karenina is a masterpiece of a novel. I highly recommend it as well.

I often jokingly mention that this is the first book that ever made me want to get married. While many now that the novel describes the tragic life and relationships of Anna, only those that read it will now that here is another, as important and positive story, about the love between Kitty and Levin.


That novel helped me understand love.


I second this... to this day, I am in awe! Such a profound and readable book in the same time. Each character lives within me, and in many situations, one of them emerges :)

As Nabokov said, it is the supreme masterpiece of 19th-century prose.


I feel like Tolstoy and Dostoyevski have produced all the great insights we could possibly have on the human soul.


Actually it is mind blowing. For example, it has exact description of a flow state in few paragraphs.


"Most drama today sucks" --- "This is called write-to-market"

Yes. I doubt you disagree, but I needed to point that out. These two, along with writing-by-formula, are intricately related. There's nothing wrong with writing being a racket.

If you want to feel depressed and lack a sense of optimism in humanity, spend some time learning about the publishing industry. Woof. It's nothing but formulaic conservative market-driven darkness all the way down. AND all the players are well-established, they've driven out inefficiencies, and they've got plenty of tricks to keep the riff-raff out. They've been doing that for centuries.

I don't like joking on HN, but this is a personal joke I've used for some time and seems appropriate: Want to start a new streaming series? Throw together a bunch of marketing-driven adjectives and end with "and they solve crimes."

"She's a gay little person goth time-traveling alien, he's an autistic left-handed incel Quaker. They live in Portland, and together they solve crimes."

My opinion is that people are going to eventually get sick of this stuff, much the same as they got tired of the B Monster Movies in the 50s, but who knows. Detective and True-Crime novels are perennials. Not my circus, not my monkeys. Working like this sounds to me like having a job I hate.


> "She's a gay little person goth time-traveling alien, he's an autistic left-handed incel Quaker. They live in Portland, and together they solve crimes."

I'm pretty sure a Netflix executive would have signed on this concept based on this sentence alone half a decade ago. Now it seems the entire world is waking up to these formulaic data-driven products that have been pushed by media conglomerates for a decade. This is especially apparent in the gaming world, where big productions seemingly flop out of nowhere (cough Concord cough) while indie studios keep innovating.

It's so easy to blame this phenomenon on rose-tinted glasses and older people like us thinking all old things are better than modern, but when the world wasn't decided by "data scientists" and corporate committees, there was more variety, more volatility. Lower lows but higher highs as well.

I've been thinking about this a lot the other day while watching snippets of the movie "O Brother, Where Art Thou?", wondering where pure, plain fun movies that are not pushing an agenda or pandering to some audience have gone.

And now, in the current era of the remake, which still has to reach the film industry, it is gonna get worse. When they'll find they are unable to invent the new multi-billion franchise, why not go and remake and "modernise" an older one? It's basically free money.


I would say that a reasonable person could have foreseen Concord failing. Perhaps not necessarily as hard as it did, but there were a number of red flags before it released. The character designs were bland and bad, which is worse than just being bad. They drummed up a whole bunch of controversy, and the marketing outside of that controversy was basically non-existent. Almost nobody had heard of the product until it already had failed. Even when it was in beta for free, they only garnered about a couple of thousand players at any given time.

Then you add on that they already missed the train for hero shooters by about eight years and their modern competition is all free to play and has already sucked up the entire market for the most part. I saw an analysis by a former game producer that thought that perhaps they had a work environment that stifled criticism and commentary from developers and I think that that might have been an accurate assessment; The entire product was released in a way that myself and a number of other people that I've seen online simply can't believe that nobody was throwing their hands up and saying that this was a bad idea before release.


> I would say that a reasonable person could have foreseen Concord failing.

> Then you add on that they already missed the train for hero shooters by about eight years

There's an analysis video I saw on Youtube which touches upon this fact: data-driven production suffer from two major problems: when they register a signal (i.e. people like hero shooters), it's already too late. It is impossible to catch a growing trend, just one that has already reached its peak.

The second problem is that the data is misleading in the first place. Using social media sentiment, for example, is pure nonsense because social media personas are not real people. They don't buy toilet paper, they don't talk (or Google) about 99% of their boring existence, and the signal is manipulated by bots and malicious agents. It is absolutely crazy that companies still haven't learned that you cannot hire a bunch of data scientists and predict the next major hit. They have tried for the past 20 years, but human ingenuity (and the chaotic human psychology) is totally elusive to their silly models.


> "And now, in the current era of the remake, which still has to reach the film industry"

You haven't heard of the Disney live-action remakes of the old classics? It's already starting to happen.


I am not at all surprised. It's pure exploitation of a bias of human psychology: we do not like to leave our comfort zone, that's a fact. Why should they try to risk making something original, when remaking stuff that already exists guarantees you an audience and packed theatres? We had the decade of superhero flicks, we're entering the decade of remakes and re-imagining of beloved franchises of the past 100 years.


What if the 20th century was the unusual time? How much "novel IP" was produced before, in the majority of human history and before? My impression is that inventing unique "3D" characters became common in the modern era. In deep time, people told stories with their cast of characters from their mythology, legends and sages. Folk tales are often written about trope, archetypal characters that are flat and predictable by modern standards.

The expectation that new stories must be set in a freshly created "universe" with fully new characters is quite new. We may be returning to that old mode of storytelling where we repeat more. Just as people still play Shakespeare with new actors each time, why can't we retell a 20th century film with today's actors and technology? And why can't we make sequels about already beloved and known characters? Seems quite natural in fact.

We now have a new canon for this era, and it consists of superman, and Darth Vader etc.

The one (big) difference of course is that in the old times there was no concept of copyright and trademarks so people were free to recombine characters as they wished.


> It's so easy to blame this phenomenon on rose-tinted glasses and older people like us thinking all old things are better than modern, but when the world wasn't decided by "data scientists" and corporate committees, there was more variety, more volatility. Lower lows but higher highs as well.

There's a trend of thinking that our age is the most accepting and inclusive and the past was rigid and conformist but in many regards it's the opposite. Through all the metrics and quantification and SEO-like data analysis-based incentives and judgments we are being "snapped to grid". The risk averseness is growing. There's only a narrow path and people must tick many boxes or get disqualified. Just as movie producers make the nth superhero movie and everything is a sequel of old IP, science is similarly turned into a formulaic churn.

Just think about how Peter Higgs said he couldn't fit the mold of today's academia and the pressure of producing a stream of consistent (and hence typically consistently mediocre - like the consistent taste of a BigMac) output.

The other day I watched this interview with a pioneer of artifical neural networks Warren McCulloch (https://www.youtube.com/watch?v=wawMjJUCMVw). I wonder if today we're letting such personalities thrive in academia. He was originally headed to the Christian ministry and learned a lot of theology then got drawn rather to math, and his pondering of abstractions, understanding, humans and theology moved him to study neurology and to use math to model logic expressed as neural networks. Nowadays, you must specialize early and grind for tests, no "forgiveness" if you go off-track, you must be single-mindedly focus on optimizing your path towards tenure for it to be realistic. Can't even get a PhD position without having published several of your original research papers beforehand. While hiring committees say they reward well-roundedness and try to avoid a monoculture, what that becomes in practice is checking if your parents sent you to one of these trendy types of extracurriculars or foreign "volunteer" programs, and whether you later were engaged with some shortlist of trendy buzzword issues.

This breeds conformity, uniformity and bland predictability.


Check out how Snow White has been going. The film industry is absolutely there, but the good news is it's going exactly as horribly as you predicted.


I completely agree with the author and yet I have taken up the profession of writing books.

There are, as we know, two different kinds of books: fiction and non-fiction. There's actually only two flavors of books as well: polemic and exploratory. With a polemic, you're expected to have your own internal theme song and the produced work will conform to it. With exploratory books, you're on an adventure with the reader and the only bullshitting you add is the minimum amount necessary to make the exploration fun for both of you.

Polemics have become quite popular lately, and I suspect anybody taking a set of blogs or essays and "sticking them together into a book" would only have this avenue available. I took a lot of essays and material I'd collected over the years and wrote some non-fiction books on programming. They were exploratory: the theme emerged as I condensed the work together. (My guess is that exploratory fiction and non-fiction is very tough to do and sell; people like a book that they already know the gist of and are familiar with the author much more than they feel like spending many hours on a lark)

After I finished a few non-fiction books on coding and performative teams, I was done. It was fun, I learned stuff, I think it's critically important stuff, nobody cares, and I'm okay with that. Because it wasn't a polemic, I felt no need to rant or self-promote. There was no movement I wanted to lead. Time to move on to something else.

Books suck. Books change the way you think. All writing does that. Books especially change you because they draw you into some sort of self-created cosmology that you've now adopted.

I write because I'm a writer. I've always been a writer. I simply got honest about it as I grew older. I don't think you choose whether to write or not. I think you choose the fashion in which your writing interacts with your subconscious. A book is just a deployment bundle. It's not the point.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: