Hacker Newsnew | past | comments | ask | show | jobs | submit | ub's commentslogin

Good thread here that provides a counter-balance. https://twitter.com/nireyal/status/1443882540868063241?s=21


> Good thread here that provides a counter-balance. https://twitter.com/nireyal/status/1443882540868063241?s=21

By the author of Hooked: How to Build Habit-Forming Products:

https://en.wikipedia.org/wiki/Nir_Eyal:

> In 2014 Eyal published his first book, Hooked: How to Build Habit-Forming Products, which became a Wall Street Journal best seller.[12][13] The title reflects Eyal's idea of the "hooked model", which aims to "build products that create habit-forming behavior in users via a looping cycle that consists of trigger, an action, a variable reward, and continued investment."[14]...

> Eyal has spoken out against proposals to regulate habit-forming technologies, arguing that it is an individual user's responsibility to control their own use of such products.[2]

Somehow I'm not surprised someone like that would be writing desperate yet underwhelming apologia for Facebook.


The person writes a book about hijacking human psychology for more clicks and then goes on to state how it’s our responsibility to control our use.

What does he want us to do, re-wire our brains on demand?


Any given platform in Big Social has millions it can throw on tuning its GUI and algorithms to increase engagement—and they do, seeing as that directly translates into advertising revenue. Then, each user has to face all of this machinery alone, often exhausted and/or sleepy, at home, defences lowered. I think it’s wishful thinking to expect majority of users to be able to stay vigilant at all times.

Neither do I think we should “regulate habit-forming technologies”, though. That really smells of addressing a symptom and of government micro-management. Sure, we can try regulating those behemoths into acting altruistically, pass laws, fund departments dedicated to monitoring of habit-forming technology usage—but inevitably lawmakers will be lobbied by big tech, departments will be understaffed, regulations wouldn’t keep up with the new tech, loopholes will be found and exploited, corruption will occur, and this all will drag on… while the business model would find its way.

Which I think is the root cause: the business model. It should never have been possible at such scale. Any platform this size that survives off advertisement money either exploits its users, loses to the next competitor that does, or—in what’s possibly the worst outcome—becomes a strongly government-regulated (and by implication government-approved) almost-monopoly. The stark misalignment of the interests, unique to this industry, effectively makes the purported service provider (can it even be called that if the service is free, though?) an adversary from end user’s frame of reference.


> What does he want us to do, re-wire our brains on demand?

He wants us to get hooked on his deliberately addictive products, but blame ourselves for it instead of him.


In this case it'd be "blame the parents", according to his tweets -- he said the parents "only give kids access to tech they're ready for".

But how are the parents going to know how Instagram affects their kids.

I know from experience that it's not that easy also for highly educated and well meaning parents to know that much about how their kids feel and precisely what causes those feelings (Instagram or something else?)


This is like saying some person can not quit eating sugar because it "hijack human psychology". Or anything similar. So I suppose ban sugar. A "re wired brain" can still stop using it, it only takes more willpower.


Someone actually including links to the raw information (transcripts, slides, etc) is to me the more trustworthy source. I've been misled a hundred times by mainstream media orgs when they don't actually include the raw data (links to studies, transcripts, links to a place the person they are writing about could respond) even if they definitely could, I've been mislead maybe twice by individuals who actually cite their sources in a manner I can look up.

Edit: and indeed it seems that a media organisation has mislead me yet again, declining to mention things like the 11 other metrics which Instagram was said to improve.


Good context, but it would be better if you could refute his take instead of attack his past.


There is no apologia, there are many other habit-forming products we can use, television, sugar, others are used by children in many places and they self regulate.


Is there something specific about Eyals take that you disagree with?


Yep "my advice to parents is only give kids access to tech they're ready for" There is no metric to check if the kids a re ready for, I doubt that most adults are ready for the psychological effects of social media. This is too easy a way to offload the entire responsibility onto the parents, while the companies that make billions from it are off the hook. It's like blaming smokers because tobacco companies use addictive additives.


He also wrote Indistractible, a book about how to create healthy habits and not get Hooked on all the stuff he talked about in his first book.

Ultimately you’re responsible for your own life. The government isn’t going to save you.

Of course, government wants you to think they’re going to save you, and they’ll happily take all the power over your life that you care to give them.


How is this not the first comment



Slack took over the world ? They only have 12M DAU. Sure, they’re successful in the enterprise market in the US, but by no means they have taken over the world.


Isn't this what Sacha Baron Cohen asked for in his speech? That there be state mandated rules on what's fake vs not. I don't think this issue has any good solutions - damned if you do, damned if you don't. The only 'okay' way to solve for this is to crowd-source a variety of opinions on the issue and show what else people are saying about it - not as the ultimate truth, but rather to broaden people's perspective on it that there are different ways to look at the same issue.


I've always wondered how Wikipedia has solved this... Isn't it essentially the same problem?


Wikipedia built a culture in which editors who lean towards truthful informative are encouraged, and a userbase which values that. This is only possible with non-profits having strong voluntary and academic participation. Nobody will dedicate quality time for a closed for-profit network like Facebook.

But I feel that the bigger problem is that people want fake news. Most people have a hard time living their everyday lives; fake news which supports their choices and viewpoints is a release, and it makes them feel good. And networks do a good job of surfacing similar content once they gauge one's interests.


The sheer volume of user-generated traffic on Facebook suggests that plenty of people are willing to dedicate quality time to it.



It's not really solved on Wikipedia. Wikipedia fights happen all the time in controversial topics, and it's essentially solved by techno-theocracy; moderators show up and lock a contentious topic against vandalism when it's too heated, and whatever the current state of the edit war is gets frozen.


Wikipedia just has far less active users than Facebook. As in, number of users actively creating content. The amount of content on Facebook is also orders of magnitude higher than Wikipedia. And the friction in creating content is also much lower on Facebook.


Wikipedia publishes a lot of incorrect information.

The fact that you can source a piece of information doesn't make it true.


They haven't. Wikipedia is essentially a battleground when it comes to recent topics that are politicized. Sometimes an admin will essentially lock editing of the page and whatever version was up at the time is essentially staying there.


They haven’t solved it! A group has wrangled the power and Wikipedia represents their vision of the truth. This is fine where talking about objective things like Trump was at the White House for Thanksgiving in 2019. This is not fine for subjective things which is WAY more of the world then people want to admit.

We eat food for breakfast.

Is this statement true or false? You could mark it either way and both could be supported by rational thought! If you don’t come up with a framework for answering first and then evaluate that question within the context of the agreed upon framework the truthiness of the statement will remain subjective.


But then it bursts people’s political and opinion bubbles, which is something Facebook might not want.

Keeping people in bubbles is how Facebook keeps people browsing and wasting their time on Facebook. Because people like their bubbles.


I don't think you can regulate opinions.

But what you can do quite easily is to disallow statements that are factual wrong.

That would solve at least a part of the problem.

If I say things which where already proven wrong, then I can't post them on Ads.


> things which where already proven wrong

Proven by whom?


Apple makes a big deal about on-device security but they heavily promote backing up data to iCloud. They hold the keys to iCloud encrypted backups, and your backed up messages are happily handed over to LE when asked. That's why "going dark" isn't a problem on iMessage.


This. Apple cooperates a lot with law enforcement than they let out. While they make a big deal about on-device security, iCLoud which most users turn on is the loophole. Apple turns in backup content when asked much more willingly.


You are making a bunch of assertions with no basis. WhatsApp's encryption hasn't been weakened. FB can't look at encrypted message content, neither can WhatsApp. A lot of people use WhatsApp because of the encryption guarantees, please don't spread unverified claims.


Facebook owns the communication end points and can do whatever they want with the message contents before they're sent encrypted.

Given their history, I would be surprised if they don't do some on-device metadata analysis.


Yup, it just seems like some random unsubstantiated theory being touted by VCs who want to show they're smart.


It's impossible that everyone out there blogging actually knows what they're talking about, and figuring out which ones do is getting to be a lot of work.


My thoughts exactly


+1 It seems like someone just created a spreadsheet with no description of what methodology was used to test the assertions.


If you click the cells you will see the rationale for columns or what "claimed" means etc.


I find all these articles that try to compare and measure sympathy and emotion as pointless. People are comparing the outpouring of grief for Cecil to apathy towards human killings and inhuman treatment of animals in the meat industry. A person can be severely depressed if they lose a dog, and another can show no emotion when they lose a parent. You can't compare human emotion because it's not always rational.

In this specific case though, I would also argue given the declining numbers of lions, the anger is justified.


I tend to think that human emotion was not rational by definition (and by not rational I mean uncorrelated not negatively correlated). Of course "not rational" is not synonymous of bad or superfluous.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: