> In 2014 Eyal published his first book, Hooked: How to Build Habit-Forming Products, which became a Wall Street Journal best seller.[12][13] The title reflects Eyal's idea of the "hooked model", which aims to "build products that create habit-forming behavior in users via a looping cycle that consists of trigger, an action, a variable reward, and continued investment."[14]...
> Eyal has spoken out against proposals to regulate habit-forming technologies, arguing that it is an individual user's responsibility to control their own use of such products.[2]
Somehow I'm not surprised someone like that would be writing desperate yet underwhelming apologia for Facebook.
Any given platform in Big Social has millions it can throw on tuning its GUI and algorithms to increase engagement—and they do, seeing as that directly translates into advertising revenue. Then, each user has to face all of this machinery alone, often exhausted and/or sleepy, at home, defences lowered. I think it’s wishful thinking to expect majority of users to be able to stay vigilant at all times.
Neither do I think we should “regulate habit-forming technologies”, though. That really smells of addressing a symptom and of government micro-management. Sure, we can try regulating those behemoths into acting altruistically, pass laws, fund departments dedicated to monitoring of habit-forming technology usage—but inevitably lawmakers will be lobbied by big tech, departments will be understaffed, regulations wouldn’t keep up with the new tech, loopholes will be found and exploited, corruption will occur, and this all will drag on… while the business model would find its way.
Which I think is the root cause: the business model. It should never have been possible at such scale. Any platform this size that survives off advertisement money either exploits its users, loses to the next competitor that does, or—in what’s possibly the worst outcome—becomes a strongly government-regulated (and by implication government-approved) almost-monopoly. The stark misalignment of the interests, unique to this industry, effectively makes the purported service provider (can it even be called that if the service is free, though?) an adversary from end user’s frame of reference.
In this case it'd be "blame the parents", according to his tweets -- he said the parents "only give kids access to tech they're ready for".
But how are the parents going to know how Instagram affects their kids.
I know from experience that it's not that easy also for highly educated and well meaning parents to know that much about how their kids feel and precisely what causes those feelings (Instagram or something else?)
This is like saying some person can not quit eating sugar because it "hijack human psychology". Or anything similar. So I suppose ban sugar. A "re wired brain" can still stop using it, it only takes more willpower.
Someone actually including links to the raw information (transcripts, slides, etc) is to me the more trustworthy source. I've been misled a hundred times by mainstream media orgs when they don't actually include the raw data (links to studies, transcripts, links to a place the person they are writing about could respond) even if they definitely could, I've been mislead maybe twice by individuals who actually cite their sources in a manner I can look up.
Edit: and indeed it seems that a media organisation has mislead me yet again, declining to mention things like the 11 other metrics which Instagram was said to improve.
There is no apologia, there are many other habit-forming products we can use, television, sugar, others are used by children in many places and they self regulate.
Yep "my advice to parents is only give kids access to tech they're ready for"
There is no metric to check if the kids a re ready for, I doubt that most adults are ready for the psychological effects of social media. This is too easy a way to offload the entire responsibility onto the parents, while the companies that make billions from it are off the hook. It's like blaming smokers because tobacco companies use addictive additives.
Slack took over the world ? They only have 12M DAU. Sure, they’re successful in the enterprise market in the US, but by no means they have taken over the world.
Isn't this what Sacha Baron Cohen asked for in his speech? That there be state mandated rules on what's fake vs not. I don't think this issue has any good solutions - damned if you do, damned if you don't. The only 'okay' way to solve for this is to crowd-source a variety of opinions on the issue and show what else people are saying about it - not as the ultimate truth, but rather to broaden people's perspective on it that there are different ways to look at the same issue.
Wikipedia built a culture in which editors who lean towards truthful informative are encouraged, and a userbase which values that. This is only possible with non-profits having strong voluntary and academic participation. Nobody will dedicate quality time for a closed for-profit network like Facebook.
But I feel that the bigger problem is that people want fake news. Most people have a hard time living their everyday lives; fake news which supports their choices and viewpoints is a release, and it makes them feel good. And networks do a good job of surfacing similar content once they gauge one's interests.
It's not really solved on Wikipedia. Wikipedia fights happen all the time in controversial topics, and it's essentially solved by techno-theocracy; moderators show up and lock a contentious topic against vandalism when it's too heated, and whatever the current state of the edit war is gets frozen.
Wikipedia just has far less active users than Facebook. As in, number of users actively creating content. The amount of content on Facebook is also orders of magnitude higher than Wikipedia. And the friction in creating content is also much lower on Facebook.
They haven't. Wikipedia is essentially a battleground when it comes to recent topics that are politicized. Sometimes an admin will essentially lock editing of the page and whatever version was up at the time is essentially staying there.
They haven’t solved it! A group has wrangled the power and Wikipedia represents their vision of the truth. This is fine where talking about objective things like Trump was at the White House for Thanksgiving in 2019. This is not fine for subjective things which is WAY more of the world then people want to admit.
We eat food for breakfast.
Is this statement true or false? You could mark it either way and both could be supported by rational thought! If you don’t come up with a framework for answering first and then evaluate that question within the context of the agreed upon framework the truthiness of the statement will remain subjective.
Apple makes a big deal about on-device security but they heavily promote backing up data to iCloud. They hold the keys to iCloud encrypted backups, and your backed up messages are happily handed over to LE when asked. That's why "going dark" isn't a problem on iMessage.
This. Apple cooperates a lot with law enforcement than they let out. While they make a big deal about on-device security, iCLoud which most users turn on is the loophole. Apple turns in backup content when asked much more willingly.
You are making a bunch of assertions with no basis. WhatsApp's encryption hasn't been weakened. FB can't look at encrypted message content, neither can WhatsApp. A lot of people use WhatsApp because of the encryption guarantees, please don't spread unverified claims.
It's impossible that everyone out there blogging actually knows what they're talking about, and figuring out which ones do is getting to be a lot of work.
I find all these articles that try to compare and measure sympathy and emotion as pointless. People are comparing the outpouring of grief for Cecil to apathy towards human killings and inhuman treatment of animals in the meat industry. A person can be severely depressed if they lose a dog, and another can show no emotion when they lose a parent. You can't compare human emotion because it's not always rational.
In this specific case though, I would also argue given the declining numbers of lions, the anger is justified.
I tend to think that human emotion was not rational by definition (and by not rational I mean uncorrelated not negatively correlated). Of course "not rational" is not synonymous of bad or superfluous.