Dairy is completely unnecessary, for one. Its prominence on the plate makes everything else immediately suspect. There are probably some axes along which a glass of milk or cup of (unsweetened) yogurt is one reasonable option but that's not what is being promoted here.
I see it as more of a limit than a requirement. After all, you can technically satisfy your dietary needs with vegetables and eliminate fruits. But we aren't talking about technicalities and edge cases, rather what a balanced diet might consist of. For many people that does include dairy and fruits, even if neither are completely necessary.
I would gently suggest that you may be blinded by a cultural bias here, which has partly been formed by the dairy lobby over the course of every living American’s lifetime. While it is true that we are not the only culture that drinks cow’s milk, it is predominantly a Northern European and later American phenomenon, and the number of people who are intolerant to dairy on some level is very high. I’m not saying a balanced diet cannot consist of dairy, but implying it should, as the plate diagram does, is highly misleading and outright paid for.
"I’m not saying a balanced diet cannot consist of dairy, but implying it should, as the plate diagram does, is highly misleading and outright paid for."
And to exclude it would imply that it cannot be part of a balanced diet. That would be misleading based on the predominate culture.
"I would gently suggest that you may be blinded by a cultural bias here,"
I would suggest that you are not aware of the cultural background. The US was colonized by Europeans. Many cultures who immigrated also used milk, cheese, or other dairy products. It makes sense that the guidelines be based on the cultural background of the foods eaten in that country.
Also many Asian countries have nutrition guidelines that include dairy products, not to mention historical cultural foods that do include dairy.
I write what I want the LLM to do. Generating a satisfactory prompt is sometimes as much work as writing the code myself - it just separates the ideation from the implementation. LLMs are the realization of the decades-long search for natural language programming, dating at least as far back as COBOL. I personally think they are great - not 100% of the time, just as a tool.
My daughter hears about gun threats at her high school weekly. I don't know how many are actual threats, but they have implemented a transparent bag policy, it's a real problem.
> they have implemented a transparent bag policy, it's a real problem.
This makes the assumption that all policies have a reasonable justification, so that the existence of a real problem can be inferred by the implementation of a policy which would only make sense if (1) there was a real problem, and (2) the policy was an effective mitigation.
I would suggest that this assumption is both false and dangerous, in that it makes one trivially manipulable by anyone in a position to set policy.
You are correct in that I did not specify what the actual problem is. There is a problem of perception, which the transparent bag policy will at least partly address, at relatively little cost. The problem of perception is almost certainly more troublesome than the reality in the majority of cases - the exceptions being notable - and while transparent bags may not be an effective deterrent, that doesn't mean they don't serve as one at all. There is also, in this case, a very real and well-known problem in American schools, including multiple guns confiscated and at least one credible threat in the past semester at this particular school.
I too kind of roll my eyes at the bag policy but it's at least an acknowledgement that something needs to be done about the problem - more than we've gotten from our politicians in the past two decades.
I'm sorry, I hope I don't come off like I'm minimizing a real problem here, but from the outside looking in, it just feels like an entirely alien line of reasoning that could only describe a solution to an imagined problem. However, I'm also missing the lived experience of what being in the US is like right now, and especially missing the context of being a child with peers that make threats like that weekly. I'm empathetic to that situation, but not to the framing that surveillance is somehow stopping those weekly rumours from being weekly atrocities. That's a huge leap.
> However, I'm also missing the lived experience of what being in the US is like right now,
I’m in the US and this story feels extremely foreign to me. Even hearing a rumor about a gun threat at my kids’ school or any of my friends’ kids’ schools would be a topic of discussion for the next year with parent-teacher meetings, the school communicating with parents to shed light on what happened, action plans, and so on. Fortunately nothing like that has happened, but this is the level of communication that happens for even rumored threats.
The US is a huge place, though. Some times I don’t think outsiders understand how big and diverse this country is.
When I was in school, the administration would work itself up into fits about "gangs infiltrating the schools" because an 11 year old wore a red or blue hat to class, clearly gang colors and a sign of the times.
This was in a wealthy suburb where people like that have to make up imaginary threats in order to feel something, and what better population to fret about than the kids.
The cops' reluctance to investigate probably had something to do with the fact that some of the gang members were white student athletes with very wealthy families.
That's a real thing that actually happened, my school administration was worried about "ethnic" gangs and rappers they saw on MTV turning 11 year old white kids into, using their words, "gangstas". Same people were running around like headless chickens about "rainbow parties" and FPS games a couple of years before.
I also thought the rest of the interview was really worthwhile - they talked a lot about real problems in the medical industry from different perspectives. What a great and critical discussion from Dr. Mike. If Amen had conceded the point they could have moved on. There could be real findings to be had there, and some may even match his conclusions, but many likely will not, and the whole thing could also be pure fiction. We should want better answers to these questions. It's unfortunate to watch someone as seemingly intelligent and well-informed as Amen come across as shilling snake oil, and/or just being hung up on his ego, at the end of it all. Scientific literacy is so critical, because it's easy to cloak pseudoscience behind high-tech smokescreens.
It kind of depends on how you define "history". Before STEM dominated the hiring landscape, Universities were less career focused. No employers in these fields, as far as I know, have ever offered apprenticeships to teach new hires chemical engineering or applied mathematics from the ground up. University will not prepare you for a corporate job, exactly, but it gives you a background that lets you step into that, or go into research, etc. Lots of employers expect new hires to have research skills as well.
I think there are a number of ways in which financial incentives and University culture are misaligned with this reality.
I do think there can be element of snobbishness around it, but that's not really the point. The overculture of corporate America has finally overtaken the hackerish (relative) meritocracy of early tech, of Getting Things Done and Building Cool Stuff. Rewards are increasingly tied to metrics decoupled from useful outcomes. If you want to get paid a big tech salary you need to go through the leetcode grind, and do things like project sufficient "masculine energy" (lol). Management performance is measured by hiring and expansion more than product delivery and success. The ethics of what you are doing are completely secondary to shareholder value. You still need technical skills, but they are somewhat less important, there are many more competing incentives than there used to be, and the stakes are higher. This has been happening since the early days - cf. Microserfs, written all the way back in 1995 - it's just that tech has worked its way so thoroughly into the fabric of corporate existence that the two have more or less completely merged.
I also enjoy coding! It’s fun. It’s also only about 10% of my job as a software developer, and I can and do use an LLM for it whenever I can find an opportunity. The author is a professor. Not to disparage that perspective, but she paints a picture of the joys of programming that are overshadowed in environments where you are actually building real world robust systems with clueless users, vague requirements, shifting budgets and priorities, etc.
As to why not use C, or assembly, it’s not just about the code, but the toolchains. These require way more knowledge and experience to get something working than, say, Python - although that has its own rather horrible complexities with packaging and portability on the back end of the code authoring process.
I have done a rigorous job of self diagnosis. I am autistic. I’ve also had the privilege of being able to pursue meditation, therapy, and other self development practices: I’m not as severely autistic as I was as a young man. I also have childhood trauma that I know contributes to many of my autistic presentations — see the last section on comorbidity. I also have some distinct ADHD symptoms but have never pursued that path because my hyperfocus tends to win out often enough that it’s not a hindrance to productivity. But it still causes problems elsewhere in my life.
For some people these diagnoses will be a very good fit with clear predictive outcomes. But many of us have a grab-bag of traits from several categories and still mostly get along in life, maybe with some assistance particular to one of these diagnosis but no more help overall than anyone else needs otherwise.
The diagnostic models suck. They are too broad here, too narrow there, misunderstood by professionals. I had a psychiatrist (mis)diagnose me as bipolar based on a 45 minute appointment when I was in some sort of crisis in my early 30s and that ended up haunting me years later when applying for a job with a security clearance. I didn’t even know about it at the time. This was one of the top rated doctors in a major metro area. What a sham.
The field is a mess. It has a terrible history of horrific abuse. Some autistic children still receive involuntary-to-them ECT. I think we should be supportive of research into these topics while also being critical of the very obvious problems with them.
Your experience illustrates something that often gets lost in the autism-vs-not-autism debate: many people don’t fall into clean diagnostic categories. You’re describing a profile that mixes autism traits, trauma adaptations, ADHD features, and developmental history, and instead of neatly labeling you, the system failed you outright with a bipolar misdiagnosis. That alone shows how fragile clinical certainty really is.
I think the most important part of what you wrote is that you changed over time. Whether that improvement came from meditation, therapy, maturity, trauma processing, or simply growing into yourself, it challenges the idea that autism is a static essence. Development, coping skills, neurology, and environment interact in ways the current diagnostic boundaries don’t fully capture.
Where I push back slightly is on the conclusion that self-diagnosis can automatically fill the gaps. For some people it’s deeply accurate and validating, for others it may explain one part of their experience but obscure another. As you said, many people carry a “grab-bag” of traits, and a single label can illuminate or compress that complexity depending on how it’s used.
You’re right that the field has a painful history and uneven present. Misdiagnosis is real. Forced treatment is real. Diagnostic tools are blunt instruments for a very diverse human reality. Supporting research while staying critical of the system makes sense, not because autism isn’t real, but because the categories we have are still evolving. Your story is a perfect example of why humility in diagnosis matters, whether it’s done by a psychiatrist or by oneself.
Mental healthcare in-general tends to suck. I went for years to a boutique psych that had suspect people working for them and that would just increase dose until prescribing the max allowed of various meds.
What I’ve noticed is that if a doctor’s or dentist office looks stylish, consider moving to a different one. It’s not worth ruining your life, health, teeth, etc.
I was also misdiagnosed as bipolar due to a crisis years ago, which destroyed my career path in the military and post service. Since then I’ve been diagnosed as autistic, but much like you I’m just capable enough to kind of run the rat race but not quite capable enough to thrive.
Not the OP, but after a couple of decades of people pointedly talking about eye contact, small talk, and body language, you learn “coping mechanisms” to deal with neurotypicals and make them more comfortable.
Did your sporting team have success on the weekend? Wonderful, direct eye contact, smile, mirror. Ok, now, to business:
I masked for years but recently (possibly linked to some bereavements in the family, who knows what the actual trigger was if there even was one single trigger) the constant effort required just burned me out. Anxiety spiked, depression symptoms loomed, and I just felt exhausted all of the time.
I've spoken to many people in the past 10 years or so who were in a crisis/burnout/depression or however they personally labelled their situation with varying degrees of bad mood, depressed affect, and reduced energy. Every single one of them had a mask they had been wearing for a very long time, and which was hugely mentally draining on them. Most of them wore the mask especially when interacting with themselves, interestingly.
Some of them self-identified as neuro-atypical (with or without professional diagnosis), others didn't. Some of them identified their situation as a co-morbidity of being atypical, others as a result of it, or as a pure coincidence. It's not clear to me whether the masks themselves and/or the current inability to wear them were a reason, a symptom, or just a coincidence of said situations and/or the subjective or objective atypicalliness. But whenever I hear that masking has such a huge drain on people with ADHD/autism I wonder about the questions of cause and effect, the question of correlation and causation, and the question of (self-)selection bias. It's really a mess and it's very difficult to make sense of any of that. But mostly, I feel that discussing ways how society could reduce the pressure to mask might be more beneficial to everyone than finding the perfect definitions for groups of people who have an accepted reason to be drained by their masking, while others must still endure because their masking is not socially or medically recognized as unnecessary suffering.
Masking is defined by being a maladaptive strategy, so I don't think that it being in general mentally draining is disputed. The issue is that sometimes there is a tendency to call any coping strategy as masking. There are coping strategies that can be successful, and there are reasons to adopt them other than to hide not being neurotypical or to make neurotypicals happy.
> Not the OP, but after a couple of decades of people pointedly talking about eye contact, small talk, and body language, you learn “coping mechanisms” to deal with neurotypicals and make them more comfortable.
It sounds to me like the article author calls that social awkwardness not autism, no?
>The key distinctions are that socially awkward individuals understand what they should do socially but find it difficult or uninteresting (versus genuinely not understanding unwritten rules), show significant improvement with practice and maturity, are more comfortable in specific contexts, lack the sensory sensitivities and restricted/repetitive behaviors required for autism diagnosis, and generally achieve life goals despite awkwardness rather than experiencing clinically significant impairment.
It seems to me that this sort of definition would preclude any person having general intelligence such that they are able to learn to mask (or feel like they have to mask less in certain safe areas).
Yeah, really good point, and I question my own diagnoses sometimes. However: I did not understand for many years why I needed to mask. I was not being contrary or looking for attention - I really did not get it.
Once you understand that neurotypicals have special needs and you must play-act to smooth things over, then you play the game.
I think your comment is very insightful. It made me think and reflect. I am not socially awkward, however: but I am autistic. I really think so. My ability to appear less so over time is my own achievement.
I think I have some kind of autism because I have been doing this social skill game for decades and still has not "clicked" with me.
It does not mean I am bad at it, it means I don't understand the rules. I can copy others people tactics and sometimes it works, but still don't know why.
One important detail which is missing from it. Masking is far from free. I have at least met several people who are capable of masking but just grow so resentful of it that they outright refuse to do it for the barking-mad neurotypical society any more. The reason for the harsh references to mental health? Because most common doesn't equate to not being deeply irrational and dysfunctional. Keeping up with the Joneses to the point of keeping oneself in debt? 'Normal' behavior but frankly belongs on a diagnostics list for something.
That’s what I’m curious to hear from you and OP…does that make the autistic person less autistic? Or is it a mask?
I—-as a non-autistic person—-have lots of default tendencies which were socially discouraged as a child and which are now no longer part of my self concept. I’m not “repressing” a desire to be awkward, I’ve simply learned to be less awkward.
But my understanding of autism, which is I think backed by the article itself, is that autism exists as a fundamental cognitive process and tends to be pretty stable.
Btw the reason I ask is to learn…as a software dev and manager, several of the people I interact with could probably be diagnosed autistic and I’m always curious to try to understand what that’s like better.
As part of my job, I have to interview and hire people.
When I first started interviewing people, I would have crippling anxiety. On days I had a interview scheduled with a candidate, I would obsess and have anxiety to the point where I wasn't able to focus on anything until the interview was over. It was bad. I'd spend hours rehearsing every line I was going to say. I was an incredibly awkward interviewer.
Fast forward 10 years and hundreds of interviews later, the anxiety is completely gone and an interview doesn't even spike my heart rate anymore.
I absolutely met multiple DSM criteria for anxiety 10 years ago, but not anymore.
I suppose I was cured through "exposure therapy" (or whatever you call doing something repeatedly that gives you massive anxiety).
Interviewing still doesn't come naturally to me. But it's easy now because every interview is basically scripted. I repeat lines that I memorized over the years. I always start interviews with the same ice breaker. I use multiple tactics to put myself and the candidate at ease throughout the call.
Do I still have anxiety even though I've learned how to cope with it? I don't know.
Is someone still autistic if they were able to learn coping tactics that make the symptoms invisible to themselves and others? I don't know.
I think the difference is that if an autistic person learns to mask, that's probably useful as a coping mechanism but doesn't remove the autism in the sense of making the fundamental neurological difference go away. Anxiety (even in anxiety disorders) can be fundamentally reduced by exposure therapy, not only in the sense of finding more effective coping mechanisms but in the sense of the anxiety itself diminishing or ceasing to exist.
For what it's worth, exposure therapy is a real term and it's an actual part of cognitive behavioural therapy.
Interviews are pretty weird in that you meet a new person and then very soon need to have a really serious conversation about important things. It's also a social context which is inherently stressful for the candidate, and where being good at putting them at their ease gives you better results.
I even notice that during the times of my life when I hire a lot, I have a much easier time with being extrovert and social at parties and in everyday interactions.
During times I don't have to meet new people as often, I get worse at it.
Very good questions. Some part of the answer might come down to your identify. Do you identify as an anxious person or a person with anxiety? The question would become even more interesting if you'd be taking a drug: is a person treating their depression successfully with an antidepressant still a depressed person? Or a person with depression? Do you have high blood pressure if you don't have high blood pressure due to meds?
I don't know the neurological mechanisms behind autism, but I know that ADHD is, briefly, defined by a reduction of dopamine receptors across your brain.
The brain is neuroplastic, especially when young, but I doubt you can just influence the growth of significantly more dopamine receptors out of pure willpower and habit-forming; especially given that ADHD disrupts those two facilities.
This is in part why dopaminergic drugs such as Adderall work so well, and why dopamine/reward-center disruption due to childhood trauma can have such a negative impact on one's ADHD symptoms.
Again, I don't know how much this applies back to autism, but it has definitely been a bane of my existence constantly explaining to people why I can't just meditate, habit-form or diet or exercise away my symptoms.
These things help, as does directed research and experimentation with what does and doesn't work for me, and because of my ADHD these things are integral to my ability to function as an adult in this insanely complex and stressful world. And it's definitely made a difference in how I manage my symptoms, especially when I look at how my siblings don't manage theirs and lack basic coping mechanisms.
But I frequently run into people who arrogantly assume I've never even heard of meditation, or that I have a bad diet, etc. and offer them up as panaceas. These people often get defensive and more arrogant whenever I try to explain to them that ADHD is not just some "mental block" or collection of bad habits that can be "fixed".
So yea... I also think we need to do way more clinical studies about the effects of teaching coping mechanisms at a young age, but I don't think autism is something that you can grow out of, there are likely specific underlying genetic and neurological factors that affect how much a specific individual can control or cope with their symptoms.
Society is moving in the right direction at least. At one point, the bell curve had 3 sections: normal, genius, retarded. Now we have more gradients and some of them trigger help or maybe longer exam times.
This causes over-diagnosis and resentment. Coping mechanisms grow over time. It’s definitely better if you can appear neurotypical.
You don't grow out of it as much as learn to manage it, this requires that you develop some form of executive function though. In my case I was forcefully required to be responsible for my younger brother (when I was 7) and so learned out of necessity -- but this led to a lifetime of resentment and so I don't recommend it as a solution.
I was homeless by 16 and had no safety net, had to graduate high school on my own while living out of someone's garage, and generally take care of myself most of my life due to absentee, drug-addict parents, and I can tell you that this trauma only worsened my executive function by the time I had the privilege of being able to sit back and reflect from a place of security and comfort.
I'm sorry you have resentment issues... definitely get that.
I think so. If I had had somebody in my youth who taught me how to interact with people I am pretty sure I would have done much better. The worst for me was to notice that I don’t fit in and had nobody to help. It was extremely lonely and depressing. But I am also a pretty mild case and performed well in school and work. I am not sure how it would have worked with severe autism cases, for example non verbal people. That’s a different ballgame.
A common issue is that autistic children tend to have autistic parents and many autistic parents are sadly bad att helping their kids understanding social interactions.
Commonly called masking - learning the 'rules of the road' for peopling - the hardest thing that young folks with autism or ADHD need to learn is that you must learn how to do this, the world will not (often or always) change to accommodate you - but once you do it, you can appear more or less normal most of the time.
When I was starting in retail, I had several notebooks I made of conversational flowcharts. They helped a lot back when I was in my late teens, and, along with some improv training, I can put on a pretty convincing performance now (my customer service skills are top notch). The only problem I have is when people veer too far off of my training.
The problem is that at times the world is fucking insane and are the ones who belong involuntarily committed as a danger to self and others. Instead they are put into positions of power.
There's nothing in the diagnostic models for nearly any mental health concern that presumes a patient would forever earn that diagnosis nor (certainly) that its presentation would be identical through their life even if the diagnosis stood.
There are some clinicians and unfortunately now many patients and caregivers that nonetheless take an essentialist view of diagnosis and come to identify their patient/self/child/peer with what's really just meant to be a guideline for support with ongoing dysfunctions.
In reality, most people face some fluctuating bag of dysfunctions over the course of their life, with fluctuating intensity, with contributing causes too diffuse and numerous to identify. They might be diagnosed squarely by one clinician with one thing thing at one time, then see some other clinician the same day who thinks the diagnosis was overstated or preposterous. Or they might find that a qualifying symptom that seemed very salient at one time of their life hasn't been an issue for them for a long time because of some new learned behavior, some change of circumstance, etc. Likewise, they may even find themselves facing new or greater dysfunctions compared to what they'd experienced or noticed before, precipitated through known or unknown reasons.
For people most intensely disabled by mental health dysfunction, they often can't escape that dysfunction entirely without the discovery and resolution of some kind of radical physiological or environmental issue.
But for the majority of people who just found that they had a hard time with their daily life, but were otherwise independent, and received a diagnosis that helped them see some constellation of related factors and opportunities for accommodation or treatment, things are hardly so static.
For most of early psychology, this marked the distinction between "psychotic" and "neurotic" presentations. The former represented a disruption so severe that escaping disability and achieving independence were largely out of reach, whereas the latter were understood to be real but fluctuating or even ephemeral disturbances.
It's not really until very recently, when so many people started to obsess with "identifying" themselves with this thing or that thing in some kind of permanent way, that this distinction began to fall out of mind.
In the case of those diagnosed with autism as part of generally independent and functional lives, it's not hard to find people who have experienced changes to the symptoms that originally qualified them for the diagnosis -- sometimes positively, sometimes negatively; sometimes during certain times, sometimes permanently. It's also not hard to find people who received such a diagnosis at one time and either felt comfortable fully rejecting that diagnosis at some later time or had a clinician who strongly questioned it or refused to confirm it. None of this stuff is static and much of it is subjective.
Let me ask you this: what is even the purpose of ECT? What does it cause in the brain, and how come people figured out that this may be a positive thing?
My “anti-ECT” stance is more that even modern ECT still has permanent side effects, “voluntary” does not have the standard meaning in in-patient psychiatry, and it is not impossible for a patient to have more or less every treatment thrown at their brain rather than contacting that patient's regular psychiatrist to get relevant context.
it would be of more interest personally to discuss the topic at hand itself rather than involve our personal opinions , so if you can't make your point another way this is where our ways part
I strongly believe I was misdiagnosed with autism when in reality the traits were caused by traumatic backlash from those I was supposed to trust towards ADHD traits that would have calmed down after adolescence. The diagnosis was largely a red herring for me and led me down treatment paths that did not address the root of my issues, and I believe I suffered unnecessarily as a result. It is insane to me that people are sooner to blame vaccines and diet than childhood upbringing/environment for causing symptoms construed as autism or ADHD. It makes sense though - no parent wants to be blamed for their child's lifelong disorder, just as mine still don't to this day. Cancer might just be curable, but a parent who refuses to change their mind will never be.
I am doing better these days but I sometimes wonder how I would have turned out if I got help sooner, instead of spending years and years searching for the wrong kind of help. It doesn't help that society is talking more about this and inadvertently leading people to believe that these problems are just the way things are, without considering upbringing and environmental factors.
At the same time, blaming the wrong problem is different then spending all one's time blaming the right problem, which is different than letting go of the past and doing the best one can with one's life. It is nearly insurmountable for me but I try to put forth an effort each day.
I think your point could be better made with less vitriolic language, and I also think you get a few things wrong: a bunch of my peers were over-medicated to the point of being senseless during the late 80s and early 90s. These drugs were pushed on kids by many well meaning but exasperated parents whose children - mostly boys - could not sit still and behave in the way demanded of them by school and society. So it's a mixed bag with regard to the intent behind medication, and the effectiveness with which it was applied. Nowadays, if anything it's harder than ever to get amphetamines because of US drug scheduling policies and our patchwork, piecemeal healthcare system.
reply