The digital media that people consume can influence not only their thoughts, but it can also have a major impact on behavior, mental and physical health.
That’s according to Dr. Shirley Wang. Wang is a researcher and assistant professor at Yale University. She received her Ph.D. in 2024 from Harvard University. Wang said social media isn't really a monolith when it comes to mental health. It can have both helpful and harmful effects, depending on who uses it and how they use it. She said the effects can even differ for the same person over time.
“Work has also demonstrated that social media can have a lot of positive effects for youth from building and connecting with friends, pursuing creative hobbies, finding supportive communities, especially for youth with marginalized identities,” Wang said.
However, young people’s use of social media has become a public health concern. According to a 2025 Pew Research study, 45% of teens said they spend too much time on social media. The survey showed that 48% of teens said sites have a mostly negative effect on people their age, but just 14% believed they personally experienced a negative effect.
“There's been some research showing that a lot of youth say that they're online almost constantly,” Wang said. “Given documented associations between social media use and many mental health outcomes like depression, anxiety, eating disorders, even suicidal thoughts and behaviors.”
Wang’s research program focuses on suicide, nonsuicidal self-injury, and eating disorders. The team integrates methods from across the clinical and computational sciences, including machine learning, mathematical modeling, and ambulatory assessment. Research subjects are studied using smartphones and wearables such as health smartwatches. Wang said there's nuance in how social media impacts mental health, but there are also examples of content that is really clearly harmful.
“So websites that might promote disordered eating, so-called Pro Ana, websites that promote restrictive eating and really extreme pursuit of the thin ideal, I think are, we can all agree that that is harmful to youth mental health, especially to adolescent girls, who are already facing a lot of societal pressure to achieve a certain body type,” Wang said.
Wang said people who spend a lot of time online might also withdraw from personal relationships. In addition to social isolation, sleep is another area that might suffer. That’s because in some cases, instead of sleeping, an app user might find it difficult to put their phone down when scrolling before bedtime. Wang said sleep is vital for both mental and physical health. She said adolescents might be more sensitive to algorithms because they're designed to keep users engaged. It activates reward centers in the brain, and users often get stuck scrolling for hours. Adults struggle as well.
Sometimes social media use can become “compulsive,” but Wang hesitates to call it addictive, as people tend to refer to social media. Especially because in psychology, the term addiction is discussed differently as it relates to the diagnostic system. “Doomscrolling” is a term often associated with excessive time spent watching short-form content. Sometimes news, sometimes social media, but the act leaves people drained or anxious.
Wang said that fighting the “Doomscrolling” urge is less about individual willpower and more about how algorithms are designed. She said a way to combat endless scrolling is to be more intentional about the time spent online. This can be done by setting time limits, turning off auto scroll features and or not engaging with content to begin with.
“Oftentimes, people can feel a lot of guilt and shame over this behavior, right? They say, ‘Oh, I should just be able to stop, or I can't get myself to stop, and what I'd say is, it doesn't. It's not about willpower. It's sort of like a predictable response to the way that these algorithms are designed to keep people engaged, and they really want to keep hitting on your reward response to getting content right and just keep you endlessly scrolling,” Wang said.
AI Chatbots
Sometimes people rely on chatbots for medical advice. Whether it’s a lack of access to certain types of care or financial limitations, Wang said it is not the best idea to rely on a chatbot for mental health advice. There have been examples of bots that are sycophantic or designed to prioritize user validation. This means that it can prioritize keeping the user happy and engaged over providing accurate facts.
“AI chatbots sort of take this to another level, in a way that can get people really stuck and and really engaged in talking to the AI chatbot because of this positive validation and this positive feedback that they're receiving that coupled with the constant and immediate availability, I think, can really become a challenging space for people who are vulnerable and feeling otherwise unheard or unsupported in daily life to then seek support from an AI Chatbot,” Wang said.
Instead of relying on chatbots, Wang suggests reaching out to professionals for support with suicidal thoughts or behaviors with self-harm. The 988 suicide crisis line is available by phone, text, or online through chat with a real person. She said that although AI chatbots are a fun way to pass the time, people should be cautious if they have serious mental health concerns.
“These are all resources where people receive training and guidelines on how to support individuals in a crisis, so I would definitely recommend those resources, while at the same time, you know, I understand why people do want to turn to AI chatbots,” Wang said.
There are ways to be intentional about the time spent online, Wang said. Following specific news outlets or reporters instead of scrolling for information and searching for positive news. Another way to curb scrolling is to set up an accountability partner. Wang said setting time limits can be a helpful tool that you and a friend can decide to help keep each other accountable. She also emphasized the importance of in-person activities.
“Spend time on hobbies and offline activities to sort of balance out that new cycle and knowing that it's okay to take a step back for your mental health if you need to. I think there's always a pull to want to stay informed, and I think that is wonderful, but there also comes a time when you might need to take a step back and turn off the news so that you can take care of your own well-being,” Wang said.
More info:
Featured Guest: Dr. Shirley Wang is an assistant Professor at Yale University. Wang received her Ph.D. in 2024 from Harvard University. Wang’s research program aims to develop and harness novel methods that can capture and model this complexity, with a focus on suicide, nonsuicidal self-injury, and eating disorders. By integrating methods from across the clinical and computational sciences, including machine learning, mathematical modeling, and ambulatory assessment (e.g., via smartphones and wearables).
Resources:
- The Computational Clinical Science Lab is directed by Dr. Shirley Wang. We are based in the Department of Psychology at Yale University. Read research on
- Visit the 988 Suicide & Crisis Lifeline if you're facing mental health struggles, emotional distress, alcohol or drug use concerns, or just need someone to talk to. You can speak with trained counselors via phone call, text, chat or other services for the deaf and hard of hearing.
- Get help from doomscrolling or text CONNECT to 741741 if you want help figuring out what steps feel doable for you to cut back on scrolling.
Roman: What are some ways that social media can affect mental health?
Wang: Yeah, it's a great question, especially because these days we know that social media is really ubiquitous, especially in young people. There's been some research showing that a lot of youth say that they're online almost constantly, and that has rapidly become a public health concern, given documented associations between social media use and many mental health outcomes like depression, anxiety, eating disorders, even suicidal thoughts and behaviors. So of course, there can be a negative impact of social media on mental health, but I also want to emphasize that's not the full story. So work has also demonstrated that social media can have a lot of positive effects for youth from like, building and connecting with friends, pursuing creative hobbies, finding supportive communities, especially for youth with marginalized identities. So I think the effects are not necessarily the same across different people or even within the same person over time. So social media isn't really a monolith in terms of mental health. It can have both helpful and harmful effects depending on who is using it and how they're using it.
Roman: When someone uses social media, are there changes that happen behaviorally, and how so?
Wang: I think when people use social media or they're online, one change that might happen for some people is that they can start to withdraw from connections in sort of in person relationships, right in the real world, so to speak. So that's one reason that some people are concerned about social media use, is that there's this fear that folks might be spending too much time online and at the expense of building in person relationships. Again, it's not the case that this happens for everyone, or that it would even be consistent within the same person over time, but I think social isolation is probably one factor that's relevant, and another is sleep. So I think, you know, we all know sleep is so important for mental and physical health and well being, and so sometimes social media use can become a little compulsive, or a little obsessive for people insofar as it even gets in the way of them getting enough sleep.
Roman: Something that has been brought up is Doom scrolling. Can you explain how Doom scrolling works, and how can we stop it?
Wang: Yeah, Oh, such a good question. The term Doom scrolling. Yeah, I do think Doom scrolling is a thing, or I would hesitate to use the term addiction and apply it to social media. I know that is a really common way that people talk about it, and there certainly are qualities that can feel obsessive or compulsive for people, but I would probably, I would be hesitant to use the term addiction for a variety of reasons, including how it's typically discussed in psychology and in our diagnostic system. But yeah, in terms of doom scrolling, the first thing I would say is, oftentimes people can feel a lot of guilt and shame over this behavior, right? They say like, oh, I should just be able to stop, or I can't get myself to stop, and what I'd say is, it doesn't. It's not about willpower. It's sort of like a predictable response to the way that these algorithms are designed to keep people engaged, and they really want to keep hitting on your reward response to getting content right and just keep you endlessly scrolling. I would say there are some strategies that people could take in order to reduce how much there's Doom scrolling, like trying to be more intentional with when they go online, setting time limits for themselves, turning off auto scroll features and things like that. But yeah, I think Doom scrolling is less about individual willpower, and probably more about the way that these algorithms are designed.
Roman: Are there differences in the ways that social media affects adults' brains compared to young people in sort of their developmental stages?
Wang: Yeah, I think this is one reason that people are especially concerned about social media in adolescence, because we know that the teenage years are a period of really rapid change in terms of social, emotional functioning and brain development. So in general, adolescents might be more sensitive to reward, right? So part of what I just said about these algorithms being designed to keep us engaged and to keep us scrolling on the apps, there is a risk that adolescents might be more susceptible to that and find themselves getting stuck in scrolling for hours, to the point where we see interference with things like sleep and in person social behavior. That said, I think there's also a lot of evidence that adults struggle with this as well.
Roman: Your research discusses. Suicidal ideation, and we can see similar topics of self harm, body image issues, even eating disorders that are all mentioned within the conversation talking about social media. How does exposure to the digital world influence perception of the world?
Wang: Yeah, it's such a great question. I mean, especially when we think about eating disorders, self harm, even suicide risk, unfortunately, there can be pretty harmful content online. I think we've been talking so far about there being nuance in how social media impacts mental health, and that's certainly the case. But there are also examples of content that is really clearly harmful. So websites that might promote disordered eating, so called Pro Ana, websites that promote restrictive eating and really extreme pursuit of the thin ideal, I think are, we can all agree that that is harmful to youth mental health, especially to adolescent girls, who are already facing a lot of societal pressure to achieve a certain body type.
Roman: We're seeing more reports of people turning to things like AI chatbots for companionship, sometimes for medical advice even. Why do you think that is?
Wang: Oh, it's such a good question, and this is what my lab is really interested in right now. So we're doing a number of studies on this area, and I hope that in the next six months to a year, I'll be able to say more about that. But I think there's a lot of reasons. The first, probably most obvious reason is that it can be really hard to access care in real life, right, whether it be for a physical or a mental health concern. Oftentimes, people's doctor's visits aren't totally covered by insurance. Sometimes people can't afford their insurance, even when insurance covers it. Sometimes it can be really hard to find a provider, to find a therapist with availability, who they feel connected to, and who they feel comfortable with. And then, even if someone does find a therapist, that person isn't available, 24/7 right? Typically, you're going to see your therapist once a week, once every other week, for about 45 minutes. And so the question is, what happens during the rest of the time when you're not in the therapy room, and yet you still might be struggling and need support. And so I think all of those reasons are why people might turn to AI chatbots for support with emotional distress or with mental health, or, like you said, even for medical advice, is because of a lack of access to other types of care. And then there are also ways in which these AI chat bots, again, are designed to keep people engaged, right? Many chat bots are known to be sycophantic, right, meaning that they might be overly sort of positive in their reactions to someone really lavish in their praise, really, really validating, and all of those are strategies that a therapist might use as well, or friends might also be supportive and encouraging and validating that AI chatbots sort of take this to another level, in a way that can Get people really stuck and and really engaged in talking to the AI chatbot because of this positive validation and this positive feedback that they're receiving that coupled with the constant and immediate availability, I think, can really become a challenging space for people who are vulnerable and feeling otherwise unheard or unsupported in daily life to then seek support from an AI Chatbot.
Roman: What are some ways that these chatbots can be beneficial, and then, what are some things that people should maybe be careful about?
Wang: Yeah, I think AI chatbots Well, clearly, they're here to stay, right? They're not going anywhere. They're going to be a bigger and bigger part of our life for the years to come. So I think it's really important for us to understand when AI chatbots can be useful and when they might be harmful or even potentially life threatening in the case of suicide, and I unfortunately, just don't think we have a solid understanding about the impacts of AI chatbots on mental health. Yet, this is an extremely new area of research. There is not a lot of data on it, and yet it's being used constantly. So first, I would just say there needs to be a lot more work being done on this topic, but based off what we know right now, I would say you know for dealing with emotional support in terms of figuring out how to navigate challenging social situations. AI chatbots might be able to help you identify different strategies. That you could try, right? So helping you to think outside of your own perspective and find new ideas for potentially engaging with people or with navigating social conflicts. But I am wary of using or relying on AI chatbots as a therapist. I think I have some concerns about when people are, you know, using AI chat bots as a form of therapy, rather than as maybe a sounding board or maybe a way to help them understand other people's perspectives or to generate ideas about how to approach situations. But I just think that when it crosses over to sort of relying on it as a therapist, is where we just don't know what the impacts of that could be
Roman: For those that are listening, that maybe feel like they do need that resource, where can they look to have resources that maybe are different than just going to some Chatbot.
Wang: Yeah, so people are right now building AI chatbots specifically for mental health. So these are algorithms that are being trained with real feedback from therapists in order to understand how evidence based guidelines might interact with AI chatbot sort of output, figure out what guardrails to put up and how to provide responses that are a little bit more in line with our evidence based practices. So some of that research has been published recently, and as those apps are developed a little bit more, I think those could be amazing resources for people to use. As it is, there are a number of digital health, digital mental health applications that are available that are evidence based, that people could try. There's, you know, if people are listening and specifically thinking about, you know, reaching out for support with suicidal thoughts or behaviors or with self harm. There's the 988, suicide crisis line, text line. These are all resources where people receive training and guidelines on how to support individuals in a crisis, so I would definitely recommend those resources, while at the same time, you know, I understand why people do want to turn to AI chatbots. And to be clear, I don't think it's necessarily all harmful. I just think we need to approach it with a cautious and critical lens.
Roman: Oftentimes, we're just inundated with so much that is going on with the world. How can we stay informed but not become overwhelmed with all of the news, especially the negative news and the information that we see online?
Wang: I think there are a few strategies people can use. One is being really intentional with the accounts that you follow in order to stay up to date with news, maybe following specific news outlets or certain reporters, right, but limiting the extent to which they are scrolling on a for you page where they might get a little bit more bombarded with information that's outside of their control. So taking a little bit more agency and autonomy back into selecting specifically who you want to be receiving news from and who you're following, setting up accountability with people in your own life, right? So maybe you and a friend together can decide to only spend 20 minutes a day on a social media app and help keep each other accountable. And then I would also just say, making sure to plan time to intentionally engage with positive news as well. So whether that be following like creators or artists who produce things that bring you joy, setting up in person events, making sure to, you know, spend time on hobbies and offline activities to sort of balance out that new cycle and knowing that it's okay to take a step back for your mental health if you need to, right? I think there's always a pull to want to stay informed, and I think that is wonderful, but there also comes a time when you might need to take a step back and turn off the news so that you can take care of your own well being.
Roman: And let's say that somebody wants to do that. You know, they've just gotten out their phone, maybe they've just seen way too much that's going on with the world and they want to take a step back. What can they do to sort of decompress and really get back to themselves?
Wang: Yeah, that's a great question. I think there are a lot of strategies that could be useful. Self soothing skills are one that I would recommend. So thinking about you. How to do activities that can really calm you down and help you find a little bit of peace amongst a really chaotic world. So for some people, that's going outside to take a walk. For other people, it's reading a book. Could be, you know, physically getting out of your house and going to a different location, or even just standing up from your desk and taking some time to stretch or practice some deep breathing, but I would be really intentional about how to make sure you're using these skills in your daily life so often we can sort of be on autopilot when we're feeling tired or stressed and just open your phone and sort of auto navigate to a social media app even without realizing that you're doing it right. So sort of putting in some training to sort of train yourself to do other strategies as well. So maybe that's like setting a rule that before you open Instagram, you're going to get up and stretch for five minutes, or go walk down the hall to make yourself a cup of coffee and then come back right putting in those little breaks and little barriers can just help you be more aware of what you're doing and then a little bit more intentional about how You do it.
Roman: What's one thing that listeners can do right now to build a healthier relationship with technology and safeguard their mental health?
Wang: Yeah, so I would say research indicates that there are many different ways that people use technology and social media, and some tend to be more positive and some tend to be more negative. So a clear distinction is between sort of active versus passive use of social media, right? So passive use meaning like what you mentioned earlier, with the Doom scrolling and just sort of being a more passive observer of what other people are putting out into the world that tends to be associated with more negative outcomes versus more active use, which could involve using social media to chat with friends or creating your own content or using it to explore different hobbies, that tends to be associated with more positive outcomes. So I think one thing that people could just bring awareness to is when they find themselves sort of stuck in a more passive use pattern, where they are Doom scrolling or using social media in a way that's almost mindless, to just bring their attention to it, sort of notice what they're doing, and then make an intentional choice to try something a little bit different, whether that be DMing someone to start a conversation or turning off their phone and going to pursue another hobby. So I think just bringing awareness to when people are getting a little bit more stuck in the passive Doom scrolling, and then trying to take back some of the autonomy and intentional decision making.