Ethics in digital mental health
As daily life becomes more influenced by digital technology, it is increasingly common for therapy and psychological care to be provided through mobile or web-based applications. According to Fortune Business Insights (2026), the global mental health apps market was valued at approximately USD 7.48 billion in 2025 and is projected to grow significantly over the next decade. This shift from the traditional therapy room to online and web-based formats offers real opportunities. For clients, digital tools facilitate access to psychological support and allow it to be more easily integrated into everyday life. For therapists, this shift allows progress to be monitored between sessions while promoting continuity and structure within the treatment process (Koh et al., 2022).
At the same time, digital mental health is not simply a technological extension of existing care. Instead, it brings together two professional worlds with contrasting priorities. Clinical practice focuses on individualized care and professional responsibility, while technology development is primarily driven by rapid innovation and user autonomy. Although these approaches can complement each other, tensions may arise when therapeutic functions are translated into digital products. In our work at the intersection of psychology and digital health, in which we develop mental health–oriented products, we observe these tensions in practice. We see them surface in product design discussions, feature decisions, and in how interventions are presented to users. It is particularly visible when individualized professional oversight is substituted with reliance on user autonomy. Clinical care is grounded in one-to-one responsibility and clear accountability, with professional gatekeeping as an ethical safeguard. Accordingly, before offering specific interventions, clinicians conduct a careful evaluation of a patient’s suitability and potential risks. Digital systems, by contrast, rely on automated and distributed processes which are designed for broad access and self-directed use, bypassing individualized clinical evaluation.
Marketing trauma therapy apps and professional ethics
This gap is clearly reflected in marketing practices. In the EMDR app market, for example, one trauma (PTSD) therapy app proclaims that users can independently “heal trauma” or “reduce PTSD symptoms” through self-guided sessions, tracking progress with streaks and milestones. It also claims endorsement by the WHO, APA, and government departments, although these endorsements apply to clinician-provided EMDR therapy rather than to the app itself. Using this app not only delays seeking real help, it puts users directly at risk, as processing intense emotional memories requires professional guidance and monitoring.
Therefore, it should not come as a surprise that leading professional organizations are unambiguously opposed to such practices. The EMDR International Association (2020) explicitly forbids self-administration, while guidelines from the American Psychological Association (APA) and the World Health Organization (WHO) recognize trauma treatments only when delivered or facilitated by qualified professionals (APA, 2025; WHO, 2013). Similarly, EMDR France refuses to recognize any EMDR therapy app. Francine Shapiro, the developer of EMDR, also warned against self-directed use, stating that “attempting self-directed therapy […] can result in retraumatisation since the memory may merely be dissociated once more rather than reprocessed” (Shapiro, 2018).
Unsupervised Interventions: Risks, responsibility, and accountability
The concern is not limited to PTSD, but extends to any mental health intervention that involves distressing symptoms or experiences. Without adequate professional oversight and safeguards, users may become overwhelmed or delay seeking appropriate care because they believe they are already receiving effective treatment. Consequently, marketing claims that extend beyond established professional standards are not just promotional exaggerations, they carry real risks for vulnerable individuals across a range of mental health conditions.
The question that follows is not whether digital mental health should exist, but under which ethical conditions it can responsibly operate. In digital environments, therapeutic functions may be embedded in platforms where responsibility is distributed across developers, companies, and users. When adverse effects occur, it may seem less clear who carries ethical accountability. From a clinical perspective, however, it is clear that certain responsibilities cannot ethically be transferred to untrained users. Clinicians continuously monitor safety, judge how difficult material should be approached, and recognize when a person becomes overwhelmed so they can intervene to restore stability. In a therapeutic setting, such safeguards are integral to the intervention itself. When structured oversight lacks, users follow a treatment-like process without the supervision necessary to manage potential risks.
Ethical design of digital tools
Digital tools can also be developed in ways that respect clinical safeguards rather than attempting to replace them. We designed the EMDR app as a tool within treatment, not as treatment itself. It provides bilateral stimulation that therapists and clients use within a therapeutic context. It doesn’t assess, diagnose or guide; it doesn’t encourage independent trauma processing; and it doesn’t use engagement strategies such as streaks, daily goals or achievement systems. We’ve also refrained from implementing features when there is insufficient scientific or practical evidence supporting their safety or effectiveness.
We do not consider these to be missing features. In the context of mental health, the absence of engaging features is an important feature on its own. It respects the ethically necessary boundary between tool and intervention. The therapist provides the guidance, and the app provides the stimulation. We always prioritize ethical responsibility and user safety over false substitution and engagement-driven product features.
The education gap
Despite the ethical need for an approach like ours, many current market practices still reflect the structural tension between clinical care and technology. This exposes a consequential educational and professional gap. Psychologists and clinicians are trained within ethical frameworks that emphasize competence and accountability, whereas digital entrepreneurs typically prioritize innovation and scale, assuming end-user responsibility. As a result, the two professional cultures shape the same intervention space without consistently shared ethical competencies. This divide cannot be addressed through compliance formalities or advisory roles that remain peripheral to core decision-making.
We believe that ethical reflection needs to be systematically integrated into the earliest stages of product design and strategic planning. Ethical considerations should not follow innovation as a corrective measure, but accompany it as a guiding framework. We see an opportunity to embed interdisciplinary collaboration within both digital health and psychology training, enabling future professionals to develop shared ethical principles and a deeper understanding of the risks inherent in psychologically oriented technologies. This requires developers to engage more directly with clinical ethical standards and evidence-based practice, while clinicians must develop greater familiarity with digital technologies and participate actively in the design and evaluation of digital mental health tools. Without such deliberate integration, tensions between the rapid innovation of digital mental health technologies and the ethical responsibility to safeguard users will continue to influence decisions, to the detriment of users. Developing sustainable and trustworthy digital mental health systems therefore requires not only technological expertise, but also strong ethical leadership supported by education and collaboration.
Building responsible digital mental health
We’ve seen that the rapid growth of digital mental health technologies creates new opportunities to expand access to psychological support, while also raising ethical and professional challenges. As clinical care and technology development increasingly interact, differences in training, priorities, and accountability can create tensions with real consequences for users, particularly when therapeutic interventions are offered without professional oversight. Addressing these challenges requires integrating ethical reflection into product design and expanding interdisciplinary education. As psychologists working in the field of digital products, we see it as our responsibility to ensure that ethical standards and appropriate safety measures remain central to how digital mental health technologies are developed and implemented. Moving forward, we remain committed to ensuring that innovation in this field continues to be guided by a strong ethical focus on protecting the people these technologies are meant to support.
References
American Psychological Association. (2017). Ethical principles of psychologists and code of conduct (2002, amended effective June 1, 2010, and January 1, 2017). https://www.apa.org/ethics/code/
American Psychological Association. (2024). Guidelines for the practice of telepsychology. https://www.apa.org/practice/guidelines/telepsychology-revision.pdf
American Psychological Association. (2025). Clinical practice guideline for the treatment of posttraumatic stress disorder (PTSD) in adults. https://www.apa.org/ptsd-guideline/ptsd.pdf
EMDR France. (n.d.). Avertissement | Pratiques non validées EMDR. https://www.emdr-france.org/fr/article/alerte-et-avertissement
EMDR International Association. (2020). Guidelines for virtual delivery of EMDR therapy. https://www.emdria.org/wp-content/uploads/2020/04/virtual_tg_report_for_member.pdf
Fortune Business Insights. (2026). Mental Health Apps Market Snapshot & Highlights. Fortune Business Insights.https://www.fortunebusinessinsights.com/mental-health-apps-market-109012
Koh, J., Tng, G. Y. Q., & Hartanto, A. (2022). Potential and Pitfalls of mobile Mental Health apps in Traditional Treatment: An umbrella review. Journal of Personalized Medicine, 12(9), 1376. https://doi.org/10.3390/jpm12091376
Shapiro, F. (2018). Eye movement desensitization and reprocessing (EMDR) therapy: Basic principles, protocols, and procedures(3rd ed.). The Guilford Press.
World Health Organization. (2013). Guidelines for the management of conditions specifically related to stress. Geneva: World Health Organization. https://www.who.int/publications/i/item/9789241505406
World Health Organization. (2019). WHO guideline: recommendations on digital interventions for health system strengthening. Geneva: World Health Organization. https://www.who.int/publications/i/item/9789241550505
World Health Organization. (2021). Global strategy on digital health 2020-2025. World Health Organization. https://www.who.int/publications/i/item/9789240020924