top of page

Can IT Innovation Change-and Improve-Mental Healthcare Delivery?


It may come as a surprise, but when it comes to evolving mental healthcare, clinicians and researchers may benefit by taking a thoughtful look at software development.

Here's why.

The human brain has often been coined the most complex organ in the universe. Wouldn’t it make sense then to believe the mind lives within a similar ballpark of complexity? Psychiatry or mental healthcare exists largely to treat problems of the mind. How then do we address questions regarding educational best practices for this immeasurably complex and invaluable system? More specifically… How do we educate our front-line providers with the best treatment framework?

At the risk of oversimplification, perhaps comparison with Information Technology (IT) software development models will be helpful. IT development has historically attempted to find responses to increasingly complex development environments. Two models within this history and their contrasting styles serve as useful analogies. They go by the titles Agile and Waterfall, and their descriptions are strikingly parallel to issues which exist today within educating providers. To be clear, by mental healthcare, I am considering both the psychopharmacologic and psychotherapeutic aspects of care.

The Waterfall Model holds a longstanding legacy and is comparable to the educating process for mental healthcare delivery as it currently exists. In waterfall design, the IT software product is developed, more or less, in isolation of its target audience. It is then released, “explained to”, or “talked at” its consumer audience. The consumer must live with the finished product regardless of how well the product, as designed, fits his or her real-life needs. As complexity increases in IT environments, product scope and goals are frequently changing between the competing needs of invested parties. It follows that the greater the level of isolation in product development, the greater the risk for an inferior product that does not match the needs of the customer. In an effort to better adapt, IT has developed other models. One alternative goes by the apt title Agile, and it is defined by a higher level of interaction with the target consumer in multiple short development cycles. By definition, it stays in closer communication and attunement with the consumer to know their needs. I would also suggest it creates an attitude of mutual respect and communication.

How does this difference in viewing the development process compare to our current model of education for mental healthcare delivery? Historically, the most common approach has entailed conducting “pure” or controlled studies and then assuming positive findings can be distributed as a “whole package”, which will be useful for all patients. This misunderstanding and inappropriate generalization of the scientific method has serious negative consequences. It leaves our well-intended front line providers who follow these packages in handcuffs and only fractionally tapping into their potential skill set. It also creates a sense of uncertainty and guilt for those who depart from these “treatment packages” and integrate interventions thoughtfully and responsibly. This educational process threatens to or frankly already has limited patient care. This risk has been addressed within the realm of psychotherapy research in an excellent article by Drew Westen, PHD (The Empirical Status of Empirically Supported Psychotherapies: Assumptions, Findings, and Reporting in Controlled Clinical Trials, Psychological Bulletin 2004).

At the heart of this problem lies the double edge of the Randomized Control Trial or RCT and attempting to use it to define the “whole” care of a patient. By definition, RCT’s are a pursuit of purity. When one begins the study design process associated with RCT’s, one primary goal is validity. Statistical validity implies that all possible variables that could affect the outcome are controlled for in various ways. For instance, consider the following question related to a study whose primary focus is on depression, “Could alcohol or drugs be the reason for changes in mood rather than depression?” The answer is yes, and so all persons enrolling in this study with substance issues would then be removed. Many other questions about other confounding variables exist and hence this process of excluding persons from a study is often repeated multiple times. What you are left with is a “pure” group, which has little to do with the real world and people who I and other providers actually see in our offices.

Randomized control trials are the current go-to

for educational resources, but in the

interest of achieving statistical validity,

researchers exclude variables the clinician confronts daily.

To illustrate this "purity" problem, I'll venture into a sports analogy: shot selection in basketball. This analogy shows the difference between a pure technique and the creative overarching strategy needed, within a game, to use a technique. One might study a player's shooting skill from one specific spot on the court. Certain interventions in shot mechanics might be made, which could be observed and statistically studied. Perhaps the player has really improved in their shooting from the studied spot so those interventions are validated as effective. This conclusion--that identifying and working on sub-components of the player's shot leads to better shooting--is reasonable. Yet it is folly to believe that a game with 10 players on the court could be won by having that player shoot from this particular spot over and over again.

Similar reasonable-sounding conclusions are routinely drawn in mental health care, i.e., “since this approach is validated by RCT studies, I can and should use it, in the same way, for all situations and patients.” Yet, in the basketball analogy, the conclusion becomes comical, to the point where a "Saturday Night Live" skit comes to mind with five defenders lining up to take their turn blocking the hapless player’s shot! The player/provider needs more than the proven "technique," he/she needs a creative mind to integrate the complexity of the entire game and find the right moment to choose that specific shot.

In summary, the current waterfall process of creating RCT-based educational packages for delivery to front-line providers is not the best fit. This process has problems within psychopharmacology training and even more so within psychotherapy. The RCT lives best in a world where it tests small areas of scope and gives good answers regarding them. It is not well suited for taking modestly reliable disorder-diagnoses, designing experiments which, when successful within a purified research environment, are then generalized and “talked at” your mental health care providers as the singular solution to the modestly reliable disorder-diagnosis you have been given. Good care requires the innate abilities of a caring human mind touching your own and unleashing a complex development process. That’s the art and craft of good mental healthcare. If the current tools of research were allowed to operate in a framework more akin to the Agile model, education of front-line providers would be improved. Given a more flexible framework providers would have greater confidence to integrate biological, psychological and social interventions and create the “best shot” at finding a solution for you.

Who's Behind the Articles
Categories
Search By Tags
No tags yet.
Archives
bottom of page