现在的位置: 首页时讯速递, 进展交流>正文
[MEDSCAPE]: 数字医生:革命性技术重新塑造医疗
2017年08月07日 时讯速递, 进展交流 暂无评论

The Digital Doctor: Revolutionary Technologies Reshaping Healthcare

John Watson | July 12, 2017

Image from Dreamstime

Medicine's Technological Leap Forward

Editing out disease from our genes. Using artificial intelligence to assist patient care. Treating psychiatric conditions with virtual reality. The by-product of an unprecedented period of technological development, 21st-century medicine's most promising advances may resemble science fiction, but they are very much fact. This slideshow highlights some of the more interesting cutting-edge technologies, including those already in practice, others on the verge of a breakthrough, and some that may ultimately prove noble failures.

Image from Dreamstime

Smartphone-Aided Clinical Studies

For many, the smartphone is an appendage that never leaves the body. Researchers are taking advantage of our techno-dependence to obtain valuable data that in years before would have only been possible in a clinic.

In 2015, Apple announced the creation of ResearchKit®, an open-source platform that gives researchers access to data from mobile apps. Among those to quickly adopt this technology were three trials at large institutions, which separately measured children's reactions to videos with "emotion detection" algorithms to test for autism, gauged the efficacy of wearable sensors in predicting epileptic seizures, and charted mole growth over time to screen for melanoma. This past spring, ResearchKit was reported to have successfully aided in a mobile study of nearly 8000 patients with asthma.

Image from Dreamstime

A Doctor in the Pocket

Companies are also working to harness the computing power of the average smartphone to conduct intricate diagnostic testing. Thanks to a new smartphone-based app, men can now test their sperm's fertility with 98% accuracy, using just their phone's camera and an inexpensive microfluidic device for less than $5. Similar apps have been developed that assess micronutrients in patients' blood and identify infectious diseases, such as schistosomiasis, with slightly more ambitious 3D-printed smartphone attachments designed to detect DNA mutations planned for release by the end of this year. [1] Silicon Valley is setting their disruptive sights on such technology as well, with the boutique eyewear company Warby Parker recently announcing plans to cut out the optometrist altogether from vision testing, instead outsourcing this work to home-based software that would result in you receiving a new pair of prescription glasses in 24 hours. [2]

On the other hand, smartphone overuse has recently been linked to everything from reduced sleep quality to so-called " text neck", leaving open the question of where the scales will balance out when it comes to whether our phone will hurt or heal us in the end.

Image from Dreamstime

Unlocking Big Data in the Brain

Currently, diagnosing and treating mental health disorders relies to a large extent on often bias-prone subjective patient reporting and psychiatric evaluations. According to a recent editorial in the Lancet[3] however, digital technologies may revolutionize the field by providing a "longitudinal picture [that] can give a more refined, quantified view of a person's mood in their real-world environment." The authors cited smartphone-aided sensors that can track breathing patterns and skin response, analytics that measure telling aspects of the human voice (tone, inflection), and records of activity and communication trends as new means for diagnosing and monitoring a patient's emotional state. Even patients' social media profiles and use patterns provide quantifiable data, acting as a virtual diary of their symptoms.

Image from Dreamstime

Digital Interventions

With its ability to be accessed at your fingertips from most anywhere, it is hoped that digital technology can expand therapeutic interventions to underserved corners of the world. [3,4]And research is starting to build up a persuasive case that digital interventions can have a tangible impact on mental health outcomes. A 2016 systematic review of Internet-based cognitive-behavioral therapy found that it was superior to control treatments at addressing depression in the short term. [5] Online therapy has similarly improved sleeping outcomes in patients with insomnia, with effects lasting up to 1 year later. [6] And as our daily activities increasingly migrate online, there are signs that some may prefer this forum for discussing often painful emotional details, with evidence showing that patients may be more likely to respond honestly to virtual humans. [7]

Image from Dreamstime

Virtual Reality Produces Real-World Gains

An extension of this effect is also seen with virtual reality (VR), which, thanks to its increasing quality and decreasing cost, has emerged as a potential game changer in psychiatric care. A recent randomized analysis showed that by providing a safe, relatively low-stakes means for engaging in simulated social exposures, VR improved outcomes over individual cognitive-behavioral therapy at up to 6 months in patients with social anxiety disorder. [8] The technology has also shown promise in relieving the symptoms of phantom limb pain, [9] lessening weight regain in obese persons, [10] and improving driving skills in those with autism spectrum disorder. [11] And VR's applications extend to practitioners as well, providing heightened training simulations that may translate to refined surgical skills in such areas as interventional cardiology and orthopedics.

Image from Dreamstime

Increasingly Intelligent Diagnostics

Artificial intelligence (AI)—the term given to machines that simulate advanced cognitive skills, such as problem-solving—has long been promised to revolutionize healthcare. When it comes to medical diagnostics, that day may soon arrive. The journal Nature recently reported that so-called "deep learning algorithms" can identify skin cancer lesions with a similar level of competence to that of trained dermatologists, and this technology showed similarly remarkable specificity for detecting diabetic retinopathy in eye scans in a 2016 JAMA study.[12] Results such as these led to a recent editorial that asked—humorously, but not without justification—whether AI means the days of the professional pathologist are numbered. [13]Rest assured, the authors found this outcome highly unlikely, instead seeing this technology's role as more complementary than conflicting.

Image from Dreamstime

The AI Gamble

Industry is betting heavily on AI's future in healthcare. Established tech titans, such as IBM, are developing software meant to aid radiologists in identifying abnormalities, [14] whereas upstarts such as the data mining company Kaggle just awarded a $1 million prize to researchers designing AI that can detect lung cancer on CT scans. [15] Entirely more ambitious applications for AI exist as well, as exemplified by Silicon Valley mogul Elon Musk recently founding Neuralink, a company tasked with creating injectable brain-implant interfaces that correct for such conditions as amyotrophic lateral sclerosis. [16]

But bold visions for AI's future have run into the harsh present-day realities of its cost. Recently, it was announced that MD Anderson Cancer Center shut down their collaboration with IBM's Watson AI system when expenditures grew to more than $60 million.

Image from Dreamstime

CRISPR: Out of the Lab, Into Patients

The race to turn CRISPR (clustered regularly interspaced short palindromic repeats) from a laboratory darling to a real-world solution for patients has been called the new Sputnik, [17]with Chinese and American researchers battling to be the first to start human testing. After further refinements were made to this gene-editing technology, the conditions were set to turn that into a reality.

In 2016, Chinese researchers injected the first person with genes using CRISPR-Cas9 technology. [17] The patient, who has metastatic non-small cell lung cancer, received an edited gene meant to increase his immune response to the cancer cells. As of this spring, upward of 20 human trials were planned for such conditions as human papillomavirus and sickle cell disease[18] Yet considerable debate surrounds CRISPR, with many worrying about its misuse in creating so-called "designer babies" and other unethical applications.

Image from Dreamstime

Regenerating Heart Muscle

Research continues into the use of cell-based therapies to repair damaged heart muscle, despite some setbacks. Earlier this year, the phase 2 PreSERVE-AMI trial, the largest US clinical trial of bone marrow cells for patients with myocardial infarction, failed to result in a significant improvement in its primary endpoint (resting myocardial perfusion over 6 months) compared with a control arm. [19] A subsequent editorial in Nature Biotechnology questioned the rationale for continuing adult stem cell therapy in this indication at all. [20]

Although enthusiasm for bone marrow-based interventions has undoubtedly dampened, there is still considerable interest in other cell-based technologies, such as human embryonic stem cells (hESCs) and induced pluripotent stem cells, which researchers believe have vastly more potential. [21,22] And when promising clinical interventions are potentially discovered for cardiac disease—the main cause of death in the United States [23]—money invariably follows. The regenerative-medicine startup BlueRock Therapeutics launched in 2016 with $225 million [24]—the latest evidence that the markets at least are not ready to move on from cell-based heart interventions.

Image from Ron Edmonds/AP

Embryonic Stem Cells’ Early Stumbles

It has been almost 20 years since the development of the first hESC line. From the beginning, hESC research was burdened by unrealistic expectations. By harnessing a resource that could proliferate endlessly and become any tissue cell type, it was thought by many that miraculous cures for paralysis, Parkinson disease, and other conditions would soon appear. When they didn't, frustration mounted.

In 2001, controversy surrounding the use of hESCs resulted in President George W. Bush restricting its federal funding in the United States. Although several states opted to fund research on their own, there was a notable sapping of momentum. [25] The decision was reversed in 2009, at which point the field was not looking particularly viable, and many researchers had moved on beyond hESC to using adult somatic cells manipulated into having qualities more like those of their embryonic counterparts.

Image from Science Source

New Evidence a Game Changer?

In the early years of this decade, the hype around hESC began to feel more justified. In 2014, US researchers reported that patients with age-related macular degeneration who received hESCs experienced improved vision—the first time this treatment showed long-term safety and disease-specific biological activity. That same year, hESCs were used to regenerate heart muscle in monkeys [26] and were shown to be safe in five patients with spinal injuries. [27] Recent innovations have also paved the way for the production of clinical-grade engraftable hESCs with enough neurogenic potential to aid central nervous system repair, finally allowing increased testing in a wide range of neurologic disorders. [28] Chinese researchers recently announced that country's first clinical trial of hESCs, and the first global study of this treatment in Parkinson disease, in which patients will receive injections of 4 million immature neurons. [29]

With several such studies recruiting patients worldwide, we should soon know whether those viable treatments envisioned 20 years ago are closer to becoming a reality.

Image from Dreamstime

Babies From Skin Cells

The controversy surrounding embryonic stem cell use will probably seem downright quaint in comparison to the uproar awaiting in vitro gametogenesis (IVG) if it proves successful. IVG is the process in which adult somatic (or body) cells can be used to create gametes for fertilization. [30] In practice, this means that something as simple as a skin cell from one person can be used to create sperm and eggs in the laboratory, and eventually a baby. IVG was successfully applied in mice, [31] and some speculate the technology could be available for humans in anywhere from 5 to 25 years. [32]

IVG could prove relief for people struggling through the battery of treatments required for in vitro fertilization, and allow same-sex couples to directly produce their own offspring. However, other applications of IVG have gained more attention. The technology would open up the possibility of "multiplex" parenting, in which several individuals are involved in genetic parenting, [33] and there are also concerns regarding the further "commodification" of human reproduction. [34] As a technology that would radically reshape our notions of reproduction, with resultant implications for everything from religion to politics, you can expect to hear much more about IVG in the years to come.

Image from Science Source

A New Understanding of Fetal Immune Systems

In other baby-related breakthroughs, this spring, researchers publishing in Nature [35] reported that fetuses, contrary to earlier thinking, have immature immune systems with unique qualities not seen in newborns. Their results showed that dendritic cells crucial for immunity and tolerance function differently in fetuses, which makes them less likely to attack foreign cells—an important feature when you're trying to coexist in someone else's body. Although these findings are preliminary, there is hope that the discovery will improve our ability to treat fetuses in utero, address some underlying causes of miscarriage, and potentially enhance the chances of successful organ transplantation by mimicking fetal immune system conditions.[36]

This study reminds us that even in an era of rapid innovation, the latest gadgets are still no match for the ultimate cutting-edge technology: the human body itself.


您必须 [ 登录 ] 才能发表留言!