Book

mHealth-Anwendungen für chronisch Kranke

Authors:

Abstract

Die Digitalisierung im Gesundheitsmarkt verändert und erweitert die Möglichkeiten, Menschen mit chronischen Erkrankungen zu helfen, weshalb sie einen immer größeren Stellenwert für Diagnostik und Versorgung einnimmt. Dieses Herausgeberwerk gibt einen Überblick über akutelle und innovative mHealth-Lösungen, die als tägliche Begleiter bei Prävention, Diagnostik und Therapie zum Einsatz kommen. Praxisbeispiele zeigen Entwicklungen und die Inanspruchnahme von mHealth-Anwendungen durch Patienten und klinisches Personal auf und es werden zahlreiche für die Forschung relevante Fragen behandelt. Sowohl Ärzte, Patienten, Entwickler und weitere Praktiker, die Mobile-Health-Anwendungen nutzen und zur Weiterentwicklung beitragen als auch Wissenschaftler und Dozenten mit den Schwerpunktfächern Gesundheitsmanagement, Informationstechnologie und Medizin erhalten wertvolle Einsichten in das aktuelle Branchenthema. Der Inhalt • mHealth-Systeme in der Medizin • Chancen der Digitalisierung des Gesundheitswesens • Digitale Dienstleistungen, veränderte Geschäftsmodelle und Zielgruppen • Gebrauchstauglichkeit, Akzeptanz und Nutzungserlebnis von mHealth-Anwendungen • Verbesserung der ganzheitlichen Gesundheit mittels mHealth und Coaching • Qualitätsbewertung, Datenschutz und Informationssicherheit von mHealth-Anwendungen Die Herausgeber Prof. Dr. Mario A. Pfannstiel ist Professor für Betriebswirtschaft im Gesundheitswesen – insbesondere innovative Dienstleistungen und Services und Mitglied im Institut für Vernetzte Gesundheit an der Hochschule Neu-Ulm. Cand. Dr. Felix Holl ist wissenschaftlicher Mitarbeiter für den Bereich Digitalisierung Healthcare in der Fakultät Gesundheitsmanagement an der Hochschule Neu-Ulm. Prof. Dr. Walter Swoboda ist Forschungsprofessor der Fakultät Gesundheitsmanagement und Leiter der Arbeitsgruppe DigiHealth an der Hochschule Neu-Ulm (HNU).
Article
ZUSAMMENFASSUNG Hintergrund Digitale Anwendungen liefern kontinuierlich Gesundheitsinformationen, die sowohl Patient en als auch Leistungserbringer unterstützen können. Digitale Gesundheitsanwendungen (DiGA) sollen das Potenzial in Deutschland nutzbar machen. Ziel Ziel der Arbeit ist die Darstellung verfügbarer DiGA im Bereich der Neurologie und Psychiatrie sowie eine Gegenüberstellung mit dem Potenzial digitaler Anwendungen. Material und Methoden Informationen über verfügbare DiGA aus dem Anwendungsverzeichnis des Bundesinstituts für Arzneimittel und Medizinprodukte (BfArM) werden dargestellt und kritisch diskutiert. Ergebnisse DiGA in Neurologie und Psychiatrie vermitteln den Zugang zu psychotherapeutischen Maßnahmen und kognitiver Verhaltenstherapie. Tragbare Sensoren und Künstliche Intelligenz sind kaum integriert. Verbreitete neurologische Indikationen sind nicht abgedeckt, Leistungserbringer sind unzureichend integriert. Diskussion DiGA sind ein erster Schritt in Richtung digital unterstützter Neurologie und Psychiatrie. Weitere Anstrengungen sind erforderlich, um den Bedarf an innovativen, digitalen Lösungen abzudecken und eine effiziente Integration in die Versorgung zu gewährleisten.
Book
Full-text available
-> There is a broad variety of health apps which is constantly being expanded. The offer ranges from fitness and nutrition apps to apps providing support in case of illness. -> With constantly new functionalities, these apps open up innovation potentials by accompanying everyday forms of people’s health. behaviour in an individually adapted way. -> Numerous health apps are developed for people with chronic diseases to support them in coping independently and actively with problems of everyday life and making competent decisions. -> However, despite of the large number of health apps, there is only little evidence regarding the benefits as well as the health-promoting and preventive effects of these apps. -> The market for health apps is only little regulated. More-over, there is a lack of reliable quality control taking into account particularly the issue of data protection.
Article
Full-text available
Background: Mobile health has provided new and exciting ways for patients to partake in their healthcare. Wearable devices are designed to collect the user's health data, which can be analysed to provide information about the user's health status. However, little research has been conducted that addresses privacy and information security issues of these devices. Objective: To investigate the privacy and information security issues to which users are exposed when using wearable health devices. Method: The study used a cross-sectional survey approach to collect data from a convenience sample of 106 respondents. Results: Half of the respondents did not understand the need to protect health information. There also appeared to be a general lack of awareness among respondents about the information security issues surrounding their data collected by wearable devices. Conclusion: Users were not knowledgeable about the privacy risks that their data are exposed to or how these data are protected once collected. Implications: Users of wearable devices that collect personal information about health need to be educated about privacy and information security issues to which they are exposed when using these devices.
Article
Full-text available
Background Recent studies have highlighted that people diagnosed with head and neck cancer (HNC) have complex information needs. They are subject to multiple clinical appointments with numerous healthcare professionals in preparation for their treatment. Speech and language therapists (SLTs) are core members of the HNC multidisciplinary team, providing assessment, prehabilitation and counselling regarding potential treatment effects on the critical functions, including swallowing and communication. We believed the purpose of the pre-treatment speech-language therapy (SLT) consultation within this pathway is not well understood by patients. Whilst the benefits of prophylactic swallowing exercise prescriptions continue to be explored, adherence is a frequently cited challenge in clinical trials. We sought to enhance pre-treatment dysphagia services for patients with head and neck cancer (HNC) undergoing chemoradiation. Methods A participatory action research approach called experience-based co-design (EBCD) was undertaken at a tertiary cancer hospital in the UK. People who had previous radical radiation treatment for head and neck cancer and staff members within the head and neck unit were recruited to take part in in-depth, one-to-one interviews about their experiences of the pre-treatment SLT head and neck radiation clinic. Patient interviews were video-recorded, analysed and edited down to a 30 min ‘trigger’ film. At a subsequent patient feedback event, the film was shown and an ‘emotional mapping’ exercise was undertaken. Through facilitated discussion, patient priorities for change were agreed and recorded. At a staff feedback event, key themes from the staff interviews were discussed and priority areas for change identified. The project culminated in a joint patient and staff event where the film was viewed, experiences shared and joint priorities for change agreed. Task and finish groups were developed to implement these changes. Results Seven patients and seven staff members participated. All seven patients had undergone radical (chemo-) radiation for HNC. At least 2 months had elapsed since their final treatment date and all participants were within 9 months of their definitive treatment. Staff members comprised a radiation oncologist, two clinical nurse specialists, two head and neck dietitians and two speech-language therapists. Patients reported that overall, their experience of the pre-treatment clinic is positive. Patients value experienced staff, consistency of staff and the messages they provide and a team approach. Patients highlighted the need for different information methods including online/digital information resources and further information regarding the longer-term effects of treatment. Patients valued the purposes of prophylactic exercises and again advocated for supporting resources to be available in a range of online/digital media. Staff members raised the need for flexibility in appointment times and clearer messaging as to the rationale for a pre-treatment SLT appointment, including a rebranding of the SLT service. Seven key areas for improvement were identified jointly by patients and staff members including revision of patient and carer information, development of a patient experience video, information on timelines for recovery, a buddy system for patients before, during and after treatment, flexibility of appointment scheduling, seamless transfer of care between settings and SLT department rebranding. Joint patient and staff task and finish groups were initiated to work on these seven priority areas. Conclusions We have worked in partnership with patients to co-design pre-treatment dysphagia services which are accessible and meet the individuals’ needs. Task and finish groups are ongoing with staff and patients are working together to address priority areas for change. This work provides a good example for other centres who may wish to engage in similar activities.
Article
Full-text available
Disposable sensors are low-cost and easy-to-use sensing devices intended for short-term or rapid single-point measurements. The growing demand for fast, accessible, and reliable information in a vastly connected world makes disposable sensors increasingly important. The areas of application for such devices are numerous, ranging from pharmaceutical, agricultural, environmental, forensic, and food sciences to wearables and clinical diagnostics, especially in resource-limited settings. The capabilities of disposable sensors can extend beyond measuring traditional physical quantities (for example, temperature or pressure); they can provide critical chemical and biological information (chemo- and biosensors) that can be digitized and made available to users and centralized/decentralized facilities for data storage, remotely. These features could pave the way for new classes of low-cost systems for health, food, and environmental monitoring that can democratize sensing across the globe. Here, a brief insight into the materials and basics of sensors (methods of transduction, molecular recognition, and amplification) is provided followed by a comprehensive and critical overview of the disposable sensors currently used for medical diagnostics, food, and environmental analysis. Finally, views on how the field of disposable sensing devices will continue its evolution are discussed, including the future trends, challenges, and opportunities.
Article
Full-text available
Like virtually all age-related chronic diseases, late-onset Alzheimer's disease (AD) develops over an extended preclinical period and is associated with modifiable lifestyle and environmental factors. We hypothesize that multimodal interventions that address many risk factors simultaneously and are individually tailored to patients may help reduce AD risk. We describe a novel clinical methodology used to evaluate and treat patients at two Alzheimer's Prevention Clinics. The framework applies evidence-based principles of clinical precision medicine to tailor individualized recommendations, follow patients longitudinally to continually refine the interventions, and evaluate N-of-1 effectiveness (trial registered at ClinicalTrials.gov NCT03687710). Prior preliminary results suggest that the clinical practice of AD risk reduction is feasible, with measurable improvements in cognition and biomarkers of AD risk. We propose using these early findings as a foundation to evaluate the comparative effectiveness of personalized risk management within an international network of clinician researchers in a cohort study possibly leading to a randomized controlled trial.
Article
Full-text available
Current trends in health care delivery and management such as predictive and personalized health care incorporating information and communication technologies, home-based care, health prevention and promotion through patients’ empowerment, care coordination, community health networks and governance represent exciting possibilities to dramatically improve health care. However, as a whole, current health care trends involve a fragmented and scattered array of practices and uncoordinated pilot projects. The present paper describes an innovative and integrated model incorporating and “assembling” best practices and projects of new innovations into an overarching health care system that can effectively address the multidimensional health care challenges related to aging patient especially with chronic health issues. The main goal of the proposed model is to address the emerging health care challenges of an aging population and stimulate improved cost-efficiency, effectiveness, and patients’ well-being. The proposed home-based and community-centered Integrated Healthcare Management System may facilitate reaching the persons in their natural context, improving early detection, and preventing illnesses. The system allows simplifying the health care institutional structures through interorganizational coordination, increasing inclusiveness and extensiveness of health care delivery. As a consequence of such coordination and integration, future merging efforts of current health care approaches may provide feasible solutions that result in improved cost-efficiency of health care services and simultaneously increase the quality of life, in particular, by switching the center of gravity of health delivery to a close relationship of individuals in their communities, making best use of their personal and social resources, especially effective in health delivery for aging persons with complex chronic illnesses.
Article
Full-text available
Background: Alzheimer’s disease has become a very significant problem in the current era that is very likely to worsen due to increasing life expectancy. The enormous potential of smart mobile devices has attracted researchers and medical practitioners alike to develop and provide M-health solutions for Alzheimer’s. In this paper, we explore, analyze and investigate the works that have attempted to develop M-health applications for Alzheimer’s over the last 10 years in order to highlight current and future directions, which it is hoped will assist medical practitioners and researchers to provide more solutions for fighting Alzheimer’s. Methods: A systematic review approach was used in this study. Google Scholar is considered for searching the published papers for academic mobile applications and Google PlayStore is considered for non-academic mobile applications. More than 900 titles of various papers were reviewed. The selection of the papers was dictated by a set of inclusion and exclusion criteria. Results: Current directions are identified by presenting the services and functions that are provided by M-health applications for Alzheimer’s patients and the technologies that are used to do so. Current services are classified into four groups and the future directions for each of these groups are presented. Conclusions: Numerous M-health applications that fight against Alzheimer’s have been surveyed and analyzed by using scientific methodology in which inclusion and exclusion criteria have been used. Based on this analysis, we were able to present the current research foci and define potential future research directions.
Article
Full-text available
Automatic computer-based seizure detection and warning devices are important for objective seizure documentation, for SUDEP prevention, to avoid seizure related injuries and social embarrassments as a consequence of seizures, and to develop on demand epilepsy therapies. Automatic seizure detection systems can be based on direct analysis of epileptiform discharges on scalp-EEG or intracranial EEG, on the detection of motor manifestations of epileptic seizures using surface electromyography (sEMG), accelerometry (ACM), video detection systems and mattress sensors and finally on the assessment of changes of physiologic parameters accompanying epileptic seizures measured by electrocardiography (ECG), respiratory monitors, pulse oximetry, surface temperature sensors, and electrodermal activity. Here we review automatic seizure detection based on scalp-EEG, ECG, and sEMG. Different seizure types affect preferentially different measurement parameters. While EEG changes accompany all types of seizures, sEMG and ACM are suitable mainly for detection of seizures with major motor manifestations. Therefore, seizure detection can be optimized by multimodal systems combining several measurement parameters. While most systems provide sensitivities over 70%, specificity expressed as false alarm rates still needs to be improved. Patients' acceptance and comfort of a specific device are of critical importance for its long-term application in a meaningful clinical way.
Article
Full-text available
Wearable Health Devices (WHDs) are increasingly helping people to better monitor their health status both at an activity/fitness level for self-health tracking and at a medical level providing more data to clinicians with a potential for earlier diagnostic and guidance of treatment. The technology revolution in the miniaturization of electronic devices is enabling to design more reliable and adaptable wearables, contributing for a world-wide change in the health monitoring approach. In this paper we review important aspects in the WHDs area, listing the state-of-the-art of wearable vital signs sensing technologies plus their system architectures and specifications. A focus on vital signs acquired by WHDs is made: first a discussion about the most important vital signs for health assessment using WHDs is presented and then for each vital sign a description is made concerning its origin and effect on heath, monitoring needs, acquisition methods and WHDs and recent scientific developments on the area (electrocardiogram, heart rate, blood pressure, respiration rate, blood oxygen saturation, blood glucose, skin perspiration, capnography, body temperature, motion evaluation, cardiac implantable devices and ambient parameters). A general WHDs system architecture is presented based on the state-of-the-art. After a global review of WHDs, we zoom in into cardiovascular WHDs, analysing commercial devices and their applicability versus quality, extending this subject to smart t-shirts for medical purposes. Furthermore we present a resumed evolution of these devices based on the prototypes developed along the years. Finally we discuss likely market trends and future challenges for the emerging WHDs area.
Article
Full-text available
Introduction: Neoplasms are the second most common diseases in western countries. Many patients with malignant diseases repeatedly present themselves in the emergency department (ED). Due to limited capacities, appropriate risk stratification strategies for cancer patients have to be developed. This study assesses if deceleration capacity (DC) of heart rate as a parameter of heart rate variability predicts mortality in emergency patients with malignant diseases. Methods: Prospectively, 140 adults with different entities of malignant diseases who presented in the medical ED were included. Primary and secondary endpoints were intrahospital mortality and mortality within 180 days, respectively. We calculated DC from short-term ECG readings of the surveillance monitors. Additionally, the Modified Early Warning Score (MEWS) and laboratory parameters such as white blood cells (WBC), lactate dehydrogenase, serum hemoglobin, and serum creatinine were determined. Results: The median age of the patients was 65 ± 14 years. 19.3% of the patients died within the hospital stay and 57.9% died within 180 days. DC and WBC were independent predictors of intrahospital death reaching a hazard ratio (HR) of 0.79 (95% confidence interval (CI) 0.63-0.993, p = 0.043) and of 1.00 (95% CI 1.00-1.00, p = 0.003), respectively. DC and serum creatinine independently predicted death within 180 days (HR 0.90, 95% CI 0.82-0.98, p = 0.023 and HR 1.41, 95% CI 1.05-1.90, p = 0.018, respectively). Conclusion: Deceleration capacity of heart rate is suitable for rapid risk assessment of emergency patients with malignant diseases.
Article
Full-text available
In 2011, the National Institute on Aging and Alzheimer's Association created separate diagnostic recommendations for the preclinical, mild cognitive impairment, and dementia stages of Alzheimer's disease. Scientific progress in the interim led to an initiative by the National Institute on Aging and Alzheimer's Association to update and unify the 2011 guidelines. This unifying update is labeled a “research framework” because its intended use is for observational and interventional research, not routine clinical care. In the National Institute on Aging and Alzheimer's Association Research Framework, Alzheimer's disease (AD) is defined by its underlying pathologic processes that can be documented by postmortem examination or in vivo by biomarkers. The diagnosis is not based on the clinical consequences of the disease (i.e., symptoms/signs) in this research framework, which shifts the definition of AD in living people from a syndromal to a biological construct. The research framework focuses on the diagnosis of AD with biomarkers in living persons. Biomarkers are grouped into those of β amyloid deposition, pathologic tau, and neurodegeneration [AT(N)]. This ATN classification system groups different biomarkers (imaging and biofluids) by the pathologic process each measures. The AT(N) system is flexible in that new biomarkers can be added to the three existing AT(N) groups, and new biomarker groups beyond AT(N) can be added when they become available. We focus on AD as a continuum, and cognitive staging may be accomplished using continuous measures. However, we also outline two different categorical cognitive schemes for staging the severity of cognitive impairment: a scheme using three traditional syndromal categories and a six-stage numeric scheme. It is important to stress that this framework seeks to create a common language with which investigators can generate and test hypotheses about the interactions among different pathologic processes (denoted by biomarkers) and cognitive symptoms. We appreciate the concern that this biomarker-based research framework has the potential to be misused. Therefore, we emphasize, first, it is premature and inappropriate to use this research framework in general medical practice. Second, this research framework should not be used to restrict alternative approaches to hypothesis testing that do not use biomarkers. There will be situations where biomarkers are not available or requiring them would be counterproductive to the specific research goals (discussed in more detail later in the document). Thus, biomarker-based research should not be considered a template for all research into age-related cognitive impairment and dementia; rather, it should be applied when it is fit for the purpose of the specific research goals of a study. Importantly, this framework should be examined in diverse populations. Although it is possible that β-amyloid plaques and neurofibrillary tau deposits are not causal in AD pathogenesis, it is these abnormal protein deposits that define AD as a unique neurodegenerative disease among different disorders that can lead to dementia. We envision that defining AD as a biological construct will enable a more accurate characterization and understanding of the sequence of events that lead to cognitive impairment that is associated with AD, as well as the multifactorial etiology of dementia. This approach also will enable a more precise approach to interventional trials where specific pathways can be targeted in the disease process and in the appropriate people.
Article
Full-text available
Background: Randomized clinical trials (RCTs) conducted among heart failure (HF) patients have reported that mobile technologies can improve HF-related outcomes. Our aim was to conduct a meta-analysis to evaluate m-Health's impact on healthcare services utilization, mortality, and cost. Methods: We searched MEDLINE, Cochrane, CINAHL, and EMBASE for studies published between 1966 and May-2017. We included studies that compared the use of m-Health in HF patients to usual care. m-Health is defined as the use of mobile computing and communication technologies to record and transmit data. The outcomes were HF-related and all-cause hospital days, cost, admissions, and mortality. Results: Our search strategy resulted in 1,494 articles. We included 10 RCTs and 1 quasi-experimental study, which represented 3,109 patients in North America and Europe. Patient average age range was 53-80 years, New York Heart Association (NYHA) class III, and Left Ventricular Ejection Fraction <50%. Patients were mostly monitored daily and followed for an average of 6 months. A reduction was seen in HF-related hospital days. Nonsignificant reductions were seen in HF-related cost, admissions, and mortality and total mortality. We found no significant differences for all-cause hospital days and admissions, and an increase in total cost. Conclusions: m-Health reduced HF-related hospital days, showed reduction trends in total mortality and HF-related admissions, mortality and cost, and increased total costs related to more clinic visits and implementation of new technologies. More studies reporting consistent quality outcomes are warranted to give conclusive information about the effectiveness and cost-effectiveness of m-Health interventions for HF.
Article
Full-text available
To identify digital biomarkers associated with cognitive function, we analyzed human–computer interaction from 7 days of smartphone use in 27 subjects (ages 18–34) who received a gold standard neuropsychological assessment. For several neuropsychological constructs (working memory, memory, executive function, language, and intelligence), we found a family of digital biomarkers that predicted test scores with high correlations (p < 10−4). These preliminary results suggest that passive measures from smartphone use could be a continuous ecological surrogate for laboratory-based neuropsychological assessment.
Article
Full-text available
We present a novel and practical deep fully convolutional neural network architecture for semantic pixel-wise segmentation termed SegNet. This core trainable segmentation engine consists of an encoder network, a corresponding decoder network followed by a pixel-wise classification layer. The architecture of the encoder network is topologically identical to the 13 convolutional layers in the VGG16 network [1]. The role of the decoder network is to map the low resolution encoder feature maps to full input resolution feature maps for pixel-wise classification. The novelty of SegNet lies is in the manner in which the decoder upsamples its lower resolution input feature map(s). Specifically, the decoder uses pooling indices computed in the max-pooling step of the corresponding encoder to perform non-linear upsampling. This eliminates the need for learning to upsample. The upsampled maps are sparse and are then convolved with trainable filters to produce dense feature maps. We compare our proposed architecture with the widely adopted FCN [2] and also with the well known DeepLab-LargeFOV [3] , DeconvNet [4] architectures. This comparison reveals the memory versus accuracy trade-off involved in achieving good segmentation performance. SegNet was primarily motivated by scene understanding applications. Hence, it is designed to be efficient both in terms of memory and computational time during inference. It is also significantly smaller in the number of trainable parameters than other competing architectures and can be trained end-to-end using stochastic gradient descent. We also performed a controlled benchmark of SegNet and other architectures on both road scenes and SUN RGB-D indoor scene segmentation tasks. These quantitative assessments show that SegNet provides good performance with competitive inference time and most efficient inference memory-wise as compared to other architectures. We also provide a Caffe implementation of SegNet and a web demo at http://mi.eng.cam.ac.uk/projects/segnet/.
Article
Full-text available
The concept of epilepsy syndromes, introduced in 1989, was defined as “clusters of signs and symptoms customarily occurring together”. Definition of epilepsy syndromes based on electro‐clinical features facilitated clinical practice and, whenever possible, clinical research in homogeneous groups of patients with epilepsies. Progress in the fields of neuroimaging and genetics made it rapidly clear that, although crucial, the electro‐clinical description of epilepsy syndromes was not sufficient to allow much needed development of targeted therapies and a better understanding of the underlying pathophysiological mechanisms of seizures. The 2017 ILAE position paper on Classification of the Epilepsies recognized that “as a critical tool for the practicing clinician, epilepsy classification must be relevant and dynamic to changes in thinking”. The concept of “epilepsy syndromes” evolved, incorporating issues related to aetiologies and comorbidities. A comprehensive update (and revision where necessary) of the EEG diagnostic criteria in the light of the 2017 revised terminology and concepts was deemed necessary. Part 2 covers the neonatal and paediatric syndromes in accordance with the age of onset. [Published with educational EEG plates at www.epilepticdisorders.com].
Article
Full-text available
Background: Chronic disease patients often face multiple challenges from difficult comorbidities. Smartphone health technology can be used to help them manage their conditions only if they accept and use the technology. Objective: The aim of this study was to develop and test a theoretical model to predict and explain the factors influencing patients' acceptance of smartphone health technology for chronic disease management. Methods: Multiple theories and factors that may influence patients' acceptance of smartphone health technology have been reviewed. A hybrid theoretical model was built based on the technology acceptance model, dual-factor model, health belief model, and the factors identified from interviews that might influence patients' acceptance of smartphone health technology for chronic disease management. Data were collected from patient questionnaire surveys and computer log records about 157 hypertensive patients' actual use of a smartphone health app. The partial least square method was used to test the theoretical model. Results: The model accounted for .412 of the variance in patients' intention to adopt the smartphone health technology. Intention to use accounted for .111 of the variance in actual use and had a significant weak relationship with the latter. Perceived ease of use was affected by patients' smartphone usage experience, relationship with doctor, and self-efficacy. Although without a significant effect on intention to use, perceived ease of use had a significant positive influence on perceived usefulness. Relationship with doctor and perceived health threat had significant positive effects on perceived usefulness, countering the negative influence of resistance to change. Perceived usefulness, perceived health threat, and resistance to change significantly predicted patients' intentions to use the technology. Age and gender had no significant influence on patients' acceptance of smartphone technology. The study also confirmed the positive relationship between intention to use and actual use of smartphone health apps for chronic disease management. Conclusions: This study developed a theoretical model to predict patients' acceptance of smartphone health technology for chronic disease management. Although resistance to change is a significant barrier to technology acceptance, careful management of doctor-patient relationship, and raising patients' awareness of the negative effect of chronic disease can negate the effect of resistance and encourage acceptance and use of smartphone health technology to support chronic disease management for patients in the community.
Article
Full-text available
Globally, in 2016, one out of eleven adults suffered from Diabetes Mellitus. Diabetic Foot Ulcers (DFU) are a major complication of this disease, which if not managed properly can lead to amputation. Current clinical approaches to DFU treatment rely on patient and clinician vigilance, which has significant limitations such as the high cost involved in the diagnosis, treatment and lengthy care of the DFU. We collected an extensive dataset of foot images, which contain DFU from different patients. In this paper, we have proposed the use of traditional computer vision features for detecting foot ulcers among diabetic patients, which represent a cost-effective, remote and convenient healthcare solution. Furthermore, we used Convolutional Neural Networks (CNNs) for the first time in DFU classification. We have proposed a novel convolutional neural network architecture, DFUNet, with better feature extraction to identify the feature differences between healthy skin and the DFU. Using 10-fold cross-validation, DFUNet achieved an AUC score of 0.962. This outperformed both the machine learning and deep learning classifiers we have tested. Here we present the development of a novel and highly sensitive DFUNet for objectively detecting the presence of DFUs. This novel approach has the potential to deliver a paradigm shift in diabetic foot care.
Conference Paper
Full-text available
Apart from technical reliability, usability is one of the major criteria for safe and efficient usage of interactive information technology in disaster and emergency management. However, in this setting, usability evaluation is difficult due to the heterogeneity and unpredictability of operation conditions, as well as the difficult, usually mobile, context. However, there are ways to conduct usability evaluations in disaster and emergency settings. Thus, in this paper, advantages and disadvantages of empirical and analytical usability evaluation methods for interactive systems in disaster and emergency management are discussed. The importance of formative evaluation measures within an iterative human-centered design process is emphasized. It is illustrated by two case studies dealing with paramedics’ and emergency physicians’ usage of mobile and wearable devices in mass casualty incidents.
Article
Full-text available
Background Multimorbidity is one of the most important and challenging aspects in public health. Multimorbid people are associated with more hospital admissions, a large number of drug prescriptions and higher risks of mortality. As there is evidence that multimorbidity varies with age and socioeconomic disparity, the main objective aimed at determining age-specific prevalence rates as well as exploring educational differences relating to multimorbidity in Germany. Methods This cross-sectional analysis is based on the national telephone health interview survey “German Health Update” (GEDA2012) conducted between March 2012 and March 2013 with nearly 20,000 adults. GEDA2012 provides information on 17 self-reported health conditions along with sociodemographic characteristics. Multimorbidity was defined as the occurrence of two or more chronic conditions in one individual at the same time. Descriptive statistical analysis was used to examine multimorbidity according to age and education, which was defined by the International Standard Classification of Education (ISCED 1997). ResultsOverall, 39.6% (95% confidence interval (CI) 38.7%–40.6%) of the 19,294 participants were multimorbid and the proportion of adults with multimorbidity increased substantially with age: nearly half (49.2%, 95% CI 46.9%–51.5%) of the adults aged 50–59 years had already two or more chronic health conditions. Prevalence rates of multimorbidity differed considerably between the levels of education. Low-level educated adults aged 40–49 years were more likely to be multimorbid with a prevalence rate of 47.4% (95% CI 44.2%–50.5%) matching those of highly educated men and women aged about ten years older. Conclusions Our findings demonstrate that both, age and education are associated with a higher risk of being multimorbid in Germany. Hence, special emphasis in the development of new approaches in national public health and prevention programs on multimorbidity should be given to low-level educated people aged <65 years.
Article
Full-text available
In today’s aging society, more people are living with lifestyle-related noncommunicable diseases (NCDs) such as cardiovascular disease, type 2 diabetes, obesity, and cancer. Numerous opinion-leader organizations recommend lifestyle medicine as the first-line approach in NCD prevention and treatment. However, there is a strong need for a personalized approach as “one-size-fits-all” public health recommendations have been insufficient in addressing the interindividual differences in the diverse populations. Advancement in systems biology and the “omics” technologies has allowed comprehensive analysis of how complex biological systems are impacted upon external perturbations (e.g., nutrition and exercise), and therefore is gradually pushing personalized lifestyle medicine toward reality. Clinicians and healthcare practitioners have a unique opportunity in advocating lifestyle medicine because patients see them as a reliable source of advice. However, there are still numerous technical and logistic challenges to overcome before personal “big data” can be translated into actionable and clinically relevant solutions. Clinicians are also facing various issues prior to bringing personalized lifestyle medicine to their practice. Nevertheless, emerging ground-breaking research projects have given us a glimpse of how systems thinking and computational methods may lead to personalized health advice. It is important that all stakeholders work together to create the needed paradigm shift in healthcare before the rising epidemic of NCDs overwhelm the society, the economy, and the dated health system.
Article
Full-text available
Current measures of neurodegenerative diseases are highly subjective and based on episodic visits. Consequently, drug development decisions rely on sparse, subjective data, which have led to the conduct of large-scale phase 3 trials of drugs that are likely not effective. Such failures are costly, deter future investment, and hinder the development of treatments. Given the lack of reliable physiological biomarkers, digital biomarkers may help to address current shortcomings. Objective, high-frequency data can guide critical decision-making in therapeutic development and allow for a more efficient evaluation of therapies of increasingly common disorders.
Conference Paper
Einleitung: Im Kontext der ambulanten Routineversorgung von Kopf-Hals-Tumor-Patienten ist die Kommunikation zwischen Patienten und behandelnden Ärzten eine zentrale Herausforderung. In den regelmäßig, üblicherweise vierteljährlich stattfindenden Nachsorgesprächen zwischen Patienten und Ärzten muss innerhalb weniger Minuten der Gesundheitszustand korrekt erfasst und beurteilt werden, damit geeignete Therapiemaßnahmen ergriffen werden können. Häufig können die Patienten ihren Gesundheitszustand dabei aus unterschiedlichsten Gründen unzutreffend, nicht vollständig oder nicht in geeigneter Form kommunizieren. Als potentiell geeignetes Mittel zur Verbesserung dieser Situation haben sich in den vergangenen Jahren Patient-Reported Outcomes herausgestellt, welche die Patienten-Ärzte-Kommunikation und damit die Diagnostizierung individueller Probleme und die anknüpfende Entscheidungsfindung unterstützen können [1]. Ebenfalls konnte auf dieser Grundlage eine höhere Überlebenswahrscheinlichkeit von Patienten gezeigt werden [2]. Neben den Vorteilen birgt die Implementierung von Patient-Reported Outcomes in den von Zeitknappheit geprägten klinischen Alltag insbesondere organisatorische Herausforderungen und erfordert eine möglichst reibungslose Integration in bestehende Prozesse und Arbeitsabläufe. Die im Wartegruppendesign angelegte Studie „TaBeL“ wird in diesem thematischen Rahmen durchgeführt. Sie untersucht Unterschiede in Hinblick auf wahrgenommene funktionelle Beeinträchtigungen und Lebensqualität der Patienten gibt, wenn den Ärzten in der Patienten-Arzt-Konsultation priorisierende Übersichtsdarstellungen der erfassten Patientendaten vorliegen. Methoden: Im Zuge der Studie wird ein Tablet-basiertes System entwickelt, welches Patienten bei der Kommunikation der wahrgenommenen Lebensqualität und funktionellen Beeinträchtigungen unterstützen soll. Den behandelnden Ärzten steht eine Tablet-basierte Checkliste zur Dokumentation des Gesundheitszustandes zur Verfügung. Zudem sollen erfasste Daten angemessen aufbereitet und rückgekoppelt werden, um die Kommunikation zwischen Patienten und Ärzten effektiver gestalten zu können und informierte und priorisierte Therapieentscheidungen zu ermöglichen. Das System wird aufbauend auf Prinzipien des Contextual Design [3] iterativ konzipiert und implementiert. Durch Partizipation der betreffenden Benutzerklassen, den Patienten und dem klinischen Fachpersonal, soll eine gebrauchstaugliche Lösung entwickelt werden. Das Vorgehen strebt damit nicht nur die Realisierung des interaktiven Systems an, sondern rückt eine ganzheitliche Problemlösung mit Fokus auf zielführende Interaktionen zwischen den verschiedenen Beteiligten im klinischen Versorgungskontext in den Mittelpunkt. Ergebnisse: Im bisherigen Projektverlauf wurden Komponenten zur Erfassung der Patient-Reported Outcomes sowie die Ärztecheckliste gestaltet, implementiert und mehrstufig hinsichtlich ihrer Gebrauchstauglichkeit auf Basis von Benutzerfeedback verbessert. In der Konzeption wurden Teilschritte der Patienten-Ärzte-Konsultationen modelliert, um angemessen einzelne, mit dem Tablet-basierten System zusammenhängende Interaktionen in Kontext und bestehende Handlungsflüsse zu integrieren.Speziell für die Ärztecheckliste hat sich als maßgeblicher Faktor die während der Konsultation verfügbare Zeit herausgestellt. Infolgedessen wurden strukturelle und gestalterische Optimierungen an der Checkliste vorgenommen und benötigte Interaktionsschritte mit dem Tablet-basierten System in Anzahl und Komplexität reduziert, um den Ärzten mehr Zeit für das Gespräch mit den Patienten zu ermöglichen. Für Patienten wurden des Weiteren die soziodemografischen Eigenschaften der Zielgruppe [4] sowie ihre Techniknutzung und -erfahrung berücksichtigt. Diskussion: Mit der ausstehenden Implementierung der Visualisierungskomponenten, welche den Ärzten Daten aus Patient-Reported Outcomes und Checkliste in aggregierter Form zur Verfügung stellt, können zum derzeitigen Zeitpunkt keine endgültigen Aussagen darüber getroffen werden, wie alltagstauglich das finale System sich in bestehende klinische Arbeitsprozesse integriert und diese modifiziert. Ebenso ist die Bewertung des späteren Nutzens in Bezug auf die Patienten-Ärzte-Interaktion sowie der Auswirkungen auf das Treffen von Therapieentscheidungen und auf die wahrgenommene Lebensqualität der Patienten noch nicht möglich.
Article
Self-care is vital for the successful management of heart failure. Mobile health can enable patients with heart failure to perform effective self-care. This article describes the theory-guided development and beta testing of a mobile application intervention to support self-care and increase symptom awareness in community-dwelling patients with heart failure. Ten participants entered physiologic data, answered qualitative questions about symptoms, and reviewed heart failure education within the HF App daily. Two validated instruments, the Self-care of Heart Failure Index and Heart Failure Somatic Awareness Scale, were administered both before and after the intervention, and results were compared using t tests. Results indicated that there were clinically significant changes from preintervention to postintervention in self-care scores in each subscale, with a statistically significant difference in the confidence subscale scores (P = .037). However, there were no statistically significant differences between preintervention and postintervention symptom awareness scores. These results indicate that incorporating mobile applications that comprise symptom monitoring, reminders, education, and the ability to track trends in physiologic data is most useful to assist individuals with heart failure to perform effective self-care.
Article
Hydrogel-based smart wound dressings that combine the traditional favourable properties of hydrogels as skin care materials with sensing functions of relevant biological parameters for the remote monitoring of wound healing are under development. In particular, lightweight, ultra-high frequency radiofrequency identification (UHF RFID) sensor are adjoined to xyloglucan-poly(vinyl alcohol) hydrogel films to battery-less monitor moisture level of the bandage in contact with the skin, as well as wireless transmit the measured data to an off-body reader. This study investigates the swelling behavior of the hydrogels in contact with simulated biological fluids, and the modification of their morphology, mechanical properties, and dielectric properties in a wide range of frequencies (10⁰–10⁶ Hz and 10⁸–10¹¹ Hz). The films absorb simulated body fluids up to approximately four times their initial weight, without losing their integrity but undergoing significant microstructural changes. We observed relevant linear increases of electric conductivity and permittivity with the swelling degree, with an abrupt change of slope that is related to the network rearrangements occurring upon swelling.
Article
Background: Evidence-based clinical guidelines have a major positive effect on the physician's decision-making process. Computer-executable clinical guidelines allow for automated guideline marshalling during a clinical diagnostic process, thus improving the decision-making process. Objectives: Implementation of a digital clinical guideline for the prevention of mother-to-child transmission of hepatitis B as a computerized workflow, thereby separating business logic from medical knowledge and decision-making. Methods: We used the Business Process Model and Notation language system Activiti for business logic and workflow modeling. Medical decision-making was performed by an Arden-Syntax-based medical rule engine, which is part of the ARDENSUITE software. Results: We succeeded in creating an electronic clinical workflow for the prevention of mother-to-child transmission of hepatitis B, where institution-specific medical decision-making processes could be adapted without modifying the workflow business logic. Conclusion: Separation of business logic and medical decision-making results in more easily reusable electronic clinical workflows.
Article
Scalp electroencephalography (EEG)–based seizure‐detection algorithms applied in a clinical setting should detect a broad range of different seizures with high sensitivity and selectivity and should be easy to use with identical parameter settings for all patients. Available algorithms provide sensitivities between 75% and 90%. EEG seizure patterns with short duration, low amplitude, circumscribed focal activity, high frequency, and unusual morphology as well as EEG seizure patterns obscured by artifacts are generally difficult to detect. Therefore, detection algorithms generally perform worse on seizures of extratemporal origin as compared to those of temporal lobe origin. Specificity (false‐positive alarms) varies between 0.1 and 5 per hour. Low false‐positive alarm rates are of critical importance for acceptance of algorithms in a clinical setting. Reasons for false‐positive alarms include physiological and pathological interictal EEG activities as well as various artifacts. To achieve a stable, reproducible performance (especially concerning specificity), algorithms need to be tested and validated on a large amount of EEG data comprising a complete temporal assessment of all interictal EEG. Patient‐specific algorithms can further improve sensitivity and specificity but need parameter adjustments and training for individual patients. Seizure alarm systems need to provide on‐line calculation with short detection delays in the order of few seconds. Scalp‐EEG–based seizure detection systems can be helpful in an everyday clinical setting in the epilepsy monitoring unit, but at the current stage cannot replace continuous supervision of patients and complete visual review of the acquired data by specially trained personnel. In an outpatient setting, application of scalp‐EEG–based seizure‐detection systems is limited because patients won't tolerate wearing widespread EEG electrode arrays for long periods in everyday life. Recently developed subcutaneous EEG electrodes may offer a solution in this respect.
Article
OBJECTIVE: To describe the average primary care physician consultation length in economically developed and low-income/middle-income countries, and to examine the relationship between consultation length and organisational-level economic, and health outcomes. DESIGN AND OUTCOME MEASURES: This is a systematic review of published and grey literature in English, Chinese, Japanese, Spanish, Portuguese and Russian languages from 1946 to 2016, for articles reporting on primary care physician consultation lengths. Data were extracted and analysed for quality, and linear regression models were constructed to examine the relationship between consultation length and health service outcomes. RESULTS: One hundred and seventy nine studies were identified from 111 publications covering 28 570 712 consultations in 67 countries. Average consultation length differed across the world, ranging from 48 s in Bangladesh to 22.5 min in Sweden. We found that 18 countries representing about 50% of the global population spend 5 min or less with their primary care physicians. We also found significant associations between consultation length and healthcare spending per capita, admissions to hospital with ambulatory sensitive conditions such as diabetes, primary care physician density, physician efficiency and physician satisfaction. CONCLUSION: There are international variations in consultation length, and it is concerning that a large proportion of the global population have only a few minutes with their primary care physicians. Such a short consultation length is likely to adversely affect patient healthcare and physician workload and stress.
Article
The pursuit of digital biomarkers wherein signal outputs from biosensors are collated to inform health-care decisions continues to evolve at a rapid pace. In the field of neurodegenerative disorders, a goal is to augment subjective patient-reported inputs with patient-independent verifiable data that can be used to recommend interventive measures. For example, in the case of Alzheimer's disease, such tools might preselect patients in the presymptomatic and prodromal phases for definitive positron emission tomographic analysis, allowing accurate staging of disease and providing a reference point for subsequent therapeutic and other measures. Selection of appropriate and meaningful digital biomarkers to pursue, however, requires deep understanding of the disease state and its ecological relationship to the instrumental activities of daily living scale. Similar opportunities and challenges exist in a number of other chronic disease states including Parkinson's, Huntington's, and Duchenne's disease, multiple sclerosis, and cardiovascular disease. This review will highlight progress in device technology, the need for holistic approaches for data inputs, and regulatory pathways for adoption. The review focuses on published work from the period 2012-2017 derived from online searches of the most widely used abstracting portals.
Article
Despite controversy, major health systems across the globe are obtaining and making use of genome sequence data in patients they care for, hoping this approach will prove beneficial.¹ Genome sequencing technology, a key driver of precision medicine, has improved substantially in accuracy, speed, and cost. As a consequence, clinicians, health systems, and governments acknowledge that individuals can have their genome sequenced and interpreted for about the cost of commonly used advanced diagnostic imaging tests. This makes obtaining genome sequence data for large numbers of individuals with and without known health issues possible.
Article
Nearly all aspects of modern life are in some way being changed by big data and machine learning. Netflix knows what movies people like to watch and Google knows what people want to know based on their search histories. Indeed, Google has recently begun to replace much of its existing non–machine learning technology with machine learning algorithms, and there is great optimism that these techniques can provide similar improvements across many sectors.
Article
Zusammenfassung Die EEG-Kommission der DGKN hat Richtlinien für die Durchführung und die Auswertung der EEG-Untersuchung formuliert, mit dem Ziel der Sicherung von Qualität und Vergleichbarkeit von Befunden [1]. Diese Mindestanforderungen umfassen definitionsgemäß nicht das gesamte für eine qualifizierte Ausführung erforderliche Wissen. Eine umfassende Darstellung der Elektroenzephalografie geben entsprechende Lehrbücher [2].
Article
Gait is emerging as a potential diagnostic tool for cognitive decline. The ‘Deep and Frequent Phenotyping for Experimental Medicine in Dementia Study’ (D&FP) is a multicenter feasibility study embedded in the United Kingdom Dementia Platform designed to determine participant acceptability and feasibility of extensive and repeated phenotyping to determine the optimal combination of biomarkers to detect disease progression and identify early risk of Alzheimer’s disease (AD). Gait is included as a clinical biomarker. The tools to quantify gait in the clinic and home, and suitability for multi-center application have not been examined. Six centers from the National Institute for Health Research Translational Research Collaboration in Dementia initiative recruited 20 individuals with early onset AD. Participants wore a single wearable (tri-axial accelerometer) and completed both clinic-based and free-living gait assessment. A series of macro (behavioral) and micro (spatiotemporal) characteristics were derived from the resultant data using previously validated algorithms. Results indicate good participant acceptability, and potential for use of body-worn sensors in both the clinic and the home. Recommendations for future studies have been provided. Gait has been demonstrated to be a feasible and suitable measure, and future research should examine its suitability as a biomarker in AD.
Article
Die Realisierung der Erstattungsfähigkeit ist für Digital-Health-Anwendungen bislang aus verschiedenen Gründen verhältnismäßig schwer zu erreichen. Der Nutzennachweis stellt hier eine Hürde dar. Die bewährten und allgemein akzeptierten Verfahren, vor allem der Parallelgruppenvergleich in randomisiert kontrollierten Studien (RCT) zur (klinischen) Nutzenbewertung, sind primär auf Fragen des medizinischen (Mehr‑)Nutzens ausgelegt. Anders als bei Arzneimitteln oder klassischen Medizinprodukten profitieren Anwender im digitalen Bereich jedoch stärker durch Souveränitätsgewinne, Förderung der Sensibilisierung und Achtsamkeit, höhere Transparenz von Versorgungsprozessen und Komfortverbesserungen in der Therapie. Digitale Lösungen mit interventionellem Charakter, die klinische Ziele bedienen, sind ebenso verfügbar (z. B. für die Indikationen Magersucht und Depression, aber insgesamt seltener). Die üblichen Nutzenbewertungen berücksichtigen primär medizinische Outcomes wie Morbidität und Mortalität, ohne jedoch die digital-health-inhärenten, darüber hinausgehenden nutzbringenden Aspekte entsprechend zu würdigen. Die Herausforderung ist es daher, eine die Besonderheiten von Digital Health berücksichtigende Evaluation zu entwickeln, ohne allerdings eine Reduktion der Aussagekraft (insbesondere in Richtung Sicherheit) zu befördern. Es werden zunehmend Konzepte benötigt, die sowohl fortlaufende Feedbackschleifen zur Anpassung und Verbesserung einer Anwendung erhalten und gleichzeitig ausreichende Evidenz zur komplexen Nutzenbewertung generieren. Hierdurch kann das Verhältnis zwischen Chancen und Risiken bei der Einführung digitaler Innovationen im Gesundheitswesen besser beurteilt werden.
Article
Die detailreiche Ausarbeitung der Basiskriterien zur Prüfung wäre ein Folgeschritt. Hier gibt es aufgrund der fachlichen Variabilität genügend Diversifizierungspotenzial, um partikulären Anforderungen der einzelnen Fächer gerecht zu werden. Ohne selbst in die Verantwortung und Verbindlichkeiten der Testung zu gelangen, können die Fachgesellschaften zudem Kriterien für Evaluationsprozesse abstimmen, die letztendlich eine valide Prüfung ermöglichen könnten. Unbenommen sollen die Fachgesellschaften im Rahmen ihrer Kompetenz inhaltliche Bewertungen vornehmen und so einen weiteren Beitrag zur Qualitätssicherung leisten.
Chapter
Seit mehr als 30 Jahren ist Gebrauchstauglichkeit (Usability) das wesentliche Kriterium zur Beurteilung interaktiver Systeme hinsichtlich der Gestaltung der Benutzungsschnittstelle sowie der Mensch-Maschine-Arbeitsteilung. Im Freizeit- und Unterhaltungsbereich rückt jedoch vermehrt das Konzept User Experience (UX) in den Fokus. In diesem Kapitel wird daher zunächst zwischen Usability und User Experience unterschieden. Aus Literaturrecherchen sowie Workshops mit Fachexperten wird anschließend gefolgert, dass in sicherheitskritischen Domänen wie der Luftfahrt, der Medizintechnik oder dem Transportwesen sicherere und effiziente Interaktionsverläufe stets Vorrang gegenüber dem Nutzungserlebnis haben müssen, jedoch bestimmte Facetten des UX-Konzeptes bei der Entwicklung von sicherheitskritischen interaktiven Systemen stärker berücksichtigt werden sollten. Hierzu zählen unter anderem hedonistische und ästhetische Eigenschaften von Anwendungssystemen sowie menschlichen Emotionen wie Stolz oder Freude. Anschließend wird auf die besonderen Anforderungen für die systematische Gewährleistung von Usability und User Experience im Rahmen von Engineering- beziehungsweise Designprozessen in sicherheitskritischen Kontexten eingegangen.
Chapter
Aus ökonomischer und ethischer Sicht ist im Gesundheitswesen die Effizienzsteigerung ein wesentlicher Aspekt zur Sicherung eines funktionierenden Gesundheitssystems. Im Zusammenhang mit Pflege- und Betreuungsprozessen bietet es sich an, die Datenerfassung zu automatisieren und in elektronischen Fieberkurven als Informationsplattformen weiter zu verwenden. Als wesentliche Innovationstreiber werden standardisierte elektronische Gesundheitsakten, standardisierte Klassifikationssysteme und immer leistungsfähigere Wearables identifiziert. Damit verbunden kann sich eine gesteigerte Datenbasis entwickeln, auf die allenfalls auch Methoden der künstlichen Intelligenz (decision support systems u. ä.) in Hinblick auf Qualitätsmanagement, Forschung und Lehre zur Anwendung kommen können.
Article
Rational . The concept of epilepsy syndromes, introduced in 1989, was defined as “ clusters of signs and symptoms customarily occurring together ”. Definition of epilepsy syndromes based on electro‐clinical features facilitated clinical practice and, whenever possible, clinical research in homogeneous groups of patients with epilepsies. Progress in the fields of neuroimaging and genetics made it rapidly clear that, although crucial, the electro‐clinical description of epilepsy syndromes was not sufficient to allow much needed development of targeted therapies and a better understanding of the underlying pathophysiological mechanisms of seizures. The 2017 ILAE position paper on Classification of the Epilepsies recognized that “as a critical tool for the practicing clinician, epilepsy classification must be relevant and dynamic to changes in thinking”. The concept of “epilepsy syndromes” evolved, incorporating issues related to aetiologies and comorbidities. A comprehensive update (and revision where necessary) of the EEG diagnostic criteria in the light of the 2017 revised terminology and concepts was deemed necessary. Methods . The work was commissioned by the Neurophysiology Task Force of the ILAE Committee on the Diagnostic Methods. Diagnostic criteria and recording procedures were developed by group consensus, reached through an “informal”, internal decision‐making process. Each working group member was allocated a number of syndromes, and a standard structured template was used. International literature was extensively reviewed. Results . We developed a simple diagnostic system that is applicable to all epilepsy syndromes which allows the physician (i) to rate the strength of EEG diagnosis (degree of diagnostic certainty) by weighting EEG findings in relation to the available clinical information or the specific clinical question, and ii) to suggest further EEG diagnostics where conclusive diagnostic evidence is lacking. We also propose a system of syndrome‐specific recording protocols that, used with the relevant clinical presentation or specific clinical question, may maximize activation of epileptic discharges and ultimately help with standardization of EEG recording across departments, worldwide. Because recording methodology also depends on available resources, a two‐tier system was developed to embrace clinical EEG services in resource‐limited and industrialized countries. A clinical practice statement for each of the epilepsy syndromes discussed underscores the crucial role of the clinical information with regards to both the optimization of the EEG recording and mainly its meaningful interpretation. Part I covers Genetic (Idiopathic) generalized epilepsies and syndromes, Reflex epilepsies, structural and genetic focal (lobar) syndromes and Progressive Myoclonus Epilepsies [ Published with educational EEG plates on www.epilepticdisorders.com ].
Technical Report
In this work we investigate the effect of the convolutional network depth on its accuracy in the large-scale image recognition setting. Our main contribution is a thorough evaluation of networks of increasing depth, which shows that a significant improvement on the prior-art configurations can be achieved by pushing the depth to 16-19 weight layers. These findings were the basis of our ImageNet Challenge 2014 submission, where our team secured the first and the second places in the localisation and classification tracks respectively.
Book
Software-Ergonomie ist die Wissenschaft von der benutzer- und anwendungsgerechten Gestaltung der Mensch-Computer-Schnittstelle. Betrachtungen und Empfehlungen der Software-Ergonomie müssen eine Vielzahl wissenschaftlicher Disziplinen einbeziehen, insbesondere Physiologie, Psychologie, Arbeitswissenschaften und Informatik. Trotz des hohen Stellenwerts computerbasierter Werkzeuge verfügen ihre Entwickler nur selten über das notwendige ergonomische Hintergrundwissen. Dieses Buch ist eine wissenschaftliche, aber leicht lesbare Einführung in die Software-Ergonomie. Es ist kein "Kochbuch für Programmierer", sondern eine methodische Einführung in die wesentlichen Aspekte der Mensch-Computer-Kommunikation.
Article
Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE are used to report the features of clinical relevance, extracted while assessing the EEGs. Selection of the terms is context sensitive: initial choices determine the subsequently presented sets of additional choices. This process automatically generates a report and feeds these features into a database. In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE is a useful clinical tool, with potential impact on clinical care, quality assurance, data-sharing, research and education.
Book
Latest edition of the seminal publication on interaction design and user experience principles, processes, practices, and patterns for desktop, mobile, and web platforms.
Article
Standardized EEG electrode positions are essential for both clinical applications and research. The aim of this guideline is to update and expand the unifying nomenclature and standardized positioning for EEG scalp electrodes. Electrode positions were based on 20% and 10% of standardized measurements from anatomical landmarks on the skull. However, standard recordings do not cover the anterior and basal temporal lobes, which is the most frequent source of epileptogenic activity. Here, we propose a basic array of 25 electrodes including the inferior temporal chain, which should be used for all standard clinical recordings. The nomenclature in the basic array is consistent with the 10-10-system. High-density scalp EEG arrays (64-256 electrodes) allow source imaging with even sub-lobar precision. This supplementary exam should be requested whenever necessary, e.g. search for epileptogenic activity in negative standard EEG or for presurgical evaluation. In the near future, nomenclature for high density electrodes arrays beyond the 10-10 system needs to be defined, to allow comparison and standardized recordings across centers. Contrary to the established belief that smaller heads needs less electrodes, in young children at least as many electrodes as in adults should be applied due to smaller skull thickness and the risk of spatial aliasing.
Article
Background In North America, heart failure (HF) is the leading cause for hospital readmission. Supportive technology, such as computers and tablets, could potentially assist patients with self-care to manage their condition after hospital discharge; however, older individuals have difficulties in adopting technology to manage their condition. Method This study used a mixed methods design to identify barriers to technology use in HF self-care. In the qualitative phase, semi-structured interviews were conducted with 18 HF patients and 10 informal caregivers or care partners (CP). In the quantitative phase, five questionnaires were administered to 15 patients and 8 CP: Montreal Cognitive Assessment; Short Literacy Survey and Subjective Numeracy Scale; Self-Care of HF Index; Knowledge Assessment Questionnaire; and Patient Activation Measure. Results In the qualitative phase, five themes emerged regarding engagement in self care and technology use: knowledge level of HF; level of willingness to ask questions related to HF; confidence level in making health-related decisions individually; level of technology usage in daily activities; and self-recording of health measurements. Quantitative analysis found that most HF patients had mild cognitive impairment (MCI), adequate health numeracy levels to understand and manage their health condition, high confidence levels in managing their condition and willingness to engage in self-care. There was variation in willingness to adopt technology. Conclusion Patients were willing to engage in HF self-care however they relied on CPs who were more willing to ask questions about HF. Technology tools may assist in HF self-care, but they must be tailored for use among older individuals.