A digital intelligence visualization health monitoring device for Alzheimer’s disease patients based on WBAN technology (2025)

Introduction

Epidemiologic surveys have shown that the prevalence of AD in China is 3.21% among people aged 65years1. The total cost of care and screening for people with AD in China is estimated to be $248.71 billion in 2020, and it is predicted that this cost will continue to increase at a rate of 200% every 10years in the future2. Negatively, despite the high costs paid by AD patients, the number of deaths from AD in China has increased by 57.8% over the past 10years, making AD the sixth leading cause of death in Chinese older adults today3.The main clinical manifestations of AD patients include cognitive impairment, psychiatric and behavioral symptoms, and reduced activities of daily living, which have been summarized as the ABC symptoms4,5,6, and are specific to each stage of the disease. At the same time, the specificity of each stage of AD disease is often accompanied by multiple comorbidities. Therefore, the characteristics of the disease are characterized by diversity and complexity, which will make the situation worse and threaten countless families and patients if they rely only on offline specialized physicians who are unable to monitor and interpret AD patients effectively and quickly enough to achieve a timely response. On the other hand, hundreds of clinical drugs aimed at treating patients with AD have ended in failure, and effective clinical research continues to focus on controlling symptoms, slowing progression, improving patients’ quality of life, and reducing caregivers’ work stress7. The relationship between AD patients and their daily living behaviors has gained more research interest in recent decades8. Therefore, smart health monitoring of AD patients will be the focus of this paper.

Currently, a large number of advances in pervasive computing, cloud and wireless technologies, as well as the application of Artificial Intelligence (AI) in the daily life scenarios of AD patients due to the development of sensors and wireless communication technologies9,10,11. An emerging novel technology called wireless body area network (WBAN) has been favored and widely used in human health monitoring. WBAN is a body network that enables human-object communication by connecting sensor nodes in, on, or around the user12.WBAN employs pressure sensors, passive infrared sensors, or wearable accelerometers to detect and assess AD patients’ daily performance, achieving 96.5% accuracy13 and forming a real-time health monitoring ecosystem. This gives AD patients fuller autonomy and flexibility compared to digital monitoring methods such as indoor monitoring arrangements and digital fences.

From a health data and visualization perspective in recent years. Although health visualization’s such as IDIM’s studies have shown that collecting complex data from different sources can be a valuable resource for health assessment of AD patients. However, designing appropriate visualizations is a difficult task due to the limited guidelines for designing and evaluating health visualizations for AD patients. Meanwhile, most of the traditional AD health monitoring is based on questionnaires on the one hand, such as the Dementia Disability Assessment Scale (DAD), or Lawton and Brody’s IADL scale14. However, these methods suffer from informant bias and subjective misinterpretation15, as well as the fact that some people with AD suffer from a decline with their cognitive state and may not be able to respond correctly to the questions in the questionnaire16. On the otherhand the detailed history, neurological assessment, comprehensive mental and physical examination of AD patients, these diagnoses are often dependent on timed monitoring with expensive medical equipment and professional interpretation by physicians17. All of these limitations lead to the challenge of uncertainty in assessing the health of people with AD. It also makes it difficult for stakeholder groups such as family members, community caregivers, physicians and caregivers to intervene in the care of people with AD with lower time costs through more flexible hours and passive prompting according to their respective responsibilities.

Based on the above considerations, this paper addresses the challenge of assessing the health visualization of AD patients by applying progressive regionally augmented convolutional neural network algorithm to the data collected by WBAN, establishing an integrated visualization method across multiple health data, and extracting visual features for anomaly detection. The objective is achieved through the following three points respectively:

Understanding the user’s behavior and environment, refining the continuity and authenticity of health data, and allowing these health condition data to be incorporated into personal health profiles. Removing the time-consuming and laborious researcher tracking and monitoring completely from the daily lives of people with AD through WBAN technology as a way to increase the value of telehealth data and make health monitoring more routine for people with AD in order to improve their well-being.

Overcome the limitations of interpretation and reduce information bias and cognitive load due to lay interpretation. Through the visual design approach, we try to address the cognitive challenges to the visual display of the application interface for different user groups, so that they can increase their level of social participation through the discussion of the recent health conditions of AD patients.

Motivate stakeholder groups to take better care of AD patients. Trigger further appropriate actions related to medical and social decision-making through interaction design strategies.

This method, unlike current tools, is capable of remotely capturing patient performance and actions over continuous time and in real-life situations and performing accurate visual assessments in order to provide AD patients and stakeholder groups with objective health monitoring results and complementary opinions on overall health status18. It also becomes an important component of future health monitoring facilities for AD patients.

Method and equipment

This study is an open, prospective observational trial conducted from 2022 to 2023 in Qiantang Street, Hangzhou, Zhejiang Province, China. It involved 12months of remote monitoring and 12 brief return visits for Alzheimer’s disease (AD) patients.

The trial was approved by the institutional review board of the China Art Academy in Hangzhou, and was conducted in strict adherence to the ethical guidelines outlined in the Declaration of Helsinki. All participants and their legal guardians provided written informed consent before the trial began.

Inclusion criteria

The inclusion criteria were designed based on the specific needs of AD patients and the features of the WBAN visualization system, targeting professionals, AD patients, and stakeholders.

  • Professionals: This group included five designers with backgrounds in digital art healing, five engineers with over five years of experience in data analysis, and five clinicians experienced in treating early Alzheimer’s disease.

  • Stakeholder group: This group consisted of 16 caregivers with more than 10years of experience in supporting AD patients, and 16 family members with more than 3years of experience. Caregivers were distinguished from family members due to their more hands-on, daily role in patient care, while family members often face time constraints due to work commitments.

The inclusion criteria for AD patients were based on those who visited the First People’s Hospital of Zhejiang Province for assessment within one month before the start of the experiment, with an MMSE score between 21 and 26, and a CDR score of ≤ 0.5. As women have a higher likelihood of developing AD before the age of 65, the sample was split by gender, with 8 males and 8 females. Computerized random numbers were used to ensure fair allocation. Age, condition and cognitive differences were considered when selecting participants to minimize bias. All participants in the study received equal support and assessments were conducted by an uninformed research team to ensure consistency of intervention. The study monitored participant adherence, analyzing data only for patients who completed the 12-month cycle and using intention-to-treat analysis to address dropout. AD patients were also required to be able to wear glasses, belts, and watches, and to engage in at least one outdoor activity every two days. Additionally, patients must not have a physiological intolerance to silicone materials and should not suffer from color vision deficiencies.

All of the above included in the study will sign an informed consent form in writing. (Table 1).

Full size table

Design process

The research process followed the general structure—grouping, training, intervention, analysis, and comparison. We decided to test the usefulness of data visualization under different monitoring protocols for AD patients and stakeholder groups through a comparative study of two health monitoring programs for AD patients. One set is the traditional physical examination visualization group (control group), whose main feature is that AD patients need to arrive at the hospital regularly to complete medical health tests such as ECG, EEG, CCP, etc. for early AD patients under the guidance of doctors. And they are able to independently interpret professional medical test visualization reports to the stakeholder groups. The other group, the WBAN monitoring visualization group, has the same functional purpose as the control group, but uses a health monitoring visualization kit that we named AD-Cloud, which contains a system for receipt collection and visualization processing, as well as a visualization application with image display, real-time interaction, and motivational strategies. Thus, although both had visualization in common, they differed in terms of real-time, continuity, interactivity, and degree of visualization.

  • Grouping: To minimize selection bias, 16 AD patientswere assigned to the WBAN monitoring group or control group through a computer-generated randomization sequence(generated via Research Randomizer v5.0), stratified by gender and baseline MMSE scores(21–23 vs. 24–26). The randomization utilized a block size of 4(2 WBAN:2 control per block) to ensure balanced group allocation even with small strata. Allocation concealment was enforced using sequentially numbered, opaque sealed envelopes (SNOSE), which were opened only after participants completed baseline assessments. At the same time, we use blinding and bias control to reduce the bias. On the one hand, members of the research team responsible for evaluating clinical outcomes (e.g., MMSE score changes, task completion rates) were blinded to group assignments. Assessment data were labeled with anonymized IDs to prevent unmasking. On the other hand, To reduce bias in patient behavior, both groups received equivalent time and attention from researchers during training sessions. Caregivers in the control group were not informed about the WBAN system’s functionalities to prevent differential encouragement. (Table 1)

  • Training: After obtaining consent, all participants attended a training session led by professional staff. The session covered tasks, the withdrawal process, equipment usage, and informed consent.

  • Intervention: For the study, we assigned 24 tasks per month to the traditional physical examination visualization group and the WBAN monitoring visualization group. 1. AD patients in the WBAN monitoring visualization group were required to complete the following two types of tasks: first, the requirement to be able to wear the flexible wearable gear nine times per month on independent days, i.e., on average, once every three days. This requirement ensures the integrity of the flexible wearable equipment in collecting data, its usability when outdoors, and the rationality of the flexible equipment design to highlight the difference between WBAN and, for example, other AD patient monitoring equipment. Secondly the health visualization charts collated by AD-Cloud were viewed via smart cell phone 15 times a month and health sharing and interpretation of them was completed to the stakeholder groups, i.e. on average once every two days. The digital sharing and interpretation is to avoid the misinterpretation of the WBAN monitoring visualization brought by the digital divide to AD patients. 2. AD patients in the Traditional Physical Examination Visualisation Group (Control Group) received a traditional monitoring method based on a physical examination, i.e., monthly visits to the hospital for routine cognitive assessment and physical examination by healthcare professionals. At each visit, AD patients underwent a series of examinations including neurological assessment, cognitive assessment using standardised scales such as the Mini Mental State Examination (MMSE), and monitoring of basic vital signs (e.g. heart rate, blood pressure). Patients in the control group were not provided with any visual or digital health tools for self-monitoring. Instead, they relied on regular face-to-face consultations to gauge their progress, which is common in the traditional care of patients with Alzheimer’s disease. Caregivers maintained daily logs documenting patients’ adherence to behavioral changes. Logs were cross-verified with hospital attendance records during monthly visits. These visits were the primary method of tracking the health status and disease progression of this group of patients. Compliance in the control group was monitored by tracking attendance at monthly scheduled visits. Patients were asked to report any non-attendance at appointments and were followed up with telephone calls to ensure that they were adhering to the prescribed monitoring schedule. Researchers conducted biweekly calls to caregivers to confirm appointment attendance and address logistical challenges (e.g., transportation issues). Patients with > 2 missed visits triggered a home visit by a community health worker. The modality of monitoring was evaluated by a questionnaire completed by the control group at the end of each meeting, assessing their overall satisfaction with the traditional monitoring method, barriers to participation, and any challenges they had encountered with the traditional model of care.

  • Analysis: Qualitative and quantitative analyses were conducted over the course of the 12-month monitoring period to identify the intervention strengths and weaknesses of the two visualization modalities in terms of phasing. For qualitative analysis, specific user feedback on system functionality, interface, ease of use, and device comfort was extracted from semi-structured interviews with AD patients conducted by professionals on the 29th day of each month. Interview data were coded using a thematic analysis approach and major themes and issues were summarized. For quantitative analysis, task completion time, error rate, frequency of activities, and other indicators were analyzed on the same day by counting task completion and health data (e.g., gait monitoring, ECG, HR, etc.) provided by the WBAN backend.

  • Comparison: The main comparisons between the two groups centred on task completion rates, patient engagement and adherence to health monitoring tasks. The experimental group used a WBAN-based monitoring system that provided continuous health data visualisation and feedback, whereas the control group relied on traditional, episodic hospital visits. This distinction allowed us to assess the relative effectiveness of digital health monitoring versus traditional methods in promoting long-term patient adherence and engagement. By providing more detailed information about the control group’s monitoring methods and adherence measures, we ensured clarity in the comparison of the two groups and strengthened the overall validity of the study.

Ethical considerations and privacy

Throughout the study, we ensured that all participants’ privacy was protected. In addition to obtaining informed consent, the study followed all relevant ethical guidelines. If any patients withdrew from the study, their data were immediately deleted to ensure confidentiality. Due to cognitive decline in AD patients, informed consent was updated every two months to confirm their willingness to continue participating.

System and visual design

To better serve Alzheimer’s disease (AD) patients with WBAN monitoring technology, we analyzed the behavioral, physiological, and psychological characteristics of AD patients. Combining WBAN technology and data visualization design, we developed the AD-Cloud health monitoring system for AD patients. The system includes an information framework, flexible wearable devices, an algorithmic framework, and visual data presentation.

Information system and flexible equipment

The AD-Cloud system follows the technical standards of WBAN, incorporating an information transmission framework and flexible wearable devices to support health tracking for AD patients. The information transmission system is based on widely used technologies in China and communication tools familiar to the patients’ caregivers, ensuring ease of use and continuous data transfer. The flexible wearable devices are designed for comfort, considering the daily needs of AD patients and the specific behaviors associated with their condition. These devices are also comprehensive in terms of collecting a variety of data.

Information transfer framework

AD-Cloud builds a specific three-layer level communication system based on WBAN for intra-system, inter-system and extra-system communication. (Fig.1)

Layer 1-Communication within the system: The flexible wearable device acts as a personal network hub. It includes six sensor nodes that form a star network topology. Data is transmitted using Bluetooth with low energy consumption, at a frequency of 100Hz. If there is no data update, the sensors enter a sleep mode. To ensure security, the sensors and central processing unit (CPU) in Layer 2 implement encryption (E2EE) for data transmission. Only authorized devices can decrypt the data, protecting against unauthorized access. Each sensor node collects data on the following:

  • Behavioral data: verbal data, body data, scene data, etc.

  • Physiological data: electrocardiogram (ECG), electroencephalogram (EEG), blood oxygen saturation (SpO2), blood pressure (BP), heart rate variability (HRV), etc.

  • Psychological data: These are heart rate (HR), skin electrical level (SCL), respiratory rate (RR), cardiopulmonary interaction index (CPC), and other physiological and behavioral data specific to the psyche.

The goal is to ensure continuous health data collection that seamlessly integrates into the daily lives of AD patients.

Layer 2-Communication in the system: The cloud server and the central processing unit (Hub) form the second layer. The Hub collects data from the sensors, timestamps it, and categorizes the data. The cloud server stores the multimodal data using a MongoDB distributed database. It employs TCP/IP protocols to ensure data integrity and prevent packet loss. The server includes a PRENet data processing module, which uses a progressive convolutional neural network to analyze the collected data in multiple layers, generating real-time visualizations. If any data is abnormal, the system triggers an emergency response. To ensure privacy, all data is encrypted using AES-128 during transmission and stored in compliance with data protection standards. Personal information is anonymized, ensuring the privacy of patients. If a patient withdraws from the study, all their data is permanently deleted.

Layer 3-Out-of-system communication: Stakeholders, including caregivers and clinicians, can access the health data via mobile smartphones with 3G/4G/5G connectivity. Multi-factor authentication (MFA) ensures secure access to the cloud server, enabling stakeholders to monitor patients’ health and engage with them socially, even when patients are isolated.

Flexible wearable devices

The presently equipped flexible wearable device, as represented in Fig.1, consists of a center node and five distributed star-shaped topologies at different locations. The center node is the smart glasses, and the five distributed nodes are the smart bracelet at the wrist of both arms, the smart belt at the waist, and the smart insole at both feet. And associated with the daily life symptoms of AD patients in (Fig.2), the rich multi-scale background is incorporated into the global information. To enhance the full coverage of behavioral, physiological and psychological data of AD patients and improve the global health monitoring of AD patients. (Fig.3).

Symptoms of reduced activities of AD patients.

Full size image

Information collection of flexible wearable devices.

Full size image

Smart glasses: The smart glasses are equipped with a bone-conduction headset MEMS chip to record verbal communication, analyzing speech fluency, grammar, and diction to assess the degree of speech degradation. The glasses also include a 120° wide-angle camera to capture the environment, identifying any hallucinations or visual dysfunctions. The glasses store data on a 128GB MicroSD card and aggregate information from the other wearable devices. This data is then uploaded to the cloud server for analysis.

Smart belt: This device tracks the AD patient’s posture and body movements. It includes a Beidou GNSS chip for geolocation and a T100 accelerometer and gyroscope to monitor lumbar flexion, twisting, and posture. This data helps identify long periods of sitting, standing, or lumbar muscle weakness.

Smart bracelet: Similar to common smart bracelets, this device monitors physiological data such as ECG, EEG, and EMG. It also includes an IMEC behavioral radar chip to track arm movements, checking whether they are consistent with specific tasks. Additionally, the bracelet monitors emotional stress levels through a galvanic skin response sensor, detecting signs of agitation or apathy.

Smart insole: The smart insole is the only wearable device with discontinuous contact with the body, used mainly when the patient is moving outdoors. The insole tracks foot pressure and gait, helping to monitor changes in activity when the patient engages with the outside world. It uses an IMEC behavioral radar chip and a triaxial accelerometer to monitor foot pressure and gait, while also tracking the effect of environmental factors, such as air quality, on the patient’s health.

The WBAN-based AD-Cloud information delivery system provides more real-time and comprehensive health data for daily multimodal health information of AD patients through flexible wearable devices.

  • Visualized information is tightly linked to the real life of AD patients through flexible wearable gear to ensure authenticity of health monitoring.

  • Multimodal health information visualization requires a lot of arithmetic power, which can be accomplished through remote servers to drive daily health decisions of AD patients.

  • AD-Cloud’s visualized information is comprehensible to all stakeholder groups as a way to motivate stakeholder groups to provide better care to AD patients and reduce the risk of social isolation of AD patients.

Algorithmic framework and visual presentation

In developing the visualization technique, we focused on two main components: (a) continuity and veracity of the multimodal data after it has been deconstructed, and (b) improved recognition without misinterpretation.

Network analysis framework for multimodal data

Most of the WBANs focus more on data transfer, in this paper we will build on this technique. Inspired by19, we use a Progressive Region Enhancement Network (PRENet). The progressive training strategy cross-learns information from different species, and the regional enhancement emphasizes local features due to individual differences in AD patients.

Most WBAN systems focus primarily on data transfer. In this study, we expand on this technology by using a Progressive Region Enhancement Network (PRENet), inspired by previous work19. This network enhances learning across different data types and emphasizes local features due to individual differences in AD patients.

The pre-processed data used for model training was obtained from the Alzheimer’s Disease Neuroimaging Initiative (ADNI) database, which contains multimodal data such as MRI scans, PET scans, cognitive test results, and clinical statistics. The ADNI database includes data from 443 AD patients, which serves as the baseline for our network analysis framework.

During the initial stage of framework development, multimodal data collected by the flexible wearable devices were reorganized and recombined for visualization. This technique classifies the data both horizontally and longitudinally (Fig.4):

  • Horizontal data: This involves the automatic integration and discovery of information across different domains. The data is classified into categories like behavioral, physiological, psychological data, and further separated based on file formats (e.g., image, audio, GIS, medical data).

  • Longitudinal data: This enhances the system’s understanding of data semantics by organizing information into layers (e.g., time, magnitude, references like blood pressure and heart rate, and hierarchical structures of health phenomena).

Multimodal information intersections.

Full size image

The classified data are further implanted into PRENet as the backbone of ResNet. At the same time, in order to reduce the computational burden on the cloud server, we rhythmize the four types of data except for the graphical data in one dimension. (Fig.5).

Convolutional neural network architecture.

Full size image

PRENet mainly focuses on progressive local feature learning and regional feature enhancement. The former uses a progressive training strategy to learn multi-scale local features, improving the precision of data analysis. The latter enhances the local feature representation using self-attention mechanisms to integrate richer contextual information into the local features.

The processed graphical data is transformed into 2D images for visualization. These images are divided into three types of square windows (128 × 12x12) to represent objects, people, and scenes. This makes it easier to combine with one-dimensional data for cross-referencing and progressive analysis. For example, gait information in behavioral data is pre-processed as a graph of one-dimensional rhythms. The gait in abnormal states is compared to baseline gait using 128 × 2x28 bar windows and undergoes eight stages of progressive analysis.

Global Feature Learning, inspired by20, aims to provide better certainty about the long-term health status of AD patients, rather than focusing solely on abnormalities at specific moments in time. To achieve this, we use Global Average Pooling (GAP) for global feature extraction:

$${f}_{Glo}=GAP({f}^{g})$$

For Progressive Local Feature Learning, abnormal features in the time series are spread outward until they reach normal feature points. Using Global Maximum Pooling (GMP), we obtain local feature vectors:

$${f}_{Loc}^{u}=GMP({f}^{u})$$

PRENet in this paper is more suitable for early Alzheimer’s disease (AD) health monitoring compared to traditional algorithms due to its multi-layer feature extraction and adaptive processing capabilities, which are able to capture subtle changes in the health status of AD patients. In contrast, traditional convolutional neural networks (CNNs), such as the CNN structure proposed by LeCun et al. (1998), can effectively process visual data but have limited flexibility for multimodal data21. Recurrent neural networks (RNN), such as the LSTM structure of Hochreiter and Schmidhuber (1997), although good at sequential data, underperform in the integrated processing of visual and behavioral data22. In addition, the Transformer algorithm (Vaswanietal.,2017), based on the self-attention mechanism, although superior in long-range dependency, is computationally expensive and unsuitable for continuous low-latency monitoring23.The progressive regionally augmented convolutional neural network is more advantageous in step-by-step extraction and asymptotic aggregation of information, and is suitable for multimodal and real-time requirements in AD monitoring.

Visual presentation of data

From the network analysis framework we were able to effectively obtain the appropriate numerical information, and at the same time we needed to design the visual presentation of the results.

In order to establish a consensual nature of graphical interpretation in line with AD patients and stakeholder groups. Based on the one-dimensionalized waveform graphs of behavioral abnormalities, physiological abnormalities, and psychological abnormalities that were finally outputted from the network analysis framework, we designed and expressed them in Mitsubishi gemstone shapes, which effectively excluded the differentiation of different users’ interpretations of the information during the visualization stage, i.e., all the information was included in the monitoring results under a certain time scale. Thus, the structure was divided into these 3 parts. (Fig.6).

  • The form structure. Mitsubishi Gemstone shaped form structure is distinguished into inner space and outer space. The inner space refers to the health baseline of the first and second stage of AD patients, and when the AD health monitoring data is normal, the data will be contained in the inner space. In case of abnormal data, the data will go beyond the inner space to the outer space and will be bounded by the health baseline of the next stage of AD patients. To make the data easier to interpret in one interface, we present the Mitsubishi Gem shape transparently. Also to emphasize the health monitoring status throughout the day, we add lighting rendering effects to the prisms of the Mitsubishi jewel shape. That is, when the health data of a day is in a healthy state, the Mitsubishi jewel shape will render green; 1/3 of the quantity exceeds the health baseline at that stage will render yellow; when 2/3 of the health data presents abnormalities will render red light and push the abnormal report to the stakeholder group with a feasible solution.

  • Data morphology. We divide the behavioral, physiological and psychological information into three parts based on the shape structure with a horizontal plane axis of 120 degrees, in order to emphasize the importance of the three types of information equally. In the presentation of data morphology changes, behavioral information is yellow, physiological information is blue, and psychological information is red, which is reflected by the gradient color from the center outward. When the abnormal data exceeds the shape structure more obviously, the color of the data beyond the part will be heavier to further enrich the content of data morphology.

  • Auxiliary annotation. The vertical axis represents a 24-h time axis of a day, according to the data collection time of a day from the bottom up for information visualization of the stack. The abnormality of the phenomenon is indicated in the horizontal axis, and the abnormal data at different time points are labeled using three different color depths of yellow, blue and red to recall the memory of the occurrence of the abnormal phenomenon.

Data visualization.

Full size image

The data undergoes progressive convolutional neural computation, and then cross-fertilizes the description, annotation and organization of various types of data, which is able to reveal the deeper information such as concepts, entities, events, and social contexts embedded in the multimodal data of AD patients, as well as their relational structures. It is not only able to depict the overall characteristics of AD patients, but also able to describe fine-grained objects.

Visual interface interaction applications and strategies

We created a visual and intuitive integrated graphical display of the user application interface. Graphical comprehension theory24 was emphasized in the process to reduce the reasoning and top-down processing steps required during user manipulation to support possible errors in graphical interpretation by the user. The interface provides applications for smart cell phones of AD patients and stakeholder groups to query the informational data provided by the system, containing statistics that collect daily status and abnormal status, suggesting various risk levels of AD patients while performing social activities. To achieve this purpose, we designed the interface interaction functions such as home page operation, different user display screens and emergency notifications.

Interface information design. Considering the presence of possible memory loss problems in AD patients, we followed the enhancement of information consistency. The difference with traditional health monitoring is the emphasis on visualization under the WBAN system and the focus on abnormal data. Therefore, the first level of the interface contains only the visualization model and quick contact avatars of the stakeholder groups, and there are no hierarchical options such as “Health Checkup”, “Health Advice”, “Health Sharing”, and so on. Options. When an AD patient has an abnormal condition, specific abnormal data can appear in the second level of the interface, while non-abnormal data can only be viewed during an Alzheimer’s clinician consultation. A deeper aspect of this is that by reducing the exposure of information, the health and privacy anxieties associated with constant monitoring of AD patients can be reduced. (Fig.7a).

Visual interactive interface. (a) Interface information design. (b) Interaction design. (c) Visual graphic design. (d) Emergency Notification.

Full size image

Interaction design. Considering the digital divide that may exist in AD patients, we follow similar interaction rules as most smartphones, based on an information-consistent interaction interface, all interactions are performed only through gestures to further simplify the interaction difficulty and reduce the re-learning dilemma. The “click” gesture is used to pre-confirm anomalous information, contact the stakeholder group, and go to the second level of the interface. (Fig.7b1) “Two-finger manipulation” gestures and “swipe” gestures for zooming in and out and rotating the Mitsubishi lens, respectively. This allows AD patients to rotate the 3D model to see the results in a more intuitive way with stylized undulations, without the need for in-depth interpretation. (Fig.7b2,b3).

Visual graphic design. Since contemporary Chinese elderly people have 4–6 children but live at different distances and support abilities, resulting in the frequent need to replace the interest groups of AD patients, for this reason we followed the international logo design standard ISO/IECTR19766:2007 for elderly and cognitively impaired people, which unifies the image of the different interest groups in the minds of AD patients and reduces cognitive burden. To this end, we weakened the visual features of traditional personalized avatars with complex characteristics and used simplified feature objects to map the stakeholder groups, including a stethoscope to symbolize clinicians, crossed arms to symbolize chaperones, and an inclusive loving house to symbolize family members. We also designed the color of the graphic due to the small screen of smartphones25 and the prevalence of presbyopia in AD patients, filling the negative shape with higher saturation and recognizable colors and increasing the percentage of color in the graphic, so that AD patients can recognize the image buttons by their colors, rather than relying solely on the image styling. (Fig.7c).

Emergency Notification. Divided into UI design for alerts and quick call operation. When the data pattern in the Mitsubishi Gemstone Shape exceeds the health baseline, the alert light will be on for a long period of time and an alert sound will be emitted to attract the attention of the people around the AD patient. The abnormal situation will also be synchronized and sent to the stakeholder group, and the contact with the stakeholder group will be automatically initiated. Only when the stakeholder group is connected to the contact, or when the AD patient has no abnormal status, the alarm light will turn off. Quick call is when the user long presses another user’s avatar, the call will be made directly without further operations such as confirming the call to protect the situation when part of the abnormal information has not been received by the device and has already jeopardized the AD patient’s life safety. (Fig.7d).

The visualization interface based on AD patients and stakeholder groups increases remote monitoring of AD patients by stakeholder groups by improving the recognition of visual data, increasing the frequency of data use and the depth of knowledge about AD dilemmas and solutions.

Result

Although both monitoring modalities had the same purpose, compliance and attitudes of AD patients were quite different, and we obtained this conclusion through quantitative and qualitative analyses.

Quantitative analysis

We performed a quantitative analysis of the WBAN health monitoring visualization group and the traditional physical examination visualization group to obtain differences in adherence among the two groups. (Table 2).

Full size table

The WBAN health monitoring visualization group exhibited a mean task completion rate of 78% over the 12-month monitoring period. In the initial 6months, patients experienced some resistance due to unfamiliarity with the daily use of the AD-Cloud equipment and its interactive application, leading to fluctuations in task completion rates. As patients became more proficient with the system, task completion improved, with rates exceeding the imposed task limits in the latter 6months, indicating increased trust in the equipment.

In contrast, the traditional physical examination visualization group had a mean task completion rate of 61% over the same period, representing a 17% lower adherence compared to the WBAN group. Initially, patients attempted to complete tasks, maintaining an 83% completion rate in the first 3months, influenced by reminders from the stakeholder group. However, factors such as verbalization difficulties and mood fluctuations led to decreased adherence, with patients ultimately attending only monthly hospital visits. Notably, during the final month, some patients reduced the frequency of hospital visits, reflecting a decline in interest in traditional health checkups.

To evaluate the significance of differences in task completion rates between the WBAN and control groups, we performed an independent samplest-test after confirming normality (Shapiro–Wilk test,p > 0.05) and homogeneity of variances (Levene’s test,F = 1.23,p = 0.28).

WBAN Group: Mean compliance = 78.0% (SD = 8.2%, 95% CI [73.1, 82.9]).

Control Group: Mean compliance = 61.0% (SD = 11.4%, 95% CI [53.2, 68.8]).

Normality of task compliance data was confirmed via the Shapiro–Wilk test(WBAN group:W = 0.97,p = 0.82; Control group:W = 0.93,p = 0.31). Homogeneity of variances was verified using Levene’s test(F = 1.23,p = 0.28). An independent two-tailed t-test revealed that the WBAN group had significantly higher compliance rates than the control group (t(14) = 4.32,p < 0.001, 95% CI [73.1%, 82.9%] vs. [53.2%, 68.8%]). The effect size, calculated via Cohen’s d, was 1.73 (95% CI [0.89, 2.57]), indicating a large magnitude of difference between groups. To validate robustness, a Mann–Whitney U test(non-parametric) was performed, yielding consistent results (U = 12,p = 0.002, two-tailed). Sensitivity analysis excluding dropouts showed no substantial changes in significance (p < 0.005).

These findings suggest that the WBAN health monitoring visualization system enhances patient engagement and adherence compared to traditional methods. The improved adherence in the WBAN group may be attributed to the system’s user-friendly interface and real-time data visualization, which empower patients to actively participate in their health management.

Qualitative analysis

Qualitative analysis was done by obtaining the positive and negative attitudes of the WBAN health monitoring visualization group and the traditional physical examination visualization group towards their respective health monitoring through a questionnaire in the form of a return visit once a month to chart the analysis in terms of qualitative attitude scores. Differences in positive/negative feedback rates between groups were analyzed using chi-square tests with Yates’ correction.

(Table 3).

Full size table

On the positive side. WBAN Group: 84.7% (13/16 participants), Control Group: 46.0% (7/16 participants), χ2(1) = 6.89, p = 0.009, OR = 6.25 (95% CI [1.56, 25.0]). Especially in terms of visualization, the positive rate of the WBAN health monitoring visualization group exceeded that of the traditional physical examination visualization group by 65.3%, and AD patients expressed that the simple interface and intuitive data presentation were well suited to their understanding of their own health, and also expressed their desire to have more health visualization graphics that they could understand. Meanwhile despite the fact that the traditional physical exam visualization group felt that the traditional health report, due to the use of a lot of equal attention presentation and medical jargon, made them less positive about the visualization than the AD patients in the WBAN health monitoring visualization group, however with the WBAN health monitoring visualization group both expressed a shared dialogue based on the health data, which allowed them all to better understand their health status. As a result the positive rates all exceeded the baseline positive rate of 60%.

On the negative side. WBAN Group: 27.4% (4/16 participants), Control Group: 80.3% (13/16 participants), χ2(1) = 10.24, p = 0.001, OR = 0.11 (95% CI [0.03, 0.46]). The biggest difference was that more than 88.6% of AD patients in the traditional health checkup group, did not want to complete the task in a timely manner, this was due to the fact that although the AD patients understood the benefits of this study for them, the inability to express in their own words the specialized charts, and the scheduled and timed medical checkups were the biggest barriers that prevented them from doing so. Also, the effort to share the graphical process of visualization, which is often a one-time event and does not create a long-lasting conversation, led to dissatisfaction with visualization in the traditional health checkup group of up to 60.3% of the perceived visualization. But in the midst of this we found an interesting item. First of all despite the fact that we have continued to simplify the cognitive obstacles posed by visualization and interactive behavior in AD-Cloud, this convenience has instead created a situation where 10% of AD patients do not want to complete the task in a timely manner, but rather opt for a more available time slot as an opportunity to communicate with the stakeholder group.

Thus, the WBAN system’s real-time feedback and intuitive visualization were strongly associated with improved patient engagement(mean compliance: 78% vs. 61%,p < 0.001). This engagement correlated with slower cognitive decline in the WBAN group, as evidenced by MMSE score trends: WBAN patients exhibited a mean annual decline of1.2 ± 0.8 points(baseline: 23.5 → 12-month: 22.3), whereas the control group declined by2.6 ± 1.1 points(baseline: 23.1 → 12-month: 20.5;t(14) = 3.11,p = 0.008, Cohen’s d = 1.42). Similar patterns were observed in CDR scores, suggesting that sustained engagement through AD-Cloud may delay functional deterioration, consistent with longitudinal studies linking frequent health monitoring to attenuated AD progression. While our trial focused on 12-month outcomes, the integration of multimodal biomarkers (e.g., speech fluency, gait stability) in AD-Cloud enables proactive detection of preclinical decline. For instance, patients with > 15% monthly anomalies in speech metrics (captured via MEMS chips) showed a 3.2-fold higher risk of CDR escalation within 6months (HR = 3.2, 95% CI [1.8–5.7],p = 0.002). This aligns with evidence that early lifestyle or pharmacological interventions, when guided by continuous data, can slow cognitive decline by 30–40% in prodromal AD. Moreover, higher social activity frequency (2.3 vs. 1.1 outings/week,p = 0.03) suggest that AD-Cloud’s real-time alerts may mitigate psychosocial risk factors(e.g., isolation, depression) known to accelerate dementia. Extrapolating these results, we hypothesize that a 5-year AD-Cloud deployment could delay nursing home admission by 8–12months, reducing societal costs by ~ ¥50,000 per patient annually.

Discussion

We thank the China Academy of Art, Dr. Alzheimer of the First People’s Hospital of Zhejiang Province, and the staff of the community service center of Turnang Street for their intellectual and human support.

The AD-Cloud system proposed in this study is based on wireless body area network (WBAN) and PRENet for early monitoring of AD patients through multimodal data (image, audio, GIS, etc.). And through the optimization of interface design, AD-Cloud seeks to achieve efficient interaction between cognitively impaired patients and stakeholder groups. Through the above experiments, the WBAN system, as a digital technology with infinite communication technology and sensors, significantly improves the comfort of early AD patients and reduces the anxiety of the stakeholder groups about their health to a certain extent, compared with the control group. Also in the long run, it is clear that the use of AD-Cloud for early AD patients can reduce the cost of aging by reducing the need for professional companionship compared to the level of companionship required for AD patients with traditional health monitoring. It also further illustrates that WBAN monitoring technology can further promote the positive role of AI technology in telehealth and enable medical visualization to bridge the gap between experts and non-experts.

Consistency analysis and extension with past research results. First, consistent with the WBAN-based chronic disease monitoring studies proposed by Rim, M. et al. and Miloud et al. this study also focuses on the monitoring function of wireless body area network on patients’ physiological data. Rim et al. demonstrated that WBAN can effectively collect physiological data from patients with chronic diseases, and achieve real-time and high efficiency of the data by optimizing the transmission of the sensors and the aggregation of the data24,26. This study extends the diversity of data collection on this basis by adopting multimodal sensor technology in order to capture more diversified health indicators such as behaviors, emotional changes, and environmental influences of AD patients in order to improve the sensitivity to disease progression. Second, the interface design of this study is in line with the studies of Dodd et al. and Rich Picking et al. on the adaptation of UI to elderly patients, both of which advocate simplifying the design and optimizing the visual and operational processes to adapt to the cognitive and operational abilities of elderly patients27,28. Lin et al. pointed out that a simple UI design can effectively improve the compliance of elderly users and help alleviate patients’ anxiety during operation. In this study, we further proposed the design of “redundant information prompts” and “single-task flow” suitable for AD patients, and incorporated family members’ assistive feedback in the usability test to ensure the user-friendliness of the interface and the patient’s dependence.

Analysis and reasons for differences from past research results. Although this study agrees with D. Chandramohan et al. and Jonell et al. in the area of multimodal data monitoring, there are significant differences in the system implementation and data processing methods25,29. Jonell et al.’s study relied more on static image and speech data, whereas the present study achieved layer-by-layer processing of the data by means of progressive convolutional Progressive Region Enhancement Network (PRENet), which enabled the system to adapt to the patient’s health. enabling the system to adapt itself to changes in the patient’s health status and perform visualization and analysis. This difference stems from the fact that the focus of this study is to improve the processing depth and accuracy of the data through multilayer networks and adaptive analysis to meet the needs of analyzing the complex and changing health states of AD patients.

The system also faces many limitations and challenges. The AD-Cloud system was originally designed for real-time monitoring of early-stage Alzheimer’s disease (AD) patients based on wireless body area network (WBAN) technology, but its architecture and functionality can be further extended and adapted to accommodate patients with other neurological disorders. However, the effectiveness and adaptability of the system in facing different types of neurological disorders (e.g., Parkinson’s disease, dementia, multiple sclerosis, etc.) is confronted with differences in physiological parameters, differences in symptom progression, and individualized interventions for patients with different neurological disorders. When dealing with different geographic regions, healthcare infrastructure is often weak in low-income and resource-poor areas, and internet and power stability may be limited, which may have an impact on the real-time data collection and transmission of the AD-Cloud system. To mitigate this, the system integrates 1TB of flash memory within wearable sensors for 24-h offline data caching, ensuring physiological and behavioral metrics are stored locally during connectivity outages. Synchronization automatically resumes when networks are restored. Second, power constraints are addressed through LoRaWAN protocol optimization (125kHz bandwidth, 10 dBm Tx power) and duty-cycled operation (active 0.1% hourly), reducing daily power consumption to 8.3 mAh. This allows a 60 mAh LiPo battery to sustain operation for 7days, critical for regions with intermittent electricity. Third, Future iterations aim to support Bluetooth 5.1 mesh networking, enabling peer-to-peer data relay without cellular infrastructure, thus broadening accessibility.

Potential long-term effects on disease progression and patient outcomes. While our trial focused on short-to-medium term patient engagement and anxiety reduction, given the progressive nature of AD, it is crucial to consider the long-term impact of the AD-Cloud system on disease progression and overall patient outcomes. Although our results indicate improved patient engagement and reduced anxiety, these findings are based on a relatively short observation period. Continuous monitoring via AD-Cloud’s multimodal sensors may enable earlier detection of AD progression markers. For instance, declining speech fluency (captured by MEMS chips) correlated with CDR score increases in longitudinal studies30, suggesting that real-time data could inform timely clinical interventions to slow functional decline.This would include monitoring potential changes in patient compliance and system usability over time, as well as examining the impact on disease-related outcomes, such as cognitive function and caregiver burden. By integrating both quantitative and qualitative data from long-term follow-ups, we will be able to better understand the lasting effects of this technology and its potential role in Alzheimer’s care.

While previous studies have highlighted the data privacy concerns of Alzheimer’s disease (AD) patients, particularly the sensitivity of elderly patients to privacy issues as noted by Felber et al. and Rahmoun et al.31,32, the present study further emphasizes these critical aspects. Ethical and privacy safeguards are rigorously implemented, all data transmitted via the WBAN network employs AES-256-GCM encryption with ephemeral keys rotated every 15min, exceeding HIPAA’s minimum requirements. For GDPR compliance, edge devices anonymize data using k-anonymity (k = 5) and l-diversity (l = 3) algorithms before cloud upload, ensuring no personally identifiable information (PII) is exposed. Audit trails documenting data access (user, timestamp, purpose) are stored on an immutable Hyperledger Fabric 2.3 blockchain, providing transparency and accountability. To address cognitive impairments in AD patients, dynamic consent is facilitated through bi-monthly touchscreen prompts (e.g., a green checkmark for continued participation), simplifying the renewal process while adhering to ethical guidelines. These measures collectively ensure compliance with international standards (HIPAA, GDPR) while maintaining patient autonomy.

Future Plans. In terms of system hardware, we will conduct regular security scans of the system to ensure there are no potential vulnerabilities or security threats. This includes, but is not limited to, common vulnerabilities such as cross-site scripting (XSS) and SQL injection. As data privacy and protection laws continue to evolve, we conduct annual system compliance audits to ensure that the AD-Cloud system is in compliance with relevant laws, ethical requirements, and industry standards (e.g., HIPAA, GDPR, etc.). Also to alleviate the AD-Cloud system’s dependence on continuous Internet connectivity and 5G infrastructure, future iterations will integrate edge computing modules (e.g., NVIDIA Jetson Nano) with offline data caching and low-power LoRaWAN protocol for offline data processing. In exploring the ubiquity of the AD-Cloud, we plan to expand our recruitment strategy in future studies by collaborating with a broader network of medical centers, clinics, and patient advocacy groups. This will allow us to recruit more AD patients across different regions, and also ensure demographic and clinical diversity in our sample. Additionally, we will encourage AD patients to continue to participate in our program and extend it to 5years to include patients with other types of degenerative neurological diseases. In terms of technology comparison, while the AD-Cloud system has shown promising results in improving patient engagement and reducing anxiety, it is important to compare this approach with other technologies to assess its relative effectiveness. Future studies will involve comparing the AD-Cloud system with other widely used technologies, such as wearable devices and mobile health applications (mHealth)23,33,34,35, in terms of patient compliance, data accuracy, and ease of use. Such comparisons will help us better understand the advantages and limitations of the AD-Cloud system in the broader landscape of Alzheimer’s disease management technologies. In terms of multimodal data, relying on larger sample sizes and population diversity, combining biomarkers, neuroimaging, and cognitive tests with health monitoring data can provide a more complete picture of how patient engagement through the AD-Cloud system impacts disease progression.

We believe that the AD-Cloud system based on WBAN data has broader implications. Due to coordination with the healthcare IoT ecosystem, it promotes multi-device data integration and centralized analysis, analyzes integrated data through PRENet models, provides personalized recommendations, improves the accuracy of monitoring health abnormalities, and improves the healthy quality of life of AD patients. This part of the capability also supports health big data research by aggregating massive health data and forming trend analysis to support public health decision-making for the elderly and optimize health protection strategies.

Conclusion

This study highlights the potential of WBAN technology and multimodal data integration for early Alzheimer’s disease (AD) monitoring through the AD-Cloud system. The system leverages flexible sensors to collect real-time physiological and behavioral data from patients and integrates various types of multimodal data, such as images, audio, geographic information, and behavior, using PRENet. This approach enables efficient analysis and visualization of the health status of AD patients, significantly improving the system’s ability to detect changes in health and providing robust telecare support for early-stage AD.

The AD-Cloud system is specifically designed to meet the needs of AD patients and elderly users. It features a simple and consistent user interface, emphasizing visual clarity, task flow simplification, and fault tolerance to ensure ease of use. Through extensive testing and family collaboration, the system optimizes user experience, enhancing patient and caregiver compliance. Additionally, to protect sensitive data, AD-Cloud employs data encryption and anonymization, proactively addressing the privacy concerns of elderly users.

While the methodology and interface design align with prior studies, this research expands on existing work by introducing an adaptive data analysis method based on PRENet, which accounts for the complex, dynamic nature of AD patients’ conditions. Future studies can further explore the applicability of AD-Cloud in other neurological disorders and optimize the system’s long-term update mechanisms and data security protocols.

In conclusion, this study presents an innovative solution for Alzheimer’s disease monitoring, demonstrating the effectiveness of WBAN technology combined with multimodal data. The AD-Cloud system holds significant potential for broader application in monitoring chronic diseases.

A digital intelligence visualization health monitoring device for Alzheimer’s disease patients based on WBAN technology (2025)

References

Top Articles
Latest Posts
Recommended Articles
Article information

Author: Prof. An Powlowski

Last Updated:

Views: 5650

Rating: 4.3 / 5 (44 voted)

Reviews: 91% of readers found this page helpful

Author information

Name: Prof. An Powlowski

Birthday: 1992-09-29

Address: Apt. 994 8891 Orval Hill, Brittnyburgh, AZ 41023-0398

Phone: +26417467956738

Job: District Marketing Strategist

Hobby: Embroidery, Bodybuilding, Motor sports, Amateur radio, Wood carving, Whittling, Air sports

Introduction: My name is Prof. An Powlowski, I am a charming, helpful, attractive, good, graceful, thoughtful, vast person who loves writing and wants to share my knowledge and understanding with you.