Mobile 3D Depth Sensing technologies for improved home-based healthcare assessments
The home environment falls-risk assessment process (HEFAP) is a widely used falls prevention intervention strategy which involves a clinician using paper-based measurement guidance to ensure that appropriate information and measurements are taken and recorded accurately. Despite the current use of paper-based guidance, over 30% of all assistive devices installed within the home are abandoned by patients. This is in part due to poor fit between the device, the patient, and the environment in which it is installed. Currently HEFAP is a clinician-led process, however, older adult patients are increasingly being expected to collect HEFAP measurements themselves as part of the personalisation agenda. Without appropriate patient-centred guidance, levels of device abandonment to are likely to rise to unprecedented levels. This study presents guidetomeasure-3D, a mobile 3D measurement guidance application designed to support patients in carrying out HEFAP self-assessments.
In recent years, 3D depth sensing technologies have been developed to the extent that they can be integrated into lightweight mobile devices such as mobile phones and tablet computers. Depth sensing technologies promise the ability to be able to scan an environment and enabled the device to record accurate measurements of that environment and objects that exist within it.
This project focuses on developing a 3D Depth Sensing enabled mobile application that effectively enables patients and practitioners to ‘automatically’ scan an environment and measure objects within it, with a view to overcoming the overcoming the challenges that are currently faced as part of the HEFAP as a result of requiring practitioners and patients to take measurements by hand.
Meet the Principal Investigator(s) for the project
Related Research Group(s)
Interactive Multimedia Systems - Building sensor and media-rich, cross-layer, inclusive e-systems, with an interest in human-machine interaction, sensorial-based interfaces, data visualisation and multimedia.
Project last modified 21/06/2021