Top

Your selfie can be used to detect heart disease

Study show it's possible to use deep learning computer algorithm to detect coronary artery disease by analysing portraits

Washington: Sending a 'selfie' to the doctor could be a cheap and simple way of detecting heart disease, according to the authors of a new study. The study is the first to show that it's possible to use a deep learning computer algorithm to detect coronary artery disease (CAD) by analysing four photographs of a person's face.

The new study was published in the European Heart Journal.

Although the algorithm needs to be developed further and tested in larger groups of people from different ethnic backgrounds, the researchers say it has the potential to be used as a screening tool that could identify possible heart disease in people in the general population or in high-risk groups, who could be referred for further clinical investigations.

"To our knowledge, this is the first work demonstrating that artificial intelligence can be used to analyse faces to detect heart disease. It is a step towards the development of a deep learning-based tool that could be used to assess the risk of heart disease, either in outpatient clinics or by means of patients taking 'selfies' to perform their own screening. This could guide further diagnostic testing or a clinical visit," said Professor Zhe Zheng, who led the research and is vice director of the National Center for Cardiovascular Diseases and vice president of Fuwai Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, People's Republic of China.

He continued: "Our ultimate goal is to develop a self-reported application for high-risk communities to assess heart disease risk in advance of visiting a clinic. This could be a cheap, simple and effective of identifying patients who need further investigation. However, the algorithm requires further refinement and external validation in other populations and ethnicities."

It is known already that certain facial features are associated with an increased risk of heart disease. These include thinning or grey hair, wrinkles, ear lobe crease, xanthelasmata (small, yellow deposits of cholesterol underneath the skin, usually around the eyelids) and arcus corneae (fat and cholesterol deposits that appear as a hazy white, grey or blue opaque ring in the outer edges of the cornea). However, they are difficult for humans to use successfully to predict and quantify heart disease risk.

Professor Zheng, Professor Xiang-Yang Ji, who is director of the Brain and Cognition Institute in the Department of Automation at Tsinghua University, Beijing, and other colleagues enrolled 5,796 patients from eight hospitals in China to the study between July 2017 and March 2019. The patients were undergoing imaging procedures to investigate their blood vessels, such as coronary angiography or coronary computed tomography angiography (CCTA). They were divided randomly into training (5,216 patients, 90 per cent) or validation (580, 10 per cent) groups.

Trained research nurses took four facial photos with digital cameras: one frontal, two profiles and one view of the top of the head. They also interviewed the patients to collect data on socioeconomic status, lifestyle and medical history. Radiologists reviewed the patients' angiograms and assessed the degree of heart disease depending on how many blood vessels were narrowed by 50 per cent or more (>= 50 per cent stenosis), and their location. This information was used to create, train and validate the deep learning algorithm.

The researchers then tested the algorithm on further 1,013 patients from nine hospitals in China, enrolled between April 2019 and July 2019. The majority of patients in all the groups were of Han Chinese ethnicity.

They found that the algorithm outperformed existing methods of predicting heart disease risk (Diamond-Forrester model and the CAD consortium clinical score). In the validation group of patients, the algorithm correctly detected heart disease in 80 per cent of cases (the true positive rate or 'sensitivity') and correctly detected heart disease was not present in 61 per cent of cases (the true negative rate or 'specificity'). In the test group, the sensitivity was 80 per cent and specificity was 54 per cent.

Professor Ji said: "The algorithm had moderate performance, and additional clinical information did not improve its performance, which means it could be used easily to predict potential heart disease based on facial photos alone. The cheek, forehead and nose contributed more information to the algorithm than other facial areas. However, we need to improve the specificity as a false positive rate of as much as 46 per cent may cause anxiety and inconvenience to patients, as well as potentially overloading clinics with patients requiring unnecessary tests."

As well as requiring testing in other ethnic groups, limitations of the study include the fact that only one centre in the test group was different to those centres which provided patients for developing the algorithm, which may further limit its generalisability to other populations.

In an accompanying editorial, Charalambos Antoniades, Professor of Cardiovascular Medicine at the University of Oxford, UK, and Dr Christos Kotanidis, a DPhil student working under Professor Antoniades at Oxford, write: "Overall, the study by Lin et al. highlights a new potential in medical diagnostics......The robustness of the approach of Lin et al. lies in the fact that their deep learning algorithm requires simply a facial image as the sole data input, rendering it highly and easily applicable at large scale."

They continue: "Using selfies as a screening method can enable a simple yet efficient way to filter the general population towards more comprehensive clinical evaluation. Such an approach can also be highly relevant to regions of the globe that are underfunded and have weak screening programmes for cardiovascular disease. A selection process that can be done as easily as taking a selfie will allow for a stratified flow of people that are fed into healthcare systems for first-line diagnostic testing with CCTA. Indeed, the 'high risk' individuals could have a CCTA, which would allow reliable risk stratification with the use of the new, AI-powered methodologies for CCTA image analysis."

They highlight some of the limitations that Professor Zheng and Professor Ji also include in their paper. These include the low specificity of the test, that the test needs to be improved and validated in larger populations, and that it raises ethical questions about "misuse of information for discriminatory purposes. Unwanted dissemination of sensitive health record data, that can easily be extracted from a facial photo, renders technologies such as that discussed here a significant threat to personal data protection, potentially affecting insurance options. Such fears have already been expressed over misuse of genetic data, and should be extensively revisited regarding the use of AI in medicine."

The authors of the research paper agree on this point. Professor Zheng said: "Ethical issues in developing and applying these novel technologies is of key importance. We believe that future research on clinical tools should pay attention to the privacy, insurance and other social implications to ensure that the tool is used only for medical purposes."

Professor Antoniades and Dr Kotanidis also write in their editorial that defining CAD as >= 50 per cent stenosis in one major coronary artery "may be a simplistic and rather crude classification as it pools in the non-CAD group individuals that are truly healthy, but also people who have already developed the disease but are still at early stages (which might explain the low specificity observed)."

Next Story