Logo and page links

Main menu


The Ahus project develops an algorithm that will be able to predict the risk of heart failure. The tool is based, among other things, on ECG measurements and is intended to be used as decision support in the clinic.

Eilin Wermundsen Mork, advisor for information security at Akershus University Hospital, has answered three questions about the project:

What is the essence of the project?

- The project, EKG AI, aims to develop an algorithm that can predict heart failure. Based on, among other things, ECG measurements, we can develop a clinical decision support tool that can predict the probability of whether the patient will be diagnosed with heart failure. This will provide better treatment options in that patients with a high probability of heart failure receive faster assessment and treatment, in addition it will be able to shorten the length of stay and reduce readmission and mortality.

What is it like to be one of the selected projects?

Profilbilde Eilin kvadr.jpg- It is exciting getting the opportunity to be part of the sandbox! The project applied last year, but was not selected, so it is extra fun that we got  in this round. The use of artificial intelligence in the health sector is increasing, and we want to expand our expertise in artificial intelligence in order to be better equipped to meet this development in a responsible way. We look forward to exploring unclear issues together with the Data Protection Authority and finding solid, responsible solutions that will benefit the end users, our patients.

What do you hope to get help with in the sandbox?

Ruter_Jesper_kvad.jpg- For the decision support tool to be safe to use in clinics, we need to know that it predicts correctly. What we want to find out in the sandbox is what is needed to avoid algorithm bias in the development and post-learning of the algorithm. What legal considerations and technical measures need to be in place to prevent the decision support tool from discriminating? How should we balance the principle of fairness and non-discrimination against the principle of data minimisation?