Skip to main content

BRIEF RESEARCH REPORT article

Front. Digit. Health, 24 November 2022
Sec. Health Informatics
This article is part of the Research Topic Women in Digital Health 2021 View all 5 articles

Which one? A suggested approach for evaluating digital health maturity models

\r\nLeanna Woods,
Leanna Woods1,2*Rebekah EdenRebekah Eden3Rhona DuncanRhona Duncan3Zack KodiyattuZack Kodiyattu1Sophie MacklinSophie Macklin1Clair Sullivan,,\r\nClair Sullivan1,2,4
  • 1Centre for Health Services Research, The University of Queensland, Herston, QLD, Australia
  • 2Queensland Digital Health Centre, The University of Queensland, Herston, QLD, Australia
  • 3School of Information Systems, Queensland University of Technology, Brisbane, QLD, Australia
  • 4Digital Metro North, Metro North Hospital and Health Service, Herston, QLD, Australia

Background: Digital health maturity models allow healthcare organizations to evaluate digital health capability and to develop roadmaps for improving patient care through technology. There are many models available commercially for healthcare providers to use to assess their digital health maturity. Currently, there are limited evidence-based methods to assess the quality, utility, and efficacy of maturity models to select the most appropriate model for the given context.

Objective: To develop a framework to assess digital maturity models and facilitate recommendations for digital maturity model selection.

Methods: A systematic, consultative, and iterative process was used. Literature analyses and a stakeholder needs analysis (n = 23) was conducted to develop content and design considerations. These considerations were incorporated into the initial version of the framework developed by researchers in a design workshop. External stakeholder review (n = 20) and improvements strengthened and finalized the framework.

Results: The criteria of the framework include assessment of healthcare context, feasibility, integrity, completeness and actionability. Users can compare model performance in order to select the most appropriate model for their context.

Conclusion: The framework provides healthcare stakeholders with a consistent and objective methodology to compare digital health maturity models, informing approaches to choosing a suitable model. This is a critical step as healthcare evolves towards a digital health system focused on improving the quality of care, reducing costs and improving the provider and consumer experience.

1. Introduction

Digital health provides unprecedented opportunities to transform healthcare (1). Like all healthcare interventions, digital health technologies need to be rigorously evaluated to ensure they achieve improved health and care (2). Digital maturity is the extent to which digital systems are leveraged for high quality healthcare (3). Digital health maturity models are structured evaluations which allow healthcare organizations to document current digital state and develop roadmaps for improving patient care, health outcomes and health equity (4). A series of “dimensions” are often used to understand aspects of digital health capability such as business processes, organizational characteristics, information and people (5). Health service leaders can use models to track the evolution of the digital transformation process, motivate or coordinate transformation activities (6) and improve health service efficiency, effectiveness, performance and productivity (5).

National strategies outline the need to support health services in measuring and improving their level of digital maturity (7). An increasing number of maturity models are available to healthcare providers; however, it is unclear how to choose the correct model for the correct context. Importantly, existing maturity models often fail to focus on outcomes of value, but rather focus on the depth and successful implementation of the technology alone, regardless of system outcomes (2, 6, 8, 9). The importance of a maturity model that can be effectively applied to local context is emerging (6).

There is an unmet need to determine the scope and characteristics of available models (5). Our research question is: How can healthcare providers evaluate the quality and utility of digital health maturity models? The objective of this work was to create a framework to enable critical evaluation and selection of digital health maturity models by healthcare providers.

2. Methods

2.1. Setting

This work was commissioned by a federal government body, the Australian Digital Health Agency. Australia has emerging digital maturity with significant investments underway to implement digital solutions across the healthcare system.

2.2. Research design

Design Sciences Research Cycles (10) were followed to ensure the framework was developed using foundational design principles (11), consistent with the evidence-base and relevant to intended users. Three closely related cycles of activities were undertaken (Figure 1):

• The relevance cycle relates to the environment, comprising activities with relevant stakeholders and organizational systems or structures.

• The rigor cycle relates to the knowledge base, comprising activities which acknowledge the contribution of the existing literature on digital health maturity and evaluation frameworks.

• The central design cycle is where the framework was developed, evaluated, updated and finalized.

FIGURE 1
www.frontiersin.org

Figure 1. Methodological approach to the development of the framework, modified from Hevner (10).

Activities were conducted across the following development process:

1. Content development—preliminary research activities to define the content and design considerations

2. Framework development—how the initial design of the framework was confirmed

3. Evaluation and update—review and improvement cycles to develop the final framework

Ethical approval for this research was provided by The University of Queensland [project number 2021/HE001314] prior to commencement. Participant consent was received prior to data collection.

2.3. Content development

Three separate research activities were undertaken to define important elements to be incorporated into the design of the framework (Table 1). Key elements for the framework development were identified through analyses in each research activity, then collated and sub-categorized into content considerations (principles to influence the content) and design considerations (principles to use for visual representation of the framework).

TABLE 1
www.frontiersin.org

Table 1. Research activities to develop content and design for the framework.

2.4. Framework development

Content to be included in the framework was collated from the outputs of the research activities. Researchers translated each content consideration into a question that could be answered, and a scoring system was selected and applied. Questions were categorized into different sections for clarity and ease-of-use.

A design workshop with the research team addressed inaccuracies and gaps in the content. Design considerations informed the co-design of a visual representation of the framework. Following the workshop, the framework was updated accordingly and prepared for external consultation.

2.5. Evaluation and update

The framework was reviewed and improved in an iterative manner. Stakeholders who participated in the needs analysis were invited to review and submit feedback on the framework. The purpose of this review was to address any inaccuracies in content or improvements in the design and understand its perceived utility. Stakeholders submitted feedback either through a semi-structured interview with two researchers or written feedback via email. Stakeholders were asked to reflect and answer the following questions: What is good about the framework?; What can be improved?; Is there anything that is missing?; How do you see it being used?; What other ideas do you have?; and Is the structure correct?

Individual feedback was collated in a data table and thematically analysed by two researchers to uncover themes. Updates were incorporated into the final framework when a clear trend appeared, or when researchers agreed.

3. Results

3.1. Content development

3.1.1. Synthesis of prior work on maturity models

Elements of 21 existing digital health maturity models from the grey and academic literature reported in the government white paper (12) were extracted across three categories:

• Dimensions of digital maturity

○ leadership and governance

○ workforce capability: digital literacy, clinical skills

○ compliance with data exchange standards

○ technical: infrastructure, architecture, security

○ patient or consumer participation

○ interoperability

○ health sector coverage

○ benchmarking

• New and emerging dimensions of digital maturity which are gaining prominence in recent years

○ user experience

○ innovation

○ organizational capability

○ clinical safety

○ adherence to government policy on data, design, infrastructure, governance and standards

○ efficiency

• Contextual considerations in which the models are being applied

○ the importance of culture in the maturation journey

○ the need for validation in the local context

○ utility by small and large health provider organizations

○ creating drivers for furthering organizational digital maturity

Seven dimensions of digital maturity were uncovered through the systematic literature review of 27 unique maturity models (9). This was published elsewhere (9) and Figure 2 summarizes the dimensions of digital maturity and corresponding indicators used to assess dimensions:

1. Strategy: The extent to which the organization has developed and implemented a strategic plan to achieve its goals and objectives (14)

2. IT capability: The extent to which the organization has adopted and implemented IT infrastructure, digital systems, technologies, and services (15) which are usable and effective (16)

3. Interoperability: The extent to which data and information can be exchanged between systems within the organization, as well as across care settings, and with patients, caregivers, and families (17)

4. Governance and management: The extent to which the organization embraces leadership, policies and procedures, structures, risk management (quality and safety), integrated workflows, relationship building, and capacity building (18)

5. Patient-centered care: The extent to which patients, caregivers and families can actively participate in their health decisions, have access to information and health data, and co-create services and service delivery (19)

6. People, skills and behaviors: The extent to which stakeholders (internal and external) are digitally literate and motivated to leverage technology (10, 17)

7. Data analytics: The extent to which the organization uses data for effective decision making for the organization, patients, and population health (4)

FIGURE 2
www.frontiersin.org

Figure 2. Digital maturity dimensions and corresponding indicators elicited from the literature.

3.1.2. Stakeholder needs analysis

The research team interviewed 23 individuals across 16 interview sessions. Half the stakeholders were Chief Information Officers and remaining roles encompassed executives, clinicians, researchers, and state and federal government employees. Affiliations spanned government, non-government organizations, and public and private health services. The following sections report the stakeholder's and their health system's current use of maturity models, and perceptions about the ideal maturity model including design preferences, dimensions, implementation and outputs.

3.1.2.1. Use and perceptions of maturity models

Most health district stakeholders reported having done some form of maturity assessment. Two models [the Healthcare Information and Management Systems Society maturity model (20) and the Victorian Digital Health Maturity Model (21)] were in use, previously used, or known by nearly half of these stakeholders. A small number of stakeholders independently reflected on their digital maturity without using a formal maturity model.

Purposes of applying existing maturity models included the need for benchmarking, driving the digital agenda, and understanding current state including gaps in maturity and priorities for change. Secondly, stakeholders emphasized providing evidence for financial decisions (return on investments or inform future investment), governance (funding, regulation, legislation), and planning for next steps (informing strategy).

Few described how maturity assessments were resourced, describing support from state government, university partners or internal resourcing. Using existing models had the benefit of minimizing procurement costs and minimized challenges associated with implementing a new process. Lack of funding was the most common reason stakeholders had not applied a model, followed by a lack of time and effort, and re-prioritization due to the COVID-19 pandemic.

Primarily, the intended purpose for applying a maturity model in the future was to inform investment decisions or apply for targeted funding. The second most common purpose was to understand the digital maturity status on the transformation journey and inform (or continue to inform) organizational strategy.

Design preferences of maturity models elicited from interview participants are summarized in descending order of importance: Simple but useful; Evidence-based; Relevant; Patient-centered; Demonstrate business value; Sustainable.

Six dimensions of digital maturity were considered by participants as most important to measure in a model, reported in descending order of importance by stakeholders: Interoperability; Level of digitalization; Data management; Infrastructure; Workforce capability; and Governance.

3.1.2.2. Implementation considerations of maturity models

Most stakeholders agreed maturity models should be applied to multiple health service levels, most importantly at the jurisdiction and regional/networked levels. Stakeholders suggested different healthcare providers should use models developed for those specific settings. Overwhelmingly, stakeholders believed governments should have overall responsibility for models, and healthcare organizations themselves should conduct the digital maturity assessment.

While half the stakeholders desired an annual maturity assessment to monitor improvements, the other half reflected that the most logical frequency is every 2–3 years due to the time it takes to enact organizational change.

3.1.2.3. Desirable outputs of maturity models

Consistently stakeholders desired a highly visual summary of digital maturity that was easily interpreted and facilitated the ability to compare. Stakeholders wanted the report to outline strengths, weaknesses, and opportunities based on best practice as evidence to support their improvements or impact domains. Ideally, the output would include a guideline indicating what the organization can act on to advance digital maturity.

3.1.3. Analysis of existing frameworks

Key components of sampled frameworks to be considered in the design of the framework were elicited through conducting the lightning demo activity. Elements of frameworks (10, 2229) that had potential to be incorporated included:

• Structure

○ reach, effectiveness, adoption, implementation, maintenance (RE-AIM)

○ inputs, activities, outputs

○ appropriateness, effectiveness, sustainability

○ descriptive, prescriptive, comparative

• Criteria

○ explanation of scores

○ questions asked

○ indicator statement

○ maturity levels for domains

○ table form with scale

○ criteria related to process, development, outcomes

• Context

○ culture

○ target group

○ health service type

○ impact area focus

○ elements of value

• Output

○ scores (weighted)

○ tally of achievement based on questions

○ short-term and long-term outcomes

• Design

○ visual

○ identify strength and weaknesses

○ spider diagram

○ pie diagram

• Validation

○ different settings

○ internal/external validation

3.1.4. Content and design considerations

The key findings from research activities culminated in a list of content and design considerations which was used as input to the framework (Table 2).

TABLE 2
www.frontiersin.org

Table 2. Content and design considerations which informed the framework.

3.2. Framework development

A design workshop and internal review of Table 2 (summarizing content and design considerations) facilitated the development of the initial version of the framework. The document was prepared for external consultation.

3.3. Evaluation and update

Twenty stakeholders provided feedback. Stakeholders reported that the version was well-structured, responding positively to the individual sections and the length. Stakeholders reported the framework was comprehensive yet generalizable to varied health services, simple and easy-to-use. Most responded positively to both the presence of the scoring system itself and the 0, 1, 2 nature of the scoring system. However, most stakeholders were uncertain as to how to interpret the final score, namely because the total achievable scores varied across the different sections.

Several comments suggested incorporation of new and emerging concepts in healthcare and digital health technology (e.g., prescriptive analytics, wearables, artificial intelligence, virtual care). Multiple individuals commented that the focus should be patient and population outcomes. Feedback included broad suggestions to update wording or include additional content. Wording and content were updated when a clear trend emerged in the data, or when all researchers agreed on the update. Broadly, stakeholders also suggested the following general principles when updating the framework document:

• Ensure any instructions presented are clear and concise

• Update any technical language to be more easily understood

• Consolidate phrases as needed. Separate questions as needed

• Update the language to be more generic, such that it is not hospital focused

• Update the language to be more active and aspirational

• Update the language to be more patient and population focused

These principles were incorporated, and a user guide was added for additional clarity.

3.4. Framework to evaluate digital health maturity models

The purpose of the framework is to provide healthcare stakeholders a consistent and objective methodology to compare maturity models identified by different vendors.

The framework contains five sections (see Figure 3 and 'Supplementary Data Sheet 1 for the full framework):

1. Assessment of healthcare context: to understand to which healthcare contexts the model could be applied. If the model cannot be applied to the appropriate healthcare context, the user may decide against proceeding with the assessment.

2. Feasibility assessment: to understand the model’s resourcing requirements and the organization’s ability to secure those requirements. It also considers the implementation requirements, accessibility of collected data, and vendor commitments to improving the model over time.

3. Integrity assessment: to evaluate the extent to which the results can be trusted and considered as accurate assessments of an organization’s digital maturity.

4. Completeness assessment: to understand the extent to which the model considers critical elements of digital maturity. Seven dimensions and 24 indicators of digital maturity were identified by a systematic literature review, which was refined to seven dimensions and 27 indicators through stakeholder consultation. This section assesses the presence and extent to which the model addresses the dimensions and their respective indicators.

5. Actionability assessment: to understand if the results from the model can be used to improve healthcare outcomes, and capacity for internal and external benchmarking.

FIGURE 3
www.frontiersin.org

Figure 3. Framework to evaluate digital health maturity models.

Remaining sections contain criteria in the format of a question. Questions can be answered using the following scale: 0 = No; 1 = Somewhat/Maybe; 2 = Yes; U = Unknown. Calculating subtotals across the sections enables identification of the strengths and weaknesses of the model. After completing the assessment for several maturity models, users can compare model performances to provide recommendations for maturity model use.

4. Discussion

Healthcare providers need assistance with their digital transformation strategies. We heard clearly from healthcare stakeholders that they require the results from applying a model to identify gaps in digital health, identify future directions for growth and help inform the business case for digital health investments. The framework to evaluate digital health maturity models can help healthcare providers select an appropriate model based on feasibility, integrity, completeness and actionability.

Applications of digital health maturity models are scarcely described (30). The process of applying the framework prompts users to seek responses to criteria, facilitating a transparent and fair evaluation of quality. As consulting firms and non-government organizations market maturity models, a method for healthcare providers to choose an appropriate model is important. Vendors will increasingly need to be transparent with the content of their models to enable evaluation.

The correlation between digital maturity and healthcare outcomes is limited (3) and needs investigation. Current focus on technology implementation depth, rather than health outcomes need to evolve. Further work is needed to correlate technology implementation and quality improvement measures such as healthcare outcomes mapped to the quadruple aims of healthcare—better patient experience, better clinician experience, improved health of the population and reduced cost of care (8).

A major benefit of utilizing this framework is to guide and enable the implementation of appropriate maturity models. In doing so, health services on all levels have the ability to benchmark, both internally and externally, improving their overall digital maturity and facilitating comparison against their peers. It is still unknown if a single maturity model can be applied to multiple healthcare contexts, so further work is needed to work towards recommending maturity models for use. A recent review uncovered that the most common scale that models are applied to is “multiple hospitals” (5). Applying the framework to multiple models will uncover the model(s) most relevant to the targeted context. This remains an important next step.

4.1. Limitations

Consistent with the consultative design methodology, stakeholders involved in the design and development of the framework were purposively sampled and deliberatively influenced its content. The results are therefore potentially biased and not necessarily generalizable to different healthcare settings (13). Systematic review methods to identify and evaluate existing digital health maturity models reported in the academic literature were required to complement the government white paper. We performed several steps to improve credibility of the results, including transparency of data collection and analysis, stakeholder review with 20 individuals and acceptance by the commissioning government agency. Additional review by additional healthcare providers would be helpful to validate the framework.

4.2. Conclusion

Digital transformation is now an essential component of the strategies for all healthcare organizations. A digital health maturity assessment can assist healthcare providers with their digital health strategy and monitor progress towards achieving organizational goals through digital change. Currently, it is unclear how to choose a digital health maturity model. We have developed an evidence-based framework to enable assessment and comparison of digital health maturity models. This is critical step as digital health systems evolve to focus on improving the quality of care, reducing costs and improving the provider and consumer experience.

Data availability statement

The original contributions presented in the study are included in the article/Supplementary Material, further inquiries can be directed to the corresponding author/s.

Ethics statement

The studies involving human participants were reviewed and approved by The University of Queensland Human Research Ethics Committee [2021/HE001314]. Written informed consent for participation was not required for this study in accordance with the national legislation and the institutional requirements.

Author contributions

LW and CS devised the project. RE and RD informed the literature review and assisted with the strategic direction of the research. ZK, SM contributed to data collection, analysis and reporting of results under the supervision of LW and CS. LW and CS wrote the manuscript with assistance from RE, RD, ZK and SM. All authors contributed to the article and approved the submitted version.

Funding

This research was supported by the Australian Digital Health Agency [project number DH3406].

Acknowledgments

We extend our appreciation to the many healthcare stakeholders who contributed to the development of the framework.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Supplementary material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fdgth.2022.1045685/full#supplementary-material.

References

1. Halminen O, Chen A, Tenhunen H, Lillrank P. Demonstrating the value of digital health: guidance on contextual evidence gathering for companies in different stages of maturity. Health Serv Manage Res. (2021) 34:13–20. doi: 10.1177/0951484820971447

PubMed Abstract | CrossRef Full Text | Google Scholar

2. Flott K, Callahan R, Darzi A, Mayer E. A patient-centered framework for evaluating digital maturity of health services: a systematic review. J Med Internet Res. (2016) 18:e75. doi: 10.2196/jmir.5047

PubMed Abstract | CrossRef Full Text | Google Scholar

3. Martin G, Clarke J, Liew F, Arora S, King D, Aylin P, et al. Evaluating the impact of organisational digital maturity on clinical outcomes in secondary care in England. NPJ Digit Med. (2019) 2:41. doi: 10.1038/s41746-019-0118-9

PubMed Abstract | CrossRef Full Text | Google Scholar

4. Johnston DS. Digital maturity: are we ready to use technology in the NHS? Future Healthcare J. (2017) 4:189. doi: 10.7861/futurehosp.4-3-189

CrossRef Full Text | Google Scholar

5. Kolukısa Tarhan A, Garousi V, Turetken O, Söylemez M, Garossi S, et al. Maturity assessment and maturity models in health care: a multivocal literature review. Digital Health. (2020) 6:1–20. doi: 10.1177/2055207620914772

CrossRef Full Text | Google Scholar

6. Cresswell K, Sheikh A, Krasuska M, Heeney C, Franklin BD, Lane W, et al. Reconceptualising the digital maturity of health systems. Lancet Digital Health. (2019) 1:e200–1. doi: 10.1016/s2589-7500(19)30083-4

PubMed Abstract | CrossRef Full Text | Google Scholar

7. Australian Digital Health Agency. Australia's national digital health strategy: safe, seamless and secure: evolving health and care to meet the needs of modern Australia (2017). Australia.

8. Bodenheimer T, Sinsky C. From triple to quadruple aim: care of the patient requires care of the provider. Ann Fam Med. (2014) 12:573–6. doi: 10.1370/afm.1713

PubMed Abstract | CrossRef Full Text | Google Scholar

9. Duncan R, Eden R, Woods L, Wong I, Sullivan C. Synthesizing dimensions of digital maturity in hospitals: systematic review. J Med Internet Res. (2022) 24:e32994. doi: 10.2196/32994

PubMed Abstract | CrossRef Full Text | Google Scholar

10. Hevner AR. A three cycle view of design science research. Scand J Inf Syst. (2007) 19:4.

Google Scholar

11. Hevner AR, March ST, Park J, Ram S. Design science in information systems research. MIS Q. (2004) 28(1):75–105. doi: 10.2307/25148625

CrossRef Full Text | Google Scholar

12. Australian Digital Health Agency. Maturity model, benefits and evaluation - update to the national interoperability steering committee (government white paper) Sydney: Australian Digital Health Agency (2021), p. 29.

13. Knapp J, Zeratsky J, Kowitz B. Sprint: How to solve big problems and test new ideas in just five days. London: Transworld Publishers (2016).

14. Carvalho JV, Rocha Á, Abreu A. Maturity assessment methodology for HISMM-hospital information system maturity model. J Med Syst. (2019) 43:35. doi: 10.1007/s10916-018-1143-y

PubMed Abstract | CrossRef Full Text | Google Scholar

15. Vidal Carvalho J, Rocha Á, Abreu A. Maturity of hospital information systems: most important influencing factors. Health Informatics J. (2019) 25:617–31. doi: 10.1177/1460458217720054

PubMed Abstract | CrossRef Full Text | Google Scholar

16. Martin G, Arora S, Shah N, King D, Darzi A. A regulatory perspective on the influence of health information technology on organisational quality and safety in England. Health Informatics J. (2020) 26:897–910. doi: 10.1177/1460458219854602

PubMed Abstract | CrossRef Full Text | Google Scholar

17. Krasuska M, Williams R, Sheikh A, Franklin BD, Heeney C, Lane W. Technological capabilities to assess digital excellence in hospitals in high performing health care systems: international eDelphi exercise. J Med Internet Res. (2020) 22:e17022. doi: 10.2196/17022

PubMed Abstract | CrossRef Full Text | Google Scholar

18. Potter I, Petersen T, D'Agostino M, Doane D, Ruiz P, Marti M. The virgin Islands national information systems for health: vision, actions, and lessons learned for advancing the national public health agenda. Rev Panam Salud Publica. (2018) 42:e156. doi: 10.26633/RPSP.2018.156

PubMed Abstract | CrossRef Full Text | Google Scholar

19. Grooten L, Borgermans L, Vrijhoef HJ. An instrument to measure maturity of integrated care: a first validation study. Int J Integr Care. (2018) 18:1–20. doi: 10.5334/ijic.3063

PubMed Abstract | CrossRef Full Text | Google Scholar

20. HIMSS Analytics,. EMRAM: a strategic roadmap for effective EMR adoption and maturity. Available at: https://www.himssanalytics.org/emram (Accessed March 31, 2021).

21. Department of Health,. Victoria’s digital health maturity model. In: Department of health. VIC: Victoria State Government (2022). p. 1–5. https://www.health.vic.gov.au/quality-safety-service/victorian-digital-health-maturity-model

22. Pöppelbuß J, Röglinger M. What makes a useful maturity model? A framework of general design principles for maturity models and its demonstration in business process management (2011).

23. Deloitte. Evaluation of latrobe health innovation zone, latrobe health assembly and health advocate: draft evaluation framework executive summary. Deloitte Touche Tohmatsu.

24. Ratana K, Herangi N, Rickard D. Maniapoto freshwater cultural assessment framework. Hamilton, New Zealand: Maniapoto Māori Trust Board (2020).

25. Department of Health and Ageing,. Evaluation toolkit for breastfeeding programs and projects. Canberra: Commonwealth of Australia (2012).

26. Ahmad T, Thaheem MJ. Economic sustainability assessment of residential buildings: a dedicated assessment framework and implications for BIM. Sustain Cities Soc. (2018) 38:476–91. doi: 10.1016/j.scs.2018.01.035

CrossRef Full Text | Google Scholar

27. Meng X. Assessment framework for construction supply chain relationships: development and evaluation. Int J Proj Manag. (2010) 28:695–707. doi: 10.1016/j.ijproman.2009.12.006

CrossRef Full Text | Google Scholar

28. Drivas G, Chatzopoulou A, Maglaras L, Lambrinoudakis C, Cook A, Janicke H. A nis directive compliant cybersecurity maturity assessment framework. 2020 IEEE 44th annual computers, software, and applications conference (COMPSAC). IEEE (2020). p. 1641–6

29. Ernst & Young Global Limited,. Maturity assessment: global business services Available at: https://www.ey.com/en_gl/maturity-assessment (Accessed August 9, 2021).

30. Burmann A, Meister S. Practical application of maturity models in healthcare: findings from multiple digitalization case studies. HEALTHINF (2021). p. 100–10

Keywords: digital maturity, maturity model, digital capability, capability model, digital transformation, evaluation, digital health

Citation: Woods L, Eden R, Duncan R, Kodiyattu Z, Macklin S and Sullivan C (2022) Which one? A suggested approach for evaluating digital health maturity models. Front. Digit. Health 4:1045685. doi: 10.3389/fdgth.2022.1045685

Received: 15 September 2022; Accepted: 3 November 2022;
Published: 24 November 2022.

Edited by:

Saskia M. Kelders, University of Twente, Netherlands

Reviewed by:

Jorge Calvillo-Arbizu, Sevilla University, Spain
Samar Betmouni, Sheffield Teaching Hospitals NHS Foundation Trust, United Kingdom
Felix Holl, Neu-Ulm University, Germany

© 2022 Woods, Eden, Duncan, Kodiyattu, Macklin and Sullivan. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Leanna Woods bGVlLndvb2RzQHVxLmVkdS5hdQ==

Specialty Section: This article was submitted to Health Informatics, a section of the journal Frontiers in Digital Health

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.