- 1Holland Bloorview Kids Rehabilitation Hospital, Toronto, ON, Canada
- 2Children’s Hospital of Eastern Ontario, Ottawa, ON, Canada
- 3Hamilton Health Sciences, Hamilton, ON, Canada
Introduction: Measurement-based care (MBC) is a structured approach to collecting patient-reported outcome measures (PROMs) to inform clinical care. While MBC is routinely used in mental health settings, its application in neurodevelopmental populations—particularly those with co-occurring physical and mental health conditions—remains limited.
Methods: MBC was implemented across three Ontario hospital Extensive Needs Service (ENS) sites in Toronto, Hamilton, and Ottawa. Selection of PROMs was informed by a focused literature review and consultation with clinicians, researchers, and family advisors. A logic model was developed to guide evaluation and link measures to anticipated outcomes. Clinicians received training and support to integrate PROMs into clinical workflows. Data were collected at baseline and at regular intervals.
Results: Between April 2023 and April 2025, 381 participants entered the ENS program, and 36 were discharged. Their duration of participation ranged from 3 to 20 months. Each site engaged clinical staff in PROM completion. Completion rates for measures were higher at baseline and declined over time. Shorter PROMs showed higher completion rates compared to longer ones.
Discussion: This novel implementation of MBC provides important insights for MBC in pediatric populations with high complexity. Early and ongoing engagement of both clinicians and families is important to success, which is also closely tied to the degree to which MBC is integrated into care processes. MBC remains necessary to guide clinical care and treatment planning for children with complex, intersecting needs. It is also helpful when evaluating new programs and generating foundational evidence on the effectiveness of therapies for this population.
1 Introduction
Measurement-based care (MBC) is a systematic approach to collecting outcome measures either before or during clinical visits to inform patient care (1), and to facilitate improvement and evaluation within clinical programs (2). Tracking growth parameters, vital signs, and laboratory markers is routine practice in many medical clinics; however, judging baseline needs, therapeutic progress, and intervention options in neurodevelopmental, behavioral, and mental health clinics more often relies on reports from children/youth and their family caregivers (1). As such, MBC in neurodevelopmental clinical settings is predominantly based on patient-/parent-reported outcome measures (PROMs). Routine implementation of PROMs using MBC strategies has been shown to improve individual patient outcomes (2), including for those who have not responded to first-line treatments (3). Beyond this impact on individual outcomes, the harmonization of outcome measures can support the development of a learning health system (4, 5), whereby analysis of outcomes across patients can be used for timely evaluation of care models and the generation of new knowledge using traditional research approaches.
Children/youth with neurodevelopmental conditions (such as autism, intellectual disability, and attention-deficit/hyperactivity disorder [ADHD]), particularly those with co-occurring mental health and/or physical health conditions, sometimes experience challenges in achieving meaningful therapeutic responses with available services (6–8). Often, these children/youth have difficulty accessing adequate community-based care (9) and can have disproportionately high use of acute care, such as emergency departments and inpatient admissions, compared to neurotypical populations (10–12). There is an urgent need to understand which treatment modalities work for which children/youth and the resulting impact in mitigating negative outcomes, such as potentially avoidable acute care visits. Given the unique needs of this population and the necessity of tailored approaches, MBC allows clinicians to evaluate and modify care in real time using data reported by the people who know them best. Coordinated implementation of MBC offers one approach to evaluate care models and their resulting impacts across a variety of domains, including service use and improved outcomes for patients.
To our knowledge, there has been little published research exploring MBC in neurodevelopmental populations to date, with comparatively more work coming from mental health clinics (1, 13, 14). One exception is the Autism Care Network (previously named the Autism Learning Health Network) (15). The aims of this North American network were to reduce variations in care across member sites and emphasize best practices, with a focus on inclusion of previously underrepresented groups. Unfortunately, this network focused only on one condition (autism), and there are no further (or longitudinal) results published from it. The implementation of longitudinal MBC is key to generating much-needed data to inform treatment modalities for children/youth with combined neurodevelopmental, behavioral, and mental health conditions—particularly those who have not responded to first-line therapies—as well as to inform supports for caregivers/families. The objective of this work was to describe the development and initial implementation of MBC in the Extensive Needs Service (ENS), a novel program funded by the Ontario government to provide wraparound care to children/youth with intersecting developmental, physical, mental health, and social complexities.
2 Methods
While the outcomes presented here do not include any personal health information, each site still obtained research ethics approval, including opt-out/waiver of consent for data collection.
2.1 Clinical setting
The ENS is a dedicated pathway offering specialized services for children and youth with complex, unmet needs in Ontario to promote optimal outcomes (16). The service is a client-centered program structured around interdisciplinary, trauma-informed, accessible care tailored to each client’s needs. Specialized, wraparound services in ENS are jointly funded through Ontario Health via the Ministry of Health (MOH) and the Ministry of Children, Communities and Social Services (MCCSS), and are delivered through specialized centers able to provide continuity of care throughout childhood and adolescence. Three Ontario hospitals were funded as part of a 3-year proof of concept of the ENS program: Holland Bloorview Kids Rehabilitation Hospital (HB), McMaster Children’s Hospital (MCH), and the Children’s Hospital of Eastern Ontario (CHEO).
Clients access ENS through various referral pathways, including external community providers (e.g., physicians, education professionals, child welfare agencies), internal providers (i.e., cross-departmental referrals within the hospital), and, in some regions, family self-referral. After a referral is received, the referral source and/or family caregivers and, when relevant, the child/youth are contacted to gather more information to help inform next steps. During this session, ENS team members collect preliminary information to inform initial service provision, as well as any further information required to determine eligibility. Referred children/youth are subsequently discussed at ENS team meetings for preliminary treatment planning. The exact complement of service providers differs across ENS sites; however, services generally include specialized behavior services, mental health assessment and treatment, social work, care coordination, occupational therapy, speech-language pathology, medication management, respite (or assistance accessing community respite), and transition support to community service providers. ENS is not strictly time-limited, and discharge is based on several factors, including the child/youth’s progress, goal attainment, and overall readiness to access community supports/programs. Outcome measures are completed at baseline and throughout the program and will be described in further detail below.
2.2 Participants
Inclusion criteria for the program include: i) age up to 18 years; ii) co-occurring neurodevelopmental conditions or an acquired brain injury and mental health condition(s), as well as possible chronic physical health condition(s); iii) existing needs that are not currently met with respect to: a) challenging/interfering behaviors (lasting at least 12 months or escalating over the past 6 months); b) having already accessed several healthcare/service providers across sectors and/or being unable to engage in services due to extraordinary circumstances within the family system; and c) safety concerns that are a barrier to participation in home, school, or community settings. Caregiver/family complexity is also considered, informing unmet needs of the family unit.
2.3 Implementation of MBC in ENS
Here we describe the initial phases of implementation of MBC within ENS, including the identification of measures, consultation with the clinical team, development of the logic model, and pilot implementation.
2.3.1 Focused literature review
Given the timely nature of establishing the Extensive Needs Service across three sites in Ontario, the research team undertook focused literature reviews in two areas:
i. identifying similar programs to determine their care models and evaluation strategies; and
ii. identifying measures that would capture each key program aim.
Program aims were co-developed with leadership from both funding ministries (MOH/MCCSS). One team member conducted the literature search and filtered results. The search was conducted using MEDLINE, PubMed, CINAHL, and PsycINFO databases from 3–13 February 2023, with articles from 2008 onward. Search terms included the following: fragility; fragility of child; fragility of family; vulnerable populations; complexity; complexness; co-occurring conditions; developmental complexities; social vulnerabilities; evidence-based treatment; trauma-informed treatment; measurement-based care; unmet needs; behavior/symptom improvement; high-intensity user. English-language peer-reviewed studies were included.
Five programs akin to the ENS vision were identified in the United States, United Kingdom, and Australia. These interventions used a combination of quantitative and qualitative methods to evaluate their programs, including both pre- and post-measurement-based tools (17), and interviews with families and clients (18–20). When reviewing program models, the Access to Tailored Autism Integrated Care protocol, implemented in California, United States, highlighted the success of having an identified clinical champion who connected with the research team on a biweekly basis to evaluate implementation of the program (19). The Children and Young People’s Health Partnership Evelina London model emphasized the importance of incorporating family perspectives early on to determine the key components considered essential for effective integrated care services for children and families (21). This was exemplified when families described “integrated service as providing holistic care, within a family,” highlighting the patient’s nurse and pediatrician discussing the care plan together.
Through a combination of literature review, consultation with adjacent programs, and discussion with ENS clinicians, we identified multiple candidate measures related to clinical symptom improvement across mental health and behavioral presentations: the Child Behavior Checklist (22), the Behavioral Assessment System for Children (3rd edition) (23), and the Aberrant Behavior Checklist (24). Quality of life measures (25, 26) were added as an additional way to assess clinical symptoms and unmet needs. Measures of adaptive skills (i.e., daily functioning) were included to further understand client needs. To measure family distress levels, the Brief Family Distress Scale (BFDS) (27) was identified to capture fragility at the family level. The Resource Use Questionnaire (RUQ) was identified to measure resource use in autistic children (28); this was modified with permission from the authors to capture a transdiagnostic clinical group. Once possible measures for each domain were identified, we reviewed the list and began narrowing it down based on relevance, sensitivity to change, inclusive language, and completion burden (see Appendix 1).
2.3.2 Consultation with ENS clinicians
Findings from the reviews were presented by the research team to the initial members of the ENS Research and Evaluation Working Group, which consisted of a project manager from CHEO; a program director, clinical manager, clinical/research lead, behavior therapists, and social workers from HB; and a program director, clinical/local research lead, researcher, data research coordinator, and various clinicians (e.g., psychologists, social workers, psychiatrist, behavior therapist) from MCH. CHEO also consulted with its Neurodevelopmental Health Patient and Family Advisory Committee (NDH-PFAC) at this stage. At this time, ENS clinicians suggested that a standardized way of setting treatment goals and measuring progress would be beneficial. Findings were subsequently presented to the ENS clinical working group, whose membership included clinicians and operational leads across the three sites. Working group members reviewed the included tools at the item level and flagged language perceived as overly pathologizing, particularly in relation to behavior. Candidate measures were also reviewed for relevance and anticipated completion burden. Certain aspects of the RUQ were deemed important to measure on a quarterly basis, including acute care usage (emergency department visits and inpatient admissions), interventions from law enforcement or paramedics, missed school days, and parent/caregiver missed workdays. Together, the ENS clinical and evaluation working groups determined that the measurement battery would be applied to ENS clients receiving the interdisciplinary wraparound component of ENS (i.e., multiple types of services through the program) with an expected treatment duration of at least 6 months.
2.3.3 Evaluation approach
Based on the results of the focused literature review and initial identification of potential clinically relevant PROMs, we developed an evaluation framework and logic model (Figure 1) to organize and contextualize the broader program evaluation. Logic models have been successful in fields such as program planning, implementation, and primary care and are useful in “providing a common approach to integrating planning, implementation, and evaluation” (54). The logic model allowed our team to link the measures with their corresponding clinical domains (high-intensity users, complexity, fragility, unmet needs, and clinical symptom improvement) and create a visual representation of anticipated short-, medium-, and long-term outcomes (55). This helped inform the shared mission, vision, and objectives of the evaluation (56).
Figure 1. Logic model. CEO, Chief executive officer; MPOC-20, Measures of Processes of Care; QoL, Quality of life; HRQoL, Health-related quality of life; PGH-7, Pediatric Global Health Measure; ED. Emergency department; BASC-3, Behavior Assessment System for Children; OAP URS, Ontario Autism Program Urgent Response Service; CAS, Children’s Aid Services; CSN, Complex Special Needs.
2.3.4 Pilot phase of outcome measures
During the pilot phase, which lasted 1 month, the evaluation team first introduced the PROMs to the ENS clinical team, leading training sessions and providing other opportunities to review the measures prior to using them with clients and families. These training sessions helped clinicians feel more prepared if caregivers had questions about any of the measures or if families requested support while completing them. Where available, standardized tools were accessed through their respective online platforms. The Pediatric Global Health measure (PGH-7) (25) and BFDS were built into REDCap so that these PROMs could also be completed by families online. Families were sent PROMs monthly via REDCap to complete independently or with support from a clinician or a member of the evaluation team, if required, in real time, either in person, virtually, or by phone. When required, interpreters were available so that non-English-speaking families could complete the PROMs. Evaluation team members at HB and CHEO built two of the measures (BFDS and RUQ) into the hospital electronic medical record (EMR), making it easier for clinicians to enter these data directly into the EMR during clinical intake interviews.
Using REDCap to collect PROMs made it easier for the evaluation team to create graphics and visuals to share with clinicians during weekly rounds. Pairing PROMs with clinical data helped the team generate a more fulsome report on each client’s progress and determine next steps and recommendations (57).
2.3.5 Refinement of outcome measure battery and logic model
Pilot experiences were brought back to the ENS Evaluation Working Group for review and discussion. Completion of longer tools (ABAS-3, BASC-3) was consistently identified as a challenge. The Working Group noted that the BASC-3 included an Adaptive Skills subscale, which could be redundant with the ABAS-3. For this reason, the ABAS-3 was removed from the battery. The program logic model was updated accordingly. Completion rates of measures were collected at the site level and compiled across tools and sites for the BFDS, RUQ, BASC-3, and PGH-7. The GAS was implemented later than the other tools and will not be presented here. Details of the final battery of outcome measures are available in Table 1. The data presented below (Figures 2, 3) represent PROM completion rates from April 2023 to April 2025.
Figure 2. Total number of ENS clients. This figure represents the total number of clients at each monthly timepoint in the Extensive Needs Service. It represents how client numbers can fluctuate at varying durations of service.
Figure 3. BFDS, PGH-7, RUQ & BASC-3 Completion rates (Quarterly). This figure shows the response rate for all measures included in the battery across the Tri-Organization sites (Holland Bloorview, McMaster Children’s Hospital, and Children’s Hospital of Eastern Ontario. Response rates are calculated every 3 months here, apart from the BASC-3, which is administered every 6 months. The PGH-7 Child is completed only when the child is able to answer the questions independently.
Completion rates were calculated by dividing the number of participants who completed each measure at a specific time point (numerator) by the total number of participants expected to be at that time point (denominator). The denominator was determined based on each participant’s service start date (i.e., when they completed their initial evaluation assessments). Participants were then tracked over time and included in the denominator for each time point based on their service start date. If a participant was discharged, they were included in the denominator only for the period during which they were enrolled in the program.
3 Results
The ENS MBC logic model is presented in Figure 1, and the streamlined outcome measure battery and measurement time points is available in Figure 4. Implementation of outcome measures in ENS began on 27 April 2023 and is ongoing.
Figure 4. Outcome measures timeline. BFDS, Brief Family Distress Scale; PGH-7, Pediatric Global Health Measure; GAS, Goal Attainment Scale; RUQ, Resource Use Questionnaire; BASC-3, Behavior Assessment System for Children; MPOC-20, Measures of Processes of Care; REDCap, (Research Electronic Data Capture) is a secure web application for building and managing online surveys and databases.
3.1 HB site implementation
We built our program based on the primary clinician model adopted as part of ENS at HB. In this model, the main clinician working with the client is identified as their “primary clinician” and acts as the main point of contact for the client’s care (59). This clinician is then responsible for ensuring that the client’s PROMs are completed. The evaluation team requested feedback from clinicians and received comments highlighting both benefits and drawbacks of their experiences. Families reported that it was helpful to have ENS clinicians or program evaluation staff support them in completing the measures, with many expressing concerns about managing PROM completion on their own.
In the initial program delivery at HB, the ENS team would decide on the complement of services that the child would receive. The first available clinician from this team would see the child and their family, triggering the initiation of measure completion, and other services would join as available. In practice, some services took longer to start, and families provided feedback that they still felt as though they were waiting for aspects of ENS. Based on this feedback, the model recently changed so that ENS begins only once all services are ready to start with the child and their family.
3.2 CHEO site implementation
CHEO initially followed a process similar to HB but has transitioned to a streamlined start model, in which families begin once a “pod” clinical team—consisting of a behavioral analyst, service planning coordinator, and social worker—is available to support the interdisciplinary assessment with families entering ENS. The ENS outcome measures are fully clinically integrated, with measures completed live with families and data inputted directly into CHEO’s EMR (EPIC). Once this assessment is completed, the clinical team begins to enlist additional members based on family needs and goals. These additions may include a respite worker, child and youth counselor, service planning coordinator, behavior therapists, and members of the medical team (i.e., occupational therapist, speech-language pathologist, registered nurse, registered dietitian, or psychologist). Team members are responsible for conducting additional clinically relevant assessments that support development of ENS goals, the treatment plan, and benchmarks for discharge criteria. Once the treatment plan has been established and agreed upon by the clinical team and family, the treatment phase begins, and subsequent outcome measures are completed with the client and their family. The program evaluation team remains actively involved in supporting clinicians with the use of outcome measures, including providing training on administration, interpretation, and use of tracking sheets. They also attend clinical rounds to offer real-time support, answer questions, and ensure clinicians feel confident and equipped throughout the data collection process.
3.3 MCH site implementation
At MCH, the family participates in a welcome meeting with the social worker and integrated service consultant, followed by initiation of a family-based assessment. More specifically, the Family Check-Up® (FCU) (60, 61)—a brief, tailored, ecological intervention developed to decrease childhood emotional and behavioral challenges and improve caregiver well-being—is delivered to families when feasible and appropriate. Following the initial assessment, each client is presented to an interdisciplinary pod team consisting of the assigned social worker and integrated service consultant, as well as a behavior analyst, behavior therapist, additional social workers, speech-language pathologist, occupational therapist, dietitian, psychologist, and pediatrician, to support further assessment and treatment planning.
The integrated service consultant serves as the primary point of contact for the family in the program and is responsible for managing and ensuring the completion of outcome measures using REDCap. They use the platform to schedule follow-up assessments, share secure links with families to complete the outcome measures independently, and provide secure access to other care team members who may assist in the completion process. In most cases, the integrated service consultant directly supports the family in completing the outcome measures during appointments. In addition, they actively monitor the status of each measure within REDCap to track progress and identify whether any follow-up actions are required to ensure timely and complete data collection.
As shown in Figure 3, CHEO and MCH have a higher number of completed measures compared to HB. This may be related to the earlier clinical integration of PROMs at CHEO and MCH (i.e., a clinician is responsible for ensuring measure completion). Having measures delivered by a clinician and recorded directly into the EMR or REDCap increases accessibility for clinicians and families and decreases logistical complexity by avoiding paper-and-pencil administration or manual data entry (57). These are all factors that can influence completion rates for MBC. While HB had the measures built into REDCap, the lower completion rates suggest that families may be more successful when completing these measures with their primary clinician.
Clinicians across the Tri-Org requested additional support and training on how to co-create GAS goals and facilitate goal-setting conversations with clients and families. As a result, rollout of the GAS lagged behind other PROMs.
3.4 Completion of measures
Three hundred and eighty-one clients across sites (CHEO: 108, Holland Bloorview: 124, McMaster: 149) have entered the interdisciplinary intensive wraparound stream of ENS as of April 2025 (Figure 2). Based on the date range of April 2023 to April 2025, clients have been in ENS for varying durations, ranging from 3 months to 20 months. We see a decline in numbers over time because some clients have not yet reached later time points or were discharged prior to 20 months. This is noticeable at the 12-month mark, when the number of clients drops below 100 and continues to decrease over time. Thirty-six clients have been discharged from the program.
Completion rates have varied across sites, measures, and clinical time points (Figure 3). Baseline response rates were generally high across all measures, with sites reaching up to 97% for the BFDS, while rates were lower for the PGH-7 Child and the BASC-3. By Month 18, only the BFDS and PGH-7 Parent show notable retention (up to 83% at McMaster), while completion rates for other measures fall to very low levels (often below 20%).
Completion rates for the BFDS are provided in Figure 3. Other tools—including the PGH-7 (completed by parents for all clients and by clients themselves when able), the RUQ (full version at baseline and abbreviated version quarterly), and the BASC-3—are completed either every 3 or 6 months; completion of these tools is also reported in Figure 3.
4 Discussion
In this paper, we have detailed the design and implementation of MBC in an innovative new multi-site program, ENS, designed to meet the needs of children and families with intersecting developmental, behavioral, mental health, physical, and social complexities. Implementing MBC in the context of this program has involved careful consideration of its aims and target population, repeatedly seeking input from clinicians, and measuring progress during implementation. Because ENS represents a new, cross-sectoral program in the service landscape, MBC is essential for tracking individual client progress, as well as site-level and program-level achievements and challenges. Indeed, our initial results show that further improvements are needed to optimize completion of measures and integration of those measures with clinical activities. Our implementation has provided many instructive lessons for similar programs seeking to use MBC.
Our initial literature search identified few comparable programs using MBC, particularly when considering the transdiagnostic nature of ENS. Our group of measures aimed to capture wide-ranging program aims, including clinical symptom improvement, better quality of life, reduced reliance on acute care, and improved family well-being. Compared to other programs, one notable weakness of our approach was the lack of robust engagement of families and youth in the early planning stages (21). Now that the program is established and operating at full capacity, our team has a clearer understanding of the types of clients and families in ENS. Accordingly, we have recruited a lived experience advisory group specifically for the ENS evaluation, comprised of families who have participated in the program. We also regularly solicited feedback from clinicians about the measures and adapted the measure battery accordingly.
Looking across ENS sites, measurement completion rates are linked with the degree of family and clinician engagement. For example, lower completion rates helped us identify necessary changes to clinical service delivery at one of the sites, such as waiting until all indicated services were available so that the child and family experience true wraparound care and greater engagement with the program. Responsibility for ensuring completion of outcome measures has shifted from evaluation staff to program clinicians. Martin-Cook et al. (62) also identified the importance of clinician engagement in implementing MBC. They described key aspects to improve this engagement, including focusing on the impact of MBC on client care, working with clinicians to develop a list of barriers, and regularly meeting to address these. To emphasize the institutional benefits of MBC, Cooper et al. (63) included project status updates at monthly clinical division meetings as a way to keep staff engaged with the project and maintain momentum.
A variable that played a significant role in clinician engagement was training and ensuring that the team felt prepared and comfortable implementing MBC during their appointments. The evaluation team at HB scheduled several training sessions during the inception of ENS to ensure that this was part of onboarding for all program staff. As described by Childs and Connors (64), establishing training in MBC processes and mandating them for newly onboarded clinicians is essential. To make these trainings accessible, the evaluation team sent polls to determine preferred meeting times and delivery methods (in person or videoconference). Multiple sessions were held to accommodate vacation and sick days. During these trainings, the evaluation team provided an overview of the outcome measurement tools, including how long each tool takes to complete, validation information, and common methods for using the data to inform clinical care.
Working alongside the clinical team in MBC helped the evaluation team recognize when different families required different supports to complete PROMs and identify the best ways to support them. Support options included meeting with parents/caregivers during visits while the client received care or completing measures together remotely (phone/videoconference), sometimes with support from an interpreter. Although this could add time to the visit, it was vital that families felt supported when completing the PROMs. Donelan et al. (2025) describe a similar situation in which a caregiver became disengaged from the program due to the amount of contact and information from the team. By collaborating with this parent, they were able to identify strategies that made engagement easier, such as limiting the number of treatment suggestions shared at one time (65).
Families entering the ENS program are facing significant unmet needs, which contribute to considerable stress and overwhelm. In this context, the task of completing measures can feel burdensome, potentially contributing to lower completion rates. Furthermore, some caregivers may face challenges related to cognitive capacity, literacy, or language barriers, highlighting the need for appropriate accommodations such as simplified materials or translation services to support equitable participation. Barron et al. identified appropriate selection of PROMs as a key strategy to improve completion rates (66). Our original battery included two longer measures, the ABAS-3 and the BASC-3. The ABAS-3 was subsequently removed based on feedback from clinicians and families about the burden of completing this measure. BASC-3 completion rates remain lower than those for other measures; for this reason, we are investigating whether a shorter measure can provide similar information and demonstrate sensitivity to change. Future evaluations of this population can include biomarkers, such as neuroimaging and EEG data, alongside PROMs, and may also apply advanced analytic approaches.
5 Limitations
Our implementation is not without shortcomings. Due to the need for timely identification of measures, we did not conduct a systematic review or a review of the gray literature. As such, we may have missed relevant programs or measures. As noted above, consultation with family caregivers and youth was deferred until we had a clearer understanding of the types of children and families who would be served by the program. This paper describes our implementation; program-wide clinical outcomes will be described in a forthcoming paper.
6 Conclusions
We implemented MBC in a novel, needs-based program by considering program aims, examining existing literature, and soliciting repeated input from clinicians. MBC will enable data-driven decisions for individual clients, program sites, and the multi-site program. Consideration of measurement burden and integration into clinical processes is key to improving PROM completion. Future evaluation will focus on sustainability, clinician buy-in, feedback utilization, and the relationship between MBC uptake and clinical outcomes.
Data availability statement
The original contributions presented in the study are included in the article/supplementary material. Further inquiries can be directed to the corresponding authors.
Author contributions
VR: Writing – original draft, Writing – review & editing. ZS: Writing – original draft, Writing – review & editing. GF: Writing – original draft, Writing – review & editing. RB: Writing – review & editing. KD: Writing – review & editing. ND: Writing – review & editing. ID: Writing – review & editing. JE: Writing – review & editing. LH: Writing – review & editing. ThJ: Writing – review & editing. TaJ: Writing – review & editing. TL: Writing – review & editing. KM: Writing – review & editing. TM: Writing – review & editing. NM: Writing – review & editing. SS: Writing – review & editing. AD: Writing – review & editing. RW: Writing – review & editing. MP: Writing – original draft, Writing – review & editing.
Funding
The author(s) declared financial support was received for this work and/or its publication. The Extensive Needs Service is supported by joint funding through Ontario Health, via the Ministry of Health and the Ministry of Children, Communities and Social Services.
Acknowledgments
Thank you to all Extensive Needs Service clinicians, family members and clients across the three sites for contributing to this project. Without you all, none of this knowledge would exist.
Conflict of interest
MP has done paid consulting for the following (unrelated to this publication): Addis & Associates, the Province of Nova Scotia, and MedCounsel.
The remaining author(s) declared that this work was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
The authors MP, AD declared that they were an editorial board member of Frontiers, at the time of submission. This had no impact on the peer review process and the final decision.
Generative AI statement
The author(s) declare that Generative AI was not used in the creation of this manuscript.
Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
1. Lewis CC, Boyd M, Puspitasari A, Navarro E, Howard J, and Kassab H. Implementing measurement-based care in behavioral health: A review. JAMA Psychiatry. (2019) 76:324–35. doi: 10.1001/jamapsychiatry.2018.3329
2. Fortney JC, Unützer J, Wrenn G, Pyne JM, Smith GR, Schoenbaum M, et al. A tipping point for measurement-based care. Psychiatr Serv. (2017) 68:179–88. doi: 10.1176/appi.ps.201500439
3. Lambert MJ, Whipple JL, Hawkins EJ, Vermeersch DA, Nielsen SL, and Smart DW. Is it time for clinicians to routinely track patient outcome? A meta-analysis. Clin psychology: Sci Pract. (2003) 10:288. doi: 10.1093/clipsy.bpg025
4. Greene SM, Reid RJ, and Larson EB. Implementing the learning health system: from concept to action. Ann Intern Med. (2012) 157:207–10. doi: 10.7326/0003-4819-157-3-201208070-00012
5. McDonald PL, Foley TJ, Verheij R, Braithwaite J, Rubin J, and Harwood K. Data to knowledge to improvement: creating the learning health system. BMJ. (2024) 384:e076175. doi: 10.1136/bmj-2023-076175
6. Donovan M. Dostupné z (2020). Available online at: https://www.health.mil (Accessed September 1, 2025).
7. Baribeau D, Vorstman J, and Anagnostou E. Novel treatments in autism spectrum disorder. Curr Opin Psychiatry. (2022) 35:101–10. doi: 10.1097/YCO.0000000000000775
8. Correll CU, Cortese S, Croatto G, Monaco F, Krinitski D, and Arrondo G. Efficacy and acceptability of pharmacological, psychosocial, and brain stimulation interventions in children and adolescents with mental disorders: an umbrella review. World Psychiatry. (2021) 20:244–75. doi: 10.1002/wps.20881
9. Weiss JA, Isaacs B, Diepstra H, Wilton AS, Brown HK, and McGarry C. Health concerns and health service utilization in a population cohort of young adults with autism spectrum disorder. J Autism Dev Disord. (2018) 48:36–44. doi: 10.1007/s10803-017-3292-0
10. Bebbington A, Glasson E, Bourke J, De Klerk N, and Leonard H. Hospitalisation rates for children with intellectual disability or autism born in Western Australia 1983–1999: a population-based cohort study. BMJ Open. (2013) 3:e002356. doi: 10.1136/bmjopen-2012-002356
11. Lunsky Y, Lin E, Balogh R, Klein-Geltink J, Wilton AS, and Kurdyak P. Emergency department visits and use of outpatient physician services by adults with developmental disability and psychiatric disorder. Can J Psychiatry. (2012) 57:601–7. doi: 10.1177/070674371205701004
12. Tint A, Robinson S, and Lunsky Y. Brief Report: Emergency department assessment and outcomes in individuals with Autism Spectrum Disorders. J Dev Disabil. (2011) 17:56–9.
13. Zhu M, Hong RH, Yang T, Yang X, Wang X, and Liu J. The efficacy of measurement-based care for depressive disorders: systematic review and meta-analysis of randomized controlled trials. J Clin Psychiatry. (2021) 82:37090. doi: 10.4088/JCP.21r14034
14. Scott K and Lewis CC. Using measurement-based care to enhance any treatment. Cogn Behav Pract. (2015) 22:49–59. doi: 10.1016/j.cbpra.2014.01.010
15. Murray DS, Anixt JS, Coury DL, Kuhlthau KA, Seide J, and Kelly A. Transforming an autism pediatric research network into a learning health system: lessons learned. Pediatr Qual Saf. (2019) 4:152. doi: 10.1097/pq9.0000000000000152
16. Holland Bloorview Kids Rehabilitation Hospital. Extensive Needs Services (2025). Available online at: https://hollandbloorview.ca/services/programs-services/extensive-needs-service (Accessed September 1, 2025).
17. Bennett SD, Cross JH, Coughtrey AE, Heyman I, Ford T, and Chorpita B. MICE—Mental Health Intervention for Children with Epilepsy: a randomised controlled, multi-centre clinical trial evaluating the clinical and cost-effectiveness of MATCH-ADTC in addition to usual care compared to usual care alone for children and young people with common mental health disorders and epilepsy—study protocol. Trials. (2021) 22:1–16. doi: 10.1186/s13063-020-05003-9
18. Frakking TT, Waugh J, Teoh HJ, Shelton D, Moloney S, and Ward D. Integrated children’s clinic care (ICCC) versus a self-directed care pathway for children with a chronic health condition: a multi-centre randomised controlled trial study protocol. BMC Pediatr. (2018) 18:1–9. doi: 10.1186/s12887-018-1034-x
19. Stadnick NA, Aarons GA, Martinez K, Sklar M, Coleman KJ, and Gizzo DP. Implementation outcomes from a pilot of “Access to Tailored Autism Integrated Care” for children with autism and mental health needs. Autism. (2022) 26:1821–32. doi: 10.1177/13623613211065801
20. Tucker PW, Bull R, Hall A, Moran TP, Jain S, and Sathian U. Application of the RE-AIM framework for the pediatric mild traumatic brain injury evaluation and management intervention: a study protocol for program evaluation. Front Public Health. (2022) 9:740238. doi: 10.3389/fpubh.2021.740238
21. Satherley R-M, Lingam R, Green J, and Wolfe I. Integrated health Services for Children: a qualitative study of family perspectives. BMC Health Serv Res. (2021) 21:1–13. doi: 10.1186/s12913-021-06141-9
22. Achenbach TM and Rescorla LA. Manual for the ASEBA school-age forms and profiles. University of Vermont Research Centre for Children, Youth & Families (Burlington, VT: ASEBA) (2001).
23. Reynolds CR and Kamphaus RW. Behavior Assessment System for Children, Third Edition (BASC-3). Pearson Assessments. Bloomington, MN: NCS Pearson, Inc. (2015).
24. Aman MG, Singh NN, Stewart AW, and Field CJ. The Aberrant Behavior Checklist: a behavior rating scale for the assessment of treatment effects. Am J Ment Deficiency. (1985) 89:485–91.
25. Forrest CB, Bevans KB, Pratiwadi R, Moon JH, Teneralli RE, and Minton JM. Development of the PROMIS® pediatric global health (PGH-7) measure. Qual Life Res. (2014) 23:1221–31. doi: 10.1007/s11136-013-0581-8
26. Reynolds CR and Kamphaus RW. Behavior Assessment System for Children, Third Edition (BASC-3). Pearson Assessments. Bloomington, MN: NCS Pearson, Inc. (2015).
27. Weiss JA and Lunsky Y. The brief family distress scale: A measure of crisis in caregivers of individuals with autism spectrum disorders. J Child Family Stud. (2011) 20:521–8. doi: 10.1007/s10826-010-9419-y
28. Ungar WJ, Tsiplova K, Millar N, and Smith IM. Development of the resource use questionnaire (RUQ–P) for families with preschool children with neurodevelopmental disorders: validation in children with autism spectrum disorder. Clin Pract Pediatr Psychol. (2018) 6:164–78. doi: 10.1037/cpp0000226
29. Rojahn J. The Behavior Problems Inventory: an instrument for the assessment of self-injury, stereotyped behavior, and aggression/destruction in individuals with developmental disabilities. Journal of Autism and Developmental Disorders. (2001) 31:577–88.
30. Bourke-Taylor H, Law M, Howie L, and Pallant J. Development of the Child's Challenging Behaviour Scale (CCBS) for mothers of school-aged children with disabilities. Child: Care, Health & Development. (2010) 36:491–498.
31. Bodfish JW, Symons FJ, Parker DE, and Lewis MH. Repetitive Behavior Scale–Revised (RBS-R) [Database record]. PsycTESTS. (2000). doi: 10.1037/t17338-000
32. Bergman RL, Keller ML, Piacentin J, and Bergman AJ. The development and psychometric properties of the Selective Mutism Questionnaire. Journal of Clinical Child and Adolescent Psychology. (2008) 37:456–464. doi: 10.1080/15374410801955805
33. Swanson JM, Schuck S, Porter MM, Carlson C, Hartman CA, Sergeant JA, et al. Categorical and dimensional definitions and evaluations of symptoms of ADHD: History of the SNAP and the SWAN rating sales. The International Journal of Educational and Psychological Assessment. (2012) 10:51–70.
34. Feudtner C, Feinstein JA, Zhong W, Hall M, and Dai D. Pediatric complex chronic conditions classification system version 2: updated for ICD-10 and complex medical technology dependence and transplantation. BMC Pediatr. (2014) 14:199. doi: 10.1186/1471-2431-14-199
35. Larson T, Kerekes N, Selinus EN, Lichtenstein P, Gumpert CH, Anckarsäter H, et al. Reliability of Autism-Tics, AD/HD, and other Comorbidities (A-TAC) inventory in a test-retest design. Psychol Rep. (2014) 114:93–103. doi: 10.2466/03.15.PR0.114k10w1
36. Sheehan DV, Sheehan KH, Shytle RD, Bannon Y, Janavs J, Rogers JE, et al. Reliability and validity of the Mini International Neuropsychiatric Interview for Children and Adolescents (MINI–KID). Journal of Clinical Psychiatry. (2010) 71:313–326. doi: 10.4088/JCP.09m05305whi
37. Angold A, Costello EJ, Messer SC, Pickles A, Winder F, and Silver D. The development of a short questionnaire for use in epidemiological studies of depression in children and adolescents. International Journal of Methods in Psychiatric Research. (1995) 5:237–249.
38. Birmaher B, Khetarpal S, Brent D, Cully M, Balach L, Kaufman J, et al. The Screen for Child Anxiety Related Emotional Disorders (SCARED): Scale construction and psychometric characteristics. Journal of the American Academy of Child & Adolescent Psychiatry. (1997) 36:545–=–553. doi: 10.1097/00004583-199704000-00018
39. Spence SH. Spence Children′s Anxiety Scale (SCAS)[Database record]. PsycTESTS. (1997). doi: 10.1037/t10518-000
40. Harrison PL and Oakland T. Adaptive Behavior Assessment System (3rd ed.). Chicago, IL: Western Psychological Services (2015).
41. Sparrow SS, Cicchetti DV, and Saulnier CA. Vineland Adaptive Behavior Scales, Third Edition (Vineland-3). San Antonio, TX: Pearson (2016).
42. Skinner HA, Steinhauer PD, and Santa-Barbara J. Family Assessment Measure III (FAM-III) [Database record]. PsycTESTS. (1995). doi: 10.1037/t02209-000
43. Koren PE, DeChillo N, and Friesen BJ. Family Empowerment Scale (FES) [Database record]. APA PsycTests. (1992). doi: 10.1037/t07132-000
44. McCubbin MA, McCubbin HI, and Thompson AI. Family Hardiness Index (FHI). Family assessment: Resiliency, coping and adaptation: Inventories for research and practice. Madison: University of Wisconsin System (1986) 239–305.
45. Gardner DL, Huber CH, Steiner R, Vazquez LA, and Savage TA. The development and validation of the Inventory of Family Protective Factors: a brief assessment for family counseling. The Family Journal. (2008) 16:107–117. doi: 10.1177/1066480708314259
46. United States Department of Health and Human Services and National Institute of Mental Health. Epidemiologic Catchment Area (ECA) Survey of Mental Disorders, Wave I (Household), 1980-1985[Data set]. Inter-university Consortium for Political and Social Research (ICPSR) [distributor]. (1994). doi: 10.3886/ICPSR08993.v1
47. Zuvekas SH and Olin GL. Validating household reports of health care use in the medical expenditure panel survey. Health Serv Res. (2009) 44:1679–700. doi: 10.1111/j.1475-6773.2009.00995
48. Ravens-Sieberer U and Bullinger M. Assessing health-related quality of life in chronically ill children with the German KINDL. Qual Life Res. (1998) 7:399–407.
49. Law M, Baptiste S, McColl M, Opzoomer A, Polatajko H, and Pollock N. The Canadian occupational performance measure: an outcome measure for occupational therapy. Can J Occup Ther. (1990) 57:82–7. doi: 10.1177/000841749005700207
50. Zubler J and Whitaker T. CDC's Revised Developmental Milestone Checklists. Am Fam Physician. (2022) 106:370–371.
51. Kiresuk TJ and Sherman RE. Goal attainment scaling: A general method for evaluating comprehensive community mental health programs. Community Mental Health Journal. (1968) 4:443–453. doi: 10.1007/BF01530764
52. Ghandour RM, Jones JR, Lebrun-Harris LA, et al. The design and implementation of the 2016 National Survey of Children's Health. Matern Child Health J. (2018) 22:1093–1102. doi: 10.1007/s10995-018-2526-x
53. Turcotte P and Shea L. Physical health needs and self-reported health status among adults with autism. Autism. (2020) 25:695–704. doi: 10.1177/1362361320971099
54. Hayes H, Parchman ML, and Howard R. A logic model framework for evaluation and planning in a primary care practice-based research network (PBRN). J Am Board Family Med. (2011) 24:576–82. doi: 10.3122/jabfm.2011.05.110043
55. WK Kellogg Foundation. WK Kellogg Foundation Logic Model Development Guide. Battle Creek, MI: WK Kellogg Foundation (2004).
56. Payton C, Kumar GS, Kimball S, Clarke SK, AlMasri I, and Karaki FM. A logic model framework for planning an international refugee health research, evaluation, and ethics committee. Health Promotion Pract. (2022) 23:852–60. doi: 10.1177/15248399211035703
57. Childs A and Connors E. A roadmap for measurement-based care implementation in intensive outpatient treatment settings for children and adolescents. Evidence-Based Pract Child Adolesc Ment Health. (2021) 7:419–438. doi: 10.1080/23794925.2021.1975518
58. Kiresuk TJ and Sherman RE. Goal attainment scaling: A general method for evaluating comprehensive community mental health programs. Community Mental Health Journal. (1968) 4:443–453. doi: 10.1007/BF01530764
59. Brandenburg C, Ward EC, Schwarz M, Palmer M, Hartley C, and Byrnes J. ‘The big value of it is getting the patient seen by the right person at the right time’: clinician perceptions of the value of allied health primary contact models of care. Int J Qual Health Care. (2024) 36:mzae021. doi: 10.1093/intqhc/mzae021
60. Dishion TJ and Stormshak EA. Intervening in children’s lives: An ecological, family-centered approach to mental health care. Washington, DC: American Psychological Association (2007).
61. Gill AM, Hyde LW, Shaw DS, Dishion TJ, and Wilson MN. The family check-up in early childhood: A case study of intervention process and change. J Clin Child Adolesc Psychol. (2008) 37:893–904. doi: 10.1080/15374410802359858
62. Martin-Cook K, Palmer L, Thornton L, Rush AJ, Tamminga CA, and Ibrahim HM. Setting measurement-based care in motion: practical lessons in the implementation and integration of measurement-based care in psychiatry clinical practice. Neuropsychiatr Dis Treat. (2021) 17:1621–31. doi: 10.2147/NDT.S308615
63. Cooper AM, Horwitz M, and Becker ML. Improving the safety of teratogen prescribing practices in a pediatric rheumatology clinic. Pediatrics. (2019) 143. doi: 10.1542/peds.2018-0803
64. Childs AW and Connors EH. A Roadmap for Measurement-based Care Implementation in Intensive Outpatient Treatment Settings for Children and Adolescents. Evid Based Pract Child Adolesc Ment Health. 7:419–438. doi: 10.1080/23794925.2021.1975518
65. Donelan J, Douglas S, Willson A, Lester T, and Daly S. Implementing measurement-Based care in a youth partial hospital setting: leveraging feedback for sustainability. Adm Policy Ment Health. (2025) 52:128–45. doi: 10.1007/s10488-024-01358-2
66. Barron J. Embracing measurement-based care within integrated primary care settings. (2023). doi: 10.1037/fsh0000789
Appendix 1
Keywords: neurodevelopmental disorders, mental disorders, behavioral symptoms, mental health services, measurement-based care (MBC)
Citation: Rombos V, Salami Z, Ferguson G, Baysarowich R, Decker K, Denomey N, Drmic I, Edwards J, Hayawi L, Jeyabalan T, Johansen T, Lui T, Margallo K, Milicevic T, Mitsakakis N, Sutherland S, D’Angiulli A, Webster R and Penner M (2026) Development and implementation of measurement-based care for children and youth with complex mental and neurodevelopmental needs: early experiences from Ontario’s Extensive Needs Service. Front. Psychiatry 16:1704325. doi: 10.3389/fpsyt.2025.1704325
Received: 12 September 2025; Accepted: 25 November 2025; Revised: 21 November 2025;
Published: 05 January 2026.
Edited by:
Annarita Vignapiano, Department of Mental Health, ItalyReviewed by:
Francesca Felicia Operto, University of Salerno, ItalyJill Donelan, Mirah, United States
Shangwen Chang, Shin Kong Wu Ho-Su Memorial Hospital, Taiwan
Copyright © 2026 Rombos, Salami, Ferguson, Baysarowich, Decker, Denomey, Drmic, Edwards, Hayawi, Jeyabalan, Johansen, Lui, Margallo, Milicevic, Mitsakakis, Sutherland, D’Angiulli, Webster and Penner. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Melanie Penner, bXBlbm5lckBob2xsYW5kYmxvb3J2aWV3LmNh
Jordan Edwards3