Skip to main content


Front. Sustain. Food Syst., 15 December 2022
Sec. Social Movements, Institutions and Governance
Volume 6 - 2022 |

Assessing the sustainability impacts of food sharing initiatives: User testing The Toolshed SIA

  • Discipline of Geography, School of Natural Sciences, Trinity College Dublin, The University of Dublin, Dublin, Ireland

The food system is unsustainable and requires reconfiguration, however more data is required to assess the impacts of action which might contribute to a more sustainable food future. Responding to this, extensive research with food sharing initiatives—activities which have been flagged for their potential sustainability credentials—led to the co-design of an online sustainability impact assessment (SIA) tool to support food sharing initiatives to asses and evidence their sustainability impacts. This paper reports on the initial user testing of the resulting online tool: The Toolshed which forms the indicator based SIA element of the SHARE IT platform. Feedback gathered from the initiatives testing the tool are analyzed and summaries of their reported impacts detailed. This analysis confirms the need for the tool, the relevance of the indicators included and the value of SIA reports for internal reflection and external communication. Nonetheless, challenges remain in relation to resourcing the practice of SIA reporting. We conclude with a plan for expanding engagement with The Toolshed and the wider SHARE IT platform.

1. Introduction

It is widely accepted that the food system is unsustainable (Canales Holzeis et al., 2019; Science Advice for Policy by European Academies, 2020) and that large scale systematic changes are required to address the unsustainability of urban food systems (Morgan and Sonnino, 2010; Cohen and Ilieva, 2015). This has stimulated a plethora of initiatives seeking to do things differently around food to help reorient the system toward sustainability. Food sharing, a term used to describe collective practices around growing, cooking, eating and redistributing surplus food, has been demonstrated to be an international phenomenon with many initiatives articulating sustainability goals (Davies et al., 2017, 2018; Davies, 2019). As detailed in the expanding body of research in this area, food sharing activities include community seed sharing, community gardens, community kitchens, social dining experiences, surplus food redistribution, gleaning and community composting (Davies and Evans, 2019). However, despite sustainability assessments being seen as a driver for sustainability transitions (Bohunovsky et al., 2011) the sustainability reporting of these initiatives is currently limited and their impacts largely unknown outside those who directly participate. This means that impacts are not taken into account in decision making about investment allocations or in reporting on national commitments, for example in relation to the achievement of the sustainable development goals.

Although not without their limitations, particularly in terms of their complexity and methodological heterogeneity (e.g., Potschin and Haines-Young, 2008), sustainability assessments of activities are increasingly promoted as a powerful means to assist the decisions of policy makers and investors, as part of a wider drive toward thinking beyond economic outcomes (Singh et al., 2012). As a result, sustainability impact assessment (SIA) reporting has become common practice amongst many large, for-profit organizations (Milne and Gray, 2013). However, small-scale, grassroots or community non-profit organizations—a category into which the majority of food sharing initiatives fall—can struggle to engage in SIA reporting for myriad reasons (Jones and Mucha, 2014). A lack of appropriate and accessible tools and methods for smaller, not-for-profit organizations which typify food sharing initiatives are key barriers, as well as time shortages and skills gaps. In essence, and as outlined in Davies (2019), food sharing initiatives find it hard to establish and maintain practices of sustainability reporting due to the combination of rules, tools, skills and understandings required to do so.

In response, a bespoke online sustainability impact assessment (SIA) tool called The Toolshed1 has been co-designed with and for food sharing initiatives as part of the SHARE IT platform to better capture the impacts they create. This platform also supports food sharing initiatives in their sustainability efforts via The Talent Garden where initiatives can share reports, videos, pictures, and stories about their work (including brief summary SIA reports created in The Toolshed) and The Greenhouse which provides a portal for food sharing initiatives to connect and exchange knowledge and experiences to create a supportive community of practice for sustainable food sharing (Mackenzie and Davies, 2019).

This paper examines the outcomes of preliminary testing of The Toolshed by active food sharing initiatives and summarizes how they engaged with the tool and the reports they generated. This testing process was carried out to investigate whether The Toolshed met its goal of being an accessible and useful SIA tool for food sharing initiatives to engage in reporting and to communicate their impacts.

2. Sustainability impact assessment as a practice for food sharing initiatives

SIA is an umbrella term encompassing a range of impact assessment practices (Hacking and Guthrie, 2008), which all seek to understand the social, economic and environmental impacts of particular activities, organizations, places or policies. The core goal of SIA is to generate data on impacts in order to assist with reshaping practices toward more sustainable development (Bond and Morrison-Saunders, 2011). SIA began as a technical managerialist tool following on from the practice of environmental impact assessment (EIA) and then strategic environmental assessment (Bond et al., 2012). However, SIA is increasingly recognized as a social practice replete with rules, tools, skills and understandings (Davies, 2019), with rules about how impacts should be captured, tools for capturing and managing data, skills needed to undertake an impact assessment and understandings of what it means to report in impact assessment (Mackenzie and Davies, 2019).

A spectrum of impact assessment models has been developed by Costanzo and Sánchez (2019) to indicate where innovations are occurring in the field. At one end is the technicist model from which SIA derives its origins. Examples of this model include the adoption by large companies of quantitative Life Cycle Assessment (LCA) approaches to improve the efficiency and reduce the environmental impact of their industrial processes (Curran, 2012). Within the context of food systems, LCA and other similar approaches have been seen as providing useful evidence for managers, investors and policy makers to include in decision making (Gava et al., 2018). However, many of these approaches require specific kinds of data, resources and skills to complete being complex, expert-generated, preoccupied with quantitative data and highly technical.

Additionally, technicist SIA approaches which incorporate economic, environmental and social pillars of impact can generate extensive datasets that are difficult to manage even for expert practitioners and organizations with significant resources (Gava et al., 2018). This can make more focused impact assessments, e.g., which only address environmental impacts, seem more attractive. It is notable that much of the academic literature investigating the impact of the emerging phenomena in the sharing economy, both in food systems and beyond, have taken such an approach (Rabbitt and Ghosh, 2016; Nijland and van Meerkerk, 2017; Sanyé-Mengual et al., 2018).

The opposite end of the spectrum is the deliberative model, where impact assessment is considered a civic science and methods are designed to encourage social engagement through technical facilitation, thereby altering the considerations and structures of decision making (Costanzo and Sánchez, 2019). To be deliberative means relating to or intended for consideration or discussion. In the context of the research presented in this paper it refers to the process through which the SIA was developed and tested e.g., through discussion with relevant actors to ensure the process and tool is robust, to ensure the relevance to food sharing and to ensure the needs of food sharing initiatives are met. This is important for, as detailed by O'Faircheallaigh (2010), encouraging greater social engagement with impact assessment including obtaining public input into decision making processes can lead to greater engagement, trust and transparency.

Research has demonstrated that food sharing initiatives and particularly not-for-profit initiatives, while a diverse category (Davies et al., 2017), often run on tight budgets and face a wide array of funding and policy challenges (Davies, 2019). Specific contexts will affect their activities in particular ways but common challenges include: a lack of financial resources, a lack of technical understand and challenges around identifying relevant evaluation systems and outcome indicators (Mackenzie and Davies, 2019); challenges that are familiar to many non-profit organizations (Bach-Mortensen and Montgomery, 2018).

To provide significant value as a tool to support sustainability transitions within food systems, the practice of SIA needs to be accessible, defensible and transparent. This means the people running food sharing initiatives must be able to access the relevant tools, have the skills to be able to use the available tools and gather relevant data to populate the tools and calculate sustainability impacts. In this context SIA is recognized as both technique and practice, e.g., there is a technical method, but the application of the method takes place in particular contexts by people with particular motivations, skills, understandings operating within particular rules (both social and regulatory).

In response, we co-designed an online, interactive SIA tool called The Toolshed with food sharing initiatives. The need to ensure that food sharing initiatives were able to use the tool and purposefully engage with SIA reporting means that the approach adopted fell toward the deliberative end of Costanzo and Sánchez (2019) theoretical scale.

The Toolshed offers familiar features of mainstream SIA reporting tools around food with modifications to ensure it appropriately captures the impacts of food sharing. As extensively documented in Mackenzie and Davies (2019), The Toolshed contains a theme based framework of practice and performance based sustainability indicators designed for the activities of food sharing initiatives. The Toolshed is flexible to support the wide range food sharing initiatives that exist. It allows users to select relevant options from an extensive offering of 110 indicator questions falling under 34 sustainability impact areas (see Table 1), providing qualitative and quantitative options for initiatives to report on their impacts. Those leading food sharing initiatives answer indicator questions in the online Toolshed portal for which they have data. This automatically generates a full text-based report, an accompanying excel spreadsheet and a three-page summary report to communicate impacts with internal and external stakeholders and which can be automatically shared publicly on The Talent Garden section of the SHARE IT platform.


Table 1. The Toolshed sustainability impact areas.

Previous analysis has found that a majority of food sharing initiatives have explicit mission statements and goals for social impacts (Davies et al., 2018). The relative importance of social impacts for the target users of The Toolshed was reflected in the final co-designed SIA framework, with 15 out of 34 impact areas designed to capture these social impacts related to food sharing including: Accessibility: Contribution to improving the accessibility of fresh nutritious food for all; Citizen health and wellbeing: Contribution to improving health and wellbeing outcomes; Community integration: Contribution to increasing community integration through food sharing; and Education and food choices (see Mackenzie and Davies, 2019 for further explanation of indicator development). If successful in engaging users in deliberatively evaluating their social impacts as part of a wider sustainability impact assessment, this mechanism will overcome an important challenge identified by Esteves et al. (2012), that social impact assessment often does not meet the expectations that it will be a deliberative process. As the conceptual framing and co-creation process has already been verified (Mackenzie and Davies, 2019), this paper focuses on the process and results of user testing The Toolshed. The objectives of this testing process were:

1) Usability—To establish whether The Toolshed is sufficiently accessible to be of use as an SIA tool for food sharing initiatives.

2) Value—To understand whether creating an SIA report in The Toolshed changed the food sharing initiatives' perspective of SIA.

3. Methods

When testing the usability of online platforms there are three main phases that can be conducted: Exploratory; summative; and validation (Rubin and Chisnell, 2008). In brief, exploratory testing is conducted during the early phases of platform development and its main purpose is to ascertain whether the preliminary concept and designs for the platform are likely to meet the users expectations. Summative testing is conducted later in the platform development cycle and is used to test the “low level” functionality of the platform. Summative testing involves users performing tasks directly with the platform or service in question, with less interaction from the moderator of the testing process. In summative testing some quantitative measures are usually employed to explore outcomes from the user experience. Validation testing is performed during the final stages of development and compares the platform in question to established benchmarks for performance and usability when compared with existing online platforms (Rubin and Chisnell, 2008). Under this rubric, the conceptual foundations and initial co-design process set out in Mackenzie and Davies (2019) can be categorized as exploratory testing, conducted to ensure The Toolshed met the objectives of potential users. Here, we outline and critically evaluate the summative testing undertaken to establish whether The Toolshed online was able to meet its objectives. First, we set out the testing protocol, which describes how the summative testing was executed.

3.1. The testing protocol

The eight initiatives that took part in the testing process are described in Table 1. These initiatives were selected to represent a range of food sharing activities (spanning the collective growing, cooking, eating and redistribution of food), and organization sizes based on the large sample of food sharing initiatives identified in the SHARECITY 100 database (Davies et al., 2017). The summative phase of testing involved participating initiatives using The Toolshed to create an SIA report of their activities for the most recent 12 month period for which they had available data. Initiatives were asked to provide feedback on their experience of using The Toolshed through online feedback forms or through informal feedback provided to researchers either verbally during a focus group or through written communication afterwards.

The testing protocol had three stages (see Figure 1).


Figure 1. Flow chart showing the organization of testing activities.

All initiatives participated in each stage of the testing process, albeit in variable ways. At Stage 2, for example, multiple meetings were conducted with some initiatives before they were able to complete the SIA report. Initiatives 4 and 8 (see Table 2) participated in all stages and engaged with the research team, but did not ultimately produce a final impact assessment report. Initiatives 1, 3 and 8 did not complete the feedback questionnaire but did provide feedback on the toolkit through other methods. Time restrictions and significant changes to the circumstance of initiatives were the main barriers to full completion of the process. This was expected and indicative of a wider problem for incorporating SIA reporting into the everyday practices of precarious, overstretched and small-scale food sharing initiatives (Davies, 2012; Massa et al., 2015).


Table 2. Summary characteristics of food sharing initiatives involved in summative testing.

The feedback questionnaire was structured into three sections:

1) The functionality and user experience of The Toolshed:

a. List up to 3 aspects of The Toolshed that you liked.

b. Describe any technical problems you had using The Toolshed to create an impact assessment report.

c. List any technical changes you would suggest for The Toolshed.

d. Add any further comments on whether you feel The Toolshed is easy to use as a tool to make a sustainability impact assessment and how its functions could be improved.

2) The accessibility and aesthetics of The Toolshed:

a. Describe any general changes you would recommend the content of The Toolshed.

b. Describe any specific problems you found with content of The Toolshed tool.

c. Add any further comments you have on the accessibility of the material and visual experience of using The Toolshed.

3) The perceived relevance of The Toolshed to the needs of the user organization:

a. After testing The Toolshed do you feel it could be a useful tool for your organization?

b. If not, please describe why and how we could change it to make it more relevant.

c. Describe any benefits The Toolshed could have for your organization.

d. Would you have any concerns about your organization using The Toolshed?

e. Did testing The Toolshed make you consider different forms of impact to those which you are tracking now? If yes then please give details.

f. Did testing The Toolshed give you any ideas for additional data you could collect to demonstrate the impact of your activities? If yes then please give details.

g. Do you have any recommendations for how we can best disseminate The Toolshed?

h. Would you use The Toolshed based on what you have seen from testing?

i. Please add any further comments about The Toolshed and how we can improve it.

Responses to these questions were collected using Google forms and are discussed in the results section of this paper.

The testing process created two key data types necessary for usability evaluations: interaction data from the observations of the initiatives using The Toolshed; and design feedback data from the online survey and informal feedback provided during the test meeting. While the value of interaction data is relatively unchallenged (Følstad, 2017), the value of design feedback is contested by researchers (Whitefield et al., 1991; Neilsen, 2001). However, the Følstad (2017) review of usability testing demonstrated that this form of feedback can greatly enhance any testing compared to protocols based on interaction data alone. It found that design feedback was particularly valuable when considering interactive systems in specialized contexts, which would apply to the case of food sharing and The Toolshed.

The outputs of the full testing process were threefold:

1. A completed SIA report detailing the impacts of each initiative over a 12 month period using both quantitative and qualitative evidence as available and appropriate;

2. Feedback from each initiative on using The Toolshed, providing reflection on its accessibility, usability and relevance to their objectives and activities;

3. Researcher observations regarding the behaviors of users interacting with The Toolshed collated from the meetings held with initiatives and the reports generated by the users.

3.2. Analysis

3.2.1. Usability and preference analysis based on interaction data

The first step in the analysis was to test whether the tool was sufficiently accessible for initiatives to create an SIA report which effectively and appropriately communicates their impacts. When making an SIA report, users of The Toolshed selected from a suite of available indicators those that were relevant to their activities. For each indicator there were sub-indicators which provided different options for demonstrating impact depending on the activity of the initiative and the evidence they had available to submit.

Two questions were asked to better understand whether the indicators were appropriate and comprehensible as follows:

1) Did the evidence provided for each sub-indicator demonstrate impact for the indicator in question?

This question was asked to see whether initiatives provided data which was not appropriate to the indicator, made an error in the data provided, or left the data incomplete.

2) Whether quantitative and qualitative evidence was submitted as evidence of impact?

It is possible to submit either qualitative or quantitative evidence for almost all indicators in The Toolshed. We analyzed the type of evidence submitted to indicators and sub-indicators to see if any clear patterns emerged. For example, if smaller initiatives tended to submit more qualitative evidence.

3.2.2. Usability analysis based on design feedback data

The data used for this analysis were:

• The answers to questions 1–7 of the user feedback questionnaire;

• Relevant informal feedback recorded during the user testing (this included feedback supplied verbally in person during meetings as well as remotely by via email or phone).

We grouped and prioritized design feedback provided by users into topical clusters using a directed content analysis approach (Hsieh and Shannon, 2005; Assarroudi et al., 2018) where specific key words are not pre-defined prior to conducting the analysis. The range of possible answers that could be submitted for feedback on using the toolkit was too large to restrict data inclusion on the basis of specific keywords.

Analysis of the qualitative design feedback data enabled us to build a matrix to identify how frequently each topical cluster was identified by users. These clusters were then placed into two categories:

1) Problems in using the toolkit.

2) Suggestions for improvements to the user experience.

Within these sub-categories each cluster of user feedback was given a rating in order to produce a prioritized list of the most important barriers to usability.

For (1) Problems—the following categories of seriousness were established:

1) Minor: Issue is likely to cause low-level irritation and/or confusion for users but is unlikely to prevent users creating a report with The Toolshed independently or reduce the value of doing so from their perspective;

2) Medium: Issue is likely to cause significant irritation and/or confusion for users, but could be overcome with support from the research team in order for users to complete their SIA report and value the output from the process;

3) Major: Issue is likely to make using The Toolshed inaccessible to the user and/or would make the exercise of creating an SIA report not worthwhile in their opinion.

For (2) Improvements to the user experience of the toolkit—the categories were:

1) Minor: easily resolvable through minor technical modifications or additional guidance through tutorials or help text that can be created in-house by the research team;

2) Medium: resolvable without major technical or other restructuring but requiring additional technical assistance and financial cost;

3) Major: only resolvable with significant redesign of the toolkit requiring specialist technical expertise, and incurring high costs.

By combining the number of users that raised a topical cluster with the rating of the relative seriousness of the issue described, a ranking of the issues raised during the user feedback was produced. A weighted score was used to rank the clusters using the formula below:

WS = NxW (1)

WS = Weighted Score; N = The number of users in who raised the issue; W= Weighting for relative seriousness.

For “Minor” W = 1, for “Medium” W = 1.5, for “Major” W = 2.

3.2.3. Perceived value of The Toolshed and SIA by users

Finally, we analyzed responses to questions 8–16 of the feedback formed filled out by initiatives to establish whether:

1) Using The Toolshed had met users' objectives and whether they would be interested in using the tool again;

2) Users felt The Toolshed has value they had not considered before or raised any concerns about SIA reporting that they had not previously considered;

3) Using The Toolshed had altered users views on where their activities created impacts.

Again feedback submitted by users was grouped into topical clusters using a directed content analysis (Hsieh and Shannon, 2005; Assarroudi et al., 2018). This stage of analysis went beyond conventional usability analysis protocols to test whether The Toolshed had perceived value as a tool from the sample of users that participated in the testing process. The following section details the results of the three stages of user data analysis.

4. Results

4.1. Interaction data

Of the eight initiatives that took part in the testing process, six were able to complete sustainability impact reports using The Toolshed. Table 3 details which impact areas were answered by these initiatives during their interaction with the tool. Each of the 34 impact areas received at least one response from one piloting initiative indicating that the impact areas are relevant to a variety of food sharing initiatives. The Table also indicates the breadth of impact creation by the initiatives across social, economic, environmental and governance categories. As the sample of piloting initiatives is not representative of wider food sharing populations it is impossible to derive any generalizable statements from these submissions in terms of the content of their impact reports at this stage. Nor should the impacts reported be read as the total range of sustainability impacts by the participating initiatives as data collection in many of the initiatives was limited at the time of engaging with The Toolshed. New systems of data collection need to be developed by these initiatives to fully document the impacts being created. Indeed the experience of engaging with The Toolshed pilot highlighted to these initiatives what kinds of data they should be collating to evidence the sustainability impacts they seek to achieve.


Table 3. Summary of responses submitted to impact areas by pilot study initiatives.

Table 4 details the impact areas and indicator questions where data was submitted by Initiative 6. This illustrates the composition of a full Toolshed report which is collated by the tool into a text document and an excel spreadsheet for the initiative. It also indicates where the impact areas reported map on to the sustainable development goals (United Nations, 2015).


Table 4. Example Toolshed SIA report by Initiative 6.

Figure 2 provides an example of a summary report which is generated by The Toolshed from the full report. Initiatives can select data from their full report to highlight in this summary. The summary report is intended to be a communication device; quick to read, visually appealing and focused on the key impacts of the initiative, whilst also providing an overview of sectoral impacts and sharing benefits (referred to as sharing stars in the summary report).


Figure 2. Example Toolshed summary report Initiative 6.

The results from the interaction data suggested that The Toolshed was partially successful in achieving its objective of creating an accessible tool for food sharing initiatives to engage in SIA and communicate their impacts. While the largest initiatives with the greatest experience of impact reporting were able to use the toolkit to create reports of their activities without additional guidance, smaller initiatives with less experience were more likely to struggle with identifying appropriate evidence and designing new ways for collecting appropriate data requiring additional support from the research team.

4.2. Design feedback data

Feedback data from users during the testing process was provided in two formats; answers to the feedback survey questions and informal verbal or written communications with researchers. These data were combined to identify the most important clusters of design feedback data for The Toolshed. Table 5 sets out a ranking of topical clusters identified through the combined analysis of all the design feedback data. The ranking of relative importance for the issues was based on the number of users that raised each topic and a qualitative judgement on the relative seriousness of the issue (see Section Usability analysis based on design feedback data). Given the small sample size, the specific ranking of the issues (i.e., whether they came second or third in the ranking) was not the purpose of the analysis. Rather, it provides a broad picture of the challenges The Toolshed will need to address and the resource implications of making these modifications.


Table 5. Ranking of the importance of topical clusters in the design feedback data.

Positive feedback comments were not included in the ranking shown in Table 4 as they were not given a rating of relative seriousness. However, when asked what they liked about the tool, all users who completed impact reports commented positively on the three-page summary report produced by the toolkit. They valued its potential to communicate their impacts in a succinct manner. Two users also commented positively on the graphics the tool produced, with two users also mentioning the potential for the reports to be used as evidence of impact for wider stakeholder such as funders and regulators. While it was clear that users valued the outputs from The Toolshed only two users made positive comments about the process of making the reports, with one commenting that they appreciated the flexibility of the tool in terms of not being required to answer all indicators and the other highlighting the value of help text provided to support users.

While the flexibility of The Toolshed was valued and utilized, the wide range of available indicators and options for sub-indicators that facilitate this flexibility also had unintended consequences for user perceptions of the tool. They created a perception that the data requirements to complete a SIA report were onerous and therefore potentially prohibitive for initiatives short on time and capacity. This was ranked as one of the most important feedback clusters. While the introductory video and help text provided with The Toolshed clearly state that not all indicators will be relevant or need to be responded to, this message was not picked up by some initiatives. Moreover, there was another important feedback cluster of general concerns about whether specific input data was suitable. There was also feedback from multiple users that they were not completely clear how the toolkit was relating their input data to the sustainable development goals in the sustainability reports created. All three of these clusters pointed to a general challenge for SIA toolkits in providing flexibility without appearing intimidating and communicating complexity clearly without providing extensive, and therefore potentially off-putting, explanations.

Another important and related cluster from the design feedback data that came high in the ranking was that several users made suggestions that the tool might be improved if they could only see indicators and sub-indicators that were relevant for their specific activities. For example, based on whether they grow or redistribute food. However, technical upgrades such as bespoke indicator pathways for different user properties, for example based on the type of food sharing they engage in, would require significant additional technical input and costs.

A recurrent technical issue raised was the inability to reopen and edit reports that users had created once they were completed. While adding this function was not possible during the first iteration of The Toolshed due to technical and budgetary constraints, it will be an important addition to future iterations. Given the precarity of many food sharing initiatives (Davies, 2019), it was unsurprising that two initiatives experienced disruption to their operations that either significantly delayed or prevented them from completing their SIA reports despite wishing to do so. This issue cannot be mitigated by changing the design of The Toolshed or similar toolkits, but it highlights the importance of considering additional ways to support food sharing initiatives to expand their sustainability reporting.

Collectively, the design feedback data reveals that changes to The Toolshed are required to fully achieve the objective of providing accessible SIA reporting for food sharing initiatives. However, many of the issues raised in the design feedback data were not insurmountable and identify a roadmap for increasing accessibility, as discussed in the Discussion section.

4.3. Perceived value of The Toolshed and perception of SIA

The final section of the feedback survey contained questions about the perceived value of The Toolshed and the process of SIA reporting. All initiatives who completed the feedback survey felt that The Toolshed was potentially a useful resource for their organizations and said they would use it in the future based on their testing experience. However, Initiative 4 did provide an important caveat regarding their perceived issue of burdensome data requirements, despite the assurances already provided in the toolkit guidance supports which indicate the contrary. One initiative had important concerns about potentially negative issues that might emerge from engaging with The Toolshed, such as the information being used against them (and other initiatives) if they indicate only small scale impacts in certain fields, or by leading to increasing (and unrealistic) demands for measurement by funders without additional supports to fund such activities.

Initiatives 2, 4 and 6 felt that using The Toolshed had made them aware that they may be having impact in areas and ways that they had not previously considered. This finding was not only expressed by those new to sustainability reporting (e.g., Initiative 4) but was similarly articulated by those who already engaged in reporting (e.g., Initiative 2). These three initiatives also engaged in different types of food sharing activities from growing (Initiative 4) and shared cooking and eating (Initiative 6) to the redistribution of food (Initiative 2). Furthermore, four initiatives responded that using The Toolshed had stimulated ideas for further data collection, so that they can capture their impacts in ways they were not doing previously. Indeed, two users also responded that The Toolshed could help to improve their data collection practices overall.

The feedback from testers provided evidence that using The Toolshed had changed perspectives on (a) the value of SIA itself as a practice; (b) the potential relevance and usefulness of SIA for their initiative and; (c) their understanding of the impacts that their activities create. While from a small sample of testers, this was an important outcome of the testing process and demonstrated the potential value of rolling out the tool further to engage more food sharing initiatives in the practice of SIA.

5. Discussion

The Toolshed marks a step change in the way that SIA has been developed and provided to food sharing initiatives. No online SIA tools have been previously developed specifically for use by this type of organization within food systems. Its social practice underpinning and co-design roots alongside the ongoing collaborative testing provides a novel, if preliminary and provisional, picture of how and why food sharing initiatives engage (or not) with the enhanced sustainability reporting opportunities the tool offers. While the focus in this paper has been on user testing of The Toolshed component of SHARE IT because of the complexity of that aspect of the toolkit, SHARE IT also offers alternative mechanisms in The Talent Garden and Greenhouse (both described in the introduction) for initiatives to share, engage with and communicate sustainability impacts. These additional dissemination and communication channels will also require further testing as engagement with them increases and content expands. It will be particularly interesting to examine how all three of the SHARE IT tool's components are received by those who fund and regulate food sharing activities.

Reflecting on the outcomes of the user testing of The Toolshed, it is important to note that only a minority of testers were able to complete SIA reports to their satisfaction without additional support from the research team. This is despite their centrality in the co-design and development of the SIA tool and their stated desire to engage in the practice of SIA reporting. Replicating the difficulties faced by SME's across all sectors engaged in SIA and Corporate Sustainability Reporting (CSR) (Jenkins, 2004; Murillo and Lozano, 2006), the initiatives best able to navigate and complete SIA reports were those that were already engaged in some form of reporting and were the largest and most well-resourced initiatives in the testing group. Below we discuss the underlying reasons for these patterns of patchy engagement and propose a suite of supports that could accelerate wider engagement with the tool.

5.1. Perpetuating patterns of participation in SIA

Analysis of the feedback data identified three main factors which prevented initiatives from engaging extensively with The Toolshed without additional external assistance. These were:

- External factors—largely outside the control of the initiative and the research team, these include major disruptions to initiatives activities due to sickness and ill-health or changes in personnel in core roles; COVID-19, for example, has severely tested many food sharing initiatives since 2020;

- Technical, resource and financial factors—initiatives have extremely limited capacity to undertake additional activities, even ones that they want to undertake, such as SIA reporting. Initiating data collection about activities was hard for initiatives to do without assistance from the research team. In addition, there was limited time and budget to design and build the tool within the grant which supported this research. The length of time it took for development, and the costs and complexity of building the tool, meant that it was not possible to incorporate some desired functionality (e.g., bespoke pathways for indicators) without considerable additional support;

- Socio-cultural and behavioral factors—in many cases, as discussed below, challenges for engagement persisted despite the existence of text or video supports which dealt directly with their concerns. For example, the large number of indicators was identified as being intimidating by some testers, but they are required to ensure that all food sharing initiatives are able to select relevant indicators for activities. This was explained in the help guides, but perceptions remained unchanged and users wanted more bespoke pathways through the tool. Such pathways would require significant additional knowledge about different food sharing initiatives activities and characteristics, as well as additional technical and financial resources.

Some of the issues flagged can be easily addressed with stronger and clearer statements about the tool's role, purpose and scope, for example on the landing webpage. However, bespoke versions of The Toolshed tailored to particular focus areas of food sharing (e.g., growing, cooking or eating, and redistributing food), the familiarity of the initiative with SIA reporting (e.g., introductory, intermediate, advanced), and the resource capacity of the user (e.g., micro, small, medium sized initiative) would be an attractive upgrade. This could help manage expectations and reassure resourced-strapped initiatives with little experience of SIA reporting, but would be expensive to develop. In any case, while such technical issues were flagged as significant, it was often a combination of all three categories that prohibited initiatives from progressing to SIA reporting. As research elsewhere has illustrated (Davies, 2019; Weymes and Davies, 2019), technical fixes alone will be unlikely to resolve all issues and widening participation will require significantly more hands-on support, particularly for early stage initiatives and those operating under precarious conditions. As indicated by research with impact reporting in small organizations and not-for-profit initiatives (see for example, Nigri et al., 2017; Nigri and Michelini, 2019), such challenges are commonly experienced and further cross-fertilization of responses to these challenges across different sectors of activity would be useful. Accepting that open access, online tools alone are unlikely to be sufficient to activate wide engagement in SIA reporting, the following section outlines a phased strategy for providing more relational support.

5.2. Accelerating engagement with The Toolshed

The potential pool of users for The Toolshed is large and international, but suitable pathways to activating engagement need to be developed because of the diverse challenges which different kinds of food sharing initiatives face when trying to incorporate reporting into their activities. The Toolshed has been co-designed with food sharing initiatives to be free at the point of use and to support initiatives to conduct their own reporting independently. However, initial testing indicated that few initiatives were in a position to engage actively with The Toolshed without assistance from the research team. Some initiatives responded well to a bespoke training session led by a researcher and were able to develop a sustainability report following this. The remainder required a much more pro-active, sustained and targeted interaction before they engaged, including the development of a preliminary report by researchers based on publicly accessible data to illustrate its capabilities.

This suggests that a system of additional supports and a means of funding those supports will be needed to optimize engagement with The Toolshed. A phased plan for activating engagement with sustainability reporting is required:

Phase 1—Raising wider awareness: Following revisions of the tool that are possible without further financial or technical input, all 3,763 food sharing initiatives identified and mapped on a global database (Davies et al., 2017) can be contacted directly and analysis of the subsequent interactions examined. Based on testing, it is expected that this could generate around 300 additional engagements amongst the largest, most experienced food sharing initiatives in the database;

Phase 2—Targeted contact with online support provision: A sample of those initiatives contacted in Phase 1 and based in English speaking countries, such as UK, Ireland, USA and Australia, which have not engaged with the tool within 2 weeks of information being disseminated will be contacted again. The aim here is to try and establish their reasons for non-engagement and to explore their interest in availing of a free trial support service to help them engage. Any reasons given for not taking up this opportunity will be logged.

Phase 3—Bespoke service for new SIA reporters: A small number of initiatives contacted in Phase 2 who are interested in sustainability reporting but feel unable to complete an SIA report themselves even with online support will be offered the opportunity to trial a bespoke SIA service. A draft report will be developed for the initiatives based on publicly accessible data for their food sharing initiative and a virtual or face-to-face workshop will support the initiative to move toward sustainability reporting.

In terms of the results of user testing, it is clear that the tool can help initiatives establish whether they are meeting their stated goals, albeit with more assistance than was originally envisaged. It is also clear that The Toolshed does provide the means to demonstrate the impacts of their activities, at least in certain ways and forms. Whether The Toolshed will stimulate the formation of additional food sharing initiatives will depend on users posting their summary impact assessment reports on The Talent Garden and potential food sharing entrepreneurs seeing and being inspired by the contents of these reports and other impact evidence (e.g., photographs, narratives, videos etc.). Establishing a community of practice (Reed et al., 2014) around sustainable food sharing through The Greenhouse could assist in this regard and identifying and evaluating means to generate engagement with these other elements of SHARE IT are required. Nonetheless, the in-depth co-construction of The Toolshed as an online SIA toolkit was extremely valuable in providing greater understanding of the everyday practices and impacts of food sharing. It usefully extended previous ethnographic research conducted in multiple cities and with many food sharing initiatives (Davies, 2019; Davies and Evans, 2019) in important ways. It was able to forensically examine how the initiatives function, providing a fuller appreciation of the demands on time and resources that simply maintaining such activities takes. It also reveals just how challenging it is for initiatives to increase their sustainability reporting and impact analysis, even when they are committed to demonstrating their sustainability worth publicly.

Sustainability assessments can be complex (Sala et al., 2015), and there is little consensus on which metrics should be used as standard to assess the sustainability of food systems (Johnston et al., 2014; Prosperi et al., 2015). The Toolshed was established to increase understanding of the impact food sharing initiatives have by overcoming some of the well-known challenges that social enterprises have in engaging with SIA (Grieco, 2015). While this came from a desire to engage with SIA by participating initiatives, there are wider concerns about the consequences of this for many small scale food sharing initiatives. For example, initiatives may be unfairly penalized and their benefits discounted by those who value quantitative indicators within SIA frameworks. Their impacts for individual indicators may look small scale to funders and policy makers when compared with those of incumbent actors, such as multinational retailers.

6. Conclusion

The Toolshed was developed to: (a) establish whether food sharing initiatives were meeting their stated goals; (b) demonstrate impacts of food sharing initiatives to potential investors, customers and funders of food sharing initiatives and; (c) illustrate the possibilities for expanding the number and sustainability worth of food sharing activities. This paper has specifically focused on the extent to which The Toolshed arm of SHARE IT helps to achieve these goals.

The Toolshed marks a step change in support for increased sustainability reporting of food sharing initiatives. This is important, as the impacts of these activities are missing from current reports on contributions to the Sustainable Development Goals, for example, and therefore not taken into consideration by local authorities or nation states in terms of planning their sustainability transitions. Following revision of The Toolshed based on the pilot study feedback set out in this paper, it will be possible to recruit and support more food sharing initiatives to engage with sustainability reporting and document their sustainability impacts for supporters and regulators alike. It will be possible to start comparing and aggregating sustainability impacts by food sharing sector (e.g., growing, cooking and eating, surplus food redistribution) at different scales (from the individual initiatives to local authorities and beyond) and across space (e.g., between localities in one country and between localities in different countries).

Indeed, particular attention should also be given to those who may engage with the outputs of the sustainability reporting that food sharing initiatives, such as funders and regulators. Moreover, public authorities who support food sharing initiatives through access to land, space, and finance, as well as food retailers who donate to food sharing initiatives, should be engaged in discussions about supporting efforts to aggregate the collective sustainability impacts of distributed food sharing initiatives. Without addressing their perspectives the value of The Toolshed will be primarily as a reflective mechanism for food sharing initiatives themselves.

Data availability statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Ethics statement

The studies involving human participants were reviewed and approved by School of Natural Sciences (SNS) Research Ethics Committee, Trinity College Dublin. The participants provided their written informed consent to participate in this study.

Author contributions

SM: conceptualization, methodology, investigation, formal analysis, writing—original draft preparation, visualization, and project coordination. AD: conceptualization, methodology, writing—review and editing, revised draft writing, project coordination, resources, project administration, and funding acquisition. All authors contributed to the article and approved the submitted version.


This paper is based on research that has received funding from the European Research Council (ERC) under the European Union's Horizon 2020 research and innovation programme (Grant Agreement No. 646883).


Our thanks and immense gratitude to all the initiatives that took part in the co-design and testing of the SHARE IT toolkit to date, many of whom have since been running vital responses to various issues that have arisen from the COVID-19 pandemic. Many thanks also to Vivien Franck for her assistance, persistence and patience throughout the process of setting up and testing the SHARE IT toolkit. Finally, we thank Trevor Clowry and the team at Symmetry Solutions for all their hard work, dedication and expertise in carrying out web-development of SHARE IT.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.


CSR, Corporate Sustainability Reporting; EIA, Environmental Impact Assessment; LCA, Life Cycle Assessment; SIA, Sustainability Impact Assessment.


1. ^The SHARE IT toolkit can be accessed at:


Assarroudi, A., Heshmati Nabavi, F., Armat, M. R., Ebadi, A., and Vaismoradi, M. (2018). Directed qualitative content analysis: the description and elaboration of its underpinning methods and data analysis process. J. Res. Nurs. 23, 42–55. doi: 10.1177/1744987117741667

PubMed Abstract | CrossRef Full Text | Google Scholar

Bach-Mortensen, A. M., and Montgomery, P. (2018). What are the barriers and facilitators for third sector organisations (non-profits) to evaluate their services? A systematic review. Syst. Rev. 7, 13. doi: 10.1186/s13643-018-0681-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Bohunovsky, L., Jäger, J., and Omann, I. (2011). Participatory scenario development for integrated sustainability assessment. Reg. Environ. Change 11, 271–284. doi: 10.1007/s10113-010-0143-3

PubMed Abstract | CrossRef Full Text | Google Scholar

Bond, A., Morrison-Saunders, A., and Pope, J. (2012). Sustainability assessment: the state of the art. Impact Assess. Project Appraisal 30, 53–62. doi: 10.1080/14615517.2012.661974

CrossRef Full Text | Google Scholar

Bond, A. J., and Morrison-Saunders, A. (2011). Re-evaluating sustainability assessment: aligning the vision and the practice. Environ. Impact Assess. Rev. 31, 1–7. doi: 10.1016/j.eiar.2010.01.007

CrossRef Full Text | Google Scholar

Canales Holzeis, C., Fears, R., Moughan, P. J., Benton, T. G., Hendriks, S. L., Clegg, M., et al. (2019). Food systems for delivering nutritious and sustainable diets: perspectives from the global network of science academies. Glob. Food Security 21, 72–76. doi: 10.1016/j.gfs.2019.05.002

CrossRef Full Text | Google Scholar

Cohen, N., and Ilieva, R. T. (2015). Environmental Innovation and Societal Transitions Transitioning the food system: a strategic practice management approach for Cities 17, 199–217. doi: 10.1016/j.eist.2015.01.003

CrossRef Full Text | Google Scholar

Costanzo, B. P., and Sánchez, L. E. (2019). Innovation in impact assessment theory and practice: how is it captured in the literature? Environ. Impact Assess. Rev. 79, 106289. doi: 10.1016/j.eiar.2019.106289

CrossRef Full Text | Google Scholar

Curran, M. A. (2012). Life Cycle Assessment Handbook: A Guide for Environmentally Sustainable Products. Beverley, MA: Scrivener Publishing LLC. doi: 10.1002/9781118528372

CrossRef Full Text | Google Scholar

Davies, A. R. (2012). “Enterprising communities: grassroots sustainability innovations,” in Advances in Ecopolitics, Vol. 9 (Bingley: Emerald Group Publishing Limited). doi: 10.1108/S2041-806X(2012)9

CrossRef Full Text | Google Scholar

Davies, A. R. (2019). Urban Food Sharing: Rules, Tools and Networks. Bristol: Policy Press. doi: 10.46692/9781447349839

CrossRef Full Text | Google Scholar

Davies, A. R., Edwards, F., Marovelli, B., Morrow, O., Rut, M., and Weymes, M. (2017). Making visible: interrogating the performance of food sharing across 100 urban areas. Geoforum 86, 136–149. doi: 10.1016/j.geoforum.2017.09.007

CrossRef Full Text | Google Scholar

Davies, A. R., and Evans, D. (2019). Urban food sharing: emerging geographies of production, consumption and exchange. Geoforum 99, 154–159. doi: 10.1016/j.geoforum.2018.11.015

CrossRef Full Text | Google Scholar

Davies, A. R., Weymes, M., and Mackenzie, S. G. (2018). Communicating goals and impacts of urban food sharing. Urban Agric. Magazine 34, 38–40.

Google Scholar

Esteves, A. M., Franks, D., and Vanclay, F. (2012). Social impact assessment: the state of the art. Impact Assess. Project Appraisal 30, 34–42. doi: 10.1080/14615517.2012.660356

CrossRef Full Text | Google Scholar

Følstad, A. (2017). Users' design feedback in usability evaluation: a literature review. Hum. Centric Comput. Inform. Sci. 7, 19. doi: 10.1186/s13673-017-0100-y

CrossRef Full Text | Google Scholar

Gava, O., Bartolini, F., Venturi, F., Brunori, G., Zinnai, A., and Pardossi, A. (2018). A reflection of the use of the life cycle assessment tool for agri-food sustainability. Sustainability 11, 71. doi: 10.3390/su11010071

CrossRef Full Text | Google Scholar

Grieco, C. (2015). Assessing Social Impact of Social Enterprises: Does One Size Really Fit All? London: SpringerBriefs in Business. doi: 10.1007/978-3-319-15314-8

CrossRef Full Text | Google Scholar

Hacking, T., and Guthrie, P. (2008). A framework for clarifying the meaning of Triple Bottom-Line, Integrated, and Sustainability Assessment. Environ. Impact Assess. Rev. 28, 73–89. doi: 10.1016/j.eiar.2007.03.002

CrossRef Full Text | Google Scholar

Hsieh, H.-F., and Shannon, S. E. (2005). Three Approaches to qualitative content analysis. Qual. Health Res. 15, 1277–1288. doi: 10.1177/1049732305276687

PubMed Abstract | CrossRef Full Text | Google Scholar

Jenkins, H. (2004). A critique of conventional CSR theory: an SME perspective. J. Gen. Manag. 29. doi: 10.1177/030630700402900403

CrossRef Full Text | Google Scholar

Johnston, J. L., Fanzo, J. C., and Cogill, B. (2014). Understanding sustainable diets: a descriptive analysis of the determinants and processes that influence diets and their impact on health, food security, and environmental sustainability. Adv. Nutr. 5, 418–429. doi: 10.3945/an.113.005553

PubMed Abstract | CrossRef Full Text | Google Scholar

Jones, K. R., and Mucha, L. (2014). Sustainability assessment and reporting for nonprofit organizations: accountability “for the public good”. VOLUNTAS 25, 1465–1482. doi: 10.1007/s11266-013-9399-9

CrossRef Full Text | Google Scholar

Mackenzie, S. G., and Davies, A. R. (2019). SHARE IT: co-designing a sustainability impact assessment framework for urban food sharing initiatives. Environ. Impact Assess. Rev. 79, 106300. doi: 10.1016/j.eiar.2019.106300

PubMed Abstract | CrossRef Full Text | Google Scholar

Massa, L., Farneti, F., and Scappini, B. (2015). Developing a sustainability report in a small to medium enterprise: process and consequences. Meditari Account. Res. 23, 62–91. doi: 10.1108/MEDAR-02-2014-0030

CrossRef Full Text | Google Scholar

Milne, M., and Gray, R. (2013). W(h)ither ecology? The triple bottom line, the global reporting initiative, and corporate sustainability reporting. Article J. Business Ethics. 118, 13–29. doi: 10.1007/s10551-012-1543-8

CrossRef Full Text | Google Scholar

Morgan, K., and Sonnino, R. (2010). The urban foodscape: world cities and the new food equation. Cambridge J. Reg. Econ. Soc. 3, 209–224. doi: 10.1093/cjres/rsq007

CrossRef Full Text | Google Scholar

Murillo, D., and Lozano, J. M. (2006). SMEs and CSR: an approach to CSR in their own words. J. Business Ethics 67, 227–240. doi: 10.1007/s10551-006-9181-7

CrossRef Full Text | Google Scholar

Neilsen, J. (2001). First Rule of Usability? Don't Listen to Users. Jakob Nielsen's Alertbox. Available online at: (accessed September 12, 2019).

Google Scholar

Nigri, G., and Michelini, L. (2019). A Systematic Literature Review on Social Impact Assessment: Outlining Main Dimensions and Future Research Lines. Cham: Springer, 53–67. doi: 10.1007/978-3-030-04819-8_4

CrossRef Full Text | Google Scholar

Nigri, G., Michelini, L., and Grieco, C. (2017). Social impact and online communication in B-corps. Glob. J. Business Res. 11, 87–104.

Google Scholar

Nijland, H., and van Meerkerk, J. (2017). Mobility and environmental impacts of car sharing in the Netherlands. Environ. Innov. Soc. Trans. 23, 84–91. doi: 10.1016/j.eist.2017.02.001

CrossRef Full Text | Google Scholar

O'Faircheallaigh, C. (2010). Public participation and environmental impact assessment: purposes, implications, and lessons for public policy making. Environ. Impact Assess. Rev. 30, 19–27. doi: 10.1016/j.eiar.2009.05.001

CrossRef Full Text | Google Scholar

Potschin, M., and Haines-Young, R. (2008). “Sustainability impact assessments: limits, thresholds and the sustainability choice space,” in Sustainability Impact Assessment of Land Use Changes. Berlin, Heidelberg: Springer Berlin Heidelberg, 425–450. doi: 10.1007/978-3-540-78648-1_21

CrossRef Full Text | Google Scholar

Prosperi, P., Moragues-Faus, A., Sonnino, R., and Devereux, C. (2015). Enhancing the Impact of Sustainable Urban Food Strategies. Available online at: (accessed May 12, 2022).

Google Scholar

Rabbitt, N., and Ghosh, B. (2016). Economic and environmental impacts of organised Car Sharing Services: a case study of Ireland. Res. Transp. Econ. 57, 3–12. doi: 10.1016/j.retrec.2016.10.001

CrossRef Full Text | Google Scholar

Reed, M. G., Godmaire, H., Abernethy, P., and Guertin, M. A. (2014). Building a community of practice for sustainability: strengthening learning and collective action of Canadian biosphere reserves through a national partnership. J. Environ. Manag. 145, 230–239. doi: 10.1016/j.jenvman.2014.06.030

PubMed Abstract | CrossRef Full Text | Google Scholar

Rubin, J., and Chisnell, D. (2008). Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. Indiana: Wiley.

Google Scholar

Sala, S., Ciuffo, B., and Nijkamp, P. (2015). A systemic framework for sustainability assessment. Ecol. Econ. 119, 314–325. doi: 10.1016/j.ecolecon.2015.09.015

CrossRef Full Text | Google Scholar

Sanyé-Mengual, E., Orsini, F., Gianquinto, G., Sanyé-Mengual, E., Orsini, F., and Gianquinto, G. (2018). Revisiting the sustainability concept of urban food production from a stakeholders' perspective. Sustainability 10, 2175. doi: 10.3390/su10072175

CrossRef Full Text | Google Scholar

Science Advice for Policy by European Academies. (2020). A Sustainable Food System For The European Union. Brussels: Science Advice for Policy by European Academies.

Google Scholar

Singh, R. K., Murty, H. R., Gupta, S. K., and Dikshit, A. K. (2012). An overview of sustainability assessment methodologies. Ecol. Indic. 15, 281–299. doi: 10.1016/j.ecolind.2011.01.007

CrossRef Full Text | Google Scholar

United Nations. (2015). Transforming Our World: The 2030 Agenda for Sustainable Development: A/RES/70/1. New York, NY: United Nations.

Google Scholar

Ward, M., and Rhodes, C. (2014). Small Businesses and the UK Economy. House of Commons Library Briefing Note SN/EP/6078, London.

Google Scholar

Weymes, M., and Davies, A. R. (2019). [Re]Valuing surplus: transitions, technologies and tensions in redistributing prepared food in San Francisco. Geoforum 99, 160–169. doi: 10.1016/j.geoforum.2018.11.005

CrossRef Full Text | Google Scholar

Whitefield, A., Wilson, F., and Dowell, J. (1991). A framework for human factors evaluation. Behav. Inform. Technol. 10, 65–79. doi: 10.1080/01449299108924272

CrossRef Full Text | Google Scholar

Keywords: sustainability impact assessment, food sharing, online toolkit, user testing, food system, sustainability reporting

Citation: Mackenzie SG and Davies AR (2022) Assessing the sustainability impacts of food sharing initiatives: User testing The Toolshed SIA. Front. Sustain. Food Syst. 6:807690. doi: 10.3389/fsufs.2022.807690

Received: 02 November 2021; Accepted: 28 November 2022;
Published: 15 December 2022.

Edited by:

Aida Turrini, Independent Researcher, Rome, Italy

Reviewed by:

Sumita Ghosh, University of Technology Sydney, Australia
Silvia Baralla, Research Center Politics and Bioeconomy, Italy

Copyright © 2022 Mackenzie and Davies. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Anna R. Davies,