Open Research Week 2026, held between 16 - 20 March 2026, was a week‑long, cross‑institutional programme championing the practices, skills and culture that make research more transparent, collaborative and impactful. Delivered in partnership between Midlands Innovation, The Open University and Nottingham Trent University, the week brought together researchers, technicians, librarians and professional services teams who are driving forward open knowledge.
Throughout the week, participants explored groundbreaking research, engaged in thought‑provoking discussions, and uncovered new opportunities for collaboration.
Sessions which featured OU academics are:
A panel discussion, which discussed providing innovative firms, including SMEs in high tech sectors such as new life sciences and medical technologies, with Open Research Data.
Watch the video recording of the session below:
[0:00] Good morning everyone. Um warm welcome. Um my name is uh Theo Papanu. I'm professor of politics,
[0:09] innovation and development and academic lead for open research at the Open University.
[0:15] It's with great pleasure that I participate in this year's Open Research Week. Um for those uh who join us today
[0:23] and this is their first event just to say that Open Research Week 2026 is a week-long cross institutional celebration of the practices, skills,
[0:34] culture that make research more transparent, collaborative and impactful.
[0:40] Delivered jointly by the uh Midlands Innovation Nottingham Trend University and the Open University. The program
[0:48] brings together colleagues who are helping shape the future of open knowledge. And this year our theme is
[0:56] enabling engagement, innovation and impact. So across five days participants are exploring practical approaches,
[1:06] hearing from leading voices and connecting with a vibrant local network.
[1:13] Throughout the week, we are highlighting crowdbreaking research, sparking thought, provoking discussions and opening new opportunities for
[1:22] collaboration. So, thank you very much for joining us uh for this session. Do pop uh any questions in the Q&A just in
[1:30] case we have uh time to discuss them uh afterwards. Now, our panel's focus today
[1:37] is very timely. Within the next 60 minutes or so, our excellent panelists
[1:44] will try to answer the question of what is the impact of open research or open science on innovation. And I'm really delighted to welcome our panel. It's Dr.
[1:57] Samatha Mlean who is associate professor of bioscience at Nottingham Trend University.
[2:04] It's Dr. Despina uh who is a senior lecturer in strategy at the open university
[2:12] Dr. Adam Vney who is senior research fellow in medical technologies innovation at Nottingham trend
[2:19] university and professor Matthew Cook who is professor of innovation also at the open university. So welcome all.
[2:29] Now open research or open science refers to whether research ideas, methods, data
[2:38] and knowledge are performed and shared following principles of transparency,
[2:44] openness, verification and reproducibility.
[2:49] And so my first question just to kick off the discussion is which are the ways in which open research facilitates a
[2:58] culture of knowledge sharing and I would like perhaps um uh to ask this question
[3:05] to the spina first because um I'm aware that the spina has worked on knowledge sourcing and and exploitation.
[3:12] So despina hi um good afternoon everybody. It's a pleasure to to be here and to talk about
[3:21] these issues. Um so starting from you know from from this question I think you know uh having research which you know
[3:31] abides with the principles of you know of open research does encourage a culture uh you know of openness and
[3:38] cooperation. uh first of all research that is conducted follows the principles of transparency of making you know data
[3:47] available. So this is you know curated as part of of of projects and of course
[3:53] uh it becomes part uh of you know of the principles that uh these research um you know follows. So there is care for kind
[4:02] of like the future impact but there is care for identifying you know collaborators through sharing uh you
[4:10] know the data and making and enabling you know uh the further further uh use
[4:17] of it in you know in other projects. uh but it's not only about uh you know about data or you know sharing
[4:24] publications and outputs is also about you know sharing the tools for doing research or you know sharing facilities
[4:31] uh learning from these processes h but also it's about you know engaging you know the broader public in um you know
[4:40] in the domain of of research. So uh to just give you some example of some work that you know has been done at the you
[4:47] know open university from the business school I work for the business school uh so uh colleagues have worked in you know
[4:55] uh the encased project for example uh which is uh funded by you know horizon Horizon Europe funding and their
[5:04] colleagues from maybe FPL such as Francesca Caro uh worked closely with um
[5:12] the laboratory uh for advancement of science in Scotland to create projects of co-creation
[5:20] around carbon capture um and and storage. So the the specific project uh
[5:27] was led by you know the engineering laboratory in Scotland but also involved students from a a primary school um and
[5:35] you know and the process of co-creation uh enabled the students to understand the problem and to create some solutions
[5:42] which they were you know applicable uh for their own context. Uh so the outcome of the of the of the co-creation process
[5:51] also is is a book which is you know available together with some other co-creation projects that they were part
[5:57] of of CAS. uh but uh the co-creation process is not only uh you know engaging
[6:04] uh partners to kind like you know understand uh you know the problem and understand what you know they're facing but also empowers them through this
[6:13] process to kind of like you know continue um you know with kind of like pursuing the you know the social uh
[6:21] target even after the completion of the project.
[6:25] So um yes I'll I'll I'll just pass on to you know some other members.
[6:35] Thanks the spin. I would like to bring in um Samantha and Adam because um in the biosciences but also in the technology innovation sector there is
[6:43] lots of collaboration and I I I wonder whether um notions of of co-creation
[6:49] apply there as well. So um Samantha yeah I think um a lot of our experiences around working with industrial partners
[6:58] and we're from an academic setting and I think this um openness and open research it's really important um as as uh
[7:06] despite was saying is around this idea of that early discussion that open and honest discussion uh industry often has
[7:15] different drivers and motivations compared to academia and having those open and honest discussions early I think really sort of strengthens the
[7:23] collaboration as I think we'll touch upon later um but also uh allows for sort of that that trust to be built from
[7:30] an early stage. you know there's early discussions around IP positions uh ability to disseminate information
[7:37] academically whether that's in teaching or or conferences or or publication and what those positions are and I think when you have that uh really open
[7:46] conversation with these other sector partners to understand them as well as to show them uh that they sort of
[7:52] understand your position and develop that what I would say like is a universal language because one phrase may mean something very different in
[8:01] academ ademia to what it may mean in industry or in the healthcare sector. So I think making sure initially laying that groundwork to understand that
[8:09] common language, the common vision and goals is I think really important um to sort of really facilitate that knowledge sharing as collaborations continue.
[8:20] Thanks Samantha. Adam, is it the same in in your area would you say?
[8:26] Absolutely. And I think Samantha hit it on the head. You you want those discussions before you kind of start the work. You don't want to be getting down the line and and you thinking you're
[8:34] going to publish a piece of work and the company saying absolutely not. Um this is closed doors and they might not necessarily see the value of open
[8:42] research at the start. So they might not see the value of publishing until you explain um a bit more especially if uh a
[8:49] company is has kind of gone straight into someone's gone straight into kind of an industrial setting and stayed in the academic setting. Um so it's about
[8:58] getting those um agreements in place um it being okay if it's not always open research but pushing it when it can be
[9:06] um and explaining the value um and having that trade-off between IP and and um dissemination and and things like that.
[9:16] Yes. Thanks. Well, I'll I'll come to the IP, but before that, I wanted to ask Matt as well because I know Matt has been working with uh regional clusters,
[9:25] notions of innovation systems, and I was wondering actually what's the added value of this notion of open research when it comes to regional interactions,
[9:35] systemic interactions,
[9:40] say that the value is uh very high. Um but I mean the the the issue for me goes
[9:49] to open research and that's uh and you think that sounds great. Uh that's the kind of thing I can buy into but the
[9:56] difficulty is in actually doing it and it's not necessarily straightforward.
[10:00] You don't open research I don't think is just about making everything available or being entirely transparent because people don't understand what you're
[10:08] doing. So I think if in my world which is about transitions to more progressive
[10:15] hopefully urban and regional futures so that transition process that's about relations and that's about relations
[10:22] between universities and firms as they manifest in space and you can be open in those relations and that's kind of the
[10:29] messaging which I'm getting here is about talking about IP up front but then also if you're going to engage a
[10:37] university asosed opposed to some maybe faster consultancy uh quicker to deliver results. You're probably going to get involved in if you're working with the
[10:46] university, you're probably going to get involved in the dirty business of problematization.
[10:52] And so and then how do you develop how do you work with along that through that relation between different partners and
[11:00] how do you identify problems which you both consider to be worthy to re resolve? How do you make the research available that you've collectively
[11:09] developed perhaps through a process such as co-creation?
[11:12] And then how do you not just make it available, how do you translate it into a form that could be used by a variety
[11:20] of actors? And that's that's actually quite hard. That's actually a difficult sort of manifesto, isn't it? I could see Samantha's nodding because you've
[11:28] actually done it. You've you've tried to do it's very hard. So I think yes, a really valuable concept. Great practice tough.
[11:39] Yes. So I mean we'll come to the to the practice and um there are a few also questions about what kind of evidence do
[11:48] we have actually that um opera research is is is working practice but because you you you all mentioned um co-creation
[11:57] you all mentioned uh co-production um this is a process for maximizing uh
[12:04] impact of course and in the UK open research has being used to promote university industry collaboration.
[12:14] And so, uh, it seems now that it's a very important part of the government's
[12:21] mission, the um idea of of of the new industrial strategy, the the the the
[12:28] objective to to deliver economic and social and and environmental uh benefits. Um and so the question I had
[12:36] was how open research facilitates the development of these new linkages, new
[12:43] networks and and learning process which run through those those networks, right?
[12:50] Um I mean we are talking about perhaps innovation systems not at national level
[12:57] but perhaps more at at regional at at local level where all these uh interactions take uh place. So um I
[13:06] would like first to ask Adam about that how systemic you know all these interactions uh are
[13:14] in terms of you know being facilitated by open research or they are just random.
[13:21] Um I think that that's quite a tricky question to answer. I suppose the linkages without open research don't
[13:31] happen and it's very rare that um innovation comes out of one organization. Um so I suppose the more
[13:38] we open it up the better but how systemic it is or how random it is. I'm not too sure. I suppose it depends on
[13:46] the individuals looking for those connections and looking to be open about their their research.
[13:53] So I don't know if that kind of answers is it the case that um you need to have champions of of open research when it
[14:02] comes to collaborations for um you know advancing a new a new
[14:09] technology for example you know um a new medical device or something like that potentially yes cuz I think not everyone
[14:18] wants to be open in their research which maybe because of a lack of understanding. And if you have someone that really understands what you can be
[14:26] open about and what you can push, um that can obviously help then develop collaborations earlier um and learn from
[14:33] different kind of um institutions or um more quickly. Um obviously if it's not
[14:41] always a technology isn't always going to come out straight out of university,
[14:44] you might need an industry partner or a or a healthcare partner. Um, and if if someone's not too sure about how they can be open, if you have someone that
[14:52] that really does understand that and understands what to protect, what to not, and what you can push, I think you can help um kind of progress how quickly you can innovate,
[15:04] right? Uh, thanks Adam. I mean, Matt,
[15:08] would you say that um industrial partners within perhaps clusters or regional uh innovation systems
[15:17] potentially are more reluctant to um think about opera research or do you think that um they're actually supported?
[15:30] I think it depends on the cluster. I think it depends on uh what kinds of things the cluster is working on, what
[15:37] kinds of relationships they have. I mean, presumably you cluster in space because you actually want proximity and you want to be able to talk to other
[15:46] people and that may be about accessing shared specialist services such as financial services, but you may do
[15:53] entirely different things. Or you may cluster around a university um because you want to make a connection
[16:01] around the research which they can produce and make openly available or maybe make available through other uh channels or you may just actually want
[16:10] the address of being near to a university because that makes your company somehow feel more high-tech. So
[16:16] it's very complex. Think about what firms actually do. An instructive example may be in the area also though
[16:24] of smart cities because smart cities so s put up sensors to monitor traffic and
[16:31] what's going on in a city. You then collect those data you analyze them and you put them in a hub and then you make
[16:40] those data available. So in essential research on the city becomes open. But one of the problems there is you just
[16:47] make the data available. So your knowledge and your actionable knowledge and then your wisdom on how you might manage the system. Well, that's kind of
[16:54] something else. And then you're you're starting to think about those relations again and actually the practices in
[17:02] there uh of uh an intermediary which I think that good intermediaries such as I'm sure we have many in the room here.
[17:11] They uh they know how to do that translation process but it's almost tacit. you you get a feeling for it of how to build that relationship, how to
[17:19] build trust. And I think that's what really matters in something like the smart city, but also in the industrial clusters. I don't think you can just get
[17:27] a load of buildings, make your research openly available and you got a cluster. It needs to be catalyzed in some way.
[17:34] Those relations need to be mapped and there's a skill in catalyzing them through intermediaries.
[17:41] Yes. So that's that's a very important point actually. I mean I I I just reflect on a research we did some time ago in the um Cambridge biocluster and
[17:50] it was clear that there was lots of sharing of knowledge but there was lots of sharing of knowledge informally. So I was actually wondering whether open
[17:58] research make things better in terms of um making some of the data formally um
[18:06] uh available in in addition to the interactions which happen anyway within um uh clusters like bioclusters in in the biosciences in these areas.
[18:18] Samantha is do you have any any any such experience? I mean that that open open
[18:26] research makes these interactions um you know more more more formal for example in the in the engagement apart from the
[18:35] informal sharing of knowledge that's happening.
[18:38] Yeah, I think um there are sort of different spheres you can work in and there's like you say there's that sort of onetoone a research group and a and
[18:47] an industry specific business partner begin to work together and I think what we see in terms of businesses is once they have a successful um sort of
[18:56] university uh academia industry collaboration they are almost certainly going to reach out to other academic
[19:04] groups with different expertise and can maintain those because they see the value of them. I think the difficulty is
[19:10] in sort of relating that value um to the industry partners. Some are very open to it as you say, different sectors,
[19:18] different areas. Some are very open to sort of university business cooperation.
[19:22] Um and that's all sort of facilitated by open research. Others are more uh resistant to it, perhaps have had bad
[19:30] experiences in the past. I think I know that many universities and probably all universities are sort of pushing sort of these connections but it's important to make meaningful connections as you say.
[19:41] So it's it's in developing those skill sets. This is a different skill set for academics than academic um sort of
[19:48] research that is uh it's a foundational and fundamental research uh within a single discipline. um and it requires that openness in communication, the
[19:56] ability to talk uh with um these different sector partners and I think the uh sort of the more formal structure
[20:05] is is around sort of developing those academics and and uh in their ability to
[20:12] then go out and work with um these different sector partners to understand how um these different sectors operate
[20:19] and their drives, their motivations uh as well. So I think there's structural work that has to be done to formally
[20:26] begin to support more routine university business cooperation and I think it does happen um sort of more successfully in
[20:34] certain clusters than others and on sort of the one-to-one bases. But I think I one of the major things that we sort of come back to a lot working with industry
[20:41] partners is that idea as as open as possible but as closed as necessary. I think there's that risk um that an industry partner uh might think, oh,
[20:50] you're just going to give everything away and then you know or you're working on different time scales and and that's not true. And that's that early conversation that's that fundamental
[20:58] building of of relationships which in many cases has to be interpersonal because you have to know you can work together as a as a cohesive team.
[21:09] Yes. Thank you. Well, I'll I'll I'll come to the um question about the the tradeoffs. Uh but I was actually
[21:17] wondering whether open research is is happening let's say um at the stage that it's more blue sky
[21:27] so to speak and before moving to translating certain findings into industrial products. Is is that the case?
[21:41] Yes. I think that's like you say the blue sky research and that that fundamental research. I think it's it's incredibly important that there's that level of openness there. Um I think you
[21:51] know we're all working to sort of move beyond pay walls and to freely promote our research. Um and and that really
[21:58] gives trust and confidence I think to other partners, other potential collaborators that openness they see. Um and I think in many cases are sort of
[22:06] attracted to is this is something that I can work with. If your if your research is not open then how do people approach you? How do they know that they can work
[22:13] successfully with you? And I think as you say that comes from the blue skies,
[22:18] the fundamental research um and the the proper dissemination of that in a structured format that is open um all
[22:27] the way through to then the translational aspects. Yeah.
[22:30] Yeah. Thanks. there's been any any any views on on on that since you know um you've done substantial work on on
[22:38] strategies and I was actually wondering whether um some firms strategize uh on this um that
[22:48] you know okay at at the beginning when they collaborate with with academia they are much more willing to think about
[22:56] data in in open terms whereas of course um you know uh in in later stages
[23:03] perhaps they they become more uh rational and probably think about more in terms of um close as necessary as as Samartha said.
[23:14] Yeah, I think I think this is kind of like you know very true in a sense that uh when you know there's a new technology which you know everybody
[23:22] wants to advance it quite quickly before it reaches like any commercialization stages everybody has the incentive to
[23:29] pull resources together and like you know uh advance technology. We've seen that you know uh with biotech and pharma
[23:36] firms uh you know pharma firms who wanted to kind of like replenish their pipelines with new drugs collaborating with biotech firms and collaborating you
[23:46] know with the universities where like that sciencebased uh comes from. So this is very true in
[23:52] science-based uh you know sectors. Um of course then issues regarding the IP
[24:00] uh come to create you know to create problems and as Samantha said these discussions have to kind of like you know come early because you never know
[24:09] when you know the research would lead to something which could be like you know commercially you know viable because you know there's a lot of serendipity in
[24:18] these in these cases. So you know you you you cannot anticipate that the project would be kind of like so to speak blue sky. Uh so what happens if
[24:27] let's say some of the results are kind of like do indicate that there is like no commercial opportunity h and I think
[24:34] you know in practice you know what you know uh some results that we've done like you know with secondary data shows
[24:41] that kind of like know firms are like you know in a sense quite quite secretive and you know this is how they manage their IP. they would prefer to
[24:49] manage it like you know in a secretive way. uh you see around you technology firms and platforms that they kind of like uh release partially uh you know
[24:58] technology we're talking about like you know partial openness rather than like you know full openness so that there is control in the standards but also
[25:06] controlling you know quality because you know openness sometimes might you know kind of like not allow the the orchestrator of a technology of an
[25:15] ecosystem to really you know control the quality of the of the of the contribution. So it is faster it can be faster and it can go in you know many
[25:23] ways but you know it's you know where is the control and also in terms of like you know the issue of preparedness was
[25:31] you know kind of like raised earlier and I wanted to add in into that in terms of you know what we have seen as well uh
[25:38] firms that you know they they open up their their process there's a lot of resistance uh from the staff from
[25:46] research staff uh due to identity issues If you're a researcher and you work like you know for a big firm your your job
[25:53] the way you conceptualize your contribution is like that you find the problem and you provide an answer to the problem. Uh what if you know that
[26:01] problem is solved externally. Uh that creates an identity problem. What do you do? What is your you know your
[26:09] contribution if the solutions are found externally? is your contribution to just kind of like scan the landscape and and
[26:16] find who can you know provide this contribution. So these these these issues come like you know with the
[26:22] crowdsourcing uh projects uh where indeed you know the crowd can help you know define the problem you know and
[26:30] define you know the solution but then uh you know would would these let's say new ideas be accepted what if these new
[26:38] ideas come from like you know nonprofessional researchers so there are a lot of like you know barriers in you know in this process
[26:47] yeah that's that's very interesting I I mean you you raised the issue of of identity also in in relation to IP and
[26:54] um also um this um uh other issue that we talked uh about which is you know I
[27:02] mean I was wondering whether open research really does uh anything that
[27:11] promotes better commercialization of scientific breakthroughs. Right. Is there um is
[27:20] there any data? Do we have any data that suggests that actually by introducing principles of open research
[27:29] this enables the process to move faster towards commercialization of scientific
[27:37] breakthroughs? I don't know Samantha you're shaking your head. Any anything from the bioscience perspective? Yeah, I
[27:46] mean I can speak from from my experience having worked with sort of uhmemes and one of the things that the open
[27:53] research has has really worked for us is um when the whatever the technology is is protected so we can speak about it
[28:02] more openly is that we can actually use the data we generate uh from the academic standpoint and environment and
[28:09] our expertise and we can actually um sort of strengthen the the evidence for regulatory bodies um potential
[28:18] investors. Uh so we have one of ourmemes um that we collaborate with uh has to crowdfund for for part of their uh their
[28:26] resources. Um and the the evidence that we can provide to them about the product is really effective in the crowdfunding.
[28:34] Um I think uh it's that kind of that that sort of strengthens that area but it also in terms of regulation when they
[28:42] see sort of that uh rigor applied um perhaps expertise that you couldn't get certainly in a small medium enterprise
[28:49] environment um the expertise and the knowhow um that we can bring to a project uh to generate that data. I
[28:56] think it really does push um sort of the commercialization forward and certainly that's the feedback we've had from sort of many of the smaller companies that we
[29:03] work with. Um I don't know how that sort of expands into sort of the the larger companies that have more resource but certainly from our experience I think
[29:12] you know that get that exposure for the companies being able to showcase their their results and their data um openly has has really helped them.
[29:21] Thanks Samantha. Adam um is it the same case with um medical technologies innovation?
[29:30] I think I think yeah I think Samantha hit most of the nails on the head there and and sometimes as well just having that aspect of if we're working with an
[29:38] academic interest and publishing as well the company gets something that's peer reviewed and not just done in house with three members of staff that they're then
[29:46] trying to to push to the next level. So there's a bit more um the data can be trusted a bit more when when trying to
[29:53] push it to the next step. Um there's obviously there's there's good cases of the opposite of that where you you need to be fully closed as well. It's not
[30:01] just about open interest being open research being um fully the right thing at all times as I think people have
[30:08] started to touch on. Um but I think yeah it can really help build that build that trust and and provide different areas of
[30:16] expertise that that like Sam said anme anme can't have an expert in every single field. Um so if you can if you
[30:25] can kind of practice open research and bring people in and then go and show an investor or go show a regulatory body um that you've thought about um aspects of
[30:34] even if it's just added value um making a medical device and then um looking at a potential risk of infection when
[30:41] you're intro introducing the advice device into the healthcare setting. So you can really add a lot of value with open research and adding the kind of
[30:50] transdisciplinary expertise that you can't have within one group. Um unless one person truly is an expert at
[30:58] everything which is which is quite difficult um the open research can really help there I think.
[31:05] Thank you. Um and uh coming to you Matt uh again because of of your work on on
[31:11] on clusters uh I was actually um wondering whether you know the there are
[31:19] clusters which practice much more open research and and sharing data than other clusters and whether these clusters you
[31:27] know are more likely to um um to translate scientific findings into um
[31:36] products for commercialization. I would you would you say that um this is this is the case? Obviously trust is quite
[31:45] important but would you say that um clusters which tend to be much more open than other clusters have more likelihood
[31:53] to commercialize um uh products from the research that is taking place there?
[32:01] Definitely because they have built up a uh capacity to collaborate to be able to
[32:09] take knowledge and uh convert it into technologies or breakthroughs which are of commercial value. So the actual
[32:16] clusters and sets of relations work uh and I think that often happens when you have a cluster around a university and I think what we're talking to Adam's point
[32:25] there what really matters is legitimacy and it's the legitimacy of the knowledge which they may then go and move on um
[32:33] for anme I mean that's a a small company may be a massive investment but if it doesn't pay off may end up breaking the
[32:40] company so they have to be very confident about the knowledge that they move on whether it's legitimate and as it's being produced not just by the
[32:48] university but in peer review through the academic community that gives them a basis to move on and they'll also be trying to attract investment as well
[32:55] from different sources and they won't be able to get that investment unless they're as sure as they can be when they because innovation you know is about
[33:04] placing a bet on the future and the odds have to be in your favor.
[33:09] Yeah, absolutely. I mean you mentioned and um obviously for for for somememes I mean knowledge
[33:18] can be really expensive. So from that point of view open research and accessing open data should um make
[33:27] things better for for them at least within within clusters. I mean there is a colleague of mine who is arguing that
[33:35] actually uh openness can be uh something internal and or something external. So
[33:43] um he's he's arguing about knowledge club. So a cluster as a knowledge club within which you know open research um
[33:50] is is thriving but outside of course the cluster it's it's it's not happening. So but formemes inside the cluster
[33:59] surely open research must be something very positive
[34:11] and I was also thinking because you know I mean in in our world which is a world of
[34:20] of AI now and and and and machine learning um open research in in some areas
[34:29] including new life sciences often faces
[34:36] uh the objection of of security risk. So there are some tradeoffs you have already uh
[34:44] touched up on tradeoffs. So I was actually wondering about that. What are what are
[34:52] the challenges or indeed the tradeoffs between open research,
[34:58] data security and commercial interest especially in those areas that industry needs to be involved
[35:07] doing collaborative work with universities. So if university researchers are committed to open
[35:15] research, what are the tradeoffs when it comes to to security? So for example in in life science and biosciences Samantha
[35:23] have you have you come across issues of of of of um of commercial interest or or
[35:32] um security issues uh in your research. I mean I think it's certainly possible.
[35:39] I think uh we um with a sort of the big data we don't I don't do so much around the artificial
[35:46] intelligence side of things but the with big data that we we work with genomics transcrytoics things like that um again
[35:54] the the the what we would build in there is um that openness so the registered
[36:00] repositories with unique identifiers. So if that work is uh potentially used, it is already identifiable as the source
[36:10] material is yours. Um I think there's obviously again a trade-off with it may be that the industry partner does not
[36:18] want that very raw source material to be used openly because um it sort of protects some of their interests. So uh
[36:28] then that's where the the agreements have to come into place as as to and the expectations have to be managed as to what we can do what we can't do and
[36:35] that's one of the situations where you might get a little bit of conflict between sort of academic priorities and
[36:41] uh for example industry priorities where protection is key for perhaps the industry partner until they secure
[36:49] something along a certain timeline. um however the academics want have their own metrics they need to um deliver
[36:56] against and so maybe pushing for more openness. So I think that's a really important conversation to have and I
[37:04] think with this very fastm moving world that we're in uh around artificial intelligence and and machine learning I think it's really important to to
[37:12] understand those aspects as you move into them and to again have those early com conversations and set those expectations. Um, I think that's
[37:21] probably key. I think it's probably a little bit more difficult now um to sort of protect the IP than it has been
[37:28] traditionally and I think we're all in a responsive mode where we need to get ahead as the new technology as it develops. As I say, I'm no expert in in that area. Maybe others on the call are,
[37:39] but um I certainly recognize that as a as a challenge and something we need to be aware of.
[37:45] Yeah. Thanks. And and the spinner for sure. I mean companies do strategize about those tradeoffs and they try to
[37:53] anticipate but the question is how some of these tradeoffs get resolved. Um is
[38:01] it through negotiations or is it through avoiding some some types of of contracts when it comes to
[38:10] uh open open uh research and uh conflicts between open research and and IP for sure some companies place very
[38:19] very important um you know um uh value to to IP because they depend on IP.
[38:27] Yeah, absolutely. I think these are things that you know kind of like discussed in these collaborations in advance but of course you know not
[38:35] everything can be anticipated. We have seen over the years uh more the a rise of core patenting. So you know patents
[38:44] belonging to more than one you know kind of like you know institutions or you know proper acknowledgement on you know on the patent that you know the the
[38:53] research was conducted elsewhere and uh you know the IP is appropriated by you know another organization.
[39:01] And this is you know this is important for kind of like uh declaring where you know the knowledge you know has come
[39:08] from. Um so this is what when it comes to IP. But of course uh you know on a on a on a on another level you know
[39:16] confidentiality is you know very very important and um you know universities when um you know the kind
[39:24] of like you know public uh you know research uh that comes from these uh kind of a you know collaborative
[39:32] projects. Um there is uh you know always a care there for you know for anonymity uh for ensuring that you know reputation
[39:41] is not you know is not damaged or you know case organizations are not are not are not mentioned. Uh so to give an
[39:50] example let's say of a colleague that um has developed a tool for verification and validation in complex manufacturing
[39:59] this is uh Kadija Tahara in the business school. So he opened the tool freely to somememes
[40:06] uh before the tool uh is commercialized uh fully uh and of coursememes uh you
[40:13] know wanted to use it there were some constraints there in terms of like not finding time to experiment with the tool uh but the idea of this innovate UK
[40:23] funded project was to uh help us understand the efficiency that can stem you know from the use of that of our
[40:30] tool and of course you know uh candida gathered data about you know kind of like how you know the tool can can be
[40:38] applied and and the results that come from that. There was a lot of like you know issues around uh companies being
[40:46] quite aware of like you know not wanting to release some results which they might you know damage their reputation uh in
[40:55] terms of like you know their own you know processes for validating you know uh their you know products for you know
[41:02] for customer uh use. Um but this is essential. So this is another step you know of openness where you kind of like
[41:09] you have a process innovation and you want to kind of like you know test it and explore it before you actually you know commercialize it. Um but you know
[41:18] care needs to be given about uh about you know how how data is used. Um I have more to talk about like you know the
[41:26] governance of data as well but we can leave this for you know for later.
[41:32] Thanks the spinner. And I would I I was actually wondering as well whether some of these tradeoffs get um better
[41:40] mitigated uh in um in in clusters in in in systems which basically uh companies
[41:48] are you know in in better proximity. I mean I don't know. I'm just putting this as a as a as an open question. I don't
[41:55] know, M would would you say that some of these tradeoffs um get better managed or or mediated
[42:04] within a um a technology cluster environment?
[42:11] I was just considering some of the comments made in the chat and the most recent one actually and I'm going to have to give you that answer. It depends.
[42:19] It depends on the nature of the sets of relations and the companies involved and the sources of knowledge. It's very very
[42:27] kind of difficult to say and I was I was sort of thinking through as we were talking about open research and I was thinking well open research so what I
[42:36] was being provocative with myself in I'm sure I'm not the I'm not the only person that does that am I thinking we make our research available and then kind of what
[42:44] is the work of that research in the innovation process what does it do and so the innovation research is littered
[42:53] with case studies where something has been invented then there's been an innovation process and it's the innovator that has become known. So
[43:00] diesel and man trucks man trucks really made the diesel engine if you like diesel invented the idea. So are we
[43:08] talking about the very early stage of the concepts which may emerge or which may stimulate the concepts and what's
[43:14] the what's the point at which you decide not to be open and how do you protect
[43:22] and how do you kind of make that decision at that point is it about control is it about revenue is about gaining investment and you have to
[43:30] protect your IP at that point for those purposes so it's quite complex actually and I I feel in some respects that we haven't
[43:38] really sort of got to the bottom of these things in research to actually understand open research. What is the work of open research in the innovation
[43:45] process in clusters or otherwise? I continue to be challenged as to why there are clusters because proximity is
[43:53] achieved in digital space like today we're having a we've got proximity we're interested collectively we have a set of topological relationships but why do we
[44:02] actually cluster in space topographically it's still quite unclear actually yeah
[44:09] uh thanks thanks M I mean I'm I'm I'm just reading uh uh Muriel's question Um
[44:18] and um she raises this this question whether you know open research can make it harder actually to license or um uh
[44:27] patent or commercialize um in in some areas. I mean I I don't know uh what
[44:34] about the area of of of medical um technologies innovation? I mean, is it harder to license when you come from a, you know,
[44:46] uh, open research perspective and and you defend open research? Would you say that, uh, Adam?
[44:55] Uh, yeah, absolutely. So, you have to be very careful that you protect your IP first, for example, and publish later.
[45:00] Um, you don't want to be in a position where you're publishing your technology and then it blocks your patent, for example. um that's a very bad position
[45:08] to be in. Um so it's not always it it's almost a bit controversial, but it's not always that open research is always
[45:16] positive. Um it's about how we use it and using it appropriately. Um and and there being which has been brought up quite a few times now already, but there
[45:25] being that balance between um what we need to protect and and what what we can what we can disseminate um and making
[45:34] sure really you protect the the the company's um IP and interests to a level where it's going to generate revenue
[45:43] back into the economy. You don't want to be really open in your research and it devastate a company and then it not drive additional innovation or
[45:51] additional income into the UK or elsewhere later on I suppose. Um so it's always a trade-off is is
[45:59] difficult as Matthew was saying.
[46:02] So I think I think the key word here is um balance as as you said. Uh bear in
[46:09] mind that the the pros and the cons of of open research and perhaps uh thinking about open research in different stages
[46:19] um in the blue sky stage but then afterwards um making sure that we we do
[46:25] achieve this uh balance when it comes to uh you know uh taking on board certain
[46:32] uh interests. Uh absolutely and of course um as I was saying before I mean
[46:38] um we do have AI now we do have um uh machine learning uh technologies they're
[46:46] all using data they're all uh get trained in um you know with with with
[46:53] data and I was wondering to what extent the values of open research are actually
[47:02] compromised ized by AI uh and the fact that certain models can
[47:09] perhaps train on data that is out there available and but but you know
[47:18] um uh potentially for for other purposes and I I just wanted um your views on on
[47:24] that. I mean, is it the case that open research can be potentially compromised
[47:31] by AI and perhaps do the combination of open research and AI raises even more
[47:39] tradeoffs and security issues? Um any any views on that? Um Samantha, I'll go first again to you.
[47:51] I I think I think it is um it's a learning curve. I think there is opportunity and also danger uh there. I
[47:59] think uh we've seen instances uh sort of in the press where you know machine well generative AI has utilized something
[48:07] that that was you know hidden or but found somewhere by the generative AI. So I think it's it's about very careful
[48:14] management of of of your research and and your assets and making sure that what needs to be protected does
[48:21] genuinely stay protected. Um and then uh what can be opened up is then opened up.
[48:27] I mean I guess the I guess there's that caveat now that you need to be uh sort of aware that if you make something open
[48:34] and and uh that that it could be used without any sort of uh accreditation to your work. Um because a generative AI is
[48:44] commonly just takes that information um and will synthesize new ideas based on on what was being asked by the user. So
[48:51] I think it's um again it feels like the very early stages and I think everyone probably across sectors is trying to uh
[48:59] sort of keep pace with the advances as they're coming in. Um and certainly um in the sort of higher education sector
[49:06] we're no different. We are de rapidly developing and redeveloping and advancing policies and and uh practices as we go and I would imagine it's similar uh in other areas.
[49:18] Thank you Samantha. Um, Adam,
[49:23] yeah, I think I think the AI question is really hard. It's so it's so fresh as well, isn't it? And and I I think I
[49:32] think it's just going to require a bit more education between the AI usage and the open research usage and making sure
[49:42] when I AI is used, it's acknowledged and it's not a replacement to the expert.
[49:47] It's the expert using it as a tool. And yeah, I I don't know if I I have much
[49:55] more to add without waffling. I I find it a really controversial topic to talk about cuz it's so useful um but can be used so
[50:04] wrongly um also rightly um and I think we're just at the start of that kind of process, aren't we?
[50:11] Yeah. Abs. Absolutely. Um the spinner any any views? Yes, I I would kind of
[50:18] like um reverse the argument and like not uh discuss about uh how AI can be
[50:26] used in research but what open research can do for AI AI models not necessarily
[50:33] uh you know generative AI. So the you know there is an argument there that you know the availability of you know open
[50:41] data of data that can be you know machine readable can help uh and support
[50:48] the development of AI models that would be you know more accurate more specific.
[50:54] Um so I I was as as a member of the O uh you know panel community at the university open university. I was you
[51:02] know given the opportunity to attend the European open science cloud uh meeting which you know they are trying to create
[51:09] you know a governance system for um for data to be uh from you know European funded projects but also beyond to be
[51:18] made available on a cloud and be used by by by by researchers. And the idea what was discussed there was that you know
[51:25] such data if it is like you know machine readable can actually form the basis for
[51:33] you know competence uh in AI technologies uh you know for you know the European continent
[51:40] um and to kind of like you know this can be linked of course you know in you know economic competitiveness and so on but
[51:48] of course you know the issue of how this data is used is kind of like you know very central to these argument and that's why there need to be like you know governance you know arrangements.
[51:59] Uh so a governance mechanism which you know would you know allow or disallow the use of you know data or ensure that
[52:07] data is not misused or data is used for you know kind of like the public benefit.
[52:14] uh I mean there's all you know we misinformation and whatever is right around us we don't want let's say such
[52:21] kind of like you know open data to be used to support arguments uh which are not you know which are not valid h so
[52:28] that's government governance mechanisms are very very important and I think you know you know universities in you know in Denmark that they engage a lot in
[52:37] kind of like you know industry university relationships uh this is kind of like you know the the things that they are putting you go forward uh you
[52:44] know a mechanism which one that would like to get access to this industry university you know data applies uh
[52:52] gives for you know to gain access uh the there is a you know a governor's committee which they kind of like would
[52:59] see this uh you know the aims of of that research and depending on these aims you know they would allow access or not. So
[53:07] in some ways you know this governor's mechanism ensures that there is fair you know use uh but also creates you know
[53:15] kind of like you know a database of projects that they are based on the same data. They do not duplicate you know each other. They build on each other and
[53:23] also creates uh creates a kind of like a a forum for you know for industry to find more collaborators you know kind of
[53:30] like attracts people. Um so it is it is very complex and we are at the start of it but you know it's exciting.
[53:38] Thank you the spinner. Matt the same question to you.
[53:43] Well I would say that the uh there's no technology which is good nor bad actually. Uh it's us we make them good
[53:50] or bad and so it depends on the set of sociotechnical relations in which they are embedded and Mr. spoiner and uh colleagues have pointed out that
[53:58] actually it's their governance and uh steering them towards the public good which again is a slippery subject. What is progress?
[54:08] Yeah.
[54:08] And trying to make us more progressive that's that's really important but one of the myths of AI is that there are no humans evolved and that AI is going to
[54:17] replace humans and that really is a myth and ultimately it's still a technology.
[54:21] is still a tool that we use and we're going to have to learn to live with and supervise AI and this is going to be one of the processes which we have to
[54:29] supervise AI prevent it from going over the cliff perhaps and uh see how we can use it usefully to wrangle lots of information.
[54:38] Thanks Matt. Um yes I I did ask this question because I was actually um thinking about the possibility of of
[54:47] misuse of academic data that is out there can train a model and then it can be used for all sorts of
[54:55] different purposes that the initial intention of of the academic um research or or the objective and and I'm just
[55:02] looking at the at the chat and Muriel um uh put a comment saying AI is quite a problem for our national trusted
[55:11] research and innovation agenda. There are certain things we're legally probably not allowed to make open. Um
[55:18] and and and that's probably true. Um now we have um about 5 minutes left and I
[55:26] would like to give the opportunity to our audience to ask a question if there was a burning question for our panel.
[55:34] So, uh if if there is a question, uh just uh uh put it in the chat or um in
[55:43] in the Q&A and we'll just try to address it or I don't know if if the mics now
[55:52] are are working for for uh our audience to to ask directly the ask him.
[56:04] So, Muriel,
[56:15] I have a larger mic, so you should be able to turn your mic on. U Thank you. Yes, it's now working. It was disabled.
[56:26] um really interesting to listen into your conversation and I think it's one of those questions and one of those
[56:33] examples where I think it's really useful if research management professionals and others come together because I think what we've established
[56:41] is that there are lots of things here we don't necessarily have answers to suggesting that hive minds are good
[56:48] collaboration is key and that there is an opportunity for us to take this topic a bit wider than what it has been discussed at not just a local but
[56:57] perhaps also a national level. So I think your your panel here is a really really interesting one and it's flagged a number of issues where I think I will
[57:06] certainly take it back to my knowledge exchange community and propose as a topic within the professional research and K management sphere. Um but perhaps
[57:15] something we could look into and developing further I would suggest because it's really quite important we think it through um more carefully than
[57:22] what we have done in the past. So thank you for organizing Thank you, Muriel. Any any response to Muriel?
[57:30] I think I think that's absolutely spot on. And and if you don't know, go and ask if if you're not sure about your IP whether to be protective of it or open,
[57:40] go and ask your IP team or go talk to your open research representatives and and yeah, collaboration is key.
[57:48] Yeah. And we're very nice. I promise you. We we're not there to stop you from doing things. We're always trying to be helpful.
[57:58] Thank you, Mural. Um and um just before we conclude, um here it comes a
[58:05] challenge that was um put by uh Katy Good says, "I wonder if the panel might
[58:13] attempt to give one sentence on what might be a guiding principle for open research and innovation." Oh, that's a
[58:22] difficult uh task for a conclusion, but anyone who wants to respond?
[58:32] I think I I can maybe hazard a a very simple one. I think for me, openness,
[58:39] respect, honesty, shared values and priorities from the outset is uh what is key to me for for open research.
[58:48] Thank you, Sam. Thank you. Okay. Well,
[58:51] um time is up. Uh I think our discussion leads to the conclusion that um yes,
[58:58] open research is is crucial for innovation, but it's an ideal that has to be balanced. Um and in in reality,
[59:08] there are trade-offs which need to be addressed uh and even institutionally need to be
[59:16] addressed. So um once again I would like to thank uh all of you our panelists uh
[59:25] excellent panelists uh for the um insights and the thoughtprovoking discussion I must say today uh but also
[59:33] our audience uh thank you very much for attending. Please don't forget that um there are various other events uh taking
[59:42] place this week and tomorrow we have our second keynote. So uh thank you thank
[59:47] you very much for attending. Um thanks
This session highlighted how academic careers can be jeopardised by inappropriate use of AI in academic publishing, often due to increasing publisher thirst for papers.
Watch the video recording of the session below:
0:14
Welcome, everyone, and thank you for joining us today.
0:17
Before we get started, could I just check that everyone here can see and hear me OK?
0:22
If we get one or two thumbs up, that would be perfect.
0:28
Thank you.
0:29
Take that, everyone can hear me.
0:30
So my name is Jamie Wells.
0:32
I'm the Marketing and Communications Manager at Mittens Innovation and I'll be helping to facilitate the session today.
0:39
I'll kick us off with a short introduction before handing over to I guess in just a moment.
0:44
So Open Research Week is a week long.
0:50
Excuse Me One Second is a week long cross institutional celebration of the practises, skills and culture that make research more transparent, collaborative and impactful.
1:00
Delivered jointly by the 8 Midlands Innovation Universities, along with Nottingham Trent University and the Open University, the programme brings together colleagues who are helping to shape the future of open Knowledge.
1:10
So across five days this week, we'll be exploring practical approaches, hearing from leading voices and connecting with a vibrant network.
1:18
Throughout the week, we're highlighting groundbreaking research, sparking thought provoking discussion and opening new opportunities for collaboration.
1:27
So thank you for joining, for joining us for this session, which explores the unintended consequences of using AI in academic publishing.
1:35
The the session will be delivered by Angelo Saltino from the Open University.
1:39
I'm Paul Moolacaba from the University of Birmingham.
1:42
Please note that today's session is being recorded and the recording will be made available along with the presentation to delegates after the event.
1:52
Finally, for me, do pop any questions in the Q&A throughout the session.
1:56
We'll try to answer those at the end.
1:59
Do know that you're muted and cameras are set to off to help the quality of the session, but we can unmute if needed.
2:07
So without further ado, let's get started and I'll hand that over to my colleagues.
2:15
Angelina ****.
2:16
Yeah, I'll show my screen.
2:24
I trust to see my slides, right?
2:27
Yes.
2:28
OK, perfect.
2:30
So just in case, let me know if something happens because I don't see what's happening in the charts.
2:35
I'm just seeing on my screen.
2:37
So thank you everybody, and thank you for joining us in this session, which is named by named as the unintended consequences of using AI in Academic Publishing.
2:48
So what's the promise today?
2:52
So we're going to walk through some of the things that you may already know, but some others you may know.
2:58
There is there are a lot of promises and I hope that we uplift those expectations.
3:04
So we're going to work through competition and misconduct, some of the predatory entities within the scientific ecosystem, like also the emergence of Gen AI and what Gen AI can do to even exacerbate those kind of issues.
3:21
And what has been the community response and what we can learn from it from the apologies, from the meta science of research.
3:30
So I would like to start with a bit of a preamble here.
3:34
So what this talk is not about, we are not here to demonise AI.
3:38
Actually I am a researcher in AI and it will be sort of idiotic for me to say that AI is is the evil here.
3:50
But actually AI is here to stay.
3:52
We should leverage it as much as possible, but our kind of warning here is to make sure that we use it in a sensible way.
4:02
So this talk wants to inform all of us about the uncritical use of AI and what are the consequences so that we can learn from it.
4:10
Now I will just pass the words to Paul.
4:14
Yeah, Thank you.
4:16
Yeah.
4:16
So to, to start this off, we want to route some of these, these issues that we're discussing here today.
4:20
And the general shift of the publishing model that started kind of like at the at the beginning of the 2000s and really kind of like took a lot of pace about 10 years later with a movement towards Open Access that radically changed the way that the scientific publishing industry works.
4:37
So there's been a shift from the traditional model towards an Open Access model.
4:42
So the traditional model was primarily based on subscriptions.
4:46
So the science that was published in journal was only available to academics that would that on an institution that would pay for a subscription to that journal and it was not available to kind of like anyone else, anyone who's not working in those institutions.
5:02
1 aspect of this is that the scientists, the authors would not pay a fee for publishing their work.
5:09
It was all kind of like covered under the general subscription that the university would pay for.
5:14
There was a a general peer review system with high rejection rates and ultimately the journals would be delivered in a in a real kind of like printed format that that people could read as an actual paper.
5:25
But now most of the publishing is done Open Access.
5:28
So that means these physical journals practically don't exist anymore.
5:33
They might not exist, but articles are mainly read digitally and by now most journals have moved away from the subscription format and move towards an an Open Access format where the the money is paid not for a subscription, but rather for each article that is being published.
5:52
And yeah, so this is some journals have also kind of like changed or even removed peer review aspect, but this is just something that's really important to bear in mind this this change in the in the publishing landscape that can lead to that.
6:07
I think what we want to emphasise here that has very positive intentions behind it that people want to make signs available not just to academics but to the general public.
6:18
But unfortunately it can create very false incentives for the publishing industry.
6:22
So if we can go to the next slide.
6:26
So it's important to bear in mind that publishers, the, the, the companies that that publish scientific articles, they are companies looking for profit.
6:34
So their their ultimate aim is not just to promote academic research, but they want to make money with it ultimately.
6:42
And this change from a subscription based policy towards publication based fees unfortunately creates really bad incentive for publishers because for them it now becomes profitable.
6:55
The more papers they publish, the more money they will make rather than just the more interest exists in the in the journal because they publish high quality research that they will sell more subscriptions.
7:07
So this is something that can have a lot of unintended consequences and has led to, yeah, a lot of things that we want to discuss here.
7:15
But like this quote on the bottom just really encapsulates this, this shift that kind of like this Open Access movement has a lot of positive intentions and also a lot of positive outcomes like making science more accessible to to non academics, to the general public.
7:31
But it has led to yeah, to a lot of yeah, to a lot of things that that are kind of like negative consequences.
7:39
If we can go to the next slide, one of those things is what we refer to as predatory journals.
7:44
So those are journals that prioritise profit over academic integrity.
7:50
So those are journals that essentially just want to publish as many papers as possible.
7:54
And this is something I really encountered the first time when I was a PhD student and kind of like dealing with journals from like the, yeah, I don't want to name the journals, but kind of like particular journals that, yeah, have kind of like 1 special issue after another where they keep emailing authors and inviting any kind of submissions, kind of like on particular research topics.
8:20
And for me, the eye opener was when I entered this from the other perspective of reviewing journals for such special issues that I was reviewing a journal that was like a, a paper that I thought was very poor quality.
8:32
There were really major flaws with it.
8:34
And I, the editor ultimately just urged me to say, like, can you please finalise your decision?
8:40
And I said, like I, I don't think I can support accepting this paper in, in, in the state like without collecting any additional data and so on.
8:47
And like, they were just ultimately the decision that they made was to replace me with another reviewer who would be fine with accepting the paper.
8:54
So it really seems like in some of these journals, this is not an open-ended review process.
8:59
Any paper that will be submitted and where the authors are willing to to pay the publication fee will ultimately be published.
9:05
So this is obviously a very negative process that's kind of like really is a is a big departure from the original publish publishing process.
9:15
If we can go to the next slide, please.
9:18
And yeah, so this is just an an example like an an e-mail that Angelo received here.
9:23
This is what these what these emails might look like that you get kind of like a cold e-mail, often very, very unspecific that you get invited to submit papers to journals that might be just tangentially in your area of expertise, sometimes even completely outside of your area of expertise.
9:38
So I'm a neuroscientist.
9:40
Sometimes I get invitations to submit to like astronomy journals or whatever, kind of like this is as as far as this gets.
9:46
And it's kind of like there are new journals coming up constantly.
9:50
The number of journals is increasing.
9:52
The number of papers that are being published in these journals is increasing constantly.
9:56
And this is just because of this new new area of profit for for the publishing industry.
10:02
Next slide, please.
10:04
And they're not just predatory journals, they're also predatory conferences.
10:08
So this is now another example of like an e-mail again, any academic, as soon as you publish your your first papers, you will get lots of those emails in your in your inbox that you get invitations to, to conferences that are just completely outside of your area of expertise.
10:26
And this is just kind of like a similar side effect of this, that this is a money making industry that benefits from as many submissions as possible.
10:35
And it's really something that undermines academic quality in the end, that it's really just for for these kind of events, for these kind of journals that the only thing that matters is the quantity is there's nothing really about the the scientific content or the quality of it, which used to be different in the original model.
10:56
Next slide, please.
10:57
I think we might be.
11:00
Yeah.
11:01
And yeah, this is just kind of like a few summaries of there's a few links that you can follow to do this.
11:10
That kind of like as the subscription costs have disappeared or in some cases gone down, that at the same time the the publishing fees, they are constantly rising to kind of like really extreme levels where kind of like to have an article published and to have it published Open Access you, you would typically have to pay fees that are really in the thousands of pounds.
11:31
And this is a huge amount of yeah, like ultimately taxpayers money that is, that is invested just that goes kind of like straight from the taxpayer towards the towards the publishing industry just to make to make research available to the public.
11:47
So this is a huge problem at the moment where kind of like there's a huge money flow that is kind of like meant to support academic research, but it is ultimately just supporting for profit publishing industries and a lot of things that you, if you think about it, that the same money could be used in a much more, in a much more constructive way to support the academic research.
12:13
OK, I think this is yeah, this is all is on me now.
12:18
So, OK, so far you have seen one facet which is the thirst for money, the thirst for profit done by certain entities, not necessarily having science at their art.
12:31
But there is of course another side of this medal, another facet of this medal, which is the competition or the way we are as researchers assessed, right.
12:40
So the academic ward is very competitive.
12:44
We compete for job fundings, prizes and awards to advance our career.
12:50
And even if you want to go for an upgrade or for advancement in careers, you know, you are assessed and typically, sometimes this is done through certain metrics, like for example, number of vacation, number of citations HE index.
13:08
And we are often reminded of the publish or perish perish mantra.
13:13
And, and what happens is that this kind of this kind of behaviour goes very much in line with what the journals want, right?
13:24
So it kind of loops back into that and it creates this vicious cycle in which, you know, we are, we are getting, we are, we are kind of exposed to this competition.
13:36
And then those kind of journals, those kind of predatory entities are there to fulfil this kind of competition.
13:42
So we are feeding each other on this.
13:45
However, one of the things that we need to remark here is that those kind of metrics are nowhere near to quality assessment.
13:53
They are just measuring quantity.
13:55
And also this metric, this metric, they're highly dependent on what kind of database we are using.
14:02
For example, in certain research assessment frameworks, for example, in other countries, they use Scopus, in some others, they use more open databases and so on.
14:15
Or many of you have been using Google score.
14:17
How many of you have how I've found that the age index don't adapt in across all these platforms because it highly depends on what kind of content they index.
14:28
And of course, as I mentioned to you earlier, this competition goes back and feeds into predatory content and so on.
14:38
And of course, it leads to potentially what is called misconduct.
14:41
So people that try to circumvent rules of, of publishing and you know, as good arts says, when a measure becomes a target, it seems to be a good measure.
14:52
So, you know, if at some point citation has become the metric for us to get, I don't know, a particular thing, then OK, we can game it.
15:02
As you can see in this profile, this person, what this person has done is simply loading preprints on on ResearchGate and then basically Google Scholar will fetch this information and boost up the citation score.
15:17
So those are the kind of things other things that you can do is also where you where in the past, you could buy followers and like social media.
15:28
Now you can buy citations.
15:30
So there are citation meals that that allow you and also other times of misconduct that go through the classic plagiarism, for example.
15:41
And here it's, it's, it's I'm, I'm showing here this example just to show what are the possible consequences.
15:49
Because, you know, one of the things that can happen is that you can easily lose your job as well as your dignity.
15:59
And what I'm trying to say is that misconduct can always happen, but of course, we need to also look at the consequences there.
16:09
Other players in this this spectrum are also paper mills.
16:14
I don't know if you're aware, but paper mills are kind of third parties outside the author publisher kind of relationship.
16:25
So there are other parties, paper mills that are basically trying to produce as many papers as possible technically automatically generated nowadays.
16:35
And what they do is they are fraudulent.
16:37
They are submit, they're fabricated and they are sold to researchers and academics in order to to boost their publication record.
16:45
And of course, because they're fraudulent, they're not true.
16:49
They're not, they're they're not, do not represent the reality of science.
16:53
Basically, they are threatening research integrity.
16:56
So what they're doing effectively is that they are polluting the literature.
17:00
They slow down scientific progress because, for example, people may assume that whatever this paper are saying could be the cure for a particular disease, whereas being false, it does not cure anything at all.
17:13
And what it does is that it slows down therapies.
17:17
And in this particular context, journals and authors are both victim of the system.
17:24
Journals are are victims in a sense that they get.
17:28
They received a lot of fake papers, although eventually they profit from it anyway.
17:34
And authors are victims because eventually at some point they will be found out and then their career is at stake.
17:43
So how do paper mills work?
17:45
So there are certain fields that are particularly susceptible to this, mostly because there are a lot of things that need to be discovered and there are less resources.
17:59
For example, there are a lot of open-ended questions.
18:02
So because there is not enough labour, not enough scientists to reproduce or to investigate those kind of those kind of areas basically basically papermills try very much in these areas because because they will go uninvestigated.
18:22
So also guest issues.
18:24
So as Paul has mentioned earlier, there are publishers that that or journals that have one special issue after another.
18:34
So they try to attack those special issue because those are the places in which there is less control from the journal and they have they are more likely to succeed.
18:45
So also there are other ways that paper mills can succeed, for example by like by bribing editors.
18:55
So paper mills can offer money to editors in a in a in order to accept publication to their journals.
19:03
So how do they get money paper mills?
19:05
But well, they have their own advertising campaigns by basically advertising authorship position.
19:12
So you have a paper that has been produced somehow and we will show later how and then this paper will be attached to a number of authors and you can buy your position.
19:24
Of course, the first and last position are more expensive because notoriously those are considered the more prestigious position within the authorship, whereas the central ones are, are are are cheaper.
19:38
So, so far you, we have seen, we have seen again a bit of what is the spectrum of, of, of misconduct, right?
19:51
Like we've seen predatory journals, we have seen authors engaged in some form of misconduct that goes through plagiarism, but also other ones.
20:01
And also we have seen other entities like paper mills, OK, but this existed even before Gen AI emerged.
20:10
Those have been already there.
20:11
They were working through different patterns, different strategies and so on.
20:16
So the kind of the kind of message that we want to pass on here is that now with Gen AI, with generative AI, things start to be a little bit more serious because Gen AI actually goes and boost UPS this kind of behaviour.
20:33
Because nowadays Gen AI can easily generate minor paper variants.
20:39
They can basically create a new version of the of the paper by simply asked to rewrite it and so that it does not get detected by plagiarism or for example, it can generate automatically literature review, generate text, paragraph text paragraphs or fabricate images or, And I'm sure you guys have somehow tested the capabilities of Gen EI, because now Gen EI is it is here since a few years now.
21:09
And I'm sure you, you, you tested some of their capabilities.
21:13
And, and so the point that you are trying to say is that with this new technology, paper mills or misconduct from, from some malicious researchers can basically be boost up.
21:35
And here there are some generative abuses.
21:38
Some of you may may have seen the rat with the big balls that was published by Frontiers.
21:45
And, and here one thing that we should highlight is that, OK, the authors misbehaved and that's OK.
21:55
There is, it's kind of understandable.
21:56
There is competition.
21:57
OK, answer the ball up to a point.
22:00
There is, there is competition and and then of course authors try to find shortcuts in order to get their things done right.
22:09
But what is alarming is the fact that this paper has gone through some reviewers, has gone through the typesetters or the type checkers and has gone through the editors and none of them was able to see or realise that this figure was fake in all aspects.
22:30
And then basically the paper got published and only when it got published, people started realising that this paper had serious issues.
22:39
Other kind of other kind of issues or or other kind of Geneii abuses is where on on the on the right hand side, for example, a copy pasting went wrong.
22:49
Somebody just threw some content into Geneii, asked to do something, and then Geneii produced that content and left there some stray sentences as well as the regenerate response.
23:02
So the authors there not only copied blatantly the text from Gen AI, but also forgot to remove the kind of the kind of markup text that Gen AI and even the button like the button to regenerate response.
23:16
This is kind of hilarious in a way.
23:19
There are some contrasting measures, although we must say that there are some contrasting measure which is publisher nowadays are asking to authors to, to disclose the generative AI use.
23:32
And this is not to kind of add or devalue the paper somehow or in any way.
23:41
Because nowadays we we kind of normalise the use of genea.
23:46
Genea is there, it's helping us.
23:48
It's, it's democratising in a way, the publishing industry or the publishing, allowing anybody to publish icon high quality content.
23:58
Because imagine researchers from other countries in which English is not their mother tongue generally is helping them, improving the way they present their concepts, the way they present the results.
24:09
So it's a good thing in a way.
24:11
So this policy or this contrasting measures are not there to devalue any content at all, but just to increase transparency.
24:20
As a reader, I want to know exactly what am I reading?
24:24
Am I reading something that has been automatically generated or something that is genuine?
24:29
OK, so journals or yeah, or publishers are setting up this kind of this kind of disclosures.
24:38
And I must say that at the moment, this kind of policy, it's a little bit like a minefield because depending on the journal, you need to be aware what is their policy.
24:50
There is not a unified policy yet.
24:53
So you need to depend on where you want to go.
24:56
You need to check exactly what's allowed, what's not and and you need to familiarise with that content.
25:01
So our role now is also about familiarising what are those policies here.
25:07
But in general, what is being asked on, on, on.
25:13
It's just to declare and also to not to not use Gen AI as author because Gen AI cannot take accountability of the research of what has been done.
25:26
And also about the conclusion only authors are responsible.
25:30
And of course, you're not supposed to lie because if you get caught, your paper is in breach of policy and you're likely to be retracted.
25:41
So what we have seen so far is that on one end there is the pressure to publish OK, which is due to the way we are assessed to the way the publisher perish.
25:56
It's within on upon us, on our back all the time.
26:01
And then there is the journal thirst for profit.
26:04
So this is creating a toxic and dysfunctional culture, but it's not it's not all doom and gloom because there is a response from the community.
26:16
The community has been responding and this has happened.
26:19
This has has happened even before Geneai again, because again, the thing that we want to stress here is that misconduct or predatory journals were there before the images of Geneai.
26:30
So how the community has been responding to this where there are integrities, loot or integrity investigators, what they do, they scan papers, they look for bad ones or somebody just tips off or look, can you look at this paper and, and basically that paper goes through an investigation and they check whether there are signs of fraudulence or there are manipulated images for plagiarism or there are fake peer reviews or fake reviewers.
27:02
So there are there are some patterns that have been learned throughout experience and disintegrities loots, basically they go to the paper and they spot for those signs.
27:14
And the problem is that it's not at the moment is not scaling.
27:19
The number of loots is so low compared to the number of papers that are currently published.
27:24
Because nowadays we are publishing so much papers that there is not enough human power to to go through all of them.
27:35
So, you know, the call that these people are asking is that anybody can view can be a steward of scientific literature.
27:42
If this fascinates you, you can go on corsic.net and you can learn how to become an integrity investigator.
27:51
So another kind of community response is that we need to inform researchers about this, we need to inform researchers about misconduct, about misbehaviour and we are learning as we talk and this talk today aligns with very much with that mission.
28:10
So university should invest in education on paper mills so that you guys are aware on how to spot them or not to not fall prey of them.
28:20
And of course we need to train staff and everybody and also funders should thoroughly investigate old account for researchers or anybody connected with paper mills.
28:32
Other responses have been a little bit more pragmatic like for example there is a retraction watch.
28:38
So Retraction Watch is a blog that reports on retraction on scientific papers.
28:44
They investigate on individual paper.
28:46
For example, there has been a fraudulent paper, they say why that paper is being fraudulent.
28:51
Or for example, they do a large investigation.
28:55
Also, for example, on the left hand side you can see a particular paper being investigated and this particular paper is is is having what is called torture phrase here rather than saying mean squared error.
29:08
So the authors just rephrase mean square blunder to kind of circumvent the anti plagiarism check, right.
29:16
So to to kind of overcome that check, they just renamed something that is commonly known in literature as mean squared error with something else.
29:24
Those are called tortured phrases.
29:26
So the investigation retraction works.
29:28
They investigate this kind of cases that go from one single paper or for example to a collection of paper.
29:38
For example, as you can see here on the top on the bottom right.
29:43
So there are a whole journal as collected for $100,000 for papers that he later attracted.
29:52
So this kind of analysis got a little bit in in depth and looking at what are the bad practises that are happening there or for example, Web of Science, the list a whole journal because this journal has been attacked by paper mills.
30:09
So the attraction watch again is a block that has been set up by former editors because they've realised that, you know, there is a lot of misconduct.
30:18
So they wanted to account to make sure to expose this this misconduct through by informing the community so that we can learn and be more aware of these kind of things.
30:35
Another important resource that we have out there is retraction watch database, which is very much attached to retraction watch.
30:42
So retraction watch is a blog.
30:44
What this one is a is a database and what it does, it collects it collects all the papers that have been retracted.
30:51
And it's a unique place where you can find the list of all the places that have been retracted.
30:57
And what are the, the reasons for it to be retracted is because they've been, they've plagiarised or for example, they have concerns about images.
31:09
And of course, it lists the authors, the original DOI of the, of the paper and so on.
31:16
And this could be interesting for some researchers to analyse what are the reasons that lead people to to act in this particular way and eventually get a paper misconduct.
31:31
Another resource is pub peer.
31:34
So pub peer is an online plot platform.
31:36
You can see it as a sort of as a sort of social medium, OK, like Facebook or Instagram, in which every paper gets their own instance, like a page.
31:47
And you can comment on these papers in a sort of post publication peer review.
31:55
So you can look at a given paper and what are the comments that people have given later on to maybe they have just questions or maybe they have concerns or maybe they have doubts or maybe just praising the paper.
32:10
All right.
32:11
But one particular feature of Papier is that allows users to post anonymously.
32:17
And thanks to this feature, it does become like an opportunity for a whistleblower to kind of raise concerns about this particular paper or any particular paper.
32:31
So to and this is because it allows them to raise question without fear of retaliation.
32:40
And and then once you know a concern has been raised potentially with that evidence, you can go to the journal and ask for retraction.
32:51
So the journal will then investigate properly, taking all the evidence including Pub Peer, and contact the authors with an investigation and then eventually possibly start retraction.
33:03
So in general, Pub Peer is considered at the moment as a Kickstarter for the retraction process.
33:11
Here there are some examples on the same line that I showed you earlier.
33:15
So again, Jen AI.
33:17
So here is Jen AI is saying, certainly here is possible is a possible introduction for your topic.
33:24
So authors again forgot to remove that, but very lazy, very silly.
33:29
And then then paper got published on an SV junior surface and interfaces and then somebody asked for, I don't remember.
33:39
This is I think future work as of our knowledge cut off in September 21.
33:44
There are noising discoveries and so on.
33:46
So here they forgot regenerate responses or yes, this is actually the future scope.
33:54
So as an AI language model, I cannot predict the future.
33:58
So those are the kind of examples that we need to be really aware of.
34:03
So yeah, Jenny, I is a tool.
34:08
We can use it to improve our language, but of course we need as authors remember that whatever goes into the paper, we need to be accountable for that content and we need to always check and re edit whatever Genia throws at us because you should never trust.
34:27
At the end of the day, they are just predicting what is the likely next word.
34:32
So it's not necessarily it's representing what is the reality or the truth, which is what science wants.
34:41
So here we're about to closing.
34:44
And so after after a matter of retraction.
34:48
So we have seen so far what can happen, right?
34:55
If you, if you're not careful enough and if you use Gen AI in an unconsiderate way, you can likely get a retraction.
35:07
What is a retraction?
35:08
Well, you cannot sense it already if you're not aware of it.
35:11
It's like your paper gets a huge red banner, which is called retracted and basically it goes straight in your CV somehow.
35:22
So you're in in this database that I mentioned that the retraction wash database.
35:27
So your name is going to appear there eventually.
35:30
So, and that is effectively like a black mark, something that stays with you permanently, right.
35:38
So and and then eventually, what is the aftermath of this retraction?
35:44
Well, in some cases we have experienced that there is dignity at stake as well as the career.
35:51
Like the plagiarism, plagiarism example that I showed you earlier, at the beginning that researcher had lost his position because it was running against time.
36:06
So it was doing a lot of things at the same time.
36:09
So he thought of coping a particular paper.
36:12
So not only lost the dignity but also he lost his career.
36:15
And there are a lot of example.
36:17
If you go on retraction watch, you can read a lot of people that you know the blogs you can read about a lot of people that lost their career because of the misconduct, right?
36:31
So the you can, your career can be compromised or all journals can temporary or permanently ban you.
36:38
And here there are links that you can follow.
36:41
By the way, the slides will be shared.
36:42
So you you are more than happy to further dive in all these links that I've, I've shared here a bit of meta science of retraction.
36:52
So, so in general, what happens is that scientists with retracted ublications typically are young researchers, maybe people, maybe researchers that are not aware completely of what are the consequences.
37:11
Maybe they are young, not experienced and kind of short sighted in a way to what could be the potential consequences, but they're not.
37:22
The particular interesting aspect is that those are the ones that are affected the most, not only like they're affected the most is because technically they are the ones to spare.
37:32
You know, they are the culprit.
37:34
We can, they are young, we can send them away.
37:36
It's very hard for perhaps for all the professor to be sent away because their career is more established, more stable, so a minor hiccup will not affect them much.
37:49
But for younger ones, it's something that we need to be opening.
37:54
And I consider myself young researchers as well.
37:56
We need to really open our eyes against this.
38:00
So the take away message here is that resist the current high pressures to churn, to churn out large number of papers.
38:09
Remember quantity, Yeah, sure some of the quality is being assessed, but remember quality is key.
38:17
Beware of junior special issues.
38:19
Check the guest editors.
38:20
Junior special issues have got their own limitations.
38:26
Avoid some publication.
38:27
So try avoid slicing your paper in your work into small atomical papers just because you want to get as many publication as possible.
38:39
And, and be mindful what you cite, because, yeah, if you cite something that is not looks dodgy in in principle, right?
38:51
And and that particular paper is core to your approach.
38:57
You certainly don't want to establish something on top of that, because if that paper gets retracted, then your paper becomes kind of hostage of something meaningless.
39:10
You are building something on, on a, on a castle of sand.
39:13
So you should be and of course in Zotero, if you're using Zotero for, for managing your, your bibliography, Zotero's got plug insurance for helping you identify whether a paper is problematic or has been retracted.
39:33
So you can take advantage.
39:35
There are a lot of technologies nowadays that you can use to, to be a little bit more careful about what you cite.
39:42
So take away message at the end is AI is and particularly LM are here to stay.
39:49
We are not here to demonise but to encourage their usage.
39:53
But you need to be really careful.
39:55
So you need to use them sensibly.
39:57
Remember to take accountability so that whatever you write, you own it.
40:05
Then declare the use according to the policy of the journal that you're publishing to.
40:11
So of course, again, read the journal general policy because at the moment every journal has got their own policy.
40:19
While typically if you go to a severe, there is a kind of general a severe policy.
40:24
But of course, there are many publishers out there which is not only a severe And remember that attraction is a black mark that it stays with you forever.
40:34
So be be diligent and strive for excellent.
40:37
A special thank to all these people that through them I've learned a lot and actually today's session will not be possible thanks to their guide and I've learned a lot through them.
40:54
So that's all from my end.
40:56
I will stop sharing and I will open the floor for Q&A.
41:03
Thank you, Angelo and Paul for a really interesting presentation there.
41:07
We've got a few minutes for questions Just for for those on the call, Please note that we may run over and if we do, we'll try to our best to answer those questions within the chance so you can return and read the responses.
41:19
So I'll read a couple out now from Claire Hedges.
41:22
Does a pay to publish model tend to exclude scholars from the global majority?
41:29
What is global majority in this case?
41:44
Should we go to another question and come back to that one?
41:47
Can you, can you repeat it?
41:48
So I'll try to I'll, I'll make an attempt to to reply.
41:52
So does a pay to publish model tend to exclude scholars from the global majority?
42:04
It really depends.
42:06
It really depends because we need to define what is global majority in this case.
42:13
But there is, there is a, there is an effect there, which is of course pay to publish has created this kind of system in which journals are asking Iapcs and there is evidence that APC tend to grow yearly.
42:31
So of course not everybody can afford that.
42:35
So this creates a problem and indeed that's why we are not fully transitioned into open asset access and paid paid to read.
42:45
It's still an existing model.
42:49
Yeah.
42:49
And I would say it's kind of like, yeah, each of these models, the traditional and the Open Access models, they exclude certain people from, from access to research just in just in slightly different ways that it can, even if articles can be made generally available to the public, that is only an option for scholars that have a certain financial backing from the from the institution.
43:11
So kind of like these high article processing fees, they, they can definitely be a barrier to inclusivity for, for some, for some scholars.
43:19
So I would say there, there is, there is an inclusivity issue with, with either publishing model.
43:24
It's just that at different levels.
43:28
OK, next question is, is the push for Open Access publication part of the problem here?
43:35
If journals were incentivized to return to a curation model, there would be no incentive to flip them with fraudulent papers.
43:41
But they can't do this as long as they're paid per article rather than being able to charge readers.
43:46
Do the speakers have any ideas on how the open research community can continue to support access to research without contributing to the quantity over quality incentives causing these problems?
44:02
That's a very yes.
44:03
So as we say during the the the the talk, Open Access is as got its own merit, which is which is being open to everybody.
44:18
So anyone in in society can read any paper, but of course that brought kind of open the door, the back door to this kind of incentives.
44:29
So Open Access is not going to go away.
44:32
We just need to find a way to fix it.
44:36
So do the speaker have any idea on how to open this community?
44:39
Can yes.
44:45
So one of the argument that what which is actually the solution that has been proposed by not yet is to get away from from for profit journals.
45:02
That's one of the solution.
45:04
Now there are solutions like for say, for example, go through community LED journals or institution LED journal or the country journals.
45:14
So those are kind of alternatives.
45:16
But actually the concrete actions are are minimal at the moment.
45:21
And the reason why people don't want to transition to this sort of model is because these journals that we have at the moment have developed a sense of prestige.
45:34
So if you are publishing a June or a paper in a, in a particular journal, in my case, for example, information processing management.
45:42
And sorry if I name a journal, I don't have anything against information processing management.
45:45
OK.
45:45
So if anybody being an editor there, don't blame you for this, but you know, it's a very good journal, right?
45:52
So it's a very prestigious journal.
45:54
And at the end of the day, as a researcher, you are motivated to go there because you can say, oh, I have a very good paper published in this journal.
46:03
Whereas if you set up journals that are institutional LED, we'll take time to develop that prestige, that that prestige.
46:10
Maybe not even I'd be able to develop this prestige because I don't know.
46:16
So, you know, there is this kind of kind of challenge to break that kind of mentality of the prestige and go to that.
46:25
I don't know, Paul, if you have any other thing to do.
46:28
I completely, completely agree with that point that I think like nobody argues against the idea of Open Access.
46:33
I think everybody stands behind that.
46:35
The problem is if there are Open Access fees that stand in no relation to the actual cost of of handling an article and that there's kind of like this direct money transferred to the to the journal industry.
46:49
And yeah, the, the problem, I mean, there are many great initiatives for, for journals that really try to, to minimise this, this for profit aspect of it.
47:00
The problem is really that the academic system has come to rely on this journal prestige as a main marker of esteem that kind of like in all all kinds of recruitment processes, the way that the the CVS of candidates are, are initially evaluated is just by looking at their, at their list of publications and see kind of like in what journals have they published.
47:19
This is kind of like a way to to outsource the evaluation of the quality of an article to the journal.
47:25
And this is something that I've spoken with quite a few people who say kind of like they don't want to move to a system where they as an institution would actually have to read all those papers to get a good sense of kind of like how how impactful is the science?
47:38
But that this is just something they can say.
47:40
We can just look at where have they published?
47:42
And this is like, it's a really terrible practise, but but this is just kind of like the, the way that the academic system has made itself completely dependent on these, on this journal prestige as a as a mark of esteem.
47:58
This is something that a really high proportion of academics feels kind of attached to because this is a system that they've been socialised with.
48:04
And this is, I think, I mean this is my personal opinion, but this is what I think what has to change is that we we move away towards more community LED journals that don't have this for profit aspect to it.
48:20
And that we also try to transition the academic institutions away from this journal prestige as the main mark of his theme towards actually, yeah, somehow forcing people to actually look at the at the original science conflict to say that each candidate you have to read at least one article rather than evaluate CVS in these in these shallow ways.
48:43
So I think that's the only way forward.
48:45
But it's really hard because it's a it's a it's a complete shift in the way that the system works at the moment.
48:50
That's really hard to to ignite this kind of paradigm shift.
48:56
OK.
48:56
Next question was from Helen Turner.
48:59
Do you think there is sufficient training for early career researchers on the care needed when using Gen IA in authorship?
49:10
That's a good question.
49:15
This is this kind of session is something that we are holding in my university as well.
49:21
So it's kind of an initiative that I am pushing within my university to make sure that everybody is aware of this.
49:29
There is some activities, but I'm not aware in other universities.
49:33
What's the extent of the training being done in other universities?
49:37
But if it's not done, it's something that we should definitely do because again, today we have seen what's the bad side of what can happen.
49:49
But of course, we need to train also also on how to use it properly and how to use to take, how to use genea to take the best out of it.
50:01
So that's another thing that we should do as well.
50:04
Yeah, yeah.
50:06
And I think the challenge is just that this technology is is developing at such a rapid pace that yeah, the question is a little bit kind of like who should lead all that training when we as academics are just kind of like catching up with a new with a new technology development.
50:21
So I, I completely agree with the premise of the question that this is really, really critical, but it's just, it's really hard kind of like even in our assessments at the university, it's really hard to keep up with the pace of of AI development.
50:34
So this this is definitely a big challenge at the moment.
50:41
OK, I've got two more questions.
50:42
Hopefully everyone can stay online for the last two we have got.
50:48
Is AI being used sort of Gwen Kent is AI being used during all parts of the scholarly communications process, EG funding applications, peer review, editing, promotion, marketing, metadata?
51:01
How can we ensure integrity across the entire process beyond the final result, IE the publisher published article?
51:11
That's another good question.
51:13
So in, in general, yes.
51:15
So actually I'm, I'm in the process of, of making an application to UK right funding.
51:20
And in the call it says general AI is allowed as long as it's used carefully, right.
51:25
So for finding application, it is allowed peer review.
51:29
There are a lot of tools out there where the authors where that you can load a paper and you can do peer review.
51:36
Now these not necessarily means that is used within journals or conferences yet, but it can use be used by authors to self check a paper before submission as well.
51:49
Because one of the challenges there is that in conferences journals, they cannot really use peer review automated tools to do peer review because they don't have the copyright to upload those tools those these these papers into into Gen AI.
52:04
So they are in breach of of copyright if they do that.
52:09
But autos they can do, yeah, promotion, marketing, metadata, yeah, AI is being used in, in the whole process.
52:16
The problem is that we only see the interface there.
52:20
The interface is the document, the paper or the funding application.
52:24
We don't know exactly what is the extent of, of, of, of gene AI being there, but actually being an AI, being a researcher in AI, perhaps, perhaps we should not, I need to be careful about how I phrase this, but in general, AI is there is a tool.
52:47
It's, we have been using AI for ages.
52:49
Even if you know, you do classification of images to identify, I don't know, a particular section in a, in a scan or for example, to classify text.
52:58
So that's AI as well.
52:59
So gene AI does not change much, right?
53:02
So gene AI is there as a tool and and as long as authors take accountability and then we need to rely on peer review to make sure that whatever content we see in that interface, which is the paper is actually rigorous.
53:17
So is it it as the experiment done in a rigorous way?
53:22
So those are the kind of thing that we need to do to ensure that integrity is there is still standing.
53:29
Yeah.
53:32
And then there was a follow on question from Florencia about I had a similar question.
53:36
Do you know how journals using Jenna?
53:39
Do you know how are journals using Jenna, Jenna AI in what specific processes?
53:47
So I'm not aware of journals using Genna AI or not aware of specific cases, but I I am aware that they will start using Gen AI to screen papers, at least to understand whether that paper can be disc rejected or not.
54:09
That's very easy, can be easily done because nowadays with journals are flooded with papers, the rate of publication is going is skyrocketing.
54:18
OK, the rate of submissions is skyrocketing, Sorry.
54:21
So journals are being flooded and the and of course also peer review it it's, it's at strain.
54:28
So there are not enough people willing to do peer review and and of course you need to sift through all the all the submissions and Gen EI can be easily deployed for, for, for doing, for for doing desk rejection.
54:45
Now, I don't see why not in in a few years this trend will dramatically change with Gen EI being also reviewer.
54:54
Why not like for example, a third reviewer to support you know what, nowadays I don't know about you, but in my journals I get 2 reviews because people are not willing to do that.
55:07
So I maybe in a few years we will be just one reviewer.
55:11
So Jenny, I can be that second reviewer or for example, Jenny, I can be used to do meta reviews to kind of understand exactly automatically if there is a positive sentiment about the paper or a negative sentiment to be accepted or major revisions on.
55:26
So I'm I don't have concrete cases yet for GENEII being deployed.
55:35
Most likely predatory journals are using it.
55:38
So, but again, I don't have proofs for that.
55:40
But I don't see why not being integrated now if that is problem or not.
55:46
It's something that we need to see an experiment in next few years.
55:52
Thank you.
55:53
I think we'll call this one the last question because in just because the time is running out.
55:58
So the last question is from Lackner.
56:03
Are there any metrics at all in the academic publishing world that indicate indicate the quality of an article?
56:10
And secondly, how to identify a predatory journal?
56:14
Is there a database with a list of at least known predatory journals?
56:21
OK, so quality is a long standing challenge to define and that's why many research evaluation exercises rely on quantity metrics because they are they are numbers, numbers, citation number of each index, whatever.
56:40
It's easy.
56:40
You rank people, you rank institutions, you rank whatever you want because you have numbers and actually ranking.
56:47
They've got and have raised a lot of issues because basically they can be gamified.
56:52
And again, it goes back to the same problem of misconduct.
56:57
So quality, it's a long standing issue.
57:01
One of the things that we do in REF is we have a paper.
57:06
We look exactly at how the paper has been used, who are the people that they used it?
57:11
What are the implication of the impact that this paper is?
57:14
So there is an act, there isn't a number there to put quality.
57:19
It's all about narratives.
57:20
It's about how it's about how you describe the paper and what kind of achievement it does has done right.
57:30
If it if you put a number in quality, then basically becomes quantity.
57:33
So that's the that's the whole purpose of it.
57:37
So the second question is how to identify predatory journals and if there is a database.
57:48
So there is one database which is, I don't remember its name.
57:56
There is one database.
57:58
Maybe I don't know how to do it.
58:00
Maybe I should add it in the, in the slides just to Yeah, there is, there are some databases, but in general predatory journals you can see, you can spot them quite easily.
58:15
OK, basically you get solicited via e-mail.
58:18
There is an e-mail in your mailbox that says, why don't don't you submit?
58:22
And technically they, they, you can easily spot those signs.
58:26
For example, they have like their scope is broad.
58:31
Rather than being specific, they tackle different topics because of course they want to attract as many submission as possible.
58:38
Then for example, they have like they offer you opportunities, they try to lure you in in as many ways as possible.
58:45
Like for example, they offer you discounts or for example, they offer they offer quick publication times.
58:54
That's another thing, publications or submissions to be published, they require their own time to be properly peer reviewed and analyse.
59:02
If you have a journal that promises you that in 30 days your paper will be published, something is wrong there because there is not technically enough time to do proper preview, peer review handling and and, and then of course responding to that to with a rebuttal and so on.
59:20
So you need to look to spot those signs and, and, and, and I'm sure that beyond those, there are some other, another thing that you can look at is also where this e-mail is coming from.
59:31
Is it coming from a Gmail account or is it coming from an official academic institution, like an official account, academic e-mail.
59:41
So those are the kind of and there is though this, there is this phenomenon called the uncanny valley, OK, something that looks almost genuine, but not genuine enough.
59:52
So that's called the uncanny valley.
59:54
So you, you look at the e-mail that you received and then you basically become aware on on all the issues that these e-mail us.
1:00:01
That's a sign of predatory journal.
1:00:03
So that's how you can spot it.
1:00:05
But of course, if you go on, if you look on Google, there are there is an extensive list of all the kind of peculiarities.
1:00:12
The problem is that, and here I'll stop after this, is that once you expose these lists.
1:00:20
Basically nobody stops predatory journal to start gaming that list because if you say that this particular feature means predatory journal and a predatory journal doesn't want to be classified as predatory journal, they will try to not fall into that category that particular they will not fall victim of that particular feature.
1:00:36
So it's another thing.
1:00:38
So that's why it's all about experience.
1:00:42
Yeah.
1:00:42
And just one just one really quick thing to that.
1:00:45
I think the the rule of thumb you can use is if the journal approaches you and tries to convince you to publish with them, then it's likely a predatory journal.
1:00:53
Kind of like the really serious journals are the ones that that you have to approach and convince them that that you can publish with them.
1:01:01
That's like a really simple rule of thumb that that I would follow.
1:01:03
If if you get lured by the journal, it's it's likely not a very serious outlet.
1:01:13
Thank you so much, Angelo, Paul, for your presentation and for answering the questions.
1:01:18
And thank you also to everyone who joined the call today and contributed to what was a really interesting discussion.
1:01:26
So thank you very much.
1:01:29
I think we can still post questions on here and answer.
1:01:32
Would that work for you, Angelo and Paul?
1:01:34
Yeah, we'll go through them.
1:01:36
Yeah, fantastic.
1:01:37
Thank you very much.
1:01:39
Just one final note before I say goodbye.
1:01:41
It's open research week running through through the remainder of the week.
1:01:45
So please do take a look at the remaining sessions and book on.
1:01:48
We hope to see you at some other sessions soon.
1:01:51
Thank you very much.
1:01:52
Have a good rest of the day.
1:01:55
Bye, Bye, bye.
This session talked about how anyone can become a citizen scientist, recording observations of nature like fungi, British mushrooms and birds, spot a hedgehog and even monitor African leopards.
Watch the video recording of the session below:
[0:00] Good morning everyone.
[0:02] It's love to see lots of people joining us for today's session on citizen science perspective on enabling collaboration, innovation and impact. So
[0:10] yes, good morning and welcome to this presentation on citizen science as a route to collaboration. Thank you for joining us for today's session. My name
[0:19] is Katie Woodhouse Skinner and I'm an open research consultant in uh NTU libraries open research team. Um, and
[0:26] this session is part of Open Research Week 2026, which is a week-long cross institutional celebration of the
[0:33] practices, skills, and culture that make research more transparent,
[0:38] collaborative, and impactful. And this year, it's delivered jointly by Midlands Innovation, Nottingham Trent University and the Open University. And it's a
[0:46] program that is bringing colleagues together who are shaping the future of open knowledge. Um, and that is exactly what we are doing in today's session.
[0:54] Um, so just to give you a bit of context, citizen science has become increasingly important as an approach across disciplines because it expands
[1:02] who participates in research and challenges traditional boundaries between researchers and the communities or organizations that they work with. At
[1:11] its best, citizen science does more than just widen participation. It can generate new forms of knowledge,
[1:18] strengthen relevance and increase the societal impact of research by embedding lived experience, local insight and
[1:24] public contribution directly into the research process. So today's session, as I mentioned, is bringing together
[1:32] colleagues from the open university as well as Nottingham Trend University to explore how citizen science works in practice across different research
[1:41] contexts. And what makes this panel particularly valuable is it focuses on the lived experience of researchers actually doing this. So they have all
[1:49] used citizen science in their own way and they're all going to be reflecting not only on the outcomes but also the practical realities of designing sort of
[1:57] collaborative research sustaining participation and creating meaningful partnerships. So, as I say, we'll hear
[2:04] short perspectives from each panelist before we move into a wide discussion about the opportunities, challenges, and
[2:11] potential sort of citizen science across different research settings. So, please do pop in questions to be answered uh at
[2:19] the end of the session into the Q&A. Add knowledge, resources, comments, queries into the chat. Um and you are set with
[2:28] sort of uh to be muted and cameras off to help with the quality of the session.
[2:32] Um but we can unmute if needed towards the end as part of that discussion session. So without further ado, it gives me great pleasure to introduce our
[2:41] speakers. So Janice Ansign is a senior manager for citizen science at the Open University where she leads projects
[2:49] using accessible digital tools to engage the public in scientific research including major citizen science initiatives focused on biodiversity,
[2:57] environmental monitoring and public participation. We're also joined from three uh different colleagues from Nottingham Trent University, Rachel
[3:05] Leman, Paulina Pauloska and Hannah Jenkins. So this group brings a range of experience of applying citizen science
[3:13] methods across different research contexts from co-designing with stakeholders in zoos monitoring hedgehold populations and also euro
[3:21] using tourist images to monitor leopard populations. We have a really rich panel today and I'm very excited to hear from
[3:29] each of them. So we're going to start with Janice Ansign. So Janice over to you.
[3:39] Thank you, Katie. Hello, everybody. Um,
[3:42] I'm just waiting for my slides to come up. There we go.
[3:57] Perfect. So, hello everybody. As I said,
[4:00] I am senior manager for citizen science in the faculty of STEM at the OU, but I'm actually speaking to you in a
[4:07] crossmix of roles today because I'm also completing a professional doctorate in education in the faculty of Wales
[4:15] focused on my work. Um, so if you could move to the next slide, please. So, I'm one of those odd mixes of people where
[4:22] I'm a practitioner, researcher, and I popped this slide in not to boast or show off about anything that I've done or what I'm doing, but just to give you
[4:30] a scope of all the types of things I'm I'm involved in and why it it in terms of where I am in citizen science now,
[4:40] how important this all is all helping to connect things together, so to speak. So my practice is also my research and my
[4:48] research is also my practice. It's a very odd mix. We can talk about that more in the in in in the Q&A probably.
[4:53] But one of the key things is is how much all this has come together over the years in terms of what I'm doing now. So
[5:01] if we could move to the to the to the next slide. So we all know what citizen science is.
[5:08] I'm assuming we're all here today because we know we know how important it is. A lot of my work is supported by
[5:15] various principles and guidelines and organizations that have been established over the years. As you know, citizen science has been around now in terms of
[5:22] the term as we know it more so particularly over the past 15 years.
[5:27] I've been at the OU now 17 years managing citizen science almost. So you can show you know I've been involved in it from that from that stage. So one of
[5:34] the key things is is the European citizen science association and um and one of the key things that I was involved in the early days
[5:42] development of the 10 principles of citizen science and ever since that has guided how I practice how I work what I
[5:50] do and interestingly this 10 principles of citizen science have become one of the standards of citizen science
[5:58] globally it's been translated in m multiple different languages and and and and utilized in different ways. There've been many iterations of it, different
[6:06] ways it's been shaped that the more recently something has developed around the characteristics of citizen science,
[6:11] but the 10 principles still remain as a guiding framework. And I and I mention this here because um because of how important this is and how it's good to
[6:20] reflect on it as a team of us who are involved in in in the development of the 10 principles, we actually wrote a paper some years ago about it in terms of how
[6:28] it's evolved and grown and and shaped things going forward. So it's one of the things that we're quite proud to have been involved in and particularly proud
[6:35] of that and that has impacted a lot on the things that I do in terms of enabling collaboration and also
[6:42] innovation and facilitating that kind of impact that we want. The impact that I like to do is around the participants,
[6:49] how the what they get out of the experience, what they feel, what they do, what they experience, how they're involved. And um we know they play
[6:58] various roles. Um I I don't like the the the the um the passive definition that exists in for the Oxford Institute
[7:05] dictionary which see them as passive people just passing on data. I like the inclusivity of of of citizen science.
[7:12] And as we know there are a whole range of projects that people are involved in and it's impacted on lots of different areas. The area of my work which is most
[7:20] my work is mostly focused on is biodiversity. I manage different projects but the one I'll be talking about today is iceportnee.org
[7:28] which is bit which is what my my my prof my professional doctorate research is based on. Next slide please.
[7:38] So about my study and there are lots of text in this but bear with me. Um so the key purpose of this work is to
[7:46] look at citizen science learning communities a case study of icepot. Now I started this work looking at learning
[7:54] journeys but really wanted to explore the full complement of things in terms of how learning is one of the ways that
[8:03] people engage and participate and how that impacts in terms of individuals as part of a community but also the
[8:11] community itself and how that learning evolves and also how that also creates it body of of of knowledge. So I'm looking at this in the context of a
[8:20] community of practice and how icebot could be interpreted of that in terms of how through this collaborative process
[8:27] it enables things to happen and facilitates that sort of an impact. So some of the the the key things I'm looking at is how these contributions
[8:36] shape learning as individuals but also how it shapes participation and engagement with others and how that I
[8:44] know that emanates in terms of the development of knowledge and and and learning. Some of the key terms I've been looking at include engagement,
[8:51] participation, online community,
[8:54] collaboration, motivation, which is a big one we all know about and there are lots of studies that have been done around that in citizen science in recent
[9:00] years. Um, learning knowledge, citizen science itself and of course the concept of citizen observatories and ISPOT is
[9:08] used as one of the European citizen observatories. We were involved in a a European project focused on this a few years up to just about two years ago
[9:16] called cost for cloud and we were one of the I was one of the exemplar um citizen observatories in that where we tested different technologies and tools to
[9:25] facilitate the development of of of cos so um one of the key things I'm trying to emphasize to sport users is that by
[9:33] taking part they'll be contributing to this research project that's trying to understand how learning happens and how knowledge develops. s in these types of
[9:41] communities with ice spot being the case study and therefore this will actually impact not only on iceot but also the development and and and how people
[9:49] understand how these types of communities are shaped and what they can do to facilitate impact on in an ongoing way. Next slide please.
[10:00] So what is spot and there's a lot of things here that's going on. I spot is a citizen science platform for biodiversity.
[10:09] um we operate um is operate has lots of different so it's an innovative tool that sort of takes on that whole practice of um innovative technology
[10:18] that the OU is involved in for learning but using in a slightly different way we sort of combine things where wanted to get people to do more and share more and
[10:26] out of this has evolved a whole range of data a whole range of practice and a whole range of benefits to the individuals that participate so the
[10:35] platform has a lot of features and tools tools where you can create projects which sort of help to shape what you want to see, how you want to view it.
[10:44] You can explore what's there in terms of looking at the carousels and look at observations that people have posted.
[10:50] You can explore it through the filtered groups espec the the species groups. You can also look at the species browser
[10:57] where you can see some of the things that there and look at it in terms of different groups whether we got it globally or to the UK and Ireland. And one of the key things about icebot is
[11:05] that your knowledge grows and you can see that through your your your eyesot profile area where you can see your own
[11:12] reputation so to speak. Um and I'll be delving a little bit more into some of the the tenants of how this operates in terms of the four areas of explore,
[11:22] record, identify and learn. And I'll be going into that a little bit more. Um next slide please.
[11:32] So why a case study of ISpot? Um I spot your place to share nature was was was
[11:40] launched to the public in 2009. It's a global platform. We were part of the original program called OPEL exploring
[11:49] nature which some of you who aren't on citizen science would know about Opal.
[11:52] It was one of the earliest big citizen science initiatives that was across the UK. And we got a big chunk of money from the big lottery fund. And the OU got it was over 12 million um over six years.
[12:05] And at the OU we got um we got just over 2 million. Actually I start at the OU at that time when we got that pot of money
[12:13] in and helped to manage this program and initiate it. So the aims of ISPOT at the time and still are are to lower barriers
[12:21] to identification to build ID skills. We wanted to help to get people learning more than just watching a David Atenburgh program, but to actually get
[12:29] outdoors and do something and share it with others in a community space. We want to make nature accessible and open to all. And we also wanted to sort of
[12:38] support this whole new generation of naturalists and and and those who are involved in initing science from that time know how things were shaping you know the things that were influencing that that whole framework at that time.
[12:48] It was during this time that exa evolved and all these things were happening.
[12:51] There was a lot happening not just in in the UK but also globally with the start of the of the of the citizens association now the AAPS based in the US
[13:00] for example and also we want to to to support the new generation of naturalists and and and develop more more more con more more more biological
[13:09] data recording and support to do that we wanted a platform that was innovative but also free and easy to use with
[13:17] integrated features and tools and we wanted to facilitate a tool for all. So it's open to noviceses, experts, etc.
[13:24] You upload a wildlife photo and with the community you get help to identify it.
[13:30] Next slide please.
[13:35] So what is happening and what is being shaped by this? So as I mentioned within spot we have a framework that that that
[13:45] sort of we use to guide the user act the user experience so to speak. And this framework I'm using it as as part of my
[13:53] research to sort of use to shape the investigation so to speak of of of of of engagement and and participation of
[14:00] using their activity. So in terms of exploring the site, this is what anybody can do to a certain degree. But once
[14:07] you're registered, your experience is enhanced. So you can add favorites, you can do agreements, you can add observations, you can do all these types
[14:14] of activity which you can access through your spot area. And this is a is is is a is a tracker that sort of looks at what
[14:23] you're doing, how you can track what you're doing and and see your performance over time. Key of course is recording. ISPOT has a range of
[14:31] different tools and features that can facilitate that in terms of adding observations. And core to this is this is core to how people participate. And
[14:40] recording observations is also documented in in your um in the in in in your eyespot area where you can see your
[14:47] user profile. And I mentioned a little bit earlier about your reputation and that is quite unique to spot where your
[14:54] your reputation grows by each activity you do oncep. So you will then see your expertise growing in lots of areas. You
[15:03] may be an expert in more areas than one and what's coming out of them the experiences of the users that I've been as I've been exploring in this study.
[15:10] They they themselves might be an expert in one and novice in another but they can see their growth in that other area.
[15:17] So it it it's quite interesting how this is all evolving and how you can see this changing through these experiences on
[15:23] ice spot. The third area is identify and of course identification is is is part of a key way of not only people of how
[15:31] people contribute individually but also how the community contributes.
[15:36] Identification happens through further let's call it verification agreements by others in the community. Oh I'm sure
[15:44] it's this you you say I it might be this somebody agrees you say yes it is this and it's this because and key here is
[15:52] what people put in the comments. This is what I've been quite interested in exploring what people say when they're helping others. And these comments, we
[16:00] have hundreds and thousands of comments posting all these observations. And this is such a rich source of information of
[16:08] what people are doing, what they're saying, and and how they're experiencing this not only on their own, but wi with
[16:14] with others. And of course, there's learning. and learning is deemed as one of the the key motivating factors in terms of all the studies that have been
[16:23] done around why people join in and participate in citizen science and in icebot what we've done there are certain tools and features in there that
[16:30] facilitate learning but also learning just happens naturally through the process of commenting and sharing so trying to explore this as well and in
[16:39] addition to looking at the data that's there through the comments I'm also doing interviews around with w with with with participants as well to get an
[16:47] understanding of how they view some of these concepts and it's been fascinating what's what's been been unearthed so far. Next slide please.
[16:58] So what's coming out of this? So this is just a quick sample of some of the the the of of a of a some of thematic
[17:07] analysis that has been done on some of the comments within ice. As you can imagine over this amount of time we do have a lot of information within spot.
[17:16] I'm focusing my my research at the moment up to the end of 2025 and I'm doing it in different bursts of you know
[17:23] in terms of how I look at the different bits of data. So, in this snapshot here,
[17:28] I I looked at um some of the comments posted, just around a 100,000 comments
[17:35] posted between 2009 and 2012. And I was looking at the frequency of terms in terms of terms related to learning and
[17:43] knowledge creation and so on. And what was interesting is that around there were around of the 100,000 um comments
[17:50] posted there were close to 20,000 comments with the term with these terms utilized in them. People actually saying it in what they actually place in their
[18:00] comments and just a few a few examples here. I don't really understand why it is so hard to ID fungi but even when one
[18:07] looks quite distinctive it seldom matches anything um in in in a particular guide. Icepot is really
[18:14] useful in help to help guide the learning process. Another one here um my question was really intended to just
[18:23] understand how you had arrived at your ideas people it's inquiry process is happening as well. Icepot is great is a
[18:31] great resource for learning and it's helpful when the more obscure species are identified to get the reasoning for
[18:37] the ID explained and this happens time and time again through the hundreds and thousands of observations I run I spot
[18:44] and another one here as someone who uses the site to learn expand limited knowledge I find debates useful
[18:52] um I've learned I've learned I've learned more about anatomical differences of birds here than anywhere else and now put these learned skills to
[19:01] the test in the field. Just a snapshot of this rich information that's there and this is all from the participants themselves.
[19:10] Um, next slide please.
[19:14] So what is coming out of this here is the is the influence of participatory learning and how this is part of the
[19:23] iceport user experience. And in in all of this, it's is it's looking at it in terms of how this is shaped by the users
[19:32] themselves and that community- based engagement and participation. So some earlier work I had done um we had done
[19:40] with my colleagues um exploring this on iceot because I spot model has has gone through a lot of iterations over the
[19:47] years. So previously we had this sort of five-step model of learning which has now come into this four-step process that I mentioned earlier and and through
[19:56] this exploration we looked at various different things and and and and we kept to the idea that I spot supports participatory learning and the learner
[20:03] is an active participant and many do this by simply exploring the site searching browsing viewing observations etc. But learning through the
[20:12] interaction with the site is not just from browsing. It even if you're just doing it without having been registered,
[20:20] it can be a valuable experience. Even more so when you are a registered and an active participant.
[20:27] Next slide, please.
[20:31] So, one of the key things I'm quite interested in in the work I've been doing as a practitioner researcher linked to one of the other projects I manage called Treezilla, the monster map
[20:39] of trees. Um we explored the use of storytelling working on a project called branching out with other partners and
[20:46] this is something that has really excited me in terms of how stories um show that impact and impact on people but also demonstrate and highlight
[20:54] impact as well and and one of the things that we've been doing we do I'm doing as well is looking at stories from the participants through their experiences
[21:03] and capturing these stories as individual stories but also collaborative narratives.
[21:09] And I have here one of the first stories that came out of Vicepot that really really got us on the map so to speak. So
[21:17] just after we launched to the public in June 2009 um in October a six-year-old girl found a moth on her window sill at
[21:27] home. She showed it to her dad. He knew about spot and he said let's put it on there and see what happens. Within 24
[21:34] hours, the online community confirmed it to be the unonymous leaf notcher, a species never previously seen in the UK.
[21:41] It was actually from Asia. Um and and she was able to take to take um it into show and tell at school and then we helped to get it into the natural
[21:49] history museum and that's where it sits now. Um but this is the example here and there are many other examples that have come out of icebot as well. It just
[21:57] demonstrates how setting the power of citizen science, how it enables anyone to participate, to make discoveries, to
[22:05] engage, to contribute in scientific knowledge while at the same time being able to engage with and learn about
[22:11] science. So, and and actually we have a course called citizen science and global biodiversity if anybody's interested.
[22:18] That's a free um open university open learn badged open course you can take and and this uh this this example is actually explored in that in that course. Um next slide please.
[22:35] Next slide please. So what's all the power Yeah. Yes. Come through now. Thank you. What's all in the power of citizen
[22:43] science? I spot and not I spot I spot is a social network and going back to the theme of this panel discussion today
[22:51] around collaboration innovation and impact and the research that I'm involved in it just helps you to see how
[22:58] things like this can help you explore the the the the impact of of of of of a
[23:05] citizen science online community of practice and that's what I'm exploring to see we can say this and and the tenants behind this we all know the
[23:13] value of committees of practice and what that can do. What how does how can this benefit more with citizen science and particularly in this context of an
[23:21] online committee like this to shape and influence others. So this is just a this just a a widescale view done by by by my colleague Mike my my dad is actually the
[23:29] icebot curator which demonstrates the range of observations in terms of and the and the connections between them.
[23:36] The blue dots are the actual observations around the globe. Um, and the red lines shows the links between the identifier's location to the
[23:46] observation where the observation is posted from. So, this just shows how these interconnections happen. So, it's a lot to delve into. I can't delve into
[23:53] it all but I'm trying to get a snapshot of it to then we can say this is an example of what we're doing and and this can actually be best practice for others
[24:02] because that's another key important thing of sharing best practice and demonstrating um what's actually happening on the ground. Um so that's it for me. Next slide please.
[24:16] So that's it. Um I don't want to say anything more. I've probably gone over time. I probably have. But I just wanted to share a bit of what what I've been
[24:24] doing, delving more into what's been happening behind the scenes. Um, and I also want to say a big thank you to the tens of thousands of users who've
[24:32] engaged and participated on ISpot over the years. And this is actually helping to shape this research that I'm involved in now. And that's it for me. And I look
[24:41] forward to answering any any questions and being from the discussions to follow. Thank you. That's fantastic. Thank you so much,
[24:48] Janice. And it's really good to see just how farreaching I think that map really shows it at the end, doesn't it? Just how far reaching the participants are
[24:57] within this, whether they are sort of answering queries about the species or posting and asking. It's really good and hopefully we can draw out some of that
[25:05] in the discussion later. Um so next up um we're joined by Rachel Leman um who
[25:11] um is a uh PhD candidate at NTEU who's going to be talking to us about citizen science um as part of uh her project on
[25:20] African leopards and how tourist photos can be used as scientific data. Over to you Rachel.
[25:27] Thank you. Uh so yeah I'm Rachel. I'm a PhD candidate at Nottingham Trent University and my research is about
[25:35] improving wildlife monitoring for elusive species and I focus particularly on the African leopard. So the core problem that I'm addressing is that
[25:44] conservation decisions depend on reliable population information and when you have uncertain estimates it becomes
[25:51] difficult to plan those effective interventions. So today I'm going to share how I use citizen science,
[25:57] specifically tourist photographs to help close those monitoring gaps. So can move on to the next one. So leopards are a
[26:05] great study species because they're charismatic and they're also individually identifiable. So you can tell the difference between them based
[26:13] on their markings. Leopards are also though very difficult to detect. They are really elusive and they occur at low densities or low population numbers
[26:21] already. So, traditionally, leopards are monitored using camera trapping or GPS transmitter callers. And these
[26:29] approaches are really robust and they're very widely used and they're really very good, but they can be quite expensive
[26:36] and they can be quite logistically demanding. So, in practice, that means that many studies are limited in terms of like the space that they can cover,
[26:44] but also the time that they can be out in the field for or both. Now, for a wide ranging carnivore, that can be a real limitation. So you often end up
[26:52] with like a short snapshot rather than long-term insight. So my question is actually not should we abandon camera traps or collars but it's whether we can
[27:00] supplement these structured monitoring techniques with like a scalable cost effective source of additional data without sacrificing any scientific
[27:09] rigor. And that's next one please. So tourists and specifically the images that they take. Oh, sorry about my
[27:16] slides. They're not very colorful. um they were before they've just formatted funny. Uh but specifically the images that they take could have the potential
[27:25] to provide large amounts of wildlife sight and data that we currently don't integrate into the literature. And I suppose you could say that our primary
[27:33] open research practice is the project itself. So it is a citizen science network that gathers leopard photographs from tourists and local guides. Now
[27:42] although leopards are rare, they are highly photogenic. people are actively seeking them out to take photos of and
[27:49] maybe most importantly people love to share photos online and this creates a sort of natural stream of data for us.
[27:56] So our project which is managed through a social media platform. It now engages over 9,700
[28:03] contributors across 10 countries. And most of our contributors are based in South Africa where this study is based.
[28:10] And our tourists or our citizen scientists, they submit their sightings and their photos through our community network. And those contributions greatly
[28:18] like expand sort of the space and time coverage beyond that that camera trapping alone can typically achieve.
[28:25] Next slide, please.
[28:27] So the key scientific challenge that I have is converting these photographs into like analyzable data. So for all the research I do, I need two things.
[28:36] Number one is reliable metadata. And especially I need the date and the location where that photograph was taken. And then the second one is the
[28:44] individual identification. So working out which leopard is which. Early in the project, many uploads lack that like
[28:52] essential metadata like that GPS information or the timestamps. But after them after I demonstrated to like our group administrators how that missing
[29:00] data really undermined our analysis, we introduced some really clear submission guidelines and like regular reminders and the proportion of fully detailed
[29:09] images has increased substantially. Then analytically I integrate the turitant into lots of various different analysis.
[29:16] So like population dynamics and like survival, habitat use, all as part of my thesis. Importantly, like this isn't just passive data donation. We've built
[29:25] an ongoing relationship where our contributors see how their science become part of real research. And our goal here is always the same. It's to
[29:33] see how these images compare against the gold standard methods and techniques and where they can add value. So next slide,
[29:41] please. We have learned so much from this data already. So we can work out population numbers and we've actually
[29:49] showcased that combining them with those structured survey means that we can actually be even more precise about population change over time and we've
[29:57] actually been able to publish that research this year in an open source journal which is really great to hear.
[30:02] We can also work out and we already know sorry how much cheaper it is as a technique in terms of monitoring technique h which really showcases that
[30:11] value and we've had some real insight into the lives of leopards such as um we've got a picture here like a female eating her own cub after it had been
[30:19] killed by lions which is like the first time that's been seen and we can also look at feeding preferences we can work out their chances surviving from one
[30:26] year to the next and as I mentioned earlier we can look at their habitat use where do they want to live and raise their offspring. All of this would be so
[30:34] difficult to detect from camera trapping alone. And again, all of this data is just from something so simple as people's photographs from their holiday or their safari. Next slide, please.
[30:47] So, the benefits for research have been substantial and measurable. Without the tourist data, our leopard data set would
[30:55] just span fewer than 3 years as that's the normal sort of PhD time length. But by integrating the citizen science
[31:04] contributions, uh, the data set now has nearly 4,000 sightings of 75 unique leopards of an otherwise incredibly
[31:13] difficult to monitor species. The collaboration has been really incredible. We have a whole host of admins in the project now who support
[31:21] the page and our data quality is continuing to improve year on year because this is open research. H, you
[31:28] know, transparency is central. We share project updates and results back to the community and I actually maintain like an open repository of reproducible code
[31:37] and workflows on GitHub so it can be re reused by others and more importantly it is their data as much as it is mine. Uh
[31:44] they own it they contributed to it and it is available to them and to provide feedback on like I want to hear their voices to improve the practice. At the
[31:52] same time we do manage sort of species security risks. So, you know, after decisions with our admin team, we don't share geosensitive locations on our
[31:59] page. So, that's like dening sites, you know, when those really, you know, small cubs are born or sites that involve other risk species such like rhino. Now,
[32:08] the ethics and the health of these individuals is so much more important to us than just getting that snapshot um or that photograph. Um, and that's something so key. We are a group of
[32:16] animal lovers and it's the leopards that brought this group together in the first place. Um, we are innovative. We are one of the first studies of our kind in
[32:25] terms of leopards for tourist images and you sort of holiday photographs and we also have one of the longest track populations of leopards which is a
[32:33] success in itself. Um as I mentioned our first paper is out we have more planned and wonderfully we have had some of the
[32:40] community be co-authors on that work. So the research has been really impactful uh and really worthwhile. We're also
[32:48] supporting other projects uh who have reached out and want to incorporate tourist data. So, we're continually building those relationships. Finally, I
[32:55] think I just want to share like my biggest lesson or takeaway from this whole experience and that's been the sort of importance of sort of
[33:03] reciprocity. So, like data flow has to be two ways. So, you can't just ask and ask and ask. When I share results back
[33:11] through like updates or maybe it's like webinars or recordings or through data,
[33:15] that trust grows. the engagement is sustained and participation becomes uh genuine collaboration rather than an
[33:22] extractive data process and I think that's what makes it work. I think that's why we are really lucky to be continually growing and I think if I could rename us a much better way to
[33:31] describe us is not really a citizen science project but a community science project that all with a shared passion and I think that's why citizen science
[33:40] and our project like like many others that we'll hear about today are so successful. It is that community aspect and that's actually me all done. So,
[33:48] thank you so much for listening. If you do have any questions, please pop them in and I'll I'll get back to you. Thank you. That was brilliant. Thank you so much,
[33:56] Rachel. It's great hearing about how you have not only included the community as co-authors. I think that's really nice.
[34:03] And I know that Rachel in the chat has mentioned around the fact that they are made as co-authors. I think it's just brilliant. Um but it's such a good way
[34:10] of thinking about as you say how do you come overcome traditional issues in research around okay well we can't have um substantial number of camera traps
[34:19] covering the same area how do we um supplement that and using things like the markers to identify them is is is such a good idea. So thank you for that.
[34:29] So next up we are joined by Paulina Paulikoska. Um and uh Paulina is going to talk to us about her PhD project
[34:37] which is called Spotter Hedgehog Exploring the Potential of Citizen Science for Monitoring Garden Hedgehog Populations. Can I just check you can see that as well?
[34:47] Uh yeah. Yeah, the presentation is visible. Thank you.
[34:52] All right. So thank you. Yeah, I'm Pina Pabikoska and today I'm going to present my citizen science project called Potter Hawk. Um, so it looks at yeah the
[35:01] potential of using camera chops for monitoring garden hedgehog populations and we're focused on hedgehog because unfortunately in the UK their population
[35:10] has been undergoing a historic decline and interestingly uh the data that we have can I have the the next line? Yeah.
[35:20] Uh we don't have robust data to understand why uh the population is declining. Uh and we have to look for new ways of getting that.
[35:31] So camera traps that were already mentioned are sensor activated cameras that take images or videos when they are triggered
[35:39] and they are widely applied in the area of re research that uh me and my colleagues are working in. But we also
[35:46] realized that there are more and more interest in increasingly used in private gardens. So people put them to observe wildlife in their vicinity and it
[35:55] collects potentially valuable data every day. So we thought how to make a use of that data and this is how spot a hawk
[36:02] was created along with my supervisory team. And it basically looks at that at the potential of those cameras in people's gardens to monitor garden
[36:11] hedgehog populations. And we can already talk about the element of innovation there because uh it camera trap images
[36:18] have been previously used in citizen science framework but what we're looking for is very precise um estimates of the
[36:26] estimate of density which is considered gold standard in ecology.
[36:31] So to do that we need to follow very rigorous data collection protocol um which can be a challenge in in citizen
[36:40] science approaches and we wanted to test if if this would basically work and we followed the experience of the national hedgehog monitoring program which is
[36:47] another research that that aims to another research project that aims to um estimate hedgehog population across the
[36:54] whole country. So uh to achieve that we ask our volunteers to install a camera trap in their garden in a very specific way. Then we ask them to make a
[37:02] calibration pole and to calibrate uh the space in front of their camera trap which is necessary for achieving that
[37:11] high quality images uh or high qual high quality estimates. And all of that was then uploaded to a citizen science
[37:19] platform called Mamo web. And as you can see here, it was already uh I mean I as I hope you can already see it required a
[37:27] lot of effort. So we were interested in in the uptake basically of of people are they are they uh willing to to follow so many steps.
[37:37] Um can I yeah thank you. Uh and another way to get involved was also to help us classify the images. So camera traps can collect enormous amounts of data.
[37:46] Sometimes they get triggered by grass moving in the in the background or many other species that were not hedgehogs that were captured as a part of this
[37:54] project and so for but it created a good opportunity for people who don't have a camera trap and still want to get involved to to help us classify this
[38:02] image. Uh as you can see here in the in the panel they were choosing species from the panel on the right and this is a a feature of mamo web that's available.
[38:12] Uh so to achieve that we had to prepare training materials and video tutorials to basically make sure that um that it's
[38:20] communicated well what are the steps of the project and it's kind of the experience that I wanted to share that there's a lot of effort that goes into
[38:27] those preparations. Uh but uh it's kind of an investment that then helps you to collect the data of this high quality
[38:35] because everyone knows what's what what what steps are involved. they can get ready and they collect the data following this this protocol that you need.
[38:45] So in terms of interest, we received 237 signups to the newsletter of the project. Uh over 50 emails from
[38:53] participants and data collection took place over two months. So they could choose a um window of 30 days within
[39:00] this two months where they would like to keep the camera recording information.
[39:06] Uh so in terms of results we received data from 45 uh camera traps and this
[39:12] equal to over 65 uh,000 sequences of camera trap images uh and I think we can already speak about research impact here
[39:22] because um sorry uh because even though uh the cameras uh we we still have to
[39:30] analyze all of the footage as you can see we only classified 41% of the images. So for we all this data is already uh available there on mamo web.
[39:39] It collects data on all species of mammals that can be captured in gardens not just hedgehogs. So uh it's already
[39:46] contributing to the to the mamo records regardless of how our project goes. And I think it's amazing to to just know that that the data is already uh
[39:55] contributing in can contribute in one way or another.
[40:00] So spot hog was designed for citizen scientists but we were also interested in their experiences. So we took it a step further and developed a
[40:08] questionnaire to learn about their experiences. And uh one of the questions that we wanted to address was or to to
[40:16] find out what's the biggest challenge of this project? What what can affect what can pose a barrier to participation uh for people? So we reached out not only
[40:24] to people who uh joined our project who submitted footage but also to people who were interested but for whatever reason
[40:30] decided to to opt out uh and the findings of our survey can I have the next slide please
[40:38] h showed that 50% of people who joined said that the calibrating the camera was the most challenging part of the survey
[40:46] and for two people two groups that s that opted out so people who were interested heard about the project but
[40:52] decided not to join and people who um um completed one of the steps but uh opted
[41:00] out along the way. Both of these groups merged together. 70 47% of them said that uh complicated protocol was was the
[41:08] barrier to continuing. So uh what I liked about this part of our project is that we knew that calibration is a challenging step but uh instead of
[41:17] assuming that this was the barrier we have quantified that and and now we know that if we wanted to go forward with this maybe this needs addressing or we
[41:25] should provide maybe different ways of people to engage. So uh yeah, I think it's very valuable to not only uh you know assume what what challenges there
[41:34] might be but um ask people what what's what's the challenging element there.
[41:41] So we also received u some u feedback as a part of this questionnaire. Uh here I chose some answers that uh obviously are
[41:51] are on the positive side and but they um make it all feel u they're basically
[41:59] very motivating to read when when you hear about uh that people continue to capture headshot continue to monitor garden in the in their monitor garden
[42:09] with camera traps uh as a continuation of the project and that it goes towards a more fulfilling life. I think it's it's it's amazing. We obviously received
[42:17] a lot of construct constructive feedback as well. H but these are the the messages that I like to come back to and and feel that you know it it made a change basically.
[42:29] Uh so in terms of collaboration uh citizen scientists could contribute to our project on various levels. So they were involved in data collection
[42:36] collecting camera trap images. They could also get involved in data processing so spotting and also they inform the future design of the project.
[42:44] So this was yeah project was fully built on collaborative uh model.
[42:52] So going back to the themes of of this uh open research week um in terms of innovation our project look at
[43:00] developing a new monitoring method and it used quite advanced tools uh which are camera traps. Um the the the
[43:08] benefits from collaboration is that we got the potential to cover large spatial areas which wouldn't be possible uh without the contribution of of all our
[43:16] volunteers potentially under represented habitats. There's not so much data coming from from garden so far. H and it also fed back to community. So people
[43:25] got involved in the research pro process. They learned about camera traps and garden wildlife uh around them. And the research impact is uh is the
[43:34] potential to collect collect this data of gold standard quality on species of conservation concern in the UK.
[43:43] So in terms of future we need to classify the remaining data. So I highly encourage anyone who who would like to
[43:49] join to to scan the QR code and and join us in the process. Uh no prior experience is needed. You can classify it as much as as little as you'd like.
[43:59] uh then we will be able to analyze uh the the data and we're hoping to publish and if possible to to use anonymized
[44:07] data uh in data sharing repositories to to be able to be feedback to community and uh and allow them to use the data as well.
[44:17] So quick acknowledgements to my supervisory team, National Hedgehog Monitoring Program team, RS NTU community who are helping me to to do
[44:25] test the protocols basically all amazing citizen scientists and NTU press team who helped me to share it. Thank you so much.
[44:36] That's fantastic. Thank you so much for that Paulina. Uh and if you want to put the link into the chat, Paulina, so that people can get involved that way as well, we'll do that too. the same way,
[44:45] Rachel, if you want to share the online platform for for um leopard spotting,
[44:50] please do. Um right, we're on to our final speaker then before we enter into our discussion section. So, it gives me
[44:57] great pleasure to to uh last but not least join uh Hannah Jenkins who's going to speak to us about how citizen science
[45:04] within her PhD project related to zoo species planning. Over to you, Hannah. You can see it. Okay, sharing this one.
[45:12] Yeah, we can see that. Perfect.
[45:15] Thank you. Um, so I'm Hannah. I'm a PhD student um in the animal rural and environmental sciences school, but um my
[45:22] PhD actually covers also the business school and it's looking at system science from a slightly different perspective as the previous um talk. So
[45:30] I'm going to go quite quickly because I'm really um worried about time. So we have time for questions. Um but my uh PhD focuses on zoo species planning and
[45:39] how we can maximize our conservation contributions through strate strategic planning.
[45:46] So firstly if you're not um used to zoo speak um so what is species planning?
[45:51] This is the process to evaluate your zoological collection. So this might be evaluating species you already have in your collection or it might be looking
[45:59] at species you want to bring in. So if you want to bring in a new species of amphibian, trying to work out which is the best amphibian to bring into your collection to meet your goals.
[46:09] We know that um zoo species conser uh composition um directly and indirectly contributes to conservation. So your
[46:18] species can contribute directly to conservation through things like reintroductions um exit breeding, but they can also contribute indirectly
[46:26] through fund generation. So the funds that you get from visitors through your door and directly uh putting those that money back into conservation.
[46:36] So for zoos to be successful businesses,
[46:39] they need to balance their conservation strategy, visitor strategy, and business strategy so that they're successful and
[46:45] that they can maximize their um uh contributions to conservation.
[46:52] So the question from my PhD was really how do they do this? How are they balancing? how are they evaluating their species? And what better way to find out
[47:01] those answers than ask for community. So the biggest part of the data collection for my PhD was through community
[47:08] engagement. We had um a process of community workshops where we collected lots of data, focus groups, so we
[47:16] collected a bit more data and honed those ideas a bit more. using that data collection to create a strategic framework. We then put that framework
[47:25] back to the community to evaluate and then the final step will be dissemination once it's all finished.
[47:32] So those community workshops, the aim of these workshops was to gather criteria that stakeholders consider to be important in their species planning.
[47:43] We uh ran one trial workshop to test out the methodology first because we wanted to make sure that the um workshop
[47:50] methods collected the data we need um made people feel comfortable contributing that data and were successful.
[47:57] And then we ran two in-person workshops at community conferences. And these workshops were made up of two different um parts. So we had a pre-ervey and a
[48:06] post- survey. And then we had a workshop activity where um participants would give their um species planning criteria,
[48:16] prioritize that criteria and also build off previous groups. So each group would have a category where they would add to
[48:23] and then we'd switch those categories around so they could build on each other's ideas.
[48:29] The trial workshop um was at the Zoological Society of London with a mix of different um animal and educational
[48:36] roles. Um so this really did test the methodology, test that the surveys were um efficient in collecting data and not too long and that the instructions for
[48:45] the activity were accurate so that people could uh contribute the data they needed.
[48:52] And then we had two inerson workshops as I said. So on the left is the EAZA conservation forum where we had predominantly insitu exit
[49:00] conservationists and as well as some animal staff and then on the right is the EZA annual conference which is the European Association of Zoos and
[49:08] Aquariums. Um so this was filled with lots of different roles throughout the zoo community.
[49:15] And then this is a snapshot of some of the data that we collected from those workshops. So each workshop would have five categories in total. Each category
[49:23] would be added to by three different groups and prioritized by three different groups. So you can see that they're really building on each other's ideas and um prioritizing as they would
[49:32] within their group. Um so we got lots of range of different ideas and opinions.
[49:38] Next up we had the focus groups. So these were smaller online meetings. Um we used the workshop results as a starting point. So they started with the
[49:47] criteria from those groups of some of the criteria and then they could build on them and we got into more of a discussion with those. So this is just a
[49:55] snapshot some that data that we got from those workshops.
[50:00] In total we had 93 participants with a range of different experiences and roles within the zoological community. And
[50:08] from those participants, we collected 457 unique criteria that people consider important when they're planning their their collections.
[50:18] With the framework development, we took these 457 criteria, started to theme those criteria, and then take the medium
[50:25] pri priority of all the criteria within the themes. Um, and we structured it in a way so that at the top you have your um zoo core missions. So conservation,
[50:35] education, recreation, research and then along the side we have our core business functions. So finance, human resources,
[50:41] operation and procurement.
[50:45] So once we had that framework, that draft framework from the um workshops and focus groups, we then took that framework to community experts and ask
[50:53] them to review it. Um so we got lots of comments and questions and um ideas for how it could be restructured.
[51:03] Um I won't go much more into that section. Um we can pick it up in the questions. But um from a community sort
[51:11] of uh engagement point of view, the workshop feedback was really important.
[51:15] We wanted to make sure that these workshops worked really well and that they were useful.
[51:21] And uh we had uh of our post survey feedback 91% of our participants either strongly agreeing or agreeing that they
[51:29] enjoyed participating in the workshop which was really great.
[51:33] Again we had 91% strongly agree or agree that they were able to contribute their thoughts and opinions during the activity. This was again really
[51:41] important that people felt like they could openly discuss and contribute without feeling like anyone was overpowering or um intimidating. Um, we
[51:51] did make sure that we had a set of facilitators to help with all of the workshops and online focus groups so that we could keep that nice level of engagement throughout.
[52:02] And then 80% of our participants found that the workshop was useful for their own learning and understanding. And this was again really important to us. We
[52:09] didn't want to just take, we wanted to make sure that this was something that the community could use by themselves as well and that they could enhance their skills and their knowledge.
[52:21] And finally, just looking at those three themes that Paulina did. Um, so innovation, this was a um completely new study. Um, we developed a brand new
[52:30] methodology for it and we've been able to test that out. Um, so it's been really um eye opening to see what goes
[52:38] on with that and it's been really um the output's been really great. Um,
[52:43] collaboration, we would not have got the data that we did without the collaboration of the zoolological community. they have been a key core
[52:50] part of this um uh study. Um and they've really highlighted some areas that weren't known before. So that's been
[52:59] really great and that kind of leads into that research impact. You know, this is um a novel study. We haven't had
[53:06] anything like this before to evaluate species planning and the framework that we output from this PhD will go into uh
[53:14] association policies as well to help guide um institutions in their species planning and do do a bit better for conservation and that's a very very quick one.
[53:26] That's perfect. Thanks so much Hannah.
[53:28] That's great. So we now have an opportunity um to first off can we just thank our speakers our wonderful speakers and I'm blown away given that
[53:36] so much of this t has taken place within both PhDs and within professional doctorates and the quality and the level
[53:43] of what you're doing is just really really wonderful. Um so yes round of applause but for our wonderful speakers.
[53:50] Um I wanted to start with um a question um which sort of I was wondering about from each of your different perspectives
[53:57] here. So what for you did you find was like a core element of making citizen science successful? If you were to pick
[54:06] one thing that really sort of enable you to succeed within your various different projects, what would it be? Um and also for everyone in uh the room, if you want
[54:14] to put questions into the Q&A, we'll just be coming into that uh Q&A in just a sec. Um, I wonder if uh I can start maybe with Janice.
[54:24] Okay. I know I put you on the spot.
[54:27] No problem. Um, well, I'm probably going to talk more generally about my practice because that's where that informs my my
[54:36] research so to speak. And I think a big thing in this is the whole
[54:43] how how citizen science has in itself evolved through the practice. Yeah.
[54:51] So it has become more inclusive. It has become more collaborative, more co-creative with an emphasis on that.
[55:00] Even the terminology has changed to it's no longer citizen science, it's communicated commun community science or
[55:07] it's participatory sciences. But I think the key thing is is the is the flexibility of the approach. It's not
[55:14] set in stone. It's adaptable and the more we do the more it can be shaped.
[55:20] Um and that impacts on the practice itself and it goes on and on and on which one comes first the chicken or the egg. But it's just how it how it how I
[55:29] think it is evolving. I'll stop there and let my other colleagues speak.
[55:34] I wonder if I can come to Rachel then next like because I know that Janice was just talking about flexibility within practice. Is that something that you had to adopt as well to ensure success?
[55:43] Absolutely. So my project is it because we we've actually been tracking these leopards for over 10 years. So it's very
[55:51] retrospective. So obviously I only started my well I started my doctorate quite a while ago but we did have to go back and in time so there had to be real
[56:01] flexibility there because of the inconsistencies with some of the data quality originally. Um and that was something that we had to manage
[56:09] appropriately in our analysis and you know our research as well. I think it's really important to to work with the
[56:18] group that and you really come to know your citizen science scientists quite well about what's going to work for them. How far can you push them or where
[56:27] you're going, you know what, they're just never going to give me a perfect GPS coordinate. So, let's come up with some other workarounds that suit them
[56:35] better. So, yeah, there's there definitely is a need to make it work. Yeah, thank you so much for that,
[56:42] Rachel. I can see Paulina is smiling so I might come to Paulina next. Is that something that resonates for you Paulina?
[56:49] Yes definitely I think it was one of the questions of of spot hogen itself but I think for me one thing that uh that yeah
[56:57] I would definitely recommend anyone starting a citizen science project to to focus on is preparations. So to allocate
[57:04] the time to to to work on the website or whatever materials you're you're able to to prepare, you know, also from the
[57:11] graphical point of view to make it to make it look engaging. Uh because I believe as as you launch the project,
[57:17] the amount of questions you receive is is is um a lot. So to have one page
[57:24] where you can u you know um show everyone and and just hey, could you please check there? It's it's an amazing
[57:31] help. and and also it it basically helps you later with the analysis because if your protocol is is easy to follow or
[57:38] it's actually strict so so it uh you know restricts this this flexibility but then it it provides data of highest
[57:46] quality because because it's collected in the in in a rigorous way I think I think this is what I would uh focus on.
[57:53] Perfect. Thank you Hannah. I wonder for you if it was slightly different given the the way you did your code design with sort of specialists in your area
[58:00] within within zoos. What would you say would have been the the key thing that made it successful for you or the key element?
[58:08] Yeah, I think for me um it was being open and transparent about the process and what what the outcome is and how
[58:15] we're getting there and how the data is going to be used. I think people um now are becoming more skeptical rightly so
[58:23] of what they're contributing to with data and where is where is it going and what is the outcome of it. So I think having all of that up front just laid
[58:31] out made it um a better experience that everyone was on the same page of what's being used and how and why and then they
[58:38] were able to contribute without um worrying.
[58:42] That's great and that's actually a really good point because one of the questions got that got raised in the question and answer section is could someone please explain uh regarding the
[58:50] data ownership in these projects who owns the data and what are the challenges around this I don't know if anyone wants to pick that one up as a as
[58:57] a closing question. Um so for me um I own the data for my study um but when
[59:06] participants um sign up so they have to do a um consent form which outlines that they consent to contributing the data
[59:14] and also any photos or videos that I take of the um study but in that it outlines what it's used for as well so
[59:22] that they know exactly what's being used and what's being collected. um but within their um participant information sheet, they've also given information on
[59:30] how they can with restore any information as well. So at any point during the study, they're able to let us know that they don't want to contribute
[59:38] anymore and they would like to remove their data. So it does take a lot of planning and preparation to make sure that you are in a position where you can
[59:46] remove people's data as well. So making sure that you're on top of it. Um but for us it we made it clear in the um
[59:53] participant information sheet that we owned the data that came out of the workshop. And I'm imagining for for Rachel,
[1:00:00] Paulina, and Janice, is there an element of terms and conditions upon upload of of of uh images for you?
[1:00:08] Yeah, it's quite interesting for for how in terms of how operates and the research I'm doing as a se as a slightly
[1:00:16] attached but separate entity. So, I spot has a general terms and conditions and when anybody signs up to use, they agree to those terms and and conditions which
[1:00:24] include the use of ice spot data that you put on for research purposes. But you own in terms of the the rights you
[1:00:32] select the creative common type of ownership of the images generally the user owns the images but the records are
[1:00:40] actually important and that's what we've discovered over the years. So records are actually deposited in various repositories for further biological
[1:00:49] recording and research. So for example they're based in the national biodiversity network atlas of living UK in the global biodiverse information
[1:00:57] facility GBF and other recording scheme societies who utilize the site can access the data and it's al some of it also when I record and so on and we try
[1:01:05] to do that more and more um um to make it more accessible and used in terms of for my particular piece of research there is actually because this was
[1:01:13] something I debated with our ethics um and our data protection teams quite a bit because it was to ensure that individuals for protecting terms of for
[1:01:21] the interviews and so on and the right to use what they what they may say in an interview. So there's a consent form and
[1:01:28] information sheet for that and um so that means for example they can decide if they want to have their iceport
[1:01:35] username used to identify them in the research or not for example um yeah and that's participant information
[1:01:43] sheets are just so critical but then of course when it comes to the general comments it's generally looking at it and an anonymizing things anything that
[1:01:50] might be deemed that can name a location or name an individual or so and be very careful about for that whole entire process.
[1:01:58] Perfect. So, I can see there have been a few more questions in the chat, but I'm conscious that we obviously are we're out of time now. So, thank you again to
[1:02:05] our wonderful speakers, to Janice, to Rachel, Hannah, and Paulina for sharing your insights on this. Um, what we'll do
[1:02:12] is we'll collect all the questions at the end and and maybe what we can do is we can get some responses from Rachel,
[1:02:18] Hannah, Paulina, and Janice afterwards which we can circulate with with the recording perhaps. Um, so thank you everyone. Thank you for your questions, comments, and thank you to our speakers.
[1:02:27] Um, don't forget to sign up for future Open Research Week sessions. We have some really interesting ones coming up today um as well as tomorrow. And we
[1:02:34] have an interactive discussion tomorrow around what's missing, what's next, and what's working in open research, which we'd love to see you um as part of. So,
[1:02:42] enjoy your week, enjoy the other sessions. Um, and have a great day, everyone. Thanks so much.
This session looked at how Open Research can be used to improve health for an ageing population, among trans and gender diverse people and how it can generate better mental health through bringing peace of mind in communities such as Ireland and Northern Ireland, which experienced 30 years of conflict and intergenerational trauma.
Watch the video recording of the session below:
[0:00] Welcome everybody. It's lovely to see lots of people joining us for today's session. I'll just give it a second as people join.
[0:13] Excellent.
[0:15] Brilliant. So yeah, welcome to today's um session as part of open research week on uh enabling engagement including
[0:22] health projects and peace plus um and using codeesign as a practice. Um,
[0:27] you're all very welcome as we come together for this session as part of Open Research Week 2026. So, my name is Katy Woodhouse Skinner and I'm an open
[0:36] research consultant in NEU Libraries open research team. Um, and Open Research Week uh this year is a week-long cross institutional
[0:45] celebration of practices, skills, and culture that make research more transparent, collaborative, and impactful. Um and it's delivered jointly
[0:53] by Midlands Innovation, Nottingham Trent University and the Open University. And it's a program that is bringing together colleagues from across these
[1:01] institutions to shape the future of open knowledge. Um and this year's our theme is enabling engagement, innovation, and
[1:09] impact, which is particularly pertinent when we look at today's session and the speakers we have joining us. So, thank you for joining us. We're going to be
[1:17] looking at how collaborative approaches to research and practice can help create health care systems, services, and educational models that are more equitable, effective, and trustworthy.
[1:29] Across health care and health related research, there is growing recognition that solutions are strongest when they're delivered with people rather
[1:37] than for them, drawing on lived experience, professional expertise, and community knowledge in meaningful and transparent ways. So today's workshop is
[1:46] bringing together four different speakers whose work demonstrates different but contemporary and complimentary applications of co-design
[1:54] in action. Through their projects we will examine how different participatory methods can address complex challenges in healthcare access, public health,
[2:04] professional education and well-being whilst also raising important questions about inclusion, power and impact. So
[2:11] our four speakers today will be Cleveland Barnett, Yikka Setova, Chase Saras, and Thea Heredito. We're going to
[2:20] start today um with um Professor Cleveland and Barett who is representing um a project which involves the NHS and
[2:28] Chantel Osper who's unable to join us today who both work on the data foundations project which explores how routinely collected healthcare data can
[2:36] be used in patient centered ways to improve prosthetic care um after lower limb amputation. It has taken a patient-
[2:44] centered approach to mapping current UK data practices, gathering perspectives from patients, clinicians, and stakeholders. Um, if you want to put any
[2:53] comments or questions in the chat as we go through, please do. And we'll be monitoring the chat and question and answer. So, without further ado, I'd like to hand over to Cleveland.
[3:02] Right. Thanks, Katy. I'll just try and share my presentation. If someone can let me know when you can see it, that would be fantastic.
[3:10] Absolutely. I'll give you a thumbs up when I can see it. Is that okay?
[3:18] I can just loading now. Yes. Perfect.
[3:21] Perfect. Okay. Well, thanks a lot for the introduction, invitation to speak today. Uh really looking forward to talking about this project and how we've
[3:28] we've engaged with our end users to to kind of deliver and actually move forward in this project. Um I co-lead this project. So as K said, I'm
[3:36] Cleveland Barnett. I'm an associate professor in the school of science and technology at Nottingham Trent University and I think my uh co-lead Dr.
[3:45] Shantel Osler is here in the audience hopefully um is a consultant academic um physiootherapist at Portsmouth Hospitals
[3:54] University NHS Trust and Hampshire and of White Healthcare is also um a member of staff at the University of Southampton. And I'd just like to
[4:02] acknowledge the rest of our project team and that this project was funded by the National Institute for Health Research.
[4:08] So as Katie's mentioned in the introduction there, we were interested in in mapping these patient and stakeholder perspectives on routine health data collection, use and sharing.
[4:19] And what I will be talking to you about today is a specific project and the work that we're going doing going forwards.
[4:25] But just to give a bit of background to the problem. So routinely collected healthcare data can provide insights
[4:32] into service provision in healthcare and in turn this can lead to improvements in quality, safety and also cost effectiveness.
[4:40] The different levels that you can use this kind of data could be at an individual level to improve an individual's care, a service level to pro improve service or a system uh so in
[4:49] the UK the NHS for example to improve its operation. However, when we looked into the literature, what we saw was
[4:56] that a lot of previous data collection initiatives, so things like registries,
[5:01] tended to be criticized for a lack of patient and clinician uh centeredness.
[5:06] And this is relevant because if you want people who are using these databases and these approaches to engage with them,
[5:13] really um the evidence would suggest that you need to include them in their development and governance. And this is this has been proposed to be the reason
[5:21] that sometimes they don't struggle they struggle to realize the impact on patient outcomes that they they might promise initially.
[5:28] We're working in prosthetic rehabilitation and there isn't currently a nationwide data collection initiative in this area of healthcare. There have
[5:37] been some smaller attempts previously but they um they didn't include outcome data and they haven't um been sustainable. They haven't lasted. So
[5:46] that's the kind of background to the problem we're trying to solve is actually how do we start to move towards a more datadriven uh care environment in prosthetic rehabilitation.
[5:56] So like to start with a bit of a spoiler alert. This is where we'd like to get to. So this is our vision without trying to sound too grand is that we'd like to
[6:04] try and implement a prosthetic rehabilitation learning health system.
[6:08] So I'm sure many people interested in this area might already be aware of what a learning health system is, but effectively it's a health system in which outcomes and experience are
[6:17] continually improved by applying science, informatics and incentives as well as culture to generate and use knowledge in the delivery of care. the
[6:25] idea that you have this kind of cyclical uh approach to practice which creates data which informs knowledge which informs practice and you have a learning
[6:33] community that is made up of the people that engage with that service um that's underpinned by some technology platforms.
[6:41] So in order to do that we had to do some foundational work. We had to understand what the current state of play was. So the data foundations project that I'll
[6:49] be talking about uh as quickly as I can today um looked to explore current data collection use and sharing within UK
[6:58] prosthetic care. And the idea here was to develop a novel patient centered approach to a nationwide data collection initiative.
[7:07] And really what we're trying to enable here is empower patients to engage with the data, enable clinicians to use data
[7:14] for decision making and actually create accessible real world research. So we're quite ambitious I think and and we we
[7:23] think we've done quite well so far making progress with this and I'll outline some of the uh patient centered or co-designed approaches that we've taken.
[7:31] So in terms of the actual setup of the project um we have four or three funded stages and some work that we were doing
[7:39] in parallel that started before the project. But I guess for the for this audience who are here I suppose to to talk about codees. We had a patient
[7:48] advisory group which is made up of um people with you know real experience of lower limb amputation or limb absence
[7:55] who worked with us on this project. the patient advisory group. Um, a member was named on the funding application. They were they were involved in the
[8:03] development of the application. They're involved in the designing the running of all of these stages. So, it's tricky to talk about how we've done code in which
[8:12] area because the whole approach has really been done in in consultation with both patients and then also in parallel to that a stakeholder group. So,
[8:19] including people from uh charities that support people with limb loss uh different uh prosthetic organizations.
[8:26] But just briefly, we uh started off with a scoping review to look at different um existing patient centered data collection approaches that have been shown in different areas of healthcare.
[8:37] We then started a large scale qualitative study with patients,
[8:40] clinicians and wider stakeholders. The findings from this which I'll describe briefly shortly led to a nationwide online survey. And then we took that
[8:49] survey and engaged in some participatory action research cycles to develop recommendations for our data initiative,
[8:57] our registry. We are then working with our patient advisory group to talk about how we best disseminate our work.
[9:05] So just to give you a flavor of of some of these stages. So for stage one for example, we're engaging uh in interviews and focus groups with patients. So we uh
[9:13] recruited 20 patients through our limb centers and through social media and our uh limb center staff. So this is people
[9:21] physiootherapists, occupational therapists, rehab uh consultants. And then we also engaged with a stakeholder group. So we had 17 interviews with
[9:28] healthcare commissioners, the ministry of defense, different uh industry partners and also academic researchers in this area. And here we took a
[9:36] qualitative approach to really explore people's views and experiences of data collection, use and sharing. and particularly the perceived barriers and enablers to some of those issues.
[9:48] In the second stage, we took that qualitative information. We were working with a small number of centers, so five sensors um and we wanted to see if we
[9:58] could um get a wider range of views from the whole of of the United Kingdom.
[10:03] There are there are multiple limb centers in the UK. I think 44 Shantel might correct me on that. Um but you know we want to see if the things that
[10:11] we we found in our qualitative study translated more broadly. So we created a survey again in partnership with both uh
[10:18] stakeholders and with um patients. So the survey went through cogability testing. So some cognitive interviewing usability testing with patients to make
[10:26] sure it's fit for purpose. And we also had a multid-disciplinary team uh survey that uh we developed again with with
[10:35] clinicians that will would likely target uh participants for the survey. This was done online or in hard copy for those that wanted that and we offered
[10:43] translation where that was where that was required. Um and we had 111 people with limb difference or limb absence
[10:50] complete the survey. Um and 124 MDT members. So, we're really pleased that we we feel that we've got a really good broad understanding of people's views on some of these issues.
[11:01] Just I will go through these very briefly just because I know that we're potentially more interested in the kind of patient involvement codeesign aspects
[11:08] of the study. But what we found um when um surveying the patients is that most patients felt that access to their data
[11:16] would be useful, but it would need to be proportionate and accessible and that they felt that their data needed to follow them. So it's quite frustrating to have change service or speak to
[11:24] different areas of the NHS and the data wasn't following them. They felt they needed help to understand their recovery and feel in control of of that
[11:31] information and it might help um having that data to to choose the kind of health care approaches they were
[11:38] engaging with but also communication with healthare professionals. They they felt it might help with their motivation and they were supportive of the idea of it being used for to improve services.
[11:50] There was um some interesting points that they raised in terms of data governance. There was a high level of trust in NHS to manage the data and they felt that their direct care team should
[11:58] have access to that information but that consent should be s sought for sharing that outside of their immediate care team and that they needed to be clear
[12:06] and transparent um processes and and a kind of government structure to allow for that.
[12:13] In terms of the clinician findings, we found that there was an appetite for collecting data. They were more than happy to do that. Although at present there was a lack of standardization and
[12:21] and data collection tended to follow particular care pathways which we had also mapped. So at the at the five centers we engaged with we mapped where
[12:28] they collected what kinds of data at the moment. There was some consistency which was which was good and that the clinicians tended to be incentivized by
[12:35] policy. So where they required to collect data there are actually very good and keen to do that. they did tend to use spreadsheets quite uh often to
[12:44] collect data and that I think was more of a function of um a spreadsheet you have control over and it enables you to engage with the data in a way that is useful for you whereas some of the
[12:52] systems that they had they were they were obliged to use maybe weren't as user friendly as they might liked. um in terms of the usefulness or the use of
[13:00] the data. They had some concerns around accuracy, the challenge of interpreting some of these outcomes um and would
[13:08] would really kind of aspire to understand the outcomes and benchmarking. So knowing where they were ne you know relation to other services or where a patient was to compared to other people.
[13:18] In terms of their views on data sharing,
[13:20] they felt that there should be uh should have consent in place to share that information and access to that data should be role- based which is uh kind of mirroring a little bit what the
[13:28] patients were saying. There needed to be transparency over data um you know use and sharing and also concern that if
[13:36] patients had access to their data what they might do and how they might interpret information. So yeah, some really interesting findings that we drew
[13:42] from from these studies um and that we used to then develop some recommendations.
[13:48] So we took this initial data from the we took some of the the review information,
[13:53] some of the qualitative interviews and and focus group work and our survey and it went through these um par cycles,
[14:00] these participatory action research cycles which I've got on the figure and we worked with our um participant um advis
[14:09] these. So it's effectively we use as an iterative process of refinement to develop the recommendations the foundation for the development of a data
[14:18] initiative or registry in the future. So I think we really see these as a co-produced set of patient centered recommendations as really kind of high
[14:26] ownership from the patients and stakeholders that we worked with on these recommendations.
[14:34] So these are what we are now calling the proare data guidelines. These are the patient centered guidelines for data collection shown and use in prosthetic
[14:42] care settings. Um I won't go through all of them but these 11 uh recommendations all have some detail that underpin them
[14:49] and really are acting as a roadmap for us to start to design this uh data system which will underpin our learning health system in the future. So really
[14:57] exciting time for us. We're just finishing off this project uh at the moment and getting that information out.
[15:04] So we're looking to I can see Kate's popped up which mean I need to hurry up. So, uh, the next steps are to, uh,
[15:09] develop some funding, uh, applications to actually start to fund some of this work that addresses these recommendations and and we're really
[15:17] pleased that our stakeholders and our patient group have agreed to keep working with us going forwards. Um, and the idea is that we design, govern, and operate this registry in partnership
[15:26] with patients, clinicians, and wider stakeholders going forwards.
[15:31] So, hopefully I'm not too much over time. Apologies if I am, but more than happy to pick up some questions in your discussion. Just like to highlight my
[15:38] contact details here, but as well as my uh my colleague Shantel Osla. Um I've changed Shantel's email, I think, is the most current one. So hopefully if people
[15:46] want to get in touch with Shantel directly, they can do. Thank you very much. That's brilliant. Thank you so much,
[15:52] Cleveland. You weren't over at all. I was just realizing that when you did finish that I would need to turn camera and microphone on at the same time and I should probably stagger that. Um that's
[16:00] really useful. if you want to put any links as well, Cleveland, um about the project into the chat. I know a lot of people have been enjoying having the
[16:08] links to everything um over the course of the week, they can go and investigate more of these projects for themselves.
[16:13] Thank you so much for that. I'm really looking forward to picking that up in our discussion later and asking you plenty of questions about that like I
[16:19] always do. So, next up we have um senior lecturer Dr. Yika Setakova from the Open University whose work focuses on
[16:28] uniquely blended approaches to education via co-designing, co-producing and deliberative uh innovative health and social care solutions with demonstrable national and international impact. So,
[16:41] uh uh Yeo is going to discuss the inclusive approach to the take five to age well um UKwide uh public health
[16:48] campaign designed to empower wide and diverse aging populations and communities. Over to you, Yuka.
[16:55] Thank you so much and thank you uh Cleveland for for the fantastic presentation. It's been really lovely to to to learn more about the work you do.
[17:05] Um I will start the slideshow. Yes. So um
[17:13] my name is Ykashkova and I'm also apart from being senior lecturer in the health and social care, I'm also chair of the caris research group. And you might find
[17:20] that kind of these these two roles are uh overlapping slightly. I'll be talking today about how we approach engaging uh
[17:30] diverse communities for healthy aging and take five to age well being one of possible solutions in doing that uh as a public health national public health
[17:38] campaign. Um and with the impacts that with the impact that we have been reaching so far as you can see there are many partners here and there are going
[17:47] to be even more uh right now our partner matrix has something around 700 plus partners and collaborators across the
[17:54] four nations uh this has always been a for nations project and for nations approach so we've been approaching the partners in in in in the same way um
[18:04] thea who will who will speak later today as well has been also part of that journey um she has had some input especially in the in the early days when
[18:12] we were shaping the take five to age well and using the citizenship platform inquire um so you might hear more about
[18:19] the inquire today it's it's a it's a fantastic uh tool we all we're all aware about around the issue that aging uh of
[18:26] of populations present uh for for everyone for the for the government for the councils but also for the NHS um we
[18:35] know that there is demand for more care homes there is increased cost to economy may at different levels and for different reasons. And that also uh
[18:42] involves care and caring especially informal carers that care for uh a large proportion of aging population and the
[18:51] carers are aging as well. Uh there is a decrease in independent living and uh there is a decrease in aging being an
[19:00] okay experience especially the the the more diverse the communities we work um
[19:08] are. So there is a there's kind of point and and and reason why we're really looking into diverse minoritized and underserved communities when doing this
[19:16] work. Um and we also know that the the pressures on on the NHS are huge and that's kind of why the three shifts are
[19:25] look looking the way they look like with focus on prevention, self-management and also digital inclusion.
[19:32] The reasons why people are not preventing ill health or managing self-managing ill health uh as well as they could uh are are are numerous. One
[19:41] of them is uh access to services and resources and access not just physical access but sometimes how the resources
[19:49] and and services are framed and introduced and how people get aware of them. So the signposting uh quality of
[19:56] the service but also the the the the way the service is presented because some in terms of language language plays a
[20:03] crucial part in how people engage or do not engage. Uh and of course there are some behavioral aspects as we all know uh helping people supporting people
[20:11] changing their behaviors or attitudes is is is not an easy one to take. Um so I'll be presenting our take on that today. The
[20:20] solutions to that uh lie in improving health literacy, in setting achievable goals, and as we've learned, meeting
[20:27] people where they are and helping them a step further to where they would like to be. Preventing ill health and empowering
[20:34] self-management is a nationwide problem and aging outcomes are not equals. As I as I've mentioned earlier, we're looking
[20:41] at um accessible resources and interventions that do not often engage rich underserved minoritized
[20:49] communities. The reasons are relatively easy to to to to pin down on a board because the lack of inclusion of those
[20:57] communities often leads to lack of engagement. And if people do not engage,
[21:02] they cannot get empowered. they cannot get the benefits of whatever intervention we would like to share with them and help them to adopt. So the huge
[21:11] trouble is that people are not being offered what they want or need if there is no co-design co-production and
[21:18] inclusive approach all the way through and people are not learning. So there is this there's persistently low health literacy which I'm sure you will agree
[21:26] with me is a crucial drawback to prevention and self-management nationwide. Um so increasing health
[21:33] literacy and engagement and kind of fostering that real life practical impact uh real world practical impact is
[21:41] that interventions need to include the participatory approaches and learning from conception to delivery via co-design co-production co-
[21:49] dissemination and that the co-design co-roduced resources need to be done so from the very outset from the planning
[21:56] uh with real people who are aging uh they're that they're aging that that their aging is at different stages
[22:03] because we're all aging since we're born. They're caring for others. They might live with long-term illnesses or condition, multiple conditions. Or they might be practitioners, clinicians,
[22:13] professionals supporting aging populations in their professional roles.
[22:17] And this is how we kind of approached supporting people with aging well since the very beginning when I've developed
[22:24] um the five pillar for aging well model that uh is central to the aging well public talk series which with with those
[22:32] we have eight years of public tools talks delivered by people with lived experience um along the living with long-term conditions clinicians professionals
[22:41] practitioners who support aging populations in their professional roles and based on these years and years of
[22:48] public engagement where we have people with lived experience not only co-producing the program but also co-producing and co-designing the talks
[22:57] and co- disseminating and delivering the talks. Um we have learned several very important lessons that we then thought
[23:07] okay so how can we share the five pillars which are nutrition, hydration, physical, social, cognitive stimulation.
[23:14] How can we share this even wider? The aging well public talk series had around 100,000 people engaging uh after four
[23:22] years of their of of of when they started which was fantastic but we were sure we can actually do much better. So take five to age well is a month-long
[23:31] health challenge. It's modeled loosely on dry January and other pledge style challenges and participants pick one or
[23:39] more actions from the five pillars. So we have the nutrition which we rename to eat, hydration which we rename to drink.
[23:46] We have the movement for the physical activity, engage and connect for the um social stimulation and think for the cognitive stimulation.
[23:56] Uh we have made the take five available to individuals and groups and communities. Uh the digital we we had a digital provision and we also had a
[24:05] non-digital provision. Uh the digital provision was offered by the enquire platform. um uh which is a platform that
[24:12] enables citizenship science and enables easy access to sign up uh do short
[24:19] surveys and receive uh brief uh communications and messages. That was combined with us sharing emails um uh
[24:28] via emails educational newsletters that were tailored to help people learn without the annoying part of teaching
[24:35] them. So what the take five really does well because it's based on the years of engagement via the ADA public talk series is that we know we're helping
[24:44] people learn. Uh the offline delivery was piloted and then worked on by H Scotland who have since then adopted the
[24:51] take five offline and have been supporting in an enormous uh amount of groups uh that also live remotely across
[24:59] the whole of Scotland. In 2023 when the take five launched for the first time it was around five groups. Uh in 2025 when
[25:07] we relaunched also partnered by AUK uh there were 170 groups that uh were in touch with H Scotland for all the
[25:15] co-produced co-designed materials to facilitate the offline take 51. Um so we are focused on education but on helping
[25:24] people learn without educating them and promoting health literacy about aging uh and how to age well.
[25:33] Um this is our main banner. So the take five uh that run in 2023 and 2025 reached over 5,000 participants across
[25:41] the four nations. Our network now I was preparing this presentation a few months ago. So actually it is now 7 700 plus
[25:48] UKwide strategic collaborative partnerships and it is embedded in the open university societal mission that's
[25:57] also reflected in its open societal challenges agenda and the take five is one of the one of the programs that actually open societal challenges
[26:04] promote um and it's supported by relevant stakeholders charities commissioners and health and social care clinicians practitioners professionals
[26:13] wider the public uh inclusive of diverse minoritized and underserved communities.
[26:18] We have noted an excellent engagement online on our educational content. We know that the o average educational
[26:26] newsletters have around 35% open rate with nonprofit organization emails reaching around 40% open rate. The take
[26:34] five educational newsletters had an average of 62%
[26:39] with a high of 73.8 eight on day one and never dropping below 57.7 which is absolutely staggering and I think it
[26:47] just talks in the favor of how well the team has mastered the co-production of the tailored
[26:54] engaging content that helps people learn without teaching them. We also partnered with uh Bridget K who have created a web
[27:03] app for us that's accessible still today. It's called age well and it kind of offers five AI um avatars uh each of
[27:13] one of them specialized in in in one of the pillars. So either the nutrition,
[27:17] hydration, physical, social and cognitive stimulation. Feel free to explore that. We had some lovely quotes from uh Caroline Abrahams who is the
[27:25] director of HUK or Mure Gray. They are both uh five take five to H1 ambassadors. We had fabulous comments from people uh from all walks of life.
[27:37] Um so these are some quotes from participants about how people enjoyed taking part learning how much it helped them improve self-care, self-management,
[27:49] how much it helped them to change attitudes and behaviors and become more confident in taking small step every day
[27:56] towards better health and well-being. So we have some more quotes here as well.
[28:02] Um in 2023 we've realized how uh low the intake uh was by diverse um diverse minoritized and underserved audiences.
[28:13] We had around 4.3%
[28:15] from the participants were from those communities. So we in the 2024 as we were running towards uh the the the
[28:23] relaunch in take five in 25 we ran uh workshops across the four nations uh in Luton, Edinburgh, Belfast and Cardiff
[28:31] and Kadigian. Uh we used purposeive sampling uh via our active partnerships and uh some of the findings were implemented into the take five in May.
[28:41] Um we have been able to really gauge interest from the diverse communities.
[28:46] We were praised on the existing partnerships, the logo. So it it's telling us the the engagement with the emails and everything, the educational
[28:54] content was very much praised. We nevertheless need to do some more work.
[28:58] So create roles of community outreach champions, continue with the code design, describe a bit better healthy
[29:06] choices and help people find the take five a bit better. So we got some good pointers on that. The direct outputs
[29:13] from this work uh were uh direct changes to the actions for the take five and 25 and a direct increase in participation from those diverse audiences to 5.8%.
[29:24] So we managed to have four workshops before the take five rerun in 2025. And so we're talking about 1.3% increase
[29:33] which might sound very little but based on four workshops we can all do the math. If we have funding for 40 workshops, the number is going to uh
[29:41] boost be boosted equally. We have created several additional resources intergenerational toolkit work uh toolkit supporting carers for people
[29:50] with long-term conditions. We're currently working on pre-retirement toolkit and have a dementia toolkit in the pipeline uh working very closely
[29:57] with a Scotland. These are some of the take five intervention development phases. So we're uh in the in the scale
[30:05] up plan and we're kind of gently started thinking about funding for future plans for 2027 rerunning as a for nations
[30:13] campaign again. Take 5 has been adopted by the NHS where we're running a program supporting people who've been diagnosed with uh lowgrade hematological cancer.
[30:23] So that's currently running. We're working with Hillington Council GP Federation and uh the NHS and the North
[30:30] Central London Integrated Care Board uh on frail using frailty uh using take five to frailty prevention. We have been
[30:39] adopted by Haringi Council who the take 5 inspired a beautiful haring age experiential festival in 2024 and 2025.
[30:48] AIDS Scotland has completely adopted the non-digital take five and we also have worked with Pickleborn in Scotland. We have realized we're achieving impact in
[30:56] increasing health literacy while tackling health inequalities, improved self-care, self-management, confidence to take actions, changing attitudes and
[31:04] behaviors and also improved social fabric. Um, we won a couple of awards recently and we're really grateful to
[31:12] all our sponsors, partners, and funders who have been enabling us to do all this work.
[31:20] I hope I didn't overrun too much.
[31:22] That's okay. Not a problem. Thank you so much for that, Yuka. It's lovely seeing um just how extensive the the program
[31:30] has become and the rate of adoption and just how much involving participants and patients all the way throughout has changed the outcomes of
[31:39] the project has changed how you disseminate information around the project and and and how that sort of influenced it. So um our next speaker um
[31:49] is uh Dr. Chase Stras, whose work focusing on advancing equity in trans and gender diverse health care through
[31:56] participatory action research. Chase's research centers on co-production, open research practices, and transparent
[32:03] auditable decision-making to improve healthcare access, doctor training, and system design. Uh, please welcome me in inviting Chase to the stage, the virtual
[32:12] stage. This is Thank you, Casey. Very nice introduction there. Um, so as Katie said, I'm Chase
[32:21] Styrus. I'm a posttock in the research department of medical medical education uh within UCL Medical School. Uh, but today I'm going to be talking about some work that I conducted during my PhD. Um,
[32:32] which was on improving healthcare for trans and gender diverse people. Um, and that was at Nottus Trent University.
[32:43] So, just to provide you with a little bit of the context to the research before I get into sort of the co-production and participatory elements, um, trans and gender diverse
[32:51] people, so people who identify with a gender different to their sex assigned at birth, um, have far poorer health outcomes um, in comparison to cisgender
[32:59] people, so people who aren't transgender. Um, now this is for a variety of reasons, but one of the big driving factors of this is social
[33:06] determinance of health. uh through experiences of discrimination, lack of stable housing, poor family relationships, and lack of good social
[33:13] community. Um yet when we look at medical services uh for or any health services for trans and gender diverse people, um they're predominantly
[33:22] clinically focused. Um so this is things like hormones and access to surgery um which have their own problems in and of
[33:28] itself. Um but one of the big issues is that whilst this care is important, it doesn't attend to those social determinants of health and it doesn't
[33:36] see trans people as holistic people with um day-to-day lives outside of um medically focused care. Now, this is something that has been acknowledged by the NHS and in their long-term plans,
[33:46] they've called for holistic care, not just for the general community, but specifically for trans and gender diverse care as well. Yet, this is still something to date that hasn't been
[33:55] achieved. Now, we know that one factor that's really important as a social determinant is social support. So, we look to social support for the solution.
[34:04] Um, we know that trans and gender diverse people themselves have identified social support as really helpful for their well-being. And we see this not just in qualitative literature,
[34:13] but we also see it in um intervention based literature. We know that social support has a tang tangible impact not
[34:20] just on social and community well-being but it actually improves physical health as well through reducing stress responses. We know this from theoretical
[34:28] models as well because it mitigates those stress responses and therefore we look to implement a model within trans and gender diverse healthcare um that
[34:35] begins to integrate that social support as a core part of healthcare delivery.
[34:40] And we saw social prescribing which is a way of facilitating access to meaningful social groups and is something that already exists within the NHS as a
[34:47] potential solution. So this is broadly what the project aims to do to look at social prescribing as a solution to um address social determinance of poor health.
[34:59] So we conducted three studies. Um first systematic review uh to understand the psychological processes that make social prescribing effective because the processes weren't quite clear yet.
[35:09] having an understanding that it was uh meaningful social relationships. So, not just generic social groups um but groups where people feel connected to other
[35:18] people, they have a sense of shared um experiences, shared connectedness,
[35:21] shared trauma, those sorts of things. Um this is what was really important. So we went then and conducted a qualitative study speaking to trans and gender
[35:29] diverse young people and also health care professionals to understand what those meaningful groups might look like and crucially how that looks in practice
[35:37] and how they might use a social prescribing model what they envision that to look like um and how they anticipate that will help them to navigate some of the challenges that
[35:45] they're experiencing. And then we conducted a service evaluation of agenda clinic that had begun to implement social support uh to understand once you
[35:54] provide that social support is it something that they engage with is there a better way of delivering that service.
[36:00] So in order to do this, we didn't want to just focus on what the academic literature said. That's really important. But we wanted to make sure that the research was co-designed
[36:08] throughout and it really included the views of uh the people who would be using the social prescribing service but also the people who would be delivering
[36:16] the care itself. So we adopted five key uh open research practices throughout which I'll go through in the next few slides. Um but these were co-design with
[36:25] uh two NHS gender clinics, participatory research, be working uh with the communities who will be using and delivering the service, pre-registration
[36:34] um of all our research plans. So um research was really clear and transparent. Um open data so people have access to all the information that we
[36:42] had and open publishing to make sure everybody can see. Thank you for that message in the chat, Katy. Um open
[36:49] publishing so people can have access to all the resources as well. So there's no um pay wall behind anything.
[36:56] But just before I get into sort of the intricacies of how we did open research and how we took that participatory um and co-production approach, I think it's
[37:04] important to pause and think about why we're doing um co-production. Um co-production is often a word that you hear a lot about nowadays. Um NIHR applications require PPI elements. Um,
[37:14] but it's really important that we think about what it adds to the research that we're doing and that we don't just pile lots of different things on top of each other without understanding how they
[37:22] work. So, for trans and gender diverse communities in particular, how we conduct research is is as important as
[37:30] what it finds. And this is because typically um conventional research in trans healthcare has studied about trans people, but they haven't necessarily
[37:39] worked with them. And what this has done is it means that um a lot of professionals have been spoken to and their perspectives on what improvements
[37:47] should be made in healthcare have been actioned. But what we see as a result of that is that trans and gender diverse young people's health outcomes don't improve. So what this suggests is what
[37:56] professionals think about um improvements in healthcare aren't necessarily the solutions for what the community needs. So that's why it's
[38:03] really important to speak to these communities themselves and participation itself can actually cause harm. So by introducing something such as a
[38:11] mandatory surveys that might have binary gender categories uh this can invalidate trans and gender diverse identities and this can actually limit their engagement
[38:19] in research as well because they feel right from the outset that they're not respected. So what this demands in research is that we're transparent about what we're doing so that communities can
[38:28] trust that the research is for their benefit and not to exclude them from services. We need their participation so they shape the research questions not
[38:36] just answer them. we research things that are important to them as well. Um openness. So we know that the um outcomes that we produce don't just get
[38:44] to academics like ourselves who have access to um journals or policy but actually the communities that will be accessing these services as well. Um and
[38:53] accountability um to participants first more so than just your funders and your peer reviewers making sure that the research really reflects their needs as accurately as possible.
[39:05] So the first thing that we set out to do was really have that co-design with NHS clinics. So as I mentioned, when you only involve professionals, you only get
[39:14] a professional-shaped solution. Having said that, we still wanted to have gender clinics on board because this is where we plan to implement the social
[39:21] prescribing service. So we set up um advisory boards. We had two gender clinics um who acted as advisory boards and we met with them at their
[39:29] pres-scheduled um MDT multi-disiplinary team sessions. And this allowed us to get input not just from a specific set of doctors or usually the clinical
[39:37] director but from the whole team. So GPS, nurses, academics, clinicians. And this was allowed us to engage them but without adding the additional burden of
[39:45] taking up their time. And this meant that we could create a pathway that was feasible and not something that would be unfeasible to implement within a gender
[39:53] identity clinic. We also included uh trans and gender diverse young people.
[39:59] We off we opted to have um one-off interviews instead of a role in advisory board because we wanted to reduce the systemic pressures on transgender
[40:06] diverse people because this is something that they often report feeling sort of drowned in that research participation but still generally centering that lived
[40:14] experience. And what we found from this is that they told us that actually they wanted activity based bases not support groups which is what professionals
[40:22] suggested. So you can see already by having those two perspectives, we could begin to identify what's feasible, but then also what actually works for trans
[40:29] and gender diverse people. So we could come to a consensus on how do we achieve both of those things. We also spoke to
[40:36] another set of professionals outside of NHS clinics um to get a broader view of people who work in the community sector
[40:43] for example and this enabled us to look at a completely different side not just medical care. So we could begin to bring in that sort of social support,
[40:51] community support angle into the medical profession. And then we integrated um lived experience within the research team as well. So several members of the
[41:00] research team were either of trans diverse experience or they had close family members or they'd worked with these communities a lot. And
[41:07] participants reflected this back saying that they felt that their needs have been well supported um that it felt good to help with the research and that it's important work.
[41:18] As I mentioned, it's also important that all of this work is uh as open as possible so people can have trust in the research and the outcomes that are
[41:26] coming for it. So, we pre-registered um the systematic review on Prospero and the qualitative studies on the open
[41:33] science framework. And this allowed us to demonstrate our research questions,
[41:36] methods, and analysis plans before data collection. And particularly in trans research or research where it's quite contested or it's seen as biased, this
[41:44] allowed us to show that this is what we aim to do. This is how it helped the community and we doesn't change anything and everything accur accurately reflects their experiences.
[41:53] Open data also supported this. So some of our deidentified transcripts with um consent from participants. It was an
[42:00] optional consent. Um are deposited on the UK data service enabling other people to look at this data. Um perhaps they want to understand where the
[42:08] research suggestions have come from or they want to use the data for their own research. And what this does is it allows people to again trust the research findings but also see um that
[42:18] they can conduct research um with trans and gender diverse communities but without um putting that burden on them again to re-engage with research and
[42:26] then as I mentioned open publishing. So all the findings are published in open access journals and prior to that were shared as preprints while under review
[42:34] with one of our preprints being lo downloaded over 40 times in its first two weeks. So we're seeing that having that data open is something that's
[42:41] really valuable as early as possible. Um ensuring that these co-produced um insights are clear to everybody.
[42:49] And this is a phrase that I know that Katie likes to hear over and over again.
[42:52] Um to have to have the data Thanks Katie as open as possible but as closed as necessary. Um so making sure everybody
[43:01] has access to as much as possible to see how we're producing meaningful outcomes.
[43:07] Um so in terms of the benefits um we saw that trans and gender diverse people's lived experiences generally shaped our
[43:13] social prescribing pathway. Um we saw that the research was reliable when we could have this accessed by others who might be um critical about the research.
[43:23] The transparent documentation allowed the research to be reproducible and also accessible by allowing everybody to access the findings.
[43:32] So I want to end on a note of honesty um because no co-production or open research is perfect and there's definitely room to improve. So as I
[43:40] mentioned we didn't have an established dedicated youth advisory board which may have allowed us to have more deeper iterative collaboration. So future work
[43:48] that I'm working on is looking at lower burden ways of having advisory boards such as rolling advisory boards um or having online engagement as opposed to
[43:57] um consistent meetings. um consent is ongoing allowing participants to have that open control over how much of they
[44:05] how much of their data they want deposited on something like the UK data service and being a facilitator not a gatekeeper. So researchers should
[44:13] co-create research with and for the communities rather than imposing predefined frameworks. So when you're conducting open research and
[44:20] co-production, it's not just about adopting every single co-production tool, but thinking about what is it that you want to achieve with your research
[44:27] and what practices best complement those aims um in order to produce something that's as accessible for the communities as possible. Thank you.
[44:38] That is fantastic. Thank you so much,
[44:40] Chase. And as you know, it makes my soul happy every time I hear open as possible, closed as necessary. Um finally, last but not least, um we are
[44:49] joined by Professor Thea Herodto from the Open University whose work focuses on community citizen science and social
[44:56] justice. And we're going to discuss approaches in mental health drawing on the peace plus funded peace of mind project um showcasing how collaborative
[45:04] inclusive design approaches can support youth well-being across diverse settings with formal and informal education. Um so I'll hand over to Thea now. And just
[45:13] a reminder that after this um uh presentation there will be an opportunity to have any questions. So do use the Q&A and chat if you want to put
[45:21] any questions in for our speakers. Over to you Thea.
[45:24] Thank you so much. Hope you can see my slides well. Um so my talk is about the peace of mind project and I want to give you a bit of a background behind the
[45:32] project because I think it's a a unique a unique type of research project. So it's research but also application the
[45:40] same time. So the project is focused on mental health for peace. What we know from research is that conflict affected areas are experiencing a lot of mental
[45:49] health issues. We see cycles of violence and aggression. We see intergenerational transmission with children inheriting
[45:56] anger and pain from their parents. uh and we see big organizations like the United Nations um talking about
[46:03] addressing psychological and social needs uh for building sustainable peace.
[46:08] So they linked mental health to sustainable peace. So as part of this uh picture we are f the project is focused
[46:16] on the areas of Northern Ireland and the Republic of Ireland. Uh you may be aware that these areas experienced uh a lot of
[46:24] conflict for more than 30 years with youth facing uh trauma, mental health issues and segregation.
[46:31] Uh the poor health the the mental health statistics in the area of young people is considerably low compared to the rest
[46:39] of the UK. Uh therefore given this situation the special EU program body uh put up this call the peace plus call for
[46:48] projects that would enable peace in the area. So we were successful in getting the funding.
[46:55] Uh so this is a project that is going to last for four years. uh uh is costing uh
[47:02] the European uh Union and the UK government uh more than€7 million euros and it's a crossber multi-artner
[47:10] collaboration that aims to tackle this intergenerational trauma and poor well-being of young people in Northern Ireland and the Republic of Ireland. Uh
[47:19] the project aims to promote positive mental health and well-being to 25,000 young people. So, we are going to
[47:27] deliver a program on mental health and well-being to 25,000 people by March
[47:33] 2028. Uh, the program uh is a six week program that is about building safe relationships, tackling stress, building
[47:41] self-confidence and um ensuring uh good use of social media and developing a sense of community. So, a little bit
[47:50] about the partners. Uh the lead partner organization is Verbal. Verbal is going to deliver the program to mainstream
[47:58] secondary schools. Uh and young people from that program are going to be trained to deliver the program f further
[48:06] to their um peers uh in subsequent years. Uh the approach they following to
[48:14] delivering the program is based on storytelling is called creative guided reading sessions. So is stories that uh
[48:21] are teaching uh young people uh um aspects of mental health and well-being and they aim to deliver to more than
[48:28] 18,000 youth. Now Cedar is focusing on a different uh demographic.
[48:35] Uh I hope yeah so it's special schools and community groups that are facing
[48:42] complex learning needs. Uh so they are trying to engage these groups with the program. uh their version of the program
[48:50] is based on interactive group work like uh talking and listening games and presentations and they going to deliver
[48:58] it to more than 2,000 young people. uh inspire uh oops just a second my oh yeah
[49:05] and sorry h inspire is the third organization that is delivering the program to communities and is focusing more on on disengage um students and
[49:15] those that require early support and early intervention and they're going to deliver it to more than 4,000 young people. Now the open university is a
[49:24] partner that is linking in a way all other partners together. We are leading the research design and evaluation of
[49:30] the program which is uh based on principles of co-design and co-
[49:35] production and in particular I want to focus this talk on one of the aims of the project. So one of the outcomes is
[49:42] to produce a validated model of how we can do community citizen science in mental health in ways that enable
[49:50] meaningful participation of young people. The reason we have this as one of our aims is because we want to tackle some key challenges in the field of
[49:58] co-design and co-production. H and I listed here five of them. So the fact that participants are only consulted in
[50:06] one stage of research most of the times at the start of it. The fact that underrepresented groups are often excluded from co-produced research. the
[50:14] lack of theoretical frameworks that detail how we can co-produce well and the fact that a lot of the projects do not really tackle uh issues like power
[50:23] dynamics. So how we enable decision making with the people we co-design and co-produce and how do we enable uh that
[50:32] we empower all of our participants to have a say and a voice in our activities.
[50:38] So having these uh challenges in mind we came up with this three layered approach as to how we can achieve this level of
[50:45] co-production. The first layer is design based research. So uh every year we are collecting evidence from the project
[50:53] that inform its interation and refinement. uh and this uh evidence are coming from the evaluation of the program with young people with the
[51:02] facilitator facilitators that are delivering it also the evaluation of our methods and the ways we are consenting youth the delivery of the peer method
[51:10] training and our digital resources. So these aspects are going to be uh redesigned and refined through data that
[51:18] is coming in from the project participants. We completed the first cycle of iteration and if we have time I'm going to show you how this worked in
[51:26] practice. Now the question is how do you do design based research in practice and we move to the middle level of the pyramid. Uh here we used uh uh what we coined as communityled citizen science.
[51:38] So our approach is that we aim to actively engage all of our participants in the scientific processes and the
[51:45] decision making of the project. So we tried to we found mechanisms through which their voice will be heard and
[51:53] inform our uh decision throughout the project and these mechanisms are uh grouped around a set of online tools and
[52:02] co-production activities. So, we are using the Enquire platform, which is the award-winning citizen science platform
[52:09] of the Open University to host data collection activities. But perhaps more importantly, we're going to use it as a
[52:17] tool that will help young people come up with their own ideas for mental health studies which they're going to design and deliver to their peers. The second
[52:26] mechanisms here around uh co-production is the co-design sessions and the youth advisory assemblies. So the project aims
[52:34] to deliver 12 assemblies throughout uh its duration. The aim of the assemblies is to ensure that the project activities
[52:42] reflect the values and needs of young people. While at the sec same time we are planning for 25 co-design sessions
[52:49] with young people uh which will help us uh co-design all aspects of the pro project in ways that meet the needs and requirements of diverse youth.
[53:00] So what does this look like in practice?
[53:03] Um I want to thank here uh the post-docctoral researchers working for the project which is Dr. Jessar and Dr.
[53:10] Natali Natalia Dein that are actually working with young people directly and they are delivering uh those assemblies
[53:18] and co-design sessions. So here is an example of a youth advisory assembly that it was asking for from young people
[53:26] to feedback on the program uh uh they attended. Uh so uh Jess came up with this idea of the river of change where
[53:34] young people were asked to note down calm waters, the rocks they faced during the uh delivery of the program, the
[53:42] moments they felt more supported and what they would change and do differently in the future. So they had to draw this on a piece of paper and put
[53:49] the different pieces around what worked and what didn't work on this map. Let's say here I have an example from Natalie
[53:58] of a code design session. So Natalie tried different game based approaches to engaging young people with those
[54:06] sessions. Uh the young people she worked with were facing additional needs and requirements. So she observed that many
[54:14] of them were likely to say yes to everything or try to please the facilitator and the researcher. So in an effort to make all voices being heard,
[54:23] she came up with this idea of the toolbox. Uh she gave them 10 tools and they had to discuss uh decide and decide
[54:32] which six of them could fit into the box. Uh so in a way each item uh brought up discussions and communication and
[54:42] made uh sure that all of uh the young people in the room had a say in deciding uh on what six topics are more important to go into the toolbox.
[54:52] Um the design based research we completed the first cycle of iteration and the outcome of it was a best practice guide that we shared with all
[55:00] of the facilitators that are delivering the program h as a means to improve their practices and ensure that what they are doing align with the evidence
[55:08] we are collecting from the project. The takeaways is that the takeaways of this project is that we are doing community
[55:15] citizen science and co-production h in ways that are tackling key challenges in the field. We are trying to involve
[55:23] participants in all stages of the process from start to end of the project. We are including underrepresented groups intentionally
[55:31] right from the start in all of those processes. We are collecting data uh and evidence to show other colleagues how to produce and co-produce. Well, we are
[55:40] addressing explicitly power dynamics uh in the design of the activities. And we aim and we hope that we're going to have an evidence-based framework by the end of the project that others can use.
[55:49] Thank you.
[55:54] That is great. Thank you so much for that, Thea. And thank you. Can we just say a massive thank you to all of our panelists for their fantastic talks to
[56:03] Cleveland, Chase, Thea, and Yuka. Um, so we now have an opportunity to maybe take one or two questions. Um, so if you want
[56:11] to put any questions into the chat or into the Q&A function or if you want to raise your hand and ask a question to our speakers directly, please do. Um I
[56:20] just wanted to start off um actually picking up on something that you mentioned Chase around that reflection process of when you're doing code design
[56:27] and how it's really important to reflect and think about how you refine and improve that process further. And one of the questions I wanted to put to the
[56:35] panel was how do you build past participants into the end of projects into that reflection stage? Um so that
[56:43] when you're thinking sort of about future developments, it's taking in their perspectives as well as your perspectives on how it might go
[56:50] differently. I don't know if anyone wants to sort of jump in on that first.
[56:59] Oh no. Am I going to have to pick on people? Oh, don't make me. I thought it was a question for Chase, to be honest. Oh, it's a question for everyone. Sorry,
[57:08] question for everyone. It was just spurred by something that Chase mentioned. I can go. Go for it.
[57:17] I think one of the things that I've started doing more recently in projects is a combination of member check-in and workshops. Um, so once we get to a point
[57:26] in the project where we've got all of the findings, um, producing a onepage summary and going back to the participants and saying, "Does this
[57:33] accurately reflect your experience and is there anything that's missing?" Um,
[57:37] and sometimes it's the case that some things are missing and that we've missed something, but sometimes it's the case that not everything is possible within one project. Um, in which case that
[57:45] becomes a platform for developing something else. Um, and then if we have a project with um, something that's more deliverable, something that's
[57:53] actionable. Um, so at the moment we're developing some tool kits for doctors in acute care. Um, so we're running workshops with those doctors to have
[58:01] them co-produce the workshops by saying this is the insights that we've got.
[58:05] This is the um anticipated tool that you can use to manage um whatever it is that they're managing. Um, and then having them provide some insights into what
[58:13] future uh tool kits and workshops might look like as well. That's brilliant. Thank you, Chase.
[58:19] Don't know if anyone else wanted to sort of reflect on their experience of that.
[58:24] How how do you sort of manage the the expectations of your your stakeholders,
[58:30] the people you're co-producing with as well as sort of juggling that with with the research side of things as well? How do you manage that scope? Yika.
[58:40] Yeah, thank you. I'd be happy to take that one because it's been something that's been on my mind actually as I've been listening to Chase, Thea, and
[58:47] Cleveland. It's it and it made me reflect back on how we did what we did and why we did it that way. To me, it's
[58:56] really the stakeholders, people with lived experience, uh CEOs of organizations who want or charities who want to make the charities to really fix
[59:04] a problem and leave people better uh for it. So it's it's listening to the
[59:11] stakeholders who actually have idea on how we can reach world real world practical impact has been always very
[59:20] much at the forefront of what I've been doing. So I've been doing primarily this and then designing a tailored study so
[59:28] that the research becomes a vehicle to help reach that impact but I'm actually putting the stakeholders in the driver's
[59:36] seat. So I really humbly sit and listen and then tailor the the research design to make sure it's a robust study, it
[59:43] delivers evidence base, it's it's safe for everyone to take part in, etc.
[59:48] around the the the ethics needs and and and all that. But it's really been putting the stakeholders, whomever they
[59:56] are, in the driver's seat. Um, which then makes it actually much easier because as a researcher, you take step back and you really help it happen.
[1:00:06] Uh, and then you're not so surprised that you're actually reaching an incredible impact.
[1:00:13] That's brilliant. I don't know. Do you want to come on in on that, Cleveland? As I know your work touches on that.
[1:00:18] Yeah, I can do. I think it's um in the project I've talked about where we've had stakeholders involved in every stage. It's it's kind of a bit of a it's
[1:00:26] been a bit of a shift in the sense that we never bring them in because they're always in. They're kind of the team. Yeah. So, it's kind of a different way.
[1:00:33] I suppose it's not you it's definitely not novel as Chase has mentioned that this is not we're not recreating anything new here but you know when
[1:00:41] we're talking about dissemination that was a discussion that we were having continually so there wasn't really a thought of how do we bring these people in because they're kind of driving that
[1:00:49] as well right so when a particular charity says you know this is the best way to get this to our our kind of membership and our audience you go well okay we'll come down and do a talk at
[1:00:57] that event you're doing then so it's you know to the point where the academic papers are probably the last thing that we'll produce produce out of
[1:01:04] the project cuz it's you know you're recording it's not the least important thing because you know the university want to see that kind of stuff but it's it's the it's the thing that's least
[1:01:12] relevant to our stakeholders. So yeah it's because they've always been in we've just kind of worked with them to deliver it in a way that's useful for them.
[1:01:20] Yeah. And it's something that struck me as you were all speaking how it might have changed how you think about dissemination and how you think about
[1:01:28] how it's structured, what its purpose and goals are, what platforms and vehicles for dissemination you might use. Thea, I know you had your hand up.
[1:01:36] Did you want to to just what I wanted to to say is that this is a general shift in how we think as researchers about our
[1:01:45] participants. uh I think there is this more and more people trying to engage communities organizations in the
[1:01:53] projects they design right from the start even at the stage where they put the proposal together and are asking f for funding which I think is a great way
[1:02:01] of doing research because it's grounded in the realities of your participants and you make sure that you are designing something that would definitely have
[1:02:08] impact on the lives and realities of uh your end user rather than doing something that you just read about
[1:02:16] through books and uh discuss with your colleagues in the walls of of a within the walls of a university. So I think there is a gradual shift towards
[1:02:25] engaging stakeholders more meaningfully in in research practices.
[1:02:29] I think that's a fantastic note to leave on as well. Um oh cle okay is it quick Cleveland just very quick I just wanted to pick up on something that Chase mentioned
[1:02:38] because there's um that I think it's a really important point around not just using all co-production tools all of the time and and using it we we had a we
[1:02:47] were talking about data and it turned out what we meant by data and what other people meant by data and understanding of data was very different and sometimes people were very clear about what they
[1:02:56] didn't want to get involved in what they felt qualified to contribute in so I think and I read this in some of the reviews that we literatur should be looked at the review. It's almost not to
[1:03:04] put too much pressure on some of the people you're working with because they're like, "Well, you're the researcher." Like, you're bringing something as well. So, a bit of a cautionary tale of not just to go to
[1:03:12] somebody and say, "Co-produce this research for me." They say, "Well, you know, I'm not I'm not the researching.
[1:03:16] We need to work together." It's not adding burden to those groups, I suppose. Yeah. Sorry. No, no, no, no. Really useful comments.
[1:03:24] It's one of those things. There's always with a really good panel of speakers,
[1:03:28] there's always more questions than there is time. Um, but I thought all of your contributions have really sort of come back to those themes that we've
[1:03:37] discussed this week around collaboration, impact, thinking about innovation and when you're rooting
[1:03:43] research in the lives and realities of participants as the you said I think it naturally forces us to think along the lines of collaboration, innovation,
[1:03:53] impact. is by necessity that because you're thinking with your participants in mind, you are creating knowledge with them that changes how we do things. So,
[1:04:02] thank you so much to all of our panelists. Um, sorry for running a couple of minutes over everyone. Thank you so much for coming along and supporting our wonderful speakers and
[1:04:11] open research week. We hope to see you in some of the other sessions. Uh, one final round of applause for our wonderful, wonderful speakers. Uh, and
[1:04:18] we look forward to seeing hopefully you in other sessions.