MASHA SOMI:
Good afternoon everyone and thank you for joining us today. I'd like to begin by acknowledging the Traditional Owners of the lands we are all joining from today. I join from Ngunnawal Country. I pay my deepest respects to Elders past, present and emerging, and I acknowledge our First Nations colleagues who are here with us today. Welcome to the March 2023 Medical Research Future Fund Webinar. I'm Masha Somi, and I'm the CEO of the Health and Medical Research Office. I'm joined by Professor Caroline Homer. She's the Deputy Chair of the Australian Medical Research Advisory Board. Today, we're co-hosting a session with a wonderful panel to talk about how applications to MRFF grant application opportunities are assessed. Our panel includes people with experience on MRFF Grant Assessment Committees, and we've asked them to share with you their experiences, and insights they have gained on those committees. We have Professor Walter Abhayaratna, Associate Professor Sarah Norris, Dr BJ Newton, Professor Michael Kimlin, and John Stubbs, who is the Chair of the MRFF Consumer Reference Panel.
I'll ask each panel member to give a little bit of information about their affiliations when they start their presentations. Before we move to the presentations, we wanted to share with you a little bit of information about MRFF Grant Assessment Committees. The grant assessment committees bring together expert reviewers with a range of experiences and expertise to consider applications. There is always a consumer voice on assessment committees, and we try to have consumer perspectives on all committees. We publish an honour roll each year listing all members of grant assessment committees to acknowledge their important contribution to our work. We also have a perpetually open portal that allows people to nominate to join a grant assessment committee. And there's information on the slide that we just showed about that portal. I'll now hand over to Caroline, who before joining AMRAB in 2021, contributed to many MRFF Grant Assessment Committees.
CAROLINE HOMER:
Thanks very much, Masha, and welcome to you all. It's great that many of you could join us. I'm actually joining from Thailand, so I can't Acknowledge Country where I am, but I, of course, acknowledge Australian Aboriginal and Torres Strait Islander lands that you are all joining from today. So we hope this webinar will be useful for you to understand how assessors run through the process of looking at MRFF Grants, and many of you I know will be really familiar with the NHMRC Grant Review Process. And I think as we go through this morning, this afternoon, we'll show the similarities and then the differences. And there's probably more similarities than differences. So as an applicant, I think it's really helpful to think about the reviewer who's going to look at your grant. And so you're writing for that reviewer, of course. And then many of you will also end up being assessors or are already assessors and contributing hugely to the work of the MRFF and MRFF and NHMRC and any other granting agency cannot exist without peer reviewers.
So we're hugely grateful to you for the work that you put in. I know it's huge and unpaid and unloved, but it is really much loved, I assure you. So at MRFF, a similar process to other grant review processes, committees established, and Masha has just shown you a slide with some of those people on the committee. MRFF ensures that we have consumer involvement and John's going to talk a little bit more about that in a moment. We manage conflicts of interest really carefully, like for all grant review panels, those people are excluded. If they're on the panel, they leave the room when conversations happen. So rest assured that your conflicts of interest are really carefully managed. And if there's any question about it, the panel has a conversation and the chair manages the conflict of interest decision. MRFF uses grant hubs. One is NHMRC and the other one is called Business Grants Hub. We're not going to go through the details of those today, but they're very, very similar. You'll be familiar with the NHMRC one, Grants Hub, Business Hub is very similar.
There are spokespeople assigned, sometimes two, sometimes more, depending on the topic, and that initial scoring happens before the panel meets. We have an independent chair in MRFF panels, much like NHMRC, and that person does not participate in assessment or scoring. And that's the role that I've held most often on MRFF panels. We run through a process on the panel of spokespeople talking to the application. We then open it up for conversation, for discussion. People score and then we have a conversation about overall value and risk, and Michael's going to talk through how that works with us this afternoon. All members score the application. So it is, mostly these days done on Zoom. I don't think we do any MRFF panels face to face and every grant gets the same amount of time. That's the role of the independent chair because we know assessors are really smart people and the longer you let them talk about a grant, the more holes they'll find in the grant. So it's really important that every person gets exactly the same amount of time, every application gets a fair hearing.
And those challenging conversations are managed really carefully by the independent chair to make sure that it's as fair as we possibly can make it and that one person doesn't dominate the decision making on any panel. So I'm going to stop there, I think Masha, and hand back over to you. Really happy to take questions at the end, but I think we'll run through the panel and super delighted to have such fantastic people speaking today. Thank you.
MASHA SOMI:
Thank you, Caroline. And I'll hand straight over to Walter, who will be talking about the impact criteria.
WALTER ABHAYARATNA:
Thanks very much, Masha. So as people are probably aware the, Canberra is where I'm at. I'm the cardiologist at ACT Public Health Hospitals and also the Professor of Cardiovascular Medicine at the Australian National University. So I'm on Ngunnawal Land and yuma everyone. So I'm going to be talking about Criterion One, which is project impact. People are probably aware of the importance of this, in terms of the weighted score, it's 40% of the weighted score. And I'm going to take the approach now of really from the perspective of the grant assessment panellists, because I think that will give applicants an idea of what we're looking at. So I'll start with the general. When we look at the applications, what we look at is, perhaps the first couple of times, we look very much at the grantsmanship. That's a skill that, you know, is honed and your senior colleagues and people have been successful will be able to help you. And examples of things that strike us when we read it the first and the second time, is the clarity, the format, the fact that it's not full of words, that there's some illustrative diagrams and schematics that can help us understand with less words.
And in fact, that's probably a good general rule. Less is more. If you can hone down your words and still have each word is meaningful, it's very powerful. But after the second reading is where we start scoring for the culling of the not for further consideration. After that, we really look at the quality just before we look at the grants again on the day, the day before and on the day, and that's the research quality. It's not just about grantsmanship and that's when we unpick some problems that may be in the grant application. I'd like you to consider for the impact section, the most valuable document is the grant opportunity guidelines because that's where the grant opportunity and the desired outcomes are clearly delineated for your program. You have to almost have that dissected and very clear because you're going to talk to that in your application. In the proposal, the two areas that are most important for the score in the project impact criterion, is Section A and Section F. So Section F is the measures of success statement.
And now I'm sort of going to go through the scores and how we score it. And it's very much, it should be very much alongside the grid matrix. So I'll give you a quick rundown of that in the last minute or two. First one is really to address the objectives and desired outcomes and you don't want to be in the middle. You don't want to be a four or five for that. If you're unclear about that, you're stuck on five. It's hard to get you out of that five, into a six and a seven. So you have to be very comprehensive and convincingly addressing those. The next most important thing, I think, at least in the first couple of reads, we really want to see a very robust, comprehensive review of literature and an understanding of where the gap is for this program that you're going to do and the application to be of value. The next thing I'd like to say is in terms of consumer and end users, you have to do a stakeholder map. And my suggestion is try not to say that you've... had some collaboration with consumers, when really you haven't because it's very transparent and it really makes the panel go down on your mark there.
So, ideally, you've truly understood your end user and work with your consumers to design the application and the aims and objectives, and also the methodology. It's ideal if you've also, as your stakeholder mapping, got relevant partners and you've got supporting letters from those relevant partners to say what they're going to do. The improving the health outcomes is a little bit of a funny one. If you've got a translational program, that one won't be scored and heavily weighted because we understand you're not going to immediately improve the health outcomes of Australians. So we're a bit light on that one on occasions. And lastly, with the measures of success statement, it's very important to look at the desired outcomes and state where you're going to address those in that middle column in Section F and also where, in your grant, you've addressed that. It makes it easy for the application panellists, the people reviewing it to find that information there and score it appropriately.
So Masha, I'll pause at that and keen for any questions later on.
MASHA SOMI:
Thank you so much, Walter. And I'll hand straight over to Sarah, who will be talking about the project methodology assessment criteria.
SARAH NORRIS:
Many thanks, Masha. So I'm a health economist, I work in the School of Public Health at the University of Sydney. And also, like Walter, I'm a member of the Medical Services Advisory Committee. So, yes, so, I'm talking about project methodology. And when it comes to this criterion, I do look at all of the documentation that's come along with the application. So that's the application report, the grant proposal, I'm looking at, as well as the methodology, looking at things like the timeline, the milestones and the team that's been gathered to undertake the research, because all of those things go to the feasibility of the methodology as well. So in terms of what constitutes a sound methodology and, sorry, just like Walter, I use that the criterion matrix essentially as a rubric, go through each of those sub themes that are in that criterion, I go through, I highlight where I think the proposal is sitting. And as well as I said then you look at, in totality, look at where a particular proposal is landing with regard to the scores of four, five and six.
So in terms of what constitutes a sound methodology, I think really can't emphasise enough the importance of having a very clear research questions or aims. And your project impact, the beginning, will have set you up to then articulate what your aims are and then that needs to flow through to the methodology section. And then the next, and those aims need to be as specific as possible and clearly addressable by the research that you're proposing. And then I think the next most important thing is that it's the questions that are driving the methods and not the other way around. So if you have a particular methodology that you like to use a lot... that will come through, if it's not appropriate or sufficient for the questions that you're asking. And I think a real feature of good MRFF applications is that there is truly a multidisciplinary team that's been assembled to deliver the work and that clear thought has gone into who was on that team and what expertise they will be bringing to the project.
For me, what's impactful is when all of the pieces of the methods fit together really nicely. So it's really clear, as the reviewer, how the findings from each piece of research will fit together and how each of them will address the different aims and how you will pull that all together at the end. And that really goes to that, that criterion of having a well defined, coherent design. And so, like Walter... the... more is less, less is more, you know, being succinct, being really clear in what you're proposing and use of a really good diagram here, just to illustrate the proposed methods and how all the different pieces... of research will fit together, is really helpful. In terms of what scores well, as I said, it's where the scope and the design and the expertise are all together as a coherent package. And it's a bit like Goldilocks, you know, you don't want too much, you don't want it to be too ambitious because that will then score, you will score down in terms of feasibility, but also you don't want it to be too light.
You don't want important aspects to be absent or not considered in your application. And I think it's really important to remember that not everybody on the panel is a subject matter expert in your field of research. So as Walter highlighted, you need to write for quite a general audience, in that regard. So you want to avoid use of jargon. So therefore it's also really important to explain why particular methods have been chosen to give a rationale for why you're selecting those methods. Again, as well, it's important that you are acknowledging the existing knowledge in the space that you're proposing to work on and that you're clearly describing how what you propose to do is new and how it will build on the existing research. I think in terms of ideas for how applications could be improved... I think I was gonna say a very similar point to Walter, actually. It's quite obvious when things such as engagement with consumers, engagement with policy makers, thinking about health economics, when they've all been a last minute addition to an application, it's very obvious.
What we actually really looking for, are that those aspects have been truly embedded in the whole design of the research project. And whilst some applications will describe how they will engage with, say, consumers, once the project is funded, the research journey actually starts with the preparation of the proposal. So we actually want to see evidence that those consumers have truly co-designed the proposal, itself, as well. And I think, I think I've said everything else. Yeah, that's it, that's from me. Back to you, Masha.
MASHA SOMI:
Thank you so much, Sarah. and I'll now hand over to BJ and she'll be talking about the capacity, capability and resources assessment criteria.
BJ NEWTON:
Thanks, Masha. Hi, everyone. My name is BJ Newton. I am a Wiradjuri woman and I'm a Scientia Senior Research fellow at the Social Policy Research Centre at the University of New South Wales. So I'm going to be talking about capability, capacity and resources. But I kind of want to preface this just by saying that my experience on the GACS, the grant assessment committees has been on Indigenous specific grants. So this is what I'll be speaking to. However, you are, you can see how it is kind... you can use it across, for other schemes as well. And it's really clear to see now that Walter and Sarah have both spoken, how interrelated the entire application is. So...so I'll just jump in. So, I mean, the team is incredibly important. And, you know, one of the first things that we're looking at when we're conducting our assessment and then when we're having the discussions in the GACs is, you know, who is the team and particularly who is the CIA? Are they Aboriginal? And if they're not Aboriginal, we need a very good justification as to why they're named as the CIA and not another CI on the team.
So if this is the case, you know, we look for evidence that there is strong Aboriginal leadership throughout the project in other ways, like other CIs being Aboriginal, the involvement of Aboriginal community controlled health organisations, Aboriginal peaks and other Aboriginal organisations and stakeholders. And we're also asking, what does strong Aboriginal leadership mean for this project? And a good application will actually spell out what this looks like for their project. So we're also asking, is this genuinely Aboriginal led research and how do we know that? And you need to provide clear evidence, and what Aboriginal community controlled organisations and Aboriginal peaks or groups are involved and in what capacity? So, for example, are they partner organisations or are people with lived experience participating as researchers? Involvement of Aboriginal stakeholders needs to be genuine and ideally provide governance over the research. So we're asking, what are the capacity building opportunities for the project?
For example, are there postgraduate or postdoc opportunities that are built into the grant and how will these researchers be mentored? What will their responsibilities be in the project and how closely are they working with senior researchers and the CIA? And what other capacity building opportunities are there? So, for example, is a mentoring of junior researchers or partner organisations or community based researchers, how much time are senior researchers and the CIs named at the top of the grants dedicating to this, to demonstrate genuine capacity building. And then looking in alignment with the method, have the CIs and particularly the CIs named at the top of the application allocated enough FTE to work on this and adequately upskill and mentor more junior researchers. So for example, is there, if there's an element of community capacity building in the project, we need clear detail of how this will look and what the intended benefits and outcomes will be. And that ties back to some of the project impact stuff that Walter was talking about.
So if we also think in conjunction with the research design, such as the methods, the scope, the timing, this is all, is it all feasible within the skill set and the other work commitments and responsibilities of the team? So in other words, are the roles that are allocated to the team members in a way that they can feasibly perform the roles and the FTE allocated to the project? The track and the research experience of the CIA is extremely important. However, the strengths of other CIAs need to demonstrate that the research is feasible and the feasibility lies within the whole research team, rather than relying on one sole researcher. And on that, everybody needs to be very clear about their roles and capacity within the project. You need to talk about the strengths of the team as a team and note previous collaborations and professional engagements. If you're if you're arguing or demonstrating that Aboriginal communities are heavily involved in the research, you need to demonstrate how.
So think more than beyond just the project advisory group meetings and how genuine Indigenous community leadership and governance can be achieved and ongoing. And we also take into account relative to opportunity and career disruptions. So be sure to include everything. So not to disadvantage your application. For instance, if you've had a significant career disruption, this will be taken into consideration when looking at gaps in research work or publications. And just to talk about exemplary examples of capacity and capability, this will be strong Aboriginal leadership where most or all of the researchers are Aboriginal and partnership and engagement with Aboriginal organisations is clearly planned through at all areas of the research, in addition to just the formal Aboriginal governance structures like project advisory group meetings. Also, you can demonstrate that these relationships already exist, so what Sarah and Walter have spoken about, and ideally that the research is a need and a priority of partner organisations and Aboriginal community groups.
And we're also looking at quality and feasible opportunities for capacity building and mentoring more junior Aboriginal researchers by more senior Aboriginal researchers. And that's it from me for now. Thank you.
MASHA SOMI:
Thank you so much, BJ. And I'll now hand over to Michael, who'll talk about the overall value and risk assessment criteria.
MICHAEL KIIMLIN:
Thank you, Masha. Look...welcome everyone. Thanks for coming along today. My name is Michael Kimlin. Like Caroline, my involvement with the MRFF process has been in a chair role, and I find the work really, really enjoyable. And this particular section of overall value and risk is one that I certainly put my hand up to speak about. Because this is where we have a slight deviation from our typical NHMRC style applications. The criterion for overall value and risk is standing back and having a look at the proposal from a very broad view and the overall value and risk is, is actually focusing on the project's outputs. How will they meaningfully contribute to the objectives of the grant opportunity? The initiative or mission that you're applying for and the MRFF and community more broadly? So this relates back to your measures of success statement and the risk management plan that you put in with your application. So I know a lot of folks get concerned about the risk management plan and the measures of success.
And we've heard about the measures of success from other people speaking this afternoon. The risk management plan is one to think about and it's quite important. It's looking at the 'what if' scenarios. For example, what if, for your particular project, you're recruiting a community group, but the people from that community don't want to participate in the study or can't find enough people to be part of what the particular type of recruitment that you want to do. So the risk management plan is a plan A, a plan B and a plan C to see what happens if things going wrong. So what I'm observing is that panels really take this particular criteria really seriously and look at the overall management and saying, OK, great project, great initiative, great recruitment strategy. But look, they haven't really thought about if something goes wrong. So within your overall value and risk, make sure you spend time on that risk management plan, really important and it ties into your measures of success. So those two can be intertwined and a little bit different as well from NHMRC, we really don't have an in-depth budget discussion.
There is a discussion about the budget requested, but more along the lines of, is it actually enough to support successful delivery of the aims of the project that you put into your application? So is it sufficiently detailed? Is it justified and does it represent value for money to the MRFF and also to the scheme more broadly or that specific scheme more broadly? So thinking about your budget measures of success statement and risk management plan, they're really good ways to intertwine them and look at how you can, from a governance perspective, looking at the project as a whole and a little bit different take that I guess we as researchers look at research projects and an opportunity for you to bring in these new ideas on risk management, how to control for things that might happen, that could impact the outcomes of the project. And how do we tie that in with the budget and the measures of success? So this is a really interesting category and certainly I would strongly suggest to look at the category descriptors for the overall value and risk.
And, you know, you really want to be looking at that excellent to good category... aiming for that for your project. And, you know, the overall value and risk category descriptors are something that should be stuck up beside your computer as you're writing the particular proposal. So thinking about this from a little bit different lens to NHMRC, but look at it as an opportunity to look at the overall project and the ways that you manage outcomes, manage risks and manage the budget. That's about all I have to say on that. Thanks Masha.
MASHA SOMI:
Thank you Michael. And I'll just hand over to John and he'll talk about consumer involvement in research projects.
JOHN STUBBS:
My name is John Stubbs. I'm the Consumer Rep Chair of the Medical Research Advisory Future Fund Consumer Engagement. And I'm coming to you today from the lands of the Bundjalung Nation, up here around Byron Bay and Mullumbimby. Look, I can only reiterate what Walter and Sarah and previous speakers have said about the importance of engaging with the consumer voice, early on in the project. I think in previous grants and initiatives, consumers were seen as a little bit of a tick box and only involved at the end of the project. But I think it's very, very important to engage with your consumer stakeholders from the outset. We're dealing with public money. And so as a consumer, you look at whether or not you're getting value for money, whether there are risks associated with this, as Michael spoke about this, and we do like to see a risk matrix or a risk plan because we're, consumers look at areas of priority that affect not only them, but also the wider community. And with Australia being such a large country, access and equity is very, very important in how you go about your research and who you engage with your research.
So consumers, consumer organisations, community organisations, can give strength and certainly added value to your project application. We can be involved in writing the grants. As some people have said, less is more and consumers certainly agree with that concept. Less words, words that is understood by the general community. And I think even a number of researchers do like to read things in plainer language. And I know that being on the GRP grants for a number of years in the past, even researchers have said, gee, this is a well written application. It's in good plain language, the strengths are behind the way that it's explained. Their stakeholder engagement has been really, really strong from the outset. And I think we also look at the governance, what organisation is running the trial or research program? Is it part of an international program? And we look for things like that. If money is given through MRFF for clinical trials, we look at the controlling organisation, the patient numbers, the impact on the patient and also the family.
Because if you're travelling to and from hospital or a medical centre once a week, twice a week, that has an impact on the person involved in the trial. We like to know about side effects in relation to clinical trials because sometimes the side effects that the medical fraternity see perhaps don't have such an impact on quality of life, which is very, very important to consumers. So those quality of life issues are really, really important element in the research package. And we do like to see things like that. As I said, it's important to engage consumers early from the outset. Individual consumers, representative consumers from particular organisations. And consumers all have their strengths, and like everybody, we have our weaknesses. But you can be a doctor, you can be an accountant, you can be a parent, a father, a grandfather, all of those things which bring that broader perspective to a research grant. And I think the consumer and the community voice does engage with Indigenous, and more so now in Australia CALD groups, and we are able to provide a nexus to engagement with CALD groups, which I think for the past number of years have been a missing link in relation to research in this country.
So I think, just to highlight, it's public money. We like to look at the governance, who's controlling the project? Can they perform this research or this trial? Is it value for money? We do like to see that and we like to see a plan whereby consumers are involved from the outset and taken right through to conclusion. And if it is a clinical trial, the results are then passed back to the consumer organisations further down the track. So I think it's collaboration now with research and consumers and I think that's now going to be the hallmark of the way research will be done in this country. Fingers crossed. Thank you.
MASHA SOMI:
Wonderful, John, thank you so much. Before we move to the Q&A session, I'll just go back through each of the panel members and ask if they'd like to make any other comments. And if I could start with you, Caroline.
CAROLINE HOMER:
Thanks, Masha, and thank you to my fellow panellists. Fantastic advice and conversation. So, just three things. One is grantspersonship. And Walter mentioned this. I think you can't get your review, your grant reviewed internally, enough times by your friends, by people who know your work, by people who don't know your work, importantly, because I think a couple of people mentioned you may not get the world's expert on your topic reviewing your grant. So grantspersonship is critical. The second one is, think about your reviewer. I don't know about you, but I review grants often at the end of the day, sometimes in a coffee shop, sometimes when I'm a bit tired, very occasionally with a glass of wine. So think about your reviewer and be kind to your reviewer. And a number of panellists talked about making it, you know, less is more, some nice diagrams. Take the reader by the hand and lead them through your grant. Lead them through your story. And finally, and I think Sarah mentioned this, have the category descriptors in front of you.
Write to those. The independent chair is going to bring the panellists, the assessors, back to those category descriptors all the time. They're going to say, when you said it was good, but you gave it a six, I'm not seeing good in a six, and they're going to keep challenging them so that we have fairness across the board. So, thanks, Masha.
MASHA SOMI:
Thank you, Caroline. And Walter, any comments?
WALTER ABHAYARATNA:
Yeah, just to extend something that Caroline said about... I hope I say this properly, I'm a bit lazy with this one, grantspersonship. And that is, if you don't have that polished product to make it easy for people to read, you risk very great science not getting through the next stage of the consideration. So getting culled at the NFFC, the not for further consideration, is tragic. But on the other hand, if it's very polished, you'll perhaps get through that. But you've got to have coherence there where each of the sections connects to the others, and as others have said, then links back to the grid, the matrix that we assess against. And it's very important to self-assess where you are on that matrix and try and push up at least one score, if not more, by cutting down the excess words and being clearer on where there may be limitations. Thanks, Masha.
MASHA SOMI:
Thank you, Walter. Over to you, Sarah.
SARAH NORRIS:
Thanks, Masha. So, I've got a couple of points. One is, don't feel that you have to use the full budget and the full timeline that are available to you. You know, you can put in a grant that's a lower amount, that goes for a shorter period of time, that has a smaller CI team. And it comes back again to this linkage between the methodology and the feasibility of what you're proposing. So don't feel like you have to go big and pad it out, and that could actually be favourable for you when it comes to that overall value and risk assessment as well. So, that's one point. The other point is, so, I deliberately went on two MRFF GACs before I ever wrote an MRFF grant, and it was enormously beneficial for me and I learnt a lot from doing that. And I think as other speakers have said, the really good applications just, they stand out, they're easy to read. And then the minute you feel like you're struggling to understand what someone's proposing, you know that that's not a seven. So, you know, so do spend time on that.
And yes, so, I would just highly recommend the experience of being on a GAC.
MASHA SOMI:
Thank you, Sarah. Over to you, BJ.
BJ NEWTON:
Thank you. I've got just a few points from each from each criteria. So for project impact, you need to clearly demonstrate why this research is a priority for Aboriginal and Torres Strait Islander people and how you know this. So for example, is an Aboriginal partner organisation contributing cash or in-kind funds? And what are the outcomes of the research, and how will the project ensure maximum impact from the research for Aboriginal people? And this is why Aboriginal governance is so important. So you need to demonstrate what this research means for Aboriginal people in an ongoing and impactful way. Regarding method, does the project, does the method align with best practice principles in Indigenous research? For example, the AIATSIS Code of Ethics. And how is the research demonstrating these best practice principles? And demonstrate how this is the right method for the project. Sarah did talk about that a bit before and that the number of different research components are feasible within the scope, time and expertise of the team.
Regarding overall value and risk, it's been said before, but it can't be stressed enough, demonstrating value for money and having a really good justification of budget. So, for example, if you ask for $1 million and 90% of that goes to salary costs and then there's very minimal amount being put into the program or the research costs, you need a strong justification that all those salary costs are essential to the research. And then I just had a final point about the importance of a polished application, which I think is important to everyone. So, thank you.
MASHA SOMI:
Thank you, BJ. Michael, would you like to make some comments?
MICHAEL KIMLIN:
Great. Thanks, Masha. Look, I agree with all the comments from the other speakers and reinforce the scoring matrix is as your friend, because that is the way that we really do ensure that we can have meaningful discussions around the project. But I'd like to really focus on what John spoke about, and that is the involvement of community and end users in the research journey. Sarah touched on this as well. And this is a really important point that I think, Sarah, you said that it's very noticeable when you review a grant that hasn't had meaningful involvement with a community and end user and they're tacked on at the end. It's very obvious. So, I agree with John. I think that John and Sarah are on the right point here. You really have to bring those community and end users on that research journey from day one. You know, I think a statement might be, it's not about us, it's about them. That's a good way to think about the research, is that it's outward looking and having those partners there from the start to guide, advise, help, support, really makes a meaningful difference to it.
And if you actually look at the scoring matrix, assessment criteria one, project impact, there is a criteria that says, demonstrates broad and meaningful involvement of consumers, community and/or research end users in the research journey. So, there you go. And I think that's sometimes missed when we're talking about MRFF, that they need to be on the journey together. So I'm all for the bettering, better and more sustained engagement of our end users and consumer groups. Thanks, Masha.
MASHA SOMI:
Thank you, Michael. And I'll just hand over to John.
JOHN STUBBS:
Thank you. Yes. Look, I can only reiterate what everybody else has said. It is about collaboration. And there are a couple of quotes. There's the European Patients Federation says, nothing about us without us. And also you're doing research with us, not just about us. And I think that kind of encapsulates, I think, what everybody has said. I'd just like to raise one issue in relation to the budget. I noticed that in some MRFF grants that I've sat on as an advisor and also as an observer, that an amount has been included in the budget for consumer engagement. And I think this is very, very positive, because it gives, I think, the consumers gravitas that they are totally involved in the project. Money's being spent on them. So, is it incentive to contribute back more? Possibly not. But I think it's the recognition that you are collaborating and that you're part of a team. And I think that's very, very important throughout the whole research process. Thanks, Masha.
MASHA SOMI:
Thank you, John, and thank you, all. So we'll now move to the Q&A section, or Questions and Answers section. Just a reminder, thank you for those who've already submitted questions, and a reminder for others who wish to submit questions to do so via the chat function. And I'll just hand over to Arun to ask the first question.
SPEAKER:
Thanks, Masha. The first question is, where people are particularly interested in knowing if a company led consortium could have a competitive application? So I thought I'd send that to Michael to answer, given his health systems background, and then anyone else who'd like to add to that.
MICHAEL KIMLIN:
Interesting, because I think that would go back to Masha and co. Look, and please, MRFF folk, feel free to dive in here. If they are a registered MRFF NHMRC administering organisation, they meet the criteria and they meet the guidelines for the call, the MRFF call, they meet all the assessment criteria and go through due process. I cannot see why that can't happen. Masha, do you want to add anything more to those criteria?
MASHA SOMI:
That's perfect, Michael. So we do talk about the eligible organisations under the Medical Research Future Fund Act, and they're set out in legislation and then applied through all of our grant opportunities. And once you're registered with NHMRC, and anyone can apply through the business grants hub as long as they meet those criteria, it's really to the guidelines and the assessment criteria to really work through what ends up being recommended for funding.
SPEAKER:
Thank you. The next question is, can you comment on how assessment processes consider the challenges of rural and remote Australia? Now, sometimes the ACT is considered a rural and remote area, so I might hand that over to Walter, but anyone else is welcome to add anything.
WALTER ABHAYARATNA:
Yeah. Strictly speaking, just on that point about the ACT. Now, if you look at the modified Monash, we're not, we're certainly not, but we're surrounded by a lot of New South Wales that is in MM3 and above. In terms of the handling the rural remote, look... I'm going to take this a little bit, you are able to collaborate with partners. And this comes back to the team as well as the project impact who are well embedded in the services in rural/remote regions, you will have an advantage because it is something that I think a lot of programs are trying to do, which is to ensure that there's access to research and clinical trial research, specifically. There's been a MRFF grant, a infrastructure enabling grant of about $100 plus million to increase the reach of clinical trials into rural areas. So if you can link it onto those as partner organisations, partner programs, then I think you have an advantage. So, look for collaborations is my advice.
MASHA SOMI:
I think we might be having some troubles with your Internet, Walter. (AUDIO DISTORTS)
JOHN STUBBS:
We can't hear Michael. Oh, Walter. Can't hear Walter.
MASHA SOMI:
So, Walter, I think we're having some issues with your connection, I'm not sure. I'm not sure... Sarah, if you'd like to comment on particularly considerations of rural and remote research when assessment processes happen?
SARAH NORRIS:
Look, I think what Walter is trying to say is that, you know, essentially, rural and remote communities are priority populations. And so, you know, plan your research accordingly, engage with the right partner organisations and think about, you know, it's an interesting way in which, again, this is a way in which you could start embedding considerations related to health economics. Of course, I'm a health economist, so I'm going to say this, early in the design because things like equity you can incorporate, you can have equity informative cost effectiveness analyses. You know, you could actually, you could actually do something really substantive in this space if you think about it from the beginning.
MASHA SOMI:
John, would you like to add to that?
JOHN STUBBS:
Masha, yeah, if I could just add. Certainly, in relation to access and equity, I think this is one of the things where the consumer group or consumer organisation can show strength and add some gravitas to something like this. I mean, if we look at the various elements of the Health Act, you know, with supply of PBS drugs, with MSAC and that, access and equity is a strong element in the approval of drugs and the format that they're given, so that people throughout this vast country can get access to those. So, I think those same criteria or same elements could be included as part of the research criteria that access, and you have to prove access and equity. And you have supporting elements from collaborators to say how access and equity is important and also using a health economist, and certainly in relation to value and risk is really, really important. So, Sarah, there might be some jobs for you, I think. But no, we strongly support, you know, give us the evidence to support something like that.
MASHA SOMI:
That's great. Thank you, John.
SPEAKER:
Thanks very much. There's a question about, how detailed does the study protocol need to be? And I might hand over to BJ on that from her experiences that she's had.
BJ NEWTON:
Thanks, Arun. So as we were saying, less is more as important, less is more is important. However, it is very important still that you are detailing every step of the research. So if there's any vagueness around the methodology, then that's going to draw questions about the feasibility. And that also goes to who will actually be conducting the research. There have been some applications where, you know, it's a really fantastic method, but it's unclear who's actually going to be doing the work. Is it going to be more junior researchers, therefore, how are they going to be supported by more senior researchers? And then that goes back to the FTE issue that I was talking about before. So the protocol needs to be clearly detailed, and I suppose I'll go back to before where people talking about, you know, think about, when you're writing it, think about yourself as the perspective of the reviewer. What questions will we be asking? Try and identify any gaps in that research protocol. I'm sure somebody else will be able to elaborate.
MASHA SOMI:
Caroline, would you like to throw in any extra points?
CAROLINE HOMER:
No, I fully agree with BJ. You know, you're given a page limit that's usually a bit of a sense of how long. I think showing it to other people and seeing if they get the story enough is really helpful. Particularly people not in your field, people not in your team. And, you know, diagrams are really helpful to explain complicated interactions and relationships or, you know, how project A fits with project B and that fits with the next bit. So I think that's really helpful. Your reviewers are going to want to know what you're doing, who you're doing it to and what you're going to get at the end. And so, one page isn't enough, but 25 is too much. So it's a tricky answer, but I think you can just share it with people and get a sense of it. Have you got the right balance?
MASHA SOMI:
Thank you, Caroline.
SPEAKER:
Thank you. We have a few questions about the not for further consideration grants and just asking around the process. I was going to pass on to Caroline again, and then anyone else who might want to add on that from their experiences.
CAROLINE HOMER:
I might actually go to Masha for this because it's managed before it comes to the panel. It's certainly the reviewers' scores and different schemes have different numbers of reviewer scores that go into the mix to make the decision and the cut off. So, I'm not going to get it wrong. So I might ask Masha to get that one because we do it differently on different schemes.
MASHA SOMI:
That's right. Thanks, Caroline. So the process is that applications are provided to the panel members, two, three or four, depending on the particular MRFF grant assessment committee. We do try to aim for four wherever possible. Sometimes that's not feasible, and each of the applications is assessed by those members, individual panel members, and then the scores are collated to create a list based on rank of the scores. And then a cut off is decided, usually by the grant hub, based on some criteria around what applications are taken to the grant assessment panel for discussion. And those that don't go to panel are called not for further consideration. So that's the overall process that results in some applications being rated as NFFC, or not for further consideration, and others progressing to panel and having the conversation.
SPEAKER:
Thank you. So the next question is around the skills of Australian researchers, for example, health economists, biostatisticians, bioinformatics. So these are in short supply. Are there any thoughts about someone's application and how they this may be considered within relation to feasibility? And I suppose I'll hand over to Sarah as our health economist.
SARAH NORRIS:
So I think, certainly, if a proposal has got a clear plan around, they've got an experienced health economist as part of the team and there's clearly articulated strategy for building capacity, specifically in something like health economics or health policy for EMCRs, then I would view that really favourably. I think the whole fund is a great way to actually enhance the capacity across those areas. And you're right, they are in short supply. And I don't know, it's not necessarily the role, the explicit role of the MRFF to solve that, but you can certainly, I think, highlight how you could do that within the context of a specific project in the capacity section.
SPEAKER:
Sorry. We have got lots of questions. We might do one more question, and it's, how do you demonstrate consumer engagement with hard evidence? For example, should we include results from consumer surveys within the grant or a supplementary documentation? I'll pass that to John.
JOHN STUBBS:
So, yes. Hi, thanks. Look, there are a number of ways that this can be done. Certainly there have been consumer surveys. I've been involved in grants or supervised grants where consumer surveys have been an important element. Focus groups, although I suppose if you speak to consumers, they probably say they've been focus grouped to death over the years. But I do think targeted surveys are a good way of getting a broader, a much broader impact or an input into what the research can do, whether it's appropriate, whether it can be effective, and also the cost and the risk. And I think it's really important that researchers look really, really closely at this risk. There may be a risk in doing this particular research, but there is also perhaps a considered risk if this research has not been done, and that research will follow perhaps a number of formats, you know, in relation to patient or social outcomes, community outcomes, better patient and family inclusion and outcomes as a result.
If we're dealing with the health system across the board, we know that the cost of health in this country is on the increase and it's going to be impactful. So I think those elements that are important to the consumers and so looking at the risk and engaging consumer feedback through surveys or focus groups is one way of establishing whether or not it would be appropriate to proceed with this research. Thank you.
MASHA SOMI:
Thank you, John. So thank you, everyone, for attending the webinar today. And also a big thank you to Caroline, Walter, Sarah, BJ, Michael and John for taking the time out of their busy schedules to be with us here today for considered comments through the webinar and also for their contribution to the MRFF through grant assessment committees. Just a quick reminder that you also can nominate to join an MRFF grant assessment committee and we have some information on the MRFF website. We also have a webinar on the 30th of March focused on research administration offices, and on the 30th we'll cover consumer involvement in the MRFF, the recent refresh of the MRFF assessment criteria descriptors and the MRFF monitoring and evaluation and learning strategy. So the discussion today was recorded and will be provided on the MRFF website, and also some responses to the questions that we weren't able to address today, they'll be provided in written format on the website. And finally, this is our first webinar with a panel, and we'd really appreciate your feedback and also any ideas for topics for upcoming webinars, and please email those through to MRFF@health.gov.au.
So thank you all again for attending, and also to our great panel. (MUSIC PLAYS)
This video is a recording of the MRFF Webinar – Assessing MRFF grants: Insights from assessors on 15 March 2023. The webinar was hosted by:
- Dr Masha Somi, Chief Executive Officer, Health and Medical Research Office
- Professor Caroline Homer AO, Deputy Chair, Australian Medical Research Advisory Board.
Dr Somi and Professor Homer were joined by a panel of experts with experience in serving on MRFF Grant Assessment Committees. The panel included:
- Mr John Stubbs AM, Chair, MRFF Consumer Reference Panel
- Professor Walter Abhayaratna OAM, Canberra Hospital and Health Services
- Dr BJ Newton, University of New South Wales
- Professor Michael Kimlin, Queensland University of Technology
- Associate Professor Sarah Norris, University of Sydney.
Topics included:
- MRFF grant assessment process
- MRFF grant assessment criteria
- how to become a GAC member.
A questions and answers session followed.
Read the webinar presentation.