Cultivating DEIA Skills in Open Education Research: The Impact of Targeted Fellowship Support
Authors
Virginia Clinton-Lisell
(University of North Dakota)
Jasmine Roberts-Crews
Abstract
In order to conduct effective research in open education, one needs skills in diversity, equity, inclusion, and accessibility. To address this need, a DEIA expert provided consultation to the members and mentor of a fellowship for open education researchers. After completing the fellowship, the researchers were asked to complete a survey about DEIA research skill development during the fellowship and their perceived challenges in DEIA research. Fellows (N = 42) across cohorts indicated growth in DEIA research skills during the fellowship. Notably, fellows who received DEIA consultation described more sophisticated development of DEIA research skills than fellows without the consultation. Fellows also reported challenges in DEIA research skills due to the current political climate, limited funding opportunities, and concerns about support from their institutions.
Keywords: DEIA research, research skills, open education research, DEIA, DEIA research skills
How to Cite:
Clinton-Lisell, V. & Roberts-Crews, J.,
(2025) “Cultivating DEIA Skills in Open Education Research: The Impact of Targeted Fellowship Support”,
Journal of Open Educational Resources in Higher Education 3(3),
81-107.
doi: https://doi.org/10.31274/joerhe.20113
We thank the William and Flora Hewlett Foundation for funding the Open Education Research Fellowship and the Organizational Effectiveness Grant that supported the DEIA consultant and this study.
Clinton-Lisell,
V & Roberts-Crews,
J.
(2025) 'Cultivating DEIA Skills in Open Education Research: The Impact of Targeted Fellowship Support',
Journal of Open Educational Resources in Higher Education.
3(3)
:81-107.
doi: 10.31274/joerhe.20113
Clinton-Lisell,
V & Roberts-Crews,
J.
Cultivating DEIA Skills in Open Education Research: The Impact of Targeted Fellowship Support. Journal of Open Educational Resources in Higher Education. 2025 10;
3(3)
:81-107.
doi: 10.31274/joerhe.20113
Clinton-Lisell,
V
& Roberts-Crews,
J.
(2025, 10 27). Cultivating DEIA Skills in Open Education Research: The Impact of Targeted Fellowship Support.
Journal of Open Educational Resources in Higher Education
3(3)
:81-107.
doi: 10.31274/joerhe.20113
I think the paper is well within the scope of the journal and that the topic is an important one (especially the barriers OE researchers are facing in making their research DEIA in the current climate).
Organization
Mostly I think the organization works. I would recommend that they reorder the Methodology section - I would have it go Procedures, Measures, and then Participants. I would also recommend they have a labeled Limits section. They address this a bit in the Discussion, but I think it helps to have it be its own subsection.
Approach and Conclusions
I think the findings seem good, although I do have some questions about the survey instrument used. I do wonder if how questions were worded could have affected how respondents answered - for instance, was it made clear to them that they were being asked about how the fellowship itself affected their learning of DEAI? Using "Before/after" could potentially open up responses to include other sources of learning, especially for cohorts that took place a while ago.
I think the authors should better stress how length of time could have affected results. While I can appreciate that using just a post-test survey can eliminate issues, using a post-test so long after the fact likely introduces new problems.
I think it would help if the authors included how many fellows were contacted in the before/after groups to give an idea of the rate of response, and I think it would be helpful to provide aggregate responses to each of the likert questions, perhaps as a table.
I question this sentence on pg. 9: "Many fellows in cohorts prior to consultation did not report a development in DEIA research skills during their tenure in the program." This seems to directly conflict with the findings from the likert scale quesitons. Was this coming from the open test responses? If so, how do they compare with resonses to the likert questions? Or is this just maybe poorly worded?
I found their first two research questions duplicative, getting at the same thing but just from different angles. It wasn't clear to me what the difference was.
It also would have been good to see some kind of validation of the survey isntrument.
Writing Style, References
I think overall the writing was good. There were some typo issues (and I suspect a find/replace error) but nothing that can't be fixed easily. I think the very first paragraph gets a little redundant, and I would also argue the first paragraph in the lit review (on why research on OER is needed) could be eliminated entirely.
I also found the labels used for the two groups confusing. I kept thinking that "pre-DEIA consultation" referred to the group that had the consultation but before they received it. Maybe something like "no DEIA consultation" or even just giving the years of the two cohorts would work better.
I do think it would be good if there was more clarity about how, if at all, the fellowship addressed DEAI prior to having the consultant. I get the sense they didn't directly, but it would be good to state that up front when they describe the program.
Application
Yes, I think so, although it would be helpful perhaps to provide more details about content covered during the use of the DEAI consultant. What learning methods were used?
What are the stronger points/qualities of the article?
I think it provides an interesting look at an important issue for the area, and provides some important findings especially in relation to the challenges faced by researchers attempting to conduct DEIA research in this area right now. I also appreciate how the language of the open text responses clearly changed between the two groups.
What are the weaker points/qualities of the article? How could they be strengthened?
The findings didn't point to an improvement in DEAI learning between the cohorts, but I do wonder how much this is a result of testing some cohorts so long after the fact. Honestly, I think what could prove more useful is a content analysis of the research outputs produced by fellows - did those who took part in the consultations demonstrate concepts that the consultant was teaching? I think this might be more reliable than the survey as a way to conduct assessment. But I also acknowledge that is essentially a separate research project, so it would good to at least mention as an area of further study and hopefully is something the authors will undertake (if they haven't already).
Peer Review Ranking: Scope
Highly relevant
Peer Review Ranking: Clarity
clear
Peer Review Ranking: Contribution
contributes
Peer Review Ranking: Research Assessment
sound
Note: This review refers to round of peer
review and may pertain to an earlier version of the document.
Open peer review from Elisabeth White
Scope, Objectives, Content
Yes, this paper is in scope for JOERHE. The study involves a fellowship for researchers focused on open education, and the connection to DEIA research skills is an important one.
Organization
Yes, the paper is well organized. It contains all necessary sections, and the information within each section is present in a logical order.
Approach and Conclusions
The paper is factually accurate and the authors are transparent about some of the limitations of their study's methodology. Some important information is missing that makes it difficult to determine how generalizable their results may be: namely, there is very little information about what the fellowship itself entailed or what the DEIA consultation looked like. It is hard to draw any conclusions without that knowledge. Not every experience with a DEIA consultant will be the same and it's possible that minimal impact that consultation appears to have had on outcomes could be related to shortcomings in the consultant's approach. Without knowing what the consultant did, though, it's impossible to determine. In addition, the self-reported pretest / posttest method is unreliable because it relies on the participant's own subjective judgment. The study could be improved by including some direct assessment measures in which participants demonstrate their understanding and their application of DEIA research skills.
The literature review is excellent -- the authors provide a thorough overview of what DEIA research skills involve and how they relate to open education practices.
Writing Style, References
The article is well-written and I did not notice any significant errors with expression or overall flow.
Application
The potential application of this paper is very limited. The results are not generalizable and the author's conclusions are limited at best. The authors collected some data that may help them assess their own fellowship program, but they have not explained how other practitioners may apply their results to their own context. The main conclusion the authors draw is that working with a DEIA consultant for their specific fellowship may have had a positive impact, albeit a muted one, but it's not clear how other educators might benefit from the information that was gained from this study.
What are the stronger points/qualities of the article?
The literature review is the highlight of the paper. The authors have cited a good variety of relevant research and have provided a great overview of DEIA research skills. The authors did an excellent job breaking down this complex topic.
The angle that the researchers are taking on open education is a good one. I appreciated the explicit link between OERs and DEIA research skills.
What are the weaker points/qualities of the article? How could they be strengthened?
The study's methodology is a weak point. The results are based on participant's own self assessment of their understanding, which is unreliable. The study could be improved by including some direct measures that would allow the researchers to observe each participant's level of understanding in practice.
The practical application of the article is also limited. This weakness could be addressed by providing more information about what role the DEIA consultant played in the fellowship. At a minimum, the authors should explain how other practicioners can benefit from the results of their study.
Peer Review Ranking: Scope
Highly relevant
Peer Review Ranking: Clarity
clear
Peer Review Ranking: Contribution
does not contribute
Peer Review Ranking: Research Assessment
not sound
Note: This review refers to round of peer
review and may pertain to an earlier version of the document.