Policy on the use of generative artificial intelligence tools in Cancer Research UK funding applications
1. Purpose
This policy sets out Cancer Research UK’s (CRUK) position on the use of generative Artificial Intelligence (AI) tools in CRUK funding applications.
2. Scope
This policy applies to all those involved in CRUK funding applications or funding reviews including researchers and their teams, Host Institutions and CRUK peer reviewers of funding applications.
3. Definitions
Generative Artificial Intelligence: a novel type of artificial intelligence system that identifies patterns and structures in data and generates novel content such as text, images and other media in response to instructions (‘prompts’).
Host Institution: the university, institution or other organisation at which some or all of the research funded under a CRUK funding application will be carried out.
Research integrity: CRUK, as a signatory to the UK’s Concordat to Support Research Integrity, uses the definition and description of research integrity as outlined within that document with core elements of honesty, rigour, transparency and open communication, care and respect, and accountability.
4. Key Points
CRUK aims to ensure researchers it funds can engage with, and benefit from, the opportunities of generative AI tools (for example, ChatGPT) such as supporting content generation for computer code, whilst protecting against potential ethical, legal and integrity issues to maintain the high standards of the research we fund.
For further guidance, researchers should refer to: CRUK’s guidance for researchers on the use of generative AI.
4.1 Requirements for CRUK funding applicants
CRUK advises researchers and their teams to use caution in relation to the use of generative AI tools in developing their funding applications for CRUK funding and to stay up to date with the policies and guidance from their institutions, funders and the sector.
CRUK funding applicants must:
- Support the highest levels of research integrity as set out in CRUK’s Research Integrity: Guidelines for research conduct.
- Ensure generative AI tools are used in accordance with relevant legal and ethical standards, including data privacy where those standards exist or as they develop.
- Use generative AI tools responsibly to ensure the originality, validity, reliability and integrity of outputs created or modified by generative AI tools. This includes ensuring funding applications contain accurate information and do not contain false or misleading information.
- Correctly and explicitly attribute outputs from generative AI tools in funding applications or research by listing the generative AI source, where practicable, naming the specific model/s used and software, and specifying how content was generated (such as listing the prompt used).
- Adhere to Host Institution policies on the use of generative AI tools, particularly those concerning plagiarism and fabrication.
When approving a funding application submission to CRUK, Host Institutions must take responsibility for ensuring the funding application content is not in breach of CRUK’s Research Integrity: Guidelines for research conduct.
4.2 Requirements for CRUK peer reviewers
To ensure we fund the best quality science and researchers, we operate a robust, rigorous and confidential peer review process to make funding decisions.
As set out in CRUK’s Code of Practice for Funding Panels and Committees, CRUK funding committee and panel discussions, papers and correspondence relating to applications for funding are strictly confidential.
CRUK peer reviewers must not:
- Input any content from a CRUK funding application into generative AI tools. This content is provided to you by CRUK in your capacity as a peer reviewer and sharing with a generative AI tool constitutes a breach of CRUK’s peer review confidentiality and research integrity requirements.
- Use generative AI tools in formulating or editing your peer review critiques for funding applications or reviews. CRUK is revising its confidentiality agreements for peer reviewers to clarify this prohibition, which relates to confidentiality and intellectual property concerns in the sharing, viewing and use of data inputted into generative AI tools.
CRUK peer reviewers will be required to sign and submit a revised confidentiality agreement to confirm compliance with the confidential nature of the review process, including not uploading or sharing content or original concepts from a CRUK funding application or peer review critique into generative AI tools.
4.3 Actions CRUK may take if our requirements are breached
Where there has been a potential breach of this policy, CRUK:
- Should be informed as soon as possible about the issues identified at policies@cancer.org.uk. CRUK recognises this is a fast-evolving field and wishes to work with and support our researchers and reviewers and learn from issues raised.
- Retains the right to apply sanctions under CRUK’s Research Integrity: Guidelines for Research Conduct policy, which may include discontinuing a funding application or funding activities, restriction from future peer review, or taking other sanctions at its own discretion.
4.4 Policy updates
Given the speed of development of generative AI tools, CRUK, in collaboration with other research funders, intends to monitor developments and evolve this policy from time to time as appropriate.
5. Support & Advice
For any queries about this policy please contact: policies@cancer.org.uk.
6. Related documents
For more information, please see the following linked documents:
Download a printable version of this policy:
Research Funders Policy Group statement on generative AI tools
We’re aligned with other UK funders around the use of generative AI tools in funding applications.