AI Use in Assessments: Your role in protecting the integrity of qualifications

30 April 2025 This document can be viewed using the interactive browser on this page, or downloaded as PDF using the following link. Executive Summary While the potential for student artificial intelligence (AI) misuse is relatively new, most of the ways to prevent misuse and mitigate the associated risks are not; centres will already have…

30 April 2025

This document can be viewed using the interactive browser on this page, or downloaded as PDF using the following link.

Executive Summary

While the potential for student artificial intelligence (AI) misuse is relatively new, most of the ways to prevent misuse and mitigate the associated risks are not; centres will already have established measures in place to ensure students are aware of the importance of submitting their own independent work for assessment and for identifying potential malpractice. 

This document highlights the regulations that apply in relation to AI use in assessments and provides guidance to help teachers and assessors in centres. 

This document emphasises the following requirements: 

  • In accordance with 5.3(k) of the JCQ General Regulations for Approved Centres, teachers and assessors must only accept work for qualification assessments which is the students’ own; 
  • Students who misuse AI to the extent that the work they submit for assessment is not their own will have committed malpractice in accordance with JCQ regulations and could attract severe sanctions; 
  • Students and centre staff must be aware of the risks of using AI and must be clear on what constitutes malpractice; 
  • Students must ensure work submitted for assessment is demonstrably their own. If any sections of their work are reproduced directly from AI-generated responses, those elements must be identified by the student and they must understand this will not allow them to demonstrate they have independently met the marking criteria and therefore will not be rewarded (please see the Acknowledging AI use and AI use and marking sections below and Appendix B: Exemplification of AI use in marking student work at the end of this document); and 
  • Where teachers have doubts about the authenticity of student work submitted for assessment (for example, they suspect that parts of it have been generated by AI but this has not been acknowledged), they must investigate and take appropriate action. 

The JCQ awarding organisations’ staff, examiners and moderators have established procedures for identifying, reporting and investigating student malpractice, including the misuse of AI. 

This document refers to AI tools and AI detection tools as they were at the time of publication; the JCQ awarding organisations are continuing to monitor developments in this area and will update this document when appropriate. 

Examples of candidate AI misuse cases and marking candidate work where AI tools have been used can be found in appendices A and B to this document. 

Additional support materials, aimed at teachers and students, can be found here: 

JCQ-AI-poster-for-students-2.pdf 
JCQ-AI-information-sheet-for-teachers-1.pdf 

Updating the JCQ guidance on AI Use in Assessments – JCQ Joint Council for Qualifications – this will take you to links for two presentations – one for SLT to use with teachers and one for teachers to use with students

Key changes since last version 

Section Paragraph Summary of change 
Front cover Document title Added that this is for the attention of Heads of Centre, as well as teachers and assessors 
Executive summary Final paragraph Added links to the additional support materials available on the JCQ website 
1. The assessments the regulations and guidance apply to First paragraph Added that care must be taken around use of laptops in exams, to ensure AI tools cannot be accessed 
2. What is AI use and what are the risks of using it in assessments? Bulleted lists setting out examples of AI tools Added additional tools, including those that can generate video content 
3. What is AI misuse by students? Second paragraph Added links to other JCQ documents where student requirements are set out 
4. Centre responsibilities First section Added reference to the responsibilities on Heads of Centre that are set out in the General Regulations as well as reference to other relevant regulations 
4. Centre responsibilities Lettered list of what centres must do One bullet removed, two new ones added 
7. Preventing AI misuse in assessments Third paragraph Added information around the mechanisms schools must have in place 
Appendix A: AI misuse examples Examples 5 and 6 Added two further examples from malpractice cases 
Appendix C: Extracts from JCQ regulations and guidance relevant to use of AI Full appendix New appendix drawing together information from different published documents which is relevant to AI 

1. The assessments the regulations and guidance apply to 

There are some assessments in which access to the internet is permitted in the preparatory, research or production stages. The majority of these assessments will be Non-Examined Assessments (NEAs), coursework and internal assessments for General Qualifications (GQs) and Vocational and Technical Qualifications (VTQs). This document is primarily intended to explain the regulations and provide supporting guidance in relation to these assessments.

Students complete the majority of their exams and a large number of other assessments under close staff supervision with limited access to authorised materials and no permitted access to the internet. The delivery of these assessments should be unaffected by developments in AI tools as students must not be able to use such tools when completing these assessments, although care must be taken when a student is allowed to use a laptop or similar device for exams, to ensure they have no access to AI tools (see sections 14.20-14.27 of the Instructions for conducting examinations document). 

There are some assessments in which access to the internet is permitted in the preparatory, research or production stages. The majority of these assessments will be Non-Examined Assessments (NEAs), coursework and internal assessments for General Qualifications (GQs) and Vocational and Technical Qualifications (VTQs). This document is primarily intended to explain the regulations and provide supporting guidance in relation to these assessments.

2. What is AI use and what are the risks of using it in assessments? 

AI use in this context refers to the use of AI tools to obtain information and content which might be used in work produced for assessments, which contributes to the award of qualifications. 

When properly referenced, this can be acceptable, although students cannot be credited for any work they produce for assessment which is not their own so the benefit to them of using AI is likely to be limited and they risk committing malpractice if AI is misused. 

AI chatbots are AI tools which generate text in response to user prompts and questions. Users can ask follow-up questions or ask the chatbot to revise the responses already provided. AI chatbots respond to prompts based upon patterns in the data sets (large language model) upon which they have been trained. They generate responses which are statistically likely to be relevant and appropriate. AI chatbots can complete tasks such as the following: 

  • Answering questions 
  • Analysing, improving, and summarising text 
  • Authoring essays, articles, fiction, and non-fiction 
  • Writing computer code 
  • Translating text from one language to another 
  • Generating new ideas, prompts, or suggestions for a given topic or theme 
  • Generating text with specific attributes, such as tone, sentiment, or formality 

AI tools available include: 

There are also AI tools which can be used to generate images, music or video, such as: 

It is important that teachers and students are aware that the range of AI tools and their capabilities is expanding quickly, and that there are limitations to their use such as producing inaccurate or inappropriate content. 

The lists of certain suppliers of AI-related products are for information purposes only and do not constitute an endorsement by JCQ. It is each centre’s or individual’s responsibility to investigate and verify any suppliers they use, including any terms and conditions which govern the sale or use of the supplier’s products. The lists provided are not exhaustive. 

The use of AI tools may pose significant risks if used by students completing qualification assessments, not least the risk of committing malpractice, for which serious sanctions can apply. As also noted above, the tools have been developed to produce responses based upon the statistical likelihood of the language selected being an appropriate response and so the responses cannot be relied upon. AI tools often produce answers which may seem convincing but contain incorrect or biased information. Some AI tools have been identified as providing answers to questions that can prompt inappropriate actions, and some can also produce fake references to books/articles.

3. What is AI misuse by students? 

In accordance with section 5.3(k) of the JCQ General Regulations for Approved Centres, students must submit work for assessments which is their own. This applies to both internal and private candidates

Student work submitted for assessment must be in their own words and not copied or paraphrased from another source such as an AI tool and must reflect their own independent work. Students must demonstrate their own knowledge, skills and understanding as required for the qualification in question and set out in the qualification specification. This includes demonstrating their performance in relation to the assessment objectives for the subject relevant to the question/s or other tasks students have been set. 

The requirements for students are set out in the documents: 

• JCQ Information for candidates – Non-examination assessments 
• JCQ Information for candidates – Coursework
assessments 

While AI is becoming a useful tool in the workplace, for the purposes of demonstrating knowledge, understanding and skills for qualifications, it is important students develop the knowledge, skills and understanding of the subjects they are studying and do not rely on AI. 

Students must be able to demonstrate the final submission is the product of their own independent work and independent thinking. 

AI misuse is where a student has used one or more AI tools but has not appropriately acknowledged this use and has submitted work for assessment when it is not their own. Examples of AI misuse include, but are not limited to, the following: 

  • Copying or paraphrasing sections of AI-generated content so that the work submitted for assessment is no longer the student’s own. 
  • Copying or paraphrasing whole responses of AI-generated content. 
  • Using AI to complete parts of the assessment so that the work does not reflect the student’s own work, analysis, evaluation or calculations. 
  • Failing to acknowledge use of AI tools when they have been used as a source of information. 
  • Incomplete or poor acknowledgement of AI tools. 
  • Submitting work with intentionally incomplete or misleading references or bibliographies. 

AI misuse constitutes malpractice as defined in the JCQ Suspected Malpractice: Policies and Procedures. The malpractice sanctions available for the offences of ‘making a false declaration of authenticity’ and ‘plagiarism’ include disqualification and debarment from taking qualifications for a number of years. Students’ marks may also be affected if they have relied on AI to complete an assessment and, as noted above, the attainment they have demonstrated in relation to the requirements of the qualification does not accurately reflect their own work. 

Examples of AI misuse cases dealt with by awarding organisations may be found in Appendix A: AI misuse examples at the end of this document.

4. Centre responsibilities

In accordance with section 5.3(k) of the JCQ General Regulations for Approved Centres, the Head of Centre is responsible for having arrangements in place to ensure that students’ centre-assessed work is produced, authenticated and marked, in accordance with the awarding bodies’ instructions. This applies to all candidates, including private candidates. 

This means that centres must have agreed policies and procedures relating to assessment in place which effectively monitor and check that the work a student submits for assessment is their own. Centres must ensure these also address the risks associated with AI misuse. 

Other relevant regulations include: 

  • 5.3 (z) of the JCQ General Regulations for Approved Centres requires centres to have in place, and available for inspection, a malpractice policy which must cover AI use (what it is, when it may be used and how it should be acknowledged, the risks of using AI, what AI misuse is and how this will be treated as malpractice). Section 3.3 of the JCQ malpractice policies and procedures requires that centres must take all reasonable steps to prevent malpractice. 
  • Section 7 of the JCQ Instructions for conducting coursework and 4.1, 4.6 and 9 of the JCQ Instructions for conducting non-examination assessments explain the supervision and authentication requirements. 

To ensure compliance with the regulations, teachers, assessors and other staff must: 

  • regularly review the use of AI in qualification assessments and agree their approach to managing use of AI by students in their school, college or exam centre. 
  • make students aware of the appropriate and inappropriate use of AI, the risks of using AI, and the possible consequences of using AI inappropriately in a qualification assessment. In doing so, they may wish to use the JCQ support materials referenced in the Executive Summary. 
  • make students aware of the centre’s approach to plagiarism and the consequences of malpractice. 
  • consider how to best communicate with parents/carers to make them aware of the risks and issues and ensure they support the centre’s approach. 

and centres must

a) Explain to students the importance of submitting work that is a result of their own independent efforts for assessments, and stress to them and to their parents/carers the risks of malpractice;

b) Regularly review the centre’s malpractice/plagiarism policy to acknowledge the use of AI (e.g. what it is, the risks of using it, what AI misuse is, how this will be treated as malpractice, when it may be used, how it should be acknowledged and how teachers will authenticate work)

c) Ensure the centre’s malpractice/plagiarism policy includes clear guidance on how students must reference appropriately (including websites); 

d) Ensure the centre’s malpractice/plagiarism policy includes clear guidance on how students must acknowledge any use of AI to avoid misuse (see the below section on Acknowledging AI use); 

e) Ensure teachers and assessors are familiar with AI tools, their risks and AI detection tools (see the What is AI use and what are the risks of using it in assessments and the What is AI misuse by students sections);

f) Ensure, where students are using word processors or computers to complete assessments, teachers and relevant centre staff are aware of how to disable improper internet/AI access where this is prohibited; 

g) Ensure each student is issued with a copy of, and understands, the appropriate JCQ Information for Candidates (www.jcq.org.uk/exams-office/ information-for-candidates-documents) document; 

h) Reinforce to students the significance of their declaration where they confirm the work they submit is their own, the consequences of a false declaration, and they have understood and followed the requirements for the subject; 

i) Remind students that awarding organisation staff, examiners and moderators have established procedures for reporting and investigating malpractice (see the Awarding Organisation actions section below and the examples of AI misuse cases dealt with by awarding organisations in Appendix A: AI misuse examples at the end of this document); 

j) Ensure teachers are aware they must not use AI tools as the sole marker of student work (see AI use and marking section below); 

k) Ensure teachers and Heads of Department are clear about their responsibility to only authenticate and submit work for assessment by the awarding organisation that they are confident is the student’s own; 

l) Have a process in place for teaching staff to follow where misuse of AI is suspected before the student has signed the declaration form as this does not need reporting to the awarding organisation and must be dealt with in the centre directly.

5. Acknowledging AI use 

It is essential students are clear about the importance of referencing the sources they have used when producing work for an assessment, and they know how to do this. Appropriate referencing is a means of demonstrating academic integrity and is key to maintaining the integrity of assessments. If a student uses an AI tool which provides details of the sources it has used in generating content, these sources must be verified by the student and referenced in their work in the normal way. Where an AI tool does not provide such details, students must ensure they independently verify the AI-generated content – and reference the sources they have used. 

Students acknowledging the use of AI and showing clearly how they have used it allows teachers and assessors to review how AI has been used and whether the use was appropriate in the context of the particular assessment. This is particularly important given that AI-generated content is not subject to the same academic scrutiny as other published sources. 

Where AI tools have been used as a source of information, student acknowledgement must show the name of the AI source used and the date the content was generated. For example: 

ChatGPT 3.5 (https://openai.com/blog/chatgpt/), 25/01/2025. 

The student must retain a copy of the question(s) and computer-generated content for reference and authentication purposes, in a non-editable format (such as a screenshot) and provide a brief explanation of how it has been used. 

This must be included with the work the student submits for assessment, so the teacher/assessor is able to review the work, the AI-generated content and how it has been used. If this is not submitted, but the teacher/assessor suspects that the student has used AI tools, the teacher/assessor will need to consult the centre’s malpractice policy for appropriate next steps and must take action to assure themselves the work is the student’s own. Where the teacher/assessor cannot assure themselves, they must follow their centre’s internal procedures and the published guidance for assessment. 

Further guidance is set out in the JCQ Plagiarism in Assessments document (see link below). 

The JCQ regulations for candidates on referencing may be found in the following: 

The JCQ guidance for teachers on referencing may be found in the following: 

Other actions which should be considered in relation to acknowledging AI use are: 

a) Students are reminded, as with any source, poor referencing, paraphrasing and copying sections of text may constitute malpractice, and could attract severe sanctions including disqualification. In the context of AI use, students must be clear what is, and what is not, acceptable in respect of acknowledging AI content and the use of AI sources. For example, it would be unacceptable to simply reference ‘AI’ or ‘ChatGPT’, just as it would be unacceptable to state ‘Google’ rather than the specific website and webpages which have been consulted;

b) Students are also reminded if they use AI they have not independently met the marking criteria therefore they will not be rewarded. Examples of how to implement this can be found in Appendix B: Exemplification of AI use in marking student work at the end of this document.

6. AI use and marking 

When marking student work in which AI use has been acknowledged, and there are no concerns of AI misuse, the assessor must still ensure the student is not rewarded if they have used AI tools such that they have not independently met the marking criteria. Depending upon the marking criteria or grade descriptors being applied, the assessor may need to take into account the student’s failure to independently demonstrate their understanding of certain aspects when determining the appropriate mark/grade to be awarded. Where such AI use has been considered, and particularly where this has had an impact upon the final marks/grades awarded by the assessor, clear records should be kept – this provides feedback to the student and provides clarity in the event of an internal appeal or the work being selected for moderation/standards verification. 

Examples of how to take into account the acknowledged use of AI tools when marking may be found in 

Appendix B: Exemplification of AI use in marking student work. 

Centres may determine, after careful consideration of any data privacy concerns, whether it is appropriate for their teachers and assessors to use AI tools to help mark student work. Where centres do permit AI tools to be used to mark student work, an AI tool cannot be the sole marker. A human assessor must review all the work in its entirety and determine the mark it warrants, regardless of the outcomes of an AI tool. The assessor remains responsible for the mark/grade awarded.

7. Preventing AI misuse in assessments 

While there may be benefits to using AI in some situations, there is the potential for it to be misused by students, either accidentally or intentionally. AI misuse, in that it involves a student submitting work for qualification assessments which is not their own, can be considered a form of plagiarism. JCQ has published guidance on plagiarism in assessments, which covers what it is, how to prevent it, and how to detect it. 

Teachers and assessors must be assured that the work they accept for assessment and mark is the student’s own work. They are required to confirm this during the assessment process and, if they have doubts, must follow their centre’s internal procedures and published guidance for assessment. 

Centres must have mechanisms in place, as previously referenced in the section titled ‘Centre Responsibilities’, which include: 

  • the approach the centre will use to prevent and identify AI misuse in each of the subjects including coursework or non-examined assessment that it delivers, including the approach taken for any private candidates. 
  • the process to follow where there are concerns about AI misuse before the student’s work is authenticated. In this situation, the centre is responsible for determining next steps and a teacher/assessor should not refer the work to the awarding organisation for a decision. 

Those who work with the students on a regular basis and are familiar with their ability and standard of work are usually best-placed to make determinations about the misuse of AI although the relevant awarding organisation is available to provide advice and guidance to help the centre where needed. 

To prevent misuse, education and awareness of staff and students is likely to be key. Here are some actions which could be taken (many of these will already be in place in centres as these are not new requirements): 

a) Consider restricting access to online AI tools on centre devices and networks; 

b) Ensure access to online AI tools is restricted on centre devices used for exams; 

c) Set reasonable deadlines for submission of work and providing reminders; 

d) Where appropriate, allocate time for sufficient portions of work to be completed in class under direct supervision to allow the teacher to authenticate all of each student’s work with confidence; 

e) Examine intermediate stages in the production of work in order to ensure work is underway in a planned and timely manner and work submitted represents a natural continuation of earlier stages; 

f) Introduce classroom activities that use the level of knowledge/understanding achieved during the course thereby making the teacher confident the student understands the material; 

g) Consider whether it is helpful to engage students in a short verbal discussion about their work to ascertain they understand it and it reflects their own independent work; 

h) Do not accept, without further investigation, work which staff suspect has been taken from AI tools without proper acknowledgement or is otherwise plagiarised – doing so encourages the spread of this practice and is likely to constitute staff malpractice which can attract sanctions. 

i) Issuing tasks for centre-devised assignments which are, wherever possible, topical, current and specific, and require the creation of content which is less likely to be accessible to AI models trained using historic data.

8. Identifying misuse in assessments 

Identifying the misuse of AI by students requires the same skills and observation techniques teachers are already using to assure themselves student work is authentically their own. There are also some tools that may be used. These different methods are explored below. 

Comparison with previous work 

When reviewing a given piece of work to ensure its authenticity, it is useful to compare it against other work created by the student. Teachers could consider comparing newly submitted work with work completed by the student in the classroom, or under supervised conditions. Where the work is made up of writing, it is possible to make note of the following characteristics: 

  • Spelling and punctuation 
  • Grammatical usage 
  • Writing style and tone
  • Vocabulary 
  • Complexity and coherency 
  • General understanding and working level 
  • The mode of production (i.e. whether handwritten or word-processed) 
Private candidates 

Verifying the authenticity of work submitted by private candidates can be more challenging for centres, given they may not have a good understanding of the standard the student is currently working at. Before accepting entries from a private candidate for a subject that includes NEA or coursework, the centre must consider the steps they will take that will enable the teachers/assessors to ensure the work submitted for assessment is the student’s own independent work. This may involve requiring the student to undertake some of the work under supervision, a review of the student’s portfolio of evidence across a range of qualifications and a short discussion with the student regarding their work. 

Potential indicators of AI misuse 

If the following are seen in student work, it may be an indication the student has misused AI: 

a) A default use of American spelling, currency, terms and other localisations. 

b) A default use of language or vocabulary which may not accord with the qualification level (though be aware AI tools may be instructed to employ different languages, registers and levels of proficiency when generating content). 

c) A lack of direct quotations and/or use of references where these are required/ expected (though some AI tools will produce quotations and references). 

d) Inclusion of references which cannot be found or verified (some AI tools have provided false references to books or articles by real authors). 

e) A lack of reference to events occurring after a certain date (reflecting when an AI tool’s data source was compiled), which may be notable for some subjects.

f) Instances of incorrect and/or inconsistent use of first-person and third-person perspective where generated text is left unaltered. 

g) A difference in the language style used when compared to that used by a student in the classroom or in other previously submitted work. 

h) A variation in the style of language evidenced in a piece of work, if a student has taken significant portions of text from AI and then amended it. 

i) A lack of graphs/data tables/visual aids where these would normally be expected. 

j) A lack of specific local or topical knowledge. 

k) Content being more generic in nature rather than relating to the student themself, or a specialised task or scenario, if this is required or expected. 

l) The inadvertent inclusion by students of warnings or provisos produced by AI to highlight the limits of its ability, or the hypothetical nature of its output. 

m) The submission of student work in a typed format, where their normal output is handwritten. 

n) The unusual use of several concluding statements throughout the text, or several repetitions of an overarching essay structure within a single lengthy essay, which can be a result of AI being asked to produce an essay several times to add depth and variety or to overcome its output limit. 

o) The inclusion of strongly stated non-sequiturs or confidently incorrect statements within otherwise cohesive content. 

p) Overly verbose or hyperbolic language that may not be in keeping with the candidate’s usual style. 

Automated detection 

AI tools, as large language models, produce content by ‘guessing’ the most likely next word in a sequence. This means AI-generated content uses the most common combinations of words, unlike humans who tend to use a variety of words in their normal writing. Several programs and services use this difference to statistically analyse written content and determine the likelihood that it was produced by AI, for example: 

These may be used as a check on student work and/or to verify concerns about the authenticity of student work. However, it should be noted that the above tools will give lower scores for AI-generated content which has been subsequently amended by students, as they base their scores on the predictability of words. Spending time getting to know how the detection tools work will help teachers and assessors understand what they are and are not capable of. 

AI detection tools, including those listed above, employ a range of detection models which vary in accuracy depending on the AI tool and version used, the proportion of AI to human content, prompt types and other factors (such as an individual’s English language competency). In instances where misuse of AI is suspected it may be helpful to use more than one detection tool to provide an additional source of evidence about the authenticity of student work. 

The use of detection tools, where used, should form part of a holistic approach to considering the authenticity of students’ work; all available information must be considered when reviewing any malpractice concerns. Teachers will know their students best and so are best placed to assess the authenticity of work submitted to them for assessment – AI detection tools can be a useful part of the evidence they can consider.

The list of certain suppliers of AI-related products is for information purposes only and does not constitute an endorsement by JCQ. It is each centre’s responsibility to investigate and verify any suppliers they use, including any terms and conditions which govern the sale or use of the supplier’s products. The list provided is not exhaustive.

9. Reporting 

If a student has not signed the declaration of authentication, centres do not have to report the incident to the appropriate awarding organisation. Steps to resolve such incidents must be detailed in the centre malpractice/plagiarism policy. These steps must include: 

  • ensuring students are aware of what malpractice is, 
  • how to avoid malpractice, 
  • how to properly reference sources and acknowledge AI tools, etc. 

Teachers must not accept work which is not the student’s own. Ultimately the Head of Centre has the responsibility for ensuring students submit authentic work. 

If AI misuse is detected or suspected by the centre and the declaration of authentication has been signed by the student, the case must be reported to the relevant awarding organisation. The procedure is detailed in the JCQ Suspected Malpractice: Policies and Procedures.

10. Awarding Organisation actions 

The JCQ awarding organisations ensure staff, moderators and examiners are appropriately trained in the identification of malpractice and have established procedures for reporting and investigating suspected malpractice. 

If AI misuse is raised by or reported to an awarding organisation, full details of the allegation will usually be relayed to the centre. The relevant awarding organisation will liaise with the Head of Centre regarding the next steps of the investigation and how appropriate evidence will be obtained. The awarding organisation will then consider the case and, if necessary, impose a sanction in line with the sanctions given in the JCQ Suspected Malpractice: Policies and Procedures (https://www.jcq.org.uk/exams-office/malpractice/). The sanctions applied to a student committing plagiarism and making a false declaration of authenticity range from a warning regarding future conduct to disqualification and the student being barred from entering for one or more examinations for a set period of time. 

Examples of AI misuse cases dealt with by awarding organisations may be found in Appendix A: AI misuse examples at the end of this document. 

Awarding organisations will also take action, which can include the imposition of sanctions, where centre staff are knowingly accepting, or failing to check, inauthentic work for qualification assessments.

Appendix A: AI misuse examples 

Introduction 

The following are anonymised examples from recent malpractice cases involving the misuse of AI tools. Please note although specific subjects are identified in the examples below, the circumstances described, and the associated actions and sanctions could be applied to any qualification. The following have been chosen so as to give examples which cover a range of different contexts, including where centres have reported AI misuse concerns and where awarding body assessment personnel have identified potential issues. The fourth example is an example of what can go wrong when word processors have not been correctly set up for examinations. 

Plagiarism – AI misuse 

Awarding body: AQA
Qualification: A Level History NEA 

A centre reported one of its teachers for A Level History had concerns relating to two candidates’ NEA submissions. The concerns were that multiple sections were inconsistent with other parts of the candidates’ work and the candidates’ usual levels and styles of writing. 

The centre used AI detection software to follow up on the teacher’s concerns. The centre’s review identified the following. 

Candidate A: The AI detection software identified the work as being highly likely to have been generated by AI. This candidate admitted using ChatGPT to generate a guideline for their own work and claimed that they had accidentally submitted the guideline instead of their own work. 

Candidate B: The AI detection software identified the work as being potentially generated by AI, and likely a combination of AI and human input. This candidate admitted using ChatGPT for some of the content of their work, for both the improvement of their own work as well as the creation of entirely new content. 

The centre reported both candidates to the awarding body and provided confirmation that the candidates had been issued all relevant ‘information for candidates’ documents and the candidates had signed the declaration of authenticity to declare that the work completed was their own. 

Both candidates were found to have committed malpractice. Candidate A was disqualified from the A Level History qualification and Candidate B received a loss of all marks gained for the A Level History NEA component. 

Awarding body: OCR
Qualification: Cambridge Nationals Enterprise and Marketing 

The moderator raised concerns of suspected plagiarism in a unit of the above qualification, due to a lack of referencing seen within candidates’ work. 

Through using Turnitin, two candidates were identified who may have potentially used AI tools, or Large Language Models (LLMs), to generate content for at least one Learning Objective. These included explanations of different business terms and financial analyses. 

One candidate admitted to using ChatGPT in the later parts of their coursework as they had not understood some of the questions and felt assistance from their teacher was “too infrequent”. They stated their logic was it was no different to asking a teacher for advice as the AI tool would take information from across the internet and since they were asking specific questions, the ‘reply’ from the AI tool would be the same as getting teacher advice and feedback.

The other candidate admitted they had used an AI tool to generate content for their work but couldn’t remember which sections of work had been their own. 

Although the cohort had been told about plagiarism and how to avoid it, there had been no specific mention of AI tools – despite AI misuse being a form of plagiarism. 

Based on the evidence provided by the centre, it was determined that the two candidates would receive zero marks for the affected Learning Objectives. 

Awarding body: Pearson
Qualification: Extended Project P301 

During a regular review of work for the purposes of identifying potential AI misuse, a candidate’s Extended Project submission was identified by detection software as containing several unreferenced sections of AI generated content. A further evaluation of the submission concluded multiple sections of the work included extensive indicators associated with generative AI. Upon contacting the centre, the candidate declined to provide a statement explaining the concerns, and the case was referred to Pearson’s Malpractice Committee for consideration. 

Following a careful review of the available evidence, the Malpractice Committee found the candidate to be in breach of the JCQ AI Use in Assessments document which defines as malpractice “copying or paraphrasing sections of AI-generated content so that the work submitted for assessment is no longer the student’s own” and “failing to acknowledge use of AI tools when they have been used as a source of information”. 

The Malpractice Committee determined, as the result of the malpractice, the candidate be disqualified from the qualification. 

Awarding body: AQA
Qualification: GCSE Religious Studies 

A candidate’s word processed exam script was reported to the malpractice team by the examiner marking it because they had identified frequent American spellings and they felt the highly sophisticated language and concepts it contained were not consistent with GCSE level work. 

The candidate’s word processed script was reviewed using AI detection software which returned a high probability score for the use of AI. The candidate was asked to provide a statement, in which they denied the use of AI. 

After consideration of the evidence gathered, it was decided the candidate had breached examination conditions and used AI for the production of answers in their examination. The candidate received a loss of all marks gained for a component. 

Post-results, it was also concluded by the centre the candidate’s marks and grades were not consistent with expectation or previous attainment. Following the outcome of this case and the disparity in performance flagged by the centre, all of the candidate’s assessments were processed through AI detection software which showed multiple components were affected. The outcome was that the candidate received a loss of all marks gained for the affected components. 

Following an investigation it was found that the candidate’s word processor had not been correctly set up. Internet access should have been disabled for the word processor, which would have prevented this malpractice from occurring. As part of the investigation, the awarding body sought to ensure that such incidents could not recur. The centre gave details of the steps that would be taken to prevent a recurrence of this issue, which included the re-training of invigilators on word processor set up.

Awarding body: OCR
Qualification: A Level Art and Design 

A candidate was suspected of having AI-generated content from DeepL, an AI-powered translation tool, in their sketchbook for A Level Art and Design. The Deputy Head of the centre explained that the candidate’s approach involved researching in their own language and then translating all of this into English. The candidate admitted they used DeepL to translate source material into their sketchbook and were aware that this was not allowed for their assessment. By translating their work in this way, the candidate effectively called into question the overall authenticity of the work. From that point onward, it became unclear what ideas, knowledge, and understanding presented were entirely their own. 

The teacher reported that while reviewing the candidate’s work, they identified several sections with writing inconsistencies. Through their own internal analysis, the centre estimated 98% of the content to be influenced by AI. The candidate explained that most of their academic materials were in their own language, which caused difficulties in their work. They assumed that DeepL would be a reliable and accurate translator for their needs. However, they were unaware that it utilised AI support. Despite checking DeepL’s website, they did not realise the seriousness of using AI at the time and would have avoided translation software had they known. The centre stated that candidates were made fully aware of the rules and regulations around AI use leading up to the assessment. 

The JCQ Instructions for Conducting Non-Examination Assessments prohibit candidates from using the internet or other sources without acknowledgment. It is evident that the candidate breached the assessment regulations by not acknowledging and referencing the use of an AI-powered translation tool in their work. Furthermore, the candidate was aware of the implications as the centre had, on multiple occasions, provided guidance on referencing and AI. Additionally, the candidate signed a declaration of authenticity form confirming that the work submitted was their own, which included that they had clearly referenced any sources and AI tools they had used. In view of the above and in accordance with the JCQ Suspected Malpractice Policies and Procedures document, the candidate was sanctioned with a loss of marks which resulted in zero marks being awarded for the component. 

Awarding body: WJEC
Qualification: Level 3 Diploma in Criminology, Unit 3 controlled assessment 

During a centre’s internal moderation process and prior to work being submitted to WJEC moderation, an assessor/teacher at the centre suspected that part of the work presented was not entirely the candidate’s own. 

It was suspected by the assessor that certain assessment criteria within the work did not match the work produced within other assessment criteria produced by the candidate. 

With this particular qualification, candidates are permitted to take a folder of general notes into the assessment based on prior research for an unseen assignment brief. 

When comparing the assessment criteria of concern to the notes the candidate took into the assessment, it was found that the candidate had copied their notes word for word, and they were identified as being generated by AI. The candidate had not referenced the source as AI-generated and had not declared it. 

On receipt of the candidate’s work WJEC conducted further checks of the work via an AI detection tool, which provided evidence to confirm the centre’s suspicion. 

The checks of the candidate’s full body of work did not detect any further AI-generated content elsewhere. 

As part of the investigation, the candidate confirmed that they had used an AI tool for one section of their notes only, due to rushing to prepare their notes prior to the assessment taking place.

Following a careful review of the available evidence, WJEC determined that the candidate was in breach of the assessment requirements and JCQ AI Use in Assessments document which defines as malpractice “copying or paraphrasing sections of AI-generated content so that the work submitted for assessment is no longer the student’s own” and “failing to acknowledge use of AI tools when they have been used as a source of information”

WJEC decided that as a result of the AI misuse being confined to one assessment criteria, the candidate received a penalty of loss of marks for a section. The impact of which resulted in the candidate not obtaining the overall qualification, as all assessment criteria for the unit must be met and a minimum number of marks must be achieved in each assessment criteria to gain the qualification.

Appendix B: Exemplification of AI use in marking student work 

Introduction 

The following are examples of how the JCQ AI Use in Assessments document relating to students using AI tools can be applied by teachers and assessors when students have not independently met the marking criteria, as per page 8 of this document: “b) Students are also reminded if they use AI they have not independently met the marking criteria therefore they will not be rewarded.” In the below examples, students have not independently met the marking criteria because of their over reliance on AI tools. 

Examples 

Awarding body: Pearson
Qualification: A Level History 

A candidate has produced coursework for the NEA component of the qualification which is of a good standard. The candidate has used a range of sources and AI tools which have been appropriately cited within the work. The candidate has demonstrated some understanding of the topic, using generally correct and appropriate information. The candidate has also expressed an opinion on the topic at hand and has attempted some discussion of differing viewpoints. The work is clear and coherent but does lack depth. 

The assessor marking the work at the centre consults the mark scheme for this component and identifies that the work is likely to attract marks which make it fall within Level 3. The mark scheme for this level is as follows: 

LevelMarkDescriptor
Level 3
17-24
Explains analysis and attempts evaluation
A range of material relevant to the enquiry has been identified from reading and appropriately cited. Information has been appropriately selected and deployed to show understanding of the overall issue in question.
A judgement on the question is related to some key points of view encountered in reading and discussion is attempted, albeit with limited substantiation. Contextual knowledge of some issues related to the debate is shown and linked to some of the points discussed.
Analyses some of the views in three chosen works by selecting and explaining some key points and indicating differences. Explanation demonstrates some understanding of the reasons for differences.
Attempts are made to establish valid criteria for evaluation of some arguments in the chosen works and to relate the overall judgement to them, although with weak substantiation.
Mostly accurate and relevant knowledge is included to demonstrate some understanding of the conceptual focus of the enquiry, but material lacks range or depth. The answer is concise and shows some organisation. The general trend of the argument is clear, but parts of it lack logic, coherence and precision.
Low level 3: 17-18 marks 
The qualities of Level 3 are displayed, but material is less convincing in some aspects and it is not concise.
Mid level 3: 19-21 marks 
The qualities of Level 3 are displayed, but material is less convincing in some aspects or it is not concise. 
High level 3: 22-24 marks 
The qualities of Level 3 are securely displayed.

Having carefully considered the descriptors and the candidate’s work, the assessor considers the work is of a high level 3 standard, worth 22-24 marks. However, for the section in the work in which the candidate discusses some key points and differences between three historical resources, the candidate has relied solely upon an AI tool. This use has been appropriately acknowledged and a copy of the input to and output from the AI tool has been submitted with the work. As the candidate has not independently met the marking criteria they cannot be rewarded for this aspect of the descriptor (i.e. the third bullet point above). The assessor therefore places the work in the mid-level 3 category, awarding 20 marks. 

The assessor ensures this decision regarding the student’s AI use and its impact on marking is clearly recorded. This provides feedback to the student and provides clarity in the event of an internal appeal or the work being selected for moderation. 

Awarding body: Pearson Qualification: BTEC Level 3 National Extended Diploma in Business 

A student has produced work for unit 1: Exploring Business. The student has produced work of a good standard in which they have compared two different businesses in some depth. The candidate has used a range of sources and AI tools which have been appropriately cited within the work. In the work the student has assessed the relationship with stakeholders of the two companies, analysed the two organisations’ structures, discussed the effects of the business environment on the companies – including their response to recent and potential future changes in the market, and reviewed the importance of innovation and entrepreneurship in the success of one of the companies. 

The assessor carefully reviews the assessment criteria for unit 1, which are as follows: 

PassMeritDistinction
Learning aim A: Explore the features of different businesses and analyse what makes them successful
A.P1 Explain the features of two contrasting businesses.

A.P2 Explain how two contrasting businesses are influenced by stakeholders.
A.M1 Assess the relationship and communication with stakeholders of two contrasting businesses using independent research. AB.D1 Evaluate the reasons for the success of two contrasting businesses, reflecting on evidence gathered.
Learning aim B: Investigate how businesses are organised
B.P3 Explore the organisation structures, aims and objectives of two contrasting businesses. B.M2 Analyse how the structures of two contrasting businesses allow each to achieve its aims and objectives.
Learning aim C: Examine the environment in which businesses operate
C.P4 Discuss the effect of internal, external and competitive environment on a given business.

C.P5 Select a variety of techniques to undertake a situational analysis of a given business.
C.M3 Assess the effects of the business environment on a given business. C.D2 Evaluate the extent to which the business environment affects a given business, using a variety of situational analysis techniques.
Learning aim D: Examine business markets
D.P6 Explore how the market structure and influences on supply and demand affect the pricing and output decisions for a given business. D.M4 Assess how a given business has responded to changes in the market. C.D3 Evaluate how changes in the market have impacted on a given business and how this business may react to future changes.
Learning aim E: Investigate the role and contribution of innovation and enterprise to business success
E.P7 Explore how innovation and enterprise contribute to the success of a business. E.M5 Analyse how successful the use of innovation and enterprise has been for a given business. E.D4 Justify the use of innovation and enterprise for a business in relation to its changing market and environment.

The assessor is content that the work meets all Pass, Merit and Distinction criteria. However, the assessor is aware that in the section in which the student discusses how one of the businesses might react to future changes in the business environment, the student has relied upon the use of an AI tool (appropriately acknowledged, with the input and output from the AI tool submitted together with the assignment) and has not independently demonstrated their own understanding beyond this. The assessor therefore cannot award criterion D.D3 and, as the work has not met all Distinction assessment criteria (which is required to achieve an overall Distinction grade), the work is awarded a Merit grade overall. 

The assessor ensures this decision regarding the student’s AI use and its impact on marking is clearly recorded. This provides feedback to the student and provides clarity in the event of an internal appeal or the work being selected for standards verification.


Appendix C: Extracts from JCQ regulations and guidance relevant to the use of AI 

This appendix draws together into one place information from different JCQ documents which is relevant to preventing AI misuse, for ease of reference. You do not need to read the full appendix and nor does it represent any new information; the aim is to help you easily access applicable information in the extracts below. These explain the responsibilities of centre staff and candidates, which are relevant to managing the use of AI. 

Note that these are extracts only and do not reflect the full guidance and regulations, which can be accessed via the following links: 

General Regulations for Approved Centres 
Instructions for Conducting Examinations 
Instructions for Conducting Coursework 
Instructions for Conducting Non-Examination Assessments (GCE & GCSE Specifications) 

SectionContentWho is it for

5.3k
It is the responsibility of the Head of Centre to ensure that their centre:
has in place arrangements to co-ordinate and standardise all marking of centre-assessed components and to ensure that candidates’ centre-assessed work is produced, authenticated and marked, or assessed and quality assured in accordance with the awarding bodies’ instructions. This applies to both internal and private candidates.

Centre staff

5.3z
It is the responsibility of the Head of Centre to ensure that their centre:
has in place the following policies for inspection that must be reviewed and updated annually:
• a written malpractice policy which covers all qualifications delivered by the centre. The policy must detail how candidates are informed and advised to avoid committing malpractice in examinations/assessments, how suspected malpractice issues should be escalated within the centre and reported to the relevant awarding body. It must also acknowledge the use of AI (e.g. what AI is, when it may be used and how it should be acknowledged, the risks of using AI, what AI misuse is and how this will be treated as malpractice).
• a written policy regarding the management of non-examination assessments including controlled assessments and coursework. (For CCEA GCSE centres this would be a written controlled assessments policy.)

Centre staff


5.11
The centre will:
e) conduct all examinations/assessments governed by these regulations in accordance with the following JCQ documents for the academic year 2024/25: 
Access Arrangements and Reasonable Adjustments, Instructions for conducting coursework, Instructions for conducting examinations, Instructions for conducting non-examination assessments

Centre staf
SectionContentWho is it for

14.25
A word processor:
f. must be used to produce work under secure conditions, otherwise the candidate’s script may not be accepted;
g. must not be used to perform skills which are being assessed;
h. must not give the candidate access to other applications such as a calculator (where prohibited in the examination), email, the internet, social media sites, spreadsheets;
j. must not have any predictive text software or an automatic spelling and grammar check enabled unless the candidate has been permitted a scribe (a scribe cover sheet must be completed), or the awarding body’s specification permits the use of automatic spell checking;

Centre staff
SectionContentWho is it for
3.1All coursework submitted for assessment must be the candidate’s own workCandidate
5.1In many subjects, candidates will use source material, including the internet and AI, when carrying out their coursework. However, candidates must not copy such material and claim it as their own work.Candidate
5.2If candidates use material from a source or generated from a source which is not their own work, they must indicate the particular part/element/phrase and state where it came from. Candidates must give detailed references even when they paraphrase the original material.
Where computer-generated content has been used (such as an AI Chatbot), the reference must show the name of the AI bot used and should show the date the content was generated. For example: ChatGPT 3.5 (https://openai.com/blog/chatgpt/). 25/01/2025. Candidates should retain a copy of the computed-generated content for reference and authentication purposes.
Candidate
6.1Candidates must not:
• submit work which is not their own;
• use AI, books, the internet or other sources without acknowledgement
  or attribution;
• misuse AI;
Candidate
6.2If irregularities in coursework are discovered prior to the candidate signing the declaration of authentication, this should be dealt with under the centre’s internal procedures and does not need to be reported to the awarding body.
Details of any work which is not the candidate’s own must be recorded on the authentication form supplied by the awarding body or other appropriate place.
Candidate
6.3If irregularities in coursework are identified by a centre after the candidate has signed the declaration of authentication, the Head of Centre must submit full details of the case to the relevant awarding body immediately.Candidate
6.7Heads of centre and appropriate senior leaders must ensure that those members of teaching staff involved in the direct supervision of candidates producing coursework are aware of the potential for malpractice.Candidate
7.1Each candidate (candidate being defined as someone for whom an entry is in place for the unit or qualification) must sign a declaration when submitting their coursework to their teacher for final assessment. Electronic signatures are acceptable. This is to confirm that the work is their own and that any assistance given and/or sources used have been acknowledged. Ensuring that they do so is the responsibility of the centre. Centres must record marks of ‘0’ (zero) if candidates cannot confirm the authenticity of work submitted for assessment.Candidate
7.2Teachers must confirm that all of the work submitted for assessment was completed under the required conditions and that they are satisfied the work is solely that of the individual candidate concerned. If they are unable to do so, the work must not be accepted for assessment. 
All teachers must sign the declaration of authentication after the work has been completed. Electronic signatures are acceptable. Failure to sign the authentication statement may delay the processing of the candidate’s results. 
If, during the external moderation process, it is found that the work has not been properly authenticated, the awarding body will set the mark(s) awarded by the centre to ‘0’ (zero).
Candidate
7.3The teacher should be sufficiently aware of the candidate’s standard and level of work to be able to identify if the coursework submitted appears to be beyond that candidate’s talents.Candidate
7.4In most centres, teachers are familiar with candidates’ work through class and homework assignments. Where this is not the case, teachers should require coursework to be completed under direct supervision.Candidate
7.5In all cases, some direct supervision is necessary to ensure that the coursework submitted can be confidently authenticated as the candidate’s own.Candidate
7.6If teachers have reservations about signing the authentication statements, the following points of guidance should be followed: 
• if it is believed that a candidate has received additional assistance and this is acceptable within the guidelines for the relevant specification, the teacher should award a mark which represents the candidate’s unaided achievement. The authentication statement must be signed and information given on the relevant form; 
• if the teacher is unable to sign the authentication statement of a particular candidate, then the candidate’s work cannot be accepted for assessment. A mark of ‘0’ (zero) must be submitted; 
• if malpractice is suspected, a member of the senior leadership team must be consulted about the procedure to be followed.
Candidate
8.1When marking coursework, teachers must pay close attention to the requirements of the specification. Teachers should note that it is their responsibility to award marks for coursework in accordance with the marking criteria detailed in the awarding body’s specification and subject-specific associated documents. Teachers must show clearly how the marks have been awarded in relation to these marking criteria. The centre’s marks must reflect the relative attainment of all the candidates. 
Teachers must not use artificial intelligence as the sole means of marking candidates’ work
Candidate

Appendix 2 – Information for candidates – coursework assessments effective from 1 September 2024


You can demonstrate your knowledge and understanding of a subject by using information from sources or generated from sources which may include the internet and AI. Remember, though, information from these sources may be incorrect or biased. You must take care how you use this material – you cannot copy it and claim it as your own work. 
The regulations state that:
‘the work which you submit for assessment must be your own’;
Where computer-generated content has been used (such as an AI Chatbot), your reference must show the name of the AI bot used and should show the date the content was generated. For example: ChatGPT 3.5 (https://openai.com/blog/chatgpt/), 25/01/2025. You must submit a copy of the computer-generated content with your work for reference and authentication purposes.
Don’t be tempted to use any pre-prepared or generated online solutions and try to pass them off as your own work – this is cheating. Electronic tools used by awarding bodies can detect this sort of copying.
Plagiarism involves taking someone else’s words, thoughts, ideas or outputs and trying to pass them off as your own. It is a form of cheating which is taken very seriously.

Candidate
SectionContentWho is it for

How do the awarding bodies monitor the management of non-examination assessments in centres? 
Awarding bodies require each centre to have a non-examination assessment policy in place to: 
• to cover procedures for planning and managing non-examination assessments; 
• to define staff roles and responsibilities for non-examination assessments; 
• to manage risks associated with non-examination assessments.

Centre staff
SectionContentWho is it for

4.1 Supervision
Where appropriate to the component being assessed, the following arrangements apply unless the awarding body’s specification says otherwise.
Candidates do not need to be directly supervised at all times. The use of resources, including the internet, is not tightly prescribed. Centres must always check the subject-specific requirements issued by the awarding body.
The centre must ensure that:
• there is sufficient supervision of every candidate to enable work to
  be authenticated;
• the work that an individual candidate submits for assessment is their own.
Work may be completed outside of the centre without direct supervision, provided that the centre is confident that the work produced is the candidate’s own.
Centres must ensure that candidates understand what they need to do to comply with the regulations for non-examination assessments. This is outlined in the JCQ document Information for candidates – non-examination assessments.
Centres must ensure that candidates:
• understand that information from all sources must be referenced;
• receive guidance on setting out references;
• are aware that they must not plagiarise other material.

Centre staff
4.3 ResourcesWhat resources are allowed?
In many subjects, candidates will use source material, including the internet and AI, when researching and planning their tasks. Candidates normally have unrestricted access to resources. Centres must refer to the JCQ document AI Use in Assessments: Protecting the Integrity of Qualifications, as well as the awarding body’s specification and/or associated documentation published by the awarding bodies and the regulator.
Some subjects require candidates to produce the work for assessment in formally supervised sessions. Unless the awarding body’s specification says otherwise, for all formally supervised sessions:
• the use of resources is always tightly prescribed and normally restricted to the candidate’s preparatory notes; 
• access to the internet is not permitted; 
• candidates are not allowed to use their own computers or other electronic devices, e.g. mobile phones.

Are candidates allowed to introduce new resources in-between formally supervised sessions? 
No. Candidates are not allowed to augment notes and resources between sessions. When work for assessment is produced over several sessions, the following material must be collected and stored securely at the end of each session (and not be accessible to candidates): 
• the work to be assessed; 
• preparatory work. 

Additional precautions need to be taken if the centre permits candidates to use computers to store work. This may involve collecting memory sticks for secure storage between sessions or restricting candidates’ access to a specific area of the centre’s IT network.

How should sources be acknowledged? 
The work submitted for assessment must include references where appropriate. To facilitate this, each candidate should keep a detailed record of their own research, planning, resources etc. The record should include all the sources used, including books, websites and audio/visual resources. 
Guidance is given in the JCQ document Information for candidates – non-examination assessments
Candidate and centre staff
4.6 Authentication proceduresHow is candidates’ work authenticated?
Teachers must be sufficiently familiar with the candidate’s general standard to judge whether the piece of work submitted is within his/her capabilities. 
Where required by the awarding body’s specification, the following procedures apply. 
All candidates must sign a declaration to confirm that the work they submit for final assessment is their own unaided work. 
Teachers must sign a declaration of authentication after the work has been completed confirming that: 
• the work is solely that of the candidate concerned; 
• the work was completed under the required conditions. 
Electronic signatures are acceptable. Typed names will be taken as being as binding as a handwritten signature.
What if the teacher has doubts about the authenticity of the work? 
If teachers are unable to confirm that the work presented by a candidate is their own and has been completed under the required conditions: 
do not accept the candidate’s work for assessment; 
• record a mark of ‘0’ (zero) for internally assessed work. 
If teachers are concerned that malpractice may have occurred or are unable to authenticate the work for any other reason, they must inform a member of the senior leadership team (see section 9). 
If, during the external moderation process, it is found that the work has not been properly authenticated, the awarding body will set the mark(s) awarded by the centre to ‘0’ (zero).
Centre staff and candidate
6.1 Marking and annotationTeachers are responsible for marking work in accordance with the marking criteria detailed in the relevant specification and associated subject-specific documents. 
Teachers must not use artificial intelligence as the sole means of marking candidates’ work.
Centre staff
9 MalpracticeCandidates must not
• submit work which is not their own; 
• make available their work to other candidates through any medium; 
• allow other candidates to have access to their own independently sourced material; 
• assist other candidates to produce work;
• use books, the internet or other sources without acknowledgement or attribution;
• submit work that has been word processed by a third party without acknowledgement; 
• include inappropriate, offensive or obscene material.

Teaching staff must
• be vigilant in relation to candidate malpractice and be fully aware of the published regulations; 
• escalate and report any alleged, suspected or actual incidents of malpractice to the senior leadership team or directly to the awarding body.
Candidate and centre staff

Contacts

AQA

Devas Street
Manchester
M15 6EX

City & Guilds

Giltspur House
5-6 Giltspur Street
London
EC1A 9DE

CCEA

29 Clarendon Road
Clarendon Dock
Belfast
BT1 3BG

NCFE

Q6 Quorum Business Park
Benton Lane
Newcastle upon Tyne
NE12 8BT

OCR

The Triangle Building
Shaftesbury Road
Cambridge
CB2 8EA

Pearson

80 Strand
London
WC2R 0RL

TQUK

Crossgate House
Cross Street
Sale
M33 7FT

WJEC

245 Western Avenue
Cardiff
CF5 2YX