Assessment Frameworks for the Generative AI Era

10 Min Read Time
12 August, 2025
Jaco Reynecke

Permitted, Restricted, or Prohibited? Setting Clear Rules for Generative AI in Student Work

The challenge for educators is no longer whether generative AI belongs in education—it’s knowing when to allow it, when to limit it, and when to prohibit it entirely. Getting this balance right ensures we protect the integrity of assessments while giving students opportunities to develop essential skills for working with generative AI safely and effectively.

If rules are too loose, generative AI can displace the student’s own thinking. If they’re too rigid, we risk denying students access to a tool that can genuinely enhance learning—fostering understanding in a conversational, responsive way that is faster and more accessible than traditional research methods. Used well, generative AI can act like an on-demand tutor, helping students clarify concepts, think critically, and move more quickly from question to insight.

The Australian Framework for Generative AI in Schools sets a clear direction: integrate generative AI in ways that are safe, ethical, and purposeful. UNSW’s Assessment Design Procedure (4.1) reinforces the need for explicit generative AI-use statements in every assessment brief. At NexEd AI, our focus is on translating these principles into practical, curriculum-aligned rules that teachers can apply confidently—while embedding two core principles:

Transparency – When generative AI is used, it must be disclosed.
Critical evaluation – Generative AI outputs must be carefully checked for factual accuracy and bias, with claims supported by legitimate sources.

When Generative AI Should Be Prohibited

There are many assessment contexts where generative AI use undermines the skills, understandings, and capabilities the task is designed to assess. In these situations, prohibition is essential to ensure the evidence collected is valid and reliable. Examples include:

Externally invigilated examinations
NAPLAN, ATAR examinations, and other senior secondary external assessments where independent recall, reasoning, and problem-solving are assessed.

Closed-book practical assessments
Science experiments, technical demonstrations in VET subjects, or performance assessments in The Arts or HPE, where practical application of knowledge and technique is the focus.

Authentic role-plays or simulations
Drama performances, language role-plays, or workplace simulations in VET courses assessing oral communication, interpersonal skills, and situational judgment.

Foundation skill checks
Diagnostic or formative assessments to gauge baseline understanding without support tools.

Creative originality tasks
Original compositions, art portfolios, or creative writing tasks where unique personal expression is central.

Accreditation or compliance-based assessments
Industry-standard, licensing, or safety-critical tasks requiring demonstration of independent competence.

In these contexts, even if generative AI use were disclosed, it would compromise the validity of the judgement against the achievement standards.

When Generative AI Can Be a Powerful Learning Tool Before Assessment

Prohibiting generative AI during the assessment does not mean banning it from the learning process. In fact, when used in teacher-directed, pre-assessment contexts, generative AI can be a highly effective tool for preparing students—provided its use is disclosed and all outputs are critically evaluated for factual accuracy, bias, and source credibility.

Some of the highest-value pre-assessment uses include:

Deep research support
Advanced generative AI platforms such as ChatGPT and Claude can access or summarise information from reputable, authored sources when prompted appropriately. By instructing students to require source attribution, teachers can guide them to use these tools in a way that promotes academic rigour.

Ideation and content knowledge acquisition
Students can use generative AI to build background knowledge and explore key concepts before an assessment. Deep research can inform this ideation phase, giving students a stronger foundation for task planning.

Iterative brainstorming
When combined with deep research, generative AI-generated brainstorming can help students develop and refine ideas through an iterative process—moving from broad possibilities to well-focused, evidence-based approaches.

Draft review and critique
Students can have generative AI review, analyse, and provide feedback on a draft response. This feedback, once verified for accuracy and relevance, can be used to strengthen the work and better prepare for the final task.

Exemplar generation
Teachers or students can use generative AI to produce annotated exemplars aligned with achievement standards or grade descriptors. This can help clarify what is required for different performance levels and support self-assessment.

Concept clarification
Generative AI can re-explain complex concepts using simpler language, analogies, or alternative perspectives. Students can then compare these explanations to authoritative sources to confirm accuracy.

When embedded in structured classroom activities, these uses can lift student performance, improve clarity of expectations, and develop the critical literacy skills needed to evaluate the quality and credibility of generative AI outputs. Importantly, these activities occur before the assessment task, ensuring the final evidence of learning is genuinely the student’s own.

The Three Models of Generative AI Use in Assessment

Prohibited – No generative AI use allowed during the task.
Best for: High-stakes, originality-focused, or safety-critical assessments.

Restricted – Generative AI allowed for certain stages but not the final submission.
Example: Students use generative AI to plan their argument but must produce the final essay without generative AI assistance—disclosing any use and including evaluation notes on outputs.

Permitted – Generative AI actively encouraged, with use embedded in the criteria.
Example: Students create a product prototype with generative AI-assisted coding, providing a process log and evaluating the generated code for accuracy, efficiency, and ethical considerations.

Generative AI Tools Used by Students During Assessment

When students use generative AI during an assessment—whether in a restricted or permitted model—it must be in a way that supports, rather than replaces, the demonstration of learning.

This means:

Transparency is non-negotiable – Students must disclose which generative AI tools they used, at what stage, and for what purpose. This can be done through:

A creation statement embedded in the submission
Annotated prompts and outputs
Process logs showing the work’s development

Critical evaluation must be visible – Students must show that they have:

Checked all generative AI outputs for factual accuracy
Identified and addressed potential bias
Cross-checked claims against legitimate, authored, and reputable sources
Made final decisions themselves rather than accepting generative AI suggestions uncritically

Examples of compliant generative AI use within an assessment include:

Using generative AI to refine a section of text for clarity, with changes explained and justified in annotations

Generating a data visualisation, then verifying accuracy against the original dataset and commenting on limitations

Producing alternative solutions to a problem and critically comparing them to the student’s own approach

Requesting targeted feedback on a draft paragraph, then selectively implementing changes with justification

By embedding these expectations into assessment design, teachers ensure generative AI becomes part of an authentic demonstration of capability rather than a shortcut that undermines the validity of the evidence collected.

Making the Rules Work in Practice

From our work with schools, NexEd AI has found that generative AI-use guidelines are most effective when they:

State the expectation of transparency
Students must declare generative AI use in a process log, creation statement, or annotated submission.

Require evidence of critical evaluation
Outputs must be checked for factual accuracy and bias, and supported by legitimate sources where applicable.

Connect rules to the learning outcome
Students should understand why certain uses of generative AI are permitted or prohibited for that task.

The Opportunity for Australian Schools

Generative AI in education is not an all-or-nothing proposition. Many assessment contexts demand prohibition to maintain integrity. But in others, generative AI can be a powerful pre-assessment learning tool—if its use is transparent and critically evaluated.

NexEd AI supports schools to create assessment-specific generative AI-use guidelines that are clear, enforceable, and aligned with the Australian Curriculum, the Australian Framework for Generative AI in Schools, and UNSW’s assessment integrity principles. By embedding transparency and critical evaluation into every permitted use of generative AI, we ensure students graduate as capable, ethical users of this transformative technology.