AFRM Entry Phase Examination

The AFRM Entry Phase Examination (EPE) assesses trainee readiness for their progression through training. You must demonstrate competence in foundational concepts, clinical skills and knowledge outlined in the rehabilitation medicine (adult) new curriculum while embodying professional practice behaviours.

Overview

Assessment changes

A key change in the Rehabilitation Medicine (adult) new curriculum is the modification of the Module 2 Clinical Assessment, now named the AFRM Entry Phase Exam (EPE). This change has also been applied to the PREP General Rehabilitation Medicine curriculum.

The final Module 2 Clinical Assessment was in August 2024. From 2025, the EPE has replaced this exam for all trainees.

The EPE is mapped to the PREP and the new curriculum. Only content commonly occurring in both curricula is assessed.

Key changes
  • The EPE is aligned with the Rehabilitation Medicine (adult) new curriculum standards appropriate for the level of training and assessment format. New trainees are recommended to attempt the EPE in their Specialty Entry Phase (first year) of advanced training.
  • The number of Objective Structured Clinical Examination (OSCE) stations has increased to 10 stations, which includes at least 3 static stations.
  • To pass the exam, a candidate's aggregate score must be greater than or equal to the overall pass mark and they must pass 6+ stations.
  • Candidates are responsible for managing their time within stations. For full details, refer to the Time Management section
  • Progression requirements have changed. This is based on when training commenced.

See the exam format and curriculum focus under the Prepare tab.

Key dates

Applications open Tuesday 7 April 2026
Applications close 5pm AEDT, Tuesday 21 April 2026
Pre-exam special consideration requests close (for provisions on exam day) 5pm AEDT, Tuesday 21 April 2026
Exam Saturday 22 August 2026
Exam day special consideration requests close (for technical and procedural issues) Friday 28 August 2026
Results released From 3pm AEST, Thursday 24 September 2026

Information is correct as of 20 January 2026.
Note: The RACP follows an alternating day rule for clinical examinations rotating each year between Saturday and Sunday.


Location

Royal North Shore Hospital

Ambulatory Care Centre (Level 3, main entrance level)

Reserve Road, St Leonards NSW 2065

Parking: RNSH parking | Public transport: Transport NSW

Risk management and contingency planning

In addition to quality assurance, a comprehensive risk management framework safeguards exam integrity and minimises disruptions. Lessons from the COVID-19 pandemic have strengthened this framework, enhancing the ability to respond to unforeseen challenges.

Risk management plans outline strategies for mitigating potential disruptions and ensuring coordinated responses to unexpected events. Crisis communications support timely updates, and contingency plans outline alternative exam delivery methods if required.

These plans are reviewed before each exam cycle and published on our website for transparency. Through proactive risk management and rigorous quality assurance, we remain committed to upholding the highest assessment standards, providing candidates and stakeholders with confidence in the exam process.

Plan A 

The exam proceeds face-to-face on a single day in Australia on 22 August 2026. Trainees from across Australia and Aotearoa New Zealand attend one exam site at an allocated time during the scheduled exam day.

Plan B 

If the exam can't be delivered as planned on 22 August 2026 due to unforeseen circumstances, it'll be postponed and will proceed a minimum of 3 months later for all candidates.

Recommendations from the Review of the Paediatric Clinical Examination in Australia and Aotearoa New Zealand are being implemented to enhance exam safety for candidates, examiners, patients, and their families/whānau/carers. To find out more, read the Review of Paediatric Clinical Examination in Australia and Aotearoa New Zealand.

Exam format overview

The exam follows the objective structured clinical exam (OSCE) format. You progress through a series of stations, each presenting a different clinical scenario.

The Prepare and Exam Day tabs provide detailed information on the content and processes of the exam day.


Exam development

The exam content is developed by the EPE Working Party, with processes designed to support validity, reliability and alignment with the Adult Rehabilitation Medicine curriculum. Topics are planned in advance and mapped to both the PREP and new curricula to ensure appropriate coverage of learning objectives and equitable assessment across trainee cohorts. This mapping supports blueprint fidelity and reduces construct under-representation.

Working Party members collaboratively develop questions and marking guides, engaging in structured peer review to refine clinical accuracy, clarity and scoring intent. Independent reviews, including role play and open-book evaluations by Fellows outside the Working Party, provide additional checks on interpretation, relevance and applicability to Aotearoa New Zealand practice. These steps help reduce item-writing bias, improve scoring consistency and strengthen content validity.

Once submitted to the RACP, questions undergo technical editing and further review to confirm internal consistency, alignment with assessment standards and overall suitability for inclusion in the exam. Final proofreading, approval and standard-setting processes ensure that each station meets the required performance expectations before materials are prepared for examination delivery.


Performance standard criteria

Standard-setting processes ensure that all candidates are assessed fairly and consistently. The EPE uses criterion-referenced methods, meaning performance is assessed against pre-determined criteria rather than a comparative curve. This approach ensures that all candidates who meet the required standard can pass.

Performance expectations are defined through detailed assessment criteria, marking guides, the Global Assessment Rating (GAR) scale and the Professional Competencies Rating Scales. Examiners apply these criteria to assess whether a candidate has demonstrated the required level of competence in each station. Pass marks are established using recognised standard-setting methods: the Modified Angoff method for static stations and the Borderline Regression method for live stations.

Examiners participate in structured calibration activities to develop a shared understanding of the required standard, the GAR categories and the observable behaviours that represent them. These processes strengthen scoring consistency across examiners and stations and support defensible, criterion-based pass–fail decisions.


Marking

Each station is marked by consensus between 2 examiners. All stations are equally weighted.

Static stations are assessed using a pre-determined marking guide. Examiners independently review the candidate’s responses and then agree on a final score for each sub-question. Pass marks for static stations are established using the Modified Angoff method, informed by subject matter experts’ expectations of minimally competent performance.

Live stations are assessed using a pre-determined marking guide, a Global Assessment Rating (GAR) and ratings from the Professional Competencies Rating Scale (PCRS), where relevant. Examiners independently mark the candidate’s performance, then agree on the GAR and a consensus score for each component of the station. Pass marks for live stations are set using the Borderline Regression method.

Your score for live stations is based on 2 components:

  • marking guide scores specific to that station (60%)
  • professional competencies ratings (40%)

The overall pass mark is the sum of the station pass marks. You pass the exam if your aggregate score is greater than or equal to the overall pass mark and you pass at least 6 stations.

See previous exam pass rates.


Professional Competencies

Performance expectations are defined through detailed assessment criteria, marking guides, the Global Assessment Rating (GAR) scale and the Professional Competencies Rating Scales. 

Professional competencies scores are determined using a pre-set rating scale for: 

  • Quality and safety of physical examination
  • Quality and safety of history taking
  • Communication
  • Judgement and decision making
  • Medical expertise

Not all domains are relevant to every station.

Professional Competencies Rating Scales

Quality assurance

A robust quality assurance framework governs the exams to ensure fairness, accuracy and consistency across all assessments. Each stage of the exam process follows detailed business rules tailored to the exam’s purpose, format and potential risks.

From planning and development with relevant committees through to topic selection, station design, role player and patient recruitment, examiner recruitment and calibration, stringent measures uphold rigorous assessment standards.

Data integrity is prioritised through comprehensive quality checks before results are finalised, while results meetings and ratification procedures provide additional oversight, particularly for candidate results close to the minimum expected standard and for any unforeseen circumstances that may impact outcomes. 

Clear and timely communication ensures transparency for candidates and stakeholders, and structured feedback mechanisms support continuous improvement of future assessments.


Results

You'll receive your exam result by email. Ensure your contact details are up to date in MyRACP, including your current email address and phone number. If you don’t receive your result email, contact us for assistance.

To ensure fairness, accuracy, and integrity, results undergo the following quality assurance process.

Post-exam steps Description
Exam data collectionScores are submitted by examiners.
Data verificationChecking for anomalies and data comparison.
RACP exam analysis Data quality assurance review by Senior Lead, Assessment, and Data Analyst.
Compile result meeting agenda and documents Compile results meeting agenda, including results, item analysis, incident reports and post-exam special consideration applications.
Results meetingDiscussion of results and decision on incidents and post-exam special consideration applications. All candidate details are de-identified.
Results confirmationConfirmation of results for release.
Results administrations Final preparation of results for release.
Results releaseResults are typically released mid-week and before end of day to ensure that candidates can access support within business hours.

Please note: The Reconsideration, Review and Appeals By-Law applies exclusively to decisions made by College bodies and doesn't apply to examination results, as these reflect outcomes of assessments against established criteria rather than discretionary decisions.


Candidate feedback

A personalised feedback report summarising the performance band you achieved for each station is included in your results email. For each station, there are 6 performance band ranges:

  • Excellent performance
  • Better than expected standard
  • Meets expected standard
  • Below expected performance
  • Poor performance
  • Very poor performance

All candidates receive a copy of the general feedback, which provides a cohort-based overview of key strengths and areas for improvement, summarised by examiners at the end of the exam.

See previous years' Module 2 Clinical Assessment and Entry Phase Examination feedback reports:


We understand that you may be seeking more detailed individualised feedback; however, additional personalised feedback can't be provided. You're encouraged to discuss your results with your Advanced Training Supervisor or mentor, who can help you reflect on your performance, recognise your strengths, and identify areas for continued development.

To support equitable access and consistency for all candidates, the AFRM Annual Trainees Meeting includes sessions on examinations, preparation strategies, and related processes, with opportunities for questions.



Close overlay