Successful Clinical Evaluation Without Drama – How to Avoid Biggest Pitfalls

Steffen Ruschinski profile image
5 min read

Article Summary

Avoid the most common MDR pitfalls in Clinical Evaluation Plans and Reports: key issues flagged by Notified Bodies and how to strengthen your CEP, CER, and PMCF for faster, compliant CE marking.

Introduction

Clinical evaluations have become one of the most scrutinised aspects of EU MDR compliance, and for good reason. Incomplete or inconsistent CEPs and CERs are among the top reasons for Notified Body nonconformities, leading to costly delays and potential product withdrawal.

For MedTech leaders, ensuring clinical evaluation success is a regulatory necessity as well as it’s a strategic imperative. From securing CE marking to demonstrating real-world clinical benefit, every claim must be backed by sufficient, state-of-the-art clinical evidence.

Top CEP Pitfalls in Clinical Evaluation Plans and How to Avoid Them

Even experienced regulatory teams can overlook critical elements in their CEPs. Below are six common pitfalls, along with practical strategies to stay compliant and confident:

  1. Incomplete Mapping of GSPRs (General Safety and Performance Requirements). Many clinical evaluation plans fail to clearly identify which GSPRs require clinical support, or inconsistently reference them across the technical file.

How to avoid it: Ensure all relevant GSPRs that depend on clinical evidence are explicitly identified in the CEP and consistently reflected in the Clinical Evaluation Report (CER), Risk Management File, and other technical documentation. 

Example: If your device has a measuring function, GSPR 15.1 must be cited and backed by sufficient clinical evidence, such as precision and accuracy data from a study. 

  1. Vague Intended Purpose and Indications for Use. A loosely defined intended purpose can lead to nonconformity. It’s often confused or conflated with indications for use.

How to avoid it: Clearly define the intended purpose as required by MDR Article 2(12), and differentiate it from indications for use, using precise language that aligns across all documents. 

  1. Inconsistencies Across the Technical File. Discrepancies between the CEP, CER, and IFU on intended purpose, indications, or performance claims are a red flag for reviewers.

How to avoid it: Implement a cross-document review process to align language and claims across the technical file: including labeling, risk assessments, and clinical evaluation documents. 

  1. Undefined Clinical Benefit. Manufacturers often overlook clearly stating how the device improves patient outcomes, public health, or clinical workflows.

How to avoid it: Define the clinical benefit early in the CEP. Tie it directly to measurable, patient-relevant outcomes and link these to your benefit-risk analysis. 

  1. Weak or Missing Acceptance Criteria. Without defined thresholds, reviewers can’t determine whether your evidence adequately supports safety and performance.

How to avoid it: Set clear clinical performance and safety benchmarks using validated methods (e.g., clinical endpoints, PROMs, published literature) and reference current standards or state-of-the-art practice. 

  1. No Clinical Development Plan (CDP) for Legacy Devices. Legacy Devices often lack a structured plan for future clinical data generation.

How to avoid it: Include a Clinical Development Plan even for legacy devices. Outline how PMCF activities will supplement existing evidence, and indicate when further clinical investigations may be needed. 

Common CER Pitfalls That Undermine Compliance 

A strong CEP means little if the CER isn’t up to standard. These common issues can derail compliance.

  1. Unclear Update Frequency. Some CERs lack a clear schedule or rationale for updates, which is critical under MDR lifecycle requirements.

How to avoid it: Define when and why the CER will be updated—e.g., annually, or after PMS/PMCF triggers. Align with MEDDEV 2.7/1 Rev. 4 and ensure updates reflect the latest clinical data. 

  1. Poor Definition of State of the Art (SOTA). Manufacturers often reference “state of the art” without clearly defining it or using it to benchmark safety and performance claims.

How to avoid it: Develop a structured SOTA section, supported by literature review and clinical guidelines. Use this to justify performance thresholds and evaluate alternatives. 

  1. Insufficient Clinical Evidence. Claims made in the CER are often not fully supported by data, especially for newer or higher-risk devices.

How to avoid it: Present robust, high-quality clinical data that supports each claim. Whether from your own studies, PMS data, or literature. Avoid cherry-picking or over-relying on equivalence unless well-substantiated. 

  1. Incomplete Benefit-Risk Analysis. Manufacturers sometimes omit key data or fail to quantify clinical benefit in relation to risk.

How to avoid it: Quantify benefit in patient-relevant outcomes and ensure risk is assessed using current post-market data, clinical feedback, and any residual hazards. 

  1. Undefined Device Lifetime. MDR requires a clearly defined expected lifetime. Leaving it open-ended can delay CE marking.

How to avoid it: Provide a realistic, evidence-backed service life for the device, including rationale (e.g., material wear, calibration needs, shelf life). 

  1. Weak Clinical Evaluation Strategy. Some CERs lack a clear rationale for their chosen evaluation approach or fail to justify how MDR Article 61 applies.

How to avoid it: Clearly state your clinical evaluation strategy. This should include justification for equivalence (if used), level of evidence based on risk class, and how PMCF will bridge any gaps. 

Top 5 Common Nonconformities from Notified Bodies, and How to Prevent Them

Notified Bodies frequently flag issues like unclear safety parameters, weak equivalence claims, and unsupported clinical data. These gaps can delay CE marking and trigger rework. Addressing them early helps ensure audit readiness and compliance.

  1. Undefined Safety and Performance Parameters
  • No thresholds defined. 
  • Use of outlier data. 
  • Inappropriate benchmarks. 

Fix: Define, justify, and align all safety/performance thresholds with clinical evidence and SOTA. 

  1. Weak Equivalence Claims
  • Partial equivalence cited. 
  • Poor scientific justification. 

Fix: Follow MDCG 2020-5. Demonstrate full equivalence across clinical, technical, and biological dimensions with solid literature and data comparisons. 

  1. Gaps in PMCF Activities
  • No plan for specific populations (e.g., adolescents). 
  • PMCF data not aligned with safety/performance outcomes. 

Fix: Design robust PMCF plans that target real data gaps and generate measurable evidence. 

  1. Poorly Executed Literature Reviews
  • No separation of Device Under Evaluation (DuE) and SOTA. 
  • Incomplete or unjustified literature search methods. 

Fix: Submit a full literature search protocol, report, and full-text articles. Clearly define how each piece of literature supports SOTA or DuE. 

  1. Unsupported Clinical Claims
  • Claims in marketing materials not backed by CER data. 

Fix: Map every public clinical claim (including on your website) back to a specific section in the CER with evidence support. 

Final Tip: Align with the CEAR Before Submission

Before submitting your clinical evaluation, compare it against the Clinical Evaluation Assessment Report (CEAR) template to ensure completeness, consistency, and audit-readiness. 

Staying ahead of these common pitfalls can save time, reduce costs, and ensure a smoother path to CE marking and patient safety outcomes. 

Disclaimer. The views and opinions expressed in this article are solely those of the author and do not necessarily reflect the official policy or position of Test Labs Limited. The content provided is for informational purposes only and is not intended to constitute legal or professional advice. Test Labs assumes no responsibility for any errors or omissions in the content of this article, nor for any actions taken in reliance thereon.

Accelerate your access to global markets.

Contact us about your testing requirements, we aim to respond the same day.

Get resources & industry updates direct to your inbox

We’ll email you 1-2 times a week at the maximum and never share your information