Software as a Medical Device: How to Register an AI Healthcare Service in Russia in 2026


A mobile app analyzes a photo of a mole and tells the user: «High risk of melanoma — consult an oncologist.»

A neural network examines a CT scan and points a radiologist toward signs of pneumonia. An algorithm processes an Electronic Health Record (EHR) and warns a physician about drug-to-drug interactions.

Five years ago, products like these occupied a legal gray zone. Developers could label their neural networks «wellness services» and launch them without registration.

In 2026, that path is closed.

If your algorithm influences medical decisions, it is subject to the same rules as a CT scanner or a defibrillator.

This article covers the current state of Software as a Medical Device (SaMD) regulation in Russia, the Eurasian Economic Union (EAEU), and globally. We examine where the line falls between a standard app and a medical device (MD), which standards entered into force in 2025–2026, and what a developer must do to bring an AI product to market legally.


When Code Could Be Called a Service

Until recently, health-related software occupied an undefined position in the regulatory system. Historically, medical devices were physical objects: scalpels, ventilators, pacemakers. Software entered the definition of a medical device comparatively late.

Federal Law No. 323-FZ «On the Fundamentals of Health Protection» has included specialized software in the definition of medical devices since 2011. Article 38 defines a medical device as «any instruments, apparatus, appliances, equipment, materials and other products, including specialized software, intended by the manufacturer for the prevention, diagnosis, treatment, and medical rehabilitation of diseases.» The practical application of this provision, however, took years to develop.

Through the early 2020s, many developers of diagnostic algorithms positioned their products as information services, sidestepping registration. The International Medical Device Regulators Forum (IMDRF) had formulated the definition of SaMD back in 2013: software intended to be used for one or more medical purposes that performs those purposes without being part of a hardware medical device. The definition covers applications running on smartphones, PCs, or in the cloud.

Russian regulators initially had no clear criteria for distinguishing medical software from general-purpose applications. A fitness tracker counting steps obviously requires no registration. An app that analyzes heart rate and recommends the user «see a cardiologist» sat in a different category, one that lacked a defined answer.

The shift came in 2020–2021, when Roszdravnadzor increased its scrutiny of digital products. The COVID-19 pandemic accelerated this: the sudden proliferation of AI diagnostic systems for lung CT analysis forced the regulator to build a working framework for software-based medical devices rapidly.


The 2026 Regulatory Landscape

What SaMD Is and How It Differs from a Wellness App

The boundary between SaMD and a standard application is determined by the product’s intended use. Per the IMDRF definition, adopted by regulators in most countries, SaMD is software intended for the prevention, diagnosis, treatment, or monitoring of diseases.

Table 1: Distinguishing SaMD from Wellness Applications

FeatureWellness (not a MD)SaMD (Medical Device)
IntentionGeneral well-beingMedical purposes
Example functionCalorie countingAnalyzing diabetes risk from data
What it tells the user«You walked 5,000 steps»«Signs of arrhythmia detected»
RegistrationNot requiredMandatory
LiabilityConsumer protection lawMedical device regulations

Russian legislation codifies this in FZ-323. If your application states a medical purpose, it falls under regulation as a medical device. The deciding factor is the intended use declared by the manufacturer, not the technical implementation.

Four characteristics push a product from Wellness into SaMD: reference to a specific disease («For diabetics,» «For arrhythmia detection»); data interpretation («Signs of atrial fibrillation detected») rather than mere display («Heart rate: 80 bpm»); prediction of disease development risk; and recommendations regarding medication or physician consultation.

SaMD Risk Classification

The IMDRF proposed a four-tier classification system for SaMD, based on two factors: the significance of the information for medical decision-making, and the severity of the patient’s health condition.

Table 2: SaMD Classification by IMDRF, EAEU, and EU MDR

IMDRF CategoryEAEU ClassEU MDR ClassExample
I (lowest risk)1IMedication trackers, PACS without analytics
II2aIIaDermatoscope apps for screening
III2bIIbCADe/CADx for CT/MRI analysis; CDSS
IV (highest risk)3IIIAutonomous diagnosis without physician verification

In Russia and the EAEU, SaMD is classified under EEC Collegium Decision No. 173 of December 22, 2015 (as amended in 2025). Software intended for diagnosis or therapy typically falls into Class 2b or 3. Most AI diagnostic systems are Class 2b.

The European MDR 2017/745 contains a dedicated rule for software. Rule 11 classifies software that provides information used in decisions for diagnostic or therapeutic purposes as Class IIa at minimum. Where such decisions could cause death or irreversible deterioration, the software moves to Class III.


New GOST Standards for Medical AI

On January 1, 2025, the first package of Russian national standards for medical AI entered into force. On January 1, 2026, additional standards followed.

Standards in force since January 1, 2025:
GOST R 71671-2024, «Clinical Decision Support Systems (CDSS) using artificial intelligence. Basic principles,» sets requirements for physician-AI interaction ergonomics and for Explainable AI (XAI). The system must not merely issue a recommendation; it must provide a rationale.

Standards in force since January 1, 2026:
GOST R 72356-2025, «AI-based predictive analytics systems in clinical medicine for EHR data analysis. Test methods,» governs AI systems working with text-based Electronic Health Records via NLP. It introduces specific metrics for predictive models, including the C-index and calibration curves.
GOST R 72357-2025, «CDSS using AI for drug therapy data analysis. Dataset formation methods,» defines requirements for CDSS in pharmacovigilance and drug-interaction checking.
GOST R 59921.11-2025, «AI systems in clinical medicine. Datasets for algorithm testing. Methods for dataset universality and structure control,» makes clear that training a model on data from a single clinic and declaring it ready for the whole country is no longer acceptable. The testing dataset must cover real-world clinical variability: equipment diversity, scanning protocols, patient demographics. The standard also sets strict requirements for Ground Truth quality, with labelling verified by an expert panel or confirmed clinically.

These standards build on the GOST R 59921 series established in 2021–2023, which covers clinical evaluation, data requirements, and information security. By 2026, Russia’s national standard ecosystem for medical AI spans approximately 50 documents.

As of September 2025, 48 AI-based medical devices had been registered in Russia, 43 of them developed by domestic companies. The primary focus areas are radiology image analysis (CT, MRI, X-ray), thoracic pathology screening, and early-stage oncology detection.


EU AI Act

The European Artificial Intelligence Act (Regulation (EU) 2024/1689) entered into force in August 2024 and applies in stages.

Table 3: EU AI Act Implementation Timeline for Medical Devices

DateEvent
August 2024EU AI Act enters into force
February 2025AI literacy requirements for personnel
August 2025Penalties for non-compliance become enforceable
August 2026Requirements for high-risk systems (Annex III)
August 2027Full requirements for AI-enabled medical devices (Art. 6(1))

The EU AI Act classifies AI systems embedded in medical devices as high-risk. Per Article 6, any medical device subject to third-party assessment by a Notified Body under the MDR automatically falls under dual regulation: MDR for clinical safety, and the AI Act for ethics, transparency, and fundamental rights.

Russian companies planning to export to the EU therefore face two distinct sets of compliance requirements simultaneously.


Technical Requirements for SaMD Development

International Lifecycle Standards

Four international standards govern medical software development.

IEC 62304, «Medical device software. Software life cycle processes,» defines lifecycle processes and requires documentation of requirements, design, coding, verification, release, and maintenance. The standard divides medical software into three safety classes based on potential harm from failure. Class A covers software whose failure cannot cause injury (minimal documentation required); Class B covers software whose failure may cause minor injury (architecture-level verification required); Class C covers software whose failure may cause serious injury or death (maximum requirements, including detailed design documentation and module-level verification). In November 2025, a draft revision was published proposing to replace this three-class model with a two-tier «rigor levels» framework.
ISO 14971, «Medical devices. Application of risk management to medical devices,» establishes the risk management process across the full product lifecycle. For SaMD, risk identification related to software failures, faulty input data, and algorithmic errors is especially critical.
ISO 13485, «Medical devices. Quality management systems,» defines requirements for the manufacturer’s Quality Management System (QMS). Without a certified QMS, market entry is not possible.
IEC 82304-1, «Health software. General requirements for product safety,» covers all medical software, including SaMD.

AI and Machine Learning Specifics

Traditional software is deterministic: the same input always produces the same output. Machine learning algorithms work differently. Their behaviour is governed by training data and may be opaque even to the developers.

Regulators distinguish two types of AI systems by update behaviour. Fixed AI means the algorithm is locked at the time of registration; any model change requires a new registration or a formal amendment to the registration dossier. Continuous Learning AI means the algorithm continues training on new data after market release, which requires a dedicated change control mechanism.

The FDA developed the concept of a Predetermined Change Control Plan (PCCP) to manage changes in learning algorithms. A manufacturer defines in advance the boundaries within which the algorithm may change without requiring a new authorization each time.

In Russia, an equivalent approach is implemented through GOST R 59921.3-2021, «Managing changes in continuous-learning AI systems.» Although the term PCCP is not used in EEC acts, the standard requires a «Change Management Plan» covering change triggers, descriptions of retraining data, and a validation methodology for each new version.

Data Requirements and De-identification

GOST R 59921.11-2025 introduces universality controls: the test dataset must cover the variability of real clinical practice, and Ground Truth labelling must meet strict quality requirements. A model trained on data from one institution cannot be presented as nationally validated.

From September 1, 2025, new requirements for personal data de-identification under Government Decree No. 1154 apply. Five methods are permitted: Tokenization, Generalization, Splitting, Shuffling, and Statistical Processing.

Cybersecurity

From September 1, 2025, cybersecurity requirements must be included in SaMD technical documentation under Ministry of Health Order No. 181n of April 11, 2025. The cybersecurity section must cover authentication and access control methods (role-based model, multi-factor authentication for cloud-based SaMD), data integrity protection (cryptographic hashes), secure communication channels (TLS 1.3 or higher), a software support plan specifying security patch release timelines, and threat modelling specific to the medical environment.

For AI-based devices, one additional requirement applies: a mandatory module for automated data transmission to the Roszdravnadzor information system (АИС).


Registration Procedure in Russia

National and EAEU Procedures

Until December 31, 2027, applicants may choose between the national procedure (Government Decree No. 1684 of November 30, 2024) and the EAEU procedure (EEC Council Decision No. 46 of February 12, 2016).

A protocol signed by EAEU states on December 29, 2025 extended the national registration option to the end of 2027; the original plan had been to switch exclusively to EAEU rules from January 1, 2026. Government Decree No. 2214 of December 30, 2025 extended the national registration rules through December 31, 2028.

Table 4: Comparison of Registration Procedures

ParameterNational (Decree 1684)EAEU (Decision No. 46)
Application deadlineUntil December 31, 2027Indefinite
MA validity areaRussia onlyAll EAEU countries
Competent authorityRoszdravnadzorRoszdravnadzor + EEC
Expert organizationsVNIIIMT, TsKMIKEEEAEU-accredited bodies
MA durationPerpetualPerpetual
Submission formatDigital only (registry model)Digital

From March 1, 2025, Decree No. 1684 introduced material changes to the national procedure: electronic-only document submission, elimination of paper Marketing Authorizations in favour of a registry model, and specific provisions for AI-based software.

Registration Stages

The registration of SaMD proceeds through several defined stages.

Technical testing is conducted in an accredited testing laboratory (such as VNIIIMT). Functional characteristics are verified, along with electrical safety where applicable and electromagnetic compatibility. For AI systems, GOST R 59921.11-2025 requirements for testing datasets apply.
Toxicological studies are generally not required for SaMD, as software does not contact the human body.
Clinical investigation of medical devices is conducted in accredited medical organizations. For diagnostic SaMD, sensitivity, specificity, and predictive values are assessed. From September 2025, results are submitted exclusively in electronic form through the Roszdravnadzor system.
Quality, efficacy, and safety assessment is performed by an expert organization on the basis of submitted technical and clinical documentation.
State registration is carried out by Roszdravnadzor following positive expert conclusions.

Clinical Investigation of AI Systems

For diagnostic AI, clinical validation carries particular weight. Demonstrating high accuracy on test data is not sufficient; the system must show that it improves clinical outcomes in real-world practice.

The key metrics for diagnostic AI: sensitivity (the share of correctly identified disease cases), specificity (the share of correctly excluded healthy cases), Positive Predictive Value (PPV, the probability of disease given a positive result), Negative Predictive Value (NPV, the probability of a disease-free status given a negative result), and Area Under the ROC Curve (AUC, the algorithm’s overall ability to distinguish between classes).

For predictive models analyzing EHR data, GOST R 72356-2025 additionally requires the C-index and calibration curves.

Roszdravnadzor requires that data used for clinical validation not overlap with data used during training or technical validation. Maintaining detailed Data Provenance records is essential for demonstrating this.


Post-Market Surveillance 2.0

Registration does not end regulatory obligations. The manufacturer is required to conduct ongoing post-market surveillance for medical devices (PMS).

From July 2025 (Roszdravnadzor Order No. 4472), manufacturers of AI-based medical devices must transmit data on all failures and errors in their products to the Roszdravnadzor information system in automated mode. The requirement applies to all 48 AI systems registered in Russia.

The transmitted data includes records of technical failures and critical software errors, operational statistics (number of processed studies), input data quality metrics (percentage of images rejected by the algorithm as unsuitable), and AI algorithm output results.

For manufacturers, this creates a real-time picture of AI performance in the field (PMS 2.0) and requires substantial architectural modifications to ensure compatibility with the regulator’s data transmission protocols.

Beyond automated monitoring, the manufacturer must collect performance data from real-world use, track adverse events and incidents, report serious incidents to Roszdravnadzor, update the risk assessment based on post-market data, and revise the device when problems are identified.


Developer’s Action Plan

The first step is confirming your product’s status. If it has a medical purpose (prevention, diagnosis, treatment, or monitoring of disease), it is a medical device subject to registration. If you position it as a wellness app, confirm that its functionality and marketing materials contain no medical claims. The phrase «identifies disease risk» converts an app into a medical device.

Once the status is confirmed, you select the risk class and registration procedure. Use the EAEU classification rules (EEC Decision No. 173, as amended in 2025) or national rules to determine the class. AI diagnostic systems typically fall into Class 2b or higher. You also decide whether you need registration in Russia only (national procedure, available until December 31, 2027) or across the entire EAEU market (union procedure, indefinite).

A Quality Management System certified to ISO 13485 is a prerequisite; without it, registration is not possible. Start with a gap analysis of current processes and build a compliance roadmap. In parallel, restructure software development to conform with IEC 62304: determine the safety class of your software and establish the corresponding documentation level.

Risk management per ISO 14971 requires a risk file covering the full product lifecycle. For AI systems, risks related to data quality and algorithmic behaviour receive particular attention. On the data side, training and testing datasets must conform to GOST R 59921.11-2025, and de-identification must follow the methods prescribed in Decree No. 1154. Clinical validation must be conducted on data independent from training.

From September 1, 2025, a cybersecurity section is mandatory in any software medical device’s technical documentation. For AI-based devices, this section must include a module for automated data transmission to the Roszdravnadzor information system. Once technical testing and clinical investigation of the device are complete and yield positive results, the registration application is submitted through the Roszdravnadzor digital portal.


Sources: Federal Law No. 323-FZ of November 21, 2011 «On the Fundamentals of Health Protection» (as amended June 7, 2025); EEC Council Decision No. 27 of February 12, 2016; EEC Council Decision No. 46 of February 12, 2016 (as amended March 30, 2023); EEC Collegium Decision No. 173 of December 22, 2015 (2025 edition); Government Decree No. 1684 of November 30, 2024 (as amended October 27, 2025); Government Decree No. 2214 of December 30, 2025; Government Decree No. 1154 of September 1, 2025; Ministry of Health Order No. 181n of April 11, 2025; Roszdravnadzor Order No. 4472, July 2025; GOST R 71671-2024; GOST R 72356-2025; GOST R 72357-2025; GOST R 59921.11-2025; GOST R 59921.3-2021; IEC 62304:2006+A1:2015; ISO 14971:2019; ISO 13485:2016; Regulation (EU) 2024/1689 (EU AI Act); Regulation (EU) 2017/745 (EU MDR); IMDRF/SaMD N10, 2013.

Other articles on Regulatory Affairs →

This page in Russian →