Digital transformation has been declared a national goal through 2030, with a view to 2036. This is institutionally enshrined in Decree of the President of the Russian Federation No. 309 of May 7, 2024, and is complemented by the constitutional provision on ensuring the security of the individual, society, and the state in the application of information technology and the circulation of digital data (clause «m» of Article 71 of the Constitution of the Russian Federation).
Against this backdrop, Russian legal and scientific thought is recording a shift from the auxiliary to the decisive influence of the digital environment on the content and exercise of the right to privacy, calling for new legal architectures and guarantees while simultaneously preventing «totalitarian» digital practices and algorithmic inequality.
The study draws on the 2025 Human Rights Council report, contemporary theoretical and legal works, and the new legislative framework for 2024–2025, proposing a holistic model for privacy protection in Russia’s digital space [1]. The right to privacy in the constitutional sense encompasses guaranteed control of an individual over their personal information and the prevention of disclosure of personal data, which, in the context of online platforms, Big Data, and AI, requires the restoration of «effective control» as an essential element of the content of the right. Scientific doctrine emphasizes the vulnerability of privacy due to the combination of mass collection, aggregation, and predictive «calculation» of hidden personal characteristics, turning a digital trace into a detailed profile with high manipulative power. This is precisely why the concepts of the «right to be forgotten,» «right to anonymity,» and «right to metadata confidentiality» are rapidly developing, requiring regulatory specification and procedural support in Russian law [2].
The HRC report systematically identifies the sources of threats to rights: the speed of change and the «combinatorial explosion» of technologies; weak software reliability; the expansion of the «gray zone» of data circulation; the transition from the collection to the calculation of personal data; cyberfraud; hidden «invisibles» in AI; The corruption potential of the digital environment; the vulnerability of digital «original» documents. These factors radically increase the likelihood of data leaks and discrimination through algorithmic profiling, and also replace judicial guarantees with incontestable «AI decisions,» leading to parallel systems of «digital justice» without due process and presumptions. In the socio-political dimension, risks are identified of forced digitalization, «uberization» with the erosion of social guarantees, attempts at social ratings, and the excessive digitalization of education and healthcare without sufficient legal barriers and human rights impact assessments [1].
The strategic vector is set by Decree No. 309: [4] Along with digital maturity, the goals of network sovereignty and the creation of a system to combat ICT crimes are set, which form the basis for special federal laws and regulatory measures for 2024–2025. Federal Law No. 41 of 01.04.2025 was adopted on the creation of a state information system to combat offenses committed using ICT, and on the adjustment of financial, banking, and telecom processes to reduce damage from cyber fraud, including restrictions on «involuntary» cash withdrawals, call tags, and interdepartmental exchange within the GIS in stages until 2026 [5]. Administrative liability for personal data leaks has been tightened: Federal Law No. 420 of 30.11.2024 introduces differentiated offenses and «turnover fines» for repeat violations linked to revenue, as well as special prohibitions on coercion of biometric identification in consumer relations, coming into force in 2025 [6]. The digital profile is becoming pervasive, blurring the lines between private and public spheres. This necessitates regulatory restrictions on both the collection and processing of sensitive data, including geodata and biometrics, and procedures for prompt data deletion (the «right to be forgotten»). Courts and international standards have broadly interpreted «personal data» to include professional and business information, providing guidance for national lawmaking in refining definitions and the boundaries of permissible interference. Against this backdrop, a ban on the imposition of biometrics and increased liability for leaks in 2024–2025 are steps toward redressing the power imbalance between data controllers and data subjects, but they require supplementation with substantive procedural guarantees for challenging «automated decisions» and auditing algorithms [3]. The mass collection, indefinite storage, and covert resale of personal data, including profiling and algorithmic discrimination, have become a systemic reality of the Russian digital space, increasing the risks of cyberfraud and undermining privacy [1].
Federal Law No. 41 launches a state system to combat ICT crimes and a set of measures in the banking and telecom sectors to curb fraudulent transactions and interdepartmental exchange, implemented in stages through 2026.
Federal Law 420 introduces differentiated fines and turnover-based fines of up to 3% of revenue (with limits), as well as a ban on discrimination for refusing biometric data, creating a new «price» for careless data handling.
A subject’s control over information, the right to anonymity and confidentiality of metadata, and the «right to be forgotten» are essential elements of law in the digital environment that require precise legislative detail.
Systemic threats: «invisible» AI, the corruption potential of digital systems, the loss of reliability of «digital originals» of documents, the risks of forced inclusion in the Unified Identification and Authentication System (ESIA)/digital platforms, and attempts at social ratings require strict legal barriers and independent auditing. Regulatory challenges: first, current regulations predominantly target «collection» and «leakage,» while the key focus is on the «computation» of sensitive attributes and their subsequent attribution to individuals, circumventing the traditional consent model and reinforcing hidden discrimination in lending, pricing, and employment. Second, the irreversibility and «superconductivity» of digital copies and the lack of procedural mechanisms for proving a causal link between an algorithmic decision and a refusal (for example, in hiring) block access to effective legal remedies. Third, the lack of a codified act on the digital environment, vague definitions of «privacy» and «personal data,» and fragmented regulatory frameworks fail to ensure predictability and balance between individual rights and public interests in digital governance. Mass ratings and integrated profiles, even under the guise of «individual trajectories,» threaten the presumption of innocence, equal rights, and judicial protection, transforming restrictions on rights into «automated» decisions without procedural appeals or algorithmic transparency. Doctrine and human rights assessments warn of the positive feedback loop of digital ratings (the «social trap» effect) and the inevitable criminalization of indicators, followed by the trading of «points,» which undermines public goals and erodes trust in institutions. Choosing a «middle path»—a ban on mass social ratings, a clear ban on integrated databases, algorithm audits, and the mandatory preservation of analog alternatives—seems the only legitimate option for minimizing damage to constitutional guarantees. The epidemic of remote fraud, with estimated losses of up to hundreds of billions of rubles per year and tens of millions of calls per day, requires an interdepartmental and intersectoral solution. This became the subject of Federal Law No. 41 and Decree No. 309, which set the direction for the creation of a state information system and the unification of practices among banks and telecom operators. The introduced mechanisms (call marking, self-prohibitions, restrictions on involuntary cash withdrawals, and data sharing) are necessary but not sufficient, as the root cause lies in the «gray market» of data and the institutional impunity of insiders. The effectiveness of anti-fraud measures will only increase with the inevitability of punishment for the «sale» of data, independent audits of information systems, and a focus on the perpetrators of leaks. The expansion of the Unified State Health Information System (EGISZ) and the platformization of education without strict data minimization and viable alternatives to offline interaction increases the risk of privacy intrusion and vulnerability of vulnerable groups, particularly children, as confirmed by human rights and theoretical assessments. Proportionality of intervention requires prohibiting the «profiling» of children and analyzing their behavior, limiting private intermediaries’ access to educational and medical data, and ensuring parental control and analog options. The introduction of digital certificates and «portfolios» should be accompanied by a right of objection, transparency of recommendation algorithms, and a prohibition on use beyond the primary purposes, to prevent the functional expansion of processing.
Proposed legal protection model:
- Codification: adoption of the Digital (Information) Code of the Russian Federation in two stages (framework amendments and subsequent codification), which would enshrine the principles, definitions, and institution of «human rights audit,» procedural guarantees for challenging automated decisions, and the liability of operators.
- Material prohibitions: legislative ban on social ratings and integrated «digital profiles» with negative legal consequences; prohibition of the calculation of sensitive data based on indirect indicators and their use outside of expressly permitted legal purposes.
- Liability and prevention: development of «turnover fines» (Federal Law No. 420) taking into account damages and repetition; criminalization of organized insider data trading; mandatory compensation funds for operators affected by personal data. 4. Document flow and resilience: maintaining hybrid (electronic-paper) document flow with the recognition of paper copies as originals in a number of critical areas; format migration and backup regulations; independent audit of registry integrity.
- Children and vulnerable groups: prohibiting behavioral analysis and advertising based on data from minors; restricting educational platforms and prioritizing government decisions with minimal data sets.
- Transparency and control: user agreement standards, disclosure of algorithm logic, the institution of external independent AI audits for «invisible» data and discrimination, an audit registry, and mandatory violation rectification.
Decree No. 309 already prescribes the creation of a system to combat ICT crimes, and Federal Law No. 41 institutionalizes the State Information System and a set of banking and telecom measures to prevent fraud, forming the infrastructural foundation for «security by default» in the data economy. Federal Law 420 significantly increases the «cost» of data leaks and illegal circulation by introducing turnover sanctions and new offences, as well as protecting consumers from coercion to provide biometric data, which partially addresses the Human Rights Council’s calls for increased liability. The next step is to close the «regulatory gaps» surrounding data mining and automated decisions by approving a codified set of substantive and procedural privacy guarantees, consistent with doctrinal conclusions on the fundamental nature of the principle of privacy in preventing digital forms of totalitarianism.
Thus, privacy protection in Russia is entering a phase of substantive restructuring: from fragmented regulation to a systemic architecture combining infrastructural measures to combat cyberfraud, increased liability for leaks, and principles of minimizing and proportional data processing, including bans on social ratings and «gray» data mining of sensitive attributes. While preserving constitutional guidelines and national goals (network security and sovereignty), an effective model must enshrine codified definitions and procedural guarantees for data control, human rights auditing, and hybrid document management, ensuring a balance between the interests of the individual, society, and the state in the digital space. Scientific research and human rights expertise confirm that without restoring «effective control» of individuals over data and the transparency of algorithms, privacy will become irrelevant, and digital transformation risks encroaching on fundamental freedoms. Therefore, a «middle path» of intelligent digitalization and a strong legal privacy regime is a legal imperative for development in the 2025–2030s.
References
1. Digital Transformation and Protection of Citizens' Rights in the Digital Space 2.0 [Electronic resource] // Presidential Council of the Russian Federation for Civil Society and Human Rights. ‒ URL: https://www.2025.doklad-spch-po-pch-v-tsifre-2.0-2025-06-27_15-54-14_624644.pdf (accessed: 29.11.2025).2. Volkova, G.E. The Right to Privacy in the Digital Age [Electronic resource] / G.E. Volkova // Philosophy of Law. ‒ 2022. ‒ No. 4. ‒ pp. 174-180. ‒ URL: https://www.pravo-na-neprikosnovennost-chastnoy-zhizni-v-tsifrovuyu-epohu.pdf (access date: November 29, 2025).
3. Filimonova, E.A. Protection of the right to privacy in the conditions of digital transformation of society [Electronic resource] / E.A. Filimonova // Legal order and legal values. - 2025. - T. 3. - No. 1. - P. 46-52. ‒ URL: https://www.zaschita-prava-na-neprikosnovennost-chastnoy-zhizni-v-usloviyah-tsifrovoy-transformatsii-obsches.pdf (access date: November 29, 2025).
4. On the national development goals of the Russian Federation for the period up to 2030 and for the period up to 2036: Decree of the President of the Russian Federation of May 7, 2024 No. 309 [Electronic resource] // URL: https://barysh.gosuslugi.ru/natsionalnye-proekty-all/natsionalnye-proekty-2025-2030/normativnye-i-metodicheskie-dokumenty/normativnye-i-metodicheskie-dokumenty-rf/dokumenty_2963.html?ysclid=mj8imjkbxf224518198 (date of access: 11/29/2025).
5. On the creation of a state information system for countering offenses committed using information and communication technologies, and on amendments to certain legislative acts of the Russian Federation: federal. Law of April 1, 2025, No. 41-FZ [Electronic resource] // ConsultantPlus: legal reference system. URL: https://www.consultant.ru/document/cons_doc_LAW_502182/ (accessed: November 29, 2025).
6. On Amendments to the Code of the Russian Federation on Administrative Offenses: federal law of November 30, 2024, No. 420-FZ // Garant.ru: legal information portal. URL: https://www.garant.ru/hotlaw/federal/1771084/ (accessed: November 29, 2025).
