ITA Supreme Court: Geolocation of Company Vehicles: Obligation to Notify the Data Protection Authority Even in Cases of Indirect Identification.
The ITA Court of Cassation, in Order No. 3462 of February 16, 2026 (Pres. Acierno, Rel. Tricomi), clarified a key aspect of the Privacy Code in the version in force prior to the GDPR (Legislative Decree 196/2003).
The obligation to notify the Data Protection Authority, provided for in Article 37, paragraph 1, letter a), applies to geolocation systems installed on company vehicles even when GPS data is not automatically linked to the employee’s name. It is sufficient that the employer be able to indirectly identify the employee through logical deductions based on ancillary elements, such as the assignment of a specific vehicle or tachograph data.
The trial court had deemed “automatic identification via codes” necessary; the Supreme Court, however, clarified that, for the purposes of the notification obligation, what matters is the concrete risk of indirect profiling, since even such a scenario constitutes the processing of personal data and requires compliance with the obligations set forth in the Privacy Code.
NIS2 and the supply chain
With NIS2, the security of the ICT supply chain ceases to be merely a technical issue and becomes a matter of governance, management accountability, and the organization’s overall resilience. The regulation requires risk management measures that explicitly include the supply chain and relationships with direct and indirect suppliers.
This aspect is directly linked to the GDPR: Article 28 requires the data controller to select only data processors that offer sufficient guarantees, while Article 32 requires technical and organizational measures appropriate to the risk. It follows that the management of ICT suppliers cannot be limited to standard contracts or merely formal due diligence, but requires a concrete assessment of the supplier’s criticality, operational dependencies, data processed, security measures adopted, use of subcontractors, incident response times, and audit rights.
Supply chain risk must be managed ex ante through the selection, classification, verification, and continuous monitoring of critical suppliers. Operationally, this involves structured due diligence, security questionnaires, the collection of documentary evidence, periodic audits, verification of subcontractors, and specific clauses regarding notifications, vulnerabilities, response times, and cooperation in the event of an incident. Added to this is the contractualization of risk: for ICT suppliers that impact relevant services or processing operations, generic clauses are insufficient; clear provisions are required regarding technical and organizational measures, incident reporting, recovery support, vulnerability management, traceability of subcontractors, and data return or deletion.
From a NIS2 perspective, these clauses govern supply chain risk; from a GDPR perspective, they provide concrete implementation of Articles 28 and 32, since the security of the supply chain depends on defined, verifiable, and effectively enforceable obligations.
The appointment of a DPO is a mandatory requirement.
By Order No. 68 of February 12, 2026, the Data Protection Authority imposed a fine of 3,000 euros on the Municipality of Aversa for violating the obligations set forth in Article 37, paragraphs 1 and 7, of the GDPR, regarding the designation of the Data Protection Officer (DPO) and the communication of the DPO’s contact information to the Authority.
The ex officio investigation established that the entity had neither formally appointed the DPO nor published the relevant contact details on its official website. The defenses put forward by the Municipality—structural staff shortages, a financial restructuring process, and the timing of upcoming elections—were not deemed sufficient to justify the non-compliance, since the role of the DPO has been, since May 2018, a mandatory and non-negotiable component of public entities’ privacy governance.
Although the Municipality took steps to rectify the situation during the proceedings, the Data Protection Authority nevertheless deemed it necessary to impose the sanction and order the publication of the decision, emphasizing the deterrent function of the measure and reiterating that organizational difficulties do not exempt the data controller from complying with fundamental data protection obligations.
A university has been fined 50,000 euros for the unlawful processing of biometric data.
In a ruling dated January 29, 2026, the Italian Data Protection Authority fined a university 50,000 euros for unlawfully processing the biometric data of numerous participants in online teacher certification courses. The university used a facial recognition system to verify identity and attendance during classes, based on the creation of biometric models derived from students’ facial images and identification documents.
The Authority found that there was no suitable legal basis to justify the use of biometric technologies, which, as they pertain to special categories of data, require strict conditions and enhanced safeguards, especially given the availability of less invasive alternative solutions for attendance monitoring. It also emerged that a Data Protection Impact Assessment (DPIA) had not been conducted prior to the system’s activation, despite the very high number of data subjects involved—over 450 students per class.
During the investigation, the system remained partially in use, with corrective measures deemed inadequate, until its final deactivation. In determining the amount of the fine, the Data Protection Authority took into account, as a mitigating factor, the university’s cooperation and the voluntary cessation of processing.
EDPB and EDPS: Joint Opinion No. 2/2026 on the proposed Digital Omnibus Regulation.
On February 10, 2026, the EDPB and the EDPS adopted Joint Opinion No. 2/2026 on the proposed “Digital Omnibus” Regulation, through which the Commission aims to simplify and streamline the EU digital rules framework, including certain aspects of the GDPR. The Authorities support the goal of reducing complexity but reiterate a key point: simplification cannot come at the cost of weakening the concept of personal data or removing pseudonymized data that can still be traced back to natural persons from the scope of the GDPR. The Opinion welcomes measures that enhance harmonization and legal certainty—such as common templates for data breach notifications and DPIA—but raises concerns about changes that affect the core of fundamental rights. Simplification is not a compromise on protections, but a call to apply the GDPR in a more straightforward, informed, and demonstrable manner.
Robust compliance thus becomes a factor in reliability and decision-making speed in digital and AI projects, not merely a bureaucratic cost.
AI versus AE: lesson on the connection between AI and Authors and Editors according to the AI Act
Nicola Tilli will be addressed to have a lesson on the connection between Publishing sector and Artificial Intelligence
In Milan at University IULM on May 4th 2026 at Master of the professions oriented to literature, publishing, novelists, authors in general and linked arts

AEPD: Agent-Based Artificial Intelligence: How to Govern Autonomy in Data Processing.
On February 18, 2026, the Spanish Data Protection Agency (AEPD) published guidelines dedicated to “agent-based” artificial intelligence, i.e., systems that do not merely generate responses but interact autonomously with the digital environment to pursue complex objectives. From a legal standpoint, the agent’s autonomy radically expands privacy risks: it is no longer sufficient to evaluate inputs and outputs; rather, one must govern a dynamic, adaptive, and only partially predictable process.
In this scenario, principles such as transparency, data minimization, purpose limitation, and privacy by design cannot be managed using standard approaches. If the agent learns from the context, selects sources, performs actions, and modifies its own behavior, the data subject’s control risks becoming merely theoretical, just as generic notices, abstract internal instructions, or purely formal forms of supervision prove insufficient.
The robustness of the system is measured by the organization’s ability to correctly assign roles and responsibilities, reconstruct information flows, define the agent’s operational scope in advance, ensure effective human oversight, and align these aspects with policies, internal procedures, supplier relationships, and accountability mechanisms. Operationally, this requires at least: mapping the areas in which the agent operates and the data it uses; defining functional limits, instructions, and thresholds for human intervention; adapting governance and documentation to actual operations; assessing the impacts on rights in advance; and establishing continuous supervision with periodic review. With agentic AI, therefore, it is not enough to control the tool: its actual behavior must be governed over time.
A fine of €15,000 was imposed on a legal training and publishing company for GDPR violations regarding consent and access to data.
he Italian Data Protection Authority, in Decision No. 87 of February 12, 2026, imposed a fine of 15,000 euros on a company active in legal training and publishing for violations of Articles 6, 12, and 15 of the GDPR and Article 130 of the Privacy Code. The proceedings stem from complaints filed by two professionals who had received unsolicited promotional emails and had not received a response to their requests for access to their personal data.
The company attributed the incidents to “human error” and organizational restructuring, but the Authority deemed these justifications irrelevant, reiterating the central importance of explicit consent for sending automated commercial communications and the obligation to respond fully and promptly to requests from data subjects.
In addition to the monetary penalty, the Data Protection Authority issued a formal warning to the data controller, urging strict compliance with the regulations governing direct marketing and access rights.
EDPB: The 2026 Coordinated Action on the Right to be Forgotten and Technical Challenges.
On February 18, 2026, the European Data Protection Board (EDPB) published the final report of the coordinated action on the right to erasure under Article 17 of the GDPR. The survey, conducted in 2025 by 32 supervisory authorities among 764 data controllers—including businesses and public bodies—highlighted seven main critical issues. Among these, the lack of effective internal procedures, insufficient information provided to data subjects, the use of insecure anonymization techniques, and uncertainty regarding retention periods and the deletion of data in backups stand out.
The report highlights the gap that still exists between the technical concept of “erasure” and its correct legal application. In fact, many data controllers consider operations such as hashing (i.e., transforming data into an alphanumeric string via an algorithm), masking (partial obscuring of data, such as hiding part of a tax ID), or other reversible methods (techniques that make the link to the individual less visible but do not eliminate it) to be equivalent to permanent deletion. However, if the data remains traceable to an individual, it continues to be subject to the provisions of the GDPR.
Similarly, the obligation is not considered fulfilled if the information remains in backup or disaster recovery systems, even if it is no longer visible in the operational systems.
The EDPB emphasizes that the right to be forgotten is no longer measured solely in formal terms, but in terms of the actual organizational and technological capacity to ensure the effective and definitive erasure of data.
France - CNIL: €5 Million Fine. Accountability Beyond Documentary Formalities.
On January 22, 2026, the CNIL imposed a €5 million fine on France Travail (formerly Pôle Emploi), underscoring a fundamental principle: compliance with the GDPR is not merely a matter of documentary compliance, but an obligation of effective accountability. The investigation revealed a serious discrepancy between the stated procedural framework and the technical security measures actually implemented.
In this case, the CNIL found the ineffectiveness of the authentication systems and the lack of access monitoring protocols, leaving the data of millions of data subjects vulnerable. Furthermore, a penalty was imposed for non-compliance with the principle of data retention limitation: the continued storage of datasets relating to users who had been inactive for years revealed the absence of automated deletion or anonymization procedures.
Although the risks had been identified in the impact assessments, the organization failed to implement the necessary countermeasures. In addition to the fine, a penalty of 5,000 euros per day was imposed for the delay in compliance.







