
Privacy, artificial intelligence, data collection and security in thecookieless era have, as we know, been hot topics on the tables of managers, players in the adtech and martech worlds, and institutions for a long time now. 2023, in particular, has been a year full of regulatory news for everything related to the protection and safeguarding of personal data. Much of what has been written and regulated will have repercussions as early as 2024. Let’s look back together at the main regulatory changes that have been introduced.
- AI ACT: Europe trailblazer on artificial intelligence legislation
- Digital Service Act (DSA): full compliance now underway
AI ACT: Europe trailblazer on artificial intelligence legislation
March 13, 2024, a date to be carved. With 523 votes in favor, 46 against, and 49 abstentions, the European Parliament approves theAI ACT and equips European countries with the world’s first artificial intelligence legislation.
Defined by many as the law that ensures that the rights and freedoms of individuals are at the heart of AI developments, ensuring a fair balance between innovation and protection, theAI ACT adopts a risk-based approach: the greater the risk imputed to an AI system, the greater the responsibilities of those who use and/or develop that system. It goes from recognizing high risk for all AI uses that have the potential to compromise security and fundamental rights, to those with limited risk.
In this context, all uses involving cognitive behavioral manipulation of vulnerable people or groups, social classification, and finally real-time and remotebiometric identification are also defined as unacceptable risk . The priority goal of the European Parliament is to make sure that the AI systems used in member states are transparent, safe, traceable, non-discriminatory, environmentally friendly and, finally, supervised by people to prevent harmful consequences.
Governance measures, including those that support innovation, and a turnover-based penalty system complete the picture of a regulation that is destined to become the example for many other countries as well.

Digital Service Act (DSA): full compliance now underway
Open the magic box of algorithms and go further to have more transparency on profiling and how platforms operate: the Digital Service Act, approved by the European Parliament in July 2022, finally came into force last February 17, affecting not only the dominant players in the market (read , Meta, Google, TikTok, YouTube, Bing, Pinterest, just to name a few), but also all smaller entities with less than 45 million monthly active users and, in general, all intermediary entities on the web such as cloud and hosting providers, search engines, ecommerce and online services.
Empowerment and protection of online users through the mitigation of “systemic risks” and the application of “robust content moderation tools: the regulatory framework of the Digital Service Act revolves once again, around the concepts of transparency, information and, above all, accountability.
The Digital Service Act was created with the primary goal of enabling better moderation of content on platforms, especially social media.
In addition to this, however, the legislation introduces a wide range of new obligations on platforms, including some that aim to disclose to regulators how their algorithms work. In fact, the DSA introduces certain limitations such as, for example, the obligation to be transparent about the data collected, the obligation to inform the user about content moderation, not to mention the obligation to provide the option not to receive suggestions based on profiling, just to name a few.
These obligations are part of a more complex framework of objectives ranging from protecting consumer rights, controlling the dissemination of illegal content, to providing consumers with a wider choice of digital services and establishing a clear regulatory framework in the area of transparency and accountability of online platforms.
All this translates for platform users into the possibility of receiving clear information about the data collected, as well as the right to opt out of content and ways of personalizing it based on profiling. Another aspect that should not be underestimated is also related to advertisements, which will no longer be able to be based on sensitive data. Information about the ads and the sponsoring companies then accompany and platforms that deliver advertising content are required to process reports of illegal content through a special mechanism.
Rules and regulations, but that’s not all: in fact, it is reported that also as of February 17, a pan-European supervisory architecture is definitively in place, which while reporting to the Commission, the sole authority responsible for supervision of platforms and search engines, will in fact work closely with national authorities.
The connected tech world is evolving rapidly, institutions and regulators are trying to keep up, but striking a balance is often complex in the face of layered and complex regulations. The increasingly crucial focus remains on personal data protection , and the challenge is increasingly being played out on the fine line between data protection and innovation in the name of personalization.