IAB Tech Lab has used its annual summit to announce two major initiatives aimed at modernizing digital advertising infrastructure and content governance, as well as to resolve some of the fundamental challenges that AI and generating LLM pose to monetization of content.
The announcement in two parts details two key initiatives with the API Initiative of LLM Ingest, responding to the concerns of publishers caused by AI agents and models of large languages, as well as research summaries led on AI which reduce the traffic of publishers. Meanwhile, its containerization project is oriented towards the development and maintenance of the programmatic infrastructure (see more below).
- The API LLM Content Ingesti Initiative offers a technical framework to help publishers and brands to control the way their content is accessible, monetized and represented by AI systems, aimed at processing traffic losses and income caused by a generative AI. These APIs can then be used to control access to the editor’s content by LLMS, the two then being able to agree on the monetization models.
- The container project introduces a standardized container technology for OpenRTB to rationalize the deployment of advertising technology, improve scalability and reduce latency in the programmatic supply chain. This includes specialized partners in enrichment and evaluation of auctions, scaling challenges, in particular for live events, fragmented systems and unequal performance, which have made current foundations difficult to evolve.
IAB Tech Lab invites publishers, brands, LLM platforms and AI agent developers to provide comments on proposals, with a workshop for the API LLM Ingestic content plan scheduled for next month. Elsewhere, the Technological laboratory working group working group is responsible for the Directorate of Distinct effort, with representatives also requesting comments on the initiative.
The announcement of two-stories marks the conclusion of the previous commitment of the standards organization to publish up to 31 new specifications or updates this year, efforts targeting industry sub-sectors, including CTV, conversion monitoring and conservation.
However, it is the rising tide of the generative AI and LLM that have proven the most fundamentally worrying change in recent years, the number of related job losses in 2025 only a striking concern. Anthony Katsur, CEO of IAB Tech Lab, discussed this question and the latest initiatives with Digiday before the IAB Tech Lab this week, detailing his conviction that each publisher should ink the license agreements with LLMS, how brands, Also, Need to protect yourself in the middle of the “contextual soup”.
When he questioned mass dismissals in the sector, Katsur has also made recommendations on how individuals can exercise their careers to flee in this new internet economy paradigm.
The conversation below has been slightly modified by brevity and clarity.
Many publishers are wary of the last era of the Internet, with layoffs that take place in the industry, how will the last initiative help?
Some publishers are starting to make content license agreements with the LLM, and each publisher should conclude a comprehensive content license agreement with their LLMS.
Each publisher should know that each LLM is crawling your content, so make a license offer, stop bleeding and are paid for your content. All LLM which actually ransives the content of the publishers without paying it; It is the theft of intellectual property, in my opinion.
The challenge, however, is that we do not believe that the creeping approach is a long -term and achievable approach. By introducing a standardized API set [LLM Content Ingest API]We can lead the industry to lock their arms and close the ramps, lock them at the IP level. Then we can create an open source standardized API which gives a structure to this content, and this structure makes a number of things.
For example, you can now create a gateway you know allows you to access the LLM which reflect the commercial terms of a contract that you sign with them. The problem is that there are publishers with different levels of content: your archival content, always on, then there is your day up to date and the same [monetization or paywall model] should exist for LLM.
There is a forest component in the API, so you can now audit the ramps and make sure that you charge properly and that you get paid appropriately for your content. And then fourth, and I would say, perhaps the most important, is the tokenization of the content, which demonstrates a source of truth. The problem with the LLM, although promising, is that they are always emerging in their development, and they are subject to [making factual] Errors.
Can you explain more about the need for the container project at the moment?
Containeurization is undoubtedly the largest development in programmatic since the opening of RTB. In today’s server server architecture, Open RTB is a [meta-protocol] And is an HTTP request that makes a wide network call, so even if a DSP and an SSP are in the same data center, it is not necessarily intelligent enough to know to stay in the same data center.
The beauty of containerization is that you can take advantage of the GRCP protocol and protocol pads to make a containerized version of the RTB protocol. So what we do is that we take these 300 to 500 milliseconds and potentially reduce it up to 50 to 100 milliseconds … and what you can do with that [saved] Time is a lot.
The connection between DSP and SSP will open and close much faster, or you can keep the connection open, simply continue to broadcast new requests through, which works very well for the live event scale [opening programmatic up to new content types such as live sports].
There are many people in the industry who have recently lost their jobs with the developments of the Cités, etc. What advice would you have for people who are considering such developments?
The agentic AI, or the AI focused on the objective which is not also focused on tasks, is that which comes into play in terms of purchase and optimization of the media, fighting against fraud. They will be able to identify models in the supply chain or performance models or creative optimization.
I think my advice to anyone in our ecosystem is to learn and become an expert in work with these tools. It’s early, but I think those who kiss it can get an advantage from the point of view of the learning curve.