In the last decade, the capabilities of AI technologies have been moving at a fast pace; corporations and governments have already started adapting to this change, and the population is beginning to familiarise itself with LLM Agents and personal assistants. Communication to and between AI agents today happens through the ambiguity of natural languages, with the wrong assumption that the semantics associated with each word do not change. The other technique used to enable inter-agent communication was born from KQML and dates back to 1993. It requires agents to implement specific protocols based on a series of complex performatives, unambiguous symbols with a particular semantics attached to them. This method is challenging to implement and works well only if the context does not change. At the same time, most of the user data is not clean and comes from unsupervised machine learning or sensors. To maximise the amount of data gathered, it is essential to reuse most of it as much as possible. To achieve better results, we require a protocol that, despite being based on natural language, facilitates user understanding and interaction, supports data reuse methodologies, represents formal and unambiguous symbols, and enhances inter-context communication. Enters Entity Markup Language (EML)---a framework containing different elements: a language that supports the representation in multilingual natural language, supporting the separated but composable representation of knowledge divided into language, concepts, contextual purpose and entities. It also has built-in knowledge layer separation to support iTelos methodology to enable data reuse, has a more formal and unambiguous language agnostic conceptual layer that can be used for inter-agent communication, producing agent that not only can "ask" and "tell" (as KQML performatives) but also perform an active "listen" allowing agents not just to receive messages but to interpret them in light of context proactively. This shift supports interoperability among diverse agents, enhances explainability by making all knowledge exchange explicitly representable, and provides a foundation for integrating both advanced AI models and efficient legacy systems. Beyond syntax, EML also defines a runtime protocol and a descriptive layer that together enable privacy-aware digital assistants, distributed data ownership, and transparent governance of information flows. This document presents the work done on EML, including its peculiarities, structure, syntax, methodological underpinnings, and supporting tools developed for its adoption. Some application scenarios will illustrate how EML can be used as a unifying framework for ACL, boosting data reuse and debuggability.

EML - Ask, Tell and Listen

Zamboni, Alessio
2025

Abstract

In the last decade, the capabilities of AI technologies have been moving at a fast pace; corporations and governments have already started adapting to this change, and the population is beginning to familiarise itself with LLM Agents and personal assistants. Communication to and between AI agents today happens through the ambiguity of natural languages, with the wrong assumption that the semantics associated with each word do not change. The other technique used to enable inter-agent communication was born from KQML and dates back to 1993. It requires agents to implement specific protocols based on a series of complex performatives, unambiguous symbols with a particular semantics attached to them. This method is challenging to implement and works well only if the context does not change. At the same time, most of the user data is not clean and comes from unsupervised machine learning or sensors. To maximise the amount of data gathered, it is essential to reuse most of it as much as possible. To achieve better results, we require a protocol that, despite being based on natural language, facilitates user understanding and interaction, supports data reuse methodologies, represents formal and unambiguous symbols, and enhances inter-context communication. Enters Entity Markup Language (EML)---a framework containing different elements: a language that supports the representation in multilingual natural language, supporting the separated but composable representation of knowledge divided into language, concepts, contextual purpose and entities. It also has built-in knowledge layer separation to support iTelos methodology to enable data reuse, has a more formal and unambiguous language agnostic conceptual layer that can be used for inter-agent communication, producing agent that not only can "ask" and "tell" (as KQML performatives) but also perform an active "listen" allowing agents not just to receive messages but to interpret them in light of context proactively. This shift supports interoperability among diverse agents, enhances explainability by making all knowledge exchange explicitly representable, and provides a foundation for integrating both advanced AI models and efficient legacy systems. Beyond syntax, EML also defines a runtime protocol and a descriptive layer that together enable privacy-aware digital assistants, distributed data ownership, and transparent governance of information flows. This document presents the work done on EML, including its peculiarities, structure, syntax, methodological underpinnings, and supporting tools developed for its adoption. Some application scenarios will illustrate how EML can be used as a unifying framework for ACL, boosting data reuse and debuggability.
7-ott-2025
Inglese
multi agent systems
Giunchiglia, Fausto
Università degli studi di Trento
TRENTO
108
File in questo prodotto:
File Dimensione Formato  
2025_Alessio_EML_PhD_thesis-10.pdf

embargo fino al 30/09/2027

Licenza: Tutti i diritti riservati
Dimensione 1.98 MB
Formato Adobe PDF
1.98 MB Adobe PDF

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14242/303795
Il codice NBN di questa tesi è URN:NBN:IT:UNITN-303795