Short course: Modern Conversational AI — From Classic NLU to LLMs

This short course covers the foundations of conversational systems—classic NLU (intents, entities, slot filling, dialogue design) and modern LLM workflows (prompt engineering, function calling, RAG). Participants build a practical chatbot grounded in their own documents, evaluate quality and safety, and deploy a lightweight interface. An HPC module is included for large-scale embeddings and offline evaluation/load testing.

  • Date: 21.11.2025 at 11:45
  • Venue: PS, UDG
  • Registration required: https://forms.gle/SRW6GYiRAbi8pFBe8
  • Designed for: students, researchers, and professionals with basic Python and web/API skills.
Short course on NLP and LLMs

Course content overview

Session 1 (90 min) – theoretical framework

  • From classic NLU (intents/entities/slots) to LLM “agents”
  • Dialogue design: state machines vs. tools/functions
  • RAG essentials: indexing, chunking, hybrid search, source citations
  • Evaluation & safety: relevance/groundedness, moderation, PII
  • HPC view: when batch embeddings and batch evaluation matter

Session 2 (90 min)- hands-on lab

  • Project setup and starter RAG pipeline
  • Document import/index, prompt + function calling
  • Quick evaluation and guardrails
  • Deploy a web chat

Learning outcomes

  • Contrast intent-based vs. LLM-based chatbots.
  • Design dialogue and implement a grounded RAG pipeline with citations.
  • Ship a lightweight production chatbot with evaluation and safety.
  • Apply HPC techniques to scale embeddings and offline performance testing.