Cognitive Systems Group

Blue apple

In the Cognitive ststems research group we are looking at formal and distributional models (and anything in between) of language used by situated agents interacting with each other and with the physical world around them through action and perception. We investigate areas such representations of meaning in computational approaches to language, action, and perception, for example of spatial descriptions, generations and interpretation of scene description, multi-modal communication, situated dialogue systems, and other.

Members

Previous members

Several other members of CLASP have occasionally collaborated with the group.

Masters students (theses)

  • Dominik K√ľnkele, learning through interaction
  • Ekaterina (Katya Voloshina), probing grounded language models
  • Chen Xi, grounding relations in object affordances

Join as a postdoc or a PhD student in the associated Grandma Karl research environment

Join us every even Friday in our reading group

Attend a doctoral course

News

  • 2023-05-15: We are co-organising the Workshop on Resources and representations for under-resourced languages and domains (RESOURCEFUL-2023) at NoDaLiDa, website
  • 2023-05-11: PhD position within the Gradma Karl research environment, deadline 27 June, more details
  • 2023-05-02: Postdoc position, deadline 1 June: Postdoctoral Researcher in Computational linguistics with specialisation in language grounding to vision, robotics, and beyond, more details
  • 2023-04-28: article in GU-Journal 02-2023: Language models with a human touch, read here
  • 2023-02-02: annual group report
  • 2022-05-25: Phd thesis defence: Chatrine Qwaider: Resources and Applications for Dialectal Arabic: the Case of Levantine, more details

Courses

Masters in Language Technology (MLT) and free-standing online courses

Doctoral courses

Resources

Contact

  • Mailing list cogsys (at) listserv (dot) gu (dot) se, subscribe
  • Discord (to get added, send Simon your discord id)

Picture of the blue apple taken from here.