Teadmiste formaliseerimine

Allikas: Lambda
teadmised

Name: Knowlege representation
Code: ITI8700
Lecturer: Tanel Tammet
Labs: Evelin Halling and Tanel Tammet
Contact: tanel.tammet@ttu.ee, 6203457, TTÜ ICT-426
Archives of previous years: 2015 2014, 2013, 2012, 2011, 2010, 2009, 2008, 2007, 2006, older.


Time, place, result

Lectures: Wednesdays 12:00-13:30 room U06A-229
Practical work: Wednesdays 14:00-15:30, room ICT-121, ICT-122
Practical work will give 40% and exam 60% of points underlying the final grade. The exam will consist of several small excercises.

The first practical work time on 30. January at 14:00 will be used as a secord conventional lecture of the day.

Some of the last lecture times at the end of the course may be used as additional seminars/labs.

Assumed background

You should have studied the course "Basics of AI and Machine Learning" or get acquinted with the logic and probabilities parts of the course yourself.

In particular, it is useful to read these course materials and exercises: logic AIMA book, wumpus world in AIMA book, uncertainty in AIMA book, probability models in AIMA book, prolog lab, bayes lab

Focus

The main focus of the course is on knowledge representation and reasoning (KR), on a spectrum from simple to very complex: representing and using knowledge in databases, sentences in natural language and commonsense knowledge.

The overall goal of the labwork to understand issues facing a task of building a natural language question answering system a la Watson and to see and experiment with existing parts of the solution.

The course contains the following blocks:

  • Background and basics. Representing and using simple facts and rules.
  • Knowledge conveyed by natural sentences: both statistical methods like word2vec and logic-based methods.
  • General-knowledge databases: wikidata, wordnet, yago, conceptnet, nell, cyc.
  • Reasoners and question answering systems.
  • Context, time, location, events, causality.
  • Different kinds and dimensions of uncertain knowledge.
  • Indexes and search.

Check out this description of the whole KR area.

Books to use

Observe that a noticeable part of the course contents are not covered by these books: use the course materials and links to papers, standards and tutorials provided.

Practical work

There are three labs. They are all steps in a single project: build a simple natural-language question-answering system.

The labs have to be presented to the course teachers and all students present at labwork time.

The labs can be prepared alone or by teams of two people. The first lab task will be given on 6. February.

First, read the explanation of the overall task. The following labs are steps on the path to the end goal.

First lab

Deadline: 14. March (after this there will be a penalty).

If you manage to fullfill the task earlier, start doing the second lab.

Please read about the task for the first lab

Second lab

The task in the second lab is actually answering simple questions from the input text, using a small set of rules you write and a reasoner.

Third lab

The third lab is optional and will simply give as many points as lab 1 or 2 towards the final result and the grade: practical work 60% and exam 40%.

The task in the third lab is the open-world scenario of answering questions based on wikipedia and using large downloaded rule sets a la yago, wordnet etc with a reasoner.

It is a plus to be able to give uncertain answers (likely, unlikely) and handle fuzzy properties like boy, girl, grownup, rich etc.


Lecture block 1: basics and representing simple facts.

Lectures 1 and 2: Overview of the course. Background and basics: SQL, logic, NLP

Lecture materials:


Lecture 2: RDF and RDFS (and OWL)

Lecture materials:

Lecture block 2: capturing meaning in natural language

Lecture 3: Intro to homework and NLP

Lecture materials:

Lecture 4: vector representation of words

This lecture will be given by Priit Järv.

Lecture materials:

Useful additional materials from (roughly) easier to more complex:

Probabilistic models:

Also interesting to read:

Lecture block 3: large common-sense knowledge bases

Lecture 5: First look into main large knowledge bases

We will have a look at the goals, main content and differences between: