Qualifying Exam -- Part 2
Mo., December 13, 1999
Your Name:
Ssn:
Topic | Your Score | Max
|
---|
Logical Reasoning I | .. | 18
|
---|
Logical Reasoning II | .. | 8
|
---|
Learning | .. | 6
|
---|
Neural Networks | .. | 16
|
---|
Planning | .. | 10
|
---|
S | .. | 60
|
---|
You have 75 minutes to complete your exam. Prefix your
answers with the correct problem number.
1) LOGICAL REASONING I [18]
Prove using resolution (and not other methods!)
A1,A2,A3,A4,A5 |- A
with
(A1) "Every student owns at least one car."
(A2) "Everybody, who owns a red car, is rich."
(A3) "At least two students own a red car."
(A4) "Fred owns a red car."
(A5) "Every student located in Houston owns at least one blue car"
(A) "Some students are rich."
a) Transform the above natural language statements into first order
predicate logic formulas!
b) Convert the FOPL-formulas into clauses!
c) Show using resolution that the above statement is true
2) LOGICAL REASONING II [8]
a) What is unification and what is its role in resolution proofs? [2]
b) Will a "good" resolution theorem prover always terminate, if a
statement cannot be proven (you can assume there are no limitations
with respect to storage and runtime). [2]
c) Most PROLOG implementations rely on Horn clause
resolution. What is Horn clause resolution; how is it more restrictive
than "general" resolution? What is the consequences of
these restrictions for PROLOG? [4]
3) LEARNING IN GENERAL [6]
Explain: what is supervised, unsupervised, reinforcement learning?
How do the three approaches differ from each other?
4) NEURAL NETWORKS [16]
a) What role do activation functions play in neural network
architectures? [2]
b) Assume the following boolean function is given that
has 3 input parameters A, B, C:
A | B | C | Output
|
---|
0 | 0 | 0 | 0
|
---|
0 | 0 | 1 | 0
|
---|
0 | 1 | 0 | 0
|
---|
0 | 1 | 1 | 1
|
---|
1 | 0 | 0 | 1
|
---|
1 | 0 | 1 | 1
|
---|
1 | 1 | 0 | 1
|
---|
1 | 1 | 1 | 1
|
---|
Can the above function be computed by a perceptron? If your
answer is yes; give a perceptron that computes the function (use
the notations of Fig. 19.6 of the textbook and clearly identify
what activation functions you assume to be used in your solution).
If your answer is no, explain why it is impossible to compute
the above function! [8]
c) How do multi-layer feed-forward network learn (answer does
not need to be very detailed!)? What makes learning feed-forward
networks more difficult than learning perceptrons? [6]
5) PLANNING [10]
Write a small essay that discusses how planning systems cope with
the following problems:
- representation of actions
- representation of states
- representation of plans