1

FIT5047 Intelligent Systems

Practice Questions for Final Exam

1. [Agents] Consider the vacuum agent presented in class, but assume that the room

has 8 squares and a square can get dirty after it has been cleaned.

(a) Use PEAS to specify the task environment.

(b) Specify the attributes of the environment.

2. [Agents] Consider shopping for used AI books on the Internet.

(a) Use PEAS to specify the task environment

(b) Specify the attributes of the environment.

3. [Backtrack] Specify a state representation, operators and goal test to solve the N-

queen problem with N=4:

Four queens have to be placed on a 4X4 board where no row will have more than one

queen, no column will have more than one queen, and no diagonal will have more than

one queen.

Illustrate the workings of the backtracking algorithm (Backtrack1) to solve this prob-

lem.

4. [GraphSearch] Suppose we want to use the algorithm A* on the graph below to find

the shortest path from node S to node G. Each node is labeled by a capital letter and

the value of a heuristic function. Each edge is labeled by the cost to traverse that edge.

For this problem:

Perform the algorithm A* on this graph, filling in the table below. Indicate the

f , g, and h values of each node on the queue as shown in the first two rows of the

table. You need not write the contents of the (priority) queuen in order in the

table.

Assume that if you find a path to a node already on the queue that you update

its cost (using the lower f value) instead of adding another copy of that node to

the queue.

• Show the path found by the algorithm A* on the graph above.

iteration node expanded Priority queue at end of this iteration

•

2

0 S=0+6=6

1 S A=2+4=6; B=3+4=7

3

5. [Algorithm A*] The evaluation function f(n) = d(n) + W(n), where d(n) is the cost

of arriving at node n and W(n) is the sum of Manhattan Distance for each tile, is used

in conjunction with the algorithm A* to search from the start node:

2 8 3

1 6 4

7 5

To the goal node:

1 2 3

8 4

7 6 5

Use this evaluation function to search forward (from start node to goal node) and

backward (from the goal node to the start node). Where would the backward search

meet the forward search?

6. [Search] Which of the following are true and which are false? Explain your answers.

(a) Depth-first search always expands at least as many nodes as A* search with an

admissible heuristic.

(b) h(n) = 0 is an admissible heuristic for the 8-puzzle.

4

∃

7. [Game playing] Consider a game tree with branching factor 2 and depth 5. Assume

that it is MAX’s turn to play, and that the evaluation function at the leaf nodes is in

increasing order from left to right, such that its value for the leftmost node is 1, and

for the rightmost node is 16 (the leaf nodes are MAX nodes).

Conduct an α-β search of this game tree, starting from leftmost-node-first. In your

α-β tree, clearly indicate the propagation of the α and β values, the performed cutoffs

and the inspected leaf nodes.

Upon completion of the search, state the final backed-up value of the root node and

the recommended move (Left or Right). Also state the number of α and β cutoffs

performed, and the number of leaf nodes generated.

8. [First order logic] Assuming predicates PARENT (p, q) and FEMALE(p) and con-

stants Joan and Kevin, with the obvious meanings, express each of the following

sentences in first-order logic. (You may use the abbreviation 1 to mean “there exists

exactly one.”)

(a) Joan has a daughter (possibly more than one, and possibly sons as well).

(b) Joan has exactly one daughter (but may have sons as well).

(c) Joan has exactly one child, a daughter

(d) Joan and Kevin have exactly one child together.

(e) Joan has at least one child with Kevin, and no children with anyone else.

9. [First order logic] Consider a vocabulary with the following symbols:

OCCUPATION (p, o): Predicate. Person p has occupation o.

CUSTOMER(p1, p2): Predicate. Person p1 is a customer of person p2.

BOSS(p1, p2): Predicate. Person p1 is a boss of person p2.

Doctor, Surgeon, Lawyer, Actor: Constants denoting occupation.

Emily, Joe: Constants denoting people.

Use these symbols to write the following assertions in first-order logic:

(a) Emily is either a surgeon or a lawyer.

(b) Joe is an actor, but he also holds another job.

(c) All surgeons are doctors.

(d) Joe does not have a lawyer (i.e., is not a customer of any lawyer).

5

(e) Emily has a boss who is a lawyer.

(f) There exists a lawyer all of whose customers are doctors.

(g) Every surgeon has a lawyer.

10. [First-order logic] Indicate which of the following Predicate Calculus sentences are

correct representations of the corresponding English sentences, and explain why or why

not.

1 [∀x PERSON(x)] ⇒ [∃y MOTHER(y,x)]

Everybody has a mother

2 OLD(DOG(Fido))

Fido is an old dog

3 DOG(Fido) ∧ OLD(Fido)

Fido is an old dog

4 ∀x [ MATHEMATICAL-THEORY(x) ⇒ x ]

All mathematical theories are true

5 ∃s [ARISTOTLE-SAID(s) ∧ ¬TRUE(s)]

Aristotle told a lie

6 ¬∃v ∃b [BUTCHER(b) ∧ VEGETARIAN(v)]

There are no vegetarian butchers

7 ¬∃b ∃d BUTCHER(b) ∧ DOG(d) ∧ OWNS(b,d)

No butcher owns a dog

11. [First-order logic] The expression LAST(x,y) means that y is the last element of the

list x. We have the following axioms:

(a) LAST({u,NIL},u), where u is an element.

(b) LAST(y,z) ⇒ LAST({x,y},z), where x and z are elements, and y is a list.

Use the answer extraction method to find v, the last element of the list (2,1):

LAST({2,1,NIL},v)

12. [Unification] For each of the following pairs of literals indicate whether or not they

unify. If they do unify, show a most general unifier. If they do not unify, explain why

not.

(a) STRONGER(x, LexLuthor), STRONGER(LoisLane,y)

(b) HOLDING(x, Girlfriend(x)), HOLDING(Superman,y)

(c) FLY(Girlfriend(x)), ¬FLY(LoisLane)

(d) ¬LIKES(x,Enemy(x)), ¬LIKES(Enemy(y), y)

6

13. [Unification] For each pair of atomic sentences, give the most general unifier if it

exists. If they do not unify, explain why not.

(a) P (A, B, B), P (x, y, z).

(b) Q(y, G(A, B)), Q(G(x, x), y).

(c) OLDER(FATHER(y), y), OLDER(FATHER(x), John).

(d) KNOWS(FATHER(y), y), KNOWS(x, x).

14. [Resolution] From “Horses are animals”, it follows that “The head of a horse is the

head of an animal.” Demonstrate that this inference is valid by carrying out the fol-

lowing steps:

(a) Translate the premise and the conclusion into a well formed formula (wff) in First

order logic. Use three predicates: HEADOF (h, x) (meaning “h is the head of

x”), HORSE(x), and ANIMAL(x).

(b) Negate the conclusion, and convert the premise and the negated conclusion into

CNF.

(c) Use resolution to show that the conclusion follows from the premise.

15. [D-separation] According to this Figure, determine whether the following claims to

be True or False and justify your answer.

(a) B ⊥ G|A

(b) C ⊥ D|F

(c) C ⊥ D|A

(d) H ⊥ B|C, F

7

16. [Supervised Machine Learning] Consider the following dataset consisting of 3 bi-

nary features and a binary class variable.

Feature 1 Feature 2 Feature 3 Class

1 0 0 0

1 0 1 1

1 1 1 1

1 1 1 1

0 1 1 0

0 0 1 0

(a) Based on the dataset above, estimate the prior probability of Class 1.

(b) What is the initial entropy of the class labels over all the data? Show the formula

before plugging-in numbers.

(c) If we were to split the data into 2 groups based on the value of Feature 1, what

would be the entropy for each group?

(d) What is the Information Gain of Feature 1?

(e) You are told the Information Gain for Feature 2 and Feature 3 after splitting on

Feature1 is 0.08 and 0.19 respectively. Draw a decision tree for this problem.

(f) We will now build a Na¨ ıve Bayes model for predicting the class from these data.

Draw the Bayesian Network that corresponds to the Naive Bayes model.

(g) From the data table, write down the conditional probability estimates required for

the Na¨ ıve Bayes model.

(h) What class does the Na¨ ıve Bayes model predict for a new data item with values (1,

1, 1) for Features 1, 2 and 3 respectively? What is the probability of each class?

How does the Na¨ ıve Bayes classification compare to the decision tree classification?

17. [Decision Tree] Assume that you are given the set of labeled training examples

below, where each of three features has three possible values: a, b, or c. You choose

to learn a decision tree from these data.

F1 F2 F3 Output

ex1 c b b +

ex2 a a c +

ex3 b c c +

ex4 b c a -

ex5 a b c -

ex1 c a b -

What score would the information gain calculation assign to feature F1, when decid-

ing which feature to use as the root node for a decision tree being built? Be sure to

show all your work.

8

18. [K-nearest-neighbour] Assume we have the age, loan amount and pay result of some

customer from the bank. We need to predict the pay result of a new loan applicant

given only age and loan amount. The data are shown below:

Age Loan Amount (K) Pay Status

20 20 Paid

25 45 Paid

22 95 Default

35 53 Paid

35 120 Paid

33 150 Default

40 54 Default

45 75 Paid

52 20 Paid

49 220 Default

60 100 Default

A new bank customer with age 42 comes in and applies for a loan AUD$140K, how

would the 3-nearest-neighbour algorithm predict this customer? Paid or Default? Be

sure to show all your work.