Final Exam CS 170
Last Name:______
First Name:______
Student ID:______
Note: Please write carefully, if necessary PRINT.
Do not touch this exam until you are told to do so!
This exam will be collected along with your scantron. If you need to do calculations, do them on this exam.
1) If I had a Knowledge Base that contains only true facts and I use a algorithm XXX to prove the 2 = 3, you could say the algorithm XXX was not
A) complete / B) optimalC) admissible / D) sound
2) Suppose I had a Knowledge Base that contains only true facts about arithmetic and using algorithm ZZZ I was unable to prove that 7 is greater than 2. This would suggest that ZZZ was not
A) complete / B) optimalC) admissible / D) decidable
3) Which of the following sentences is the best representation of the sentence “You can fool some of the people all of the time”
A) $x $t (person(x) Ù time(t)) Ù can-fool(x,t) / B) $x "t (person(x) Ù time(t)) Þ can-fool(x,t)C) "t $x (person(x) Ù time(t)) Ú can-fool(x,t) / D) Ø$x "t (person(x) Ù time(t)) Û can-fool(x,t)
4) Which of the following sentences is the best representation of the sentence “You can fool all of the people some of the time”
A) "x "t (person(x) Ù time(t) Þ can-fool(x,t) / B) "x $t (person(x) Ù time(t) Þ can-fool(x,t)C) "x $t (person(x) Ù time(t) Þ can-fool(x,t) / D) "x Ø$t (person(x) Þ time(t) Ù can-fool(x,t)
5) The sentence A Ù ØA is
A) Valid / B) satisfiable, but not validC) True / D) unsatisfiable
6) Which of the following sentences are in Horn Clause Format? (just one is)
A) A Ù B Þ C / B) A Þ B Þ CC) A Ù B Û C / D) A Ú B Þ C
7) The sentence A Ú ØA is
A) Valid / B) satisfiable, but not validC) false / D) complete
8) The following definition “Specifies the symbols in the language and how they can be combined to form sentences” refers to the following term.
A) Modulation / B) InferenceC) Semantics / D) Syntax
9) Suppose I am working in proposition logic, trying to prove a fact about A, B, C, D ,E and F. The truth table I build will have
A) 6 rows / B) 256 rowsC) 64 rows / D) infinite rows (you can’t build a truth table for this problem)
10) Resolution Refutation is the best algorithm for proving something in FOL because…
A) There is no search involved / B) The search space in polynomial.C) It is complete / D) It simple to implement.
11) “If Joe gets a good grade in the final, he will pass the class, Joe did get a good grade, therefore he will pass”. This kind of reasoning is known as
A) resolution / B) modus ponensC) and-elimination / D) existential introduction
12) In FOL, an application of a function returns
A) a term / B) true or falseC) a sentence / D) a real number
13) To resolve P(w) Ú Q(w)
with ØQ(y) Ú S(y) I should make the following substitution
13)
A) SUBST{y/w} / B) SUBST{w/w}C) SUBST{P/Q} / D) SUBST{P/S}
14) After resolving P(w) Ú Q(w) with ØQ(y) Ú S(y) I am left with…
A) P(w) Ú S(w) / B) Q(w)C) a contradiction (ie Q(w) Ù ØQ(y) ) / D) Q(y)
15) Consider Figure 1, assume that both F and K are goal nodes. Which algorithms will find the optimal solution? (DFS: depth-first search, BFS: breadth-first search, ID iterative deepening)
A) BFS only / B) BFS and ID onlyC) DFS, BFS, and ID / D) DFS and ID only
16) Consider Figure 2, what would be the next node to be expanded by uniform cost search?
A) node A / B) node BC) node C / D) node D
17) Consider Figure 3, what would the next node to be expanded by A*?
A) node A / B) node BC) node C / D) node D
18) Consider Figure 3, what is the effective branching factor for this problem?
A) exactly 1 / B) exactly 2C) about 1.6 / D) about 0.625
19) Suppose we have two admissible heuristics h(n) and h’(n), for solving the left-handed dyslexic monkey problem. Which of the following is also admissible?
A) max( h(n) , h’(n) ) / B) h(n) + h’(n)C) h(n) times h’(n) / D) h(n) / h’(n) (if h’(n) ¹ 0 )
20) If you know for sure that the single goal state for a problem is at depth 19, and the search space has a large branching factor, then the best search algorithm is
A) Depth-first search / B) Breath-first searchC) Depth-limited search (limit set to 19) / D) Iterative-deepening
21) Consider figure 4. Using 3 random restarts of hill-climbing search, the probability of finding the optimal solution is about…
A) 8% / B) 100% (exactly)C) 87% / D) 99.999999999999999999999999999999%
22) Consider figure 5. Using 4 random restarts of hill-climbing search, the probability of finding the optimal solution is about…
A) 16% / B) 100% (exactly)C) 87% / D) 53.5%
23) The worst case space complexity of A* search is
A) O(db) / B) O(bd)C) O(1) (ie constant space) / D) O(bd)
24) Suppose I run Minimax on a complete search tree, and find my best payoff under the normal assumptions of Minimax is 7. Min however, for some reason plays a sub-optimal game. I can now expect my payoff to be
A) greater than or equal to 7 / B) exactly 7C) less than or equal to 7 / D) strictly greater than 7
25) We know that Minimax has a time complexity of O(bd). In the worst case, using Minimax with alpha beta pruning has a time complexity of
A) O(bd/2) / B) O(bd)C) O(cube_root(bd)) / D) O(bd-1)
26) Consider Figure 6, what value will be backed up to the root node. (equivalently, what is Max’s expected payoff)
A) 1 / B) 2C) 3 / D) 17
27) Consider Figure 6, what nodes would be pruned by alpha beta
A) none / B) A22 and A23C) A33 / D) A22, A23, A32 and A33
28) Consider the machine learning problem in Figure 7 (only four examples of each class are shown for simplicity, imagine that we have 1,000 examples to train our machine learning algorithms). Suppose I learn a decision tree for this problem. The final tree would be..
A) a single test node (a decision stump) / B) A deep “bushy” and generally balanced treeC) A highly unbalanced tree / D) Exactly of depth 2 (ie root, two children, and four terminal nodes)
29) Consider the machine learning problem in Figure 8. Suppose I learn a decision tree for this problem. The final tree would be..
A) a single test node (a decision stump) / B) A highly unbalanced tree.C) A “bushy” and generally balanced tree which is very deep, at least depth 10. / D) Exactly of depth 2 (ie root, two children, and four terminal nodes)
30) Consider the machine learning problem in Figure 8. Suppose I learn a linear classifier for this problem. The resulting classifier would have an accuracy rate of about…
A) 0 to 10% / B) 20 to 25%C) 45 to 60%. / D) 90 to 100%
Use the following information to answer the next 3 questions. Assume that each question is independent of the others.
I am working for the government on a secret project. The send me a dataset with a million instances and 10 continuous features. I have no idea what the features are. I quickly test both the nearest neighbor and decision tree classifiers, they both have about 95% accuracy on this problem, so I have no reason to choose one over the other…
31) Given the above, suppose that the government tells me that it now knows that features 5 and 6 are useless for the problem at hand. Using this information I can
A) Make nearest neighbor more accurate. / B) Make the decision tree more accurate.C) Make both algorithms more accurate / D) Do nothing
32) Given the above, suppose that the government tells me that it the problem is to classify whether a image on a radar screen is an civilian airplane or incoming missile. In the latter case the government has 1.2 seconds to try to shoot the missile down. This information would make me lean towards recommending
A) Nearest neighbor / B) Decision treeC) Does not effect my recommendation / D)
33) Given the above, suppose that the government tells me that one of the goals of the project is to gain some understanding of the data, not just to build a accurate classifier. This information would make me lean towards recommending
A) Nearest neighbor / B) Decision treeC) Does not effect my recommendation / D)
34) An admissible heuristic…
A) always overestimates the cost to reach the goal / B) never overestimates the cost to reach the goalC) rarely overestimates the cost to reach the goal / D) returns the exact distance to the goal
35) The fingerprint matching problem can be viewed as a linear search problem. All the algorithms we have seen are really defined for trees (or graphs), nevertheless, we can speed up fingerprint matching by
A) clustering all the fingerprints / B) using heuristics to change order in which we test the fingerprintsC) resolving all fingerprints / D) converting linear search to binary search
36) According to Dr Keogh, the key to solving the “gang-sign” indexing problem is
A) converting an intrinsically linear search problem to a binary search problem. / B) first clustering all the “gang-signs” by geographical or racial originC) changing the representation of the image (i.e a bitmap) to some other representation (i.e a Boolean vector, or a time series). This idea of changing the representation seems universality useful in AI / D) exploiting the bilateral symmetry of the human body
Figure 1: A search space, of depth 4 and branching factor 2 / Figure 2: A search space where operators have different costs.
Figure 3: A search space. For clarity the g(n) values are not shown, but each operator has a cost of 1. / Figure 4: A 2D function (the left peak is 1% higer than the right peak) we would like to find the maximum value in the range
(-0.5 < x < 0.5 ) and (-0.5 < y < 0.5 )
Figure 5: A 2D function, we would like to find the maximum value in the visible range. / Figure 6: a game tree. Max (as always) makes the first move.
Figure 7: A learning problem. (The rule used to determine class is this, if the sum of the two heights is less than or equal to ten, it is class A, otherwise it is class B). / Figure 8: A two class learning problem.