COMP9444
Neural Networks and Deep Learning
Term 2, 2025

Week 1 Tutorial: Perceptrons


Note: if your Tutorial is on Monday or Tuesday, you should watch these three short videos before the Week 1 Tutorial, in order to familiarize yourself with the material:


  1. Introduce yourselves, get to know your fellow students and your tutor.

  2. Perceptron Learning

    1. Construct by hand a Perceptron which correctly classifies the following data; use your knowledge of plane geometry to choose appropriate values for the weights w0, w1 and w2.

      Training Examplex1 x2Class
      a.01–1
      b.20–1
      c.11+1

    2. Demonstrate the Perceptron Learning Algorithm on the above data, using a learning rate of 1.0 and initial weight values of
      w0 = –1.5
      w1 =   0
      w2 =   2
      In your answer, you should clearly indicate the new weight values at the end of each training step. The first three steps are shown here:

      Iteration w0w1w2 Training Example x1x2Class s=w0+w1x1+w2x2 Action
      1–1.502 a.01 +0.5Subtract
      2–2.501 b.20 –2.5None
      3–2.501 c.11+ –1.5Add

      Continue the table until all items are correctly classified.

  3. Computing any Logical Function with a 2-layer Network

    Recall that any logical function can be converted into Conjunctive Normal Form (CNF), which means a conjunction of terms where each term is a disjunction of (possibly negated) literals. This is an example of an expression in CNF:

    (A ∨ B) ∧ (¬ B ∨ C ∨ ¬ D) ∧ (D ∨ ¬ E)
    Assuming False=0 and True=1, explain how each of the following could be constructed. You should include the bias for each node, as well as the values of all the weights (input-to-output or input-to-hidden and hidden-to-output, as appropriate).

    1. Perceptron to compute the OR function of m inputs,
    2. Perceptron to compute the AND function of n inputs,
    3. Two-layer Neural Network to compute the function
      (A ∨ B) ∧ (¬ B ∨ C ∨ ¬ D) ∧ (D ∨ ¬ E).

      With reference to this example, explain how a two-layer neural network could be constructed to compute any (given) logical expression, assuming it is written in Conjunctive Normal Form.

      Hint: first consider how to construct a Perceptron to compute the OR function of m inputs, with k of the m inputs negated.

  4. XOR Network

    Construct by hand a Neural Network (or Multi-Layer Perceptron) that computes the XOR function of two inputs. Make sure the connections, weights and biases of your network are clearly visible.

    Challenge: Can you construct a Neural Network to compute XOR which has only one hidden unit, but also includes shortcut connections from the two inputs directly to the (one) output?
    Hint: start with a network that computes the inclusive OR, and then try to think of how it could be modified.

  5. Checkpoint #1: Python Refresher and Tensor Basics

    Checkpoints are designed to make sure you are keeping up with weekly tasks and consistently taking actions to learn the technical material, and the skills required for the successful completion of the assignment and the group project. Each checkpoint is worth 1% of your final grade. You will need to show your work to your tutor during the tutorial or mentoring session to get the weekly checkpoint mark.
    In the last 45 minutes of the Week 1 Tutorial, you will work on refreshing Python and learning basics of NumPy Arrays and PyTorch Tensors, by following these links:

  6. Implications of Deep Learning

    What potential benefits and dangers might Deep Learning pose for education, entertainment, the economy, and society in general?