# Lab 04: Predicting Text

## CS371: Cognitive Science Bryn Mawr College, Fall 2016 Prof. Blank

In this lab, we will explore the capability of a network to be able to predict what comes next in a text.

# Network with no predefined inputs¶

First, let's explore a Network with no predefined inputs.

In these experiments, we will use the idea of a subclass.

We create a DynamicInputsNetwork class based on Network:

In [ ]:
from conx import Network

class DynamicInputsNetwork(Network):
def initialize_inputs(self):
pass
# Do some initialization here
# Shuffle them, if self.inputs and "shullfe" is set:
# if self.settings["shuffle"]:
#    self.shuffle_inputs()

def inputs_size(self):
# Return the number of inputs:
return 4

def get_inputs(self, i):
# Return a pattern:
temp = [[0, 0],
[0, 1],
[1, 0],
[1, 1]]
return temp[i]

In [ ]:
net = DynamicInputsNetwork(2, 3, 1)

def target_function(inputs):
return [int(bool(inputs) != bool(inputs))]

net.target_function = target_function

In [ ]:
net.train(max_training_epochs=5000,
tolerance=0.3,
epsilon=0.1,
shuffle=True)

In [ ]:
net.test()


Run the network a few times, and make sure you understand how it works.

# No predefined inputs, with memory¶

Now, let's try a SRN network (one with memory):

In [ ]:
from conx import SRN

class DynamicInputsSRN(SRN):
def initialize_inputs(self):
pass
# Do some initialization here
# Shuffle them, if self.inputs and "shullfe" is set:
# if self.settings["shuffle"]:
#    self.shuffle_inputs()

def inputs_size(self):
# Return the number of inputs:
return 8

def get_inputs(self, i):
# Return a pattern:
temp = [0, 0,
0, 1,
1, 0,
1, 1]
return [temp[i]]

In [ ]:
net = DynamicInputsSRN(1, 5, 1)

In [ ]:
last = 0
def target_function(inputs):
global last
retval = [int(bool(inputs) != bool(last))]
last = inputs
return retval

net.target_function = target_function

In [ ]:
net.train(max_training_epochs=5000,
tolerance=0.3,
epsilon=0.1,
shuffle=True)

In [ ]:
net.test()


Test out the above network, and make sure you understand how it works.

# Predicting text¶

And now, we explore reading through a text, predicting what letter comes next:

In [ ]:
from conx import SRN

text = ("This is a test. Ok. What comes next? Depends? Yes. " +
"This is also a way of testing prediction. Ok. " +
"This is fine. Need lots of data. Ok?")

letters = list(set([letter for letter in text]))

def encode(letter):
index = letters.index(letter)
binary =  * len(letters)
binary[index] = 1
return binary

patterns = {letter: encode(letter) for letter in letters}

In [ ]:
class Predict(SRN):
def initialize_inputs(self):
pass

def inputs_size(self):
# Return the number of inputs:
return len(text)

def get_inputs(self, i):
letter = text[i]
return patterns[letter]

In [ ]:
net = Predict(len(encode("T")), 5, len(encode("T")))

In [ ]:
def target_function(inputs):
index = net.current_input_index
if index + 1 < len(text):
letter = text[index + 1]
else:
letter = " "
return patterns[letter]

net.target_function = target_function

In [ ]:
net.train(max_training_epochs=1000,
report_rate=100,
tolerance=0.3,
epsilon=0.1,
shuffle=True)

In [ ]:
net.test()


Your mission is to train a network on your own text. How well does it do?

You can use a trained network to generate text. How?

Use your network to generate some text.

# 3. Reflections¶

As per usual, please reflect deeply on this week's lab. What was challenging, easy, or surprising? Connect the topics onto what you already know.