Functions | Variables
make_training Namespace Reference

Functions

def train_model (model, dataset, validation_ratio=0.2, batch_size=64)
 

Variables

string dataset = "train.h5"
 
 hf = h5py.File(dataset, 'r')
 
 n1 = hf.get('data')
 
 total_count = n1.shape[0]
 
 model = models.CVN(5)
 
float learning_rate = 0.02
 
float decay_rate = 0.1
 
float momentum = 0.9
 
 opt = SGD(lr=learning_rate, momentum=momentum, decay=decay_rate, nesterov=False)
 
 loss
 
 optimizer
 
 metrics
 

Function Documentation

def make_training.train_model (   model,
  dataset,
  validation_ratio = 0.2,
  batch_size = 64 
)

Definition at line 30 of file make_training.py.

References makeTrainCVSamples.int, open(), print, generator.produce_seq(), and PandAna.Demos.demo1.range.

30 def train_model(model, dataset, validation_ratio=0.2, batch_size=64):
31  with h5py.File(dataset, "r") as data:
32 
33  total_ids = range(0, total_count)
34  total_ids = np.random.permutation(total_ids)
35  train_total_ids = total_ids[0:int((1-validation_ratio)*total_count)]
36  test_total_ids = total_ids[int((1-validation_ratio)*total_count):]
37 
38  training_sequence_generator = generator.produce_seq(batch_size=batch_size,
39  data=data, sample_ids=train_total_ids)
40  validation_sequence_generator = generator.produce_seq(batch_size=batch_size,
41  data=data, sample_ids=test_total_ids)
42 
43  history = model.fit_generator(generator=training_sequence_generator,
44  validation_data=validation_sequence_generator,
45  samples_per_epoch=len(train_total_ids),
46  nb_val_samples=len(test_total_ids),
47  nb_epoch=1,
48  max_q_size=1,
49  verbose=1,
50  class_weight=None,
51  nb_worker=1)
52 
53  directory = 'logs/'
54  if not os.path.exists(directory):
55  os.makedirs(directory)
56 
57  # Save the history/dictonary to plot it later
58  with open(directory + 'history.pickle', 'wb') as handle:
59  pickle.dump(history.history, handle, protocol=2)
60  print("The training/testing logs saved")
61 
62  # serialize model to JSON
63  model_json = model.to_json()
64  with open(directory + "model.json", "w") as json_file:
65  json_file.write(model_json)
66  print("The arch saved")
67 
68  #serialize weights to HDF5
69  model.save_weights(directory + "model_weights.h5")
70  model.save(directory + 'model_4recover.h5')
71  print("The weights and model saved")
72 
73 
def train_model(model, dataset, validation_ratio=0.2, batch_size=64)
def produce_seq(batch_size, data, sample_ids)
Definition: generator.py:7
bool print
procfile open("FD_BRL_v0.txt")

Variable Documentation

string make_training.dataset = "train.h5"
float make_training.decay_rate = 0.1

Definition at line 77 of file make_training.py.

make_training.hf = h5py.File(dataset, 'r')

Definition at line 25 of file make_training.py.

float make_training.learning_rate = 0.02

Definition at line 76 of file make_training.py.

make_training.loss

Definition at line 80 of file make_training.py.

make_training.metrics

Definition at line 80 of file make_training.py.

make_training.model = models.CVN(5)

Definition at line 74 of file make_training.py.

float make_training.momentum = 0.9
make_training.n1 = hf.get('data')
make_training.opt = SGD(lr=learning_rate, momentum=momentum, decay=decay_rate, nesterov=False)

Definition at line 79 of file make_training.py.

make_training.optimizer

Definition at line 80 of file make_training.py.

Referenced by containment_optimization(), and fiducial_optimization().

make_training.total_count = n1.shape[0]

Definition at line 27 of file make_training.py.