# deepchem.models.tensorgraph package¶

## deepchem.models.tensorgraph.IRV module¶

class deepchem.models.tensorgraph.IRV.IRVLayer(n_tasks, K, **kwargs)[source]

Core layer of IRV classifier, architecture described in: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2750043/

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.IRV.IRVRegularize(IRVLayer, penalty=0.0, **kwargs)[source]

Extracts the trainable weights in IRVLayer and return their L2-norm No in_layers is required, but should be built after target IRVLayer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.IRV.Slice(slice_num, axis=1, **kwargs)[source]

Choose a slice of input on the last axis given order, Suppose input x has two dimensions, output f(x) = x[:, slice_num:slice_num+1]

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.IRV.TensorflowMultiTaskIRVClassifier(n_tasks, K=10, penalty=0.0, mode='classification', **kwargs)[source]
add_output(layer)
build()
build_graph()[source]

Constructs the graph architecture of IRV as described in:

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2750043/

create_submodel(layers=None, loss=None, optimizer=None)

Create an alternate objective for training one piece of a TensorGraph.

A TensorGraph consists of a set of layers, and specifies a loss function and optimizer to use for training those layers. Usually this is sufficient, but there are cases where you want to train different parts of a model separately. For example, a GAN consists of a generator and a discriminator. They are trained separately, and they use different loss functions.

A submodel defines an alternate objective to use in cases like this. It may optionally specify any of the following: a subset of layers in the model to train; a different loss function; and a different optimizer to use. This method creates a submodel, which you can then pass to fit() to use it for training.

Parameters: layers (list) – the list of layers to train. If None, all layers in the model will be trained. loss (Layer) – the loss function to optimize. If None, the model’s main loss function will be used. optimizer (Optimizer) – the optimizer to use for training. If None, the model’s main optimizer will be used. the newly created submodel, which can be passed to any of the fitting methods.
default_generator(dataset, epochs=1, predict=False, deterministic=True, pad_batches=True)[source]

TensorGraph style implementation

evaluate(dataset, metrics, transformers=[], per_task_metrics=False)

Evaluates the performance of this model on specified dataset.

Parameters: dataset (dc.data.Dataset) – Dataset object. metric (deepchem.metrics.Metric) – Evaluation metric transformers (list) – List of deepchem.transformers.Transformer per_task_metrics (bool) – If True, return per-task scores. Maps tasks to scores under metric. dict
evaluate_generator(feed_dict_generator, metrics, transformers=[], labels=None, outputs=None, weights=[], per_task_metrics=False)
fit(dataset, nb_epoch=10, max_checkpoints_to_keep=5, checkpoint_interval=1000, deterministic=False, restore=False, submodel=None, **kwargs)

Train this model on a dataset.

Parameters: dataset (Dataset) – the Dataset to train on nb_epoch (int) – the number of epochs to train for max_checkpoints_to_keep (int) – the maximum number of checkpoints to keep. Older checkpoints are discarded. checkpoint_interval (int) – the frequency at which to write checkpoints, measured in training steps. Set this to 0 to disable automatic checkpointing. deterministic (bool) – if True, the samples are processed in order. If False, a different random order is used for each epoch. restore (bool) – if True, restore the model from the most recent checkpoint and continue training from there. If False, retrain the model from scratch. submodel (Submodel) – an alternate training objective to use. This should have been created by calling create_submodel().
fit_generator(feed_dict_generator, max_checkpoints_to_keep=5, checkpoint_interval=1000, restore=False, submodel=None)

Train this model on data from a generator.

Parameters: feed_dict_generator (generator) – this should generate batches, each represented as a dict that maps Layers to values. max_checkpoints_to_keep (int) – the maximum number of checkpoints to keep. Older checkpoints are discarded. checkpoint_interval (int) – the frequency at which to write checkpoints, measured in training steps. Set this to 0 to disable automatic checkpointing. restore (bool) – if True, restore the model from the most recent checkpoint and continue training from there. If False, retrain the model from scratch. submodel (Submodel) – an alternate training objective to use. This should have been created by calling create_submodel(). the average loss over the most recent checkpoint interval
fit_on_batch(X, y, w, submodel=None)
get_checkpoints()

Get a list of all available checkpoint files.

get_global_step()
get_layer_variables(layer)

Get the list of trainable variables in a layer of the graph.

get_model_filename(model_dir)

Given model directory, obtain filename for the model itself.

get_num_tasks()
get_params(deep=True)

Get parameters for this estimator.

Parameters: deep (boolean, optional) – If True, will return the parameters for this estimator and contained subobjects that are estimators. params – Parameter names mapped to their values. mapping of string to any
get_params_filename(model_dir)

Given model directory, obtain filename for the model itself.

get_pickling_errors(obj, seen=None)
get_pre_q_input(input_layer)
get_task_type()

Currently models can only be classifiers or regressors.

load_from_dir(model_dir, restore=True)
predict(dataset, transformers=[], outputs=None)[source]
predict_on_batch(X, transformers=[], outputs=None)

Generates predictions for input samples, processing samples in a batch.

Parameters: X (ndarray) – the input data, as a Numpy array. transformers (List) – List of dc.trans.Transformers A Numpy array of predictions.
predict_on_generator(generator, transformers=[], outputs=None)
Parameters: generator (Generator) – Generator that constructs feed dictionaries for TensorGraph. transformers (list) – List of dc.trans.Transformers. outputs (object) – If outputs is None, then will assume outputs = self.outputs. If outputs is a Layer/Tensor, then will evaluate and return as a single ndarray. If outputs is a list of Layers/Tensors, will return a list of ndarrays. Returns – y_pred: numpy ndarray of shape (n_samples, n_classes*n_tasks)
predict_proba(dataset, transformers=[], outputs=None)[source]
predict_proba_on_batch(X, transformers=[], outputs=None)

Generates predictions for input samples, processing samples in a batch.

Parameters: X (ndarray) – the input data, as a Numpy array. transformers (List) – List of dc.trans.Transformers A Numpy array of predictions.
predict_proba_on_generator(generator, transformers=[], outputs=None)
Returns: numpy ndarray of shape (n_samples, n_classes*n_tasks) y_pred
reload()

Reload trained model from disk.

restore(checkpoint=None)

Reload the values of all variables from a checkpoint file.

Parameters: checkpoint (str) – the path to the checkpoint file to load. If this is None, the most recent checkpoint will be chosen automatically. Call get_checkpoints() to get a list of all available checkpoints.
save()
save_checkpoint(max_checkpoints_to_keep=5)

Save a checkpoint to disk.

Usually you do not need to call this method, since fit() saves checkpoints automatically. If you have disabled automatic checkpointing during fitting, this can be called to manually write checkpoints.

Parameters: max_checkpoints_to_keep (int) – the maximum number of checkpoints to keep. Older checkpoints are discarded.
set_loss(layer)
set_optimizer(optimizer)

Set the optimizer to use for fitting.

set_params(**params)

Set the parameters of this estimator.

The method works on simple estimators as well as on nested objects (such as pipelines). The latter have parameters of the form <component>__<parameter> so that it’s possible to update each component of a nested object.

Returns: self
topsort()

## deepchem.models.tensorgraph.activations module¶

Activations for models.

deepchem.models.tensorgraph.activations.elu(x, alpha=1.0)[source]
deepchem.models.tensorgraph.activations.get(identifier)[source]
deepchem.models.tensorgraph.activations.get_from_module(identifier, module_params, module_name, instantiate=False, kwargs=None)[source]

Retrieves a class of function member of a module.

Parameters: identifier (the object to retrieve. It could be specified) – by name (as a string), or by dict. In any other case, identifier itself will be returned without any changes. module_params (the members of a module) – (e.g. the output of globals()). module_name (string; the name of the target module. Only used) – to format error messages. instantiate (whether to instantiate the returned object) – (if it’s a class). kwargs (a dictionary of keyword arguments to pass to the) – class constructor if instantiate is True. The target object. ValueError: if the identifier cannot be found.
deepchem.models.tensorgraph.activations.hard_sigmoid(x)[source]

The hard sigmoidal activation function

Piecewise-linear approximation to sigmoid.

Parameters: x (tf.Tensor) – Input tensor
deepchem.models.tensorgraph.activations.linear(x)[source]

A linear activation function.

Note that a linear activation function is simply the identity.

Parameters: x (tf.Tensor) – Input tensor
deepchem.models.tensorgraph.activations.relu(x, alpha=0.0, max_value=None)[source]

The rectified linear activation function

Wrapper around model_ops.relu.

Parameters: x (tf.Tensor) – Input tensor
deepchem.models.tensorgraph.activations.selu(x)[source]
deepchem.models.tensorgraph.activations.sigmoid(x)[source]

The sigmoidal activation function

Wrapper around tf.nn.sigmoid.

Parameters: x (tf.Tensor) – Input tensor
deepchem.models.tensorgraph.activations.softmax(x)[source]
deepchem.models.tensorgraph.activations.softplus(x)[source]
deepchem.models.tensorgraph.activations.softsign(x)[source]
deepchem.models.tensorgraph.activations.tanh(x)[source]

The hyperbolic tanget activation function

Wrapper around tf.nn.tanh.

Parameters: x (tf.Tensor) – Input tensor

## deepchem.models.tensorgraph.fcnet module¶

TensorFlow implementation of fully connected networks.

class deepchem.models.tensorgraph.fcnet.MultiTaskClassifier(n_tasks, n_features, layer_sizes=[1000], weight_init_stddevs=0.02, bias_init_consts=1.0, weight_decay_penalty=0.0, weight_decay_penalty_type='l2', dropouts=0.5, activation_fns=<function relu>, n_classes=2, **kwargs)[source]
add_output(layer)
build()
create_submodel(layers=None, loss=None, optimizer=None)

Create an alternate objective for training one piece of a TensorGraph.

A TensorGraph consists of a set of layers, and specifies a loss function and optimizer to use for training those layers. Usually this is sufficient, but there are cases where you want to train different parts of a model separately. For example, a GAN consists of a generator and a discriminator. They are trained separately, and they use different loss functions.

A submodel defines an alternate objective to use in cases like this. It may optionally specify any of the following: a subset of layers in the model to train; a different loss function; and a different optimizer to use. This method creates a submodel, which you can then pass to fit() to use it for training.

Parameters: layers (list) – the list of layers to train. If None, all layers in the model will be trained. loss (Layer) – the loss function to optimize. If None, the model’s main loss function will be used. optimizer (Optimizer) – the optimizer to use for training. If None, the model’s main optimizer will be used. the newly created submodel, which can be passed to any of the fitting methods.
default_generator(dataset, epochs=1, predict=False, deterministic=True, pad_batches=True)[source]
evaluate(dataset, metrics, transformers=[], per_task_metrics=False)

Evaluates the performance of this model on specified dataset.

Parameters: dataset (dc.data.Dataset) – Dataset object. metric (deepchem.metrics.Metric) – Evaluation metric transformers (list) – List of deepchem.transformers.Transformer per_task_metrics (bool) – If True, return per-task scores. Maps tasks to scores under metric. dict
evaluate_generator(feed_dict_generator, metrics, transformers=[], labels=None, outputs=None, weights=[], per_task_metrics=False)
fit(dataset, nb_epoch=10, max_checkpoints_to_keep=5, checkpoint_interval=1000, deterministic=False, restore=False, submodel=None, **kwargs)

Train this model on a dataset.

Parameters: dataset (Dataset) – the Dataset to train on nb_epoch (int) – the number of epochs to train for max_checkpoints_to_keep (int) – the maximum number of checkpoints to keep. Older checkpoints are discarded. checkpoint_interval (int) – the frequency at which to write checkpoints, measured in training steps. Set this to 0 to disable automatic checkpointing. deterministic (bool) – if True, the samples are processed in order. If False, a different random order is used for each epoch. restore (bool) – if True, restore the model from the most recent checkpoint and continue training from there. If False, retrain the model from scratch. submodel (Submodel) – an alternate training objective to use. This should have been created by calling create_submodel().
fit_generator(feed_dict_generator, max_checkpoints_to_keep=5, checkpoint_interval=1000, restore=False, submodel=None)

Train this model on data from a generator.

Parameters: feed_dict_generator (generator) – this should generate batches, each represented as a dict that maps Layers to values. max_checkpoints_to_keep (int) – the maximum number of checkpoints to keep. Older checkpoints are discarded. checkpoint_interval (int) – the frequency at which to write checkpoints, measured in training steps. Set this to 0 to disable automatic checkpointing. restore (bool) – if True, restore the model from the most recent checkpoint and continue training from there. If False, retrain the model from scratch. submodel (Submodel) – an alternate training objective to use. This should have been created by calling create_submodel(). the average loss over the most recent checkpoint interval
fit_on_batch(X, y, w, submodel=None)
get_checkpoints()

Get a list of all available checkpoint files.

get_global_step()
get_layer_variables(layer)

Get the list of trainable variables in a layer of the graph.

get_model_filename(model_dir)

Given model directory, obtain filename for the model itself.

get_num_tasks()
get_params(deep=True)

Get parameters for this estimator.

Parameters: deep (boolean, optional) – If True, will return the parameters for this estimator and contained subobjects that are estimators. params – Parameter names mapped to their values. mapping of string to any
get_params_filename(model_dir)

Given model directory, obtain filename for the model itself.

get_pickling_errors(obj, seen=None)
get_pre_q_input(input_layer)
get_task_type()

Currently models can only be classifiers or regressors.

load_from_dir(model_dir, restore=True)
predict(dataset, transformers=[], outputs=None)[source]

Uses self to make predictions on provided Dataset object.

Parameters: dataset (dc.data.Dataset) – Dataset to make prediction on transformers (list) – List of dc.trans.Transformers. outputs (object) – If outputs is None, then will assume outputs = self.outputs[0] (single output). If outputs is a Layer/Tensor, then will evaluate and return as a single ndarray. If outputs is a list of Layers/Tensors, will return a list of ndarrays. y_pred numpy ndarray or list of numpy ndarrays
predict_on_batch(X, transformers=[], outputs=None)

Generates predictions for input samples, processing samples in a batch.

Parameters: X (ndarray) – the input data, as a Numpy array. transformers (List) – List of dc.trans.Transformers A Numpy array of predictions.
predict_on_generator(generator, transformers=[], outputs=None)
Parameters: generator (Generator) – Generator that constructs feed dictionaries for TensorGraph. transformers (list) – List of dc.trans.Transformers. outputs (object) – If outputs is None, then will assume outputs = self.outputs. If outputs is a Layer/Tensor, then will evaluate and return as a single ndarray. If outputs is a list of Layers/Tensors, will return a list of ndarrays. Returns – y_pred: numpy ndarray of shape (n_samples, n_classes*n_tasks)
predict_proba(dataset, transformers=[], outputs=None)[source]
predict_proba_on_batch(X, transformers=[], outputs=None)

Generates predictions for input samples, processing samples in a batch.

Parameters: X (ndarray) – the input data, as a Numpy array. transformers (List) – List of dc.trans.Transformers A Numpy array of predictions.
predict_proba_on_generator(generator, transformers=[], outputs=None)
Returns: numpy ndarray of shape (n_samples, n_classes*n_tasks) y_pred
reload()

Reload trained model from disk.

restore(checkpoint=None)

Reload the values of all variables from a checkpoint file.

Parameters: checkpoint (str) – the path to the checkpoint file to load. If this is None, the most recent checkpoint will be chosen automatically. Call get_checkpoints() to get a list of all available checkpoints.
save()
save_checkpoint(max_checkpoints_to_keep=5)

Save a checkpoint to disk.

Usually you do not need to call this method, since fit() saves checkpoints automatically. If you have disabled automatic checkpointing during fitting, this can be called to manually write checkpoints.

Parameters: max_checkpoints_to_keep (int) – the maximum number of checkpoints to keep. Older checkpoints are discarded.
set_loss(layer)
set_optimizer(optimizer)

Set the optimizer to use for fitting.

set_params(**params)

Set the parameters of this estimator.

The method works on simple estimators as well as on nested objects (such as pipelines). The latter have parameters of the form <component>__<parameter> so that it’s possible to update each component of a nested object.

Returns: self
topsort()
class deepchem.models.tensorgraph.fcnet.MultiTaskFitTransformRegressor(n_tasks, n_features, fit_transformers=[], n_evals=1, batch_size=50, **kwargs)[source]

Implements a MultiTaskRegressor that performs on-the-fly transformation during fit/predict.

Example:

>>> n_samples = 10
>>> n_features = 3
>>> n_tasks = 1
>>> ids = np.arange(n_samples)
>>> X = np.random.rand(n_samples, n_features, n_features)
>>> y = np.zeros((n_samples, n_tasks))
>>> w = np.ones((n_samples, n_tasks))
>>> dataset = dc.data.NumpyDataset(X, y, w, ids)
>>> fit_transformers = [dc.trans.CoulombFitTransformer(dataset)]
>>> model = dc.models.MultiTaskFitTransformRegressor(n_tasks, [n_features, n_features],
...     dropouts=[0.], learning_rate=0.003, weight_init_stddevs=[np.sqrt(6)/np.sqrt(1000)],
...     batch_size=n_samples, fit_transformers=fit_transformers, n_evals=1)
n_features after fit_transform: 12

add_output(layer)
build()
create_submodel(layers=None, loss=None, optimizer=None)

Create an alternate objective for training one piece of a TensorGraph.

A TensorGraph consists of a set of layers, and specifies a loss function and optimizer to use for training those layers. Usually this is sufficient, but there are cases where you want to train different parts of a model separately. For example, a GAN consists of a generator and a discriminator. They are trained separately, and they use different loss functions.

A submodel defines an alternate objective to use in cases like this. It may optionally specify any of the following: a subset of layers in the model to train; a different loss function; and a different optimizer to use. This method creates a submodel, which you can then pass to fit() to use it for training.

Parameters: layers (list) – the list of layers to train. If None, all layers in the model will be trained. loss (Layer) – the loss function to optimize. If None, the model’s main loss function will be used. optimizer (Optimizer) – the optimizer to use for training. If None, the model’s main optimizer will be used. the newly created submodel, which can be passed to any of the fitting methods.
default_generator(dataset, epochs=1, predict=False, deterministic=True, pad_batches=True)[source]
evaluate(dataset, metrics, transformers=[], per_task_metrics=False)

Evaluates the performance of this model on specified dataset.

Parameters: dataset (dc.data.Dataset) – Dataset object. metric (deepchem.metrics.Metric) – Evaluation metric transformers (list) – List of deepchem.transformers.Transformer per_task_metrics (bool) – If True, return per-task scores. Maps tasks to scores under metric. dict
evaluate_generator(feed_dict_generator, metrics, transformers=[], labels=None, outputs=None, weights=[], per_task_metrics=False)
fit(dataset, nb_epoch=10, max_checkpoints_to_keep=5, checkpoint_interval=1000, deterministic=False, restore=False, submodel=None, **kwargs)

Train this model on a dataset.

Parameters: dataset (Dataset) – the Dataset to train on nb_epoch (int) – the number of epochs to train for max_checkpoints_to_keep (int) – the maximum number of checkpoints to keep. Older checkpoints are discarded. checkpoint_interval (int) – the frequency at which to write checkpoints, measured in training steps. Set this to 0 to disable automatic checkpointing. deterministic (bool) – if True, the samples are processed in order. If False, a different random order is used for each epoch. restore (bool) – if True, restore the model from the most recent checkpoint and continue training from there. If False, retrain the model from scratch. submodel (Submodel) – an alternate training objective to use. This should have been created by calling create_submodel().
fit_generator(feed_dict_generator, max_checkpoints_to_keep=5, checkpoint_interval=1000, restore=False, submodel=None)

Train this model on data from a generator.

Parameters: feed_dict_generator (generator) – this should generate batches, each represented as a dict that maps Layers to values. max_checkpoints_to_keep (int) – the maximum number of checkpoints to keep. Older checkpoints are discarded. checkpoint_interval (int) – the frequency at which to write checkpoints, measured in training steps. Set this to 0 to disable automatic checkpointing. restore (bool) – if True, restore the model from the most recent checkpoint and continue training from there. If False, retrain the model from scratch. submodel (Submodel) – an alternate training objective to use. This should have been created by calling create_submodel(). the average loss over the most recent checkpoint interval
fit_on_batch(X, y, w, submodel=None)
get_checkpoints()

Get a list of all available checkpoint files.

get_global_step()
get_layer_variables(layer)

Get the list of trainable variables in a layer of the graph.

get_model_filename(model_dir)

Given model directory, obtain filename for the model itself.

get_num_tasks()
get_params(deep=True)

Get parameters for this estimator.

Parameters: deep (boolean, optional) – If True, will return the parameters for this estimator and contained subobjects that are estimators. params – Parameter names mapped to their values. mapping of string to any
get_params_filename(model_dir)

Given model directory, obtain filename for the model itself.

get_pickling_errors(obj, seen=None)
get_pre_q_input(input_layer)
get_task_type()

Currently models can only be classifiers or regressors.

load_from_dir(model_dir, restore=True)
predict(dataset, transformers=[], outputs=None)

Uses self to make predictions on provided Dataset object.

Parameters: dataset (dc.data.Dataset) – Dataset to make prediction on transformers (list) – List of dc.trans.Transformers. outputs (object) – If outputs is None, then will assume outputs = self.outputs[0] (single output). If outputs is a Layer/Tensor, then will evaluate and return as a single ndarray. If outputs is a list of Layers/Tensors, will return a list of ndarrays. results numpy ndarray or list of numpy ndarrays
predict_on_batch(X, transformers=[], outputs=None)

Generates predictions for input samples, processing samples in a batch.

Parameters: X (ndarray) – the input data, as a Numpy array. transformers (List) – List of dc.trans.Transformers A Numpy array of predictions.
predict_on_generator(generator, transformers=[], outputs=None)[source]
predict_proba(dataset, transformers=[], outputs=None)
Parameters: dataset (dc.data.Dataset) – Dataset to make prediction on transformers (list) – List of dc.trans.Transformers. outputs (object) – If outputs is None, then will assume outputs = self.outputs[0] (single output). If outputs is a Layer/Tensor, then will evaluate and return as a single ndarray. If outputs is a list of Layers/Tensors, will return a list of ndarrays. y_pred numpy ndarray or list of numpy ndarrays
predict_proba_on_batch(X, transformers=[], outputs=None)

Generates predictions for input samples, processing samples in a batch.

Parameters: X (ndarray) – the input data, as a Numpy array. transformers (List) – List of dc.trans.Transformers A Numpy array of predictions.
predict_proba_on_generator(generator, transformers=[], outputs=None)
Returns: numpy ndarray of shape (n_samples, n_classes*n_tasks) y_pred
reload()

Reload trained model from disk.

restore(checkpoint=None)

Reload the values of all variables from a checkpoint file.

Parameters: checkpoint (str) – the path to the checkpoint file to load. If this is None, the most recent checkpoint will be chosen automatically. Call get_checkpoints() to get a list of all available checkpoints.
save()
save_checkpoint(max_checkpoints_to_keep=5)

Save a checkpoint to disk.

Usually you do not need to call this method, since fit() saves checkpoints automatically. If you have disabled automatic checkpointing during fitting, this can be called to manually write checkpoints.

Parameters: max_checkpoints_to_keep (int) – the maximum number of checkpoints to keep. Older checkpoints are discarded.
set_loss(layer)
set_optimizer(optimizer)

Set the optimizer to use for fitting.

set_params(**params)

Set the parameters of this estimator.

The method works on simple estimators as well as on nested objects (such as pipelines). The latter have parameters of the form <component>__<parameter> so that it’s possible to update each component of a nested object.

Returns: self
topsort()
class deepchem.models.tensorgraph.fcnet.MultiTaskRegressor(n_tasks, n_features, layer_sizes=[1000], weight_init_stddevs=0.02, bias_init_consts=1.0, weight_decay_penalty=0.0, weight_decay_penalty_type='l2', dropouts=0.5, activation_fns=<function relu>, **kwargs)[source]
add_output(layer)
build()
create_submodel(layers=None, loss=None, optimizer=None)

Create an alternate objective for training one piece of a TensorGraph.

A TensorGraph consists of a set of layers, and specifies a loss function and optimizer to use for training those layers. Usually this is sufficient, but there are cases where you want to train different parts of a model separately. For example, a GAN consists of a generator and a discriminator. They are trained separately, and they use different loss functions.

A submodel defines an alternate objective to use in cases like this. It may optionally specify any of the following: a subset of layers in the model to train; a different loss function; and a different optimizer to use. This method creates a submodel, which you can then pass to fit() to use it for training.

Parameters: layers (list) – the list of layers to train. If None, all layers in the model will be trained. loss (Layer) – the loss function to optimize. If None, the model’s main loss function will be used. optimizer (Optimizer) – the optimizer to use for training. If None, the model’s main optimizer will be used. the newly created submodel, which can be passed to any of the fitting methods.
default_generator(dataset, epochs=1, predict=False, deterministic=True, pad_batches=True)[source]
evaluate(dataset, metrics, transformers=[], per_task_metrics=False)

Evaluates the performance of this model on specified dataset.

Parameters: dataset (dc.data.Dataset) – Dataset object. metric (deepchem.metrics.Metric) – Evaluation metric transformers (list) – List of deepchem.transformers.Transformer per_task_metrics (bool) – If True, return per-task scores. Maps tasks to scores under metric. dict
evaluate_generator(feed_dict_generator, metrics, transformers=[], labels=None, outputs=None, weights=[], per_task_metrics=False)
fit(dataset, nb_epoch=10, max_checkpoints_to_keep=5, checkpoint_interval=1000, deterministic=False, restore=False, submodel=None, **kwargs)

Train this model on a dataset.

Parameters: dataset (Dataset) – the Dataset to train on nb_epoch (int) – the number of epochs to train for max_checkpoints_to_keep (int) – the maximum number of checkpoints to keep. Older checkpoints are discarded. checkpoint_interval (int) – the frequency at which to write checkpoints, measured in training steps. Set this to 0 to disable automatic checkpointing. deterministic (bool) – if True, the samples are processed in order. If False, a different random order is used for each epoch. restore (bool) – if True, restore the model from the most recent checkpoint and continue training from there. If False, retrain the model from scratch. submodel (Submodel) – an alternate training objective to use. This should have been created by calling create_submodel().
fit_generator(feed_dict_generator, max_checkpoints_to_keep=5, checkpoint_interval=1000, restore=False, submodel=None)

Train this model on data from a generator.

Parameters: feed_dict_generator (generator) – this should generate batches, each represented as a dict that maps Layers to values. max_checkpoints_to_keep (int) – the maximum number of checkpoints to keep. Older checkpoints are discarded. checkpoint_interval (int) – the frequency at which to write checkpoints, measured in training steps. Set this to 0 to disable automatic checkpointing. restore (bool) – if True, restore the model from the most recent checkpoint and continue training from there. If False, retrain the model from scratch. submodel (Submodel) – an alternate training objective to use. This should have been created by calling create_submodel(). the average loss over the most recent checkpoint interval
fit_on_batch(X, y, w, submodel=None)
get_checkpoints()

Get a list of all available checkpoint files.

get_global_step()
get_layer_variables(layer)

Get the list of trainable variables in a layer of the graph.

get_model_filename(model_dir)

Given model directory, obtain filename for the model itself.

get_num_tasks()
get_params(deep=True)

Get parameters for this estimator.

Parameters: deep (boolean, optional) – If True, will return the parameters for this estimator and contained subobjects that are estimators. params – Parameter names mapped to their values. mapping of string to any
get_params_filename(model_dir)

Given model directory, obtain filename for the model itself.

get_pickling_errors(obj, seen=None)
get_pre_q_input(input_layer)
get_task_type()

Currently models can only be classifiers or regressors.

load_from_dir(model_dir, restore=True)
predict(dataset, transformers=[], outputs=None)

Uses self to make predictions on provided Dataset object.

Parameters: dataset (dc.data.Dataset) – Dataset to make prediction on transformers (list) – List of dc.trans.Transformers. outputs (object) – If outputs is None, then will assume outputs = self.outputs[0] (single output). If outputs is a Layer/Tensor, then will evaluate and return as a single ndarray. If outputs is a list of Layers/Tensors, will return a list of ndarrays. results numpy ndarray or list of numpy ndarrays
predict_on_batch(X, transformers=[], outputs=None)

Generates predictions for input samples, processing samples in a batch.

Parameters: X (ndarray) – the input data, as a Numpy array. transformers (List) – List of dc.trans.Transformers A Numpy array of predictions.
predict_on_generator(generator, transformers=[], outputs=None)
Parameters: generator (Generator) – Generator that constructs feed dictionaries for TensorGraph. transformers (list) – List of dc.trans.Transformers. outputs (object) – If outputs is None, then will assume outputs = self.outputs. If outputs is a Layer/Tensor, then will evaluate and return as a single ndarray. If outputs is a list of Layers/Tensors, will return a list of ndarrays. Returns – y_pred: numpy ndarray of shape (n_samples, n_classes*n_tasks)
predict_proba(dataset, transformers=[], outputs=None)
Parameters: dataset (dc.data.Dataset) – Dataset to make prediction on transformers (list) – List of dc.trans.Transformers. outputs (object) – If outputs is None, then will assume outputs = self.outputs[0] (single output). If outputs is a Layer/Tensor, then will evaluate and return as a single ndarray. If outputs is a list of Layers/Tensors, will return a list of ndarrays. y_pred numpy ndarray or list of numpy ndarrays
predict_proba_on_batch(X, transformers=[], outputs=None)

Generates predictions for input samples, processing samples in a batch.

Parameters: X (ndarray) – the input data, as a Numpy array. transformers (List) – List of dc.trans.Transformers A Numpy array of predictions.
predict_proba_on_generator(generator, transformers=[], outputs=None)
Returns: numpy ndarray of shape (n_samples, n_classes*n_tasks) y_pred
reload()

Reload trained model from disk.

restore(checkpoint=None)

Reload the values of all variables from a checkpoint file.

Parameters: checkpoint (str) – the path to the checkpoint file to load. If this is None, the most recent checkpoint will be chosen automatically. Call get_checkpoints() to get a list of all available checkpoints.
save()
save_checkpoint(max_checkpoints_to_keep=5)

Save a checkpoint to disk.

Usually you do not need to call this method, since fit() saves checkpoints automatically. If you have disabled automatic checkpointing during fitting, this can be called to manually write checkpoints.

Parameters: max_checkpoints_to_keep (int) – the maximum number of checkpoints to keep. Older checkpoints are discarded.
set_loss(layer)
set_optimizer(optimizer)

Set the optimizer to use for fitting.

set_params(**params)

Set the parameters of this estimator.

The method works on simple estimators as well as on nested objects (such as pipelines). The latter have parameters of the form <component>__<parameter> so that it’s possible to update each component of a nested object.

Returns: self
topsort()

## deepchem.models.tensorgraph.graph_layers module¶

Created on Thu Mar 30 14:02:04 2017

@author: michael

class deepchem.models.tensorgraph.graph_layers.DAGGather(n_graph_feat=30, n_outputs=30, max_atoms=50, layer_sizes=[100], init='glorot_uniform', activation='relu', dropout=None, **kwargs)[source]

TensorGraph style implementation

DAGgraph_step(batch_inputs, W_list, b_list)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]

“Construct internal trainable weights.

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

parent layers: atom_features, membership

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.graph_layers.DAGLayer(n_graph_feat=30, n_atom_feat=75, max_atoms=50, layer_sizes=[100], init='glorot_uniform', activation='relu', dropout=None, batch_size=64, **kwargs)[source]

TensorGraph style implementation

DAGgraph_step(batch_inputs, W_list, b_list)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]

“Construct internal trainable weights.

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

parent layers: atom_features, parents, calculation_orders, calculation_masks, n_atoms

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.graph_layers.DTNNEmbedding(n_embedding=30, periodic_table_length=30, init='glorot_uniform', **kwargs)[source]

TensorGraph style implementation

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

parent layers: atom_number

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.graph_layers.DTNNExtract(task_id, **kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.graph_layers.DTNNGather(n_embedding=30, n_outputs=100, layer_sizes=[100], output_activation=True, init='glorot_uniform', activation='tanh', **kwargs)[source]

TensorGraph style implementation

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

parent layers: atom_features, atom_membership

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.graph_layers.DTNNStep(n_embedding=30, n_distance=100, n_hidden=60, init='glorot_uniform', activation='tanh', **kwargs)[source]

TensorGraph style implementation

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

parent layers: atom_features, distance, distance_membership_i, distance_membership_j

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.graph_layers.EdgeNetwork(pair_features, n_pair_features=8, n_hidden=100, init='glorot_uniform')[source]

Bases: object

Submodule for Message Passing

forward(atom_features, atom_to_pair)[source]
none_tensors()[source]
set_tensors(tensor)[source]
class deepchem.models.tensorgraph.graph_layers.GatedRecurrentUnit(n_hidden=100, init='glorot_uniform')[source]

Bases: object

Submodule for Message Passing

forward(inputs, messages)[source]
none_tensors()[source]
set_tensors(tensor)[source]
class deepchem.models.tensorgraph.graph_layers.MessagePassing(T, message_fn='enn', update_fn='gru', n_hidden=100, **kwargs)[source]

General class for MPNN default structures built according to https://arxiv.org/abs/1511.06391

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build(pair_features, n_pair_features)[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

Perform T steps of message passing

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.graph_layers.SetGather(M, batch_size, n_hidden=100, init='orthogonal', **kwargs)[source]

set2set gather layer for graph-based model model using this layer must set pad_batches=True

LSTMStep(h, c, x=None)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

Perform M steps of set2set gather, detailed descriptions in: https://arxiv.org/abs/1511.06391

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.graph_layers.WeaveGather(batch_size, n_input=128, gaussian_expand=False, init='glorot_uniform', activation='tanh', epsilon=0.001, momentum=0.99, **kwargs)[source]

TensorGraph style implementation

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

parent layers: atom_features, atom_split

gaussian_histogram(x)[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.graph_layers.WeaveLayer(n_atom_input_feat=75, n_pair_input_feat=14, n_atom_output_feat=50, n_pair_output_feat=50, n_hidden_AA=50, n_hidden_PA=50, n_hidden_AP=50, n_hidden_PP=50, update_pair=True, init='glorot_uniform', activation='relu', dropout=None, **kwargs)[source]

TensorGraph style implementation Note: Use WeaveLayerFactory to construct this layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]

Construct internal trainable weights.

TODO(rbharath): Need to make this not set instance variables to follow style in other layers.

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

Creates weave tensors.

parent layers: [atom_features, pair_features], pair_split, atom_to_pair

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
deepchem.models.tensorgraph.graph_layers.WeaveLayerFactory(**kwargs)[source]

## deepchem.models.tensorgraph.initializations module¶

Ops for tensor initialization

deepchem.models.tensorgraph.initializations.get(identifier, **kwargs)[source]
deepchem.models.tensorgraph.initializations.get_fans(shape)[source]
deepchem.models.tensorgraph.initializations.glorot_normal(shape, name=None)[source]

Glorot normal variance scaling initializer.

# References
Glorot & Bengio, AISTATS 2010
deepchem.models.tensorgraph.initializations.glorot_uniform(shape, name=None)[source]
deepchem.models.tensorgraph.initializations.he_normal(shape, name=None)[source]

He normal variance scaling initializer.

# References
He et al., http://arxiv.org/abs/1502.01852
deepchem.models.tensorgraph.initializations.he_uniform(shape, name=None)[source]

He uniform variance scaling initializer.

deepchem.models.tensorgraph.initializations.identity(shape, scale=1, name=None)[source]
deepchem.models.tensorgraph.initializations.lecun_uniform(shape, name=None)[source]

LeCun uniform variance scaling initializer.

# References
LeCun 98, Efficient Backprop, http://yann.lecun.com/exdb/publis/pdf/lecun-98b.pdf
deepchem.models.tensorgraph.initializations.normal(shape, scale=0.05, name=None)[source]
deepchem.models.tensorgraph.initializations.one(shape, name=None)[source]
deepchem.models.tensorgraph.initializations.orthogonal(shape, scale=1.1, name=None)[source]

Orthogonal initializer.

# References
Saxe et al., http://arxiv.org/abs/1312.6120
deepchem.models.tensorgraph.initializations.uniform(shape, scale=0.05, name=None)[source]
deepchem.models.tensorgraph.initializations.zero(shape, name=None)[source]

## deepchem.models.tensorgraph.layers module¶

class deepchem.models.tensorgraph.layers.ANIFeat(in_layers, max_atoms=23, radial_cutoff=4.6, angular_cutoff=3.1, radial_length=32, angular_length=8, atom_cases=[1, 6, 7, 8, 16], atomic_number_differentiated=True, coordinates_in_bohr=True, **kwargs)[source]

Performs transform from 3D coordinates to ANI symmetry functions

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

angular_symmetry(d_cutoff, d, atom_numbers, coordinates)[source]

Angular Symmetry Function

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

In layers should be of shape dtype tf.float32, (None, self.max_atoms, 4)

distance_cutoff(d, cutoff, flags)[source]

Generate distance matrix with trainable cutoff

distance_matrix(coordinates, flags)[source]

Generate distance matrix

get_num_feats()[source]
layer_number_dict = {}
none_tensors()
radial_symmetry(d_cutoff, d, atom_numbers)[source]

Radial Symmetry Function

set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Add(in_layers=None, weights=None, **kwargs)[source]

Compute the (optionally weighted) sum of the input layers.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
deepchem.models.tensorgraph.layers.AlphaShare(in_layers=None, **kwargs)[source]

This method should be used when constructing AlphaShare layers from Sluice Networks

Parameters: in_layers (list of Layers or tensors) – tensors in list must be the same size and list must include two or more tensors output_layers (list of Layers or tensors with same size as in_layers) – Distance matrix. References Sluice networks (Learning what to share between loosely related tasks) https (//arxiv.org/abs/1705.08142)
class deepchem.models.tensorgraph.layers.AlphaShareLayer(**kwargs)[source]

Part of a sluice network. Adds alpha parameters to control sharing between the main and auxillary tasks

Factory method AlphaShare should be used for construction

Parameters: in_layers (list of Layers or tensors) – tensors in list must be the same size and list must include two or more tensors out_tensor (a tensor with shape [len(in_layers), x, y] where x, y were the original layer dimensions) Distance matrix.
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.AtomicConvolution(atom_types=None, radial_params=[], boxsize=None, **kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
Parameters: X (tf.Tensor of shape (B, N, d)) – Coordinates/features. Nbrs (tf.Tensor of shape (B, N, M)) – Neighbor list. Nbrs_Z (tf.Tensor of shape (B, N, M)) – Atomic numbers of neighbor atoms. layer – A new tensor representing the output of the atomic conv layer tf.Tensor of shape (B, N, l)
distance_matrix(D)[source]

Calcuates the distance matrix from the distance tensor

B = batch_size, N = max_num_atoms, M = max_num_neighbors, d = num_features

Parameters: D (tf.Tensor of shape (B, N, M, d)) – Distance tensor. R – Distance matrix. tf.Tensor of shape (B, N, M)
distance_tensor(X, Nbrs, boxsize, B, N, M, d)[source]

Calculates distance tensor for batch of molecules.

B = batch_size, N = max_num_atoms, M = max_num_neighbors, d = num_features

Parameters: X (tf.Tensor of shape (B, N, d)) – Coordinates/features tensor. Nbrs (tf.Tensor of shape (B, N, M)) – Neighbor list tensor. boxsize (float or None) – Simulation box length [Angstrom]. D – Coordinates/features distance tensor. tf.Tensor of shape (B, N, M, d)
gather_neighbors(X, nbr_indices, B, N, M, d)[source]

Gathers the neighbor subsets of the atoms in X.

B = batch_size, N = max_num_atoms, M = max_num_neighbors, d = num_features

Parameters: X (tf.Tensor of shape (B, N, d)) – Coordinates/features tensor. atom_indices (tf.Tensor of shape (B, M)) – Neighbor list for single atom. neighbors – Neighbor coordinates/features tensor for single atom. tf.Tensor of shape (B, M, d)
gaussian_distance_matrix(R, rs, e)[source]

Calculates gaussian distance matrix.

B = batch_size, N = max_num_atoms, M = max_num_neighbors

Parameters: [B, N, M] (R) – Distance matrix. rs (tf.Variable) – Gaussian distance matrix mean. e (tf.Variable) – Gaussian distance matrix width (e = .5/std**2). retval [B, N, M] – Gaussian distance matrix. tf.Tensor
layer_number_dict = {}
none_tensors()
radial_cutoff(R, rc)[source]

Calculates radial cutoff matrix.

B = batch_size, N = max_num_atoms, M = max_num_neighbors

Parameters: [B, N, M] (R) – Distance matrix. rc (tf.Variable) – Interaction cutoff [Angstrom]. FC [B, N, M] – Radial cutoff matrix. tf.Tensor
radial_symmetry_function(R, rc, rs, e)[source]

Calculates radial symmetry function.

B = batch_size, N = max_num_atoms, M = max_num_neighbors, d = num_filters

Parameters: R (tf.Tensor of shape (B, N, M)) – Distance matrix. rc (float) – Interaction cutoff [Angstrom]. rs (float) – Gaussian distance matrix mean. e (float) – Gaussian distance matrix width. retval – Radial symmetry function (before summation) tf.Tensor of shape (B, N, M)
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.AttnLSTMEmbedding(n_test, n_support, n_feat, max_depth, **kwargs)[source]

Implements AttnLSTM as in matching networks paper.

The AttnLSTM embedding adjusts two sets of vectors, the “test” and “support” sets. The “support” consists of a set of evidence vectors. Think of these as the small training set for low-data machine learning. The “test” consists of the queries we wish to answer with the small amounts ofavailable data. The AttnLSTMEmbdding allows us to modify the embedding of the “test” set depending on the contents of the “support”. The AttnLSTMEmbedding is thus a type of learnable metric that allows a network to modify its internal notion of distance.

References: Matching Networks for One Shot Learning https://arxiv.org/pdf/1606.04080v1.pdf

Order Matters: Sequence to sequence for sets https://arxiv.org/abs/1511.06391

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

Execute this layer on input tensors.

Parameters: in_layers (list) – List of two tensors (X, Xp). X should be of shape (n_test, n_feat) and Xp should be of shape (n_support, n_feat) where n_test is the size of the test set, n_support that of the support set, and n_feat is the number of per-atom features. Returns two tensors of same shape as input. Namely the output shape will be [(n_test, n_feat), (n_support, n_feat)] list
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.BatchNorm(in_layers=None, **kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.BatchNormalization(epsilon=1e-05, axis=-1, momentum=0.99, beta_init='zero', gamma_init='one', **kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

add_weight(shape, initializer, name=None)[source]
build(input_shape)[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.BetaShare(**kwargs)[source]

Part of a sluice network. Adds beta params to control which layer outputs are used for prediction

Parameters: in_layers (list of Layers or tensors) – tensors in list must be the same size and list must include two or more tensors output_layers – Distance matrix. list of Layers or tensors with same size as in_layers
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

Size of input layers must all be the same

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Cast(in_layers=None, dtype=None, **kwargs)[source]

Wrapper around tf.cast. Changes the dtype of a single layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.CombineMeanStd(in_layers=None, training_only=False, noise_epsilon=0.01, **kwargs)[source]

Generate Gaussian nose.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Concat(in_layers=None, axis=1, **kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Constant(value, dtype=tf.float32, **kwargs)[source]

Output a constant value.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Conv1D(filters, kernel_size, strides=1, padding='valid', dilation_rate=1, activation=None, use_bias=True, kernel_initializer='glorot_uniform', bias_initializer='zeros', kernel_regularizer=None, bias_regularizer=None, activity_regularizer=None, kernel_constraint=None, bias_constraint=None, in_layers=None, **kwargs)[source]

A 1D convolution on the input.

This layer expects its input to be a three dimensional tensor of shape (batch size, width, # channels). If there is only one channel, the third dimension may optionally be omitted.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Conv2D(num_outputs, kernel_size=5, stride=1, padding='SAME', activation_fn=<function relu>, normalizer_fn=None, biases_initializer=<class 'tensorflow.python.ops.init_ops.Zeros'>, weights_initializer=<function xavier_initializer>, scope_name=None, **kwargs)[source]

A 2D convolution on the input.

This layer expects its input to be a four dimensional tensor of shape (batch size, height, width, # channels). If there is only one channel, the fourth dimension may optionally be omitted.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)
class deepchem.models.tensorgraph.layers.Conv2DTranspose(num_outputs, kernel_size=5, stride=1, padding='SAME', activation_fn=<function relu>, normalizer_fn=None, biases_initializer=<class 'tensorflow.python.ops.init_ops.Zeros'>, weights_initializer=<function xavier_initializer>, scope_name=None, **kwargs)[source]

A transposed 2D convolution on the input.

This layer is typically used for upsampling in a deconvolutional network. It expects its input to be a four dimensional tensor of shape (batch size, height, width, # channels). If there is only one channel, the fourth dimension may optionally be omitted.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)
class deepchem.models.tensorgraph.layers.Conv3D(num_outputs, kernel_size=5, stride=1, padding='SAME', activation_fn=<function relu>, normalizer_fn=None, biases_initializer=<class 'tensorflow.python.ops.init_ops.Zeros'>, weights_initializer=<function xavier_initializer>, scope_name=None, **kwargs)[source]

A 3D convolution on the input.

This layer expects its input to be a five dimensional tensor of shape (batch size, height, width, depth, # channels). If there is only one channel, the fifth dimension may optionally be omitted.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)
class deepchem.models.tensorgraph.layers.Conv3DTranspose(num_outputs, kernel_size=5, stride=1, padding='SAME', activation_fn=<function relu>, normalizer_fn=None, biases_initializer=<class 'tensorflow.python.ops.init_ops.Zeros'>, weights_initializer=<function xavier_initializer>, scope_name=None, **kwargs)[source]

A transposed 3D convolution on the input.

This layer is typically used for upsampling in a deconvolutional network. It expects its input to be a five dimensional tensor of shape (batch size, height, width, depth, # channels). If there is only one channel, the fifth dimension may optionally be omitted.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)
class deepchem.models.tensorgraph.layers.Dense(out_channels, activation_fn=None, biases_initializer=<class 'tensorflow.python.ops.init_ops.Zeros'>, weights_initializer=<function variance_scaling_initializer>, time_series=False, **kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)
class deepchem.models.tensorgraph.layers.Divide(in_layers=None, **kwargs)[source]

Compute the ratio of the input layers.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Dropout(dropout_prob, **kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Exp(in_layers=None, **kwargs)[source]

Compute the exponential of the input.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Feature(**kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_pre_q()
create_tensor(in_layers=None, set_tensors=True, **kwargs)
get_pre_q_name()
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Flatten(in_layers=None, **kwargs)[source]

Flatten every dimension except the first

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.GRU(n_hidden, batch_size, **kwargs)[source]

A Gated Recurrent Unit.

This layer expects its input to be of shape (batch_size, sequence_length, ...). It consists of a set of independent sequences (one for each element in the batch), that are each propagated independently through the GRU.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Gather(in_layers=None, indices=None, **kwargs)[source]

Gather elements or slices from the input.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.GraphCNN(num_filters, **kwargs)[source]

GraphCNN Layer from Robust Spatial Filtering with Graph Convolutional Neural Networks https://arxiv.org/abs/1703.00792

Spatial-domain convolutions can be defined as H = h_0I + h_1A + h_2A^2 + ... + hkAk, H ∈ R**(N×N)

We approximate it by H ≈ h_0I + h_1A

We can define a convolution as applying multiple these linear filters over edges of different types (think up, down, left, right, diagonal in images) Where each edge type has its own adjacency matrix H ≈ h_0I + h_1A_1 + h_2A_2 + . . . h_(L−1)A_(L−1)

V_out = sum_{c=1}^{C} H^{c} V^{c} + b

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

batch_mat_mult(A, B)[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
graphConvolution(V, A)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
deepchem.models.tensorgraph.layers.GraphCNNPool(num_vertices, **kwargs)[source]
class deepchem.models.tensorgraph.layers.GraphConv(out_channel, min_deg=0, max_deg=10, activation_fn=None, **kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensors)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
sum_neigh(atoms, deg_adj_lists)[source]

Store the summed atoms by degree

class deepchem.models.tensorgraph.layers.GraphEmbedPoolLayer(num_vertices, **kwargs)[source]

GraphCNNPool Layer from Robust Spatial Filtering with Graph Convolutional Neural Networks https://arxiv.org/abs/1703.00792

This is a learnable pool operation It constructs a new adjacency matrix for a graph of specified number of nodes.

This differs from our other pool opertions which set vertices to a function value without altering the adjacency matrix.

$V_{emb} = SpatialGraphCNN({V_{in}})$$V_{out} = sigma(V_{emb})^{T} * V_{in}$ $A_{out} = V_{emb}^{T} * A_{in} * V_{emb}$

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
Parameters: num_filters (int) – Number of filters to have in the output in_layers (list of Layers or tensors) – [V, A, mask] V are the vertex features must be of shape (batch, vertex, channel) A are the adjacency matrixes for each graph Shape (batch, from_vertex, adj_matrix, to_vertex) mask is optional, to be used when not every graph has the same number of vertices Returns (tf.tensor) – a tf.tensor with a graph convolution applied (Returns) – shape will be (batch, vertex, self.num_filters) (The) –
embedding_factors(V, no_filters, name='default')[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
softmax_factors(V, axis=1, name=None)[source]
class deepchem.models.tensorgraph.layers.GraphGather(batch_size, activation_fn=None, **kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.GraphPool(min_degree=0, max_degree=10, **kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Highway(activation_fn=<function relu>, biases_initializer=<class 'tensorflow.python.ops.init_ops.Zeros'>, weights_initializer=<function variance_scaling_initializer>, **kwargs)[source]

Create a highway layer. y = H(x) * T(x) + x * (1 - T(x)) H(x) = activation_fn(matmul(W_H, x) + b_H) is the non-linear transformed output T(x) = sigmoid(matmul(W_T, x) + b_T) is the transform gate

reference: https://arxiv.org/pdf/1505.00387.pdf

This layer expects its input to be a two dimensional tensor of shape (batch size, # input features). Outputs will be in the same shape.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Input(shape, dtype=tf.float32, **kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_pre_q()[source]
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
get_pre_q_name()[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.InputFifoQueue(shapes, names, capacity=5, **kwargs)[source]

This Queue Is used to allow asynchronous batching of inputs During the fitting process

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, **kwargs)[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensors)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.InteratomicL2Distances(N_atoms, M_nbrs, ndim, **kwargs)[source]

Compute (squared) L2 Distances between atoms given neighbors.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.IterRefLSTMEmbedding(n_test, n_support, n_feat, max_depth, **kwargs)[source]

Implements the Iterative Refinement LSTM.

Much like AttnLSTMEmbedding, the IterRefLSTMEmbedding is another type of learnable metric which adjusts “test” and “support.” Recall that “support” is the small amount of data available in a low data machine learning problem, and that “test” is the query. The AttnLSTMEmbedding only modifies the “test” based on the contents of the support. However, the IterRefLSTM modifies both the “support” and “test” based on each other. This allows the learnable metric to be more malleable than that from AttnLSTMEmbeding.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

Execute this layer on input tensors.

Parameters: in_layers (list) – List of two tensors (X, Xp). X should be of shape (n_test, n_feat) and Xp should be of shape (n_support, n_feat) where n_test is the size of the test set, n_support that of the support set, and n_feat is the number of per-atom features. Returns two tensors of same shape as input. Namely the output shape will be [(n_test, n_feat), (n_support, n_feat)] list
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.L1Loss(in_layers=None, **kwargs)[source]

Compute the mean absolute difference between the elements of the inputs.

This layer should have two or three inputs. If there is a third input, the difference between the first two inputs is multiplied by the third one to produce a weighted error.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.L2Loss(in_layers=None, **kwargs)[source]

Compute the mean squared difference between the elements of the inputs.

This layer should have two or three inputs. If there is a third input, the squared difference between the first two inputs is multiplied by the third one to produce a weighted error.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.LSTM(n_hidden, batch_size, **kwargs)[source]

A Long Short Term Memory.

This layer expects its input to be of shape (batch_size, sequence_length, ...). It consists of a set of independent sequences (one for each element in the batch), that are each propagated independently through the LSTM.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.LSTMStep(output_dim, input_dim, init_fn=<function glorot_uniform>, inner_init_fn=<function orthogonal>, activation_fn=<function tanh>, inner_activation_fn=<function hard_sigmoid>, **kwargs)[source]

Layer that performs a single step LSTM update.

This layer performs a single step LSTM update. Note that it is not a full LSTM recurrent network. The LSTMStep layer is useful as a primitive for designing layers such as the AttnLSTMEmbedding or the IterRefLSTMEmbedding below.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]

Constructs learnable weights for this layer.

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

Execute this layer on input tensors.

Parameters: in_layers (list) – List of three tensors (x, h_tm1, c_tm1). h_tm1 means “h, t-1”. Returns h, [h + c] list
get_initial_states(input_shape)[source]
layer_number_dict = {}
none_tensors()[source]

Zeros out stored tensors for pickling.

set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]

Sets all stored tensors.

set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Label(**kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_pre_q()
create_tensor(in_layers=None, set_tensors=True, **kwargs)
get_pre_q_name()
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Layer(in_layers=None, **kwargs)[source]

Bases: object

add_summary_to_tg()[source]

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)[source]

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)[source]

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)[source]

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)[source]

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)[source]

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.LayerSplitter(output_num, **kwargs)[source]

Layer which takes a tensor from in_tensor[0].out_tensors at an index Only layers which need to output multiple layers set and use the variable self.out_tensors. This is a utility for those special layers which set self.out_tensors to return a layer wrapping a specific tensor in in_layers[0].out_tensors

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Log(in_layers=None, **kwargs)[source]

Compute the natural log of the input.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.MaxPool1D(window_shape=2, strides=1, padding='SAME', **kwargs)[source]

A 1D max pooling on the input.

This layer expects its input to be a three dimensional tensor of shape (batch size, width, # channels).

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.MaxPool2D(ksize=[1, 2, 2, 1], strides=[1, 2, 2, 1], padding='SAME', **kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.MaxPool3D(ksize=[1, 2, 2, 2, 1], strides=[1, 2, 2, 2, 1], padding='SAME', **kwargs)[source]

A 3D max pooling on the input.

This layer expects its input to be a five dimensional tensor of shape (batch size, height, width, depth, # channels).

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.Multiply(in_layers=None, **kwargs)[source]

Compute the product of the input layers.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.NeighborList(N_atoms, M_nbrs, ndim, nbr_cutoff, start, stop, **kwargs)[source]

Computes a neighbor-list in Tensorflow.

Neighbor-lists (also called Verlet Lists) are a tool for grouping atoms which are close to each other spatially

TODO(rbharath): Make this layer support batching.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

compute_nbr_list(coords)[source]

Get closest neighbors for atoms.

Needs to handle padding for atoms with no neighbors.

Parameters: coords (tf.Tensor) – Shape (N_atoms, ndim) nbr_list – Shape (N_atoms, M_nbrs) of atom indices tf.Tensor
copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

Creates tensors associated with neighbor-listing.

get_atoms_in_nbrs(coords, cells)[source]

Get the atoms in neighboring cells for each cells.

Returns: atoms_in_nbrs = (N_atoms, n_nbr_cells, M_nbrs)
get_cells()[source]

Returns the locations of all grid points in box.

Suppose start is -10 Angstrom, stop is 10 Angstrom, nbr_cutoff is 1. Then would return a list of length 20^3 whose entries would be [(-10, -10, -10), (-10, -10, -9), ..., (9, 9, 9)]

Returns: cells – (n_cells, ndim) shape. tf.Tensor
get_cells_for_atoms(coords, cells)[source]

Compute the cells each atom belongs to.

Parameters: coords (tf.Tensor) – Shape (N_atoms, ndim) cells (tf.Tensor) – (n_cells, ndim) shape. cells_for_atoms – Shape (N_atoms, 1) tf.Tensor
get_closest_atoms(coords, cells)[source]

For each cell, find M_nbrs closest atoms.

Let N_atoms be the number of atoms.

Parameters: coords (tf.Tensor) – (N_atoms, ndim) shape. cells (tf.Tensor) – (n_cells, ndim) shape. closest_inds – Of shape (n_cells, M_nbrs) tf.Tensor
get_neighbor_cells(cells)[source]

Compute neighbors of cells in grid.

# TODO(rbharath): Do we need to handle periodic boundary conditions properly here? # TODO(rbharath): This doesn’t handle boundaries well. We hard-code # looking for n_nbr_cells neighbors, which isn’t right for boundary cells in # the cube.

Parameters: cells (tf.Tensor) – (n_cells, ndim) shape. nbr_cells – (n_cells, n_nbr_cells) tf.Tensor
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.ReLU(in_layers=None, **kwargs)[source]

Compute the relu activation of input: f(x) = relu(x) Only one input is allowed, output will have the same shape as input

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.ReduceMax(in_layers=None, axis=None, **kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.ReduceMean(in_layers=None, axis=None, **kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.ReduceSquareDifference(in_layers=None, axis=None, **kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters: summary_op (str) – summary operation to annotate node summary_description (object, optional) – Optional summary_pb2.SummaryDescription() collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters: in_layers (list tensor) – in tensors for the shared layer (List) – Layer
class deepchem.models.tensorgraph.layers.ReduceSum(in_layers=None, axis=None, **kwargs)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)


The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> new_input = Feature(shape=(None, 100))
>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)

Parameters: replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one. variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model. shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
<