deepchem.models.tensorgraph package

Submodules

deepchem.models.tensorgraph.graph_layers module

Created on Thu Mar 30 14:02:04 2017

@author: michael

class deepchem.models.tensorgraph.graph_layers.DAGGather(n_graph_feat=30, n_outputs=30, max_atoms=50, layer_sizes=[100], init='glorot_uniform', activation='relu', dropout=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

TensorGraph style implementation The same as deepchem.nn.DAGGather

DAGgraph_step(batch_inputs, W_list, b_list)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]

“Construct internal trainable weights.

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

description and explanation refer to deepchem.nn.DAGGather parent layers: atom_features, membership

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.graph_layers.DAGLayer(n_graph_feat=30, n_atom_feat=75, max_atoms=50, layer_sizes=[100], init='glorot_uniform', activation='relu', dropout=None, batch_size=64, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

TensorGraph style implementation The same as deepchem.nn.DAGLayer

DAGgraph_step(batch_inputs, W_list, b_list)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]

“Construct internal trainable weights.

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

description and explanation refer to deepchem.nn.DAGLayer parent layers: atom_features, parents, calculation_orders, calculation_masks, n_atoms

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.graph_layers.DTNNEmbedding(n_embedding=30, periodic_table_length=30, init='glorot_uniform', **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

TensorGraph style implementation The same as deepchem.nn.DTNNEmbedding

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

description and explanation refer to deepchem.nn.DTNNEmbedding parent layers: atom_number

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.graph_layers.DTNNExtract(task_id, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.graph_layers.DTNNGather(n_embedding=30, n_outputs=100, layer_sizes=[100], output_activation=True, init='glorot_uniform', activation='tanh', **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

TensorGraph style implementation The same as deepchem.nn.DTNNGather

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

description and explanation refer to deepchem.nn.DTNNGather parent layers: atom_features, atom_membership

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.graph_layers.DTNNStep(n_embedding=30, n_distance=100, n_hidden=60, init='glorot_uniform', activation='tanh', **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

TensorGraph style implementation The same as deepchem.nn.DTNNStep

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

description and explanation refer to deepchem.nn.DTNNStep parent layers: atom_features, distance, distance_membership_i, distance_membership_j

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.graph_layers.EdgeNetwork(pair_features, n_pair_features=8, n_hidden=100, init='glorot_uniform')[source]

Bases: object

Submodule for Message Passing

forward(atom_features, atom_to_pair)[source]
none_tensors()[source]
set_tensors(tensor)[source]
class deepchem.models.tensorgraph.graph_layers.GatedRecurrentUnit(n_hidden=100, init='glorot_uniform')[source]

Bases: object

Submodule for Message Passing

forward(inputs, messages)[source]
none_tensors()[source]
set_tensors(tensor)[source]
class deepchem.models.tensorgraph.graph_layers.MessagePassing(T, message_fn='enn', update_fn='gru', n_hidden=100, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

General class for MPNN default structures built according to https://arxiv.org/abs/1511.06391

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build(pair_features, n_pair_features)[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

Perform T steps of message passing

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.graph_layers.SetGather(M, batch_size, n_hidden=100, init='orthogonal', **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

set2set gather layer for graph-based model model using this layer must set pad_batches=True

LSTMStep(h, c, x=None)[source]
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

Perform M steps of set2set gather, detailed descriptions in: https://arxiv.org/abs/1511.06391

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.graph_layers.WeaveGather(batch_size, n_input=128, gaussian_expand=False, init='glorot_uniform', activation='tanh', epsilon=0.001, momentum=0.99, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

TensorGraph style implementation The same as deepchem.nn.WeaveGather

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

description and explanation refer to deepchem.nn.WeaveGather parent layers: atom_features, atom_split

gaussian_histogram(x)[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.graph_layers.WeaveLayer(n_atom_input_feat=75, n_pair_input_feat=14, n_atom_output_feat=50, n_pair_output_feat=50, n_hidden_AA=50, n_hidden_PA=50, n_hidden_AP=50, n_hidden_PP=50, update_pair=True, init='glorot_uniform', activation='relu', dropout=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

TensorGraph style implementation The same as deepchem.nn.WeaveLayer Note: Use WeaveLayerFactory to construct this layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]

Construct internal trainable weights.

TODO(rbharath): Need to make this not set instance variables to follow style in other layers.

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

description and explanation refer to deepchem.nn.WeaveLayer parent layers: [atom_features, pair_features], pair_split, atom_to_pair

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

deepchem.models.tensorgraph.graph_layers.WeaveLayerFactory(**kwargs)[source]

deepchem.models.tensorgraph.layers module

class deepchem.models.tensorgraph.layers.ANIFeat(in_layers, max_atoms=23, radial_cutoff=4.6, angular_cutoff=3.1, radial_length=32, angular_length=8, atom_cases=[1, 6, 7, 8, 16], atomic_number_differentiated=True, coordinates_in_bohr=True, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Performs transform from 3D coordinates to ANI symmetry functions

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

angular_symmetry(d_cutoff, d, atom_numbers, coordinates)[source]

Angular Symmetry Function

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

In layers should be of shape dtype tf.float32, (None, self.max_atoms, 4)

distance_cutoff(d, cutoff, flags)[source]

Generate distance matrix with trainable cutoff

distance_matrix(coordinates, flags)[source]

Generate distance matrix

get_num_feats()[source]
layer_number_dict = {}
none_tensors()
radial_symmetry(d_cutoff, d, atom_numbers)[source]

Radial Symmetry Function

set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Add(in_layers=None, weights=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Compute the (optionally weighted) sum of the input layers.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

deepchem.models.tensorgraph.layers.AlphaShare(in_layers=None, **kwargs)[source]

This method should be used when constructing AlphaShare layers from Sluice Networks

Parameters:in_layers (list of Layers or tensors) – tensors in list must be the same size and list must include two or more tensors
Returns:
  • output_layers (list of Layers or tensors with same size as in_layers) – Distance matrix.
  • References
  • Sluice networks (Learning what to share between loosely related tasks)
  • https (//arxiv.org/abs/1705.08142)
class deepchem.models.tensorgraph.layers.AlphaShareLayer(**kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Part of a sluice network. Adds alpha parameters to control sharing between the main and auxillary tasks

Factory method AlphaShare should be used for construction

Parameters:in_layers (list of Layers or tensors) – tensors in list must be the same size and list must include two or more tensors
Returns:
  • out_tensor (a tensor with shape [len(in_layers), x, y] where x, y were the original layer dimensions)
  • Distance matrix.
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.AtomicConvolution(atom_types=None, radial_params=[], boxsize=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
Parameters:
  • X (tf.Tensor of shape (B, N, d)) – Coordinates/features.
  • Nbrs (tf.Tensor of shape (B, N, M)) – Neighbor list.
  • Nbrs_Z (tf.Tensor of shape (B, N, M)) – Atomic numbers of neighbor atoms.
Returns:

layer – A new tensor representing the output of the atomic conv layer

Return type:

tf.Tensor of shape (B, N, l)

distance_matrix(D)[source]

Calcuates the distance matrix from the distance tensor

B = batch_size, N = max_num_atoms, M = max_num_neighbors, d = num_features

Parameters:D (tf.Tensor of shape (B, N, M, d)) – Distance tensor.
Returns:R – Distance matrix.
Return type:tf.Tensor of shape (B, N, M)
distance_tensor(X, Nbrs, boxsize, B, N, M, d)[source]

Calculates distance tensor for batch of molecules.

B = batch_size, N = max_num_atoms, M = max_num_neighbors, d = num_features

Parameters:
  • X (tf.Tensor of shape (B, N, d)) – Coordinates/features tensor.
  • Nbrs (tf.Tensor of shape (B, N, M)) – Neighbor list tensor.
  • boxsize (float or None) – Simulation box length [Angstrom].
Returns:

D – Coordinates/features distance tensor.

Return type:

tf.Tensor of shape (B, N, M, d)

gather_neighbors(X, nbr_indices, B, N, M, d)[source]

Gathers the neighbor subsets of the atoms in X.

B = batch_size, N = max_num_atoms, M = max_num_neighbors, d = num_features

Parameters:
  • X (tf.Tensor of shape (B, N, d)) – Coordinates/features tensor.
  • atom_indices (tf.Tensor of shape (B, M)) – Neighbor list for single atom.
Returns:

neighbors – Neighbor coordinates/features tensor for single atom.

Return type:

tf.Tensor of shape (B, M, d)

gaussian_distance_matrix(R, rs, e)[source]

Calculates gaussian distance matrix.

B = batch_size, N = max_num_atoms, M = max_num_neighbors

Parameters:
  • [B, N, M] (R) – Distance matrix.
  • rs (tf.Variable) – Gaussian distance matrix mean.
  • e (tf.Variable) – Gaussian distance matrix width (e = .5/std**2).
Returns:

retval [B, N, M] – Gaussian distance matrix.

Return type:

tf.Tensor

layer_number_dict = {}
none_tensors()
radial_cutoff(R, rc)[source]

Calculates radial cutoff matrix.

B = batch_size, N = max_num_atoms, M = max_num_neighbors

Parameters:
  • [B, N, M] (R) – Distance matrix.
  • rc (tf.Variable) – Interaction cutoff [Angstrom].
Returns:

FC [B, N, M] – Radial cutoff matrix.

Return type:

tf.Tensor

radial_symmetry_function(R, rc, rs, e)[source]

Calculates radial symmetry function.

B = batch_size, N = max_num_atoms, M = max_num_neighbors, d = num_filters

Parameters:
  • R (tf.Tensor of shape (B, N, M)) – Distance matrix.
  • rc (float) – Interaction cutoff [Angstrom].
  • rs (float) – Gaussian distance matrix mean.
  • e (float) – Gaussian distance matrix width.
Returns:

retval – Radial symmetry function (before summation)

Return type:

tf.Tensor of shape (B, N, M)

set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.AttnLSTMEmbedding(n_test, n_support, n_feat, max_depth, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Implements AttnLSTM as in matching networks paper.

The AttnLSTM embedding adjusts two sets of vectors, the “test” and “support” sets. The “support” consists of a set of evidence vectors. Think of these as the small training set for low-data machine learning. The “test” consists of the queries we wish to answer with the small amounts ofavailable data. The AttnLSTMEmbdding allows us to modify the embedding of the “test” set depending on the contents of the “support”. The AttnLSTMEmbedding is thus a type of learnable metric that allows a network to modify its internal notion of distance.

References: Matching Networks for One Shot Learning https://arxiv.org/pdf/1606.04080v1.pdf

Order Matters: Sequence to sequence for sets https://arxiv.org/abs/1511.06391

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

Execute this layer on input tensors.

Parameters:in_layers (list) – List of two tensors (X, Xp). X should be of shape (n_test, n_feat) and Xp should be of shape (n_support, n_feat) where n_test is the size of the test set, n_support that of the support set, and n_feat is the number of per-atom features.
Returns:Returns two tensors of same shape as input. Namely the output shape will be [(n_test, n_feat), (n_support, n_feat)]
Return type:list
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.BatchNorm(in_layers=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.BatchNormalization(epsilon=1e-05, axis=-1, momentum=0.99, beta_init='zero', gamma_init='one', **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

add_weight(shape, initializer, name=None)[source]
build(input_shape)[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.BetaShare(**kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Part of a sluice network. Adds beta params to control which layer outputs are used for prediction

Parameters:in_layers (list of Layers or tensors) – tensors in list must be the same size and list must include two or more tensors
Returns:output_layers – Distance matrix.
Return type:list of Layers or tensors with same size as in_layers
add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

Size of input layers must all be the same

layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.CombineMeanStd(in_layers=None, training_only=False, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Generate Gaussian nose.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Concat(in_layers=None, axis=1, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Constant(value, dtype=tf.float32, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Output a constant value.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Conv1D(width, out_channels, stride=1, padding='SAME', activation_fn=<function relu>, biases_initializer=<class 'tensorflow.python.ops.init_ops.RandomNormal'>, weights_initializer=<class 'tensorflow.python.ops.init_ops.RandomNormal'>, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

A 1D convolution on the input.

This layer expects its input to be a three dimensional tensor of shape (batch size, width, # channels). If there is only one channel, the third dimension may optionally be omitted.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Conv2D(num_outputs, kernel_size=5, stride=1, padding='SAME', activation_fn=<function relu>, normalizer_fn=None, biases_initializer=<class 'tensorflow.python.ops.init_ops.Zeros'>, weights_initializer=<function xavier_initializer>, scope_name=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.SharedVariableScope

A 2D convolution on the input.

This layer expects its input to be a four dimensional tensor of shape (batch size, height, width, # channels). If there is only one channel, the fourth dimension may optionally be omitted.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)
class deepchem.models.tensorgraph.layers.Conv2DTranspose(num_outputs, kernel_size=5, stride=1, padding='SAME', activation_fn=<function relu>, normalizer_fn=None, biases_initializer=<class 'tensorflow.python.ops.init_ops.Zeros'>, weights_initializer=<function xavier_initializer>, scope_name=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.SharedVariableScope

A transposed 2D convolution on the input.

This layer is typically used for upsampling in a deconvolutional network. It expects its input to be a four dimensional tensor of shape (batch size, height, width, # channels). If there is only one channel, the fourth dimension may optionally be omitted.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)
class deepchem.models.tensorgraph.layers.Conv3D(num_outputs, kernel_size=5, stride=1, padding='SAME', activation_fn=<function relu>, normalizer_fn=None, biases_initializer=<class 'tensorflow.python.ops.init_ops.Zeros'>, weights_initializer=<function xavier_initializer>, scope_name=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.SharedVariableScope

A 3D convolution on the input.

This layer expects its input to be a five dimensional tensor of shape (batch size, height, width, depth, # channels). If there is only one channel, the fifth dimension may optionally be omitted.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)
class deepchem.models.tensorgraph.layers.Conv3DTranspose(num_outputs, kernel_size=5, stride=1, padding='SAME', activation_fn=<function relu>, normalizer_fn=None, biases_initializer=<class 'tensorflow.python.ops.init_ops.Zeros'>, weights_initializer=<function xavier_initializer>, scope_name=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.SharedVariableScope

A transposed 3D convolution on the input.

This layer is typically used for upsampling in a deconvolutional network. It expects its input to be a five dimensional tensor of shape (batch size, height, width, depth, # channels). If there is only one channel, the fifth dimension may optionally be omitted.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)
class deepchem.models.tensorgraph.layers.Dense(out_channels, activation_fn=None, biases_initializer=<class 'tensorflow.python.ops.init_ops.Zeros'>, weights_initializer=<function variance_scaling_initializer>, time_series=False, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.SharedVariableScope

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)
class deepchem.models.tensorgraph.layers.Divide(in_layers=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Compute the ratio of the input layers.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Dropout(dropout_prob, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Exp(in_layers=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Compute the exponential of the input.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Feature(**kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Input

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_pre_q(batch_size)
create_tensor(in_layers=None, set_tensors=True, **kwargs)
get_pre_q_name()
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Flatten(in_layers=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Flatten every dimension except the first

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.GRU(n_hidden, batch_size, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

A Gated Recurrent Unit.

This layer expects its input to be of shape (batch_size, sequence_length, ...). It consists of a set of independent sequence (one for each element in the batch), that are each propagated independently through the GRU.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Gather(in_layers=None, indices=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Gather elements or slices from the input.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.GraphCNN(num_filters, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

GraphCNN Layer from Robust Spatial Filtering with Graph Convolutional Neural Networks https://arxiv.org/abs/1703.00792

Spatial-domain convolutions can be defined as H = h_0I + h_1A + h_2A^2 + ... + hkAk, H ∈ R**(N×N)

We approximate it by H ≈ h_0I + h_1A

We can define a convolution as applying multiple these linear filters over edges of different types (think up, down, left, right, diagonal in images) Where each edge type has its own adjacency matrix H ≈ h_0I + h_1A_1 + h_2A_2 + . . . h_(L−1)A_(L−1)

V_out = sum_{c=1}^{C} H^{c} V^{c} + b

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

batch_mat_mult(A, B)[source]
clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
graphConvolution(V, A)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

deepchem.models.tensorgraph.layers.GraphCNNPool(num_vertices, **kwargs)[source]
class deepchem.models.tensorgraph.layers.GraphConv(out_channel, min_deg=0, max_deg=10, activation_fn=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensors)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

sum_neigh(atoms, deg_adj_lists)[source]

Store the summed atoms by degree

class deepchem.models.tensorgraph.layers.GraphEmbedPoolLayer(num_vertices, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

GraphCNNPool Layer from Robust Spatial Filtering with Graph Convolutional Neural Networks https://arxiv.org/abs/1703.00792

This is a learnable pool operation It constructs a new adjacency matrix for a graph of specified number of nodes.

This differs from our other pool opertions which set vertices to a function value without altering the adjacency matrix.

$V_{emb} = SpatialGraphCNN({V_{in}})$$V_{out} = sigma(V_{emb})^{T} * V_{in}$ $A_{out} = V_{emb}^{T} * A_{in} * V_{emb}$

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
Parameters:
  • num_filters (int) – Number of filters to have in the output
  • in_layers (list of Layers or tensors) –

    [V, A, mask] V are the vertex features must be of shape (batch, vertex, channel)

    A are the adjacency matrixes for each graph
    Shape (batch, from_vertex, adj_matrix, to_vertex)

    mask is optional, to be used when not every graph has the same number of vertices

  • Returns (tf.tensor) –
  • a tf.tensor with a graph convolution applied (Returns) –
  • shape will be (batch, vertex, self.num_filters) (The) –
embedding_factors(V, no_filters, name='default')[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

softmax_factors(V, axis=1, name=None)[source]
class deepchem.models.tensorgraph.layers.GraphGather(batch_size, activation_fn=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.GraphPool(min_degree=0, max_degree=10, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Highway(activation_fn=<function relu>, biases_initializer=<class 'tensorflow.python.ops.init_ops.Zeros'>, weights_initializer=<function variance_scaling_initializer>, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Create a highway layer. y = H(x) * T(x) + x * (1 - T(x)) H(x) = activation_fn(matmul(W_H, x) + b_H) is the non-linear transformed output T(x) = sigmoid(matmul(W_T, x) + b_T) is the transform gate

reference: https://arxiv.org/pdf/1505.00387.pdf

This layer expects its input to be a two dimensional tensor of shape (batch size, # input features). Outputs will be in the same shape.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Input(shape, dtype=tf.float32, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_pre_q(batch_size)[source]
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
get_pre_q_name()[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.InputFifoQueue(shapes, names, capacity=5, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

This Queue Is used to allow asynchronous batching of inputs During the fitting process

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, **kwargs)[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensors)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.InteratomicL2Distances(N_atoms, M_nbrs, ndim, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Compute (squared) L2 Distances between atoms given neighbors.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.IterRefLSTMEmbedding(n_test, n_support, n_feat, max_depth, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Implements the Iterative Refinement LSTM.

Much like AttnLSTMEmbedding, the IterRefLSTMEmbedding is another type of learnable metric which adjusts “test” and “support.” Recall that “support” is the small amount of data available in a low data machine learning problem, and that “test” is the query. The AttnLSTMEmbedding only modifies the “test” based on the contents of the support. However, the IterRefLSTM modifies both the “support” and “test” based on each other. This allows the learnable metric to be more malleable than that from AttnLSTMEmbeding.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

Execute this layer on input tensors.

Parameters:in_layers (list) – List of two tensors (X, Xp). X should be of shape (n_test, n_feat) and Xp should be of shape (n_support, n_feat) where n_test is the size of the test set, n_support that of the support set, and n_feat is the number of per-atom features.
Returns:Returns two tensors of same shape as input. Namely the output shape will be [(n_test, n_feat), (n_support, n_feat)]
Return type:list
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.L1Loss(in_layers=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.L2Loss(in_layers=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.LSTMStep(output_dim, input_dim, init_fn=<function glorot_uniform>, inner_init_fn=<function orthogonal>, activation_fn=<function tanh>, inner_activation_fn=<function hard_sigmoid>, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Layer that performs a single step LSTM update.

This layer performs a single step LSTM update. Note that it is not a full LSTM recurrent network. The LSTMStep layer is useful as a primitive for designing layers such as the AttnLSTMEmbedding or the IterRefLSTMEmbedding below.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

build()[source]

Constructs learnable weights for this layer.

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

Execute this layer on input tensors.

Parameters:in_layers (list) – List of three tensors (x, h_tm1, c_tm1). h_tm1 means “h, t-1”.
Returns:Returns h, [h + c]
Return type:list
get_initial_states(input_shape)[source]
layer_number_dict = {}
none_tensors()[source]

Zeros out stored tensors for pickling.

set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]

Sets all stored tensors.

set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Label(**kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Input

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_pre_q(batch_size)
create_tensor(in_layers=None, set_tensors=True, **kwargs)
get_pre_q_name()
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Layer(in_layers=None, **kwargs)[source]

Bases: object

add_summary_to_tg()[source]

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)[source]

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)[source]

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()[source]
set_summary(summary_op, summary_description=None, collections=None)[source]

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)[source]
set_variable_initial_values(values)[source]

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)[source]

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.LayerSplitter(output_num, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Layer which takes a tensor from in_tensor[0].out_tensors at an index Only layers which need to output multiple layers set and use the variable self.out_tensors. This is a utility for those special layers which set self.out_tensors to return a layer wrapping a specific tensor in in_layers[0].out_tensors

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Log(in_layers=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Compute the natural log of the input.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.MaxPool1D(window_shape=2, strides=1, padding='SAME', **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

A 1D max pooling on the input.

This layer expects its input to be a three dimensional tensor of shape (batch size, width, # channels).

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.MaxPool2D(ksize=[1, 2, 2, 1], strides=[1, 2, 2, 1], padding='SAME', **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.MaxPool3D(ksize=[1, 2, 2, 2, 1], strides=[1, 2, 2, 2, 1], padding='SAME', **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

A 3D max pooling on the input.

This layer expects its input to be a five dimensional tensor of shape (batch size, height, width, depth, # channels).

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Multiply(in_layers=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Compute the product of the input layers.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.NeighborList(N_atoms, M_nbrs, ndim, nbr_cutoff, start, stop, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Computes a neighbor-list in Tensorflow.

Neighbor-lists (also called Verlet Lists) are a tool for grouping atoms which are close to each other spatially

TODO(rbharath): Make this layer support batching.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

compute_nbr_list(coords)[source]

Get closest neighbors for atoms.

Needs to handle padding for atoms with no neighbors.

Parameters:coords (tf.Tensor) – Shape (N_atoms, ndim)
Returns:nbr_list – Shape (N_atoms, M_nbrs) of atom indices
Return type:tf.Tensor
copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]

Creates tensors associated with neighbor-listing.

get_atoms_in_nbrs(coords, cells)[source]

Get the atoms in neighboring cells for each cells.

Returns:
Return type:atoms_in_nbrs = (N_atoms, n_nbr_cells, M_nbrs)
get_cells()[source]

Returns the locations of all grid points in box.

Suppose start is -10 Angstrom, stop is 10 Angstrom, nbr_cutoff is 1. Then would return a list of length 20^3 whose entries would be [(-10, -10, -10), (-10, -10, -9), ..., (9, 9, 9)]

Returns:cells – (n_cells, ndim) shape.
Return type:tf.Tensor
get_cells_for_atoms(coords, cells)[source]

Compute the cells each atom belongs to.

Parameters:
  • coords (tf.Tensor) – Shape (N_atoms, ndim)
  • cells (tf.Tensor) – (n_cells, ndim) shape.
Returns:

cells_for_atoms – Shape (N_atoms, 1)

Return type:

tf.Tensor

get_closest_atoms(coords, cells)[source]

For each cell, find M_nbrs closest atoms.

Let N_atoms be the number of atoms.

Parameters:
  • coords (tf.Tensor) – (N_atoms, ndim) shape.
  • cells (tf.Tensor) – (n_cells, ndim) shape.
Returns:

closest_inds – Of shape (n_cells, M_nbrs)

Return type:

tf.Tensor

get_neighbor_cells(cells)[source]

Compute neighbors of cells in grid.

# TODO(rbharath): Do we need to handle periodic boundary conditions properly here? # TODO(rbharath): This doesn’t handle boundaries well. We hard-code # looking for n_nbr_cells neighbors, which isn’t right for boundary cells in # the cube.

Parameters:cells (tf.Tensor) – (n_cells, ndim) shape.
Returns:nbr_cells – (n_cells, n_nbr_cells)
Return type:tf.Tensor
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.ReduceMean(in_layers=None, axis=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.ReduceSquareDifference(in_layers=None, axis=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.ReduceSum(in_layers=None, axis=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Repeat(n_times, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Reshape(shape, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.SharedVariableScope(**kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

A Layer that can share variables with another layer via name scope.

This abstract class can be used as a parent for any layer that implements shared() by means of the variable name scope. It exists to avoid duplicated code.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)[source]
class deepchem.models.tensorgraph.layers.SluiceLoss(**kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Calculates the loss in a Sluice Network Every input into an AlphaShare should be used in SluiceLoss

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.SoftMax(in_layers=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.SoftMaxCrossEntropy(in_layers=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.SparseSoftMaxCrossEntropy(in_layers=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Squeeze(in_layers=None, squeeze_dims=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Stack(in_layers=None, axis=1, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.StopGradient(in_layers=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Block the flow of gradients.

This layer copies its input directly to its output, but reports that all gradients of its output are zero. This means, for example, that optimizers will not try to optimize anything “upstream” of this layer.

For example, suppose you have pre-trained a stack of layers to perform a calculation. You want to use the result of that calculation as the input to another layer, but because they are already pre-trained, you do not want the optimizer to modify them. You can wrap the output in a StopGradient layer, then use that as the input to the next layer.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.TensorWrapper(out_tensor, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Used to wrap a tensorflow tensor.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, **kwargs)[source]

Take no actions.

layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.TimeSeriesDense(out_channels, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.ToFloat(in_layers=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Transpose(perm, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Variable(initial_value, dtype=tf.float32, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Output a trainable value.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.VinaFreeEnergy(N_atoms, M_nbrs, ndim, nbr_cutoff, start, stop, stddev=0.3, Nrot=1, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Computes free-energy as defined by Autodock Vina.

TODO(rbharath): Make this layer support batching.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
Parameters:
  • X (tf.Tensor of shape (N, d)) – Coordinates/features.
  • Z (tf.Tensor of shape (N)) – Atomic numbers of neighbor atoms.
Returns:

layer – The free energy of each complex in batch

Return type:

tf.Tensor of shape (B)

cutoff(d, x)[source]
gaussian_first(d)[source]

Computes Autodock Vina’s first Gaussian interaction term.

gaussian_second(d)[source]

Computes Autodock Vina’s second Gaussian interaction term.

hydrogen_bond(d)[source]

Computes Autodock Vina’s hydrogen bond interaction term.

hydrophobic(d)[source]

Computes Autodock Vina’s hydrophobic interaction term.

layer_number_dict = {}
none_tensors()
nonlinearity(c)[source]

Computes non-linearity used in Vina.

repulsion(d)[source]

Computes Autodock Vina’s repulsion interaction term.

set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.WeightDecay(penalty, penalty_type, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Apply a weight decay penalty.

The input should be the loss value. This layer adds a weight decay penalty to it and outputs the sum.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.WeightedError(in_layers=None, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.WeightedLinearCombo(in_layers=None, std=0.3, **kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Layer

Computes a weighted linear combination of input layers, with the weights defined by trainable variables.

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_tensor(in_layers=None, set_tensors=True, **kwargs)[source]
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

class deepchem.models.tensorgraph.layers.Weights(**kwargs)[source]

Bases: deepchem.models.tensorgraph.layers.Input

add_summary_to_tg()

Can only be called after self.create_layer to gaurentee that name is not none

clone(in_layers)

Create a copy of this layer with different inputs.

copy(replacements={}, variables_graph=None, shared=False)

Duplicate this Layer and all its inputs.

This is similar to clone(), but instead of only cloning one layer, it also recursively calls copy() on all of this layer’s inputs to clone the entire hierarchy of layers. In the process, you can optionally tell it to replace particular layers with specific existing ones. For example, you can clone a stack of layers, while connecting the topmost ones to different inputs.

For example, consider a stack of dense layers that depend on an input:

>>> input = Feature(shape=(None, 100))
>>> dense1 = Dense(100, in_layers=input)
>>> dense2 = Dense(100, in_layers=dense1)
>>> dense3 = Dense(100, in_layers=dense2)

The following will clone all three dense layers, but not the input layer. Instead, the input to the first dense layer will be a different layer specified in the replacements map.

>>> replacements = {input: new_input}
>>> dense3_copy = dense3.copy(replacements)
Parameters:
  • replacements (map) – specifies existing layers, and the layers to replace them with (instead of cloning them). This argument serves two purposes. First, you can pass in a list of replacements to control which layers get cloned. In addition, as each layer is cloned, it is added to this map. On exit, it therefore contains a complete record of all layers that were copied, and a reference to the copy of each one.
  • variables_graph (TensorGraph) – an optional TensorGraph from which to take variables. If this is specified, the current value of each variable in each layer is recorded, and the copy has that value specified as its initial value. This allows a piece of a pre-trained model to be copied to another model.
  • shared (bool) – if True, create new layers by calling shared() on the input layers. This means the newly created layers will share variables with the original ones.
create_pre_q(batch_size)
create_tensor(in_layers=None, set_tensors=True, **kwargs)
get_pre_q_name()
layer_number_dict = {}
none_tensors()
set_summary(summary_op, summary_description=None, collections=None)

Annotates a tensor with a tf.summary operation Collects data from self.out_tensor by default but can be changed by setting self.tb_input to another tensor in create_tensor

Parameters:
  • summary_op (str) – summary operation to annotate node
  • summary_description (object, optional) – Optional summary_pb2.SummaryDescription()
  • collections (list of graph collections keys, optional) – New summary op is added to these collections. Defaults to [GraphKeys.SUMMARIES]
set_tensors(tensor)
set_variable_initial_values(values)

Set the initial values of all variables.

This takes a list, which contains the initial values to use for all of this layer’s values (in the same order retured by TensorGraph.get_layer_variables()). When this layer is used in a TensorGraph, it will automatically initialize each variable to the value specified in the list. Note that some layers also have separate mechanisms for specifying variable initializers; this method overrides them. The purpose of this method is to let a Layer object represent a pre-trained layer, complete with trained values for its variables.

shape

Get the shape of this Layer’s output.

shared(in_layers)

Create a copy of this layer that shares variables with it.

This is similar to clone(), but where clone() creates two independent layers, this causes the layers to share variables with each other.

Parameters:
  • in_layers (list tensor) –
  • in tensors for the shared layer (List) –
Returns:

Return type:

Layer

deepchem.models.tensorgraph.layers.convert_to_layers(in_layers)[source]

Wrap all inputs into tensors if necessary.

deepchem.models.tensorgraph.optimizers module

Optimizers and related classes for use with TensorGraph.

class deepchem.models.tensorgraph.optimizers.Adam(learning_rate=0.001, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]

Bases: deepchem.models.tensorgraph.optimizers.Optimizer

The Adam optimization algorithm.

class deepchem.models.tensorgraph.optimizers.ExponentialDecay(initial_rate, decay_rate, decay_steps, staircase=True)[source]

Bases: deepchem.models.tensorgraph.optimizers.LearningRateSchedule

A learning rate that decreases exponentially with the number of training steps.

class deepchem.models.tensorgraph.optimizers.GradientDescent(learning_rate=0.001)[source]

Bases: deepchem.models.tensorgraph.optimizers.Optimizer

The gradient descent optimization algorithm.

class deepchem.models.tensorgraph.optimizers.LearningRateSchedule[source]

Bases: object

A schedule for changing the learning rate over the course of optimization.

This is an abstract class. Subclasses represent specific schedules.

class deepchem.models.tensorgraph.optimizers.Optimizer[source]

Bases: object

An algorithm for optimizing a TensorGraph based model.

This is an abstract class. Subclasses represent specific optimization algorithms.

class deepchem.models.tensorgraph.optimizers.PolynomialDecay(initial_rate, final_rate, decay_steps, power=1.0)[source]

Bases: