CSS

# 分层嵌套操作¶

We show how to create multiple operations on a computational graph and how to visualize them using Tensorboard.

# 多层操作¶

Here we extend the usage of the computational graph to create multiple layers and show how they appear in Tensorboard.

# 载入损失函数¶

In order to train a model, we must be able to evaluate how well it is doing. This is given by loss functions. We plot various loss functions and talk about the benefits and limitations of some.

# 载入反向传播¶

Here we show how to use loss functions to iterate through data and back propagate errors for regression and classification.

# 随机和批量训练¶

TensorFlow makes it easy to use both batch and stochastic training. We show how to implement both and talk about the benefits and limitations of each.

# 结合训练¶

We now combine everything together that we have learned and create a simple classifier.

# 模型评估¶

Any model is only as good as it’s evaluation. Here we show two examples of (1) evaluating a regression algorithm and (2) a classification algorithm.

# 本章学习模块¶

## tensorflow.zeros¶

Creates a tensor with all elements set to zero.

This operation returns a tensor of type dtype with shape shape and all elements set to zero.

>>> tf.zeros([3, 4], tf.int32)
<tf.Tensor: shape=(3, 4), dtype=int32, numpy=
array([[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0]], dtype=int32)>

param shape: A list of integers, a tuple of integers, or a 1-D Tensor of type int32. The DType of an element in the resulting Tensor. Optional string. A name for the operation. A Tensor with all elements set to zero.

## tensorflow.ones¶

Creates a tensor with all elements set to one (1).

>>> tf.ones([3, 4], tf.int32)