- 快召唤伙伴们来围观吧
- 微博 QQ QQ空间 贴吧
- 文档嵌入链接
- 复制
- 微信扫一扫分享
- 已成功复制到剪贴板
Getting Started with TensorFlow 2.0
展开查看详情
1 .Getting Started with TensorFlow 2.0 Eliyar Eziz Google Developers Expert @ Machine Learning
2 .Eliyar Eziz Google Developers Expert @ Machine Learning Yodo1 AI Specialist
3 .Eliyar Eziz Google Developers Expert @ Machine Learning Yodo1 AI Specialist Open-Source Project • Kashgari : NLP Transfer learning framework | 1.3k Stars • BMPlayer : iOS Swift Video Player | 1.4k Stars
4 .Eliyar Eziz Google Developers Expert @ Machine Learning Yodo1 AI Specialist Open-Source Project • Kashgari : NLP Transfer learning framework | 1.3k Stars • BMPlayer : iOS Swift Video Player | 1.4k Stars •Blog https://eliyar.biz •Github https://github.com/BrikerMan •Email eliyar917@gmail.com
5 .Topics All New 2.0 And More ● Show Cases ● TF Lite ● Whats New ● TF Hub ● TF Agent Keras ● TensorBoard ● API Map ● Keras For Beginners Learning more ● Keras For Experts ● Books and courses ● Workflow and modules
6 .Amazing things (1 of 3) Art and science in one. Deep learning as representation learning. bit.ly/mini-dream https://www.tensorflow.org/tutorials/generative/style_transfer
7 .Amazing things (2 of 3) Encoder / decoders. Deep learning as compression. ● Sentence -> RNN -> vector sentences = [ ("Do you want a cup of coffee?", "¿Quieres una taza de café?"), ... ] https://bit.ly/mini-nmt https://tensorflow.org/tutorials/text/nmt_with_attention
8 .Amazing things (3 of 3) Encoder ● CNN -> Vector Decoder ● Nearly identical to the previous tutorial ● Vector -> RNN -> Sentence “A surfer riding on a wave” https://tensorflow.org/tutorials/text/image_captioning
9 .We’ve learned a lot since 1.0
10 .We’ve learned a lot since 1.0
11 .TF1 VS TF2 import tensorflow as tf a = tf.Variable(3) b = tf.Variable(7) c = a * b print(c)
12 .TF1 VS TF2 import tensorflow as tf a = tf.Variable(3) b = tf.Variable(7) c = a * b print(c) # Tensor("mul_1:0", shape=(), dtype=int32)
13 .TF1 VS TF2 import tensorflow as tf a = tf.Variable(3) b = tf.Variable(7) c = a * b print(c) # Tensor("mul_1:0", shape=(), dtype=int32) with tf.Session() as sess: sess.run(tf.initialize_all_variables()) result = sess.run(c) print(result) # 21
14 .TF1 VS TF2 import tensorflow as tf a = tf.Variable(3) b = tf.Variable(7) c = a * b print(c) # tf.Tensor(21, shape=(), dtype=int32)
15 .TensorFlow 2.0 Usability - tf.keras as the recommended high-level API. - Eager execution by default.
16 .TensorFlow 2.0 Clarity - Remove duplicate functionality - Consistent, intuitive syntax across APIs - Compatibility throughout the TensorFlow ecosystem
17 .TensorFlow 2.0 Flexibility - Full lower-level API. - Internal ops accessible in tf.raw_ops - Inheritable interfaces for variables, checkpoints, layers.
18 .All with one API
19 .All with one API
20 .
21 .
22 .
23 .Keras for Beginners
24 .Keras for Beginners
25 .Sequential Model L = tf.keras.layers model = tf.keras.models.Sequential([ L.Flatten(), L.Dense(512, activation='relu'), L.Dropout(0.2), L.Dense(10, activation='softmax') ]) model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy']) model.fit(x_train, y_train, epochs=5) model.evaluate(x_test, y_test)
26 .Keras for Experts
27 .Keras for Experts
28 .Subclass Model class MyModel(tf.keras.Model): def __init__(self, num_classes=10): super(MyModel, self).__init__(name='my_model') self.dense_1 = layers.Dense(32, activation='relu') self.dense_2 = layers.Dense(num_classes, activation='sigmoid') def call(self, inputs): # Define your forward pass here, x = self.dense_1(inputs) return self.dense_2(x)
29 .Custom Training Loop model = MyModel() with tf.GradientTape() as tape: logits = model(images) loss_value = loss(logits, labels) grads = tape.gradient(loss_value, model.trainable_variables) optimizer.apply_gradients(zip(grads, model.trainable_variables))