0% found this document useful (0 votes)
43 views

Python数据科学速查表 - Keras

This document summarizes Keras, a powerful and easy-to-use deep learning library for developing and evaluating deep learning models. It provides high-level APIs built on TensorFlow or Theano. It describes common neural network architectures like multilayer perceptrons, convolutional neural networks, and recurrent neural networks. It also covers model training, evaluation, prediction, and saving/loading models in Keras.

Uploaded by

Keith Ng
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views

Python数据科学速查表 - Keras

This document summarizes Keras, a powerful and easy-to-use deep learning library for developing and evaluating deep learning models. It provides high-level APIs built on TensorFlow or Theano. It describes common neural network architectures like multilayer perceptrons, convolutional neural networks, and recurrent neural networks. It also covers model training, evaluation, prediction, and saving/loading models in Keras.

Uploaded by

Keith Ng
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Python数据科学 速查表 模型架构 审视模型

Keras
呆鸟 译 序贯模型 >>>
>>>
model.output_shape
model.summary()
模型输出形状
模型摘要展示
>>> from keras.models import Sequential >>> model.get_config() 模型配置
列出模型的所有权重张量
天善智能 商业智能与大数据社区 www.hellobi.com
>>> model = Sequential() >>> model.get_weights()
>>> model2 = Sequential()
>>> model3 = Sequential() 编译模型

Keras 多层感知器(MLP) 多层感知器:二进制分类


二进制分类 >>> model.compile(optimizer='adam',
loss='binary_crossentropy',
Keras是强大、易用的深度学习库,基于Theano和TensorFlow提供
>>> from keras.layers import Dense metrics=['accuracy'])
>>> model.add(Dense(12, 多层感知器:多级分类
了高阶神经网络API,用于开发和评估深度学习模型。 input_dim=8, >>> model.compile(optimizer='rmsprop',
kernel_initializer='uniform', loss='categorical_crossentropy',

示例 多层感知器:回归
activation='relu')) metrics=['accuracy'])
>>> model.add(Dense(8,kernel_initializer='uniform',activation='relu'))
>>> model.add(Dense(1,kernel_initializer='uniform',activation='sigmoid')) >>> model.compile(optimizer='rmsprop',
>>> import numpy as np loss='mse',
>>> from keras.models import Sequential 多级分类 metrics=['mae'])
>>> from keras.layers import Dense
递归神经网络
>>> from keras.layers import Dropout
>>> data = np.random.random((1000,100)) >>> model.add(Dense(512,activation='relu',input_shape=(784,)))
>>> labels = np.random.randint(2,size=(1000,1)) >>> model.add(Dropout(0.2)) >>> model3.compile(loss='binary_crossentropy',
>>> model = Sequential() optimizer='adam',
>>> model.add(Dense(512,activation='relu')) metrics=['accuracy'])
>>> model.add(Dense(32, >>> model.add(Dropout(0.2))
activation='relu', >>> model.add(Dense(10,activation='softmax'))

>>>
input_dim=100))
model.add(Dense(1, activation='sigmoid')) 回归 模型训练
>>> model.compile(optimizer='rmsprop', >>> model.add(Dense(64,activation='relu',input_dim=train_data.shape[1])) >>> model3.fit(x_train4,
loss='binary_crossentropy', >>> model.add(Dense(1)) y_train4,

卷积神经网络(CNN)
metrics=['accuracy']) batch_size=32,
>>> model.fit(data,labels,epochs=10,batch_size=32) epochs=15,
verbose=1,
>>> predictions = model.predict(data) >>> from keras.layers import Activation,Conv2D,MaxPooling2D,Flatten validation_data=(x_test4,y_test4))
>>> model2.add(Conv2D(32,(3,3),padding='same',input_shape=x_train.shape[1:]))
数据 参阅 NumPy, Pandas & Scikit-Learn >>> model2.add(Activation('relu'))
评估模型性能
>>> model2.add(Conv2D(32,(3,3)))
数据要存为 NumPy 数组或数组列表,使用 sklearn.cross_validation >>> model2.add(Activation('relu')) >>> score = model3.evaluate(x_test,
的 train_test_split 模块进行分割将数据分割为训练集与测试集。
>>> model2.add(MaxPooling2D(pool_size=(2,2))) y_test,
>>> model2.add(Dropout(0.25)) batch_size=32)
>>> model2.add(Conv2D(64,(3,3), padding='same'))
Keras 数据集 >>> model2.add(Activation('relu')) 预测
>>> model2.add(Conv2D(64,(3, 3)))
>>> from keras.datasets import boston_housing, >>> model2.add(Activation('relu')) >>> model3.predict(x_test4, batch_size=32)
mnist, >>> model2.add(MaxPooling2D(pool_size=(2,2))) >>> model3.predict_classes(x_test4,batch_size=32)
cifar10, >>> model2.add(Dropout(0.25))
imdb
>>> (x_train,y_train),(x_test,y_test) = mnist.load_data() >>> model2.add(Flatten()) 保存/加载模型
>>> (x_train2,y_train2),(x_test2,y_test2) = boston_housing.load_data() >>> model2.add(Dense(512))
>>> (x_train3,y_train3),(x_test3,y_test3) = cifar10.load_data() >>> model2.add(Activation('relu')) >>> from keras.models import load_model
>>> (x_train4,y_train4),(x_test4,y_test4) = imdb.load_data(num_words=20000) >>> model2.add(Dropout(0.5)) >>> model3.save('model_file.h5')
>>> num_classes = 10 >>> my_model = load_model('my_model.h5')
>>> model2.add(Dense(num_classes))
其它 >>> model2.add(Activation('softmax'))
模型微调
递归神经网络(RNN)
参数优化
>>> from urllib.request import urlopen
>>> data = np.loadtxt(urlopen("http://archive.ics.uci.edu/ >>> from keras.klayers import Embedding,LSTM
ml/machine-learning-databases/pima-indians-diabetes/
pima-indians-diabetes.data"),delimiter=",") >>> model3.add(Embedding(20000,128)) >>> from keras.optimizers import RMSprop
>>> X = data[:,0:8] >>> model3.add(LSTM(128,dropout=0.2,recurrent_dropout=0.2)) >>> opt = RMSprop(lr=0.0001, decay=1e-6)
>>> y = data [:,8] >>> model3.add(Dense(1,activation='sigmoid')) >>> model2.compile(loss='categorical_crossentropy',
optimizer=opt,
预处理 参阅 NumPy 与 Scikit-Learn
metrics=['accuracy'])

早停法
序列填充 训练与测试集
>>> from keras.callbacks import EarlyStopping
>>> from keras.preprocessing import sequence >>> from sklearn.model_selection import train_test_split >>> early_stopping_monitor = EarlyStopping(patience=2)
>>> x_train4 = sequence.pad_sequences(x_train4,maxlen=80) >>> X_train5,X_test5,y_train5,y_test5 = train_test_split(X, >>> model3.fit(x_train4,
>>> x_test4 = sequence.pad_sequences(x_test4,maxlen=80) y,
test_size=0.33, y_train4,
独热编码
random_state=42) batch_size=32,

标准化/归一化
epochs=15,
>>> from keras.utils import to_categorical validation_data=(x_test4,y_test4),
>>> Y_train = to_categorical(y_train, num_classes) >>> from sklearn.preprocessing import StandardScaler callbacks=[early_stopping_monitor])
>>> Y_test = to_categorical(y_test, num_classes) >>> scaler = StandardScaler().fit(x_train2)
原文作者
>>> Y_train3 = to_categorical(y_train3, num_classes) >>> standardized_X = scaler.transform(x_train2) DataCamp
>>> Y_test3 = to_categorical(y_test3, num_classes) >>> standardized_X_test = scaler.transform(x_test2) Learn Python for Data Science Interactively

You might also like