torch数据加载

时间:2021-10-11
本文章向大家介绍torch数据加载,主要包括torch数据加载使用实例、应用技巧、基本知识点总结和需要注意事项,具有一定的参考价值,需要的朋友可以参考一下。

"""
批训练,把数据变成一小批一小批数据进行训练。
DataLoader就是用来包装所使用的数据,每次抛出一批数据
"""
import torch
import torch.utils.data as Data

BATCH_SIZE = 3

x = torch.linspace(1, 10, 10) # linspace: 返回一个1维张量,包含在区间start和end上均匀间隔的step个点
y = torch.linspace(10, 1, 10)

把数据放在数据集中

torch_dataset = Data.TensorDataset(x, y)

print(torch_dataset[1])

loader = Data.DataLoader(
# 从数据集中每次抽出batch size个样本
dataset=torch_dataset,
batch_size=BATCH_SIZE,
shuffle=True,
num_workers=3,
)

def show_batch():
for epoch in range(3): # epoch: 迭代次数
print('Epoch:', epoch)
for batch_id, (batch_x, batch_y) in enumerate(loader):
print(" batch_id:{}, batch_x:{}, batch_y:{}".format(batch_id, batch_x, batch_y))
# print(f' batch_id:{batch_id}, batch_x:{batch_x}, batch_y:{batch_y}')

if name == 'main':
show_batch()


(tensor(2.), tensor(9.))
Epoch: 0
batch_id:0, batch_x:tensor([4., 5., 3.]), batch_y:tensor([7., 6., 8.])
batch_id:1, batch_x:tensor([1., 8., 9.]), batch_y:tensor([10., 3., 2.])
batch_id:2, batch_x:tensor([ 2., 7., 10.]), batch_y:tensor([9., 4., 1.])
batch_id:3, batch_x:tensor([6.]), batch_y:tensor([5.])
Epoch: 1
batch_id:0, batch_x:tensor([2., 6., 4.]), batch_y:tensor([9., 5., 7.])
batch_id:1, batch_x:tensor([9., 8., 3.]), batch_y:tensor([2., 3., 8.])
batch_id:2, batch_x:tensor([ 1., 7., 10.]), batch_y:tensor([10., 4., 1.])
batch_id:3, batch_x:tensor([5.]), batch_y:tensor([6.])
Epoch: 2
batch_id:0, batch_x:tensor([2., 7., 6.]), batch_y:tensor([9., 4., 5.])
batch_id:1, batch_x:tensor([ 3., 10., 4.]), batch_y:tensor([8., 1., 7.])
batch_id:2, batch_x:tensor([5., 1., 8.]), batch_y:tensor([ 6., 10., 3.])
batch_id:3, batch_x:tensor([9.]), batch_y:tensor([2.])

原文地址:https://www.cnblogs.com/wana-/p/15394352.html