Pytorch intermediate(三) BiLSTM

Bi-directional Long Short-Term Memory,双向LSTM网络。

有些时候预测可能需要由前面若干输入和后面若干输入共同决定,这样会更加准确。因此提出了双向循环神经网络,网络结构如上图。


构建LSTM模型时,在参数中添加bidirectional=True,这样就构建了一个双向的LSTM模型。初始化参数时,全连接层的隐藏层特征数量x2,h0和c0参数也要相应改变。

python 复制代码
import torch 
import torch.nn as nn
import torchvision
import torchvision.transforms as transforms

device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')

sequence_length = 28
input_size = 28
hidden_size = 128
num_layers = 2
num_classes = 10
batch_size = 100
num_epochs = 2
learning_rate = 0.003

# MNIST dataset
train_dataset = torchvision.datasets.MNIST(root='./data/',
                                           train=True, 
                                           transform=transforms.ToTensor(),
                                           download=True)

test_dataset = torchvision.datasets.MNIST(root='./data/',
                                          train=False, 
                                          transform=transforms.ToTensor())

# Data loader
train_loader = torch.utils.data.DataLoader(dataset=train_dataset,
                                           batch_size=batch_size, 
                                           shuffle=True)

test_loader = torch.utils.data.DataLoader(dataset=test_dataset,
                                          batch_size=batch_size, 
                                          shuffle=False)

python 复制代码
class BiRNN(nn.Module):
    def __init__(self, input_size, hidden_size, num_layers, num_classes):
        super(BiRNN, self).__init__()
        self.hidden_size = hidden_size
        self.num_layers = num_layers
        self.lstm = nn.LSTM(input_size, hidden_size, num_layers, batch_first=True, bidirectional=True)
        self.fc = nn.Linear(hidden_size*2, num_classes)  # 2 for bidirection
    
    def forward(self, x):
        # Set initial states
        h0 = torch.zeros(self.num_layers*2, x.size(0), self.hidden_size).to(device) # 2 for bidirection 
        c0 = torch.zeros(self.num_layers*2, x.size(0), self.hidden_size).to(device)
        
        # Forward propagate LSTM
        out, _ = self.lstm(x, (h0, c0))  # out: tensor of shape (batch_size, seq_length, hidden_size*2)
        
        # Decode the hidden state of the last time step
        out = self.fc(out[:, -1, :])
        return out

python 复制代码
model = BiRNN(input_size, hidden_size, num_layers, num_classes).to(device)

criterion = nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)

total_step = len(train_loader)
for epoch in range(num_epochs):
    for i, (images, labels) in enumerate(train_loader):
        images = images.reshape(-1, sequence_length, input_size).to(device)
        labels = labels.to(device)
        
        # Forward pass
        outputs = model(images)
        loss = criterion(outputs, labels)
        
        # Backward and optimize
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()
        
        if (i+1) % 100 == 0:
            print ('Epoch [{}/{}], Step [{}/{}], Loss: {:.4f}' 
                   .format(epoch+1, num_epochs, i+1, total_step, loss.item()))
相关推荐
岁月宁静3 分钟前
AI 聊天消息长列表性能优化:后端分页 + 前端虚拟滚动
前端·vue.js·人工智能
阿水实证通11 分钟前
能源经济大赛选题推荐:新能源汽车试点城市政策对能源消耗的负面影响——基于技术替代效应的视角
大数据·人工智能·汽车
视觉人机器视觉12 分钟前
机器视觉Halcon3D中,六大类3D处理算子
人工智能·计算机视觉·3d·视觉检测
GAOJ_K13 分钟前
从汽车传动到航空航天:滚珠花键的跨领域精密革命
人工智能·科技·机器人·自动化·制造
程序员晚枫17 分钟前
Python 3.14正式发布!这5大新特性太炸裂了
python
yunyun188635820 分钟前
AI - 自然语言处理(NLP) - part 2 - 词向量
人工智能·自然语言处理
先做个垃圾出来………27 分钟前
SortedList
python
这里有鱼汤29 分钟前
从DeepSeek到Kronos,3个原因告诉你:Kronos如何颠覆传统量化预测
后端·python·aigc
晓宜38 分钟前
Java25 新特性介绍
java·python·算法
热心不起来的市民小周1 小时前
基于 RoBERTa + 多策略优化的中文商品名细粒度分类
人工智能·分类·数据挖掘