时间卷积神经网络(TCN)MATLAB实战

文摘   科技   2024-05-12 10:06   贵州  
    今天给大家分享时间卷积神经网络,主要从算法原理和代码实战展开。需要了解更多算法代码的,可以点击文章左下角的阅读全文,进行获取哦~需要了解智能算法、机器学习、深度学习和信号处理相关理论的可以后台私信哦,下一期分享的内容就是你想了解的内容~

一、算法原理
    时间卷积网络(Temporal Convolutional Network,TCN)由具有相同输入和输出长度的扩张的、因果的1D卷积层组成,其结构图如图1所示。TCN的设计十分巧妙,同ConvLSTM不同的是,ConvLSTM通过引入卷积操作,让LSTM网络可以处理图像信息,其卷积只对一个时间的输入图像进行操作,TCN则直接利用卷积强大的特性,跨时间步提取特征。
图1 TCN结构图

 

    有时模型可能仍然很深,较深的网络结构可能会引起梯度消失等问题,为了应对这种情况,TCN利用了一种类似于ResNet中的残差块的结构,这样设计的TCN结构更加的具有泛化能力(generic),如下图所示。

二、代码实战

clcclear
%%下载训练集和测试集s = load("HumanActivityTrain.mat");XTrain = s.XTrain;TTrain = s.YTrain;numObservations = numel(XTrain);classes = categories(TTrain{1});numClasses = numel(classes);numFeatures = size(s.XTrain{1},1);s1 = load("HumanActivityTest.mat");XTest = s1.XTest;TTest = s1.YTest;
%% 搭建TCN网络numFilters = 64;filterSize = 5;dropoutFactor = 0.005;numBlocks = 4;
layer = sequenceInputLayer(numFeatures,Normalization="rescale-symmetric",Name="input");lgraph = layerGraph(layer);
outputName = layer.Name;
for i = 1:numBlocks dilationFactor = 2^(i-1); layers = [ convolution1dLayer(filterSize,numFilters,DilationFactor=dilationFactor,Padding="causal",Name="conv1_"+i) layerNormalizationLayer spatialDropoutLayer(dropoutFactor) convolution1dLayer(filterSize,numFilters,DilationFactor=dilationFactor,Padding="causal") layerNormalizationLayer reluLayer spatialDropoutLayer(dropoutFactor) additionLayer(2,Name="add_"+i)];
% Add and connect layers. lgraph = addLayers(lgraph,layers); lgraph = connectLayers(lgraph,outputName,"conv1_"+i);
% Skip connection. if i == 1 % Include convolution in first skip connection. layer = convolution1dLayer(1,numFilters,Name="convSkip");
lgraph = addLayers(lgraph,layer); lgraph = connectLayers(lgraph,outputName,"convSkip"); lgraph = connectLayers(lgraph,"convSkip","add_" + i + "/in2"); else lgraph = connectLayers(lgraph,outputName,"add_" + i + "/in2"); end % Update layer output name. outputName = "add_" + i;end
layers = [ fullyConnectedLayer(numClasses,Name="fc") softmaxLayer classificationLayer];lgraph = addLayers(lgraph,layers);lgraph = connectLayers(lgraph,outputName,"fc");
%% 画网络结构figureplot(lgraph)title("Temporal Convolutional Network")%% 设置网络参数options = trainingOptions("adam", ... MaxEpochs=60, ... miniBatchSize=1, ... Plots="training-progress", ... Verbose=0);%% 训练网络net = trainNetwork(XTrain,TTrain,lgraph,options);%% 测试网络YPred = classify(net,XTest);figureconfusionchart(TTest{1},YPred{1})accuracy = mean(YPred{1} == TTest{1})

仿真结果:

完整代码免费获取(多多点赞和转发朋友圈
回复 
TCN算法
获取
参考文献:
【1】https://blog.csdn.net/Leon_winter/article/details/100124146

    部分知识来源于网络,如有侵权请联系作者删除~


    今天的分享就到这里了,后续想了解智能算法、机器学习、深度学习和信号处理相关理论的可以后台私信哦~希望大家多多转发点赞加收藏,你们的支持就是我源源不断的创作动力!


作 者 | 华 夏

编 辑 | 华 夏

校 对 | 华 夏


matlab学习之家
分享学习matlab建模知识和matlab编程知识
 最新文章