您好,欢迎访问三七文档
当前位置:首页 > 商业/管理/HR > 项目/工程管理 > BP神经网络模型推导
Theprojectdescribesteachingprocessofmulti-layerneuralnetworkemployingbackpropagationalgorithm.Toillustratethisprocessthethreelayerneuralnetworkwithtwoinputsandoneoutput,whichisshowninthepicturebelow,isused:Eachneuroniscomposedoftwounits.Firstunitaddsproductsofweightscoefficientsandinputsignals.Thesecondunitrealisenonlinearfunction,calledneuronactivationfunction.Signaleisadderoutputsignal,andy=f(e)isoutputsignalofnonlinearelement.Signalyisalsooutputsignalofneuron.Toteachtheneuralnetworkweneedtrainingdataset.Thetrainingdatasetconsistsofinputsignals(x1andx2)assignedwithcorrespondingtarget(desiredoutput)z.Thenetworktrainingisaniterativeprocess.Ineachiterationweightscoefficientsofnodesaremodifiedusingnewdatafromtrainingdataset.Modificationiscalculatedusingalgorithmdescribedbelow:Eachteachingstepstartswithforcingbothinputsignalsfromtrainingset.Afterthisstagewecandetermineoutputsignalsvaluesforeachneuronineachnetworklayer.Picturesbelowillustratehowsignalispropagatingthroughthenetwork,Symbolsw(xm)nrepresentweightsofconnectionsbetweennetworkinputxmandneuronnininputlayer.Symbolsynrepresentsoutputsignalofneuronn.Propagationofsignalsthroughthehiddenlayer.Symbolswmnrepresentweightsofconnectionsbetweenoutputofneuronmandinputofneuronninthenextlayer.Propagationofsignalsthroughtheoutputlayer.Inthenextalgorithmsteptheoutputsignalofthenetworkyiscomparedwiththedesiredoutputvalue(thetarget),whichisfoundintrainingdataset.Thedifferenceiscallederrorsignalofoutputlayerneuron.Itisimpossibletocomputeerrorsignalforinternalneuronsdirectly,becauseoutputvaluesoftheseneuronsareunknown.Formanyyearstheeffectivemethodfortrainingmultiplayernetworkshasbeenunknown.Onlyinthemiddleeightiesthebackpropagationalgorithmhasbeenworkedout.Theideaistopropagateerrorsignal(computedinsingleteachingstep)backtoallneurons,whichoutputsignalswereinputfordiscussedneuron.Theweights'coefficientswmnusedtopropagateerrorsbackareequaltothisusedduringcomputingoutputvalue.Onlythedirectionofdataflowischanged(signalsarepropagatedfromoutputtoinputsoneaftertheother).Thistechniqueisusedforallnetworklayers.Ifpropagatederrorscamefromfewneuronstheyareadded.Theillustrationisbelow:Whentheerrorsignalforeachneuroniscomputed,theweightscoefficientsofeachneuroninputnodemaybemodified.Informulasbelowdf(e)/derepresentsderivativeofneuronactivationfunction(whichweightsaremodified).df1(e)/dw(x1)1=[df1(e)/de]*[de/dw(x1)1]=[df(e)/de]*x1Coefficientaffectsnetworkteachingspeed.Thereareafewtechniquestoselectthisparameter.Thefirstmethodistostartteachingprocesswithlargevalueoftheparameter.Whileweightscoefficientsarebeingestablishedtheparameterisbeingdecreasedgradually.Thesecond,morecomplicated,methodstartsteachingwithsmallparametervalue.Duringtheteachingprocesstheparameterisbeingincreasedwhentheteachingisadvancedandthendecreasedagaininthefinalstage.Startingteachingprocesswithlowparametervalueenablestodetermineweightscoefficientssigns.ReferencesRyszardTadeusiewczSiecineuronowe,Kraków1992
本文标题:BP神经网络模型推导
链接地址:https://www.777doc.com/doc-2900958 .html