combininggoogle

google中国地图  时间:2021-02-05  阅读:()
Zhuetal.
BMCBioinformatics2019,20(Suppl18):575https://doi.
org/10.
1186/s12859-019-3131-8RESEARCHOpenAccessAttention-basedrecurrentneuralnetworkforinfluenzaepidemicpredictionXiangleiZhu1,BofengFu1,YaodongYang1,YuMa3,JianyeHao1*,SiqiChen1,ShuangLiu1,TiegangLi3,SenLiu2,WeimingGuo2andZhenyuLiao4,5FromBiologicalOntologiesandKnowledgebasesworkshopatIEEEBIBM2018Madrid,Spain.
3-6December2018AbstractBackground:Influenzaisaninfectiousrespiratorydiseasethatcancauseseriouspublichealthhazard.
Duetoitshugethreattothesociety,precisereal-timeforecastingofinfluenzaoutbreaksisofgreatvaluetoourpublic.
Results:Inthispaper,weproposeanewdeepneuralnetworkstructurethatforecastsareal-timeinfluenza-likeillnessrate(ILI%)inGuangzhou,China.
Longshort-termmemory(LSTM)neuralnetworksisappliedtopreciselyforecastaccuratenessduetothelong-termattributeanddiversityofinfluenzaepidemicdata.
Wedeviseamulti-channelLSTMneuralnetworkthatcandrawmultipleinformationfromdifferenttypesofinputs.
Wealsoaddattentionmechanismtoimproveforecastingaccuracy.
Byusingthisstructure,weareabletodealwithrelationshipsbetweenmultipleinputsmoreappropriately.
Ourmodelfullyconsidertheinformationinthedataset,targetedlysolvingpracticalproblemsoftheGuangzhouinfluenzaepidemicforecasting.
Conclusion:Weassesstheperformanceofourmodelbycomparingitwithdifferentneuralnetworkstructuresandotherstate-of-the-artmethods.
Theexperimentalresultsindicatethatourmodelhasstrongcompetitivenessandcanprovideeffectivereal-timeinfluenzaepidemicforecasting.
Keywords:Influenzaepidemicprediction,Attentionmechanism,Multi-channelLSTMneuralnetworkBackgroundInfluenzaisaninfectiousrespiratorydiseasethatcancauseseriouspublichealthhazard.
Itcanaggravatetheoriginalunderlyingdiseaseafterinfection,causingsec-ondarybacterialpneumoniaandacuteexacerbationofchronicheartandlungdisease.
Furthermore,the2009H1N1pandemiccausedbetween151,700and575,400deathsinworldwideduringthefirstyeartheviruscir-culated[1].
Therefore,preciseon-linemonitoringandforecastingofinfluenzaepidemicoutbreakshasagreatvaluetopublichealthdepartments.
Influenzadetectionandsurveillancesystemsprovideepidemiologicinforma-tionthatcanhelppublichealthsectorsdeveloppreventive*Correspondence:haojianye@gmail.
comXiangleiZhuandBofengFucontributedequallytothiswork.
1CollegeofIntelligenceandComputing,TianjinUniversity,PeiyangParkCampus:No.
135YaguanRoad,HaiheEducationPark,300350Tianjin,ChinaFulllistofauthorinformationisavailableattheendofthearticlemeasuresandassistlocalmedicalinstitutionsindeploy-mentplanning[2].
Influenza-like-illness(ILI)isaninfectiousrespiratoryinfectionmeasurementdefinedbytheWorldHealthOrganization(WHO).
ILIwithameasuredfeverhigherthan38°C,andcough,withonsetwithintheprevious10days[3].
Ourpredictiontarget,ILI%,isequaltotheratiooftheinfluenza-likecasesnumbertothevisitingpatients'number.
Inthefieldofinfluenzasurveillance,ILI%isoftenusedasanindicatortohelpdetermineifthereisapossibleinfluenzaepidemic.
WhentheILI%baselineisexceeded,theinfluenzaseasonhasarrived,remindingthehealthadministrationstotaketimelypreventivemeasures.
Inrecentyears,moreandmoreresearchershavecon-centratedonpreciseon-linemonitoring,earlydetec-tionandinfluenzaepidemicoutbreaksforecasting.
Thus,influenzaepidemicoutbreaksforecastinghasbecomethemostactiveresearchdirection.
TheinformationfromTheAuthor(s).
2019OpenAccessThisarticleisdistributedunderthetermsoftheCreativeCommonsAttribution4.
0InternationalLicense(http://creativecommons.
org/licenses/by/4.
0/),whichpermitsunrestricteduse,distribution,andreproductioninanymedium,providedyougiveappropriatecredittotheoriginalauthor(s)andthesource,providealinktotheCreativeCommonslicense,andindicateifchangesweremade.
TheCreativeCommonsPublicDomainDedicationwaiver(http://creativecommons.
org/publicdomain/zero/1.
0/)appliestothedatamadeavailableinthisarticle,unlessotherwisestated.
Zhuetal.
BMCBioinformatics2019,20(Suppl18):575Page2of10websitesearchorsocialnetworkapplications,suchasTwitterandGoogleCorrelate[4–6],providessufficientdatasupportforthisresearcharea.
Previousmethodsarecommonlybuiltonlinearmodels,suchasleastabsoluteshrinkageandselectionoperator(LASSO)orpenalizedregression[4,6,7].
Somepeoplealsoimplementdeeplearningmodelswhensolvinginfluenzaepidemicfore-castingproblems[8,9].
However,thesemethodscan'teffi-cientlyprovidethepreciseforecastingofILI%oneweekinadvance.
First,theonlinedataisnotaccurateenoughandlacksnecessaryfeatures,whichcannotfullyreflectthetrendoftheinfluenzaepidemic.
Second,influenzaepi-demicdataisusuallyverycomplex,non-stationary,andverynoisy.
Traditionallinearmodelscannothandlemulti-variableinputsappropriately.
Third,previouslyproposeddeeplearningmethodsdidn'tconsiderthetime-sequencepropertyofinfluenzaepidemicdata.
Inthispaper,weuseinfluenzasurveillancedataasourdataset,whichisprovidedbytheGuangzhouCenterforDiseaseControlandPrevention.
Thisdatasetincludesmultiplefeaturesandiscountseparatelyofeachdis-trictinGuangzhou.
Ourapproachtakesadvantageofthesetwocharacteristics.
Meanwhile,weconsiderthetime-sequenceproperty,makingourapproachsolvetheinfluenzaepidemicforecastingprobleminGuangzhouwithpertinence.
Duetotherelevantspecificationsofdatacollection,ourmethodisalsoapplicableinotherregions.
Weconcentrateonimplementingdeeplearningmodelstoaddresstheinfluenzaoutbreaksforecastingprob-lem.
Recently,deeplearningmethodshaveobtainedremarkableperformancesinvariousresearchareasfromcomputervision,speechrecognitiontoclimatefore-casting[10–12].
Weimplementlong-shorttermmemory(LSTM)neuralnetworks[13]asafundamentalmethodforforecasting,becausetheinfluenzaepidemicdatanat-urallyhastimeseriesattribute.
Consideringthatdifferenttypesofinputdatacorrespondtodifferentcharacter-istics,onesingleLSTMwithaspecificfiltermaynotcapturethetimeseriesinformationcomprehensively.
Byusingamulti-channelarchitecture,wecanbettercap-turethetimeseriesattributesfromthedata.
Notonlyensurestheintegrationofvariousrelevantdescriptorsinthehigh-levelnetwork,butalsoensuresthattheinputdatawillnotinterferewitheachotherintheunderlyingnetwork.
ThestructuredLSTMcanproviderobustfittingabilitythathasbeenprovidedinseveralpapers[14,15].
Wefurtherenhanceourmethodusingattentionmecha-nism.
Inattentionlayer,theprobabilityofoccurrenceofeachvalueintheoutputsequencedependsontheval-uesintheinputsequence.
Bydesigningthisarchitecture,wecanbetterdealwithinputstreamrelationshipsamongmultipleregionsmoreappropriately.
WenamedourmodelasAtt-MCLSTM,whichstandsforattention-basedmulti-channelLSTM.
Ourmaincontributionscanbesummarizedasfollows:(1)WetestourmodelonGuangzhouinfluenzasurveil-lancedataset,whichisauthenticandreliable.
Itcontainsmultipleattributesandtimeseriesfeatures.
(2)Wepro-poseanattention-basedmulti-channelLSTMstructurethatassociatesdifferentwell-behavedapproaches.
Thestructuretakestheforecastingproblemandtheinfluenzaepidemicdataattributesintoaccount.
Theproposedmodelcanbeseenasanalternativetoforecastinfluenzaepidemicoutbreaksinotherdistricts.
(3)Theproposedmodelmakesfulluseofinformationinthedataset,solv-ingtheactualproblemofinfluenzaepidemicforecastinginGuangzhouwithpertinence.
Theexperimentalresultsdemonstratethevalidityofourmethod.
Tothebestofourknowledge,thisisthefirststudythatappliesLSTMneuralnetworkstotheinfluenzaoutbreaksforecastingproblem.
Therestofthispaperisorganizedasfollows.
Inthesecondsection,weillustratedetailsofourmethod.
Inthethirdsection,weevaluateperformancesofourmethodbycomparingitwithdifferentneuralnetworkstructuresandotherpriorartmethods.
Inthefourthsection,wediscussconclusionsandprospectsforfutureworks.
MethodsTheaccuratenessoftheforecastingproblemscanbeenhancedbycombiningmultiplemodels[16–26].
Inthispaper,wedeviseannovelLSTMneuralnetworkstructuretosettletheinfluenzaepidemicforecastingprobleminGuangzhou,China.
Ourmodelcanextractcharacteristicsmoreeffectivelyfromtimeseriesdata,andtakediffer-entimpactsofdifferentpartsofdataintoconsideration.
Inordertoillustrateourmodelclearly,weillustrateourdatasetfirst.
Thefollowingsectionswillgivefurtherillus-trationsonthedataset,theoverallideaofourmodel,detailsofLSTMneuralnetworks,attentionmechanism,attention-basedmulti-channelLSTM,datanormalization,andevaluationmethod.
DatasetdescriptionTheinfluenzasurveillancedataweusedincludes9yearsdata.
Statisticsoninfluenzaepidemicdatain9regionsarecountedeachyear.
Thedatasetincludes6modules,andeachofthesemoduleshasmultiplefeatures.
Thedatasethasonerecordeachweek,anddatafor52weeksiscountedeachyear.
DesignoftheproposedmodelInFig.
1,wedemonstratetheflowdiagramofourmethod.
Theintegratedflowdiagramhastwoparts,trainingpartandtestpart.
Inthetrainingpart,first,weselect19relevantfeaturesafterdatacleaningandnormalizationprocesses.
WefurtherillustratethechosenmodulesandZhuetal.
BMCBioinformatics2019,20(Suppl18):575Page3of10Fig.
1TheflowchartofAttention-basedmulti-channelLSTMfeaturesinTable1.
Table1doesn'tincludebasicinforma-tionmodule,whichincludestimeinformation,districts,andpopulation.
Weusemodel-basedrankingmethodasourfeatureselectionmethod.
Inordertoimplementmodel-basedrankingmethod,wedeleteonefeatureatatime,andinputtherestoffeaturesintothesamefore-castingmodeleverytime.
Iftheforecastingaccuracyislow,thismeansthatthefeatureweremovedisrelevanttoourforecastingobjective.
Afterrankingallthefore-castingaccuracy,weselect19featuresthatarerelevanttotheforecastingobjective.
Thenweseparatethedatasetintotrainingdatasetandtestdataset.
Thetrain-ingdatasetcontains80percentofdatatoextractannualtrendandseasonalperiodicity.
Inthetestpart,wetestourmodelonthetestdataset.
Then,wepreformdenor-malizationprocesstoreconstructtheoriginalvalues.
Finally,weassessourmodelandcompareitwithothermodels.
DatanormalizationMin-Maxnormalizationisalineartransformationstrategy[27].
Thismethodmaintainstherelationshipamongalltheoriginaldata.
Min-Maxnormalizationtransformsavaluextoy,yisdefinedasEq.
1.
y=(xmin)(maxmin)(1)Whereministhesmallestvalueinthedata,maxisthebiggestvalueinthedata.
Afterdatanormalization,thefeaturesofdatawillbescaledbetween0and1.
Wepreformdenormalizationprocesstoreconstructtheoriginaldata.
Givenanormalizedvaluey,itsoriginalvaluexisdefinedasEq.
2.
x=(maxmin)y+min(2)Zhuetal.
BMCBioinformatics2019,20(Suppl18):575Page4of10Table1ModulesandfeaturesdescriptionforSection2.
1ModulenameFeaturenameDescriptionLegalinfluenzacasesreportmoduleLegalinfluenzacasesnumbersThenumberofinfluenzacasesinthenationalinfectiousdiseasereportingsystem.
EpidemicmonitoringmoduleInfluenzaoutbreaksnumbersMorethan10influenza-likecasesoccurredwithinoneweekinthesameunit.
AffectedcasesnumbersThetotalnumberofpeopleaffectedbytheepidemic.
SymptommonitoringmoduleInfluenza-likecasesnumbers(0-5age)Thenumberofinfluenza-likecases(0-5age).
Influenza-likecasesnumbers(5-15age)Thenumberofinfluenza-likecases(5-15age).
Influenza-likecasesnumbers(15-25age)Thenumberofinfluenza-likecases(15-25age).
Influenza-likecasesnumbers(25-60age)Thenumberofinfluenza-likecases(25-60age).
Influenza-likecasesnumbers(>60age)Thenumberofinfluenza-likecases(over60age).
Totalinfluenza-likecasesnumbersThetotalnumberofinfluenza-likecases.
TotalvisitingpatientsnumbersThetotalnumberofvisitingpatients.
UpperrespiratorytractinfectionsnumbersThenumberofupperrespiratorytractinfections.
PharmacymonitoringmoduleChinesepatentcoldmedicinesSalesofChinesepatentcoldmedicines.
OthercoldmedicinesSalesofothercoldmedicines.
ClimatedatamoduleAveragetemperature(°C)Averagetemperature.
Maximumtemperature(°C)Maximumtemperature.
Minimumtemperature(°C)Minimumtemperature.
Rainfall(mm)Rainfall.
Airpressure(hPa)Airpressure.
Relativehumidity(%)Relativehumidity.
Long-shorttermmemoryneuralnetworkRecurrentneuralnetworkshavetheabilitytodynam-icallycombineexperiencesbecauseoftheirinternalrecurrence[28].
DifferentfromothertraditionalRNNs,LSTMcandealwiththegradientvanishingprob-lem[29].
ThememoryunitsofLSTMcellsretaintimeseriesattributesofgivencontext[29].
SomeresearcheshaveproventhatLSTMneuralnetworkscanyieldabetterperformancecomparedwithothertradi-tionalRNNswhendealingwithlong-termtimeseriesdata[30].
ThestructureofasingleLSTMcellillustrateinFig.
2.
Thegatescontroltheflowofinformation,thatis,inter-actionsbetweendifferentcellsandcellitself.
Inputgatecontrolsthememorystateupdatingprocess.
Outputgatecontrolswhethertheoutputflowcanalterothercells'memorystate.
Forgetgatecanchoosetorememberorfor-getitspreviousstate.
LSTMisimplementedbyfollowingcompositefunctions:it=σ(Wxixt+Whiht1+Wcict1+bi)(3)ft=σ(Wxfxt+Whfht1+Wcfct1+bf)(4)ct=ft1+ittanh(Wxcxt+Whcht1+bc)(5)ot=σ(Wxoxt+Whoht1+Wcoct+bo)(6)ht=ottanh(ct)(7)Whereσrepresentthelogisticsigmoidfunction.
i,f,o,andcrepresenttheinputgate,forgetgate,outputgate,Fig.
2ThestructureofsingleLSTMcellZhuetal.
BMCBioinformatics2019,20(Suppl18):575Page5of10cellinputactivationvectorsrespectively.
hrepresentsthehiddenvector.
Theweightmatrixsubscriptshavetheintu-itivemeaning.
Like,Whirepresentsthehidden-inputgatematrixetc.
AttentionmechanismTraditionalEncode-Decodestructurestypicallyencodeaninputsequenceintoafixed-lengthvectorrepresenta-tion.
However,thismodelhasdrawbacks.
Whentheinputsequenceisverylong,itisdifficulttolearnafeasiblevectorrepresentation.
Onefundamentaltheoryofattentionmechanism[31]istoabandontheconventionalEncoder-Decoderstruc-ture.
Attentionmechanismtrainsamodelthatselectivelylearnstheinputstreamsbyconservingtheintermedi-ateoutputsofLSTM.
Inattentionstructure,theoutputsequencesareaffiliatedwiththeinputsequences.
Inotherwords,theprobabilityofoccurrenceofeachvalueintheoutputsequencedependsonthevalueintheinputsequence.
Figure3illustratestheattentionmechanism.
AttentionlayercalculatestheweighteddistributionofX1,.
.
.
,XT.
TheinputofStcontainstheoutputoftheattentionlayer.
Theprobabilityofoccurrenceoftheout-putsequence.
.
.
,yt1,yt,.
.
.
dependsoninputsequenceX1,X2,.
.
.
,XT.
hirepresentsthehiddenvector.
At,irepre-sentstheweightofithinputattimestept.
Attentionlayerinputsnparametersy1,.
.
.
,yn,contextsequencec,andoutputsvectorz,zistheweighteddistributionofyiforagivencontextc.
Attentionmechanismisimplementedbyfollowingcompositefunction:mi=tanh(Wcmc+Wymyi)(8)si∝exp(wm,mi)(9)isi=1(10)z=isiyi(11)Wheremiiscalculatedbytanhlayer,siisthesoftmaxofthemiprojectedonalearneddirection.
TheoutputzisFig.
3Thediagramofattentionmechanism.
AttentionlayercalculatestheweighteddistributionofX1,.
.
.
,XT.
TheinputofStcontainstheoutputoftheattentionlayer.
Theprobabilityofoccurrenceoftheoutputsequence.
.
.
,yt1,yt,.
.
.
dependsoninputsequenceX1,X2,.
.
.
,XT.
hirepresentsthehiddenvector.
At,irepresentstheweightofithinputattimesteptZhuetal.
BMCBioinformatics2019,20(Suppl18):575Page6of10theweightedarithmeticmeanofallyi,Wrepresentstherelevanceforeachvariableaccordingtothecontextc.
Attention-basedmulti-channelLSTMInFig.
4,weillustratetheoverallarchitectureofourmodel.
Weseparateourdatasetintotwocategories.
First,weclassifyaveragetemperature,maximumtemperature,minimumtemperature,rainfall,airpressureandrelativehumiditytogetherasclimate-relateddatacategory.
Then,therestoffeaturesareclassifiedtogetherasinfluenza-relateddatacategory.
Inourdataset,eachregionhasitsowninfluenza-relateddata,andtheysharethesameclimate-relateddataeveryweek.
Becauseourdatasethastheabovecharacteristics,theinputsofAtt-MCLSTMcontainstwoparts.
First,theinfluenza-relateddataisinputintoaseriesofLSTMneu-ralnetworks(LSTM1,.
.
.
,LSTM9)tocapturecorrelativefeatures.
Second,theclimate-relateddataisinputintoasingleLSTMneuralnetwork(LSTM10)tocapturethelong-termtimeseriesattributeofinfluenzaepidemicdata.
Forthefirstpart,eachLSTMneuralnetworkacquirestheinfluenza-relateddatafromonedistinctregion.
Inordertomakefulluseofthecomplementarityamongeveryregions,theoutputsofLSTMneuralnetworks(LSTM1,.
.
.
,LSTM9)areconcatenatedinahigherlayer(Merge1).
Thishigherlayercanobtainthefuseddescriptorsofunderlyingneuralnetworks.
Afterwecapturethefea-turesofeveryregions,westillwanttoweightintermediatesequences.
Thereasonisthatthedataofeachregionhasdifferentinfluencesonthefinalforecastingresult.
Therefore,theintermediatesequencespassthroughanattentionlayer(Attention)andafullyconnectedlayer(Dense1)inturn.
Thereafter,weconcatenatetheoutputsofthesetwopartstogether(Merge2).
Finally,theinterme-diatesequencesarepassedthroughtwofullyconnectedlayers(Dense2,Dense3).
Sofar,weacquirethehigh-levelfeaturesoftheinputdata,andtheyareusedtosolvetheinfluenzaepidemicforecasting.
Bydesigningamulti-channelstructure,wecanbet-terextractthetime-sequencepropertyofeachtypeofdata.
Notonlyensurestheintegrationofvariousrelevantdescriptorsinthehigh-levelnetwork,butalsoensuresthatinputdatawillnotinterferewitheachotherintheunderlyingnetwork.
Intheattentionlayer,theprobabilityofoccurrenceofeachvalueintheoutputsequencedependsonthevalueintheinputsequence.
Thisstructureallowsustohandletherelationshipofinputdatabetweendifferentdistrictsmoreappropriately.
EvaluationmethodToevaluateourmethod,weusethemeanabsoluteper-centageerror(MAPE)asthecriteriastandard.
ItsformulaisexpressasEq.
12.
MAPE=1nni=1|yixiyi|*100(12)Fig.
4ThestructureofAttention-basedmulti-channelLSTMZhuetal.
BMCBioinformatics2019,20(Suppl18):575Page7of10Whereyidenotestheithactualvalue,andxidenotestheithpredictedvalue.
IfthevalueofMAPEislow,theaccuracyofthemethodishigh.
ExperimentsInthissection,wedidtwoexperimentstoverifytheAtt-MCLSTMmodel.
Inthefirstexperiment,weevaluatethenumbersofconsecutiveweeksofdatathatweneedtoforecastILI%forthenextweek.
Inthesecondexperiment,wecompareourmodelwithdifferentneuralnetworkstructuresandothermethods.
Eachexperimentresultistheaverageof10repeatedtrials.
SelectionofconsecutiveweeksInthisexperiment,wesetthenumbersofconsecu-tiveweeksas6,8,10,12,14respectively.
Thehyper-parametersofeachlayerarelistedinTable2.
Theacti-vationfunctionsweusedarelinearactivationfunction.
Thelossfunctionandoptimizeraremapeandadamrespectively.
Weusethefirst370consecutiveweeks'dataintrainingphaseandtheremainingdatainthetestphase.
Eachdatasampleincludes6featuresinclimate-relateddatacate-goryand9differentdistricts'influenza-relateddata.
Eachinfluenza-relateddatacontains13features.
Theclimate-relateddataandeachdistrict'sinfluenza-relateddataareinputintotheclimate-relatedchannelandtheinfluenza-relatedchannelrespectively.
TheforecastingresultsareshowninTable3.
PerformancevalidationInthisexperiment,weverifythevalidityofourmodel.
First,wecompareAtt-MCLSTMwithMCLSTMbycomparingtheirforecastingaccuracy.
Thepurposeofdoingthisistoverifytheeffectoftheattentionmechanism.
Forbothmodels,weusethesamemulti-channelarchitecture(asshowninFig.
4).
TheonlydifferencebetweenthesetwomodelsisthatwedeletetheattentionlayerinMCLSTM.
Theparameterssettingsanddatainputsmethodareasdescribedinthefirstexperiment.
Table2ThesizeofeveryunitinAtt-MCLSTMneuralnetworkforSection3.
1LayernameUnitsnumberLSTM1,.
.
.
,LSTM932LSTM1032Dense116Dense210Dense31Table3TheMAPEofthepredictionresultsforSection3.
1NumberofweeksMAPE60.
10780.
092100.
086120.
106140.
109Second,wecompareMCLSTMwithLSTMbycomparingtheirforecastingaccuracy.
Thepurposeofdoingthisistoverifytheeffectofthemulti-channelstructure.
ForMCLSTM,parameterssettingsanddatainputsmethodareasdescribedinthefirstexperiment.
ForLSTM,weinputentirefeaturesintooneLSTMlayertocapturethefuseddescriptors.
Insteadofseparatingdatasetaccordingtodifferentregions,wesumcorrespondinginfluenza-relatedfeaturesineachweekfromeveryregionstogether.
Therefore,eachdatarecordincludes19selectedfeatures.
Thedatathatcontainsthese19featuresarepassedthroughafullyconnectedlayertoacquirehigh-levelfeatures.
Theunits'numberofLSTMlayerandfullyconnectedlayerare32and1respectively.
Third,wedemonstratethatLSTMscanyieldbetterperformancethanRNNswhendealingwithtimeseriesdata.
Results(1)AscanbeseenfromTable3,10consecutiveweeks'datayieldsthebestperformance.
(2)Table4showsthatAtt-MCLSTMhasstrongcompetitivenessandcanpro-videeffectivereal-timeinfluenzaepidemicforecasting.
DiscussionTheresultsofthefirstexperimentindicatethat10consec-utiveweeksdatacanappropriatelyreflectthetimeseriesattributeofinfluenzadata.
Ifthelengthofinputdataisshorterthan10,theinputdatadoesn'tcontainenoughtimeseriesinformation.
Onthecontrary,ifthelengthofinputdataislongerthan10,thenoiseinsidetheinputdataincreased,leadingtoadecreaseinforecastingaccuracy.
Therefore,inourexperiments,eachdatarecordincludes10consecutiveweeks'data.
Table4TheMAPEofthepredictionresultsforSection3.
2SchemesMAPEAtt-MCLSTM0.
086MCLSTM0.
105LSTM0.
118RNN0.
132Zhuetal.
BMCBioinformatics2019,20(Suppl18):575Page8of10TheresultsofthesecondexperimentshowthatAtt-MCLSTMcanyieldthebestperformance.
InTable4,fromthefirsttworows,wecanconcludethatusingatten-tionmechanismcanimprovetheMAPEfrom0.
105to0.
086.
Thereasonisthattheattentionlayercanbet-terdealwiththerelationshipsofinputstreamsamongeveryregionsmoreappropriately.
Fromthesecondrowandthethirdrow,wecanconcludethatusingmulti-channelstructurecanimprovetheMAPEfrom0.
118to0.
105.
Thereasonisthatthemulti-channelstructurecanbettercapturethetimeseriesattributesfromdif-ferentinputstreams.
Fromthelasttworows,wecanconcludethatusingLSTMcanimprovetheMAPEfrom0.
132to0.
118.
ThereasonisthatLSTMneuralnet-workcanbetterdealwithtimeseriesdata.
Thisresultalsodemonstratesthetimeseriesattributeofinfluenzaepidemicdata.
Figure5showstheactualvaluesandpredictedval-uesoffourmodels.
WecanseethattheresultofAtt-MCLSTMisclosetotheactualoutput.
Therearemoreobviousdifferencesbetweenthepre-dictedresultsandtheactualvaluebyusingtheotherthreemodels.
So,thiscanverifythatadoptingAtt-MCLSTMtoanalyzethesequentialinformationcanhelptoextracttime-sequencecharacteristicmoreaccuratelyandcomprehensively.
ConclusionandfutureworkInthispaper,weproposeanewdeepneuralnet-workstructure(Att-MCLSTM)toforecasttheILI%inGuangzhou,China.
First,weimplementthemulti-channelarchitecturetocapturetimeseriesattributesfromdifferentinputstreams.
Then,theattentionmechanismisappliedtoweightthefusedfeaturesequences,whichallowsustodealwithrelationshipsbetweendifferentinputstreamsmoreappropriately.
Ourmodelfullycon-sidertheinformationinthedataset,targetedlysolvingthepracticalproblemofinfluenzaepidemicforecastinginGuangzhou.
Weassesstheperformanceofourmodelbycomparingitwithdifferentneuralnetworkstructuresandotherstate-of-the-artmodels.
Theexperimentalresultsindicatethatourmodelhasstrongcompetitivenessandcanprovideeffectivereal-timeinfluenzaepidemicfore-casting.
Tothebestofourknowledge,thisisthefirststudythatappliesLSTMneuralnetworkstotheinfluenzaout-breaksforecasting.
Continuingworkwillfurtherimprovetheexpansionabilityofourmodelbyintroducingtransferlearning.
Fig.
5Theresultsofone-weekaheadpredictionbyusingfourindividualmodels.
ashowsthecomparisonofAtt-MCLSTMandrealdata;bshowsthecomparisonofMCLSTMandrealdata;cshowsthecomparisonofLSTMandrealdata;dshowsthecomparisonoftraditionalRNNandrealdata.
Ineachfigure,thebluelinedenotestheactualvalues,andtheorangelinedenotesthepredictedvaluesZhuetal.
BMCBioinformatics2019,20(Suppl18):575Page9of10AbbreviationsILI:Influenza-likeillness;LSTM:Longshort-termmemory;LASSO:Leastabsoluteshrinkageandselectionoperator;MAPE:MeanabsolutepercentageerrorAcknowledgementsWethankthereviewers'valuablecommentsforimprovingthequalityofthiswork.
AboutthissupplementThisarticlehasbeenpublishedaspartofBMCBioinformaticsVolume20Supplement18,2019:SelectedarticlesfromtheBiologicalOntologiesandKnowledgebasesworkshop2018.
Thefullcontentsofthesupplementareavailableonlineathttps://bmcbioinformatics.
biomedcentral.
com/articles/supplements/volume-20-supplement-18.
Authors'contributionsXZandBFcontributedequallytothealgorithmdesignandtheoreticalanalysis.
YY,YM,JH,SC,SL,TL,SL,WG,andZLcontributedequallytothethequalitycontrolanddocumentreviewing.
Allauthorsreadandapprovedthefinalmanuscript.
FundingPublicationcostsarefundedbyTheNationalNaturalScienceFoundationofChina(GrantNos.
:U1836214),TianjinDevelopmentProgramforInnovationandEntrepreneurshipandSpecialProgramofArtificialIntelligenceofTianjinMunicipalScienceandTechnologyCommission(NO.
:17ZXRGGX00150).
AvailabilityofdataandmaterialsAlldatainformationoranalyzedduringthisstudyareincludedinthisarticle.
EthicsapprovalandconsenttoparticipateNotapplicable.
ConsentforpublicationNotapplicable.
CompetinginterestsTheauthorsdeclarethattheyhavenocompetinginterests.
Authordetails1CollegeofIntelligenceandComputing,TianjinUniversity,PeiyangParkCampus:No.
135YaguanRoad,HaiheEducationPark,300350Tianjin,China.
2AutomotiveDataCenter,ChinaAutomotiveTechnology&Research,300300Tianjin,China.
3GuangzhouCenterforDiseaseControlandPrevention,510440Guangzhou,China.
4PonyTestingInternationalGroup,300051Tianjin,China.
5TianjinFoodSafetyInspectionTechnologyInstitute,300300Tianjin,China.
Published:25November2019References1.
YangS,SantillanaM,KouSC.
Accurateestimationofinfluenzaepidemicsusinggooglesearchdataviaargo.
ProcNatlAcadSci.
2015;112(47):14473–8.
2.
BrownsteinJS,MandlKD.
Reengineeringrealtimeoutbreakdetectionsystemsforinfluenzaepidemicmonitoring.
In:AMIAAnnualSymposiumProceedings,vol.
2006.
AmericanMedicalInformaticsAssociation;2006.
p.
866.
3.
OrganizationWH,etal.
Whointerimglobalepidemiologicalsurveillancestandardsforinfluenza.
20121–61.
4.
SantillanaM,ZhangDW,AlthouseBM,AyersJW.
Whatcandigitaldiseasedetectionlearnfrom(anexternalrevisionto)googleflutrendsAmJPrevMed.
2014;47(3):341–7.
5.
AchrekarH,GandheA,LazarusR,YuS.
-H.
,LiuB.
Predictingflutrendsusingtwitterdata.
In:ComputerCommunicationsWorkshops(INFOCOMWKSHPS),2011IEEEConferenceOn.
IEEE;2011.
p.
702–7.
https://doi.
org/10.
1109/infcomw.
2011.
5928903.
6.
BroniatowskiDA,PaulMJ,DredzeM.
Nationalandlocalinfluenzasurveillancethroughtwitter:ananalysisofthe2012-2013influenzaepidemic.
PLoSONE.
2013;8(12):83672.
7.
SantillanaM,NsoesieEO,MekaruSR,ScalesD,BrownsteinJS.
Usingclinicians'searchquerydatatomonitorinfluenzaepidemics.
ClinInfectDisOffPublInfectDisSocAm.
2014;59(10):1446.
8.
XuQ,GelYR,RamirezLLR,NezafatiK,ZhangQ,TsuiK.
-L.
Forecastinginfluenzainhongkongwithgooglesearchqueriesandstatisticalmodelfusion.
PLoSONE.
2017;12(5):0176690.
9.
HuH,WangH,WangF,LangleyD,AvramA,LiuM.
Predictionofinfluenza-likeillnessbasedontheimprovedartificialtreealgorithmandartificialneuralnetwork.
SciRep.
2018;8(1):4895.
10.
HintonGE,SalakhutdinovRR.
Reducingthedimensionalityofdatawithneuralnetworks.
science.
2006;313(5786):504–7.
11.
ZouB,LamposV,GortonR,CoxIJ.
Oninfectiousintestinaldiseasesurveillanceusingsocialmediacontent.
In:Proceedingsofthe6thInternationalConferenceonDigitalHealthConference.
ACM;2016.
p.
157–61.
https://doi.
org/10.
1145/2896338.
2896372.
12.
HuangW,SongG,HongH,XieK.
Deeparchitecturefortrafficflowprediction:Deepbeliefnetworkswithmultitasklearning.
IEEETransIntellTranspSyst.
2014;15(5):2191–201.
13.
HowDNT,LooCK,SahariKSM.
Behaviorrecognitionforhumanoidrobotsusinglongshort-termmemory.
IntJAdvRobotSyst.
2016;13(6):1729881416663369.
14.
YangY,HaoJ,SunM,WangZ,FanC,StrbacG.
Recurrentdeepmultiagentq-learningforautonomousbrokersinsmartgrid.
In:IJCAI,vol.
18;2018.
p.
569–75.
https://doi.
org/10.
24963/ijcai.
2018/79.
15.
YangY,HaoJ,WangZ,SunM,StrbacG.
Recurrentdeepmultiagentq-learningforautonomousagentsinfuturesmartgrid.
In:Proceedingsofthe17thInternationalConferenceonAutonomousAgentsandMultiAgentSystems.
InternationalFoundationforAutonomousAgentsandMultiagentSystems;2018.
p.
2136–8.
https://doi.
org/10.
24963/ijcai.
2018/79.
16.
Shafie-KhahM,MoghaddamMP,Sheikh-El-EslamiM.
Priceforecastingofday-aheadelectricitymarketsusingahybridforecastmethod.
EnergyConversManag.
2011;52(5):2165–9.
17.
XiaotianH,WeixunW,JianyeH,YaodongY.
Independentgenerativeadversarialself-imitationlearningincooperativemultiagentsystems.
In:Proceedingsofthe18thInternationalConferenceonAutonomousAgentsandMultiAgentSystems;2019.
p.
1315–1323.
InternationalFoundationforAutonomousAgentsandMultiagentSystems.
18.
YaodongY,JianyeH,YanZ,XiaotianH,BofengF.
Large-scalehomeenergymanagementusingentropy-basedcollectivemultiagentreinforcementlearningframework.
In:Proceedingsofthe18thInternationalConferenceonAutonomousAgentsandMultiAgentSystems;2019.
https://doi.
org/10.
24963/ijcai.
2019/89.
19.
HongyaoT,JianyeH,TangjieLv,YingfengC,ZongzhangZ,HangtianJ,ChunxuR,YanZ,ChangjieF,LiW.
HierarchicaldeepmultiagentreinforcementlearningwithTemporalAbstraction.
In:arXivpreprintarXiv:1809.
09332;2018.
20.
PengJ,GuanJ,ShangX.
Predictingparkinson'sdiseasegenesbasedonnode2vecandautoencoder.
FrontGenet.
2019;10:226.
21.
PengJ,ZhuL,WangY,ChenJ.
Miningrelationshipsamongmultipleentitiesinbiologicalnetworks.
IEEE/ACMTransComputBiolBioinforma.
2019.
https://doi.
org/10.
1109/tcbb.
2019.
2904965.
22.
PengJ,XueH,ShaoY,ShangX,WangY,ChenJ.
Anovelmethodtomeasurethesemanticsimilarityofhpoterms.
IJDMB.
2017;17(2):173–88.
23.
ChengL,HuY,SunJ,ZhouM,JiangQ.
Dincrna:acomprehensiveweb-basedbioinformaticstoolkitforexploringdiseaseassociationsandncrnafunction.
Bioinformatics.
2018;34(11):1953–6.
24.
ChengL,WangP,TianR,WangS,GuoQ,LuoM,ZhouW,LiuG,JiangH,JiangQ.
Lncrna2targetv2.
0:acomprehensivedatabasefortargetgenesoflncrnasinhumanandmouse.
NucleicAcidsRes.
2018;47(D1):140–4.
25.
HuY,ZhaoT,ZangT,ZhangY,ChengL.
Identificationofalzheimer'sdisease-relatedgenesbasedondataintegrationmethod.
FrontGenet.
2018;9:.
https://doi.
org/10.
3389/fgene.
2018.
00703.
26.
PengJ,HuiW,LiQ,ChenB,JiangQ,WeiZ,ShangX.
Alearning-basedframeworkformirna-diseaseassociationpredictionusingneuralnetworks.
bioRxiv.
2018276048.
https://doi.
org/10.
1101/276048.
27.
PandaSK,JanaPK.
Efficienttaskschedulingalgorithmsforheterogeneousmulti-cloudenvironment.
JSupercomput.
2015;71(4):1505–33.
28.
MurtaghF,StarckJ-L,RenaudO.
Onneuro-waveletmodeling.
DecSupportSyst.
2004;37(4):475–84.
29.
HochreiterS,SchmidhuberJ.
Longshort-termmemory.
NeuralComput.
1997;9(8):1735–80.
Zhuetal.
BMCBioinformatics2019,20(Suppl18):575Page10of1030.
PalangiH,DengL,ShenY,GaoJ,HeX,ChenJ,SongX,WardR.
Deepsentenceembeddingusinglongshort-termmemorynetworks:Analysisandapplicationtoinformationretrieval.
IEEE/ACMTransAudioSpeechLangProcess(TASLP).
2016;24(4):694–707.
31.
VaswaniA,ShazeerN,ParmarN,UszkoreitJ,JonesL,GomezAN,Kaiser,PolosukhinI.
Attentionisallyouneed.
In:AdvancesinNeuralInformationProcessingSystems;2017.
p.
5998–6008.
Publisher'sNoteSpringerNatureremainsneutralwithregardtojurisdictionalclaimsinpublishedmapsandinstitutionalaffiliations.

2021年国内/国外便宜VPS主机/云服务器商家推荐整理

2021年各大云服务商竞争尤为激烈,因为云服务商家的竞争我们可以选择更加便宜的VPS或云服务器,这样成本更低,选择空间更大。但是,如果我们是建站用途或者是稳定项目的,不要太过于追求便宜VPS或便宜云服务器,更需要追求稳定和服务。不同的商家有不同的特点,而且任何商家和线路不可能一直稳定,我们需要做的就是定期观察和数据定期备份。下面,请跟云服务器网(yuntue.com)小编来看一下2021年国内/国...

织梦DEDECMS即将授权收费和维权模式 站长应对的几个方法

这两天在站长群里看到不少有使用DEDECMS织梦程序的朋友比较着急,因为前两天有看到来自DEDECMS,我们熟悉的织梦程序官方发布的公告,将会在10月25日开始全面商业用途的使用DEDECMS内容管理程序的会采用授权收费模式,如果我们有在个人或者企业商业用途的,需要联系且得到授权才可以使用,否则后面会通过维权的方式。对于这个事情,我们可能有些站长经历过,比如字体、图片的版权。以及有一些国内的CMS...

快云科技:夏季大促销,香港VPS7.5折特惠,CN2 GIA线路; 年付仅不到五折巨惠,续费永久同价

快云科技怎么样?快云科技是一家成立于2020年的新起国内主机商,资质齐全 持有IDC ICP ISP等正规商家。我们秉承着服务于客户服务于大众的理念运营,机器线路优价格低。目前已注册用户达到5000+!主营产品有:香港弹性云服务器,美国vps和日本vps,香港物理机,国内高防物理机以及美国日本高防物理机!产品特色:全配置均20M带宽,架构采用KVM虚拟化技术,全盘SSD硬盘,RAID10阵列, 国...

google中国地图为你推荐
Telewizjamedia回收卡巴斯基动设备管理解决Createdwin7存在问题的应用软件名单(2020年第四批)支持ipad重庆网通重庆联通宽带xp如何关闭445端口系统怎么关闭445端口photoshop技术ps几大关键技术?win10445端口windows server2008怎么开放4443端口
国际域名 godaddy域名解析教程 冰山互联 hkbn simcentric jsp主机 韩国加速器 ssh帐号 一元域名 发包服务器 100m空间 免费防火墙 江苏双线服务器 1元域名 上海电信测速 cxz 畅行云 永久免费空间 镇江高防 学生服务器 更多