tier雅虎免费邮箱
雅虎免费邮箱 时间:2021-02-24 阅读:(
)
ICICExpressLettersICICInternational2010ISSN1881-803XVolume4,Number5,October2010pp.
1–9AnUnconstrainedOptimizationMethodBasedonBPNeuralNetworkFulinWang*,HuixiaZhu,JiquanWangCollegeofEngineeringNortheastAgriculturalUniversity,Harbin150030,ChinaEmail:fulinwang@yahoo.
com.
cnABSTRACT.
Anunconstrainedoptimizationmethodisproposedinthispaper,basedonbackpropagation(BP)neuralnetwork.
Theoptimizationmethodismainlyappliedtosolvingtheblackboxproblem.
WhenBPneuralnetworkisintermsoffitting,thismethodcanadjusttheinputvaluesofBPneuralnetworkaccordingtodeterminationofthemaximumandminimumoutputvalues.
Therefore,withthismethodtheapplicationofBPneuralnetworkisexpandedbycombiningBPnetwork'sfittingandoptimizationtogether.
Inaddition,anewmethodisprovidedintheresearchtosolvetheissuesrelatedtoblackboxoptimizationandanewplatformisalsosetupforfurtherstudyoftheconstrainedoptimizationproblemsbasedonBPneuralnetwork.
Inthisresearch,ageneralmathematicalmodelforBPneuralnetworkunconstraintoptimizationisestablishedusingunipolarSigmoidfunctionasthetransferringfunctionandmaximizingnetworkoutputvalues.
Basedonthismodel,anfundamentalideaofunconstrainedoptimizationmethodbasedonBPneuralnetworkisgiven,thepartialderivativeequationsfortheBPneuralnetwork'soutputtoinputisderivedandanalgorithmbasedontheunconstrainedoptimizationmethodofBpneuralnetworkisproposed.
Themodelisvalidatedbydemonstrationofsamplecalculationandtheresultsshowthatthealgorithmisaneffectivemethod.
Keywords:BPneuralnetwork;Unconstraint;model;optimizationmethod1.
Introduction.
BackPropagation(BP)neuralnetworkisoneoftheimportantresearchfieldsinintelligentcontrolandintelligentautomation[1,2].
BPneuralnetworkiscomposedofmanysimpleparallelalgorithmmodules,whicharesimilarwithbiologicalneuralsystemneurons.
Neuralnetworkisanonlineardynamicsystemwhichischaracterizedbydistributedinformationstorageandparallelsynergistictreatment[3].
Thestructureofasingleneuronissimpleanditsfunctionislimited.
However,anetworksystemwhichcontainsalargenumberofneuronshasvariousfunctionsandcanbeusedinmanyapplications.
BPneuralnetworkmodelwhichisoneofthemostimportantartificialneuralnetworkmodelsisamulti-layerforward-neuralnetwork,whichismostwidelystudiedandusedatpresent.
Theorieshaveprovedthatifathree-layerBPneuralnetworkhasenoughhiddenlayernodes,itcansimulateanycomplexnonlinearmapping[4-7].
ThisindicatesthatBPneuralnetworkfitsrathereasily.
TheoptimizationresearchofBPneuralnetworkisrecordedinsomedocuments.
Peoplehavedonealotofstudiesinsuchaspectsaslearningrate,weight,threshold,andnetworkstructureoptimizationetc,inordertosolvetheproblemsofBPneuralnetworkwhichincludethefactorsofthefluctuations,theoscillation,theslowingoffittingspeed,theinfinitenetworkstructureandsoon[8-15].
Butintheactualsituation,peoplearenotonlyconcernedaboutthefittingeffectofneuralnetworks,butalsoabouthowtoachievethemaximumandminimumoutputvaluesbyadjustinginputvalues.
ThepresentliteratureaboutBPneuralnetworkoptimizationismainlyaboutidentifyingthecorrespondingrelationbetweenBPneuralnetworkinputandoutputtoobtaintherequiredoutputvalues[16-19].
Actuallythesestudiescanbeconsideredassimulationbutnotoptimizationbecausetheselectedoptimumprogramisbasedonthesimulationresults.
Therefore,itisthestatingpointofthispaperthatprobestheoptimizationmethodwhichisbasedontrueBPneuralnetwork.
AnunconstraintoptimizationmethodispresentedinthisresearchtosolvetheblackboxproblembasedonBPneuralnetworktoobtaintheexperimentalandobserveddatawithoutknowingthefunctionalrelationbetweeninputandoutputandtoachievetheoutputoptimizationbyadjustingtheinputvalues.
WhenBPneuralnetworkisintermsoffitting,thismethodcanadjusttheinputvaluesofBPneuralnetworkaccordingtodeterminationofthemaximumandminimumoutputvalues.
Therefore,thefittingandoptimizationofBPneuralnetworkiscombined.
ThecombinationexpandstheapplicationdomainofBPneuralnetworkandsolvestheissuesrelatedtoblackboxoptimizationandanewplatformisalsosetupforfurtherstudyoftheconstrainedoptimizationproblemsbasedonBPneuralnetwork.
Inaddition,constrainedoptimizationproblemsbasedonBPneuralnetworkwillbediscussedinfurtherstudies.
Thispapercanbedividedintofiveparts.
Inthefirstpart,theaimandsignificanceoftheresearchandthestatusquoisdiscussed,basedontheBPneuralnetwork'sadaptivecharacteristics.
ThesecondpartismainlyabouttheBPneuralnetwork'sstructureanditsalgorithm.
TheunconstrainedoptimizationmethodbasedonBPneuralnetworkisgiveninthethirdpart.
Anillustrationoftheresearchisgiveninthefollowingpart.
Finally,theachievementoftheresearchisdemonstrated.
2.
BPneuralnetworkstructureanditsalgorithm2.
1.
BPneuralnetworkstructure.
BPneuralisamulti-layerforward-network.
Thethree-tiernetworkstructureisusuallyused[20].
Itsnetworkstructureshowsasfollows.
(Figure1).
Figure1Three-layersBPnetworkstructure2.
2BPneuralnetworkalgorithm.
(1)Forwardpropagationprocess.
Inputsignalstartingfromtheinputlayerpassesthroughthehiddenlayerunitsaretransmittedtotheoutputlayer,andfinallytheoutputsignalisgeneratedattheoutputlayer.
Iftheoutputsignalsmeetthegivenoutputrequirement,thecalculationisterminated.
Iftheoutputsignalsdoesnotmatchthegivenoutputrequirement,thesignalswillbetransferredtotheerrorsignalback-propagation.
Theforwardpropagationprocessiscalculatedasfollows:Supposetheinputlayerhasq+1inputsignals,anyoftheinputsignalsisexpressedwithi.
Thehiddenlayerhasp+1neuronsandanyoftheneuronsisexpressedwithj.
TheoutputlayerhasOoutputneuronsandanyoftheneuronsisexpressedwithk.
Theweightoftheinputlayerandthehiddenlayerisexpressedwithvij(i=0,1,2,…,q;j=1,2,…,p)andv0jisthehiddenlayerthresholdvalue.
Theweightofthehiddenlayerandoutputlayerisexpressedwithujk(j=0,1,2,…,p;k=1,2,…,o)andu0kistheoutputlayerthresholdvalue.
Supposehiddeninputisthenetj(j=1,2,…,p),thehiddenlayeroutputisyj(j=1,2,…,p),theinputofoutputlayerisnetk(k=1,2,…,o),andtheoutputofoutputlayeriszk(k=1,2,…,o).
SupposethetrainingsamplesetisexpressedasX=[X1,X2,…,Xr,…,Xn]correspondtoanyoftrainingsamplesasXr=[xr0,xr1,xr2,…,xrq](r=1,2,…,n,xr0=-1).
Theactualoutputanddesiredoutputareexpressedaszr=[zr1,zr2,…,zro]Tanddr=[dr1,dr2,…,dro]Trespectively.
Whenweletmbetheiterationnumber,theweightandtheactualoutputarethefunctionofm.
Forsignalforwardpropagationprocess,ifletXrbenetworkinputtrainingsamples,thenwehave:(1)(2)(3)(4)Foraboveequations,the-thneuronserrorsignaloftheoutputlayerisandthekneuronserrorenergyisdefinedas.
Thesumoftheerrorenergyoftheallneuronsofoutputlayeris:(5)If(eexpressesexpectedcalculatingaccuracy),thecalculationwillbefinished,otherwise,theback-propagationcomputingwillbecarriedout.
(2)Errorback-propagationprocess.
Theerrorsignalisthemarginbetweentheactualnetworkoutputandthedesiredoutput.
Errorsignaloutputpointstartsforwardpropagationlayerbylayer,whichistheback-propagationoftheerrorsignal.
Intheerrorsignalback-propagationprocess,thenetworkweightisadjustedbytheerrorfeedback.
Throughtherepeatingmodificationoftheweighttheactualnetworkoutputgraduallyapproachesthedesiredoutput.
Errorback-propagationprocessiscalculatedasfollows:(6)(7)(8)(9)Whereisthelearningrateandisagivenconstant.
Aftercalculationofthenewweightofvariouslayers,thecalculationisturnedtotheforwardpropagationprocess.
3.
TheunconstraintoptimizationmethodbasedonBPNeuralNetwork3.
1.
Mathematicalmodel.
Since,themaximumnetworkoutputisusedasanexampleforconveniencetoillustratetheprobleminthispaper.
WhenletF(X)betherelationbetweeninputandoutput,amathematicalmodelofunconstraintoptimizationbasedonBPneuralnetworkcanbeexpressedasfollows:(10)WhereXistheinputvector,X=(x1,x2,…,xq)TandZistheoutputofBPneuralnetwork.
3.
2.
Basicideas.
ThegradientofoutputintheX(0)pointiscalculatedfirstbasedonartificially-selectedorrandomly-selectedinitialpointX(0).
IfthegradientofX(0)pointisnot0,itmustbepossibletofindanewpointX(1)thatisbetterthanX(0)inthedirectionoftheX(0)pointgradient,anditispossibletosolvethegradientofX(1)point.
IfthegradientofX(1)pointisnot0,anewpointX(2)thatisbetterthanX(1)inthedirectionoftheX(1)pointgradientwillbecalculated.
Thisprocesscontinuesuntilthegradient0isobtainedorabetterpointcannotbefound(atthispoint,theproductofgradientandstepsizeislessthanorequaltothethresholdvalueofcomputer.
Atthispoint,theXvalueistheoptimalinputandthecorrespondingnetworkoutputistheoptimaloutput.
3.
3.
Thepartialderivativesofthenetworkofoutputtoinput.
AsthegradientvectoroffunctionF(X)is(11)Thus,aslongasthepartialderivativeoffunctionF(X)iscalculated,thegradientoffunctionF(X)canbeobtained.
ThefollowingistheproceduretoderivethepartialderivativeoftheBPneuralnetworkoutputversusitsinput.
BPneuralnetworktransferfunctionisgenerallyunipolarSigmoidfunctionexpressedasequation(12)(12)TakethistransferfunctionasanexampletoderivethepartialderivativesoftheBPneuralnetwork'soutputversusinput.
Sincethederivativeoff(x)is(13)(k=1,2,…,o;i=1,2,…,q;j=1,2,…,p)(14)(15)(16)(17)(18)So(19)Iflet(20)(21)Thenwehave(22)3.
4.
Theunconstraintoptimizationmethod.
IfX(0)isartificiallyselectedorrandomlyselectedastheinitialpointX(0)andX(m)issupposedtobegradientobtainedatpointofmthiterations,thenthegradientofX(m)canbecalculatedbyequation(23)(23)IfX(m)meetstheiterationterminationconditiongivenby,(24)Whereisthestepfactorand,andisthepre-specifiedprecisionvalue.
TheoptimalsolutionX*=X(m)(25)CorrespondingZ*isoptimalvalue.
IfZ*doesnotsatisfyequation(24),thenlet(26)(27)TheadjustmentmethodforX(m+1)isdescribedinthefollowingtwocases:Case1:X(m+1)isnotsuperiortoX(m)Let(28)Accordingtotheequation(26),equation(27),anewX(m+1)valueisobtainedbyrecalculating,thenitcanbejudgedwhetherX(m+1)issuperiortoX(m),namelywhetheritsatisfiesequation(29)ornot.
(29)IfX(m+1)satisfiestheequation(29),thenextiterationbegins.
IfX(m+1)doesnotmeettheequation(29),equation(28)isusedtoreducethestepsize.
Ifequations(26)and(27)areusedtorecalculate,anewX(m+1)isobtained,thendeterminewhetheritsatisfiesequation(29).
Ifnot,thenthesestepsarerepeateduntilthenewcalculatedX(m+1)issuperiortoX(m),thisiterationwillcometotheend.
Case2:X(m+1)issuperiortoX(m)Let(30)andanewX(m+1)valueisobtainedbyrecalculatingwithequation(26)and(27),thendetermineifX(m+1)issuperiortoX(m)andsatisfiesequation(29).
Ifnot,let(31)inthesametimethestepsizeisreducedbyhalfandthisiterationisdown.
IfthenewcalculationofX(m+1)satisfiesequation(29),thenassigntheX(m+1)valuestoX(m)byusingequation(32).
(32)Atthistime,stepsizeiscontinuetobeincreasedusingequation(30),andisrecalculatedbyequations(26)and(27)untiltheX(m+1)valuewhichisnotsuperiortoX(m)isobtained,thenX(m+1)isreplacedbyX(m).
Inthesametime,thestepsizeisreducedbyhalfandthisiterationisover.
Aftercompletionofeachiterations,ithastobejudgedifresultsmeettheiterationterminationcondition(iftheresultsmeetequation(24)).
Ifitsatisfiesequation(24),thenletX*=X(m+1)(33)WhereX*istheoptimalsolutionandthecorrespondingZ*istherequiredoptimalvalue.
Ifitdoesnotsatisfyequation(24),thenextiterationbeginsuntilX(m)valuessatisfyequation(24),thentheoptimizationisover.
4.
DemonstrationcalculationBecauseoftheoptimalvalueoftheblackboxproblemisunknown,itisdifficulttovalidatetheaccuracyandstabilityofoptimizationmethod.
Toovercomethisproblem,weselecttwoknownfunctionsfordiscretizationandthenusethediscretedsamplestoconductBPneuralnetworkfittingtraining.
Theoptimalvalueofnetworkoutputisobtainedthroughthefittingtrainingsothattheoptimizedresultscanbecomparedwiththetheoreticaloptimalvalue.
4.
1Example1.
Letequation(34)beknownfunction,(34)thenthetheoreticalmaximumvalueofthisfunctionismaxF(X)=maxF(57.
625,51.
136,1)=2045.
412.
Inthisexample,BPneuralnetworkwillbeusedtomakefunctionfittingandthenetworkoutputmaximumafterfittingcomesout.
ThefirststepisdiscretizationofthefunctionF(X).
Sixpointswithequalintervalsareselectedforx1,x2,andx3intherangeof30to80,25to75,and-10to15respectively,atotalof216points.
ThecorrespondingvaluesofF(x)arecalculatedinequation34andthenBPneuralnetworkisusedtoachievefunctionfitting.
Atthisstep,thenetworkstructureis3-25-1.
Whenthenetworkmatchesthepre-specifiedprecisione=10-4,thenetworkweightandthresholdarekept.
Underthiscondition,theaveragerelativeerrorofthenetworkfittingis0.
002854%.
Second,themaxF(x)canbecalculatedbyusingtheoptimizationmethodgiveninthisarticle.
Table1showsthesamemax(F)valuescalculatedwithtendifferentinitialpointsatε=0.
InTable1,istheaveragevalueofoptimizedresultsfor10timesandβisthestabilityindicatorformeasuringoptimizedresults(equation35).
(35)Table1CalculationresultsofF(x)valuesNum.
12345678910X(0)3040506070803040508025354555657545657535-10-5051015510010x157.
69757.
69757.
69757.
69757.
69757.
69757.
69757.
69757.
69757.
697x251.
16751.
16751.
16751.
16751.
16751.
16751.
16751.
16751.
16751.
167x31.
0001.
0001.
0001.
0001.
0001.
0001.
0001.
0001.
0001.
000maxF2045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
329βi11111111114.
2Example2.
Letequation(36)beknownfunctionF(x)=x2-20x+205(36)TheminimumvalueofthefunctionisminF(x)=minF(10)=105.
ThefollowingistheprocessofachievingfunctionfittingandseekingthenetworkminimumoutputafterfunctionfittingwithBPneuralnetwork.
First,turningtheminimumF(x)intomaxF(x)usingequation(37)maxF(x)=min[-F(x)](37)AndthendiscretethefunctionF(x),inwhich81pointsistakeninrangeof0to20at0.
25interval.
ThenthecorrespondingvaluesofF(x)arecalculatedusingequation(37)andBPneuralnetworkisusedtoachievefunctionfitting.
Atthisstep,thenetworkstructureis1-10-1.
Whenthenetworkmeetsthepre-specifiedprecisione=10-5,thenetworkweightandthresholdarekept,thenatthispoint,theaveragerelativeerrorofthenetworkfittingis0.
1004%.
ThenminF(x)canbecalculatedbyusingtheoptimizationmethodgiveninthispaper.
Table2showstheresultsthatarecalculatedwithtendifferentinitialpointsatε=0.
Intable2,istheaveragevalueofoptimizationresultsfor10times.
Table2CalculationresultsofF(x)Num.
12345678910X(0)02.
254.
757.
259.
7512.
2514.
7517.
2519.
7520x9.
9719.
9719.
9719.
9719.
9719.
9719.
9719.
9719.
9719.
971minF105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003βi1111111111Table1andTable2showthattheresultfromoptimizedmethodisverystableandtheoptimalvalueofF(x)isveryclosetothetheoreticaloptimalvalue.
Inexample1,theaveragerelativeerrorsofx1,x2,andx3are0.
125%,0.
061%,and0%respectively.
ThemaximumfunctionvaluewithBPneuralnetworkoptimizationisalsoveryclosetothetheoreticalmaximumvalueandtherelativeerrorisonly0.
00406%.
Inexample2,theaveragerelativeerrorofxis0.
29%.
ThemaximumvaluewithBPneuralnetworkoptimizationisalsoveryclosetothetheoreticalmaximumvalueandtherelativeerrorisonly0.
00286%.
Inaddition,theseerrorsalsoincludethefittingerrors.
Theresultsindicatethattheaccuracyofnetworkoptimizationisrelativelyhigh.
5.
Conclusion.
(1)AnunconstrainedoptimizationmethodisproposedbasedonBPneuralnetwork,whichispowerfultosolvetheblackboxproblem.
(2)WhenBPneuralnetworkisintermsoffitting,thismethodcanadjusttheinputvaluesofBPneuralnetworkaccordingtodeterminationofthemaximumandminimumoutputvalues.
Therefore,withthismethodtheapplicationofBPneuralnetworkisexpandedbycombiningBPnetwork'sfittingandoptimizationtogether.
Inaddition,anewmethodisprovidedintheresearchtosolvetheissuesrelatedtoblackboxoptimizationandanewplatformisalsosetupforfurtherstudyoftheconstrainedoptimizationproblemsbasedonBPneuralnetwork.
(3)Inthisresearch,ageneralmathematicalmodelforBPneuralnetworkunconstraintoptimizationisestablishedusingunipolarSigmoidfunctionasthetransferringfunctionandmaximizingnetworkoutputvalues.
Basedonthismodel,anfundamentalideaofunconstrainedoptimizationmethodbasedonBPneuralnetworkisgiven,thepartialderivativeequationsfortheBPneuralnetwork'soutputtoinputisderived(4)Thestepsizeofoptimizationmethodpresentedinthispaperhasinheritance,whichacceleratestheoptimizationspeed.
(5)Themodelisvalidatedbydemonstrationofsamplecalculationandresultsshowthatthealgorithmisaneffectivemethod.
AcknowledgmentTheresearchissupportedbyNationalNaturalScienceFoundationofChinaandNationalHighTechnologyResearchandDevelopmentProgramofChina(GrantNo.
31071331、2006AA10A310-1)REFERENCES[1]YUWei-ping,PENGYi-gong.
IntelligentControlTechnologyResearch[C].
EighthConferenceonIndustrialInstrumentationandAutomationAcademicMeetingpaper,2007,415-418[2]LIShu-rong,YANGQing,GUOShu-hui.
Neuralnetworkbasedadaptivecontrolforaclassofnonaffinenonlinearsystems[J].
SystemsScienceandMathematics,2007,27(2):161-169[3]CHENMing-jie,NIJin-ren,CHAKe-maietc.
Applicationofgeneticalgorithm-basedartificialneuralnetworksin2Dtidalflowsimulation[J].
JournalofHydraulicEngineering,2003,(10):1-12[4]ZHOULing,SUNJun,YUANYu-bo.
EffectsofcombinedactivationfunctiononBPalgorithm'sconvergencespeed[J].
JournalofHohaiUniversity,1999,27(5):107-108.
[5]TANGWan-mei.
ThestudyoftheoptimalstructureofBPneuralnetwork[J].
SystemsEngineeringTheory&Practice,2005,(10):95-100[6]FunahashiK.
Ontheapproximaterealizationofcontinuousmappingsbyneuralnetworks[J].
NeuralNetworks,1989,2(7):183-192[7]Hecht-NielsonR.
Theoryofthebackpropagationneuralnetworks[M].
WashingtonD.
C.
ProceedingsofIEEEinternationalJointconferenceonNeuralNetworks.
1989[8]Zhang,Y.
,WuL.
.
WeightsoptimizationofneuralnetworkviaimprovedBCOapproach[J].
ProgressInElectromagneticResearch,PIER83,185-198,2008[9]WANGWen-jian.
TheoptimizationofBPneuralnetworks[J].
ComputerEngineeringandDesign,2000,21(6):8-10[10]ZHANGShan,HEJiannong.
ResearchonOptimizedAlgorithmforBPNeuralNetworks[J].
ComputerandModernization,2009,(1):73-80[11]XingHihua,LinHngyan,ChenHuandong,etal.
SensitivityanalysisofBPneuralnetworkoptimizedbygeneticalgorithmanditsapplicationstofeaturereduction[J].
InternationalReviewonComputersandSoftware.
2012,7(6):3084-3089.
[12]ChunshengDong,Liudong,MingmingYang.
TheApplicationoftheBPNeuralNetworkintheNonlinearOptimization.
AdvancesinIntelligentandSoftComputing[J],2010,78:727-732[13]ShifeiDing,ChunyangSu,JunzhaoYu.
AnoptimizingBPneuralnetworkalgorithmbasedongeneticalgorithm[J].
ArtificialIntelligenceReview,2011,36(2):153-162.
[14]LiSong,LiuLijun,ZhaiMan.
Predictionforshort-termtrafficflowbasedonmodifiedPSOoptimizedBPneuralnetwork[J].
SystemsEngineering-Theory&Practice,2012.
39(9):2045-2049.
[15]XingHihua,LinHngyan.
AnintelligentmethodoptimizingBPneuralnetworkmodel[C].
2ndInternationalConferenceonMaterialsandProductsManufacturingTechnology,ICMPMT2012,2012,2470-2474.
[16]Merad,L.
,Bendimerad,F.
T.
,Meriah,S.
M.
,etal.
NeuralNetworksforsynthesisandoptimizationofantennasarrays[J].
RadioengineeringJournal,2007,16(1):23-30[17]GulatiT.
,ChakrabartiM.
,SinghA.
,etal.
ComparativeStudyofResponseSurfaceMethodology,ArtificialNeuralNetworkandGeneticAlgorithmsforOptimizationofSoybeanHydration[J].
FoodTechnolBiotechnol,2010,1(48):11-18[18]WANGXin-min,ZHAOBin,WANGXian-lai.
Optimizationofdrillingandblastingparametersbasedonback-propagationneuralnetwork[J].
JournalofCentralSouthUniversity(NaturalScience),2009,40(5):1411-1416[19]LIULei.
Indextrackingoptimizationmethodbasedongeneticneuralnetwork[J].
SystemsEngineeringTheory&Practice,2010,30(1):22-29[20]HANLi-qun.
Artificialneuralnetworktutorial[M].
BeijingUniversityofPostsandTelecommunicationsPress,2006,12
前天,还有在"Hostodo商家提供两款大流量美国VPS主机 可选拉斯维加斯和迈阿密"文章中提到有提供两款流量较大的套餐,这里今天看到有发布四款庆祝独立日的七月份的活动,最低年付VPS主机13.99美元,如果有需要年付便宜VPS主机的可以选择商家。目前,Hostodo机房可选拉斯维加斯和迈阿密两个数据中心,且都是基于KVM虚拟+NVMe整列,年付送DirectAdmin授权,需要发工单申请。(如何...
达州创梦网络怎么样,达州创梦网络公司位于四川省达州市,属于四川本地企业,资质齐全,IDC/ISP均有,从创梦网络这边租的服务器均可以备案,属于一手资源,高防机柜、大带宽、高防IP业务,一手整C IP段,四川电信,一手四川托管服务商,成都优化线路,机柜租用、服务器云服务器租用,适合建站做游戏,不须要在套CDN,全国访问快,直连省骨干,大网封UDP,无视UDP攻击,机房集群高达1.2TB,单机可提供1...
PIGYun发布了九月份及中秋节特惠活动,提供8折优惠码,本月商家主推中国香港和韩国机房,优惠后最低韩国每月14元/中国香港每月19元起。这是一家成立于2019年的国人商家,提供中国香港、韩国和美国等地区机房VPS主机,基于KVM架构,采用SSD硬盘,CN2+BGP线路(美国为CUVIP-AS9929、GIA等)。下面列出两款主机配置信息。机房:中国香港CPU:1core内存:1GB硬盘:10GB...
雅虎免费邮箱为你推荐
中国论坛大全安徽论坛都有哪些?镜像文件是什么什么是文件镜像?什么是镜像文件?数据库损坏数据库坏了怎么办bt封杀为什么现在网上许多BT下载都被封了?宕机何谓宕机?什么是云平台云平台和云计算的区别是什么?blogcnblogcn网页无法正常显示,直接跳转http://www.7t7t.com/7?rewritebase为什么我写.htaccess这个 rewriterule 进入死循环了,高手帮忙修改qq等级表QQ等级列表怎么在图片上写文字如何用美图秀秀在照片上写字
欧洲欧洲vps 免费域名跳转 hostmaster la域名 godaddy支付宝 百度云1t 免费智能解析 香港新世界中心 国外视频网站有哪些 根服务器 国内域名 个人免费邮箱 腾讯网盘 好看的空间 forwarder apache启动失败 phpwind论坛 asp简介 主机配置 大容量存储控制器 更多