tier雅虎免费邮箱

雅虎免费邮箱  时间:2021-02-24  阅读:()
ICICExpressLettersICICInternational2010ISSN1881-803XVolume4,Number5,October2010pp.
1–9AnUnconstrainedOptimizationMethodBasedonBPNeuralNetworkFulinWang*,HuixiaZhu,JiquanWangCollegeofEngineeringNortheastAgriculturalUniversity,Harbin150030,ChinaEmail:fulinwang@yahoo.
com.
cnABSTRACT.
Anunconstrainedoptimizationmethodisproposedinthispaper,basedonbackpropagation(BP)neuralnetwork.
Theoptimizationmethodismainlyappliedtosolvingtheblackboxproblem.
WhenBPneuralnetworkisintermsoffitting,thismethodcanadjusttheinputvaluesofBPneuralnetworkaccordingtodeterminationofthemaximumandminimumoutputvalues.
Therefore,withthismethodtheapplicationofBPneuralnetworkisexpandedbycombiningBPnetwork'sfittingandoptimizationtogether.
Inaddition,anewmethodisprovidedintheresearchtosolvetheissuesrelatedtoblackboxoptimizationandanewplatformisalsosetupforfurtherstudyoftheconstrainedoptimizationproblemsbasedonBPneuralnetwork.
Inthisresearch,ageneralmathematicalmodelforBPneuralnetworkunconstraintoptimizationisestablishedusingunipolarSigmoidfunctionasthetransferringfunctionandmaximizingnetworkoutputvalues.
Basedonthismodel,anfundamentalideaofunconstrainedoptimizationmethodbasedonBPneuralnetworkisgiven,thepartialderivativeequationsfortheBPneuralnetwork'soutputtoinputisderivedandanalgorithmbasedontheunconstrainedoptimizationmethodofBpneuralnetworkisproposed.
Themodelisvalidatedbydemonstrationofsamplecalculationandtheresultsshowthatthealgorithmisaneffectivemethod.
Keywords:BPneuralnetwork;Unconstraint;model;optimizationmethod1.
Introduction.
BackPropagation(BP)neuralnetworkisoneoftheimportantresearchfieldsinintelligentcontrolandintelligentautomation[1,2].
BPneuralnetworkiscomposedofmanysimpleparallelalgorithmmodules,whicharesimilarwithbiologicalneuralsystemneurons.
Neuralnetworkisanonlineardynamicsystemwhichischaracterizedbydistributedinformationstorageandparallelsynergistictreatment[3].
Thestructureofasingleneuronissimpleanditsfunctionislimited.
However,anetworksystemwhichcontainsalargenumberofneuronshasvariousfunctionsandcanbeusedinmanyapplications.
BPneuralnetworkmodelwhichisoneofthemostimportantartificialneuralnetworkmodelsisamulti-layerforward-neuralnetwork,whichismostwidelystudiedandusedatpresent.
Theorieshaveprovedthatifathree-layerBPneuralnetworkhasenoughhiddenlayernodes,itcansimulateanycomplexnonlinearmapping[4-7].
ThisindicatesthatBPneuralnetworkfitsrathereasily.
TheoptimizationresearchofBPneuralnetworkisrecordedinsomedocuments.
Peoplehavedonealotofstudiesinsuchaspectsaslearningrate,weight,threshold,andnetworkstructureoptimizationetc,inordertosolvetheproblemsofBPneuralnetworkwhichincludethefactorsofthefluctuations,theoscillation,theslowingoffittingspeed,theinfinitenetworkstructureandsoon[8-15].
Butintheactualsituation,peoplearenotonlyconcernedaboutthefittingeffectofneuralnetworks,butalsoabouthowtoachievethemaximumandminimumoutputvaluesbyadjustinginputvalues.
ThepresentliteratureaboutBPneuralnetworkoptimizationismainlyaboutidentifyingthecorrespondingrelationbetweenBPneuralnetworkinputandoutputtoobtaintherequiredoutputvalues[16-19].
Actuallythesestudiescanbeconsideredassimulationbutnotoptimizationbecausetheselectedoptimumprogramisbasedonthesimulationresults.
Therefore,itisthestatingpointofthispaperthatprobestheoptimizationmethodwhichisbasedontrueBPneuralnetwork.
AnunconstraintoptimizationmethodispresentedinthisresearchtosolvetheblackboxproblembasedonBPneuralnetworktoobtaintheexperimentalandobserveddatawithoutknowingthefunctionalrelationbetweeninputandoutputandtoachievetheoutputoptimizationbyadjustingtheinputvalues.
WhenBPneuralnetworkisintermsoffitting,thismethodcanadjusttheinputvaluesofBPneuralnetworkaccordingtodeterminationofthemaximumandminimumoutputvalues.
Therefore,thefittingandoptimizationofBPneuralnetworkiscombined.
ThecombinationexpandstheapplicationdomainofBPneuralnetworkandsolvestheissuesrelatedtoblackboxoptimizationandanewplatformisalsosetupforfurtherstudyoftheconstrainedoptimizationproblemsbasedonBPneuralnetwork.
Inaddition,constrainedoptimizationproblemsbasedonBPneuralnetworkwillbediscussedinfurtherstudies.
Thispapercanbedividedintofiveparts.
Inthefirstpart,theaimandsignificanceoftheresearchandthestatusquoisdiscussed,basedontheBPneuralnetwork'sadaptivecharacteristics.
ThesecondpartismainlyabouttheBPneuralnetwork'sstructureanditsalgorithm.
TheunconstrainedoptimizationmethodbasedonBPneuralnetworkisgiveninthethirdpart.
Anillustrationoftheresearchisgiveninthefollowingpart.
Finally,theachievementoftheresearchisdemonstrated.
2.
BPneuralnetworkstructureanditsalgorithm2.
1.
BPneuralnetworkstructure.
BPneuralisamulti-layerforward-network.
Thethree-tiernetworkstructureisusuallyused[20].
Itsnetworkstructureshowsasfollows.
(Figure1).
Figure1Three-layersBPnetworkstructure2.
2BPneuralnetworkalgorithm.
(1)Forwardpropagationprocess.
Inputsignalstartingfromtheinputlayerpassesthroughthehiddenlayerunitsaretransmittedtotheoutputlayer,andfinallytheoutputsignalisgeneratedattheoutputlayer.
Iftheoutputsignalsmeetthegivenoutputrequirement,thecalculationisterminated.
Iftheoutputsignalsdoesnotmatchthegivenoutputrequirement,thesignalswillbetransferredtotheerrorsignalback-propagation.
Theforwardpropagationprocessiscalculatedasfollows:Supposetheinputlayerhasq+1inputsignals,anyoftheinputsignalsisexpressedwithi.
Thehiddenlayerhasp+1neuronsandanyoftheneuronsisexpressedwithj.
TheoutputlayerhasOoutputneuronsandanyoftheneuronsisexpressedwithk.
Theweightoftheinputlayerandthehiddenlayerisexpressedwithvij(i=0,1,2,…,q;j=1,2,…,p)andv0jisthehiddenlayerthresholdvalue.
Theweightofthehiddenlayerandoutputlayerisexpressedwithujk(j=0,1,2,…,p;k=1,2,…,o)andu0kistheoutputlayerthresholdvalue.
Supposehiddeninputisthenetj(j=1,2,…,p),thehiddenlayeroutputisyj(j=1,2,…,p),theinputofoutputlayerisnetk(k=1,2,…,o),andtheoutputofoutputlayeriszk(k=1,2,…,o).
SupposethetrainingsamplesetisexpressedasX=[X1,X2,…,Xr,…,Xn]correspondtoanyoftrainingsamplesasXr=[xr0,xr1,xr2,…,xrq](r=1,2,…,n,xr0=-1).
Theactualoutputanddesiredoutputareexpressedaszr=[zr1,zr2,…,zro]Tanddr=[dr1,dr2,…,dro]Trespectively.
Whenweletmbetheiterationnumber,theweightandtheactualoutputarethefunctionofm.
Forsignalforwardpropagationprocess,ifletXrbenetworkinputtrainingsamples,thenwehave:(1)(2)(3)(4)Foraboveequations,the-thneuronserrorsignaloftheoutputlayerisandthekneuronserrorenergyisdefinedas.
Thesumoftheerrorenergyoftheallneuronsofoutputlayeris:(5)If(eexpressesexpectedcalculatingaccuracy),thecalculationwillbefinished,otherwise,theback-propagationcomputingwillbecarriedout.
(2)Errorback-propagationprocess.
Theerrorsignalisthemarginbetweentheactualnetworkoutputandthedesiredoutput.
Errorsignaloutputpointstartsforwardpropagationlayerbylayer,whichistheback-propagationoftheerrorsignal.
Intheerrorsignalback-propagationprocess,thenetworkweightisadjustedbytheerrorfeedback.
Throughtherepeatingmodificationoftheweighttheactualnetworkoutputgraduallyapproachesthedesiredoutput.
Errorback-propagationprocessiscalculatedasfollows:(6)(7)(8)(9)Whereisthelearningrateandisagivenconstant.
Aftercalculationofthenewweightofvariouslayers,thecalculationisturnedtotheforwardpropagationprocess.
3.
TheunconstraintoptimizationmethodbasedonBPNeuralNetwork3.
1.
Mathematicalmodel.
Since,themaximumnetworkoutputisusedasanexampleforconveniencetoillustratetheprobleminthispaper.
WhenletF(X)betherelationbetweeninputandoutput,amathematicalmodelofunconstraintoptimizationbasedonBPneuralnetworkcanbeexpressedasfollows:(10)WhereXistheinputvector,X=(x1,x2,…,xq)TandZistheoutputofBPneuralnetwork.
3.
2.
Basicideas.
ThegradientofoutputintheX(0)pointiscalculatedfirstbasedonartificially-selectedorrandomly-selectedinitialpointX(0).
IfthegradientofX(0)pointisnot0,itmustbepossibletofindanewpointX(1)thatisbetterthanX(0)inthedirectionoftheX(0)pointgradient,anditispossibletosolvethegradientofX(1)point.
IfthegradientofX(1)pointisnot0,anewpointX(2)thatisbetterthanX(1)inthedirectionoftheX(1)pointgradientwillbecalculated.
Thisprocesscontinuesuntilthegradient0isobtainedorabetterpointcannotbefound(atthispoint,theproductofgradientandstepsizeislessthanorequaltothethresholdvalueofcomputer.
Atthispoint,theXvalueistheoptimalinputandthecorrespondingnetworkoutputistheoptimaloutput.
3.
3.
Thepartialderivativesofthenetworkofoutputtoinput.
AsthegradientvectoroffunctionF(X)is(11)Thus,aslongasthepartialderivativeoffunctionF(X)iscalculated,thegradientoffunctionF(X)canbeobtained.
ThefollowingistheproceduretoderivethepartialderivativeoftheBPneuralnetworkoutputversusitsinput.
BPneuralnetworktransferfunctionisgenerallyunipolarSigmoidfunctionexpressedasequation(12)(12)TakethistransferfunctionasanexampletoderivethepartialderivativesoftheBPneuralnetwork'soutputversusinput.
Sincethederivativeoff(x)is(13)(k=1,2,…,o;i=1,2,…,q;j=1,2,…,p)(14)(15)(16)(17)(18)So(19)Iflet(20)(21)Thenwehave(22)3.
4.
Theunconstraintoptimizationmethod.
IfX(0)isartificiallyselectedorrandomlyselectedastheinitialpointX(0)andX(m)issupposedtobegradientobtainedatpointofmthiterations,thenthegradientofX(m)canbecalculatedbyequation(23)(23)IfX(m)meetstheiterationterminationconditiongivenby,(24)Whereisthestepfactorand,andisthepre-specifiedprecisionvalue.
TheoptimalsolutionX*=X(m)(25)CorrespondingZ*isoptimalvalue.
IfZ*doesnotsatisfyequation(24),thenlet(26)(27)TheadjustmentmethodforX(m+1)isdescribedinthefollowingtwocases:Case1:X(m+1)isnotsuperiortoX(m)Let(28)Accordingtotheequation(26),equation(27),anewX(m+1)valueisobtainedbyrecalculating,thenitcanbejudgedwhetherX(m+1)issuperiortoX(m),namelywhetheritsatisfiesequation(29)ornot.
(29)IfX(m+1)satisfiestheequation(29),thenextiterationbegins.
IfX(m+1)doesnotmeettheequation(29),equation(28)isusedtoreducethestepsize.
Ifequations(26)and(27)areusedtorecalculate,anewX(m+1)isobtained,thendeterminewhetheritsatisfiesequation(29).
Ifnot,thenthesestepsarerepeateduntilthenewcalculatedX(m+1)issuperiortoX(m),thisiterationwillcometotheend.
Case2:X(m+1)issuperiortoX(m)Let(30)andanewX(m+1)valueisobtainedbyrecalculatingwithequation(26)and(27),thendetermineifX(m+1)issuperiortoX(m)andsatisfiesequation(29).
Ifnot,let(31)inthesametimethestepsizeisreducedbyhalfandthisiterationisdown.
IfthenewcalculationofX(m+1)satisfiesequation(29),thenassigntheX(m+1)valuestoX(m)byusingequation(32).
(32)Atthistime,stepsizeiscontinuetobeincreasedusingequation(30),andisrecalculatedbyequations(26)and(27)untiltheX(m+1)valuewhichisnotsuperiortoX(m)isobtained,thenX(m+1)isreplacedbyX(m).
Inthesametime,thestepsizeisreducedbyhalfandthisiterationisover.
Aftercompletionofeachiterations,ithastobejudgedifresultsmeettheiterationterminationcondition(iftheresultsmeetequation(24)).
Ifitsatisfiesequation(24),thenletX*=X(m+1)(33)WhereX*istheoptimalsolutionandthecorrespondingZ*istherequiredoptimalvalue.
Ifitdoesnotsatisfyequation(24),thenextiterationbeginsuntilX(m)valuessatisfyequation(24),thentheoptimizationisover.
4.
DemonstrationcalculationBecauseoftheoptimalvalueoftheblackboxproblemisunknown,itisdifficulttovalidatetheaccuracyandstabilityofoptimizationmethod.
Toovercomethisproblem,weselecttwoknownfunctionsfordiscretizationandthenusethediscretedsamplestoconductBPneuralnetworkfittingtraining.
Theoptimalvalueofnetworkoutputisobtainedthroughthefittingtrainingsothattheoptimizedresultscanbecomparedwiththetheoreticaloptimalvalue.
4.
1Example1.
Letequation(34)beknownfunction,(34)thenthetheoreticalmaximumvalueofthisfunctionismaxF(X)=maxF(57.
625,51.
136,1)=2045.
412.
Inthisexample,BPneuralnetworkwillbeusedtomakefunctionfittingandthenetworkoutputmaximumafterfittingcomesout.
ThefirststepisdiscretizationofthefunctionF(X).
Sixpointswithequalintervalsareselectedforx1,x2,andx3intherangeof30to80,25to75,and-10to15respectively,atotalof216points.
ThecorrespondingvaluesofF(x)arecalculatedinequation34andthenBPneuralnetworkisusedtoachievefunctionfitting.
Atthisstep,thenetworkstructureis3-25-1.
Whenthenetworkmatchesthepre-specifiedprecisione=10-4,thenetworkweightandthresholdarekept.
Underthiscondition,theaveragerelativeerrorofthenetworkfittingis0.
002854%.
Second,themaxF(x)canbecalculatedbyusingtheoptimizationmethodgiveninthisarticle.
Table1showsthesamemax(F)valuescalculatedwithtendifferentinitialpointsatε=0.
InTable1,istheaveragevalueofoptimizedresultsfor10timesandβisthestabilityindicatorformeasuringoptimizedresults(equation35).
(35)Table1CalculationresultsofF(x)valuesNum.
12345678910X(0)3040506070803040508025354555657545657535-10-5051015510010x157.
69757.
69757.
69757.
69757.
69757.
69757.
69757.
69757.
69757.
697x251.
16751.
16751.
16751.
16751.
16751.
16751.
16751.
16751.
16751.
167x31.
0001.
0001.
0001.
0001.
0001.
0001.
0001.
0001.
0001.
000maxF2045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
329βi11111111114.
2Example2.
Letequation(36)beknownfunctionF(x)=x2-20x+205(36)TheminimumvalueofthefunctionisminF(x)=minF(10)=105.
ThefollowingistheprocessofachievingfunctionfittingandseekingthenetworkminimumoutputafterfunctionfittingwithBPneuralnetwork.
First,turningtheminimumF(x)intomaxF(x)usingequation(37)maxF(x)=min[-F(x)](37)AndthendiscretethefunctionF(x),inwhich81pointsistakeninrangeof0to20at0.
25interval.
ThenthecorrespondingvaluesofF(x)arecalculatedusingequation(37)andBPneuralnetworkisusedtoachievefunctionfitting.
Atthisstep,thenetworkstructureis1-10-1.
Whenthenetworkmeetsthepre-specifiedprecisione=10-5,thenetworkweightandthresholdarekept,thenatthispoint,theaveragerelativeerrorofthenetworkfittingis0.
1004%.
ThenminF(x)canbecalculatedbyusingtheoptimizationmethodgiveninthispaper.
Table2showstheresultsthatarecalculatedwithtendifferentinitialpointsatε=0.
Intable2,istheaveragevalueofoptimizationresultsfor10times.
Table2CalculationresultsofF(x)Num.
12345678910X(0)02.
254.
757.
259.
7512.
2514.
7517.
2519.
7520x9.
9719.
9719.
9719.
9719.
9719.
9719.
9719.
9719.
9719.
971minF105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003βi1111111111Table1andTable2showthattheresultfromoptimizedmethodisverystableandtheoptimalvalueofF(x)isveryclosetothetheoreticaloptimalvalue.
Inexample1,theaveragerelativeerrorsofx1,x2,andx3are0.
125%,0.
061%,and0%respectively.
ThemaximumfunctionvaluewithBPneuralnetworkoptimizationisalsoveryclosetothetheoreticalmaximumvalueandtherelativeerrorisonly0.
00406%.
Inexample2,theaveragerelativeerrorofxis0.
29%.
ThemaximumvaluewithBPneuralnetworkoptimizationisalsoveryclosetothetheoreticalmaximumvalueandtherelativeerrorisonly0.
00286%.
Inaddition,theseerrorsalsoincludethefittingerrors.
Theresultsindicatethattheaccuracyofnetworkoptimizationisrelativelyhigh.
5.
Conclusion.
(1)AnunconstrainedoptimizationmethodisproposedbasedonBPneuralnetwork,whichispowerfultosolvetheblackboxproblem.
(2)WhenBPneuralnetworkisintermsoffitting,thismethodcanadjusttheinputvaluesofBPneuralnetworkaccordingtodeterminationofthemaximumandminimumoutputvalues.
Therefore,withthismethodtheapplicationofBPneuralnetworkisexpandedbycombiningBPnetwork'sfittingandoptimizationtogether.
Inaddition,anewmethodisprovidedintheresearchtosolvetheissuesrelatedtoblackboxoptimizationandanewplatformisalsosetupforfurtherstudyoftheconstrainedoptimizationproblemsbasedonBPneuralnetwork.
(3)Inthisresearch,ageneralmathematicalmodelforBPneuralnetworkunconstraintoptimizationisestablishedusingunipolarSigmoidfunctionasthetransferringfunctionandmaximizingnetworkoutputvalues.
Basedonthismodel,anfundamentalideaofunconstrainedoptimizationmethodbasedonBPneuralnetworkisgiven,thepartialderivativeequationsfortheBPneuralnetwork'soutputtoinputisderived(4)Thestepsizeofoptimizationmethodpresentedinthispaperhasinheritance,whichacceleratestheoptimizationspeed.
(5)Themodelisvalidatedbydemonstrationofsamplecalculationandresultsshowthatthealgorithmisaneffectivemethod.
AcknowledgmentTheresearchissupportedbyNationalNaturalScienceFoundationofChinaandNationalHighTechnologyResearchandDevelopmentProgramofChina(GrantNo.
31071331、2006AA10A310-1)REFERENCES[1]YUWei-ping,PENGYi-gong.
IntelligentControlTechnologyResearch[C].
EighthConferenceonIndustrialInstrumentationandAutomationAcademicMeetingpaper,2007,415-418[2]LIShu-rong,YANGQing,GUOShu-hui.
Neuralnetworkbasedadaptivecontrolforaclassofnonaffinenonlinearsystems[J].
SystemsScienceandMathematics,2007,27(2):161-169[3]CHENMing-jie,NIJin-ren,CHAKe-maietc.
Applicationofgeneticalgorithm-basedartificialneuralnetworksin2Dtidalflowsimulation[J].
JournalofHydraulicEngineering,2003,(10):1-12[4]ZHOULing,SUNJun,YUANYu-bo.
EffectsofcombinedactivationfunctiononBPalgorithm'sconvergencespeed[J].
JournalofHohaiUniversity,1999,27(5):107-108.
[5]TANGWan-mei.
ThestudyoftheoptimalstructureofBPneuralnetwork[J].
SystemsEngineeringTheory&Practice,2005,(10):95-100[6]FunahashiK.
Ontheapproximaterealizationofcontinuousmappingsbyneuralnetworks[J].
NeuralNetworks,1989,2(7):183-192[7]Hecht-NielsonR.
Theoryofthebackpropagationneuralnetworks[M].
WashingtonD.
C.
ProceedingsofIEEEinternationalJointconferenceonNeuralNetworks.
1989[8]Zhang,Y.
,WuL.
.
WeightsoptimizationofneuralnetworkviaimprovedBCOapproach[J].
ProgressInElectromagneticResearch,PIER83,185-198,2008[9]WANGWen-jian.
TheoptimizationofBPneuralnetworks[J].
ComputerEngineeringandDesign,2000,21(6):8-10[10]ZHANGShan,HEJiannong.
ResearchonOptimizedAlgorithmforBPNeuralNetworks[J].
ComputerandModernization,2009,(1):73-80[11]XingHihua,LinHngyan,ChenHuandong,etal.
SensitivityanalysisofBPneuralnetworkoptimizedbygeneticalgorithmanditsapplicationstofeaturereduction[J].
InternationalReviewonComputersandSoftware.
2012,7(6):3084-3089.
[12]ChunshengDong,Liudong,MingmingYang.
TheApplicationoftheBPNeuralNetworkintheNonlinearOptimization.
AdvancesinIntelligentandSoftComputing[J],2010,78:727-732[13]ShifeiDing,ChunyangSu,JunzhaoYu.
AnoptimizingBPneuralnetworkalgorithmbasedongeneticalgorithm[J].
ArtificialIntelligenceReview,2011,36(2):153-162.
[14]LiSong,LiuLijun,ZhaiMan.
Predictionforshort-termtrafficflowbasedonmodifiedPSOoptimizedBPneuralnetwork[J].
SystemsEngineering-Theory&Practice,2012.
39(9):2045-2049.
[15]XingHihua,LinHngyan.
AnintelligentmethodoptimizingBPneuralnetworkmodel[C].
2ndInternationalConferenceonMaterialsandProductsManufacturingTechnology,ICMPMT2012,2012,2470-2474.
[16]Merad,L.
,Bendimerad,F.
T.
,Meriah,S.
M.
,etal.
NeuralNetworksforsynthesisandoptimizationofantennasarrays[J].
RadioengineeringJournal,2007,16(1):23-30[17]GulatiT.
,ChakrabartiM.
,SinghA.
,etal.
ComparativeStudyofResponseSurfaceMethodology,ArtificialNeuralNetworkandGeneticAlgorithmsforOptimizationofSoybeanHydration[J].
FoodTechnolBiotechnol,2010,1(48):11-18[18]WANGXin-min,ZHAOBin,WANGXian-lai.
Optimizationofdrillingandblastingparametersbasedonback-propagationneuralnetwork[J].
JournalofCentralSouthUniversity(NaturalScience),2009,40(5):1411-1416[19]LIULei.
Indextrackingoptimizationmethodbasedongeneticneuralnetwork[J].
SystemsEngineeringTheory&Practice,2010,30(1):22-29[20]HANLi-qun.
Artificialneuralnetworktutorial[M].
BeijingUniversityofPostsandTelecommunicationsPress,2006,12

Sharktech:美国/荷兰独立服务器,10Gbps端口/不限流量/免费DDoS防护60G,319美元/月起

sharktech怎么样?sharktech (鲨鱼机房)是一家成立于 2003 年的知名美国老牌主机商,又称鲨鱼机房或者SK 机房,一直主打高防系列产品,提供独立服务器租用业务和 VPS 主机,自营机房在美国洛杉矶、丹佛、芝加哥和荷兰阿姆斯特丹,所有产品均提供 DDoS 防护。此文只整理他们家10Gbps专用服务器,此外该系列所有服务器都受到高达 60Gbps(可升级到 100Gbps)的保护。...

hostkey俄罗斯、荷兰GPU显卡服务器/免费Windows Server

Hostkey.com成立于2007年的荷兰公司,主要运营服务器出租与托管,其次是VPS、域名、域名证书,各种软件授权等。hostkey当前运作荷兰阿姆斯特丹、俄罗斯莫斯科、美国纽约等数据中心。支持Paypal,信用卡,Webmoney,以及支付宝等付款方式。禁止VPN,代理,Tor,网络诈骗,儿童色情,Spam,网络扫描,俄罗斯色情,俄罗斯电影,俄罗斯MP3,俄罗斯Trackers,以及俄罗斯法...

星梦云-100G高防4H4G21M月付仅99元,成都/雅安/德阳

商家介绍:星梦云怎么样,星梦云好不好,资质齐全,IDC/ISP均有,从星梦云这边租的服务器均可以备案,属于一手资源,高防机柜、大带宽、高防IP业务,一手整C IP段,四川电信,星梦云专注四川高防服务器,成都服务器,雅安服务器,。活动优惠促销:1、成都电信夏日激情大宽带活动机(封锁UDP,不可解封):机房CPU内存硬盘带宽IP防护流量原价活动价开通方式成都电信优化线路2vCPU2G40G+60G21...

雅虎免费邮箱为你推荐
明星论坛www.51.com是一个关于什么的网站?明星论坛谁能介绍几个关于明星的好看图片网站啊.?易pc华硕易PC怎么样?性价比到底怎么样?深圳公交车路线深圳公交车路线查询迅雷云点播账号求一个迅雷云点播vip的账号,只是看的,绝不动任何手脚。二层交换机集线器和二层交换机,三层交换机的区别iphone6上市时间苹果6是什么时候出的 ?blogcn哪种博客更好...sina.baidu.blogcn还是.............?网络虚拟机如何设置vmware虚拟机网络freebsd安装FreeBSD系统NetBSD系统OpenBSD系统 这三个系统安装方法相似吗?用什么方法装哦?
vps优惠码cnyvps 看国外视频直播vps 域名主机基地 新秒杀 hostmonster godaddy域名转出 轻量 mysql主机 建立邮箱 网站木马检测工具 169邮箱 速度云 流媒体加速 中国电信网络测速 大化网 香港博客 512内存 标准机柜 认证机构 hosts文件 更多