[Typetext][Typetext][Typetext]2014TradeScienceInc.
ISSN:0974-7435Volume10Issue24BioTechnologyAnIndianJournalFULLPAPERBTAIJ,10(24),2014[16338-16346]ApplicationresearchofdecisiontreealgorithminenglishgradeanalysisZhaoKunBeihuaUniversity,Teacher'scollege,Jilin,(CHINA)ABSTRACTThispaperintroducesandanalysesthedatamininginthemanagementofstudents'grades.
Weusethedecisiontreeinanalysisofgradesandinvestigateattributeselectionmeasureincludingdatacleaning.
WetakecoursescoreofinstituteofEnglishlanguageforexampleandproducedecisiontreeusingID3algorithmwhichgivesthedetailedcalculationprocess.
Becausetheoriginalalgorithmlacksterminationcondition,weproposeanimprovedalgorithmwhichcanhelpustofindthelatencyfactorwhichimpactsthegrades.
KEYWORDSDecisiontreealgorithm;Englishgradeanalysis;ID3algorithm;Classification.
BTAIJ,10(24)2014ZhaoKun16339INTRODUCTIONWiththerapiddevelopmentofhighereducation,EnglishgradeanalysisasanimportantguaranteeforthescientificmanagementconstitutesthemainpartoftheEnglisheducationalassessment.
Theresearchonapplicationofdatamininginmanagementofstudents'gradeswantstotalkhowtogettheusefuluncoveredinformationfromthelargeamountsofdatawiththedataminingandgrademanagement[1-5].
Itintroducesandanalysesthedatamininginthemanagementofstudents'grades.
Itusesthedecisiontreeinanalysisofgrades.
Itdescribesthefunction,statusanddeficiencyofthemanagementofstudents'grades.
Ittellsushowtoemploythedecisiontreeinmanagementofstudents'grades.
ItimprovestheID3arithmetictoanalyzethestudents'gradessothatwecouldfindthelatencyfactorwhichimpactsthegrades.
Ifwefindoutthefactors,wecanofferthedecision-makinginformationtoteachers.
Italsoadvancesthequalityofteaching[6-10].
TheEnglishgradeanalysishelpsteacherstoimprovetheteachingqualityandprovidesdecisionsforschoolleaders.
Thedecisiontree-basedclassificationmodeliswidelyusedasitsuniqueadvantage.
Firstly,thestructureofthedecisiontreemethodissimpleanditgeneratesruleseasytounderstand.
Secondly,thehighefficiencyofthedecisiontreemodelismoreappropriateforthecaseofalargeamountofdatainthetrainingset.
Furthermorethecomputationofthedecisiontreealgorithmisrelativelynotlarge.
Thedecisiontreemethodusuallydoesnotrequireknowledgeofthetrainingdata,andspecializesinthetreatmentofnon-numericdata.
Finally,thedecisiontreemethodhashighclassificationaccuracy,anditistoidentifycommoncharacteristicsoflibraryobjects,andclassifytheminaccordancewiththeclassificationmodel.
Theoriginaldecisiontreealgorithmusesthetop-downrecursiveway[11-12].
Comparisonofpropertyvaluesisdoneintheinternalnodesofthedecisiontreeandaccordingtothedifferentpropertyvaluesjudgedownbranchesfromthenode.
Wegetconclusionfromthedecisiontreeleafnode.
Therefore,apathfromtheroottotheleafnodecorrespondstoaconjunctiverules,theentiredecisiontreecorrespondstoasetofdisjunctiveexpressionsrules.
Thedecisiontreegenerationalgorithmisdividedintotwosteps[13-15].
Thefirststepisthegenerationofthetree,andatthebeginningallthedataisintherootnode,thendotherecursivedataslice.
Treepruningistoremovesomeofthenoiseorabnormaldata.
Conditionsofdecisiontreetostopsplittingisthatanodedatabelongstothesamecategoryandtherearenotattributesusedtosplitthedata.
Inthenextsection,weintroduceconstructionofdecisiontree.
InSection3weintroduceattributeselectionmeasure.
InSection4,wedoempiricalresearchbasedonID3algorithmandproposeanimprovedalgorithm.
InSection5weconcludethepaperandgivesomeremarks.
CONSTRUCTIONOFDECISIONTREEUSINGID3ThegrowingstepofthedecisiontreeisshowninFigure1.
Decisiontreegenerationalgorithmisdescribedasfollows.
Thenameofthealgorithmis__Generatedecisiontreewhichproduceadecisiontreebygiventrainingdata.
Theinputistrainingsampleswhichisrepresentedwithdiscretevalues.
Candidateattributesetisattribute.
Theoutputisadecisiontree.
Step1.
SetupnodeN.
IfsamplesisinasameclassCthenreturnNasleadnodeandlabelitwithC.
Step2.
Ifattribute_listisempty,thenreturnNasleafnodeandlabelitwiththemostcommonclassinthesamples.
Step3.
Choose_testattributewithinformationgainintheattribute_list,andlabelNas_testattribute.
Step4.
Whileeachiainevery_testattributedothefollowingoperation.
Step5.
NodeNproducesabranchwhichmeetstheconditionof_itestattributeaStep6.
Supposeisissamplesetof_itestattributeainthesamples.
Ifisisempty,thenplusaleafandlabelitasthemostcommonclass.
OtherwiseplusanodewhichwasreturnedbyiGeneratedecisiontreesattributelisttestattribute.
16340ApplicationresearchofdecisiontreealgorithminenglishgradeanalysisBTAIJ,10(24)2014Figure1:GrowingstepofthedecisiontreeANIMPROVEDALGORITHMAttributeselectionmeasureSupposeSisdatasamplesetofsnumberandclasslabelattributehasmdifferentvalues(1,2,,)iCim.
SupposeiSisthenumberofsampleofclassiCinS.
Foragivensampleclassificationthedemandedexpectationinformationisgivenbyformula1[11-12].
1221log(1,2,,,)mjjmjijijiIssKsppiKn(1)12121()VjjmjjjmjjSSSEAISSKSS(2)ipisprobabilitythatrandomsamplebelongstoiCandisestimatedby/iss.
SupposeattributeAhasVdifferentvalues12Vaaa.
WecanuseattributeAtoclassifySintoVnumberofsubset12(,,)VSSS.
SupposeijSisthenumberofclassiCinsubsetjS.
Theexpectedinformationofsubsetisshowninformula2.
12()jjmjSSSSistheweightofthej-thsubset.
ForagivensubsetjSformula3setsup[13].
1221log(1,2,,,)mjjmjijijiIssKsppiKn(3)BTAIJ,10(24)2014ZhaoKun16341ijijjspsistheprobabilitythatsamplesofjsbelongstoclassiC.
IfwebranchinA,theinformationgainisshowninformula4[14].
12mGainAIsssEA(4)TheimprovedalgorithmTheimprovedalgorithmisasfollows.
Function__Generatedecisiontree(trainingsamples,candidateattributeattribute_list){SetupnodeN;IfsamplesareinthesameclassCthenReturnNasleafnodeandlabelitwithC;Recordstatisticaldatameetingtheconditionsontheleafnode;Ifattribute_listisemptythenReturnNastheleafnodeandlabelitasthemostcommonclassofsamples;Recordstatisticaldatameetingtheconditionsontheleafnode;SupposeGainMax=max(Gain1,Gain2,…,Gainn)IfGainMax='85'Updatekssetci_pi='medium'whereci_pj>='75'andci_pj='60'andci_pj<'75'Updatekssetsjnd='high'wheresjnd='1'Updatekssetsjnd='medium'wheresjnd='2'Updatekssetsjnd='low'wheresjnd='3'ResultofID3algorithmTABLE2istrainingsetofstudenttestscoressituationinformationafterdatacleaning.
Weclassifythesamplesintothreecategories.
1"outstanding"C,2"medium"C,3"general"C,1300,s21950s,3880s,3130s.
Accordingtoformula1,weobtain123300,1950,880)(300/3130)Isss2/log(300/3130).
22(1950/3130)log(1950/3130)(880/3130)log(880/3130)1.
256003.
Entropyofeveryattributeiscalculatedasfollows.
Firstlycalculatewhetherre-learning.
Foryes,11210s,21950s,31580s.
112131210,950,580)Isss222(210/1740)log(210/1740)(950/1740)log(950/1740)(580/1740)log(580/1740)1.
074901Forno,1290s,221000s,32300s.
12223290,1000,300)Isss222(90/1390)log(90/1390)(1000/1390)log(1000/1390)(300/1390)log(300/1390)1.
373186.
IFsamplesareclassifiedaccordingtowhetherre-learning,theexpectedinformationis1121311222321740/3130)1390/3130)EwhetherrelearningIsssIsss0.
5559111.
0749010.
4440891.
3731861.
240721.
Sotheinformationgainis1230.
015282GainwhetherrelearningIsssEwhetherrelearning.
Secondlycalculatecoursetype,whenitisA,112131110,200,580sss.
112131222110,200,580)(110/890)log(110/890)(200/890)log(200/890)(580/890)log(580/890)Isss1.
259382.
ForcoursetypeB,122232100,400,0sss.
BTAIJ,10(24)2014ZhaoKun1634312223222100,400,0)(100/500)log(100/500)(400/500)log(400/500)0Isss0.
721928.
ForcoursetypeC,1323330,550,0sss.
132333220,550,0)(0/550)log(0/550)(550/500)log(550/500)0Isss1.
168009.
ForcoursetypeD,14243490,800,300sss.
14243422290,800,300)(90/1190)log(90/1190)(800/1190)log(800/1190)(300/1190)log(300/1190)Isss1.
168009.
112131122232("")(890/3130)500/3130)EcoursetypeIsssIsss132333142434(550/3130)1190/3130)0.
91749.
IsssIsss("")1.
2560030.
917490.
338513Gaincoursetype.
Thirdlycalculatepaperdifficulty.
Forhigh,112131110,900,280sss.
112131222110,900,280)(110/1290)log(110/1290)(900/1290)log(900/1290)(280/1290)log(280/1290)Isss1.
14385.
Formedium,122232190,700,300sss.
122232222190,700,300)(190/1190)log(190/1190)(700/1190)log(700/1190)(300/1190)log(300/1190)Isss1.
374086Forlow,1323330,350,300sss.
1323332220,350,300)(0/650)log(0/650)(350/650)log(350/650)(300/650)log(300/650)0.
995727.
Isss112131122232("")(1290/3130)1190/3130)EpaperdifficultyIsssIsss132333(650/3130)1.
200512.
Isss("")1.
2560031.
2005120.
55497.
GainpaperdifficultyFourthlycalculatewhetherrequiredcourse.
Foryes,112131210,850,600sss16344ApplicationresearchofdecisiontreealgorithminenglishgradeanalysisBTAIJ,10(24)2014112131222210,850,600)(210/1660)log(210/1660)(850/1660)log(850/1660)(600/1660)log(600/1660)Isss1.
220681.
Forno,12223290,1100,280sss12223222290,1100,280)(90/1470)log(90/1470)(1100/1470)log(1100/1470)(280/1470)log(280/1470)Isss1.
015442.
112131122232("")(1660/3130)1470/3130)1.
220681.
EwhetherrequiredIsssIsss("")1.
2560031.
2206810.
035322.
GainwhetherrequiredTABLE2:TrainingsetofstudenttestscoresCoursetypeWhetherre-learningPaperdifficultyWhetherrequiredScoreStatisticaldataDnomediumnooutstanding90Byesmediumyesoutstanding100Ayeshighyesmedium200Dnolownomedium350Cyesmediumyesgeneral300Ayeshighnomedium250Bnohighnomedium300Ayeshighyesoutstanding110Dyesmediumyesmedium500Dnolowyesgeneral300Ayeshighnogeneral280Bnohighyesmedium150Cnomediumnomedium200ResultofimprovedalgorithmTheoriginalalgorithmlacksterminationcondition.
ThereareonlytworecordsforasubtreetobeclassifiedwhichisshowninTABLE3.
TABLE3:SpecialcaseforclassificationofthesubtreeCoursetypeWhetherre-learningPaperdifficultyWhetherrequiredScoreStatisticaldataAnohighyesmedium15Anohighyesgeneral20BTAIJ,10(24)2014ZhaoKun16345Figure2:DecisiontreeusingimprovedalgorithmAllGainscalculatedare0.
00,andGainMax=0.
00whichdoesnotconformtorecursiveterminationconditionoftheoriginalalgorithminTABLE3.
Thetreeobtainedisnotreasonable,soweadopttheimprovedalgorithmanddecisiontreeusingimprovedalgorithmisshowninFigure2.
CONCLUSIONSInthispaperwestudyconstructionofdecisiontreeandattributeselectionmeasure.
Becausetheoriginalalgorithmlacksterminationcondition,weproposeanimprovedalgorithm.
WetakecoursescoreofinstituteofEnglishlanguageforexampleandwecouldfindthelatencyfactorwhichimpactsthegrades.
REFERENCES[1]XueleiXu,ChunweiLou;"ApplyingDecisionTreeAlgorithmsinEnglishVocabularyTestItemSelection",IJACT:InternationalJournalofAdvancementsinComputingTechnology,4(4),165-173(2012).
[2]HuaweiZhang;"LazyDecisionTreeMethodforDistributedPrivacyPreservingDataMining",IJACT:InternationalJournalofAdvancementsinComputingTechnology,4(14),458-465(2012).
[3]Xin-huaZhu,Jin-lingZhang,Jiang-taoLu;"AnEducationDecisionSupportSystemBasedonDataMiningTechnology",JDCTA:InternationalJournalofDigitalContentTechnologyanditsApplications,6(23),354-363(2012).
[4]ZhenLiu,XianFengYang;"Anapplicationmodeloffuzzyclusteringanalysisanddecisiontreealgorithmsinbuildingwebmining",JDCTA:InternationalJournalofDigitalContentTechnologyanditsApplications,6(23),492-500(2012).
[5]Guang-xianJi;"Theresearchofdecisiontreelearningalgorithmintechnologyofdataminingclassification",JCIT:JournalofConvergenceInformationTechnology,7(10),216-223(2012).
[6]FuxianHuang;"ResearchofanAlgorithmforGeneratingCost-SensitiveDecisionTreeBasedonAttributeSignificance",JDCTA:InternationalJournalofDigitalContentTechnologyanditsApplications,6(12),308-316(2012).
[7]M.
SudheepElayidom,SumamMaryIdikkula,JosephAlexander;"DesignandPerformanceanalysisofDataminingtechniquesBasedonDecisiontreesandNaiveBayesclassifierFor",JCIT:JournalofConvergenceInformationTechnology,6(5),89-98(2011).
[8]MarjanBahrololum,ElhamSalahi,MahmoudKhaleghi;"AnImprovedIntrusionDetectionTechniquebasedontwoStrategiesUsingDecisionTreeandNeuralNetwork",JCIT:JournalofConvergenceInformationTechnology,4(4),96-101(2009).
[9]Bor-tyngWang,Tian-WeiSheu,Jung-ChinLiang,Jian-WeiTzeng,NagaiMasatake;"TheStudyofSoftComputingontheFieldofEnglishEducation:ApplyingGreyS-PChartinEnglishWritingAssessment",JDCTA:InternationalJournalofDigitalContentTechnologyanditsApplications,5(9),379-388(2011).
[10]MohamadFarhanMohamadMohsin,MohdHelmyAbdWahab,MohdFairuzZaiyadi,CikFazilahHibadullah;"AnInvestigationintoInfluenceFactorofStudentProgrammingGradeUsingAssociationRuleMining",AISS:AdvancesinInformationSciencesandServiceSciences,2(2),19-27(2010).
16346ApplicationresearchofdecisiontreealgorithminenglishgradeanalysisBTAIJ,10(24)2014[11]HaoXin;"AssessmentandAnalysisofHierarchicalandProgressiveBilingualEnglishEducationBasedonNeuro-Fuzzyapproach",AISS:AdvancesinInformationSciencesandServiceSciences,5(1),269-276(2013).
[12]Hong-chaoChen,Jin-lingZhang,Ya-qiongDeng;"ApplicationofMixed-Weighted-Association-Rules-BasedDataMiningTechnologyinCollegeExaminationgradesAnalysis",JDCTA:InternationalJournalofDigitalContentTechnologyanditsApplications,6(10),336-344(2012).
[13]YuanWang,LanZheng;"EndocrineHormonesAssociationRulesMiningBasedonImprovedAprioriAlgorithm",JCIT:JournalofConvergenceInformationTechnology,7(7),72-82(2012).
[14]TianBai,JinchaoJi,ZheWang,ChunguangZhou;"ApplicationofaGlobalCategoricalDataClusteringMethodinMedicalDataAnalysis",AISS:AdvancesinInformationSciencesandServiceSciences,4(7),182-190(2012).
[15]HongYanMei,YanWang,JunZhou;"DecisionRulesExtractionBasedonNecessaryandSufficientStrengthandClassificationAlgorithm",AISS:AdvancesinInformationSciencesandServiceSciences,4(14),441-449(2012).
[16]LiuYong;"TheBuildingofDataMiningSystemsbasedonTransactionDataMiningLanguageusingJava",JDCTA:InternationalJournalofDigitalContentTechnologyanditsApplications,6(14),298-305(2012).
digital-vm在日本东京机房当前提供1Gbps带宽、2Gbps带宽、10Gbps带宽接入的独立服务器,每个月自带10T免费流量,一个独立IPv4。支持额外购买流量:20T-$30/月、50T-$150/月、100T-$270美元/月;也支持额外购买IPv4,/29-$5/月、/28-$13/月。独立从下单开始一般24小时内可以上架。官方网站:https://digital-vm.com/de...
在之前几个月中也有陆续提到两次HostYun主机商,这个商家前身是我们可能有些网友熟悉的主机分享团队的,后来改名称的。目前这个品牌主营低价便宜VPS主机,这次有可以看到推出廉价版本的美国CN2 GIA VPS主机,月费地址15元,适合有需要入门级且需要便宜的用户。第一、廉价版美国CN2 GIA VPS主机方案我们可看到这个类型的VPS目前三网都走CN2 GIA网络,而且是原生IP。根据信息可能后续...
百纵科技:美国高防服务器,洛杉矶C3机房 独家接入zenlayer清洗 带金盾硬防,CPU全系列E52670、E52680v3 DDR4内存 三星固态盘阵列!带宽接入了cn2/bgp线路,速度快,无需备案,非常适合国内外用户群体的外贸、搭建网站等用途。C3机房,双程CN2线路,默认200G高防,3+1(高防IP),不限流量,季付送带宽美国洛杉矶C3机房套餐处理器内存硬盘IP数带宽线路防御价格/月套...
1100lu.com为你推荐
access数据库什么是ACCESS数据库老虎数码我想买个一千左右的数码相机!最好低于一千五!再给我说一下像素是多少?psbc.comwap.psbc.com网银激活同ip网站12306怎么那么多同IP网站啊?这么重要的一个网站我感觉应该是超强配置的独立服务器才对啊,求高人指点广告法有那些广告法?还有广告那些广告词?www.gogo.com祺笑化瘀祛斑胶囊效果。梦遗姐男人梦遗,女人会吗?dpscycle痛苦术士PVE输出宏www.zzzcn.com哪里有免费看书的网站官人放题《墨竹题图诗》 大意
域名停靠 中文域名注册 高防服务器租用qy vps是什么意思 如何查询域名备案号 ftp空间 京东商城双十一活动 免费高速空间 深圳域名 黑科云 国外免费网盘 碳云 美国主机侦探 accountsuspended 免费的加速器 so域名 asp简介 pptpvpn 性能测试工具 大硬盘补丁 更多