This page provides a full list of all submissions to all tasks in SemEval-2012.

 

 

 

 

 

 

 

Task1           English Lexical Simplification                    
                       
Rank Participant Name System Name Score(Kappa) File Path Participant Name System Name TRnk n=1 n=2 n=3
1 UOW SHEF-SimpLex 0.496 task1-WLV-SHEF-SimpLex.zip sujay89 UOW-SHEF SimpLex 0.602 0.575 0.689 0.769
2 UNT SimpRank-Simple 0.471 task1-UNT-SIMPRANK.zip ravisinha UNT-SimpRank Simple 0.585 0.559 0.681 0.76
2 Baseline Simple Freq. 0.471     Baseline Simple Freq. 0.585 0.559 0.681 0.76
2 ANNLOR simple 0.465 task1-annlor-simple.zip annlor ANNLOR simple 0.564 0.538 0.674 0.768
3 UNT SimpleRankL 0.449 task1-UNT-SIMPRANK.zip ravisinha UNT SimpleRankL 0.567 0.541 0.674 0.753
4 EMNLPCPH ORD1 0.405 TASK1_EMNLPCPH_ORD1.zip ajohannsen EMNLPCPH ORD1 0.539 0.513 0.645 0.727
5 EMNLPCPH ORD2 0.393 TASK1_EMNLPCPH_ORD2.zip ajohannsen EMNLPCPH ORD2 0.53 0.503 0.637 0.722
6 SB mmSystem 0.289 task1-SB-mmSystem.zip m_amoia SB mmSystem 0.477 0.452 0.632 0.748
7 ANNLOR Imbing 0.199 task1-annlor-lmbing.zip annlor ANNLOR Imbing 0.336 0.316 0.494 0.647
8 Baseline-L Sub Gold 0.106     Baseline-L Sub Gold 0.454 0.427 0.667 0.959
9 Baseline Random 0.013     Baseline Random 0.34 0.321 0.612 0.825
10 UNT SaLSA -0.082 task1-UNT-SaLSA.zip ravisinha UNT SaLSA 0.146 0.137 0.364 0.532




Task 2 Measuring Degrees of Relational Similarity            
               
Participant Name System Name Spearman's p p < 0.05 p < 0.01 MaxDiff File Path
BUAP BUAP 0.014 2 0 31.7 task2-BUAP-RUN1.zip mireyatovar
UTD NB 0.229 22 16 39.4 task2-UTDRelSim-nb-Run2.zip bryan
UTD SVM 0.116 11 5 34.7 task2-UTDRelSim-svmrank-Run1.zip bryan
Duluth V0 0.05 9 3 32.4 task2-duluth-vector0.zip mireyatovar
Duluth V1 0.039 10 4 31.5 task2-duluth-vector1.zip mireyatovar
Duluth V2 0.038 7 3 31.1 task2-duluth-vector2.zip mireyatovar
Baselines Random 0.018 4 0 31.2    
Baselines PMI 0.112 15 7 33.9    



Task 3 Spatial Role Labeling          
             
Participant Name System Name Precision Recall F1 File Path
KUL SKIP-CHAIN-CRF 0.745 0.773 0.758    
UTDSPRL SUPERVISED2 0.773 0.679 0.723 task3-UTDSpRL-supervised2.zip kirk



Task 4         Evaluating Chinese Word Similarity        
           
Rank Participant Name System Name Score(Tau) File Path
1 UOW-SHEF MIXCC 0.05 task4-NJU-MIXCC.zip lib
2 UNT-SimpRank MIXCD 0.04 task4-NJU-MIXDC.zip lib
3 Gfp1987 Guo-ngram 0.007 task4-Guo-ngram.zip gfp1987
4 Gfp1987 Guo-words -0.011 task4-Guo-words.zip gfp1987



Task 5 Chinese Semantic Dependency Parsing        
         
System   LAS UAS File Path
Zhou Qiaoli-3   61.84 80.6 task5-SAU.rar zhou_qiao_li
NJU-Parser-2   61.64 80.29 task5-NJU-contrast.zip lib
NJU-Parser-1   61.63 80.35 task5-NJU-subsentence.zip lib
Zhijun Wu-1   61.58 80.64 task5-ZhijunWu-APPROACH.zip 244501294
Zhou Qiaoli-1   61.15 80.41 task5-SAU.rar zhou_qiao_li
Zhou Qiaoli-2   57.55 78.55 task5-SAU.rar zhou_qiao_li
ICT-1   56.31 73.2    
Giuseppe Attardi-SVM-1-R   44.46 60.83 task5-a_v_miceli_barone__g_attardi-desr_SVM_1_R.zip attardi
Giuseppe Attardi-SVM-1-Rev   21.86 40.47 task5-a_v_miceli_barone__g_attardi-desr_SVM_1_rev.zip attardi



Task 6 A Pilot on Semantic Textual Similarity        
                           
Run ALL Rank ALLnrm Rank Mean Rank MSRpar MSRvid SMT-eur On-WN SMT-news File Path
00-baseline/task6-baseline 0.311 87 0.6732 85 0.4356 70 0.4334 0.2996 0.4542 0.5864 0.3908    
aca08ls/task6-University_Of_Shefï¬Âeld-Hybrid 0.6485 34 0.8238 15 0.61 18 0.5166 0.8187 0.4859 0.6676 0.428 task6-University_Of_Sheffield-Hybrid.zip aca08ls
aca08ls/task6-University_Of_Shefï¬Âeld-Machine_Learning 0.7241 17 0.8169 18 0.575 38 0.5166 0.8187 0.4859 0.639 0.2089 task6-University_Of_Sheffield-Machine_Learning.zip aca08ls
aca08ls/task6-University_Of_Shefï¬Âeld-Vector_Space 0.6054 48 0.7946 44 0.5943 27 0.546 0.7241 0.4858 0.6676 0.428 task6-University_Of_Sheffield-Vector_Space.zip aca08ls
acaputo/task6-UNIBA-DEPRI 0.6141 46 0.8027 38 0.5891 31 0.4542 0.7673 0.5126 0.6593 0.4636 task6-UNIBA-DEPRI.zip acaputo
acaputo/task6-UNIBA-LSARI 0.6221 44 0.8079 30 0.5728 40 0.3886 0.7908 0.4679 0.6826 0.4238 task6-UNIBA-LSARI.zip acaputo
acaputo/task6-UNIBA-RI 0.6285 41 0.7951 43 0.5651 45 0.4128 0.7612 0.4531 0.6306 0.4887 task6-UNIBA-RI.zip acaputo
baer/task6-UKP-run1 0.8117 4 0.8559 4 0.6708 4 0.6821 0.8708 0.5118 0.6649 0.4672 task6-UKP-run1 baer
baer/task6-UKP-run2-plus_postprocessing_smt_twsi 0.8239 1 0.8579 2 0.6773 1 0.683 0.8739 0.528 0.6641 0.4937 task6-UKP-run2_plus_postprocessing_smt_twsi baer
baer/task6-UKP-run3-plus_random 0.779 8 0.8166 19 0.432 71 0.683 0.8739 0.528 -0.062 -0.052 task6-UKP-run3_plus_random baer
croce/task6-UNITOR-1-REGRESSION_BEST_FEATURES 0.7474 13 0.8292 12 0.6316 10 0.5695 0.8217 0.5168 0.6591 0.4713 task6-UNITOR-1_REGRESSION_BEST_FEATURES.zip croce
croce/task6-UNITOR-2-REGRESSION_ALL_FEATURES 0.7475 12 0.8297 11 0.6323 9 0.5763 0.8217 0.5102 0.6591 0.4713 task6-UNITOR-2_REGRESSION_ALL_FEATURES.zip croce
croce/task6-UNITOR-3-REGRESSION_ALL_FEATURES_ALL_DOMAINS 0.6289 40 0.815 21 0.5939 28 0.4686 0.8027 0.4574 0.6591 0.4713 task6-UNITOR-3_REGRESSION_ALL_FEATURES_ALL_DOMAINS.zip croce
csjxu/task6-PolyUCOMP-RUN1 0.6528 31 0.7642 59 0.5492 51 0.4728 0.6593 0.4835 0.6196 0.429 task6-PolyUCOMP-RUN1.zip csjxu
danielcer/stanford_fsa† 0.6354 38 0.7212 70 0.4848 66 0.3795 0.535 0.4377 0.6052 0.4164 task6-stanford-fsa.zip danielcer
danielcer/stanford_pdaAll† 0.4229 77 0.716 72 0.5044 62 0.4409 0.4698 0.4558 0.6468 0.4769 task6-stanford-pdaAll.zip danielcer
danielcer/stanford_rte† 0.5589 55 0.7807 55 0.4674 67 0.4374 0.8037 0.3533 0.3077 0.3235 task6-stanford-rte.zip danielcer
davide-buscaldi/task6-IRIT-pg1 0.428 76 0.7379 65 0.5009 63 0.4295 0.6125 0.4952 0.5387 0.3614 task6-IRIT-pg1.zip davide_buscaldi
davide-buscaldi/task6-IRIT-pg3 0.4813 68 0.7569 61 0.5202 58 0.4171 0.6728 0.5179 0.5526 0.3693 task6-IRIT-pg3.zip davide_buscaldi
davide-buscaldi/task6-IRIT-wu 0.4064 81 0.7287 69 0.4898 65 0.4326 0.5833 0.4856 0.5317 0.348 task6-IRIT-wu.zip davide_buscaldi
demetrios-glinos/task6-ATA-BASE 0.3454 83 0.699 81 0.2772 87 0.1684 0.6256 0.2244 0.1648 0.0988 task6-ATA-BASE.zip demetrios_glinos
demetrios-glinos/task6-ATA-CHNK 0.4976 64 0.716 73 0.3215 86 0.2312 0.6595 0.1504 0.2735 0.1426 task6-ATA-CHNK.zip demetrios_glinos
demetrios-glinos/task6-ATA-STAT 0.4165 79 0.7129 75 0.3312 85 0.1887 0.6482 0.2769 0.295 0.1336 task6-ATA-STAT.zip demetrios_glinos
desouza/task6-FBK-run1 0.5633 54 0.7127 76 0.3628 82 0.2494 0.6117 0.1495 0.4212 0.2439 task6-FBK-run1.zip desouza
desouza/task6-FBK-run2 0.6438 35 0.808 29 0.5888 32 0.5128 0.7807 0.3796 0.6228 0.5474 task6-FBK-run2.zip desouza
desouza/task6-FBK-run3 0.6517 32 0.8106 25 0.6077 20 0.5169 0.7773 0.4419 0.6298 0.6085 task6-FBK-run3.zip desouza
dvilarinoayala/task6-BUAP-RUN-1 0.4997 63 0.7568 62 0.526 56 0.4037 0.6532 0.4521 0.605 0.4537 task6-BUAP-RUN-1.zip dvilarinoayala
dvilarinoayala/task6-BUAP-RUN-2 -0.026 89 0.5933 89 0.1016 89 0.1109 0.0057 0.0348 0.1788 0.1964 task6-BUAP-RUN-2.zip dvilarinoayala
dvilarinoayala/task6-BUAP-RUN-3 0.663 25 0.7474 64 0.5105 59 0.4018 0.6378 0.4758 0.5691 0.4057 task6-BUAP-RUN-3.zip dvilarinoayala
enrique/task6-UNED-H34measures 0.4381 75 0.7518 63 0.5577 48 0.5328 0.5788 0.4785 0.6692 0.4465 task6-UNED-H34measures.zip enrique
enrique/task6-UNED-HallMeasures 0.2791 88 0.6694 87 0.4286 72 0.3861 0.257 0.4086 0.6006 0.5305 task6-UNED-HallMeasures.zip enrique
enrique/task6-UNED-SP-INIST 0.468 69 0.7625 60 0.5615 47 0.5166 0.6303 0.4625 0.6442 0.4753 task6-UNED-SP_INIST.zip enrique
georgiana-dinu/task6-SAARLAND-ALIGN-VSSIM 0.4952 65 0.7871 50 0.5065 60 0.4043 0.7718 0.2686 0.5721 0.3505 task6-SAARLAND-ALIGN_VSSIM.zip georgiana_dinu
georgiana-dinu/task6-SAARLAND-MIXT-VSSIM 0.4548 71 0.8258 13 0.5662 43 0.631 0.8312 0.1391 0.5966 0.3806 task6-SAARLAND-MIXT_VSSIM.zip georgiana_dinu
jan-snajder/task6-takelab-simple 0.8133 3 0.8635 1 0.6753 2 0.7343 0.8803 0.4771 0.6797 0.3989 task6-takelab-simple.zip jan_snajder
jan-snajder/task6-takelab-syntax 0.8138 2 0.8569 3 0.6601 5 0.6985 0.862 0.3612 0.7049 0.4683 task6-takelab-syntax.zip jan_snajder
janardhan/task6-janardhan-UNL-matching 0.3431 84 0.6878 84 0.3481 83 0.1936 0.5504 0.3755 0.2888 0.3387 task6-janardhan-UNL_matching.zip janardhan
jhasneha/task6-Penn-ELReg 0.6622 27 0.8048 34 0.5654 44 0.548 0.7844 0.3513 0.604 0.3607 task6-Penn-ELReg.zip jhasneha
jhasneha/task6-Penn-ERReg 0.6573 28 0.8083 28 0.5755 37 0.561 0.7857 0.3568 0.6214 0.3732 task6-Penn-ERReg.zip jhasneha
jhasneha/task6-Penn-LReg 0.6497 33 0.8043 36 0.5699 41 0.546 0.7818 0.3547 0.5969 0.4137 task6-Penn-LReg.zip jhasneha
jotacastillo/task6-SAGAN-RUN1 0.5522 57 0.7904 47 0.5906 29 0.5659 0.7113 0.4739 0.6542 0.4253 task6-SAGAN-RUN1.zip jotacastillo
jotacastillo/task6-SAGAN-RUN2 0.6272 42 0.8032 37 0.5838 34 0.5538 0.7706 0.448 0.6135 0.3894 task6-SAGAN-RUN2.zip jotacastillo
jotacastillo/task6-SAGAN-RUN3 0.6311 39 0.7943 45 0.5649 46 0.5394 0.756 0.4181 0.5904 0.3746 task6-SAGAN-RUN3.zip jotacastillo
Konstantin-Z/task6-ABBYY-General 0.5636 53 0.8052 33 0.5759 36 0.4797 0.7821 0.4576 0.6488 0.3682 task6-ABBYY-General.zip Konstantin_Z
M-Rios/task6-UOW-LEX-PARA 0.6397 36 0.7187 71 0.3825 80 0.3628 0.6426 0.3074 0.2806 0.2082 task6-UOW-LEX_PARA.zip M_Rios
M-Rios/task6-UOW-LEX-PARA-SEM 0.5981 49 0.6955 82 0.3473 84 0.3529 0.5724 0.3066 0.2643 0.1164 task6-UOW-LEX_PARA_SEM.zip M_Rios
M-Rios/task6-UOW-SEM 0.5361 59 0.6287 88 0.2567 88 0.2995 0.291 0.1611 0.2571 0.2212 task6-UOW-SEM.zip M_Rios
mheilman/task6-ETS-PERP 0.7808 7 0.8064 32 0.6305 11 0.6211 0.721 0.4722 0.708 0.5149 task6-ETS-PERP.zip mheilman
mheilman/task6-ETS-PERPphrases 0.7834 6 0.8089 27 0.6399 7 0.6397 0.72 0.485 0.7124 0.5312 task6-ETS-PERPphrases.zip mheilman
mheilman/task6-ETS-TERp 0.4477 73 0.7291 68 0.5253 57 0.5049 0.5217 0.4748 0.6169 0.4566 task6-ETS-TERp.zip mheilman
nitish-aggarwal/task6-aggarwal-run1? 0.5777 52 0.8158 20 0.5466 52 0.3675 0.8427 0.3534 0.603 0.443 SemEval.STS.Results nitish_aggarwal
nitish-aggarwal/task6-aggarwal-run2? 0.5833 51 0.8183 17 0.5683 42 0.372 0.833 0.4238 0.6513 0.4499 SemEval.STS.Results nitish_aggarwal
nitish-aggarwal/task6-aggarwal-run3 0.4911 67 0.7696 57 0.5377 53 0.532 0.6874 0.4514 0.5827 0.2818 SemEval.STS.Results nitish_aggarwal
nmalandrakis/task6-DeepPurple-DeepPurple-hierarchical 0.6228 43 0.81 26 0.5979 23 0.5984 0.7717 0.4292 0.648 0.3702 task6-DeepPurple-DeepPurple_hierarchical.zip nmalandrakis
nmalandrakis/task6-DeepPurple-DeepPurple-sigmoid 0.554 56 0.7997 41 0.5558 50 0.596 0.7616 0.2628 0.6016 0.3446 task6-DeepPurple-DeepPurple_sigmoid.zip nmalandrakis
nmalandrakis/task6-DeepPurple-DeepPurple-single 0.4918 66 0.7646 58 0.5061 61 0.4989 0.7092 0.4437 0.4879 0.2441 task6-DeepPurple-DeepPurple_single.zip nmalandrakis
parthapakray/task6-JU-CSE_NLP-Semantic_Syntactic_Approach∗ 0.388 82 0.6706 86 0.4111 76 0.3427 0.3549 0.4271 0.5298 0.4034 task6-JU_CSE_NLP-Semantic_Syntactic_Approach.zip parthapakray
rada/task6-UNT-CombinedRegression 0.7418 14 0.8406 7 0.6159 14 0.5032 0.8695 0.4797 0.6715 0.4033 task6-UNT-CombinedRegression.zip rada
rada/task6-UNT-IndividualDecTree 0.7677 9 0.8389 9 0.5947 25 0.5693 0.8688 0.4203 0.6491 0.2256 task6-UNT-IndividualDecTree.zip rada
rada/task6-UNT-IndividualRegression 0.7846 5 0.844 6 0.6162 13 0.5353 0.875 0.4203 0.6715 0.4033 task6-UNT-IndividualRegression.zip rada
sbdlrhmn/task6-sbdlrhmn-Run1 0.6663 23 0.7842 53 0.5376 54 0.544 0.7335 0.383 0.586 0.2445 Run1.zip sbdlrhmn
sbdlrhmn/task6-sbdlrhmn-Run2 0.4169 78 0.7104 77 0.4986 64 0.4617 0.4489 0.4719 0.6353 0.4353 Run2.zip sbdlrhmn
sgjimenezv/task6-SOFT-CARDINALITY 0.7331 15 0.8526 5 0.6708 3 0.6405 0.8562 0.5152 0.7109 0.4833 task6-SOFT-CARDINALITY-ONE-FUNCTION.zip sgjimenezv
sgjimenezv/task6-SOFT-CARDINALITY-ONE-FUNCTION 0.7107 19 0.8397 8 0.6486 6 0.6316 0.8237 0.432 0.7109 0.4833 task6-SOFT-CARDINALITY.zip sgjimenezv
siva/task6-DSS-alignheuristic 0.5253 60 0.7962 42 0.603 21 0.5735 0.7123 0.4781 0.6984 0.4177 task6-DSS-alignheuristic.zip siva
siva/task6-DSS-average 0.549 58 0.8047 35 0.5943 26 0.502 0.7645 0.4875 0.6677 0.4324 task6-DSS-average.zip siva
siva/task6-DSS-wordsim 0.513 61 0.7895 49 0.5287 55 0.3765 0.7761 0.4161 0.5728 0.3964 task6-DSS-wordsim.zip siva
skamler/task6-EHU-RUN1v2?† 0.3129 86 0.6935 83 0.3889 79 0.3605 0.5187 0.2259 0.4098 0.3465 task6-EHU-RUN1.zip skamler_
sokolov/task6-LIMSI-cosprod 0.6392 37 0.7344 67 0.394 78 0.3948 0.6597 0.0143 0.4157 0.2889 task6-LIMSI-cosprod.zip sokolov
sokolov/task6-LIMSI-gradtree 0.6789 22 0.7377 66 0.4118 75 0.4848 0.6636 0.0934 0.3706 0.2455 task6-LIMSI-gradtree.zip sokolov
sokolov/task6-LIMSI-sumdiff 0.6196 45 0.7101 78 0.4131 74 0.4295 0.5724 0.2842 0.3989 0.2575 task6-LIMSI-sumdiff.zip sokolov
spirin2/task6-UIUC-MLNLP-Blend 0.4592 70 0.78 56 0.5782 35 0.6523 0.6691 0.3566 0.6117 0.4603 task6-UIUC-MLNLP-Blend.zip spirin2
spirin2/task6-UIUC-MLNLP-CCM 0.7269 16 0.8217 16 0.6104 17 0.5769 0.8203 0.4667 0.5835 0.4945 task6-UIUC-MLNLP-CCM.zip spirin2
spirin2/task6-UIUC-MLNLP-Puzzle 0.3216 85 0.7857 51 0.4376 69 0.5635 0.8056 0.063 0.2774 0.2409 task6-UIUC-MLNLP-Puzzle.zip spirin2
sranjans/task6-sranjans-1 0.6529 30 0.8018 39 0.6249 12 0.6124 0.724 0.5581 0.6703 0.4533 task6-sranjans-1.zip sranjans
sranjans/task6-sranjans-2 0.6651 24 0.8128 22 0.6366 8 0.6254 0.7538 0.5328 0.6649 0.5036 task6-sranjans-2.zip sranjans
sranjans/task6-sranjans-3 0.5045 62 0.7846 52 0.5905 30 0.6167 0.7061 0.5666 0.5664 0.3968 task6-sranjans-3.zip sranjans
tiantianzhu7/task6-tiantianzhu7-1 0.4533 72 0.7134 74 0.4192 73 0.4184 0.563 0.2083 0.4822 0.2745 task6-tiantianzhu7-1.zip tiantianzhu7
tiantianzhu7/task6-tiantianzhu7-2 0.4157 80 0.7099 79 0.396 77 0.426 0.5628 0.1546 0.4552 0.1923 task6-tiantianzhu7-2.zip tiantianzhu7
tiantianzhu7/task6-tiantianzhu7-3 0.4446 74 0.7097 80 0.374 81 0.3411 0.5946 0.1868 0.4029 0.1823 task6-tiantianzhu7-3.zip tiantianzhu7
weiwei/task6-weiwei-run1?† 0.6946 20 0.8303 10 0.6081 19 0.4106 0.8351 0.5128 0.7273 0.4383    
yeh/task6-SRIUBC-SYSTEM1† 0.7513 11 0.8017 40 0.5997 22 0.6084 0.7458 0.4688 0.6315 0.3994 task6-SRIUBC-SYSTEM1.zip yeh
yeh/task6-SRIUBC-SYSTEM2† 0.7562 10 0.8111 24 0.5858 33 0.605 0.7939 0.4294 0.5871 0.3366 task6-SRIUBC-SYSTEM2.zip yeh
yeh/task6-SRIUBC-SYSTEM3† 0.6876 21 0.7812 54 0.4668 68 0.4791 0.7901 0.2159 0.3843 0.2801 task6-SRIUBC-SYSTEM3.zip yeh
ygutierrez/task6-UMCC-DLSI-MultiLex 0.663 26 0.7922 46 0.556 49 0.6022 0.7709 0.4435 0.4327 0.4264 task6-UMCC_DLSI-MultiLex.zip ygutierrez
ygutierrez/task6-UMCC-DLSI-MultiSem 0.6529 29 0.8115 23 0.6116 16 0.5269 0.7756 0.4688 0.6539 0.547 task6-UMCC_DLSI-MultiSem.zip ygutierrez
ygutierrez/task6-UMCC-DLSI-MultiSemLex 0.7213 18 0.8239 14 0.6158 15 0.6205 0.8104 0.4325 0.6256 0.434 task6-UMCC_DLSI-MultiSemLex.zip ygutierrez
yrkakde/task6-yrkakde-DiceWordnet 0.5977 50 0.7902 48 0.5742 39 0.5294 0.747 0.5531 0.5698 0.3659 task6-yrkakde-DiceWordnet.zip yrkakde
yrkakde/task6-yrkakde-JaccNERPenalty 0.6067 47 0.8078 31 0.5955 24 0.5757 0.7765 0.4989 0.6257 0.3468 task6-yrkakde-JaccNERPenalty.zip yrkakde


Task 7 Choice of Plausible Alternatives: An Evaluation of Commonsense Causal Reasoning          
       
System Accuracy File Path
UTDHLT Bigram PMI   61.8% task7-UTDHLT-bigram_pmi.test travis
UTDHLT SVM Combined   63.4% task7-UTDHLT-svm_combined.test travis



Task 8 Cross-lingual Textual Entailment for Content Synchronization        
               
System Run SP-EN IT-EN FR-EN DE-EN File Path
BUAP run1 0.35 0.336 0.334 0.33 task8-BUAP-Run-1.zip dvilarinoayala
BUAP run2 0.366 0.344 0.342 0.268 task8-BUAP-Run-2.zip dvilarinoayala
celi run1 0.276 0.278 0.278 0.28 task8_celi_run1.zip kouylekov
celi run2 0.336 0.338 0.3 0.352 task8_celi_run2.zip kouylekov
celi run3 0.322 0.334 0.298 0.35 task8_celi_run3.zip kouylekov
celi run4 0.268 0.28 0.28 0.274 task8_celi_run4.zip kouylekov
DirRelCond3 run1 0.3 0.28 0.362 0.336 task8_DirRelCond3_spa-eng_run1.zip, task8_DirRelCond3_ita-eng_run1.zip, task8_DirRelCond3_fra-eng_run1.zip, task8_DirRelCond3_deu-eng_run1.zip palpar
DirRelCond3 run2 0.3 0.284 0.36 0.336 task8_DirRelCond3_spa-eng_run2.zip, task8_DirRelCond3_ita-eng_run2.zip, task8_DirRelCond3_fra-eng_run2.zip, task8_DirRelCond3_deu-eng_run2.zip palpar
DirRelCond3 run3 0.3 0.338 0.384 0.364 task8_DirRelCond3_spa-eng_run3.zip, task8_DirRelCond3_ita-eng_run3.zip, task8_DirRelCond3_fra-eng_run3.zip, task8_DirRelCond3_deu-eng_run3.zip palpar
DirRelCond3 run4 0.344 0.316 0.384 0.374 task8_DirRelCond3_spa-eng_run4.zip, task8_DirRelCond3_ita-eng_run4.zip, task8_DirRelCond3_fra-eng_run4.zip, task8_DirRelCond3_deu-eng_run4.zip palpar
FBK run1* 0.502 - - - task8-FBK-CLTE.zip mehdad
FBK run2* 0.49 - - - task8-FBK-CLTE.zip mehdad
FBK run3* 0.504 - - - task8-FBK-CLTE.zip mehdad
FBK run4* 0.5 - - - task8-FBK-CLTE.zip mehdad
HDU run1 0.63 0.554 0.564 0.558 task8_HDU_spa-eng_run1.zip, task8_HDU_ita-eng_run1.zip, task8_HDU_fra-eng_run1.zip, task8_HDU_deu-eng_run1.zip waeschle
HDU run2 0.632 0.562 0.57 0.552 task8_HDU_spa-eng_run2.zip, task8_HDU_ita-eng_run2.zip, task8_HDU_fra-eng_run2.zip, task8_HDU_deu-eng_run2.zip waeschle
ICT run1 0.448 0.454 0.456 0.46 task8_ICT_run.zip, task8_ICT_spa-eng_run1.zip, task8_ICT_ita-eng_run1.zip, task8_ICT_fra-eng_run1.zip, task8_ICT_deu-eng_run1.zip xionghao
JU-CSE-NLP run1 0.274 0.316 0.288 0.262 Task8_JU-CSE-NLP_spa-eng_run1.zip, Task8_JU-CSE-NLP_ita-eng_run1.zip, Task8_JU-CSE-NLP_fra-eng_run1.zip, Task8_JU-CSE-NLP_deu-eng_run1.zip parthapakray
JU-CSE-NLP run2 0.266 0.326 0.294 0.296 Task8_JU-CSE-NLP_spa-eng_run2.zip, Task8_JU-CSE-NLP_ita-eng_run2.zip, Task8_JU-CSE-NLP_fra-eng_run2.zip, Task8_JU-CSE-NLP_deu-eng_run2.zip parthapakray
JU-CSE-NLP run3 0.272 0.314 0.296 0.264 Task8_JU-CSE-NLP_spa-eng_run3.zip, Task8_JU-CSE-NLP_ita-eng_run3.zip, Task8_JU-CSE-NLP_fra-eng_run3.zip, Task8_JU-CSE-NLP_deu-eng_run3.zip parthapakray
Sagan run1 0.342 0.352 0.346 0.342 task8_Sagan_spa-eng_run1.zip, task8_Sagan_ita-eng_run1.zip, task8_Sagan_fra-eng_run1.zip, task8_Sagan_ger-eng_run1.zip jotacastillo
Sagan run2 0.328 0.352 0.336 0.31 task8_Sagan_spa-eng_run2.zip, task8_Sagan_ita-eng_run2.zip, task8_Sagan_fra-eng_run2.zip, task8_Sagan_ger-eng_run2.zip jotacastillo
Sagan run3 0.346 0.356 0.33 0.332 task8_Sagan_spa-eng_run3.zip, task8_Sagan_ita-eng_run3.zip, task8_Sagan_fra-eng_run3.zip, task8_Sagan_ger-eng_run3.zip jotacastillo
Sagan run4 0.34 0.33 0.31 0.31 task8_Sagan_spa-eng_run4.zip, task8_Sagan_ita-eng_run4.zip, task8_Sagan_fra-eng_run4.zip, task8_Sagan_ger-eng_run4.zip jotacastillo
SoftCard run1 0.552 0.566 0.57 0.55 sgjimenezv_spa-eng_run1.xml, sgjimenezv_ita-eng_run1.xml, sgjimenezv_fra-eng_run1.xml, sgjimenezv_deu-eng_run1.xml  
UAlacant run1-LATE 0.598 - - - task8_UAlacant_spa-eng_run1.zip mespla
UAlacant run2 0.582 - - - task8_UAlacant_spa-eng_run2.zip mespla
UAlacant run3-LATE 0.51 - - - task8_UAlacant_spa-eng_run3.zip mespla
UAlacant run4 0.514 - - - task8_UAlacant_spa-eng_run4.zip mespla
Highest   0.632 0.566 0.57 0.558    
Average   0.44 0.411 0.408 0.408    
Median   0.407 0.35 0.365 0.363    
Lowest   0.274 0.326 0.296 0.296    
               
Accuracy results (92 runs) over the 4 language combinations. Highest, average, median and lowest scores are calculated considering the best run for each team (*task organizers’ system).