Why R square value is coming negative ?

21 vues (au cours des 30 derniers jours)
Devendra
Devendra le 8 Mar 2024
I am doing linear regression on PCA scores but my R square value is coming negative. I am attaching the code and input file. I would appreciate to kindly have a look on the code and suggest me why it is so.
Davendir
% Load data
df = readtable('input_file.csv')
df = 100×460 table
name t1 t2 t3 t4 t5 t6 t7 t8 t9 t10 t11 t12 t13 t14 t15 t16 t17 t18 t19 t20 t21 t22 t23 t24 t25 t26 t27 t28 t29 t30 t31 t32 t33 t34 t35 t36 t37 t38 t39 t40 t41 t42 t43 t44 t45 t46 t47 t48 t49 t50 t51 t52 t53 t54 t55 t56 t57 t58 t59 t60 t61 t62 t63 t64 t65 t66 t67 t68 t69 t70 t71 t72 t73 t74 t75 t76 t77 t78 t79 t80 t81 t82 t83 t84 t85 t86 t87 t88 t89 t90 t91 t92 t93 t94 t95 t96 t97 t98 t99 t100 t101 t102 t103 t104 t105 t106 t107 t108 t109 t110 t111 t112 t113 t114 t115 t116 t117 t118 t119 t120 t121 t122 t123 t124 t125 t126 t127 t128 t129 t130 t131 t132 t133 t134 t135 t136 t137 t138 t139 t140 t141 t142 t143 t144 t145 t146 t147 t148 t149 t150 t151 t152 t153 t154 t155 t156 t157 t158 t159 t160 t161 t162 t163 t164 t165 t166 t167 t168 t169 t170 t171 t172 t173 t174 t175 t176 t177 t178 t179 t180 t181 t182 t183 t184 t185 t186 t187 t188 t189 t190 t191 t192 t193 t194 t195 t196 t197 t198 t199 t200 t201 t202 t203 t204 t205 t206 t207 t208 t209 t210 t211 t212 t213 t214 t215 t216 t217 t218 t219 t220 t221 t222 t223 t224 t225 t226 t227 t228 t229 t230 t231 t232 t233 t234 t235 t236 t237 t238 t239 t240 t241 t242 t243 t244 t245 t246 t247 t248 t249 t250 t251 t252 t253 t254 t255 t256 t257 t258 t259 t260 t261 t262 t263 t264 t265 t266 t267 t268 t269 t270 t271 t272 t273 t274 t275 t276 t277 t278 t279 t280 t281 t282 t283 t284 t285 t286 t287 t288 t289 t290 t291 t292 t293 t294 t295 t296 t297 t298 t299 t300 t301 t302 t303 t304 t305 t306 t307 t308 t309 t310 t311 t312 t313 t314 t315 t316 t317 t318 t319 t320 t321 t322 t323 t324 t325 t326 t327 t328 t329 t330 t331 t332 t333 t334 t335 t336 t337 t338 t339 t340 t341 t342 t343 t344 t345 t346 t347 t348 t349 t350 t351 t352 t353 t354 t355 t356 t357 t358 t359 t360 t361 t362 t363 t364 t365 t366 t367 t368 t369 t370 t371 t372 t373 t374 t375 t376 t377 t378 t379 t380 t381 t382 t383 t384 t385 t386 t387 t388 t389 t390 t391 t392 t393 t394 t395 t396 t397 t398 t399 t400 t401 t402 t403 t404 t405 t406 t407 t408 t409 t410 t411 t412 t413 t414 t415 t416 t417 t418 t419 t420 t421 t422 t423 t424 t425 t426 t427 t428 t429 t430 t431 t432 t433 t434 t435 t436 t437 t438 t439 t440 t441 t442 t443 t444 t445 t446 t447 t448 t449 t450 t451 t452 t453 t454 act_y_ha act_per_ha field_yield total_FS variety ________ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ ________ ________ ________ ________ ________ ________ _______ _______ _______ _______ ________ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ ______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ ________ ________ __________ _________ ________ _______ _________ ________ ________ _______ _______ _______ _________ ________ ________ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ ________ ________ _______ _______ _______ _______ _______ _______ _______ _________ ________ __________ __________ __________ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ ________ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _______ _________ ________ _______ __________ ________ ________ _________ _________ _________ __________ _________ _________ _________ _________ _________ _________ _________ _________ _________ _________ _________ _________ _________ _________ _________ _________ _________ _________ _________ ______ ______ ______ ____ ____ ____ ______ ____ ____ ____ _______ ________ ____ _____ ________ __________ ___________ ________ _____________ {'AE20'} 0.13494 0.13494 0.13494 0.13494 0.13494 0.13494 0.14605 0.20922 0.30745 0.52481 0.32327 0.37595 0.40095 0.33647 0.53768 0.53628 0.60151 0.70125 0.66241 0.62357 0.61283 0.63243 0.66907 0.66823 0.65883 0.70083 0.67779 0.7138 0.68978 0.6722 0.67333 0.68902 0.71146 0.68861 0.6017 0.63995 0.64686 0.55539 0.70421 0.65452 0.62457 0.66696 0.65621 0.66049 0.52734 0.5947 0.65152 0.67425 0.64495 0.666 0.55844 0.56328 0.57 0.4758 0.55211 0.43585 0.51396 0.53113 0.57918 0.62724 0.70516 0.65336 0.61663 0.69166 0.66996 0.63791 0.62302 0.65001 0.63642 0.61943 0.64855 0.62627 0.57171 0.63958 0.65079 0.62197 0.63453 0.65098 0.62867 0.63058 0.59435 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.57138 0.11037 0.11037 0.11037 0.11037 0.11037 0.11037 0.11936 0.13563 0.17133 0.26615 0.21246 0.24119 0.25996 0.20986 0.35499 0.36492 0.40533 0.49551 0.46988 0.44425 0.43049 0.4312 0.45083 0.46034 0.47585 0.48746 0.47286 0.50524 0.47865 0.47037 0.45493 0.46903 0.48234 0.46766 0.40785 0.43373 0.44209 0.37294 0.47407 0.45488 0.4371 0.46198 0.44436 0.44036 0.35916 0.38363 0.42423 0.46292 0.44412 0.4386 0.3812 0.37842 0.38103 0.31531 0.36936 0.28536 0.33428 0.35089 0.38889 0.4269 0.48606 0.44106 0.42927 0.45373 0.44675 0.41906 0.39902 0.40194 0.38111 0.37678 0.40165 0.38536 0.35701 0.40237 0.39805 0.39016 0.4051 0.40278 0.39288 0.38732 0.36008 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.35896 0.49475 0.49475 0.49475 0.49475 0.49475 0.49475 0.50442 0.4387 0.60448 0.99377 0.78624 0.89373 1.0077 0.7706 1.5816 1.7354 2.045 2.9111 2.6123 2.3134 2.1987 2.1938 2.4193 2.4788 2.6795 2.8166 2.6873 3.1771 2.7784 2.7609 2.4831 2.6814 2.8805 2.6282 2.1415 2.327 2.3199 1.8451 2.6982 2.453 2.4585 2.6247 2.4188 2.3113 1.6948 1.9009 2.2179 2.4674 2.2725 2.3077 1.8214 1.8418 1.8895 1.4262 1.7712 1.2614 1.507 1.6251 1.9118 2.1984 2.8093 2.345 2.2246 2.5059 2.3451 2.0914 1.8899 1.9564 1.7359 1.73 1.9241 1.7433 1.6498 2.0282 1.8721 1.8749 1.9623 1.9547 1.8837 1.8338 1.6194 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 1.628 0 0 0 0 0 0 0 0 0 0.070873 0 0 0 0 0.1743 0.32584 0.64679 0.95304 0.82498 0.69692 0.74618 0.65082 0.73063 0.73091 0.70297 0.76232 0.76491 0.83117 0.80121 0.81504 0.80409 0.80812 0.82627 0.79084 0.60732 0.64506 0.69404 0.24489 0.76721 0.67934 0.58283 0.6877 0.66494 0.68021 0.26131 0.48516 0.65514 0.78392 0.81014 0.83665 0.65341 0.75367 0.75569 0.056468 0.72602 0 0.23884 0.39355 0.67574 0.95794 0.98963 0.97638 0.81331 0.97119 0.94699 0.89945 0.78853 0.81763 0.81129 0.77989 0.86517 0.78479 0.54105 0.60156 0.67099 0.62864 0.75137 0.79084 0.78594 0.7822 0.59723 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 0.42668 3.1771 149.35 92.117 79 334 729 2.9793 650 255 395 2.6824 -1.5491 2 8719 1026 118.7 11.57 1008.8 {'NCO334' } {'AE21'} 0.1506 0.1424 0.19163 0.32431 0.35525 0.3862 0.50851 0.42858 0.54241 0.67799 0.68906 0.59103 0.54628 0.51039 0.57585 0.55353 0.61579 0.74872 0.74032 0.73191 0.70075 0.69767 0.73994 0.73007 0.70167 0.73835 0.72276 0.74245 0.71484 0.69151 0.67987 0.69552 0.70767 0.69611 0.60384 0.6422 0.66061 0.54501 0.70603 0.66024 0.63224 0.65567 0.66619 0.66554 0.53059 0.61173 0.66982 0.70445 0.68907 0.7018 0.587 0.59912 0.5957 0.47481 0.56303 0.43627 0.5222 0.55888 0.60847 0.65807 0.73892 0.70271 0.71018 0.73832 0.72638 0.6903 0.67603 0.70936 0.72073 0.70973 0.72949 0.68574 0.62572 0.68887 0.69318 0.66308 0.65774 0.67251 0.63601 0.63197 0.57921 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.54777 0.075299 0.08265 0.12324 0.18857 0.20985 0.23114 0.30245 0.24683 0.33585 0.47236 0.48731 0.39665 0.37232 0.34893 0.40437 0.38552 0.44571 0.54358 0.53747 0.53136 0.49488 0.4913 0.52305 0.51087 0.49952 0.51589 0.49789 0.52681 0.49361 0.47049 0.44341 0.4626 0.47961 0.45811 0.39332 0.42301 0.42687 0.35188 0.45064 0.42542 0.41644 0.43872 0.43481 0.42284 0.34563 0.39186 0.43555 0.4732 0.46258 0.46627 0.38886 0.405 0.40029 0.31555 0.37068 0.2868 0.34936 0.37122 0.40291 0.43459 0.48829 0.46518 0.47564 0.4912 0.48794 0.45817 0.44146 0.46378 0.4672 0.46212 0.47705 0.43597 0.40761 0.44332 0.43387 0.42034 0.41001 0.41014 0.38194 0.37962 0.33204 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.3119 0.28461 0.32701 0.41774 0.67997 0.77307 0.86616 1.1605 0.9329 1.4759 2.6089 2.8058 1.9461 1.9023 1.7505 2.2234 2.1071 2.6139 3.9655 3.8667 3.7679 3.36 3.2219 3.828 3.4658 3.3192 3.4772 3.3286 3.5944 3.1308 2.8877 2.4723 2.7339 2.8816 2.6016 2.0162 2.2566 2.2514 1.7181 2.4502 2.2825 2.2491 2.4426 2.4465 2.2367 1.6244 2.04 2.3644 2.7197 2.6837 2.7013 1.9267 2.1791 2.1569 1.4824 1.9134 1.278 1.6447 1.862 2.1204 2.3789 2.9595 2.6736 2.8735 2.8821 2.8963 2.6692 2.336 2.5583 2.5302 2.5733 2.7864 2.2745 2.0477 2.32 2.2067 2.1007 1.9722 2.0061 1.8 1.7117 1.3873 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 1.3325 0 0 0 0 0 0 0.022177 0 0.070565 0.86778 0.60484 0.52335 0.48774 0.31099 0.80813 0.82208 0.95144 1 1 1 0.99882 0.99462 0.99311 0.99362 0.97715 0.98421 0.97749 0.99026 0.98639 0.98454 0.97194 0.97077 0.98001 0.97749 0.86626 0.93851 0.95464 0.65491 0.97228 0.91515 0.86391 0.89651 0.8871 0.90104 0.52268 0.81939 0.89079 0.95733 0.96371 0.98135 0.88441 0.90759 0.89751 0.040491 0.85601 0 0.48101 0.75235 0.85039 0.94842 0.98774 0.97396 0.97312 0.98101 0.97026 0.95934 0.93901 0.97093 0.97379 0.9748 0.99563 0.97429 0.94204 0.96757 0.9664 0.93414 0.93162 0.94052 0.90138 0.91163 0.7127 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 0.47883 3.9655 180.34 122.08 19 229 729 3.8176 710 210 500 3.6809 -2.633 3 23962 1653 188.9 11.43 2739 {'CO740' } {'AE16'} 0.17683 0.17683 0.17683 0.17683 0.17683 0.17683 0.17683 0.17683 0.36963 0.63359 0.36859 0.37746 0.39091 0.43508 0.47835 0.50939 0.58802 0.73832 0.71381 0.6893 0.66968 0.69857 0.72737 0.70955 0.67874 0.72733 0.7162 0.73659 0.70691 0.69033 0.67374 0.6903 0.69423 0.68639 0.5999 0.63482 0.64594 0.67072 0.6955 0.64284 0.60352 0.66241 0.65747 0.65016 0.52695 0.58782 0.63337 0.67129 0.66193 0.66979 0.57041 0.58317 0.58226 0.47542 0.55724 0.43315 0.52669 0.55623 0.59688 0.63752 0.70226 0.67229 0.68581 0.69032 0.66673 0.64783 0.63545 0.65586 0.70146 0.69854 0.69649 0.6848 0.61594 0.68401 0.72353 0.68572 0.69572 0.69785 0.68403 0.69124 0.65841 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.63018 0.14394 0.14394 0.14394 0.14394 0.14394 0.14394 0.14394 0.14394 0.21884 0.37568 0.21497 0.24182 0.25315 0.27036 0.33135 0.33989 0.40287 0.5166 0.49945 0.48229 0.47478 0.4955 0.50755 0.48633 0.46422 0.50284 0.4957 0.51843 0.47831 0.47313 0.44396 0.46635 0.47054 0.45184 0.39807 0.4231 0.43347 0.44769 0.4619 0.4277 0.40734 0.4385 0.43474 0.41151 0.34218 0.38388 0.42956 0.4547 0.44726 0.45681 0.37386 0.39933 0.39026 0.31431 0.36648 0.28295 0.34712 0.365 0.39911 0.43323 0.4721 0.44254 0.43674 0.44505 0.43811 0.40767 0.399 0.42664 0.43111 0.43623 0.46278 0.42285 0.39734 0.46072 0.45344 0.45378 0.44327 0.45538 0.43103 0.444 0.41293 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.39873 0.47585 0.47585 0.47585 0.47585 0.47585 0.47585 0.47585 0.47585 0.76919 1.4878 0.7782 0.89519 0.94308 1.0483 1.3417 1.4828 1.991 3.1517 2.9532 2.7546 2.7127 3.0586 3.2745 2.8297 2.655 3.0199 3.0061 3.3081 2.7973 2.7762 2.3688 2.5984 2.6387 2.4776 1.9735 2.2475 2.3082 2.409 2.5097 2.2263 2.1196 2.3604 2.328 2.0385 1.5576 1.9075 2.2396 2.5379 2.4189 2.5191 1.7815 2.0422 2.0183 1.4068 1.7926 1.2346 1.6043 1.7636 2.0625 2.3615 2.6993 2.4554 2.3101 2.4623 2.2857 2.0543 1.9429 2.1807 2.1992 2.2855 2.5755 2.1511 2.0355 2.6317 2.4836 2.498 2.327 2.5816 2.2006 2.3924 2.0795 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 1.9002 0 0 0 0 0 0 0 0 0 0.14627 0 0 0 0 0.020452 0.0482 0.46621 0.83913 0.86515 0.89116 0.91807 0.92118 0.92202 0.91688 0.89618 0.90276 0.88973 0.89523 0.87741 0.87322 0.8542 0.85528 0.8542 0.84404 0.74465 0.84571 0.86341 0.88291 0.9024 0.85648 0.78483 0.86569 0.8737 0.88745 0.35151 0.63019 0.78507 0.91114 0.93111 0.94869 0.73317 0.91616 0.90671 0.13276 0.82131 0 0.62744 0.92154 0.95138 0.98122 0.99223 0.98421 0.95587 0.97608 0.96747 0.95491 0.94247 0.94092 0.9207 0.85325 0.91532 0.91413 0.89391 0.96388 0.97524 0.96376 0.96747 0.96221 0.95335 0.94833 0.89343 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 0.88123 3.3081 156.67 103.54 99 334 729 3.0564 630 235 395 2.8323 -1.408 1 28828 1386 159 11.47 3307.5 {'NCO334' } {'OR16'} 0.43634 0.43634 0.43634 0.40463 0.46861 0.53258 0.48316 0.5418 0.57725 0.62145 0.67134 0.56673 0.55228 0.53197 0.58314 0.55474 0.55024 0.72828 0.72181 0.71534 0.68865 0.69332 0.72106 0.70529 0.67571 0.72086 0.7014 0.71615 0.68416 0.65627 0.64989 0.66334 0.65986 0.6414 0.55515 0.58814 0.58352 0.60719 0.63085 0.57917 0.54972 0.56251 0.57498 0.58313 0.46689 0.51738 0.56253 0.6011 0.5723 0.60466 0.51188 0.52576 0.51668 0.41048 0.50467 0.39293 0.45493 0.46906 0.51284 0.55661 0.6277 0.61244 0.62755 0.64267 0.62283 0.60428 0.60052 0.62241 0.62846 0.63516 0.64774 0.61531 0.5531 0.61743 0.62446 0.57745 0.59676 0.59676 0.55934 0.55885 0.49475 0.46288 0.47866 0.44376 0.47478 0.36921 0.36775 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.34426 0.24141 0.24141 0.24141 0.22201 0.2611 0.3002 0.30508 0.3278 0.35147 0.39001 0.45738 0.38137 0.37197 0.36258 0.4031 0.38462 0.37619 0.52171 0.51728 0.51285 0.47761 0.48854 0.50709 0.4879 0.47612 0.49865 0.4702 0.48549 0.45306 0.43319 0.41247 0.42422 0.42441 0.39641 0.3462 0.35841 0.36114 0.36541 0.36967 0.35728 0.3434 0.35858 0.36168 0.34542 0.28785 0.31303 0.3421 0.37843 0.37032 0.37248 0.32189 0.33081 0.32667 0.26124 0.30934 0.25174 0.27641 0.27161 0.30269 0.33377 0.37574 0.37491 0.38724 0.39958 0.39679 0.38029 0.3748 0.3778 0.37916 0.39204 0.39897 0.36884 0.3339 0.37287 0.37158 0.34866 0.34834 0.34751 0.32192 0.30873 0.26669 0.2493 0.24402 0.21581 0.24485 0.197 0.18934 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.17463 0.9407 0.9407 0.9407 0.79916 0.95716 1.1152 1.1867 1.3019 1.4576 1.7053 2.2609 1.8209 1.8364 1.744 2.0952 1.9979 1.8098 3.4482 3.3782 3.3082 2.9892 2.9939 3.2779 2.9736 2.855 3.059 2.7612 2.8947 2.5127 2.305 2.0448 2.1527 2.1251 1.9249 1.55 1.5925 1.5969 1.6298 1.6627 1.6192 1.5829 1.6356 1.6773 1.5273 1.188 1.352 1.5644 1.7462 1.6893 1.6932 1.4059 1.4763 1.4658 1.0439 1.3376 1.0244 1.1442 1.1158 1.291 1.4662 1.6782 1.7122 1.8479 1.9836 1.8932 1.8154 1.7284 1.7984 1.7268 1.8561 1.9132 1.6239 1.4861 1.7691 1.6328 1.6061 1.4919 1.517 1.3501 1.2642 1.0373 0.96204 0.93913 0.80101 0.93886 0.75525 0.68983 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0.68988 0 0 0 0 0.0024102 0.0048203 0.010371 0.2508 0.67821 0.94741 0.98349 0.69661 0.37248 0.28922 0.81683 0.72568 0.52016 1 1 1 1 0.99445 0.98788 0.98861 0.98247 0.99357 0.98729 0.98846 0.9683 0.94186 0.92025 0.91864 0.91207 0.86211 0.43836 0.56968 0.55609 0.66367 0.77125 0.38533 0.17967 0.44289 0.33582 0.4553 0.017528 0.13249 0.32267 0.67003 0.67324 0.7508 0.2492 0.30879 0.21925 0 0.091878 0 0.00029214 0.00073035 0.11386 0.22699 0.66594 0.68522 0.73123 0.77724 0.75051 0.62942 0.59378 0.62825 0.68303 0.57055 0.46567 0.32472 0.14432 0.17894 0.2007 0.12942 0.15527 0.13862 0.11656 0.10941 0.01563 0.0035057 0.010663 0 0.00029214 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3.4482 139.52 85.993 39 229 759 3.2743 720 190 530 2.5075 -2.7583 1 6108 364 31.7 8.72 532.6 {'CO740' } {'AE3' } 0.15971 0.14663 0.24097 0.28646 0.40458 0.52269 0.46246 0.33678 0.44335 0.50586 0.5686 0.51458 0.44333 0.44187 0.58635 0.58643 0.3797 0.72052 0.72548 0.73044 0.71831 0.71582 0.75358 0.74782 0.71147 0.75185 0.74353 0.7593 0.72986 0.70767 0.704 0.71891 0.72919 0.71077 0.64277 0.65884 0.66245 0.59172 0.70439 0.66168 0.62496 0.68204 0.66027 0.658 0.54322 0.5885 0.64658 0.69409 0.6669 0.63829 0.57889 0.60645 0.60408 0.49354 0.58489 0.44697 0.55643 0.58723 0.62702 0.66682 0.72098 0.68734 0.54996 0.72157 0.705 0.68771 0.68724 0.71958 0.73074 0.70223 0.7345 0.71422 0.65333 0.71692 0.71599 0.68645 0.682 0.68582 0.66315 0.66666 0.62104 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.59684 0.070088 0.093677 0.11573 0.15564 0.21862 0.28161 0.21324 0.18493 0.21786 0.31117 0.36372 0.33968 0.28053 0.28944 0.40262 0.41059 0.27048 0.516 0.51996 0.52391 0.51211 0.50696 0.54029 0.53617 0.50787 0.52815 0.52395 0.53731 0.50325 0.47926 0.47442 0.50774 0.49309 0.47642 0.42329 0.44018 0.43918 0.39919 0.46315 0.4485 0.43163 0.46162 0.44652 0.43266 0.35407 0.38574 0.43858 0.45653 0.44385 0.42148 0.38563 0.40265 0.39863 0.32713 0.38333 0.30256 0.37684 0.39196 0.424 0.45605 0.49261 0.47427 0.36471 0.49801 0.47554 0.45702 0.45969 0.46951 0.47782 0.46977 0.4812 0.46644 0.42979 0.46973 0.46166 0.44102 0.43982 0.43137 0.41765 0.40795 0.37222 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.35233 0.26541 0.3448 0.44478 0.53723 0.76786 0.9985 0.73736 0.66928 0.83144 1.2395 1.5916 1.5238 1.1825 1.2076 1.9783 2.1358 1.1088 3.2374 3.3078 3.3782 3.3259 3.1834 3.8306 3.6623 3.3851 3.4891 3.4514 3.6706 3.2044 2.8618 2.7001 3.1558 2.9246 2.7465 2.243 2.3831 2.3289 2.0489 2.5533 2.4645 2.3965 2.6504 2.543 2.2874 1.6561 1.8838 2.2851 2.5313 2.3877 2.1491 1.8627 2.0793 2.087 1.5242 1.9778 1.3718 1.8638 2.0402 2.3597 2.6792 3.0573 2.8064 1.7363 2.955 2.7626 2.5692 2.5344 2.7253 2.7037 2.6652 2.809 2.5659 2.2831 2.6707 2.4912 2.3474 2.2913 2.2377 2.0795 1.9955 1.6989 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 1.5838 0 0 0 0 0.0031592 0.0063185 9.87e-05 0 0 0.004344 0.093494 0.11897 0 0 0.43371 0.46322 0 0.97473 0.98381 0.99289 0.99822 0.99832 0.99852 0.99664 0.98953 0.99181 0.99052 0.98657 0.98608 0.9845 0.98016 0.98282 0.97808 0.97502 0.95172 0.96564 0.9691 0.89663 0.98065 0.94748 0.90897 0.96031 0.96189 0.94067 0.54102 0.7674 0.89762 0.95074 0.93158 0.95212 0.83552 0.84924 0.85892 0.092902 0.83888 0 0.59236 0.78478 0.89017 0.99556 0.9998 0.9997 0.22855 0.99941 0.99645 0.99803 0.98766 0.99358 0.98332 0.97719 0.91154 0.97739 0.97028 0.97295 0.96466 0.94817 0.94422 0.93 0.86504 0.85191 0.67944 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 0.47142 3.8306 176.85 119.2 19 294 729 3.6499 710 275 435 3.5651 -2.2468 2 44705 1739 208.8 12 5365.5 {'NCO334' } {'A3' } 0.16255 0.16255 0.16255 0.16255 0.16255 0.16255 0.16255 0.16255 0.15615 0.21329 0.31696 0.26564 0.34816 0.37824 0.47175 0.49004 0.49418 0.62106 0.62223 0.66627 0.64805 0.6716 0.72315 0.70223 0.68028 0.72968 0.70848 0.72942 0.70714 0.66972 0.68732 0.70823 0.71594 0.69919 0.6253 0.65897 0.6492 0.59064 0.70743 0.66823 0.61906 0.62035 0.64549 0.6255 0.50759 0.54835 0.61814 0.66239 0.62308 0.61692 0.55955 0.56437 0.56018 0.44434 0.53379 0.40136 0.50035 0.51123 0.55568 0.60014 0.66063 0.635 0.65833 0.68166 0.65699 0.64336 0.63317 0.65371 0.66237 0.65325 0.67574 0.63681 0.5927 0.66019 0.67695 0.64301 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.65296 0.11027 0.11027 0.11027 0.11027 0.11027 0.11027 0.11027 0.11027 0.10048 0.12506 0.17524 0.17942 0.21417 0.2333 0.31935 0.31384 0.33734 0.42672 0.42636 0.47121 0.46345 0.48467 0.50976 0.49616 0.48798 0.5209 0.49117 0.5089 0.50427 0.45866 0.47179 0.48187 0.50163 0.47108 0.4238 0.43877 0.45114 0.38757 0.45845 0.4446 0.40663 0.41415 0.41304 0.39412 0.32387 0.34622 0.40286 0.43006 0.40708 0.39468 0.36634 0.37767 0.37116 0.29709 0.34313 0.2542 0.33674 0.33302 0.36627 0.39953 0.44489 0.43758 0.44492 0.45226 0.45002 0.42636 0.42154 0.42373 0.42587 0.43436 0.43121 0.42232 0.37699 0.42094 0.45047 0.41325 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.44216 0.39405 0.39405 0.39405 0.39405 0.39405 0.39405 0.39405 0.39405 0.38617 0.45417 0.63369 0.66032 0.7939 0.88832 1.3229 1.3785 1.4864 2.2552 2.3234 2.6791 2.6618 2.8528 3.3668 3.0632 3.0654 3.4701 3.0694 3.3552 3.1819 2.7442 2.7453 2.9487 3.0788 2.8237 2.32 2.4598 2.5291 2.0529 2.6855 2.4739 2.2787 2.2422 2.2778 1.9592 1.4706 1.6286 2.0569 2.3261 2.1159 2.0425 1.753 1.9191 1.9371 1.3234 1.6981 1.1377 1.6095 1.6349 1.8948 2.1548 2.5841 2.4932 2.5806 2.668 2.5417 2.3774 2.2641 2.297 2.2703 2.3205 2.3505 2.1521 1.9093 2.3476 2.4758 2.2385 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 2.4092 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.0052144 0.035921 0.14311 0.8482 0.87775 0.97972 0.98378 1 1 1 1 1 1 1 1 1 1 1 1 1 0.99537 0.99131 0.99594 0.95365 1 0.98552 0.95191 0.97914 0.99363 0.96698 0.74623 0.84878 0.94902 0.98725 0.99073 0.99768 0.97045 0.97798 0.98841 0 0.9606 0 0.73059 0.8175 0.90875 1 1 1 1 1 1 1 1 0.46524 0.47277 0.47393 0.46408 0.47798 0.45944 0.46987 0.48262 0.4803 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 0.4832 3.4701 145.87 111.34 99 319 699 3.2575 600 220 380 3.0761 -1.0609 1 4604 2093 233.1 11.14 512.9 {'Mex54-245'} {'OR2' } 0.17447 0.17447 0.17447 0.25997 0.3662 0.47244 0.30053 0.20941 0.35733 0.65558 0.68666 0.60322 0.57458 0.57484 0.55544 0.54406 0.64533 0.74661 0.74426 0.74191 0.71983 0.72969 0.7618 0.75436 0.72394 0.76274 0.74542 0.76 0.72838 0.69281 0.70327 0.72595 0.72467 0.71247 0.63858 0.67171 0.66994 0.61388 0.71894 0.67274 0.64246 0.67409 0.66421 0.66472 0.56396 0.6247 0.67277 0.73341 0.71013 0.69588 0.60208 0.63762 0.6258 0.51803 0.61372 0.44685 0.5877 0.60976 0.65526 0.70076 0.74523 0.71079 0.73116 0.75153 0.74459 0.72574 0.71721 0.74018 0.74617 0.73035 0.72587 0.72139 0.66858 0.72901 0.72828 0.70188 0.70632 0.70782 0.67385 0.68643 0.63768 0.61097 0.64505 0.62295 0.6215 0.54572 0.49685 0.5165 0.52332 0.47019 0.31985 0.1695 0.1695 0.1695 0.1695 0.1695 0.1695 0.1695 0.1695 0.1695 0.1695 0.1695 0.1695 0.1695 0.1695 0.1695 0.1695 0.1695 0.1695 0.1695 0.12463 0.12463 0.12463 0.17853 0.21902 0.25952 0.17247 0.12517 0.1886 0.41187 0.43662 0.40685 0.38562 0.39392 0.3938 0.37963 0.46109 0.54255 0.53892 0.53529 0.51522 0.51722 0.53784 0.5288 0.5138 0.53057 0.51965 0.53137 0.50287 0.47332 0.47353 0.49214 0.49739 0.48008 0.43087 0.45271 0.45188 0.41752 0.48537 0.45146 0.43635 0.46408 0.44669 0.43467 0.37093 0.40496 0.44143 0.49092 0.48195 0.46273 0.4045 0.43818 0.43261 0.34571 0.42098 0.30637 0.40179 0.42139 0.4531 0.48481 0.53341 0.50322 0.51551 0.52779 0.52297 0.49677 0.49876 0.51126 0.51243 0.51276 0.50524 0.49772 0.45786 0.50194 0.4906 0.46823 0.46801 0.4651 0.44086 0.4431 0.39596 0.38059 0.39386 0.36341 0.34162 0.29928 0.27499 0.27478 0.28297 0.25715 0.1692 0.08125 0.08125 0.08125 0.08125 0.08125 0.08125 0.08125 0.08125 0.08125 0.08125 0.08125 0.08125 0.08125 0.08125 0.08125 0.08125 0.08125 0.08125 0.08125 0.48478 0.48478 0.48478 0.66854 0.78173 0.89492 0.60628 0.44606 0.6864 1.8902 2.0836 1.9351 1.8533 1.9377 1.9466 1.9345 2.7903 3.6461 3.6085 3.5709 3.3997 3.3168 3.733 3.4682 3.3365 3.5727 3.3908 3.5538 3.0709 2.7508 2.7288 2.9713 2.9412 2.7829 2.2865 2.4763 2.4732 2.1682 2.6926 2.4173 2.3708 2.5948 2.4717 2.2324 1.7313 2.0245 2.3934 3.0229 2.8253 2.6017 2.0726 2.4163 2.4116 1.6611 2.3065 1.4028 2.1029 2.2992 2.6682 3.0372 3.6664 3.2055 3.3764 3.5473 3.3559 3.0922 3.0461 3.299 3.2747 3.2696 3.1156 2.9615 2.6102 3.1408 2.8415 2.6716 2.5967 2.5638 2.3011 2.3089 1.9338 1.784 1.8502 1.5793 1.4165 1.233 1.0824 1.0995 1.0954 1.0199 0.6718 0.32365 0.32365 0.32365 0.32365 0.32365 0.32365 0.32365 0.32365 0.32365 0.32365 0.32365 0.32365 0.32365 0.32365 0.32365 0.32365 0.32365 0.32365 0.32365 0 0 0 0 0.00077443 0.0015489 0 0 0 0.17573 0.14517 0.36947 0.27021 0.16883 0.31878 0.3285 0.66425 1 1 1 1 1 0.99972 0.99972 0.99972 0.99972 0.99972 0.99972 0.99972 0.99972 0.99972 0.99972 0.99972 0.99972 0.99803 0.99972 0.99972 0.99775 0.99972 0.99972 0.99958 0.99972 0.99972 0.99958 0.9962 0.99958 0.99972 1 0.99972 1 1 1 1 0.58223 0.99986 0 0.99972 1 1 1 1 1 1 1 1 1 1 1 0.99986 0.99972 0.99972 0.99972 0.99972 0.99986 0.99972 0.99972 0.99972 0.99972 0.99972 0.99972 0.94283 0.81611 0.87398 0.77739 0.71811 0.33371 0.037595 0.094903 0.099409 0.0016897 0.00084483 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3.733 196.27 145.09 39 294 789 3.5574 750 255 495 3.2482 -3.4094 3 38435 2159 267.6 12.39 4762.6 {'NCO334' } {'A11' } 0.18151 0.18151 0.18151 0.18151 0.18151 0.18151 0.18151 0.18151 0.18151 0.18151 0.19335 0.33609 0.37871 0.43353 0.4681 0.49191 0.57068 0.51362 0.50439 0.61765 0.60151 0.66057 0.69466 0.67074 0.62959 0.65516 0.62755 0.68264 0.67714 0.61993 0.64511 0.67239 0.69006 0.67916 0.59868 0.62203 0.63127 0.5655 0.69588 0.66711 0.64899 0.66422 0.64882 0.66701 0.54286 0.62792 0.68969 0.72079 0.67794 0.67174 0.59982 0.611 0.59005 0.47507 0.54992 0.44148 0.51946 0.52491 0.57518 0.62545 0.71739 0.69648 0.71315 0.72982 0.70583 0.67227 0.64976 0.67013 0.68595 0.64955 0.684 0.6599 0.62263 0.69605 0.71015 0.67948 0.69054 0.70521 0.69419 0.70825 0.649 0.62605 0.67211 0.65271 0.63279 0.57341 0.49433 0.52355 0.49624 0.43579 0.41702 0.41075 0.39331 0.38925 0.37314 0.36288 0.36288 0.36288 0.36288 0.36288 0.36288 0.36288 0.36288 0.36288 0.36288 0.36288 0.36288 0.36288 0.36288 0.36288 0.14948 0.14948 0.14948 0.14948 0.14948 0.14948 0.14948 0.14948 0.14948 0.14948 0.18731 0.18703 0.23247 0.27234 0.31884 0.34795 0.39778 0.3653 0.34583 0.42308 0.41211 0.46171 0.45098 0.42821 0.39638 0.37473 0.37572 0.43341 0.43656 0.38369 0.40828 0.42708 0.44467 0.43899 0.38687 0.40387 0.41537 0.36596 0.45656 0.45268 0.43961 0.46087 0.41468 0.43793 0.35703 0.41315 0.47254 0.4954 0.46425 0.43961 0.39982 0.40459 0.38944 0.32012 0.36037 0.2831 0.34987 0.33367 0.37437 0.41506 0.47482 0.45547 0.46816 0.48085 0.47444 0.42287 0.42229 0.43226 0.43604 0.40148 0.44189 0.42917 0.40583 0.46698 0.48794 0.44363 0.46446 0.45766 0.47649 0.44189 0.39749 0.40747 0.4093 0.38768 0.34847 0.33801 0.26378 0.29387 0.27163 0.22062 0.23089 0.19772 0.19258 0.18655 0.18008 0.16348 0.16348 0.16348 0.16348 0.16348 0.16348 0.16348 0.16348 0.16348 0.16348 0.16348 0.16348 0.16348 0.16348 0.16348 0.51941 0.51941 0.51941 0.51941 0.51941 0.51941 0.51941 0.51941 0.51941 0.51941 0.69737 0.71815 0.86314 1.0498 1.343 1.557 1.9266 1.6097 1.6176 2.1849 2.1755 2.5871 2.5316 2.2025 1.9138 1.7043 1.7418 2.3388 2.3078 1.9403 2.0705 2.2101 2.3868 2.3715 1.9321 2.1198 2.2159 1.8977 2.5993 2.6288 2.6631 2.8065 2.259 2.4375 1.7355 2.3259 2.8931 3.0975 2.8087 2.6048 2.097 2.167 2.1083 1.4805 1.8779 1.3251 1.7191 1.6894 2.0156 2.3417 2.9146 2.7059 2.8092 2.9124 2.7185 2.3804 2.2456 2.3662 2.2609 1.9523 2.3889 2.2733 2.1245 2.7403 2.8053 2.5068 2.5767 2.6244 2.7308 2.4438 2.0423 2.1042 2.1007 1.8273 1.5658 1.4785 1.1126 1.2041 1.0983 0.89523 0.90492 0.7152 0.71422 0.71011 0.65893 0.61927 0.61927 0.61927 0.61927 0.61927 0.61927 0.61927 0.61927 0.61927 0.61927 0.61927 0.61927 0.61927 0.61927 0.61927 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.010697 0.075277 0.72583 0.31339 0.19651 0.87401 0.88946 0.93978 0.8954 0.84033 0.65967 0.66284 0.68027 0.9271 0.93225 0.90293 0.92274 0.94651 0.9584 0.9584 0.91363 0.96395 0.9687 0.93621 0.9901 0.98693 0.98613 0.98653 0.6042 0.97979 0.813 0.97504 0.98257 0.99406 0.9996 1 0.99643 1 1 0.23217 0.97979 0 0.70602 0.74366 0.87183 1 1 1 1 1 0.99842 0.99525 0.99168 0.99326 0.9897 0.43146 0.98891 0.98732 0.9794 0.99049 0.99445 0.99128 0.99366 0.99525 0.99247 0.98098 0.81537 0.77536 0.80071 0.72544 0.65531 0.41799 0.045563 0.16125 0.069334 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3.0975 167.82 123.67 -236 114 454 2.8192 690 350 340 2.5781 -2.4782 2 7567 1182 117.6 9.95 752.9 {'Mex54-245'} {'B12' } 0.56532 0.56532 0.56532 0.56532 0.56532 0.56532 0.56532 0.56532 0.56532 0.56532 0.1701 0.29983 0.38888 0.44566 0.52657 0.53907 0.50478 0.73514 0.67466 0.70152 0.66535 0.67392 0.72878 0.71296 0.69566 0.73925 0.71411 0.7427 0.71279 0.66185 0.67478 0.69888 0.70312 0.68187 0.60271 0.64137 0.63798 0.5653 0.70853 0.6592 0.61708 0.6333 0.6232 0.64726 0.51228 0.57927 0.64012 0.67202 0.6197 0.63426 0.574 0.5723 0.56961 0.45058 0.54716 0.41903 0.50422 0.50992 0.55589 0.60186 0.66427 0.64503 0.66738 0.68973 0.6686 0.64119 0.62878 0.6542 0.64799 0.64267 0.68071 0.64045 0.59691 0.66358 0.68473 0.65462 0.66926 0.66677 0.65433 0.65849 0.59995 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.57411 0.32507 0.32507 0.32507 0.32507 0.32507 0.32507 0.32507 0.32507 0.32507 0.32507 0.099848 0.18282 0.25563 0.29775 0.36903 0.36091 0.3526 0.51982 0.46264 0.49124 0.46235 0.45666 0.49241 0.48493 0.48289 0.51179 0.49106 0.51044 0.48812 0.43884 0.44854 0.46309 0.47856 0.45502 0.41278 0.42921 0.43018 0.37362 0.46577 0.45112 0.41873 0.43401 0.41297 0.41382 0.34157 0.36851 0.42484 0.44203 0.4154 0.40748 0.38722 0.38079 0.37082 0.30723 0.36027 0.2709 0.33557 0.32647 0.35399 0.38151 0.43152 0.42766 0.44099 0.45433 0.43922 0.41687 0.40849 0.40165 0.39041 0.40309 0.41622 0.41392 0.37086 0.41943 0.45491 0.42832 0.45339 0.43118 0.44584 0.4089 0.37307 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 0.36291 1.4103 1.4103 1.4103 1.4103 1.4103 1.4103 1.4103 1.4103 1.4103 1.4103 0.38193 0.66301 0.94734 1.1732 1.6379 1.6965 1.5684 3.176 2.7397 2.8572 2.6774 2.4432 2.9591 2.8309 2.8826 3.2231 3.0035 3.2312 2.9552 2.4113 2.4668 2.6511 2.7634 2.5018 2.087 2.3398 2.2459 1.842 2.6137 2.4822 2.3071 2.3694 2.2047 2.1368 1.574 1.8496 2.2712 2.3253 2.1218 2.0725 1.8051 1.8416 1.8312 1.3386 1.7736 1.1496 1.5 1.4533 1.6621 1.871 2.2284 2.2085 2.3361 2.4637 2.2818 2.154 1.9942 1.9429 1.8028 1.9353 2.0744 1.9842 1.792 2.2226 2.3153 2.2228 2.3632 2.3153 2.3172 2.1225 1.7712 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 1.7416 0.64881 0.64881 0.64881 0.64881 0.64881 0.64881 0.64881 0.64881 0.64881 0.64881 0 0 0 0 0.87355 0.96687 0.69464 1 1 1 1 1 1 0.99945 0.99834 1 1 1 1 0.98564 0.99061 0.99945 0.99945 0.99945 0.97957 0.99448 0.99503 0.93981 0.99558 0.99227 0.96687 0.97902 0.94202 0.97681 0.69078 0.93429 0.97681 0.99061 0.98951 1 0.97736 0.96853 0.9873 0.0027609 0.97736 0 0.50138 0.66096 0.81557 0.97018 1 0.99669 0.9942 0.99172 0.98951 0.99503 0.95859 0.99724 0.94478 0.95141 0.99558 0.94092 0.95417 0.98123 0.95693 0.96135 0.94865 0.97736 0.93539 0.94809 0.83711 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 0.7079 3.2312 150.91 119.68 129 334 729 2.9267 600 205 395 1.8209 -1.4896 1 5698 1266 143 11.29 643.3 {'B41227' }
dependent_vars = {'act_y_ha', 'act_per_ha'};
for jj = 1:numel(dependent_vars)
var = dependent_vars{jj};
% Prepare features and target variable
features = [];
for f = 1:454
features = [features, "t" + num2str(f)];
end
X = df{:, features};
Y = df{:, var};
% PCA
[coeff, score, ~, ~, explained] = pca(X);
X_pca = score(:, 1:10);
% Split data
cv = cvpartition(size(X, 1), 'HoldOut', 0.2);
idxTrain = training(cv);
idxTest = test(cv);
X_train = X(idxTrain, :);
X_test = X(idxTest, :);
Y_train = Y(idxTrain);
Y_test = Y(idxTest);
% Model fitting
reg = fitlm(X_train, Y_train);
% Predict
Y_pred = predict(reg, X_test);
% Predict and calculate metrics
r2_reg = 1 - sum((Y_pred - Y_test).^2) / sum((Y_test - mean(Y_test)).^2);
rmse = sqrt(mean((Y_pred - Y_test).^2));
disp([ var, ' ', num2str(r2_reg), ' ', num2str(rmse)]);
end
Warning: Regression design matrix is rank deficient to within machine precision.
act_y_ha -5.3373 1154.401
Warning: Regression design matrix is rank deficient to within machine precision.
act_per_ha -1.2859 85.2228
  2 commentaires
Walter Roberson
Walter Roberson le 9 Mar 2024
R-scores come in negative when the proposed fit is worse than a linear fit.
John D'Errico
John D'Errico le 9 Mar 2024
That is not true. R^2 will be negative almost always when there is no constant term in the model. This happens when the model fit it worse than a CONSTANT model, NOT a linear model.

Connectez-vous pour commenter.

Réponses (1)

John D'Errico
John D'Errico le 9 Mar 2024
Modifié(e) : John D'Errico le 9 Mar 2024
Actually, not. R^2 can be negative for two potential reasons.
  1. If the R^2 was computed incorrectly.
  2. If the model is worse than a CONSTANT model. NOT a linear model. This almost always arises because no constant term was allowed in the model.
The latter is almost always the important factor, and why almost always when we see a negative R^2 reported, that was the cause. Why is this the case?
The formula for R^2 itself is easy enough to write.
Essentially it is just:
R^2 = 1 - ssmodel/sstotal
where ssmodel is the residual sum of squares for the model, and sstotal is the residual sum of squares when subtrcting off the mean. Since the mean of your data is the best least squares constant model, the formula should make sense.
In the case here, the formul used to compute R^2 was just:
r2_reg = 1 - sum((Y_pred - Y_test).^2) / sum((Y_test - mean(Y_test)).^2);
And that is clearly MATLAB code for the formula I wrote, so the problem should be a lack of a constant term.
help fitlm
FITLM Create linear regression model by fitting to data. LM = FITLM(TBL,MODELSPEC) fits the model specified by MODELSPEC to variables in the table/dataset array TBL, and returns the linear model LM. MODELSPEC can be any of the following: 'linear' Linear (main effect) terms only. 'interactions' Linear and pairwise interaction terms. 'purequadratic' Linear and squared terms. 'quadratic' Linear, pairwise interactions, and squares. 'polyIJK...' Polynomial with all terms up to power I for the first predictor, J for the second, K for the third, and so on. FORMULA a string such as 'y ~ x1 + x2 + x3*x4' defining the response and the predictor terms. A formula string always has the response variable name, followed by '~', followed by one or more terms joined by '+' or '-'. The following are the rules for constructing a formula: A + B term A and term B A - B term A but without term B A:B the product of A and B A*B A + B + A:B A^2 A + A:A () grouping of terms TERMS A T-by-V matrix where T is the desired number of terms in the model and V is the number of variables in TBL. The (I,J) element indicates the power of variable J in term I. The following are some examples of FORMULA term expressions when the predictors are x1, x2, and x3: 'x1+x2+x3' Linear model including constant 'x1+x2+x3-1' Linear model without constant 'x1^2+x2^2' Constant, linear, squared terms 'x1*x2*x3-x1:x2:x3' All except the three-way interaction 'x1*(x2+x3)' Linear, plus two-way interactions with x1 The following are some examples of TERMS matrix rows when the table/dataset TBL consists of predictors x1, x2, and x3, followed by response y: [0 0 0 0] represents a constant term or intercept [0 1 0 0] represents x2 [1 0 1 0] represents the product x1:x3 [2 0 0 0] represents x1:x1 [0 1 2 0] represents x2:x3:x3 LM = FITLM(X,Y) fits a linear regression model using the column vector Y as a response variable and the columns of the matrix X as predictor variables, and returns the linear model LM. LM = FITLM(X,Y,MODELSPEC) fits the model specified by MODELSPEC. If MODELSPEC is a formula, it should specify the response as y and the predictors as x1,x2,.... The default is 'linear'. LM = FITLM(...,PARAM1,VAL1,PARAM2,VAL2,...) specifies one or more of the following name/value pairs: 'Weights' Vector of N non-negative weights, where N is the number of rows in TBL or Y. Default is ones(N,1). 'VarNames' String/cell array of names for the columns in X. Default is {'x1','x2',...} to use for the predictors and 'y' for the response.Not allowed when fitting to a table/dataset. 'CategoricalVars' Vector of integer or logical indices specifying the variables in TBL or the columns in X that should be treated as categorical when they are used as predictors. Default is to treat TBL variables as categorical if they are categorical, logical, or char arrays, or cell arrays of strings. 'Exclude' Vector of integer or logical indices into the rows of TBL, or X and Y, that should be excluded from the fit. Default is to use all rows. 'Intercept' true (default) to include a constant term in the model, or false to omit it. 'PredictorVars' A specification of the variables to use as predictors, either as a string/cell array of variable names, or a vector of integer or logical indices into the variables in TBL or the columns in X. 'ResponseVar' The response variable, specified either as a variable name or number. 'RobustOpts' 'off' (default) to perform least squares, 'on' to perform robust regression using the bisquare weighting function, or a structure. When the 'RobustOpts' value is a structure, it must have a 'RobustWgtFun' field and may have a 'Tune' field. The value of the 'RobustWgtFun' field can be any of 'andrews', 'bisquare', 'cauchy', 'fair', 'huber', 'logistic', 'talwar', or 'welsch', or it can be a function that takes a residual vector as input and produces a weight vector as output. The value of the 'Tune' field is a tuning constant. The residuals are scaled by the tuning constant (default depends on weighting function and is 1 for a function handle) and by an estimate of the error standard deviation before the weight function is called. 'RobustWgtFun' can be specified using @ (as in @myfun). Examples: % Fit to data in matrices load hald lm = fitlm(ingredients,heat) % Fit to data in a dataset load carsmall d = dataset(MPG,Weight); d.Year = ordinal(Model_Year); lm = fitlm(d,'MPG ~ Year + Weight + Weight^2') % Fit to data in a table load carsmall d = table(MPG,Weight); d.Year = ordinal(Model_Year); lm = fitlm(d,'MPG ~ Year + Weight + Weight^2') See also LinearModel, STEPWISELM. Documentation for fitlm doc fitlm Other uses of fitlm gpuArray/fitlm tall/fitlm
In there, we see that if fitlm is called just as fitlm(X,Y) with no modelspec, it uses only the columns of X. There is no constant term additionally introduced by fitlm.
I do also see a potential problem, in that the design matrix was reported to be singular.

Produits


Version

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by