[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"summary-03a80d45cc3addfe-preprocessing-swings-cnn-accuracy-from-65-to-87-on-summary":3,"summaries-facets-categories":94,"summary-related-03a80d45cc3addfe-preprocessing-swings-cnn-accuracy-from-65-to-87-on-summary":3663},{"id":4,"title":5,"ai":6,"body":13,"categories":59,"created_at":61,"date_modified":61,"description":53,"extension":62,"faq":61,"featured":63,"kicker_label":61,"meta":64,"navigation":75,"path":76,"published_at":77,"question":61,"scraped_at":78,"seo":79,"sitemap":80,"source_id":81,"source_name":82,"source_type":83,"source_url":84,"stem":85,"tags":86,"thumbnail_url":61,"tldr":91,"tweet":61,"unknown_tags":92,"__hash__":93},"summaries\u002Fsummaries\u002F03a80d45cc3addfe-preprocessing-swings-cnn-accuracy-from-65-to-87-on-summary.md","Preprocessing Swings CNN Accuracy from 65% to 87% on CIFAR-10",{"provider":7,"model":8,"input_tokens":9,"output_tokens":10,"processing_time_ms":11,"cost_usd":12},"openrouter","x-ai\u002Fgrok-4.1-fast",8876,1567,16564,0.00205185,{"type":14,"value":15,"toc":52},"minimark",[16,21,34,38,45,49],[17,18,20],"h2",{"id":19},"scale-pixels-to-stabilize-gradients-and-boost-baseline-performance","Scale Pixels to Stabilize Gradients and Boost Baseline Performance",[22,23,24,25,29,30,33],"p",{},"Train CNNs on raw CIFAR-10 images (32x32x3 pixels, 0-255 range) without preprocessing for a 65.47% test accuracy baseline after 10 epochs using Adam optimizer and sparse categorical cross-entropy. Large pixel values (up to 255) cause exploding gradients: ∂L\u002F∂w ≈ 255 × δ, leading to overshooting and oscillations in weight updates. Normalize by dividing by 255.0 to scale to ",[26,27,28],"span",{},"0,1",", reducing gradients to 1 × δ for smooth convergence, raising accuracy to 69.38%. Standardization (Z-score: (x - μ)\u002Fσ per channel) matches this at 69.38%, centering data at mean 0 and std 1—E",[26,31,32],{},"z"," = 0 and Var(z) = 1 proven via linearity of expectation and variance properties—but offers no extra gain for CNNs on images, as basic normalization suffices for stable training.",[17,35,37],{"id":36},"use-geometric-augmentation-for-invariance-but-avoid-photometric-overkill","Use Geometric Augmentation for Invariance but Avoid Photometric Overkill",[22,39,40,41,44],{},"Apply geometric augmentations (RandomFlip horizontal, RandomRotation 0.1, RandomZoom 0.1) after normalization, training 20 epochs: accuracy dips to 67.13% on simple CNN, as added variability challenges the model without deeper capacity. These create rotation\u002Fscale\u002Fflip invariance via affine transformations—e.g., flip: x' = -x, rotation: ",[26,42,43],{},"cosθ -sinθ; sinθ cosθ",", zoom: s scaling—forcing feature learning (wheels, wings) over memorization. Photometric augmentations (RandomBrightness\u002FContrast 0.2) after normalization catastrophically drop accuracy to 20.62%: clipping saturates pixels to 0\u002F1 (e.g., 0.9 + 0.2 → 1.0), destroying edges\u002Ftextures in low-res 32x32 images, worsening signal-to-noise ratio and erasing discriminative features like airplane wings or cat eyes.",[17,46,48],{"id":47},"stack-normalization-geometric-augs-and-architecture-for-87-accuracy","Stack Normalization, Geometric Augs, and Architecture for 87% Accuracy",[22,50,51],{},"Combine Z-score standardization ((X - mean)\u002Fstd, ε=1e-7), geometric augmentations (add RandomTranslation 0.1,0.1), one-hot labels with 0.1 label smoothing (y_smooth = (1-α)y_true + α\u002FK, injecting 0.01 uniform noise across 10 classes to curb overconfidence), and deeper CNN (64-128-256 filters in padded conv blocks, BatchNorm, Dropout 0.2-0.5, MaxPool): achieves 87.32% test accuracy with EarlyStopping (patience=8 on val_acc) and ReduceLROnPlateau (factor=0.5, patience=3). BatchNorm normalizes layer activations: ˆx = (x - μ_B)\u002F√(σ²_B + ε), then γˆx + β for learnable scaling\u002Fshift, stabilizing internal distributions. This pipeline aligns preprocessing with model capacity, proving no single technique wins—success demands tailored combinations avoiding info destruction while enforcing generalization.",{"title":53,"searchDepth":54,"depth":54,"links":55},"",2,[56,57,58],{"id":19,"depth":54,"text":20},{"id":36,"depth":54,"text":37},{"id":47,"depth":54,"text":48},[60],"Data Science & Visualization",null,"md",false,{"content_references":65,"triage":70},[66],{"type":67,"title":68,"context":69},"dataset","CIFAR-10","mentioned",{"relevance":71,"novelty":71,"quality":72,"actionability":72,"composite":73,"reasoning":74},3,4,3.45,"Category: Data Science & Visualization. The article discusses preprocessing techniques that significantly improve CNN accuracy on the CIFAR-10 dataset, which is relevant for AI product builders looking to enhance model performance. It provides actionable insights on normalization and augmentation strategies that can be directly applied in practice.",true,"\u002Fsummaries\u002F03a80d45cc3addfe-preprocessing-swings-cnn-accuracy-from-65-to-87-on-summary","2026-04-20 16:07:06","2026-04-21 15:25:42",{"title":5,"description":53},{"loc":76},"03a80d45cc3addfe","Level Up Coding","article","https:\u002F\u002Flevelup.gitconnected.com\u002Fwhen-preprocessing-helps-and-when-it-hurts-why-your-image-classification-models-accuracy-varies-a6761f20e09e?source=rss----5517fd7b58a6---4","summaries\u002F03a80d45cc3addfe-preprocessing-swings-cnn-accuracy-from-65-to-87-on-summary",[87,88,89,90],"machine-learning","deep-learning","data-science","python","Raw CIFAR-10 pixels yield 65% test accuracy; normalization\u002Fstandardization lift to 69%; geometric augmentation maintains ~67%; photometric brightness\u002Fcontrast crashes to 20%; combined pipeline with deeper CNN hits 87%.",[],"w3EW0KbB8oA66HfhVLdBPkpkFdwzACfoXmZTwcAFNMQ",[95,98,101,104,107,110,112,114,116,118,120,122,125,127,129,131,133,135,137,139,141,143,146,148,150,152,155,157,159,162,164,166,168,170,172,174,176,178,180,182,184,186,188,190,192,194,196,198,200,202,204,206,208,210,212,214,216,218,220,222,224,226,228,230,232,234,236,238,240,242,244,246,248,250,252,254,256,258,260,262,264,266,268,270,272,274,276,278,280,282,284,286,288,290,292,294,296,298,300,302,304,306,308,310,312,314,316,318,320,322,324,326,328,330,332,334,336,338,340,342,344,346,348,350,352,354,356,358,360,362,364,366,368,370,372,374,376,378,380,382,384,386,388,390,392,394,396,398,400,402,404,406,408,410,412,414,417,419,421,423,425,427,429,431,433,435,437,439,441,443,445,447,449,451,453,455,457,459,461,463,465,467,469,471,473,475,477,479,481,483,485,487,489,491,493,495,497,499,501,503,505,507,509,511,513,515,517,519,521,523,525,527,529,531,533,535,537,539,541,543,545,547,549,551,553,555,557,559,561,563,565,567,569,571,573,575,577,579,581,583,585,587,589,591,593,595,597,599,601,603,605,607,609,611,613,615,617,619,621,623,625,627,629,631,633,635,637,639,641,643,645,647,649,651,653,655,657,659,661,663,665,667,669,671,673,675,677,679,681,683,685,687,689,691,693,695,697,699,701,703,705,707,709,711,713,715,717,719,721,723,725,727,729,731,733,735,737,739,741,743,745,747,749,751,753,755,757,759,761,763,765,767,769,771,773,775,777,779,781,783,785,787,789,791,793,795,797,799,801,803,805,807,809,811,813,815,817,819,821,823,825,827,829,831,833,835,837,839,841,843,845,847,849,851,853,855,857,859,861,863,865,867,869,871,873,875,877,879,881,883,885,887,889,891,893,895,897,899,901,903,905,907,909,911,913,915,917,919,921,923,925,927,929,931,933,935,937,939,941,943,945,947,949,951,953,955,957,959,961,963,965,967,969,971,973,975,977,979,981,983,985,987,989,991,993,995,997,999,1001,1003,1005,1007,1009,1011,1013,1015,1017,1019,1021,1023,1025,1027,1029,1031,1033,1035,1037,1039,1041,1043,1045,1047,1049,1051,1053,1055,1057,1059,1061,1063,1065,1067,1069,1071,1073,1075,1077,1079,1081,1083,1085,1087,1089,1091,1093,1095,1097,1099,1101,1103,1105,1107,1109,1111,1113,1115,1117,1119,1121,1123,1125,1127,1129,1131,1133,1135,1137,1139,1141,1143,1145,1147,1149,1151,1153,1155,1157,1159,1161,1163,1165,1167,1169,1171,1173,1175,1177,1179,1181,1183,1185,1187,1189,1191,1193,1195,1197,1199,1201,1203,1205,1207,1209,1211,1213,1215,1217,1219,1221,1223,1225,1227,1229,1231,1233,1235,1237,1239,1241,1243,1245,1247,1249,1251,1253,1255,1257,1259,1261,1263,1265,1267,1269,1271,1273,1275,1277,1279,1281,1283,1285,1287,1289,1291,1293,1295,1297,1299,1301,1303,1305,1307,1309,1311,1313,1315,1317,1319,1321,1323,1325,1327,1329,1331,1333,1335,1337,1339,1341,1343,1345,1347,1349,1351,1353,1355,1357,1359,1361,1363,1365,1367,1369,1371,1373,1375,1377,1379,1381,1383,1385,1387,1389,1391,1393,1395,1397,1399,1401,1403,1405,1407,1409,1411,1413,1415,1417,1419,1421,1423,1425,1427,1429,1431,1433,1435,1437,1439,1441,1443,1445,1447,1449,1451,1453,1455,1457,1459,1461,1463,1465,1467,1469,1471,1473,1475,1477,1479,1481,1483,1485,1487,1489,1491,1493,1495,1497,1499,1501,1503,1505,1507,1509,1511,1513,1515,1517,1519,1521,1523,1525,1527,1529,1531,1533,1535,1537,1539,1541,1543,1545,1547,1549,1551,1553,1555,1557,1559,1561,1563,1565,1567,1569,1571,1573,1575,1577,1579,1581,1583,1585,1587,1589,1591,1593,1595,1597,1599,1601,1603,1605,1607,1609,1611,1613,1615,1617,1619,1621,1623,1625,1627,1629,1631,1633,1635,1637,1639,1641,1643,1645,1647,1649,1651,1653,1655,1657,1659,1661,1663,1665,1667,1669,1671,1673,1675,1677,1679,1681,1683,1685,1687,1689,1691,1693,1695,1697,1699,1701,1703,1705,1707,1709,1711,1713,1715,1717,1719,1721,1723,1725,1727,1729,1731,1733,1735,1737,1739,1741,1743,1745,1747,1749,1751,1753,1755,1757,1759,1761,1763,1765,1767,1769,1771,1773,1775,1777,1779,1781,1783,1785,1787,1789,1791,1793,1795,1797,1799,1801,1803,1805,1807,1809,1811,1813,1815,1817,1819,1821,1823,1825,1827,1829,1831,1833,1835,1837,1839,1841,1843,1845,1847,1849,1851,1853,1855,1857,1859,1861,1863,1865,1867,1869,1871,1873,1875,1877,1879,1881,1883,1885,1887,1889,1891,1893,1895,1897,1899,1901,1903,1905,1907,1909,1911,1913,1915,1917,1919,1921,1923,1925,1927,1929,1931,1933,1935,1937,1939,1941,1943,1945,1947,1949,1951,1953,1955,1957,1959,1961,1963,1965,1967,1969,1971,1973,1975,1977,1979,1981,1983,1985,1987,1989,1991,1993,1995,1997,1999,2001,2003,2005,2007,2009,2011,2013,2015,2017,2019,2021,2023,2025,2027,2029,2031,2033,2035,2037,2039,2041,2043,2045,2047,2049,2051,2053,2055,2057,2059,2061,2063,2065,2067,2069,2071,2073,2075,2077,2079,2081,2083,2085,2087,2089,2091,2093,2095,2097,2099,2101,2103,2105,2107,2109,2111,2113,2115,2117,2119,2121,2123,2125,2127,2129,2131,2133,2135,2137,2139,2141,2143,2145,2147,2149,2151,2153,2155,2157,2159,2161,2163,2165,2167,2169,2171,2173,2175,2177,2179,2181,2183,2185,2187,2189,2191,2193,2195,2197,2199,2201,2203,2205,2207,2209,2211,2213,2215,2217,2219,2221,2223,2225,2227,2229,2231,2233,2235,2237,2239,2241,2243,2245,2247,2249,2251,2253,2255,2257,2259,2261,2263,2265,2267,2269,2271,2273,2275,2277,2279,2281,2283,2285,2287,2289,2291,2293,2295,2297,2299,2301,2303,2305,2307,2309,2311,2313,2315,2317,2319,2321,2323,2325,2327,2329,2331,2333,2335,2337,2339,2341,2343,2345,2347,2349,2351,2353,2355,2357,2359,2361,2363,2365,2367,2369,2371,2373,2375,2377,2379,2381,2383,2385,2387,2389,2391,2393,2395,2397,2399,2401,2403,2405,2407,2409,2411,2413,2415,2417,2419,2421,2423,2425,2427,2429,2431,2433,2435,2437,2439,2441,2443,2445,2447,2449,2451,2453,2455,2457,2459,2461,2463,2465,2467,2469,2471,2473,2475,2477,2479,2481,2483,2485,2487,2489,2491,2493,2495,2497,2499,2501,2503,2505,2507,2509,2511,2513,2515,2517,2519,2521,2523,2525,2527,2529,2531,2533,2535,2537,2539,2541,2543,2545,2547,2549,2551,2553,2555,2557,2559,2561,2563,2565,2567,2569,2571,2573,2575,2577,2579,2581,2583,2585,2587,2589,2591,2593,2595,2597,2599,2601,2603,2605,2607,2609,2611,2613,2615,2617,2619,2621,2623,2625,2627,2629,2631,2633,2635,2637,2639,2641,2643,2645,2647,2649,2651,2653,2655,2657,2659,2661,2663,2665,2667,2669,2671,2673,2675,2677,2679,2681,2683,2685,2687,2689,2691,2693,2695,2697,2699,2701,2703,2705,2707,2709,2711,2713,2715,2717,2719,2721,2723,2725,2727,2729,2731,2733,2735,2737,2739,2741,2743,2745,2747,2749,2751,2753,2755,2757,2759,2761,2763,2765,2767,2769,2771,2773,2775,2777,2779,2781,2783,2785,2787,2789,2791,2793,2795,2797,2799,2801,2803,2805,2807,2809,2811,2813,2815,2817,2819,2821,2823,2825,2827,2829,2831,2833,2835,2837,2839,2841,2843,2845,2847,2849,2851,2853,2855,2857,2859,2861,2863,2865,2867,2869,2871,2873,2875,2877,2879,2881,2883,2885,2887,2889,2891,2893,2895,2897,2899,2901,2903,2905,2907,2909,2911,2913,2915,2917,2919,2921,2923,2925,2927,2929,2931,2933,2935,2937,2939,2941,2943,2945,2947,2949,2951,2953,2955,2957,2959,2961,2963,2965,2967,2969,2971,2973,2975,2977,2979,2981,2983,2985,2987,2989,2991,2993,2995,2997,2999,3001,3003,3005,3007,3009,3011,3013,3015,3017,3019,3021,3023,3025,3027,3029,3031,3033,3035,3037,3039,3041,3043,3045,3047,3049,3051,3053,3055,3057,3059,3061,3063,3065,3067,3069,3071,3073,3075,3077,3079,3081,3083,3085,3087,3089,3091,3093,3095,3097,3099,3101,3103,3105,3107,3109,3111,3113,3115,3117,3119,3121,3123,3125,3127,3129,3131,3133,3135,3137,3139,3141,3143,3145,3147,3149,3151,3153,3155,3157,3159,3161,3163,3165,3167,3169,3171,3173,3175,3177,3179,3181,3183,3185,3187,3189,3191,3193,3195,3197,3199,3201,3203,3205,3207,3209,3211,3213,3215,3217,3219,3221,3223,3225,3227,3229,3231,3233,3235,3237,3239,3241,3243,3245,3247,3249,3251,3253,3255,3257,3259,3261,3263,3265,3267,3269,3271,3273,3275,3277,3279,3281,3283,3285,3287,3289,3291,3293,3295,3297,3299,3301,3303,3305,3307,3309,3311,3313,3315,3317,3319,3321,3323,3325,3327,3329,3331,3333,3335,3337,3339,3341,3343,3345,3347,3349,3351,3353,3355,3357,3359,3361,3363,3365,3367,3369,3371,3373,3375,3377,3379,3381,3383,3385,3387,3389,3391,3393,3395,3397,3399,3401,3403,3405,3407,3409,3411,3413,3415,3417,3419,3421,3423,3425,3427,3429,3431,3433,3435,3437,3439,3441,3443,3445,3447,3449,3451,3453,3455,3457,3459,3461,3463,3465,3467,3469,3471,3473,3475,3477,3479,3481,3483,3485,3487,3489,3491,3493,3495,3497,3499,3501,3503,3505,3507,3509,3511,3513,3515,3517,3519,3521,3523,3525,3527,3529,3531,3533,3535,3537,3539,3541,3543,3545,3547,3549,3551,3553,3555,3557,3559,3561,3563,3565,3567,3569,3571,3573,3575,3577,3579,3581,3583,3585,3587,3589,3591,3593,3595,3597,3599,3601,3603,3605,3607,3609,3611,3613,3615,3617,3619,3621,3623,3625,3627,3629,3631,3633,3635,3637,3639,3641,3643,3645,3647,3649,3651,3653,3655,3657,3659,3661],{"categories":96},[97],"Developer Productivity",{"categories":99},[100],"Business & SaaS",{"categories":102},[103],"AI & LLMs",{"categories":105},[106],"AI Automation",{"categories":108},[109],"Product Strategy",{"categories":111},[103],{"categories":113},[97],{"categories":115},[100],{"categories":117},[],{"categories":119},[103],{"categories":121},[],{"categories":123},[124],"AI News & Trends",{"categories":126},[106],{"categories":128},[124],{"categories":130},[106],{"categories":132},[106],{"categories":134},[103],{"categories":136},[103],{"categories":138},[124],{"categories":140},[103],{"categories":142},[],{"categories":144},[145],"Design & Frontend",{"categories":147},[60],{"categories":149},[124],{"categories":151},[],{"categories":153},[154],"Software Engineering",{"categories":156},[103],{"categories":158},[106],{"categories":160},[161],"Marketing & Growth",{"categories":163},[103],{"categories":165},[106],{"categories":167},[],{"categories":169},[],{"categories":171},[145],{"categories":173},[106],{"categories":175},[97],{"categories":177},[145],{"categories":179},[103],{"categories":181},[106],{"categories":183},[124],{"categories":185},[],{"categories":187},[],{"categories":189},[106],{"categories":191},[154],{"categories":193},[],{"categories":195},[100],{"categories":197},[],{"categories":199},[],{"categories":201},[106],{"categories":203},[106],{"categories":205},[103],{"categories":207},[],{"categories":209},[154],{"categories":211},[],{"categories":213},[],{"categories":215},[],{"categories":217},[103],{"categories":219},[161],{"categories":221},[145],{"categories":223},[145],{"categories":225},[103],{"categories":227},[106],{"categories":229},[103],{"categories":231},[103],{"categories":233},[106],{"categories":235},[106],{"categories":237},[60],{"categories":239},[124],{"categories":241},[106],{"categories":243},[161],{"categories":245},[106],{"categories":247},[109],{"categories":249},[],{"categories":251},[106],{"categories":253},[],{"categories":255},[106],{"categories":257},[154],{"categories":259},[145],{"categories":261},[103],{"categories":263},[],{"categories":265},[],{"categories":267},[106],{"categories":269},[],{"categories":271},[103],{"categories":273},[],{"categories":275},[97],{"categories":277},[154],{"categories":279},[100],{"categories":281},[124],{"categories":283},[103],{"categories":285},[],{"categories":287},[103],{"categories":289},[],{"categories":291},[154],{"categories":293},[60],{"categories":295},[],{"categories":297},[103],{"categories":299},[145],{"categories":301},[],{"categories":303},[145],{"categories":305},[106],{"categories":307},[],{"categories":309},[106],{"categories":311},[124],{"categories":313},[103],{"categories":315},[],{"categories":317},[106],{"categories":319},[103],{"categories":321},[109],{"categories":323},[],{"categories":325},[103],{"categories":327},[106],{"categories":329},[106],{"categories":331},[],{"categories":333},[60],{"categories":335},[103],{"categories":337},[],{"categories":339},[97],{"categories":341},[100],{"categories":343},[103],{"categories":345},[106],{"categories":347},[154],{"categories":349},[103],{"categories":351},[],{"categories":353},[],{"categories":355},[103],{"categories":357},[],{"categories":359},[145],{"categories":361},[],{"categories":363},[103],{"categories":365},[],{"categories":367},[106],{"categories":369},[103],{"categories":371},[145],{"categories":373},[],{"categories":375},[103],{"categories":377},[103],{"categories":379},[100],{"categories":381},[106],{"categories":383},[103],{"categories":385},[145],{"categories":387},[106],{"categories":389},[],{"categories":391},[],{"categories":393},[124],{"categories":395},[],{"categories":397},[103],{"categories":399},[100,161],{"categories":401},[],{"categories":403},[103],{"categories":405},[],{"categories":407},[],{"categories":409},[103],{"categories":411},[],{"categories":413},[103],{"categories":415},[416],"DevOps & Cloud",{"categories":418},[],{"categories":420},[124],{"categories":422},[145],{"categories":424},[],{"categories":426},[124],{"categories":428},[124],{"categories":430},[103],{"categories":432},[161],{"categories":434},[],{"categories":436},[100],{"categories":438},[],{"categories":440},[103,416],{"categories":442},[103],{"categories":444},[103],{"categories":446},[106],{"categories":448},[103,154],{"categories":450},[60],{"categories":452},[103],{"categories":454},[161],{"categories":456},[106],{"categories":458},[106],{"categories":460},[],{"categories":462},[106],{"categories":464},[103,100],{"categories":466},[],{"categories":468},[145],{"categories":470},[145],{"categories":472},[],{"categories":474},[],{"categories":476},[124],{"categories":478},[],{"categories":480},[97],{"categories":482},[154],{"categories":484},[103],{"categories":486},[145],{"categories":488},[106],{"categories":490},[154],{"categories":492},[124],{"categories":494},[145],{"categories":496},[],{"categories":498},[103],{"categories":500},[103],{"categories":502},[103],{"categories":504},[124],{"categories":506},[97],{"categories":508},[103],{"categories":510},[106],{"categories":512},[416],{"categories":514},[145],{"categories":516},[106],{"categories":518},[],{"categories":520},[],{"categories":522},[145],{"categories":524},[124],{"categories":526},[60],{"categories":528},[],{"categories":530},[103],{"categories":532},[103],{"categories":534},[100],{"categories":536},[103],{"categories":538},[103],{"categories":540},[124],{"categories":542},[],{"categories":544},[106],{"categories":546},[154],{"categories":548},[],{"categories":550},[103],{"categories":552},[103],{"categories":554},[106],{"categories":556},[],{"categories":558},[],{"categories":560},[103],{"categories":562},[],{"categories":564},[100],{"categories":566},[106],{"categories":568},[],{"categories":570},[97],{"categories":572},[103],{"categories":574},[100],{"categories":576},[124],{"categories":578},[],{"categories":580},[],{"categories":582},[],{"categories":584},[124],{"categories":586},[124],{"categories":588},[],{"categories":590},[],{"categories":592},[100],{"categories":594},[],{"categories":596},[],{"categories":598},[97],{"categories":600},[],{"categories":602},[161],{"categories":604},[106],{"categories":606},[100],{"categories":608},[106],{"categories":610},[],{"categories":612},[109],{"categories":614},[145],{"categories":616},[154],{"categories":618},[103],{"categories":620},[106],{"categories":622},[100],{"categories":624},[103],{"categories":626},[],{"categories":628},[],{"categories":630},[154],{"categories":632},[60],{"categories":634},[109],{"categories":636},[106],{"categories":638},[103],{"categories":640},[],{"categories":642},[416],{"categories":644},[],{"categories":646},[106],{"categories":648},[],{"categories":650},[],{"categories":652},[103],{"categories":654},[145],{"categories":656},[161],{"categories":658},[106],{"categories":660},[],{"categories":662},[97],{"categories":664},[],{"categories":666},[124],{"categories":668},[103,416],{"categories":670},[124],{"categories":672},[103],{"categories":674},[100],{"categories":676},[103],{"categories":678},[],{"categories":680},[100],{"categories":682},[],{"categories":684},[154],{"categories":686},[145],{"categories":688},[124],{"categories":690},[60],{"categories":692},[97],{"categories":694},[103],{"categories":696},[154],{"categories":698},[],{"categories":700},[],{"categories":702},[109],{"categories":704},[],{"categories":706},[103],{"categories":708},[],{"categories":710},[145],{"categories":712},[145],{"categories":714},[145],{"categories":716},[],{"categories":718},[],{"categories":720},[124],{"categories":722},[106],{"categories":724},[103],{"categories":726},[103],{"categories":728},[103],{"categories":730},[100],{"categories":732},[103],{"categories":734},[],{"categories":736},[154],{"categories":738},[154],{"categories":740},[100],{"categories":742},[],{"categories":744},[103],{"categories":746},[103],{"categories":748},[100],{"categories":750},[124],{"categories":752},[161],{"categories":754},[106],{"categories":756},[],{"categories":758},[145],{"categories":760},[],{"categories":762},[103],{"categories":764},[],{"categories":766},[100],{"categories":768},[106],{"categories":770},[],{"categories":772},[416],{"categories":774},[60],{"categories":776},[154],{"categories":778},[161],{"categories":780},[154],{"categories":782},[106],{"categories":784},[],{"categories":786},[],{"categories":788},[106],{"categories":790},[97],{"categories":792},[106],{"categories":794},[109],{"categories":796},[100],{"categories":798},[],{"categories":800},[103],{"categories":802},[109],{"categories":804},[103],{"categories":806},[103],{"categories":808},[161],{"categories":810},[145],{"categories":812},[106],{"categories":814},[],{"categories":816},[],{"categories":818},[416],{"categories":820},[154],{"categories":822},[],{"categories":824},[106],{"categories":826},[103],{"categories":828},[145,103],{"categories":830},[97],{"categories":832},[],{"categories":834},[103],{"categories":836},[97],{"categories":838},[145],{"categories":840},[106],{"categories":842},[154],{"categories":844},[],{"categories":846},[103],{"categories":848},[],{"categories":850},[97],{"categories":852},[],{"categories":854},[106],{"categories":856},[109],{"categories":858},[103],{"categories":860},[103],{"categories":862},[145],{"categories":864},[106],{"categories":866},[416],{"categories":868},[145],{"categories":870},[106],{"categories":872},[103],{"categories":874},[103],{"categories":876},[103],{"categories":878},[124],{"categories":880},[],{"categories":882},[109],{"categories":884},[106],{"categories":886},[145],{"categories":888},[106],{"categories":890},[154],{"categories":892},[145],{"categories":894},[106],{"categories":896},[124],{"categories":898},[],{"categories":900},[103],{"categories":902},[145],{"categories":904},[103],{"categories":906},[97],{"categories":908},[124],{"categories":910},[103],{"categories":912},[161],{"categories":914},[103],{"categories":916},[103],{"categories":918},[106],{"categories":920},[106],{"categories":922},[103],{"categories":924},[106],{"categories":926},[145],{"categories":928},[103],{"categories":930},[],{"categories":932},[],{"categories":934},[154],{"categories":936},[],{"categories":938},[97],{"categories":940},[416],{"categories":942},[],{"categories":944},[97],{"categories":946},[100],{"categories":948},[161],{"categories":950},[],{"categories":952},[100],{"categories":954},[],{"categories":956},[],{"categories":958},[],{"categories":960},[],{"categories":962},[],{"categories":964},[103],{"categories":966},[106],{"categories":968},[416],{"categories":970},[97],{"categories":972},[103],{"categories":974},[154],{"categories":976},[109],{"categories":978},[103],{"categories":980},[161],{"categories":982},[103],{"categories":984},[103],{"categories":986},[103],{"categories":988},[103,97],{"categories":990},[154],{"categories":992},[154],{"categories":994},[145],{"categories":996},[103],{"categories":998},[],{"categories":1000},[],{"categories":1002},[],{"categories":1004},[154],{"categories":1006},[60],{"categories":1008},[124],{"categories":1010},[145],{"categories":1012},[],{"categories":1014},[103],{"categories":1016},[103],{"categories":1018},[],{"categories":1020},[],{"categories":1022},[106],{"categories":1024},[103],{"categories":1026},[100],{"categories":1028},[],{"categories":1030},[97],{"categories":1032},[103],{"categories":1034},[97],{"categories":1036},[103],{"categories":1038},[154],{"categories":1040},[161],{"categories":1042},[103,145],{"categories":1044},[124],{"categories":1046},[145],{"categories":1048},[],{"categories":1050},[416],{"categories":1052},[145],{"categories":1054},[106],{"categories":1056},[],{"categories":1058},[],{"categories":1060},[],{"categories":1062},[],{"categories":1064},[154],{"categories":1066},[106],{"categories":1068},[106],{"categories":1070},[103],{"categories":1072},[103],{"categories":1074},[],{"categories":1076},[145],{"categories":1078},[],{"categories":1080},[],{"categories":1082},[106],{"categories":1084},[],{"categories":1086},[],{"categories":1088},[161],{"categories":1090},[161],{"categories":1092},[106],{"categories":1094},[],{"categories":1096},[103],{"categories":1098},[103],{"categories":1100},[154],{"categories":1102},[145],{"categories":1104},[145],{"categories":1106},[106],{"categories":1108},[97],{"categories":1110},[103],{"categories":1112},[145],{"categories":1114},[145],{"categories":1116},[106],{"categories":1118},[106],{"categories":1120},[103],{"categories":1122},[],{"categories":1124},[],{"categories":1126},[103],{"categories":1128},[106],{"categories":1130},[124],{"categories":1132},[154],{"categories":1134},[97],{"categories":1136},[103],{"categories":1138},[],{"categories":1140},[106],{"categories":1142},[106],{"categories":1144},[],{"categories":1146},[97],{"categories":1148},[103],{"categories":1150},[97],{"categories":1152},[97],{"categories":1154},[],{"categories":1156},[],{"categories":1158},[106],{"categories":1160},[106],{"categories":1162},[103],{"categories":1164},[103],{"categories":1166},[124],{"categories":1168},[60],{"categories":1170},[109],{"categories":1172},[124],{"categories":1174},[145],{"categories":1176},[],{"categories":1178},[124],{"categories":1180},[],{"categories":1182},[],{"categories":1184},[],{"categories":1186},[],{"categories":1188},[154],{"categories":1190},[60],{"categories":1192},[],{"categories":1194},[103],{"categories":1196},[103],{"categories":1198},[60],{"categories":1200},[154],{"categories":1202},[],{"categories":1204},[],{"categories":1206},[106],{"categories":1208},[124],{"categories":1210},[124],{"categories":1212},[106],{"categories":1214},[97],{"categories":1216},[103,416],{"categories":1218},[],{"categories":1220},[145],{"categories":1222},[97],{"categories":1224},[106],{"categories":1226},[145],{"categories":1228},[],{"categories":1230},[106],{"categories":1232},[106],{"categories":1234},[103],{"categories":1236},[161],{"categories":1238},[154],{"categories":1240},[145],{"categories":1242},[],{"categories":1244},[106],{"categories":1246},[103],{"categories":1248},[106],{"categories":1250},[106],{"categories":1252},[106],{"categories":1254},[161],{"categories":1256},[106],{"categories":1258},[103],{"categories":1260},[],{"categories":1262},[161],{"categories":1264},[124],{"categories":1266},[106],{"categories":1268},[],{"categories":1270},[],{"categories":1272},[103],{"categories":1274},[106],{"categories":1276},[124],{"categories":1278},[106],{"categories":1280},[],{"categories":1282},[],{"categories":1284},[],{"categories":1286},[106],{"categories":1288},[],{"categories":1290},[],{"categories":1292},[60],{"categories":1294},[103],{"categories":1296},[60],{"categories":1298},[124],{"categories":1300},[103],{"categories":1302},[103],{"categories":1304},[106],{"categories":1306},[103],{"categories":1308},[],{"categories":1310},[],{"categories":1312},[416],{"categories":1314},[],{"categories":1316},[],{"categories":1318},[97],{"categories":1320},[],{"categories":1322},[],{"categories":1324},[],{"categories":1326},[],{"categories":1328},[154],{"categories":1330},[124],{"categories":1332},[161],{"categories":1334},[100],{"categories":1336},[103],{"categories":1338},[103],{"categories":1340},[100],{"categories":1342},[],{"categories":1344},[145],{"categories":1346},[106],{"categories":1348},[100],{"categories":1350},[103],{"categories":1352},[103],{"categories":1354},[97],{"categories":1356},[],{"categories":1358},[97],{"categories":1360},[103],{"categories":1362},[161],{"categories":1364},[106],{"categories":1366},[124],{"categories":1368},[100],{"categories":1370},[103],{"categories":1372},[106],{"categories":1374},[],{"categories":1376},[103],{"categories":1378},[97],{"categories":1380},[103],{"categories":1382},[],{"categories":1384},[124],{"categories":1386},[103],{"categories":1388},[],{"categories":1390},[100],{"categories":1392},[103],{"categories":1394},[],{"categories":1396},[],{"categories":1398},[],{"categories":1400},[103],{"categories":1402},[],{"categories":1404},[416],{"categories":1406},[103],{"categories":1408},[],{"categories":1410},[103],{"categories":1412},[103],{"categories":1414},[103],{"categories":1416},[103,416],{"categories":1418},[103],{"categories":1420},[103],{"categories":1422},[145],{"categories":1424},[106],{"categories":1426},[],{"categories":1428},[106],{"categories":1430},[103],{"categories":1432},[103],{"categories":1434},[103],{"categories":1436},[97],{"categories":1438},[97],{"categories":1440},[154],{"categories":1442},[145],{"categories":1444},[106],{"categories":1446},[],{"categories":1448},[103],{"categories":1450},[124],{"categories":1452},[103],{"categories":1454},[100],{"categories":1456},[],{"categories":1458},[416],{"categories":1460},[145],{"categories":1462},[145],{"categories":1464},[106],{"categories":1466},[124],{"categories":1468},[106],{"categories":1470},[103],{"categories":1472},[],{"categories":1474},[103],{"categories":1476},[],{"categories":1478},[],{"categories":1480},[103],{"categories":1482},[103],{"categories":1484},[103],{"categories":1486},[106],{"categories":1488},[103],{"categories":1490},[],{"categories":1492},[60],{"categories":1494},[106],{"categories":1496},[],{"categories":1498},[103],{"categories":1500},[124],{"categories":1502},[],{"categories":1504},[145],{"categories":1506},[416],{"categories":1508},[124],{"categories":1510},[154],{"categories":1512},[154],{"categories":1514},[124],{"categories":1516},[124],{"categories":1518},[416],{"categories":1520},[],{"categories":1522},[124],{"categories":1524},[103],{"categories":1526},[97],{"categories":1528},[124],{"categories":1530},[],{"categories":1532},[60],{"categories":1534},[124],{"categories":1536},[154],{"categories":1538},[124],{"categories":1540},[416],{"categories":1542},[103],{"categories":1544},[103],{"categories":1546},[],{"categories":1548},[100],{"categories":1550},[],{"categories":1552},[],{"categories":1554},[103],{"categories":1556},[103],{"categories":1558},[103],{"categories":1560},[103],{"categories":1562},[],{"categories":1564},[60],{"categories":1566},[97],{"categories":1568},[],{"categories":1570},[103],{"categories":1572},[103],{"categories":1574},[416],{"categories":1576},[416],{"categories":1578},[],{"categories":1580},[106],{"categories":1582},[124],{"categories":1584},[124],{"categories":1586},[103],{"categories":1588},[106],{"categories":1590},[],{"categories":1592},[145],{"categories":1594},[103],{"categories":1596},[103],{"categories":1598},[],{"categories":1600},[],{"categories":1602},[416],{"categories":1604},[103],{"categories":1606},[154],{"categories":1608},[100],{"categories":1610},[103],{"categories":1612},[],{"categories":1614},[106],{"categories":1616},[97],{"categories":1618},[97],{"categories":1620},[],{"categories":1622},[103],{"categories":1624},[145],{"categories":1626},[106],{"categories":1628},[],{"categories":1630},[103],{"categories":1632},[103],{"categories":1634},[106],{"categories":1636},[],{"categories":1638},[106],{"categories":1640},[154],{"categories":1642},[],{"categories":1644},[103],{"categories":1646},[],{"categories":1648},[103],{"categories":1650},[],{"categories":1652},[103],{"categories":1654},[103],{"categories":1656},[],{"categories":1658},[103],{"categories":1660},[124],{"categories":1662},[103],{"categories":1664},[103],{"categories":1666},[97],{"categories":1668},[103],{"categories":1670},[124],{"categories":1672},[106],{"categories":1674},[],{"categories":1676},[103],{"categories":1678},[161],{"categories":1680},[],{"categories":1682},[],{"categories":1684},[],{"categories":1686},[97],{"categories":1688},[124],{"categories":1690},[106],{"categories":1692},[103],{"categories":1694},[145],{"categories":1696},[106],{"categories":1698},[],{"categories":1700},[106],{"categories":1702},[],{"categories":1704},[103],{"categories":1706},[106],{"categories":1708},[103],{"categories":1710},[],{"categories":1712},[103],{"categories":1714},[103],{"categories":1716},[124],{"categories":1718},[145],{"categories":1720},[106],{"categories":1722},[145],{"categories":1724},[100],{"categories":1726},[],{"categories":1728},[],{"categories":1730},[103],{"categories":1732},[97],{"categories":1734},[124],{"categories":1736},[],{"categories":1738},[],{"categories":1740},[154],{"categories":1742},[145],{"categories":1744},[],{"categories":1746},[103],{"categories":1748},[],{"categories":1750},[161],{"categories":1752},[103],{"categories":1754},[416],{"categories":1756},[154],{"categories":1758},[],{"categories":1760},[106],{"categories":1762},[103],{"categories":1764},[106],{"categories":1766},[106],{"categories":1768},[103],{"categories":1770},[],{"categories":1772},[97],{"categories":1774},[103],{"categories":1776},[100],{"categories":1778},[154],{"categories":1780},[145],{"categories":1782},[],{"categories":1784},[],{"categories":1786},[],{"categories":1788},[106],{"categories":1790},[145],{"categories":1792},[124],{"categories":1794},[103],{"categories":1796},[124],{"categories":1798},[145],{"categories":1800},[],{"categories":1802},[145],{"categories":1804},[124],{"categories":1806},[100],{"categories":1808},[103],{"categories":1810},[124],{"categories":1812},[161],{"categories":1814},[],{"categories":1816},[],{"categories":1818},[60],{"categories":1820},[103,154],{"categories":1822},[124],{"categories":1824},[103],{"categories":1826},[106],{"categories":1828},[106],{"categories":1830},[103],{"categories":1832},[],{"categories":1834},[154],{"categories":1836},[103],{"categories":1838},[60],{"categories":1840},[106],{"categories":1842},[161],{"categories":1844},[416],{"categories":1846},[],{"categories":1848},[97],{"categories":1850},[106],{"categories":1852},[106],{"categories":1854},[154],{"categories":1856},[103],{"categories":1858},[103],{"categories":1860},[],{"categories":1862},[],{"categories":1864},[],{"categories":1866},[416],{"categories":1868},[124],{"categories":1870},[103],{"categories":1872},[103],{"categories":1874},[103],{"categories":1876},[],{"categories":1878},[60],{"categories":1880},[100],{"categories":1882},[],{"categories":1884},[106],{"categories":1886},[416],{"categories":1888},[],{"categories":1890},[145],{"categories":1892},[145],{"categories":1894},[],{"categories":1896},[154],{"categories":1898},[145],{"categories":1900},[103],{"categories":1902},[],{"categories":1904},[124],{"categories":1906},[103],{"categories":1908},[145],{"categories":1910},[106],{"categories":1912},[124],{"categories":1914},[],{"categories":1916},[106],{"categories":1918},[145],{"categories":1920},[103],{"categories":1922},[],{"categories":1924},[103],{"categories":1926},[103],{"categories":1928},[416],{"categories":1930},[124],{"categories":1932},[60],{"categories":1934},[60],{"categories":1936},[],{"categories":1938},[],{"categories":1940},[],{"categories":1942},[106],{"categories":1944},[154],{"categories":1946},[154],{"categories":1948},[],{"categories":1950},[],{"categories":1952},[103],{"categories":1954},[],{"categories":1956},[106],{"categories":1958},[103],{"categories":1960},[],{"categories":1962},[103],{"categories":1964},[100],{"categories":1966},[103],{"categories":1968},[161],{"categories":1970},[106],{"categories":1972},[103],{"categories":1974},[154],{"categories":1976},[124],{"categories":1978},[106],{"categories":1980},[],{"categories":1982},[124],{"categories":1984},[106],{"categories":1986},[106],{"categories":1988},[],{"categories":1990},[100],{"categories":1992},[106],{"categories":1994},[],{"categories":1996},[103],{"categories":1998},[97],{"categories":2000},[124],{"categories":2002},[416],{"categories":2004},[106],{"categories":2006},[106],{"categories":2008},[97],{"categories":2010},[103],{"categories":2012},[],{"categories":2014},[],{"categories":2016},[145],{"categories":2018},[103,100],{"categories":2020},[],{"categories":2022},[97],{"categories":2024},[60],{"categories":2026},[103],{"categories":2028},[154],{"categories":2030},[103],{"categories":2032},[106],{"categories":2034},[103],{"categories":2036},[103],{"categories":2038},[124],{"categories":2040},[106],{"categories":2042},[],{"categories":2044},[],{"categories":2046},[106],{"categories":2048},[103],{"categories":2050},[416],{"categories":2052},[],{"categories":2054},[103],{"categories":2056},[106],{"categories":2058},[],{"categories":2060},[103],{"categories":2062},[161],{"categories":2064},[60],{"categories":2066},[106],{"categories":2068},[103],{"categories":2070},[416],{"categories":2072},[],{"categories":2074},[103],{"categories":2076},[161],{"categories":2078},[145],{"categories":2080},[103],{"categories":2082},[],{"categories":2084},[161],{"categories":2086},[124],{"categories":2088},[103],{"categories":2090},[103],{"categories":2092},[97],{"categories":2094},[],{"categories":2096},[],{"categories":2098},[145],{"categories":2100},[103],{"categories":2102},[60],{"categories":2104},[161],{"categories":2106},[161],{"categories":2108},[124],{"categories":2110},[],{"categories":2112},[],{"categories":2114},[103],{"categories":2116},[],{"categories":2118},[103,154],{"categories":2120},[124],{"categories":2122},[106],{"categories":2124},[154],{"categories":2126},[103],{"categories":2128},[97],{"categories":2130},[],{"categories":2132},[],{"categories":2134},[97],{"categories":2136},[161],{"categories":2138},[103],{"categories":2140},[],{"categories":2142},[145,103],{"categories":2144},[416],{"categories":2146},[97],{"categories":2148},[],{"categories":2150},[100],{"categories":2152},[100],{"categories":2154},[103],{"categories":2156},[154],{"categories":2158},[106],{"categories":2160},[124],{"categories":2162},[161],{"categories":2164},[145],{"categories":2166},[103],{"categories":2168},[103],{"categories":2170},[103],{"categories":2172},[97],{"categories":2174},[103],{"categories":2176},[106],{"categories":2178},[124],{"categories":2180},[],{"categories":2182},[],{"categories":2184},[60],{"categories":2186},[154],{"categories":2188},[103],{"categories":2190},[145],{"categories":2192},[60],{"categories":2194},[103],{"categories":2196},[103],{"categories":2198},[106],{"categories":2200},[106],{"categories":2202},[103,100],{"categories":2204},[],{"categories":2206},[145],{"categories":2208},[],{"categories":2210},[103],{"categories":2212},[124],{"categories":2214},[97],{"categories":2216},[97],{"categories":2218},[106],{"categories":2220},[103],{"categories":2222},[100],{"categories":2224},[154],{"categories":2226},[161],{"categories":2228},[],{"categories":2230},[124],{"categories":2232},[103],{"categories":2234},[103],{"categories":2236},[124],{"categories":2238},[154],{"categories":2240},[103],{"categories":2242},[106],{"categories":2244},[124],{"categories":2246},[103],{"categories":2248},[145],{"categories":2250},[103],{"categories":2252},[103],{"categories":2254},[416],{"categories":2256},[109],{"categories":2258},[106],{"categories":2260},[103],{"categories":2262},[124],{"categories":2264},[106],{"categories":2266},[161],{"categories":2268},[103],{"categories":2270},[],{"categories":2272},[103],{"categories":2274},[],{"categories":2276},[],{"categories":2278},[],{"categories":2280},[100],{"categories":2282},[103],{"categories":2284},[106],{"categories":2286},[124],{"categories":2288},[124],{"categories":2290},[124],{"categories":2292},[124],{"categories":2294},[],{"categories":2296},[97],{"categories":2298},[106],{"categories":2300},[124],{"categories":2302},[97],{"categories":2304},[106],{"categories":2306},[103],{"categories":2308},[103,106],{"categories":2310},[106],{"categories":2312},[416],{"categories":2314},[124],{"categories":2316},[124],{"categories":2318},[106],{"categories":2320},[103],{"categories":2322},[],{"categories":2324},[124],{"categories":2326},[161],{"categories":2328},[97],{"categories":2330},[103],{"categories":2332},[103],{"categories":2334},[],{"categories":2336},[154],{"categories":2338},[],{"categories":2340},[97],{"categories":2342},[106],{"categories":2344},[124],{"categories":2346},[103],{"categories":2348},[124],{"categories":2350},[97],{"categories":2352},[124],{"categories":2354},[124],{"categories":2356},[],{"categories":2358},[100],{"categories":2360},[106],{"categories":2362},[124],{"categories":2364},[124],{"categories":2366},[124],{"categories":2368},[124],{"categories":2370},[124],{"categories":2372},[124],{"categories":2374},[124],{"categories":2376},[124],{"categories":2378},[124],{"categories":2380},[124],{"categories":2382},[60],{"categories":2384},[97],{"categories":2386},[103],{"categories":2388},[103],{"categories":2390},[],{"categories":2392},[103,97],{"categories":2394},[],{"categories":2396},[106],{"categories":2398},[124],{"categories":2400},[106],{"categories":2402},[103],{"categories":2404},[103],{"categories":2406},[103],{"categories":2408},[103],{"categories":2410},[103],{"categories":2412},[106],{"categories":2414},[100],{"categories":2416},[145],{"categories":2418},[124],{"categories":2420},[103],{"categories":2422},[],{"categories":2424},[],{"categories":2426},[106],{"categories":2428},[145],{"categories":2430},[103],{"categories":2432},[],{"categories":2434},[],{"categories":2436},[161],{"categories":2438},[103],{"categories":2440},[],{"categories":2442},[],{"categories":2444},[97],{"categories":2446},[100],{"categories":2448},[103],{"categories":2450},[100],{"categories":2452},[145],{"categories":2454},[],{"categories":2456},[124],{"categories":2458},[],{"categories":2460},[145],{"categories":2462},[103],{"categories":2464},[161],{"categories":2466},[],{"categories":2468},[161],{"categories":2470},[],{"categories":2472},[],{"categories":2474},[106],{"categories":2476},[],{"categories":2478},[100],{"categories":2480},[97],{"categories":2482},[145],{"categories":2484},[154],{"categories":2486},[],{"categories":2488},[],{"categories":2490},[103],{"categories":2492},[97],{"categories":2494},[161],{"categories":2496},[],{"categories":2498},[106],{"categories":2500},[106],{"categories":2502},[124],{"categories":2504},[103],{"categories":2506},[106],{"categories":2508},[103],{"categories":2510},[106],{"categories":2512},[103],{"categories":2514},[109],{"categories":2516},[124],{"categories":2518},[],{"categories":2520},[161],{"categories":2522},[154],{"categories":2524},[106],{"categories":2526},[],{"categories":2528},[103],{"categories":2530},[106],{"categories":2532},[100],{"categories":2534},[97],{"categories":2536},[103],{"categories":2538},[145],{"categories":2540},[154],{"categories":2542},[154],{"categories":2544},[103],{"categories":2546},[60],{"categories":2548},[103],{"categories":2550},[106],{"categories":2552},[100],{"categories":2554},[106],{"categories":2556},[103],{"categories":2558},[103],{"categories":2560},[106],{"categories":2562},[124],{"categories":2564},[],{"categories":2566},[97],{"categories":2568},[103],{"categories":2570},[106],{"categories":2572},[103],{"categories":2574},[103],{"categories":2576},[],{"categories":2578},[145],{"categories":2580},[100],{"categories":2582},[124],{"categories":2584},[103],{"categories":2586},[103],{"categories":2588},[145],{"categories":2590},[161],{"categories":2592},[60],{"categories":2594},[103],{"categories":2596},[124],{"categories":2598},[103],{"categories":2600},[106],{"categories":2602},[416],{"categories":2604},[103],{"categories":2606},[106],{"categories":2608},[60],{"categories":2610},[],{"categories":2612},[106],{"categories":2614},[154],{"categories":2616},[145],{"categories":2618},[103],{"categories":2620},[97],{"categories":2622},[100],{"categories":2624},[154],{"categories":2626},[],{"categories":2628},[106],{"categories":2630},[103],{"categories":2632},[],{"categories":2634},[124],{"categories":2636},[],{"categories":2638},[124],{"categories":2640},[103],{"categories":2642},[106],{"categories":2644},[106],{"categories":2646},[106],{"categories":2648},[],{"categories":2650},[],{"categories":2652},[103],{"categories":2654},[103],{"categories":2656},[],{"categories":2658},[145],{"categories":2660},[106],{"categories":2662},[161],{"categories":2664},[97],{"categories":2666},[],{"categories":2668},[],{"categories":2670},[124],{"categories":2672},[154],{"categories":2674},[103],{"categories":2676},[103],{"categories":2678},[103],{"categories":2680},[154],{"categories":2682},[124],{"categories":2684},[145],{"categories":2686},[103],{"categories":2688},[103],{"categories":2690},[103],{"categories":2692},[124],{"categories":2694},[103],{"categories":2696},[124],{"categories":2698},[106],{"categories":2700},[106],{"categories":2702},[154],{"categories":2704},[106],{"categories":2706},[103],{"categories":2708},[154],{"categories":2710},[145],{"categories":2712},[],{"categories":2714},[106],{"categories":2716},[],{"categories":2718},[],{"categories":2720},[100],{"categories":2722},[103],{"categories":2724},[106],{"categories":2726},[97],{"categories":2728},[106],{"categories":2730},[161],{"categories":2732},[],{"categories":2734},[106],{"categories":2736},[],{"categories":2738},[97],{"categories":2740},[106],{"categories":2742},[],{"categories":2744},[106],{"categories":2746},[103],{"categories":2748},[124],{"categories":2750},[103],{"categories":2752},[106],{"categories":2754},[124],{"categories":2756},[106],{"categories":2758},[154],{"categories":2760},[145],{"categories":2762},[97],{"categories":2764},[],{"categories":2766},[106],{"categories":2768},[145],{"categories":2770},[124],{"categories":2772},[103],{"categories":2774},[145],{"categories":2776},[97],{"categories":2778},[],{"categories":2780},[106],{"categories":2782},[106],{"categories":2784},[103],{"categories":2786},[],{"categories":2788},[106],{"categories":2790},[109],{"categories":2792},[124],{"categories":2794},[106],{"categories":2796},[100],{"categories":2798},[],{"categories":2800},[103],{"categories":2802},[109],{"categories":2804},[103],{"categories":2806},[106],{"categories":2808},[124],{"categories":2810},[97],{"categories":2812},[416],{"categories":2814},[103],{"categories":2816},[103],{"categories":2818},[103],{"categories":2820},[124],{"categories":2822},[100],{"categories":2824},[103],{"categories":2826},[145],{"categories":2828},[124],{"categories":2830},[416],{"categories":2832},[103],{"categories":2834},[],{"categories":2836},[],{"categories":2838},[416],{"categories":2840},[60],{"categories":2842},[106],{"categories":2844},[106],{"categories":2846},[124],{"categories":2848},[103],{"categories":2850},[97],{"categories":2852},[145],{"categories":2854},[106],{"categories":2856},[103],{"categories":2858},[161],{"categories":2860},[103],{"categories":2862},[106],{"categories":2864},[],{"categories":2866},[103],{"categories":2868},[103],{"categories":2870},[124],{"categories":2872},[97],{"categories":2874},[],{"categories":2876},[103],{"categories":2878},[103],{"categories":2880},[154],{"categories":2882},[145],{"categories":2884},[103,106],{"categories":2886},[161,100],{"categories":2888},[103],{"categories":2890},[],{"categories":2892},[106],{"categories":2894},[],{"categories":2896},[154],{"categories":2898},[103],{"categories":2900},[124],{"categories":2902},[],{"categories":2904},[106],{"categories":2906},[],{"categories":2908},[106],{"categories":2910},[97],{"categories":2912},[106],{"categories":2914},[103],{"categories":2916},[416],{"categories":2918},[161],{"categories":2920},[100],{"categories":2922},[100],{"categories":2924},[97],{"categories":2926},[97],{"categories":2928},[103],{"categories":2930},[106],{"categories":2932},[103],{"categories":2934},[103],{"categories":2936},[97],{"categories":2938},[103],{"categories":2940},[161],{"categories":2942},[124],{"categories":2944},[103],{"categories":2946},[106],{"categories":2948},[103],{"categories":2950},[],{"categories":2952},[154],{"categories":2954},[],{"categories":2956},[106],{"categories":2958},[97],{"categories":2960},[],{"categories":2962},[416],{"categories":2964},[103],{"categories":2966},[],{"categories":2968},[124],{"categories":2970},[106],{"categories":2972},[154],{"categories":2974},[103],{"categories":2976},[106],{"categories":2978},[154],{"categories":2980},[106],{"categories":2982},[124],{"categories":2984},[97],{"categories":2986},[124],{"categories":2988},[154],{"categories":2990},[103],{"categories":2992},[145],{"categories":2994},[103],{"categories":2996},[103],{"categories":2998},[103],{"categories":3000},[103],{"categories":3002},[106],{"categories":3004},[103],{"categories":3006},[106],{"categories":3008},[103],{"categories":3010},[97],{"categories":3012},[103],{"categories":3014},[106],{"categories":3016},[145],{"categories":3018},[97],{"categories":3020},[106],{"categories":3022},[145],{"categories":3024},[],{"categories":3026},[103],{"categories":3028},[103],{"categories":3030},[154],{"categories":3032},[],{"categories":3034},[106],{"categories":3036},[161],{"categories":3038},[103],{"categories":3040},[124],{"categories":3042},[161],{"categories":3044},[106],{"categories":3046},[100],{"categories":3048},[100],{"categories":3050},[103],{"categories":3052},[97],{"categories":3054},[],{"categories":3056},[103],{"categories":3058},[],{"categories":3060},[97],{"categories":3062},[103],{"categories":3064},[106],{"categories":3066},[106],{"categories":3068},[],{"categories":3070},[154],{"categories":3072},[154],{"categories":3074},[161],{"categories":3076},[145],{"categories":3078},[],{"categories":3080},[103],{"categories":3082},[97],{"categories":3084},[103],{"categories":3086},[154],{"categories":3088},[97],{"categories":3090},[124],{"categories":3092},[124],{"categories":3094},[],{"categories":3096},[124],{"categories":3098},[106],{"categories":3100},[145],{"categories":3102},[60],{"categories":3104},[103],{"categories":3106},[],{"categories":3108},[124],{"categories":3110},[154],{"categories":3112},[100],{"categories":3114},[103],{"categories":3116},[97],{"categories":3118},[416],{"categories":3120},[97],{"categories":3122},[],{"categories":3124},[],{"categories":3126},[124],{"categories":3128},[],{"categories":3130},[106],{"categories":3132},[106],{"categories":3134},[106],{"categories":3136},[],{"categories":3138},[103],{"categories":3140},[],{"categories":3142},[124],{"categories":3144},[97],{"categories":3146},[145],{"categories":3148},[103],{"categories":3150},[124],{"categories":3152},[124],{"categories":3154},[],{"categories":3156},[124],{"categories":3158},[97],{"categories":3160},[103],{"categories":3162},[],{"categories":3164},[106],{"categories":3166},[106],{"categories":3168},[97],{"categories":3170},[],{"categories":3172},[],{"categories":3174},[],{"categories":3176},[145],{"categories":3178},[106],{"categories":3180},[103],{"categories":3182},[],{"categories":3184},[],{"categories":3186},[],{"categories":3188},[145],{"categories":3190},[],{"categories":3192},[97],{"categories":3194},[],{"categories":3196},[],{"categories":3198},[145],{"categories":3200},[103],{"categories":3202},[124],{"categories":3204},[],{"categories":3206},[161],{"categories":3208},[124],{"categories":3210},[161],{"categories":3212},[103],{"categories":3214},[],{"categories":3216},[],{"categories":3218},[106],{"categories":3220},[],{"categories":3222},[],{"categories":3224},[106],{"categories":3226},[103],{"categories":3228},[],{"categories":3230},[106],{"categories":3232},[124],{"categories":3234},[161],{"categories":3236},[60],{"categories":3238},[106],{"categories":3240},[106],{"categories":3242},[],{"categories":3244},[],{"categories":3246},[],{"categories":3248},[124],{"categories":3250},[],{"categories":3252},[],{"categories":3254},[145],{"categories":3256},[97],{"categories":3258},[],{"categories":3260},[100],{"categories":3262},[161],{"categories":3264},[103],{"categories":3266},[154],{"categories":3268},[97],{"categories":3270},[60],{"categories":3272},[100],{"categories":3274},[154],{"categories":3276},[],{"categories":3278},[],{"categories":3280},[106],{"categories":3282},[97],{"categories":3284},[145],{"categories":3286},[97],{"categories":3288},[106],{"categories":3290},[416],{"categories":3292},[106],{"categories":3294},[],{"categories":3296},[103],{"categories":3298},[124],{"categories":3300},[154],{"categories":3302},[],{"categories":3304},[145],{"categories":3306},[124],{"categories":3308},[97],{"categories":3310},[106],{"categories":3312},[103],{"categories":3314},[100],{"categories":3316},[106,416],{"categories":3318},[106],{"categories":3320},[154],{"categories":3322},[103],{"categories":3324},[60],{"categories":3326},[161],{"categories":3328},[106],{"categories":3330},[],{"categories":3332},[106],{"categories":3334},[103],{"categories":3336},[100],{"categories":3338},[],{"categories":3340},[],{"categories":3342},[103],{"categories":3344},[60],{"categories":3346},[103],{"categories":3348},[],{"categories":3350},[124],{"categories":3352},[],{"categories":3354},[124],{"categories":3356},[154],{"categories":3358},[106],{"categories":3360},[103],{"categories":3362},[161],{"categories":3364},[154],{"categories":3366},[],{"categories":3368},[124],{"categories":3370},[103],{"categories":3372},[],{"categories":3374},[103],{"categories":3376},[106],{"categories":3378},[103],{"categories":3380},[106],{"categories":3382},[103],{"categories":3384},[103],{"categories":3386},[103],{"categories":3388},[103],{"categories":3390},[100],{"categories":3392},[],{"categories":3394},[109],{"categories":3396},[124],{"categories":3398},[103],{"categories":3400},[],{"categories":3402},[154],{"categories":3404},[103],{"categories":3406},[103],{"categories":3408},[106],{"categories":3410},[124],{"categories":3412},[103],{"categories":3414},[103],{"categories":3416},[100],{"categories":3418},[106],{"categories":3420},[145],{"categories":3422},[],{"categories":3424},[60],{"categories":3426},[103],{"categories":3428},[],{"categories":3430},[124],{"categories":3432},[161],{"categories":3434},[],{"categories":3436},[],{"categories":3438},[124],{"categories":3440},[124],{"categories":3442},[161],{"categories":3444},[97],{"categories":3446},[106],{"categories":3448},[106],{"categories":3450},[103],{"categories":3452},[100],{"categories":3454},[],{"categories":3456},[],{"categories":3458},[124],{"categories":3460},[60],{"categories":3462},[154],{"categories":3464},[106],{"categories":3466},[145],{"categories":3468},[60],{"categories":3470},[60],{"categories":3472},[],{"categories":3474},[124],{"categories":3476},[103],{"categories":3478},[103],{"categories":3480},[154],{"categories":3482},[],{"categories":3484},[124],{"categories":3486},[124],{"categories":3488},[124],{"categories":3490},[],{"categories":3492},[106],{"categories":3494},[103],{"categories":3496},[],{"categories":3498},[97],{"categories":3500},[100],{"categories":3502},[],{"categories":3504},[103],{"categories":3506},[103],{"categories":3508},[],{"categories":3510},[154],{"categories":3512},[],{"categories":3514},[],{"categories":3516},[],{"categories":3518},[],{"categories":3520},[103],{"categories":3522},[124],{"categories":3524},[],{"categories":3526},[],{"categories":3528},[103],{"categories":3530},[103],{"categories":3532},[103],{"categories":3534},[60],{"categories":3536},[103],{"categories":3538},[60],{"categories":3540},[],{"categories":3542},[60],{"categories":3544},[60],{"categories":3546},[416],{"categories":3548},[106],{"categories":3550},[154],{"categories":3552},[],{"categories":3554},[],{"categories":3556},[60],{"categories":3558},[154],{"categories":3560},[154],{"categories":3562},[154],{"categories":3564},[],{"categories":3566},[97],{"categories":3568},[154],{"categories":3570},[154],{"categories":3572},[97],{"categories":3574},[154],{"categories":3576},[100],{"categories":3578},[154],{"categories":3580},[154],{"categories":3582},[154],{"categories":3584},[60],{"categories":3586},[124],{"categories":3588},[124],{"categories":3590},[103],{"categories":3592},[154],{"categories":3594},[60],{"categories":3596},[416],{"categories":3598},[60],{"categories":3600},[60],{"categories":3602},[60],{"categories":3604},[],{"categories":3606},[100],{"categories":3608},[],{"categories":3610},[416],{"categories":3612},[154],{"categories":3614},[154],{"categories":3616},[154],{"categories":3618},[106],{"categories":3620},[124,100],{"categories":3622},[60],{"categories":3624},[],{"categories":3626},[],{"categories":3628},[60],{"categories":3630},[],{"categories":3632},[60],{"categories":3634},[124],{"categories":3636},[106],{"categories":3638},[],{"categories":3640},[154],{"categories":3642},[103],{"categories":3644},[145],{"categories":3646},[],{"categories":3648},[103],{"categories":3650},[],{"categories":3652},[124],{"categories":3654},[97],{"categories":3656},[60],{"categories":3658},[],{"categories":3660},[154],{"categories":3662},[124],[3664,3893,4137,4269],{"id":3665,"title":3666,"ai":3667,"body":3672,"categories":3867,"created_at":61,"date_modified":61,"description":53,"extension":62,"faq":61,"featured":63,"kicker_label":61,"meta":3868,"navigation":75,"path":3880,"published_at":3881,"question":61,"scraped_at":3882,"seo":3883,"sitemap":3884,"source_id":3885,"source_name":3886,"source_type":83,"source_url":3887,"stem":3888,"tags":3889,"thumbnail_url":61,"tldr":3890,"tweet":61,"unknown_tags":3891,"__hash__":3892},"summaries\u002Fsummaries\u002Fff126f8e0954389e-skfolio-build-tune-portfolio-optimizers-in-python-summary.md","skfolio: Build & Tune Portfolio Optimizers in Python",{"provider":7,"model":8,"input_tokens":3668,"output_tokens":3669,"processing_time_ms":3670,"cost_usd":3671},9292,2519,30098,0.00309525,{"type":14,"value":3673,"toc":3861},[3674,3678,3710,3714,3763,3767,3832,3836],[17,3675,3677],{"id":3676},"data-prep-and-baseline-benchmarks-deliver-quick-wins","Data Prep and Baseline Benchmarks Deliver Quick Wins",[22,3679,3680,3681,3685,3686,3689,3690,3693,3694,3697,3698,3701,3702,3705,3706,3709],{},"Load S&P 500 prices via ",[3682,3683,3684],"code",{},"skfolio.datasets.load_sp500_dataset()",", convert to returns with ",[3682,3687,3688],{},"prices_to_returns()",", and split chronologically (",[3682,3691,3692],{},"train_test_split(shuffle=False, test_size=0.33)",") to prevent look-ahead bias—training spans ~67% historical days, testing the rest. Baselines like ",[3682,3695,3696],{},"EqualWeighted()",", ",[3682,3699,3700],{},"InverseVolatility()",", and ",[3682,3703,3704],{},"Random()"," fit on train, predict on test, yielding metrics like annualized Sharpe (printed via ",[3682,3707,3708],{},"ptf.annualized_sharpe_ratio","), mean return, and volatility. These expose naive strategies' flaws: equal-weight ignores volatility, random adds noise—use them to benchmark any optimizer.",[17,3711,3713],{"id":3712},"mean-variance-risk-measures-and-clustering-beat-baselines","Mean-Variance, Risk Measures, and Clustering Beat Baselines",[22,3715,3716,3719,3720,3723,3724,3727,3728,3731,3732,3697,3735,3738,3739,3742,3743,3746,3747,3750,3751,3754,3755,3758,3759,3762],{},[3682,3717,3718],{},"MeanRisk(risk_measure=RiskMeasure.VARIANCE)"," minimizes variance or maximizes Sharpe (",[3682,3721,3722],{},"ObjectiveFunction.MAXIMIZE_RATIO","), generating efficient frontiers (",[3682,3725,3726],{},"efficient_frontier_size=20",") plotted by risk vs. Sharpe. Swap risks to ",[3682,3729,3730],{},"CVaR"," (95%), ",[3682,3733,3734],{},"SEMI_VARIANCE",[3682,3736,3737],{},"CDAR",", or ",[3682,3740,3741],{},"MAX_DRAWDOWN"," for tail-focused portfolios that cut CVaR@95% and max drawdown vs. variance. ",[3682,3744,3745],{},"RiskBudgeting()"," equalizes contributions (variance or CVaR). Hierarchical methods shine: ",[3682,3748,3749],{},"HierarchicalRiskParity()"," clusters assets via dendrograms for stable weights; ",[3682,3752,3753],{},"NestedClustersOptimization()"," nests ",[3682,3756,3757],{},"MeanRisk(CVAR)"," inside ",[3682,3760,3761],{},"RiskBudgeting(VARIANCE)"," with 5-fold CV, capturing correlations without covariance pitfalls.",[17,3764,3766],{"id":3765},"robust-priors-constraints-and-views-stabilize-real-world-use","Robust Priors, Constraints, and Views Stabilize Real-World Use",[22,3768,3769,3770,3773,3774,3777,3778,3697,3781,3697,3784,3738,3787,3790,3791,3794,3795,3697,3798,3697,3801,3697,3804,3807,3808,3811,3812,3815,3816,3819,3820,3823,3824,3827,3828,3831],{},"Replace ",[3682,3771,3772],{},"EmpiricalCovariance()","\u002F",[3682,3775,3776],{},"EmpiricalMu()"," with ",[3682,3779,3780],{},"DenoiseCovariance()",[3682,3782,3783],{},"ShrunkMu()",[3682,3785,3786],{},"GerberCovariance()",[3682,3788,3789],{},"EWMu(alpha=0.1)"," in ",[3682,3792,3793],{},"EmpiricalPrior()"," for max-Sharpe portfolios resilient to estimation error. Add realism via ",[3682,3796,3797],{},"min_weights=0.0",[3682,3799,3800],{},"max_weights=0.20",[3682,3802,3803],{},"transaction_costs=0.0005",[3682,3805,3806],{},"groups"," (e.g., GroupA \u003C=0.6, GroupB>=0.2), ",[3682,3809,3810],{},"l2_coef=0.01",". ",[3682,3813,3814],{},"BlackLitterman(views=[\"AAPL == 0.0008\", \"JPM - BAC == 0.0002\"])"," blends market priors with views. ",[3682,3817,3818],{},"FactorModel()"," on ",[3682,3821,3822],{},"load_factors_dataset()"," explains returns via external factors, boosting Sharpe. Pipelines like ",[3682,3825,3826],{},"SelectKExtremes(k=8)"," + ",[3682,3829,3830],{},"MeanRisk()"," prune to top performers.",[17,3833,3835],{"id":3834},"walk-forward-cv-and-tuning-ensure-out-of-sample-performance","Walk-Forward CV and Tuning Ensure Out-of-Sample Performance",[22,3837,3838,3777,3841,3844,3845,3848,3849,3852,3853,3856,3857,3860],{},[3682,3839,3840],{},"cross_val_predict()",[3682,3842,3843],{},"WalkForward(train_size=252*2, test_size=63)"," simulates rolling 2-year trains\u002F3-month tests, computing portfolio Sharpe\u002FCalmar. ",[3682,3846,3847],{},"GridSearchCV()"," tunes ",[3682,3850,3851],{},"l2_coef=[0.0,0.01,0.1]"," and ",[3682,3854,3855],{},"mu_estimator__alpha=[0.05,0.1,0.2,0.5]"," on max-Sharpe, selecting best CV Sharpe. Final ",[3682,3858,3859],{},"Population()"," of 18 strategies compares annualized mean\u002Fvol\u002FSharpe\u002FSortino\u002FCVaR@95%\u002Fdrawdowns (sorted by test Sharpe), with plots for cumulative returns, weights, risk contributions—revealing hierarchical\u002Frisk-parity often top variance-based in stability.",{"title":53,"searchDepth":54,"depth":54,"links":3862},[3863,3864,3865,3866],{"id":3676,"depth":54,"text":3677},{"id":3712,"depth":54,"text":3713},{"id":3765,"depth":54,"text":3766},{"id":3834,"depth":54,"text":3835},[60],{"content_references":3869,"triage":3878},[3870,3874],{"type":3871,"title":3872,"url":3873,"context":69},"tool","skfolio","https:\u002F\u002Fgithub.com\u002Fskfolio\u002Fskfolio",{"type":3875,"title":3876,"url":3877,"context":69},"other","Full Codes","https:\u002F\u002Fgithub.com\u002FMarktechpost\u002FAI-Agents-Projects-Tutorials\u002Fblob\u002Fmain\u002FData%20Science\u002Fportfolio_optimization_with_skfolio_Marktechpost.ipynb",{"relevance":71,"novelty":71,"quality":72,"actionability":72,"composite":73,"reasoning":3879},"Category: Data Science & Visualization. The article provides a practical guide on using the skfolio library for portfolio optimization, which aligns with the audience's interest in actionable AI and data science tools. It includes specific code examples and methodologies that can be directly applied, making it useful for developers looking to implement AI in financial products.","\u002Fsummaries\u002Fff126f8e0954389e-skfolio-build-tune-portfolio-optimizers-in-python-summary","2026-05-12 07:05:02","2026-05-12 15:01:25",{"title":3666,"description":53},{"loc":3880},"ff126f8e0954389e","MarkTechPost","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F12\u002Fa-coding-implementation-to-portfolio-optimization-with-skfolio-for-building-testing-tuning-and-comparing-modern-investment-strategies\u002F","summaries\u002Fff126f8e0954389e-skfolio-build-tune-portfolio-optimizers-in-python-summary",[90,89,87],"skfolio's scikit-learn API lets you construct, validate, and compare 18+ portfolio strategies—from baselines to HRP, Black-Litterman, factors, and tuned models—on S&P 500 returns with walk-forward CV and GridSearchCV.",[],"s9QUFNF_HWzNZV61Dh6PEETN3C3-K3FsZalb0rd3HRQ",{"id":3894,"title":3895,"ai":3896,"body":3901,"categories":4108,"created_at":61,"date_modified":61,"description":53,"extension":62,"faq":61,"featured":63,"kicker_label":61,"meta":4109,"navigation":75,"path":4125,"published_at":4126,"question":61,"scraped_at":4127,"seo":4128,"sitemap":4129,"source_id":4130,"source_name":3886,"source_type":83,"source_url":4131,"stem":4132,"tags":4133,"thumbnail_url":61,"tldr":4134,"tweet":61,"unknown_tags":4135,"__hash__":4136},"summaries\u002Fsummaries\u002Fa59df2d47dafe018-scanpy-pipeline-for-pbmc-scrna-seq-clustering-traj-summary.md","Scanpy Pipeline for PBMC scRNA-seq Clustering & Trajectories",{"provider":7,"model":8,"input_tokens":3897,"output_tokens":3898,"processing_time_ms":3899,"cost_usd":3900},9209,2235,26831,0.0029368,{"type":14,"value":3902,"toc":4102},[3903,3907,3939,3965,3969,3992,4008,4012,4035,4053,4057,4088],[17,3904,3906],{"id":3905},"rigorous-qc-and-filtering-removes-noise-for-reliable-downstream-analysis","Rigorous QC and Filtering Removes Noise for Reliable Downstream Analysis",[22,3908,3909,3910,3913,3914,3917,3918,3921,3922,3925,3926,3929,3930,3697,3933,3697,3936,3938],{},"Load PBMC-3k via ",[3682,3911,3912],{},"sc.datasets.pbmc3k()"," (2700 cells, ~2k genes\u002Fcell). Compute QC metrics for mitochondrial (",[3682,3915,3916],{},"MT-"," prefix, filter \u003C5% ",[3682,3919,3920],{},"pct_counts_mt",") and ribosomal (",[3682,3923,3924],{},"RPS\u002FRPL",") genes using ",[3682,3927,3928],{},"sc.pp.calculate_qc_metrics",". Visualize with violin plots (",[3682,3931,3932],{},"n_genes_by_counts",[3682,3934,3935],{},"total_counts",[3682,3937,3920],{},") and scatters to spot outliers.",[22,3940,3941,3942,3697,3945,3948,3949,3952,3953,3956,3957,3960,3961,3964],{},"Filter: ",[3682,3943,3944],{},"min_genes=200",[3682,3946,3947],{},"min_cells=3",", upper ",[3682,3950,3951],{},"n_genes_by_counts \u003C2500",". Detect doublets via ",[3682,3954,3955],{},"sc.pp.scrublet"," (removes ~sum of ",[3682,3958,3959],{},"predicted_doublet","). Preserve raw in ",[3682,3962,3963],{},"layers[\"counts\"]",". This yields cleaner data, preventing artifacts in clustering.",[17,3966,3968],{"id":3967},"normalization-hvgs-and-cell-cycle-correction-focus-on-biological-signal","Normalization, HVGs, and Cell-Cycle Correction Focus on Biological Signal",[22,3970,3971,3972,3975,3976,3979,3980,3983,3984,3987,3988,3991],{},"Normalize to 10k counts (",[3682,3973,3974],{},"sc.pp.normalize_total(target_sum=1e4)","), log-transform (",[3682,3977,3978],{},"sc.pp.log1p","). Identify highly variable genes (",[3682,3981,3982],{},"sc.pp.highly_variable_genes(min_mean=0.0125, max_mean=3, min_disp=0.5)","), subset to them (",[3682,3985,3986],{},"adata = adata[:, adata.var.highly_variable]","). Store raw in ",[3682,3989,3990],{},"adata.raw",".",[22,3993,3994,3995,3697,3997,3999,4000,4003,4004,4007],{},"Score S\u002FG2M phases with 40+ predefined markers (e.g., S: MCM5,PCNA; G2M: HMGB2,CDK1, filter to dataset genes). Regress out ",[3682,3996,3935],{},[3682,3998,3920],{}," (",[3682,4001,4002],{},"sc.pp.regress_out","). Scale (",[3682,4005,4006],{},"sc.pp.scale(max_value=10)","). These steps isolate biological variance, regressing technical noise for accurate modeling.",[17,4009,4011],{"id":4010},"dimensionality-reduction-leiden-clustering-and-marker-based-annotation-reveals-cell-types","Dimensionality Reduction, Leiden Clustering, and Marker-Based Annotation Reveals Cell Types",[22,4013,4014,4015,4018,4019,4022,4023,4026,4027,4030,4031,4034],{},"PCA (",[3682,4016,4017],{},"sc.tl.pca(svd_solver=\"arpack\")",", check ",[3682,4020,4021],{},"n_pcs=50"," variance). Neighbors (",[3682,4024,4025],{},"sc.pp.neighbors(n_neighbors=10, n_pcs=40)","). Embeddings: UMAP (",[3682,4028,4029],{},"sc.tl.umap","), t-SNE (",[3682,4032,4033],{},"sc.tl.tsne(n_pcs=40)",").",[22,4036,4037,4038,4041,4042,4045,4046,3697,4049,4052],{},"Cluster with Leiden (",[3682,4039,4040],{},"sc.tl.leiden(resolution=0.5, flavor=\"igraph\", n_iterations=2)","). Rank markers (",[3682,4043,4044],{},"sc.tl.rank_genes_groups(method=\"wilcoxon\")",", top 10\u002Fcluster via Wilcoxon). Annotate using PBMC markers: B-cell (CD79A,MS4A1), CD8 T (CD8A,CD8B), CD4 T (IL7R,CD4), NK (GNLY,NKG7), CD14 Mono (CD14,LYZ), FCGR3A Mono (FCGR3A,MS4A7), Dendritic (FCER1A,CST3), Mega (PPBP). Confirm via ",[3682,4047,4048],{},"sc.pl.dotplot",[3682,4050,4051],{},"sc.pl.stacked_violin(groupby=\"leiden\")",". Visualizes 8-9 clusters matching immune subsets.",[17,4054,4056],{"id":4055},"paga-trajectories-pseudotime-and-custom-scores-enable-developmental-insights","PAGA Trajectories, Pseudotime, and Custom Scores Enable Developmental Insights",[22,4058,4059,4060,4063,4064,4067,4068,4071,4072,4075,4076,4079,4080,4083,4084,4087],{},"Graph-based trajectories: ",[3682,4061,4062],{},"sc.tl.paga(groups=\"leiden\")",", threshold=0.1, init UMAP (",[3682,4065,4066],{},"sc.tl.umap(init_pos=\"paga\")","). Diffusion maps (",[3682,4069,4070],{},"sc.tl.diffmap","), recompute neighbors on ",[3682,4073,4074],{},"X_diffmap",", root at cluster 0 (",[3682,4077,4078],{},"adata.uns[\"iroot\"]","), pseudotime (",[3682,4081,4082],{},"sc.tl.dpt","). Plot ",[3682,4085,4086],{},"dpt_pseudotime"," on UMAP.",[22,4089,4090,4091,3697,4094,4097,4098,4101],{},"Custom score: IFN-response genes (ISG15,IFI6,IFIT1,IFIT3,MX1,OAS1,STAT1,IRF7) via ",[3682,4092,4093],{},"sc.tl.score_genes(score_name=\"IFN_score\")",[3682,4095,4096],{},"cmap=\"viridis\"",". Save full AnnData (",[3682,4099,4100],{},"adata.write(\"pbmc3k_analyzed.h5ad\")",") with embeddings, clusters, scores for reuse. Extends basic clustering to infer progression and response states.",{"title":53,"searchDepth":54,"depth":54,"links":4103},[4104,4105,4106,4107],{"id":3905,"depth":54,"text":3906},{"id":3967,"depth":54,"text":3968},{"id":4010,"depth":54,"text":4011},{"id":4055,"depth":54,"text":4056},[60],{"content_references":4110,"triage":4122},[4111,4114,4116,4118],{"type":3871,"title":4112,"url":4113,"context":69},"Scanpy","https:\u002F\u002Fgithub.com\u002Fscverse\u002Fscanpy",{"type":67,"title":4115,"context":69},"PBMC-3k",{"type":3871,"title":4117,"context":69},"Scrublet",{"type":3875,"title":4119,"url":4120,"context":4121},"Full Codes with Notebook","https:\u002F\u002Fgithub.com\u002FMarktechpost\u002FAI-Agents-Projects-Tutorials\u002Fblob\u002Fmain\u002FData%20Science\u002Fscanpy_pbmc3k_single_cell_rnaseq_analysis_Marktechpost.ipynb","recommended",{"relevance":71,"novelty":54,"quality":72,"actionability":71,"composite":4123,"reasoning":4124},3.05,"Category: Data Science & Visualization. The article provides a detailed overview of building a single-cell RNA-seq analysis pipeline using Scanpy, which is relevant for data scientists working with biological data. However, it primarily focuses on a specific use case without broader implications or insights that could apply to a wider audience.","\u002Fsummaries\u002Fa59df2d47dafe018-scanpy-pipeline-for-pbmc-scrna-seq-clustering-traj-summary","2026-05-08 21:32:12","2026-05-09 15:37:24",{"title":3895,"description":53},{"loc":4125},"a59df2d47dafe018","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F08\u002Fhow-to-build-a-single-cell-rna-seq-analysis-pipeline-with-scanpy-for-pbmc-clustering-annotation-and-trajectory-discovery\u002F","summaries\u002Fa59df2d47dafe018-scanpy-pipeline-for-pbmc-scrna-seq-clustering-traj-summary",[89,87,90],"Process PBMC-3k data with Scanpy: filter cells (min 200 genes, \u003C2500 genes, \u003C5% mt), remove Scrublet doublets, select HVGs (min_mean=0.0125, max_mean=3, min_disp=0.5), Leiden cluster at res=0.5, annotate via markers, infer PAGA\u002FDPT trajectories, score IFN response.",[],"jTCku7xsp8M-LiBcwiNLzHzB68G5RjE-UBMIb_cET-c",{"id":4138,"title":4139,"ai":4140,"body":4145,"categories":4246,"created_at":61,"date_modified":61,"description":53,"extension":62,"faq":61,"featured":63,"kicker_label":61,"meta":4247,"navigation":75,"path":4257,"published_at":4258,"question":61,"scraped_at":4259,"seo":4260,"sitemap":4261,"source_id":4262,"source_name":3886,"source_type":83,"source_url":4263,"stem":4264,"tags":4265,"thumbnail_url":61,"tldr":4266,"tweet":61,"unknown_tags":4267,"__hash__":4268},"summaries\u002Fsummaries\u002Fa50c8b812151a371-tabpfn-beats-tree-models-on-tabular-accuracy-with--summary.md","TabPFN Beats Tree Models on Tabular Accuracy with Zero Training",{"provider":7,"model":8,"input_tokens":4141,"output_tokens":4142,"processing_time_ms":4143,"cost_usd":4144},9215,1914,16447,0.00277735,{"type":14,"value":4146,"toc":4241},[4147,4151,4154,4165,4194,4197,4201,4204,4227,4230,4234,4237],[17,4148,4150],{"id":4149},"tabpfns-pretraining-enables-direct-inference-on-tabular-tasks","TabPFN's Pretraining Enables Direct Inference on Tabular Tasks",[22,4152,4153],{},"TabPFN is a foundation model pretrained on millions of synthetic tabular datasets from causal processes, allowing it to perform supervised classification without dataset-specific training. Provide your training data during the .fit() call, which loads pretrained weights in 0.47 seconds—no hyperparameter tuning or iterative optimization needed. Predictions use in-context learning: the model conditions on your full training set (e.g., 4,000 samples) alongside test inputs at inference time, mimicking LLM prompting but for structured data. TabPFN-2.5 extends this to larger datasets up to millions of rows, outperforming tuned XGBoost, CatBoost, and ensembles like AutoGluon on benchmarks by capturing general tabular patterns.",[22,4155,4156,4157,4160,4161,4164],{},"To implement, install via ",[3682,4158,4159],{},"pip install tabpfn-client scikit-learn catboost",", set ",[3682,4162,4163],{},"TABPFN_TOKEN"," from priorlabs.ai, then:",[4166,4167,4170],"pre",{"className":4168,"code":4169,"language":90,"meta":53,"style":53},"language-python shiki shiki-themes github-light github-dark","from tabpfn_client import TabPFNClassifier\ntabpfn = TabPFNClassifier()\ntabpfn.fit(X_train, y_train)  # Loads weights\ntabpfn_preds = tabpfn.predict(X_test)\n",[3682,4171,4172,4179,4184,4189],{"__ignoreMap":53},[26,4173,4176],{"class":4174,"line":4175},"line",1,[26,4177,4178],{},"from tabpfn_client import TabPFNClassifier\n",[26,4180,4181],{"class":4174,"line":54},[26,4182,4183],{},"tabpfn = TabPFNClassifier()\n",[26,4185,4186],{"class":4174,"line":71},[26,4187,4188],{},"tabpfn.fit(X_train, y_train)  # Loads weights\n",[26,4190,4191],{"class":4174,"line":72},[26,4192,4193],{},"tabpfn_preds = tabpfn.predict(X_test)\n",[22,4195,4196],{},"This shifts computation from training to inference, ideal for rapid prototyping where setup speed trumps everything.",[17,4198,4200],{"id":4199},"quantified-wins-over-tree-based-baselines","Quantified Wins Over Tree-Based Baselines",[22,4202,4203],{},"Tested on scikit-learn's synthetic binary classification: 5,000 samples, 20 features (10 informative, 5 redundant), 80\u002F20 train\u002Ftest split.",[4205,4206,4207,4215,4221],"ul",{},[4208,4209,4210,4214],"li",{},[4211,4212,4213],"strong",{},"Random Forest"," (200 trees): 95.5% accuracy, 9.56s train, 0.0627s infer. Robust bagging handles noise but plateaus on complex interactions.",[4208,4216,4217,4220],{},[4211,4218,4219],{},"CatBoost"," (500 iterations, depth=6, lr=0.1): 96.7% accuracy, 8.15s train, 0.0119s infer. Boosting edges out RF via error correction, excels in low-latency production.",[4208,4222,4223,4226],{},[4211,4224,4225],{},"TabPFN",": 98.8% accuracy, 0.47s fit, 2.21s infer. Gains 2.1-3.3% accuracy by leveraging pretrained priors on noisy features.",[22,4228,4229],{},"TabPFN wins on accuracy and setup for small-to-medium data (\u003C10k rows), eliminating tuning that tree models demand.",[17,4231,4233],{"id":4232},"inference-cost-and-distillation-for-production","Inference Cost and Distillation for Production",[22,4235,4236],{},"TabPFN's 2.21s inference (vs \u003C0.1s for trees) arises from joint processing of train+test data—scales with training set size, unsuitable for real-time apps or huge datasets without tweaks. Solution: distillation engine converts predictions to compact neural nets or tree ensembles, preserving ~98% of accuracy while slashing inference to milliseconds. Use for offline analysis, A\u002FB tests, or batch scoring; distill for deployment. Best for dev speed on tabular tasks where trees fall short, like healthcare\u002Ffinance with mixed types—no preprocessing grind required.",[4238,4239,4240],"style",{},"html .default .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}html.dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}",{"title":53,"searchDepth":54,"depth":54,"links":4242},[4243,4244,4245],{"id":4149,"depth":54,"text":4150},{"id":4199,"depth":54,"text":4200},{"id":4232,"depth":54,"text":4233},[60],{"content_references":4248,"triage":4253},[4249,4251],{"type":3871,"title":4225,"url":4250,"context":69},"https:\u002F\u002Fux.priorlabs.ai\u002Fhome",{"type":3875,"title":4119,"url":4252,"context":69},"https:\u002F\u002Fgithub.com\u002FMarktechpost\u002FAI-Agents-Projects-Tutorials\u002Fblob\u002Fmain\u002FData%20Science\u002FTabPFN.ipynb",{"relevance":4254,"novelty":72,"quality":72,"actionability":72,"composite":4255,"reasoning":4256},5,4.35,"Category: AI & LLMs. The article provides a detailed comparison of TabPFN with traditional tree models, addressing the audience's need for practical AI applications in product development. It includes specific implementation steps for using TabPFN, making it actionable for developers looking to integrate this model into their workflows.","\u002Fsummaries\u002Fa50c8b812151a371-tabpfn-beats-tree-models-on-tabular-accuracy-with-summary","2026-04-19 19:11:03","2026-04-21 15:26:59",{"title":4139,"description":53},{"loc":4257},"a50c8b812151a371","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F04\u002F19\u002Fhow-tabpfn-leverages-in-context-learning-to-achieve-superior-accuracy-on-tabular-datasets-compared-to-random-forest-and-catboost\u002F","summaries\u002Fa50c8b812151a371-tabpfn-beats-tree-models-on-tabular-accuracy-with--summary",[87,89,90],"On a 5k-sample tabular dataset, TabPFN hits 98.8% accuracy vs CatBoost's 96.7% and Random Forest's 95.5%, with 0.47s setup but 2.21s inference due to in-context learning at predict time.",[],"ib8Gsg5sdpFcbFssj_HpjZUdd84YjROCIhNcU90X7HE",{"id":4270,"title":4271,"ai":4272,"body":4277,"categories":4438,"created_at":61,"date_modified":61,"description":53,"extension":62,"faq":61,"featured":63,"kicker_label":61,"meta":4439,"navigation":75,"path":4440,"published_at":4441,"question":61,"scraped_at":61,"seo":4442,"sitemap":4443,"source_id":4444,"source_name":4445,"source_type":83,"source_url":4446,"stem":4447,"tags":4448,"thumbnail_url":61,"tldr":4449,"tweet":61,"unknown_tags":4450,"__hash__":4451},"summaries\u002Fsummaries\u002Fminimal-numpy-rnn-for-char-level-text-gen-summary.md","Minimal NumPy RNN for Char-Level Text Gen",{"provider":7,"model":8,"input_tokens":4273,"output_tokens":4274,"processing_time_ms":4275,"cost_usd":4276},10743,1482,11844,0.0024192,{"type":14,"value":4278,"toc":4433},[4279,4283,4298,4321,4339,4343,4360,4384,4391,4395,4405,4420,4430],[17,4280,4282],{"id":4281},"rnn-architecture-and-one-hot-encoding","RNN Architecture and One-Hot Encoding",[22,4284,4285,4286,4289,4290,4293,4294,4297],{},"Load text from 'input.txt' into ",[3682,4287,4288],{},"data",", extract unique ",[3682,4291,4292],{},"chars"," for vocabulary (vocab_size = len(chars)). Map chars to indices with ",[3682,4295,4296],{},"char_to_ix"," and reverse. Use one-hot encoding: inputs are lists of indices turned into (vocab_size, 1) vectors with 1 at input index.",[22,4299,4300,4301,4304,4305,4308,4309,4312,4313,4316,4317,4320],{},"Hidden layer size fixed at 100 neurons (",[3682,4302,4303],{},"hidden_size=100","), sequence length 25 (",[3682,4306,4307],{},"seq_length=25","), learning rate 0.1. Weights initialized small: ",[3682,4310,4311],{},"Wxh = np.random.randn(100, vocab_size)*0.01"," (input-to-hidden), ",[3682,4314,4315],{},"Whh"," (hidden-to-hidden, 100x100), ",[3682,4318,4319],{},"Why"," (hidden-to-output, vocab_size x 100). Biases zero-initialized. Scaling by 0.01 keeps initial activations small for tanh stability and breaks symmetry so hidden units learn distinct features.",[22,4322,4323,4324,4327,4328,4331,4332,4335,4336,4034],{},"Forward step per timestep t: ",[3682,4325,4326],{},"hs[t] = tanh(Wxh @ xs[t] + Whh @ hs[t-1] + bh)",", then ",[3682,4329,4330],{},"ys[t] = Why @ hs[t] + by",", softmax ",[3682,4333,4334],{},"ps[t] = exp(ys[t])\u002Fsum(exp(ys[t]))"," for next-char probs. Loss is negative log-likelihood: sum -log(ps[t]",[26,4337,4338],{},"target",[17,4340,4342],{"id":4341},"backpropagation-through-time-and-gradients","Backpropagation Through Time and Gradients",[22,4344,4345,4346,4349,4350,4353,4354,3697,4357,3991],{},"In ",[3682,4347,4348],{},"lossFun(inputs, targets, hprev)",": forward pass stores xs, hs, ys, ps for all timesteps. Backward pass starts from output: ",[3682,4351,4352],{},"dy = ps[t].copy(); dy[target] -= 1"," (softmax + cross-entropy gradient simplifies to this). Accumulate ",[3682,4355,4356],{},"dWhy += dy @ hs[t].T",[3682,4358,4359],{},"dby += dy",[22,4361,4362,4363,4366,4367,4370,4371,3697,4374,3697,4377,3697,4380,4383],{},"Propagate to hidden: ",[3682,4364,4365],{},"dh = Why.T @ dy + dhnext"," (dhnext from future timestep), ",[3682,4368,4369],{},"dhraw = (1 - hs[t]^2) * dh"," (tanh derivative), then ",[3682,4372,4373],{},"dbh += dhraw",[3682,4375,4376],{},"dWxh += dhraw @ xs[t].T",[3682,4378,4379],{},"dWhh += dhraw @ hs[t-1].T",[3682,4381,4382],{},"dhnext = Whh.T @ dhraw"," for prior timestep.",[22,4385,4386,4387,4390],{},"Clip all gradients to ",[26,4388,4389],{},"-5, 5"," to prevent exploding gradients. Returns total loss, all dparams, final h for next sequence.",[17,4392,4394],{"id":4393},"adagrad-training-and-text-sampling","Adagrad Training and Text Sampling",[22,4396,4397,4398,4401,4402,3991],{},"Infinite loop sweeps data left-to-right in seq_length=25 chunks: reset hprev=zeros every epoch (when p >= len(data)). Compute inputs\u002Ftargets as char indices for data",[26,4399,4400],{},"p:p+25"," and shifted ",[26,4403,4404],{},"p+1:p+26",[22,4406,4407,4408,4411,4412,4415,4416,4419],{},"Every 100 iterations: sample 200 chars from model starting with inputs",[26,4409,4410],{},"0"," seed: forward like training but pick ",[3682,4413,4414],{},"ix = np.random.choice(vocab_size, p=ps.ravel())",", decode to text, print. Smooth loss: ",[3682,4417,4418],{},"smooth_loss *= 0.999 + loss * 0.001",", print every 100 iters.",[22,4421,4422,4423,3697,4426,4429],{},"Update with Adagrad: mem vars track ",[3682,4424,4425],{},"mem += dparam**2",[3682,4427,4428],{},"param -= lr * dparam \u002F sqrt(mem + 1e-8)",". Advance p by 25, n +=1. Initial smooth_loss = -log(1\u002Fvocab_size)*25.",[22,4431,4432],{},"Common issues: input.txt must exceed seq_length+1 chars (else IndexError in loss); large datasets like Shakespeare need 100k+ iters for loss ~3.0 and coherent text.",{"title":53,"searchDepth":54,"depth":54,"links":4434},[4435,4436,4437],{"id":4281,"depth":54,"text":4282},{"id":4341,"depth":54,"text":4342},{"id":4393,"depth":54,"text":4394},[60],{},"\u002Fsummaries\u002Fminimal-numpy-rnn-for-char-level-text-gen-summary","2026-04-08 21:21:20",{"title":4271,"description":53},{"loc":4440},"7fdb0ca0899660d5","Andrej Karpathy Gists","https:\u002F\u002Funknown","summaries\u002Fminimal-numpy-rnn-for-char-level-text-gen-summary",[90,87,88],"Build a vanilla RNN language model from scratch in ~170 lines of NumPy: processes text chunks of 25 chars, trains with BPTT and Adagrad, generates samples after 100 iterations.",[],"ytSsn8v5OXyfyPKcCX7WUMYHNuzZlEdeMutJWRk1eM0"]