[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"summary-ee34e33691a72ff0-gpus-accelerate-pandas-100x-on-google-cloud-summary":3,"summaries-facets-categories":165,"summary-related-ee34e33691a72ff0-gpus-accelerate-pandas-100x-on-google-cloud-summary":3734},{"id":4,"title":5,"ai":6,"body":13,"categories":139,"created_at":141,"date_modified":141,"description":142,"extension":143,"faq":141,"featured":144,"kicker_label":141,"meta":145,"navigation":146,"path":147,"published_at":148,"question":141,"scraped_at":149,"seo":150,"sitemap":151,"source_id":152,"source_name":153,"source_type":154,"source_url":155,"stem":156,"tags":157,"thumbnail_url":141,"tldr":162,"tweet":141,"unknown_tags":163,"__hash__":164},"summaries\u002Fsummaries\u002Fee34e33691a72ff0-gpus-accelerate-pandas-100x-on-google-cloud-summary.md","GPUs Accelerate Pandas 100x on Google Cloud",{"provider":7,"model":8,"input_tokens":9,"output_tokens":10,"processing_time_ms":11,"cost_usd":12},"openrouter","x-ai\u002Fgrok-4.1-fast",8779,2245,18738,0.0028558,{"type":14,"value":15,"toc":129},"minimark",[16,21,25,28,31,35,38,41,44,48,51,59,62,66,69,72,75,79,82,85,88,92],[17,18,20],"h2",{"id":19},"blazing-fast-queries-on-340-million-rows","Blazing-Fast Queries on 340 Million Rows",[22,23,24],"p",{},"Jeff Nelson from Google Cloud demoed a climate analytics dashboard powered by NVIDIA's cuDF library on a Cloud Run instance with an NVIDIA L4 GPU. Users input any city—New York, Los Angeles, Ho Chi Minh City, Bengaluru, London—and it instantly returns insights like hottest day, max rainfall, and coldest temperature from the Global Climatology Network dataset. This dataset spans 340 million weather records from thousands of stations, some dating to the 1700s, plus station metadata for geospatial matching.",[22,26,27],{},"\"We're chewing through 340 million records... it took about 88 milliseconds,\" Jeff explained. The dashboard finds the nearest station (e.g., 0.8 miles from Bengaluru) and filters to ~40,000 relevant records for London in under 100ms. All data loads into GPU memory; no pre-aggregation tricks. Side-by-side with a CPU-only Pandas version on the same Cloud Run setup showed stark differences: GPU handled 340M rows in 95ms for New Orleans; CPU managed only 113M sampled rows in 9 seconds—nearly 100x slower, with less accurate results due to sampling.",[22,29,30],{},"Jeff emphasized greater accuracy from full datasets: \"On the CPU side, we're only able to go back so far... On the GPU, we're able to ingest all of the data.\"",[17,32,34],{"id":33},"gpu-vs-cpu-parallel-power-for-data-frames","GPU vs. CPU: Parallel Power for Data Frames",[22,36,37],{},"William Hill from NVIDIA broke down why GPUs excel for data workloads. CPUs handle sequential tasks like OS operations with complex branching; GPUs thrive on parallel matrix operations, ideal for Pandas data frames or SQL scans.",[22,39,40],{},"\"A GPU was designed to operate in parallel on large matrices... it's basically a supercomputer for doing tons of floating point operations in parallel,\" Will said. The stack starts with NVIDIA data center GPUs (e.g., L4, A100, H100), layered with CUDA (C\u002FC++ API for GPU control), and topped by open-source CUDA-X Python libraries like cuDF (Pandas accelerator) and cuML (scikit-learn accelerator).",[22,42,43],{},"These libraries are drop-in replacements: \"If you know pandas, then you already know how to use it.\" cuDF accelerates Pandas, Polars, SQL, and Spark; cuML handles ML pipelines. No code rewrites needed—cuGraph even speeds NetworkX for graphs. Will shared his motivation: \"I want to go fast, but I don't want to write C++.\"",[17,45,47],{"id":46},"one-line-code-change-unlocks-gpu-speed","One-Line Code Change Unlocks GPU Speed",[22,49,50],{},"In Vertex AI Workbench's Colab Enterprise, Jeff loaded 113M rows (10GB) into Pandas on CPU, generating histograms across all stations in 3 seconds while monitoring RAM via the resources pane to avoid crashes. Replicating dashboard logic—geospatial nearest-station lookup for Fairbanks, Alaska, then aggregating extremes—took seconds on CPU.",[22,52,53,54,58],{},"The \"magic\" switch: ",[55,56,57],"code",{},"%load_ext cuDF.pandas",". Restart runtime, reload data, and Pandas operations auto-accelerate on GPU, falling back to CPU if needed. Jeff timed identical functions: GPU slashed latencies dramatically, enabling full 340M-row analysis without sampling.",[22,60,61],{},"\"All you need to do is add this one line... and all of a sudden you're running on GPUs using cuDF,\" Jeff noted. Pre-installed in Colab Enterprise and other services, it requires zero manual setup.",[17,63,65],{"id":64},"google-cloud-gpu-setup-templates-and-cost-guards","Google Cloud GPU Setup: Templates and Cost Guards",[22,67,68],{},"Google Cloud integrates NVIDIA GPUs across services. Jeff created a runtime template in Colab Enterprise: Select G2 machine type (L4 GPUs), A2 (A100s), or A3 (H100s); set idle shutdown (10min–1day) to curb bills.",[22,70,71],{},"\"One of the worst feelings... is getting a bill about a week later because I left my GPU running,\" Jeff warned. He recommends 30 minutes: long enough for coffee breaks, short enough for safety. Boot takes minutes; attach to notebooks. Cloud Run supports GPU attachments similarly for apps.",[22,73,74],{},"Resources pane tracks RAM\u002Fusage spikes—critical for Pandas OOM errors. Full climate notebook code mirrors the dashboard, proving production viability.",[17,76,78],{"id":77},"efficiency-expensive-hardware-pays-off","Efficiency: \"Expensive\" Hardware Pays Off",[22,80,81],{},"Speakers addressed GPU cost perceptions. Faster completion means less runtime, offsetting higher hourly rates. Live benchmark scanned 340M rows on-screen; Q&A covered hardware acceleration queries. Greg Baugues hosted, prompting city inputs from chat (Netherlands, New Orleans) to showcase real-time responsiveness.",[22,83,84],{},"\"How 'expensive' hardware is actually cheaper when it finishes the job in seconds,\" per event description. Jeff's dashboard on Cloud Run proves scalable, interactive analytics without precompute hacks.",[22,86,87],{},"\"Jeff Nelson argues that... the GPU has about three times as much data and it's almost 100 times faster.\"",[17,89,91],{"id":90},"key-takeaways","Key Takeaways",[93,94,95,99,105,108,111,114,117,120,123,126],"ul",{},[96,97,98],"li",{},"Load 340M+ row datasets into GPU memory on Google Cloud (Cloud Run, Colab Enterprise) for sub-100ms queries using cuDF—no sampling needed for accuracy.",[96,100,101,102,104],{},"Add ",[55,103,57],{}," to accelerate existing Pandas code; cuML does the same for scikit-learn—zero rewrites.",[96,106,107],{},"Choose machine types like G2 (L4), A2 (A100), A3 (H100) via runtime templates; always set 10-30min idle shutdown to avoid surprise bills.",[96,109,110],{},"Monitor RAM in Colab resources pane to prevent Pandas OOM crashes; start with 113M rows to test scaling.",[96,112,113],{},"Use Global Climatology Network for weather benchmarks—replicate Jeff's notebook for geospatial joins, aggregations, histograms.",[96,115,116],{},"Pair cuDF with cuML for end-to-end data science: ETL to ML on GPUs.",[96,118,119],{},"Test side-by-side: CPU Pandas limits scale; GPU handles 3x data at 100x speed.",[96,121,122],{},"Explore CUDA-X ecosystem (cuGraph for graphs) for broader acceleration.",[96,124,125],{},"Provision GPUs in Vertex AI Workbench for notebooks; deploy to Cloud Run for apps.",[96,127,128],{},"Prioritize parallel workloads (data frames, matrices) for max GPU ROI over sequential tasks.",{"title":130,"searchDepth":131,"depth":131,"links":132},"",2,[133,134,135,136,137,138],{"id":19,"depth":131,"text":20},{"id":33,"depth":131,"text":34},{"id":46,"depth":131,"text":47},{"id":64,"depth":131,"text":65},{"id":77,"depth":131,"text":78},{"id":90,"depth":131,"text":91},[140],"Data Science & Visualization",null,"* Speed up data analytics on GPUs → https:\u002F\u002Fgoo.gle\u002Fspeed-up-data-analytics-GPUs\n* Accelerated machine learning with GPUs → https:\u002F\u002Fgoo.gle\u002Faccelerated-machine-learning-with-google-cloud-and-nvidia\n\nIf your datasets are growing but your processing speed isn't, you're losing momentum. Join us as Jeff Nelson (Google) and William Hill (NVIDIA) demonstrate how to inject massive speed into your standard data analytics.\n\nThis livestream covers:\n* Live benchmark: A 340-million-row data scan, live on screen.\n* The efficiency win: How \"expensive\" hardware is actually cheaper when it finishes the job in seconds.\n* Expert Q&A: We're answering your hardware acceleration questions in the chat.\n\n🔔 Subscribe to Google Cloud Tech → https:\u002F\u002Fgoo.gle\u002FGoogleCloudTech\n\nThis livestream originally aired on April 7, 2026 at 9:00 A.M. PDT \u002F 12:00 P.M. EDT.\n\n#GPUs #NVIDIA #GoogleCloud\n\nSpeakers: Greg Baugues, Jeff Nelson, William Hill (NVIDIA)\nProducts Mentioned: Google Cloud Dataproc, GPUs","md",false,{},true,"\u002Fsummaries\u002Fee34e33691a72ff0-gpus-accelerate-pandas-100x-on-google-cloud-summary","2026-04-07 17:04:21","2026-04-08 14:51:34",{"title":5,"description":142},{"loc":147},"ee34e33691a72ff0","Google Cloud Tech","video","https:\u002F\u002Fwww.youtube.com\u002Fwatch?v=yBxRoYj-i28","summaries\u002Fee34e33691a72ff0-gpus-accelerate-pandas-100x-on-google-cloud-summary",[158,159,160,161],"data-science","machine-learning","python","cloud","NVIDIA cuDF and cuML libraries turn Pandas and scikit-learn into GPU-accelerated drop-ins, querying 340M rows in 88ms vs. 9s on CPU—add one line of code.",[],"VAGmk4S_yW3jM9qvL7ys46QaA6b9a80lLLzS3ZmfnXI",[166,169,172,175,178,181,183,185,187,189,191,193,196,198,200,202,204,206,208,210,212,214,217,219,221,223,226,228,230,233,235,237,239,241,243,245,247,249,251,253,255,257,259,261,263,265,267,269,271,273,275,277,279,281,283,285,287,289,291,293,295,297,299,301,303,305,307,309,311,313,315,317,319,321,323,325,327,329,331,333,335,337,339,341,343,345,347,349,351,353,355,357,359,361,363,365,367,369,371,373,375,377,379,381,383,385,387,389,391,393,395,397,399,401,403,405,407,409,411,413,415,417,419,421,423,425,427,429,431,433,435,437,439,441,443,445,447,449,451,453,455,457,459,461,463,465,467,469,471,473,475,477,479,481,483,485,488,490,492,494,496,498,500,502,504,506,508,510,512,514,516,518,520,522,524,526,528,530,532,534,536,538,540,542,544,546,548,550,552,554,556,558,560,562,564,566,568,570,572,574,576,578,580,582,584,586,588,590,592,594,596,598,600,602,604,606,608,610,612,614,616,618,620,622,624,626,628,630,632,634,636,638,640,642,644,646,648,650,652,654,656,658,660,662,664,666,668,670,672,674,676,678,680,682,684,686,688,690,692,694,696,698,700,702,704,706,708,710,712,714,716,718,720,722,724,726,728,730,732,734,736,738,740,742,744,746,748,750,752,754,756,758,760,762,764,766,768,770,772,774,776,778,780,782,784,786,788,790,792,794,796,798,800,802,804,806,808,810,812,814,816,818,820,822,824,826,828,830,832,834,836,838,840,842,844,846,848,850,852,854,856,858,860,862,864,866,868,870,872,874,876,878,880,882,884,886,888,890,892,894,896,898,900,902,904,906,908,910,912,914,916,918,920,922,924,926,928,930,932,934,936,938,940,942,944,946,948,950,952,954,956,958,960,962,964,966,968,970,972,974,976,978,980,982,984,986,988,990,992,994,996,998,1000,1002,1004,1006,1008,1010,1012,1014,1016,1018,1020,1022,1024,1026,1028,1030,1032,1034,1036,1038,1040,1042,1044,1046,1048,1050,1052,1054,1056,1058,1060,1062,1064,1066,1068,1070,1072,1074,1076,1078,1080,1082,1084,1086,1088,1090,1092,1094,1096,1098,1100,1102,1104,1106,1108,1110,1112,1114,1116,1118,1120,1122,1124,1126,1128,1130,1132,1134,1136,1138,1140,1142,1144,1146,1148,1150,1152,1154,1156,1158,1160,1162,1164,1166,1168,1170,1172,1174,1176,1178,1180,1182,1184,1186,1188,1190,1192,1194,1196,1198,1200,1202,1204,1206,1208,1210,1212,1214,1216,1218,1220,1222,1224,1226,1228,1230,1232,1234,1236,1238,1240,1242,1244,1246,1248,1250,1252,1254,1256,1258,1260,1262,1264,1266,1268,1270,1272,1274,1276,1278,1280,1282,1284,1286,1288,1290,1292,1294,1296,1298,1300,1302,1304,1306,1308,1310,1312,1314,1316,1318,1320,1322,1324,1326,1328,1330,1332,1334,1336,1338,1340,1342,1344,1346,1348,1350,1352,1354,1356,1358,1360,1362,1364,1366,1368,1370,1372,1374,1376,1378,1380,1382,1384,1386,1388,1390,1392,1394,1396,1398,1400,1402,1404,1406,1408,1410,1412,1414,1416,1418,1420,1422,1424,1426,1428,1430,1432,1434,1436,1438,1440,1442,1444,1446,1448,1450,1452,1454,1456,1458,1460,1462,1464,1466,1468,1470,1472,1474,1476,1478,1480,1482,1484,1486,1488,1490,1492,1494,1496,1498,1500,1502,1504,1506,1508,1510,1512,1514,1516,1518,1520,1522,1524,1526,1528,1530,1532,1534,1536,1538,1540,1542,1544,1546,1548,1550,1552,1554,1556,1558,1560,1562,1564,1566,1568,1570,1572,1574,1576,1578,1580,1582,1584,1586,1588,1590,1592,1594,1596,1598,1600,1602,1604,1606,1608,1610,1612,1614,1616,1618,1620,1622,1624,1626,1628,1630,1632,1634,1636,1638,1640,1642,1644,1646,1648,1650,1652,1654,1656,1658,1660,1662,1664,1666,1668,1670,1672,1674,1676,1678,1680,1682,1684,1686,1688,1690,1692,1694,1696,1698,1700,1702,1704,1706,1708,1710,1712,1714,1716,1718,1720,1722,1724,1726,1728,1730,1732,1734,1736,1738,1740,1742,1744,1746,1748,1750,1752,1754,1756,1758,1760,1762,1764,1766,1768,1770,1772,1774,1776,1778,1780,1782,1784,1786,1788,1790,1792,1794,1796,1798,1800,1802,1804,1806,1808,1810,1812,1814,1816,1818,1820,1822,1824,1826,1828,1830,1832,1834,1836,1838,1840,1842,1844,1846,1848,1850,1852,1854,1856,1858,1860,1862,1864,1866,1868,1870,1872,1874,1876,1878,1880,1882,1884,1886,1888,1890,1892,1894,1896,1898,1900,1902,1904,1906,1908,1910,1912,1914,1916,1918,1920,1922,1924,1926,1928,1930,1932,1934,1936,1938,1940,1942,1944,1946,1948,1950,1952,1954,1956,1958,1960,1962,1964,1966,1968,1970,1972,1974,1976,1978,1980,1982,1984,1986,1988,1990,1992,1994,1996,1998,2000,2002,2004,2006,2008,2010,2012,2014,2016,2018,2020,2022,2024,2026,2028,2030,2032,2034,2036,2038,2040,2042,2044,2046,2048,2050,2052,2054,2056,2058,2060,2062,2064,2066,2068,2070,2072,2074,2076,2078,2080,2082,2084,2086,2088,2090,2092,2094,2096,2098,2100,2102,2104,2106,2108,2110,2112,2114,2116,2118,2120,2122,2124,2126,2128,2130,2132,2134,2136,2138,2140,2142,2144,2146,2148,2150,2152,2154,2156,2158,2160,2162,2164,2166,2168,2170,2172,2174,2176,2178,2180,2182,2184,2186,2188,2190,2192,2194,2196,2198,2200,2202,2204,2206,2208,2210,2212,2214,2216,2218,2220,2222,2224,2226,2228,2230,2232,2234,2236,2238,2240,2242,2244,2246,2248,2250,2252,2254,2256,2258,2260,2262,2264,2266,2268,2270,2272,2274,2276,2278,2280,2282,2284,2286,2288,2290,2292,2294,2296,2298,2300,2302,2304,2306,2308,2310,2312,2314,2316,2318,2320,2322,2324,2326,2328,2330,2332,2334,2336,2338,2340,2342,2344,2346,2348,2350,2352,2354,2356,2358,2360,2362,2364,2366,2368,2370,2372,2374,2376,2378,2380,2382,2384,2386,2388,2390,2392,2394,2396,2398,2400,2402,2404,2406,2408,2410,2412,2414,2416,2418,2420,2422,2424,2426,2428,2430,2432,2434,2436,2438,2440,2442,2444,2446,2448,2450,2452,2454,2456,2458,2460,2462,2464,2466,2468,2470,2472,2474,2476,2478,2480,2482,2484,2486,2488,2490,2492,2494,2496,2498,2500,2502,2504,2506,2508,2510,2512,2514,2516,2518,2520,2522,2524,2526,2528,2530,2532,2534,2536,2538,2540,2542,2544,2546,2548,2550,2552,2554,2556,2558,2560,2562,2564,2566,2568,2570,2572,2574,2576,2578,2580,2582,2584,2586,2588,2590,2592,2594,2596,2598,2600,2602,2604,2606,2608,2610,2612,2614,2616,2618,2620,2622,2624,2626,2628,2630,2632,2634,2636,2638,2640,2642,2644,2646,2648,2650,2652,2654,2656,2658,2660,2662,2664,2666,2668,2670,2672,2674,2676,2678,2680,2682,2684,2686,2688,2690,2692,2694,2696,2698,2700,2702,2704,2706,2708,2710,2712,2714,2716,2718,2720,2722,2724,2726,2728,2730,2732,2734,2736,2738,2740,2742,2744,2746,2748,2750,2752,2754,2756,2758,2760,2762,2764,2766,2768,2770,2772,2774,2776,2778,2780,2782,2784,2786,2788,2790,2792,2794,2796,2798,2800,2802,2804,2806,2808,2810,2812,2814,2816,2818,2820,2822,2824,2826,2828,2830,2832,2834,2836,2838,2840,2842,2844,2846,2848,2850,2852,2854,2856,2858,2860,2862,2864,2866,2868,2870,2872,2874,2876,2878,2880,2882,2884,2886,2888,2890,2892,2894,2896,2898,2900,2902,2904,2906,2908,2910,2912,2914,2916,2918,2920,2922,2924,2926,2928,2930,2932,2934,2936,2938,2940,2942,2944,2946,2948,2950,2952,2954,2956,2958,2960,2962,2964,2966,2968,2970,2972,2974,2976,2978,2980,2982,2984,2986,2988,2990,2992,2994,2996,2998,3000,3002,3004,3006,3008,3010,3012,3014,3016,3018,3020,3022,3024,3026,3028,3030,3032,3034,3036,3038,3040,3042,3044,3046,3048,3050,3052,3054,3056,3058,3060,3062,3064,3066,3068,3070,3072,3074,3076,3078,3080,3082,3084,3086,3088,3090,3092,3094,3096,3098,3100,3102,3104,3106,3108,3110,3112,3114,3116,3118,3120,3122,3124,3126,3128,3130,3132,3134,3136,3138,3140,3142,3144,3146,3148,3150,3152,3154,3156,3158,3160,3162,3164,3166,3168,3170,3172,3174,3176,3178,3180,3182,3184,3186,3188,3190,3192,3194,3196,3198,3200,3202,3204,3206,3208,3210,3212,3214,3216,3218,3220,3222,3224,3226,3228,3230,3232,3234,3236,3238,3240,3242,3244,3246,3248,3250,3252,3254,3256,3258,3260,3262,3264,3266,3268,3270,3272,3274,3276,3278,3280,3282,3284,3286,3288,3290,3292,3294,3296,3298,3300,3302,3304,3306,3308,3310,3312,3314,3316,3318,3320,3322,3324,3326,3328,3330,3332,3334,3336,3338,3340,3342,3344,3346,3348,3350,3352,3354,3356,3358,3360,3362,3364,3366,3368,3370,3372,3374,3376,3378,3380,3382,3384,3386,3388,3390,3392,3394,3396,3398,3400,3402,3404,3406,3408,3410,3412,3414,3416,3418,3420,3422,3424,3426,3428,3430,3432,3434,3436,3438,3440,3442,3444,3446,3448,3450,3452,3454,3456,3458,3460,3462,3464,3466,3468,3470,3472,3474,3476,3478,3480,3482,3484,3486,3488,3490,3492,3494,3496,3498,3500,3502,3504,3506,3508,3510,3512,3514,3516,3518,3520,3522,3524,3526,3528,3530,3532,3534,3536,3538,3540,3542,3544,3546,3548,3550,3552,3554,3556,3558,3560,3562,3564,3566,3568,3570,3572,3574,3576,3578,3580,3582,3584,3586,3588,3590,3592,3594,3596,3598,3600,3602,3604,3606,3608,3610,3612,3614,3616,3618,3620,3622,3624,3626,3628,3630,3632,3634,3636,3638,3640,3642,3644,3646,3648,3650,3652,3654,3656,3658,3660,3662,3664,3666,3668,3670,3672,3674,3676,3678,3680,3682,3684,3686,3688,3690,3692,3694,3696,3698,3700,3702,3704,3706,3708,3710,3712,3714,3716,3718,3720,3722,3724,3726,3728,3730,3732],{"categories":167},[168],"Developer Productivity",{"categories":170},[171],"Business & SaaS",{"categories":173},[174],"AI & LLMs",{"categories":176},[177],"AI Automation",{"categories":179},[180],"Product Strategy",{"categories":182},[174],{"categories":184},[168],{"categories":186},[171],{"categories":188},[],{"categories":190},[174],{"categories":192},[],{"categories":194},[195],"AI News & Trends",{"categories":197},[177],{"categories":199},[195],{"categories":201},[177],{"categories":203},[177],{"categories":205},[174],{"categories":207},[174],{"categories":209},[195],{"categories":211},[174],{"categories":213},[],{"categories":215},[216],"Design & Frontend",{"categories":218},[140],{"categories":220},[195],{"categories":222},[],{"categories":224},[225],"Software Engineering",{"categories":227},[174],{"categories":229},[177],{"categories":231},[232],"Marketing & Growth",{"categories":234},[174],{"categories":236},[177],{"categories":238},[],{"categories":240},[],{"categories":242},[216],{"categories":244},[177],{"categories":246},[168],{"categories":248},[216],{"categories":250},[174],{"categories":252},[177],{"categories":254},[195],{"categories":256},[],{"categories":258},[],{"categories":260},[177],{"categories":262},[225],{"categories":264},[],{"categories":266},[171],{"categories":268},[],{"categories":270},[],{"categories":272},[177],{"categories":274},[177],{"categories":276},[174],{"categories":278},[],{"categories":280},[225],{"categories":282},[],{"categories":284},[],{"categories":286},[],{"categories":288},[174],{"categories":290},[232],{"categories":292},[216],{"categories":294},[216],{"categories":296},[174],{"categories":298},[177],{"categories":300},[174],{"categories":302},[174],{"categories":304},[177],{"categories":306},[177],{"categories":308},[140],{"categories":310},[195],{"categories":312},[177],{"categories":314},[232],{"categories":316},[177],{"categories":318},[180],{"categories":320},[],{"categories":322},[177],{"categories":324},[],{"categories":326},[177],{"categories":328},[225],{"categories":330},[216],{"categories":332},[174],{"categories":334},[],{"categories":336},[],{"categories":338},[177],{"categories":340},[],{"categories":342},[174],{"categories":344},[],{"categories":346},[168],{"categories":348},[225],{"categories":350},[171],{"categories":352},[195],{"categories":354},[174],{"categories":356},[],{"categories":358},[174],{"categories":360},[],{"categories":362},[225],{"categories":364},[140],{"categories":366},[],{"categories":368},[174],{"categories":370},[216],{"categories":372},[],{"categories":374},[216],{"categories":376},[177],{"categories":378},[],{"categories":380},[177],{"categories":382},[195],{"categories":384},[174],{"categories":386},[],{"categories":388},[177],{"categories":390},[174],{"categories":392},[180],{"categories":394},[],{"categories":396},[174],{"categories":398},[177],{"categories":400},[177],{"categories":402},[],{"categories":404},[140],{"categories":406},[174],{"categories":408},[],{"categories":410},[168],{"categories":412},[171],{"categories":414},[174],{"categories":416},[177],{"categories":418},[225],{"categories":420},[174],{"categories":422},[],{"categories":424},[],{"categories":426},[174],{"categories":428},[],{"categories":430},[216],{"categories":432},[],{"categories":434},[174],{"categories":436},[],{"categories":438},[177],{"categories":440},[174],{"categories":442},[216],{"categories":444},[],{"categories":446},[174],{"categories":448},[174],{"categories":450},[171],{"categories":452},[177],{"categories":454},[174],{"categories":456},[216],{"categories":458},[177],{"categories":460},[],{"categories":462},[],{"categories":464},[195],{"categories":466},[],{"categories":468},[174],{"categories":470},[171,232],{"categories":472},[],{"categories":474},[174],{"categories":476},[],{"categories":478},[],{"categories":480},[174],{"categories":482},[],{"categories":484},[174],{"categories":486},[487],"DevOps & Cloud",{"categories":489},[],{"categories":491},[195],{"categories":493},[216],{"categories":495},[],{"categories":497},[195],{"categories":499},[195],{"categories":501},[174],{"categories":503},[232],{"categories":505},[],{"categories":507},[171],{"categories":509},[],{"categories":511},[174,487],{"categories":513},[174],{"categories":515},[174],{"categories":517},[177],{"categories":519},[174,225],{"categories":521},[140],{"categories":523},[174],{"categories":525},[232],{"categories":527},[177],{"categories":529},[177],{"categories":531},[],{"categories":533},[177],{"categories":535},[174,171],{"categories":537},[],{"categories":539},[216],{"categories":541},[216],{"categories":543},[],{"categories":545},[],{"categories":547},[195],{"categories":549},[],{"categories":551},[168],{"categories":553},[225],{"categories":555},[174],{"categories":557},[216],{"categories":559},[177],{"categories":561},[225],{"categories":563},[195],{"categories":565},[216],{"categories":567},[],{"categories":569},[174],{"categories":571},[174],{"categories":573},[174],{"categories":575},[195],{"categories":577},[168],{"categories":579},[174],{"categories":581},[177],{"categories":583},[487],{"categories":585},[216],{"categories":587},[177],{"categories":589},[],{"categories":591},[],{"categories":593},[216],{"categories":595},[195],{"categories":597},[140],{"categories":599},[],{"categories":601},[174],{"categories":603},[174],{"categories":605},[171],{"categories":607},[174],{"categories":609},[174],{"categories":611},[195],{"categories":613},[],{"categories":615},[177],{"categories":617},[225],{"categories":619},[],{"categories":621},[174],{"categories":623},[174],{"categories":625},[177],{"categories":627},[],{"categories":629},[],{"categories":631},[174],{"categories":633},[],{"categories":635},[171],{"categories":637},[177],{"categories":639},[],{"categories":641},[168],{"categories":643},[174],{"categories":645},[171],{"categories":647},[195],{"categories":649},[],{"categories":651},[],{"categories":653},[],{"categories":655},[195],{"categories":657},[195],{"categories":659},[],{"categories":661},[],{"categories":663},[171],{"categories":665},[],{"categories":667},[],{"categories":669},[168],{"categories":671},[],{"categories":673},[232],{"categories":675},[177],{"categories":677},[171],{"categories":679},[177],{"categories":681},[],{"categories":683},[180],{"categories":685},[216],{"categories":687},[225],{"categories":689},[174],{"categories":691},[177],{"categories":693},[171],{"categories":695},[174],{"categories":697},[],{"categories":699},[],{"categories":701},[225],{"categories":703},[140],{"categories":705},[180],{"categories":707},[177],{"categories":709},[174],{"categories":711},[],{"categories":713},[487],{"categories":715},[],{"categories":717},[177],{"categories":719},[],{"categories":721},[],{"categories":723},[174],{"categories":725},[216],{"categories":727},[232],{"categories":729},[177],{"categories":731},[],{"categories":733},[168],{"categories":735},[],{"categories":737},[195],{"categories":739},[174,487],{"categories":741},[195],{"categories":743},[174],{"categories":745},[171],{"categories":747},[174],{"categories":749},[],{"categories":751},[171],{"categories":753},[],{"categories":755},[225],{"categories":757},[216],{"categories":759},[195],{"categories":761},[140],{"categories":763},[168],{"categories":765},[174],{"categories":767},[225],{"categories":769},[],{"categories":771},[],{"categories":773},[180],{"categories":775},[],{"categories":777},[174],{"categories":779},[],{"categories":781},[216],{"categories":783},[216],{"categories":785},[216],{"categories":787},[],{"categories":789},[],{"categories":791},[195],{"categories":793},[177],{"categories":795},[174],{"categories":797},[174],{"categories":799},[174],{"categories":801},[171],{"categories":803},[174],{"categories":805},[],{"categories":807},[225],{"categories":809},[225],{"categories":811},[171],{"categories":813},[],{"categories":815},[174],{"categories":817},[174],{"categories":819},[171],{"categories":821},[195],{"categories":823},[232],{"categories":825},[177],{"categories":827},[],{"categories":829},[216],{"categories":831},[],{"categories":833},[174],{"categories":835},[],{"categories":837},[171],{"categories":839},[177],{"categories":841},[],{"categories":843},[487],{"categories":845},[140],{"categories":847},[225],{"categories":849},[232],{"categories":851},[225],{"categories":853},[177],{"categories":855},[],{"categories":857},[],{"categories":859},[177],{"categories":861},[168],{"categories":863},[177],{"categories":865},[180],{"categories":867},[171],{"categories":869},[],{"categories":871},[174],{"categories":873},[180],{"categories":875},[174],{"categories":877},[174],{"categories":879},[232],{"categories":881},[216],{"categories":883},[177],{"categories":885},[],{"categories":887},[],{"categories":889},[487],{"categories":891},[225],{"categories":893},[],{"categories":895},[177],{"categories":897},[174],{"categories":899},[216,174],{"categories":901},[168],{"categories":903},[],{"categories":905},[174],{"categories":907},[168],{"categories":909},[216],{"categories":911},[177],{"categories":913},[225],{"categories":915},[],{"categories":917},[174],{"categories":919},[],{"categories":921},[168],{"categories":923},[],{"categories":925},[177],{"categories":927},[180],{"categories":929},[174],{"categories":931},[174],{"categories":933},[216],{"categories":935},[177],{"categories":937},[487],{"categories":939},[216],{"categories":941},[177],{"categories":943},[174],{"categories":945},[174],{"categories":947},[174],{"categories":949},[195],{"categories":951},[],{"categories":953},[180],{"categories":955},[177],{"categories":957},[216],{"categories":959},[177],{"categories":961},[225],{"categories":963},[216],{"categories":965},[177],{"categories":967},[195],{"categories":969},[],{"categories":971},[174],{"categories":973},[216],{"categories":975},[174],{"categories":977},[168],{"categories":979},[195],{"categories":981},[174],{"categories":983},[232],{"categories":985},[174],{"categories":987},[174],{"categories":989},[177],{"categories":991},[177],{"categories":993},[174],{"categories":995},[177],{"categories":997},[216],{"categories":999},[174],{"categories":1001},[],{"categories":1003},[],{"categories":1005},[225],{"categories":1007},[],{"categories":1009},[168],{"categories":1011},[487],{"categories":1013},[],{"categories":1015},[168],{"categories":1017},[171],{"categories":1019},[232],{"categories":1021},[],{"categories":1023},[171],{"categories":1025},[],{"categories":1027},[],{"categories":1029},[],{"categories":1031},[],{"categories":1033},[],{"categories":1035},[174],{"categories":1037},[177],{"categories":1039},[487],{"categories":1041},[168],{"categories":1043},[174],{"categories":1045},[225],{"categories":1047},[180],{"categories":1049},[174],{"categories":1051},[232],{"categories":1053},[174],{"categories":1055},[174],{"categories":1057},[174],{"categories":1059},[174,168],{"categories":1061},[225],{"categories":1063},[225],{"categories":1065},[216],{"categories":1067},[174],{"categories":1069},[],{"categories":1071},[],{"categories":1073},[],{"categories":1075},[225],{"categories":1077},[140],{"categories":1079},[195],{"categories":1081},[216],{"categories":1083},[],{"categories":1085},[174],{"categories":1087},[174],{"categories":1089},[],{"categories":1091},[],{"categories":1093},[177],{"categories":1095},[174],{"categories":1097},[171],{"categories":1099},[],{"categories":1101},[168],{"categories":1103},[174],{"categories":1105},[168],{"categories":1107},[174],{"categories":1109},[225],{"categories":1111},[232],{"categories":1113},[174,216],{"categories":1115},[195],{"categories":1117},[216],{"categories":1119},[],{"categories":1121},[487],{"categories":1123},[216],{"categories":1125},[177],{"categories":1127},[],{"categories":1129},[],{"categories":1131},[],{"categories":1133},[],{"categories":1135},[225],{"categories":1137},[177],{"categories":1139},[177],{"categories":1141},[174],{"categories":1143},[174],{"categories":1145},[],{"categories":1147},[216],{"categories":1149},[],{"categories":1151},[],{"categories":1153},[177],{"categories":1155},[],{"categories":1157},[],{"categories":1159},[232],{"categories":1161},[232],{"categories":1163},[177],{"categories":1165},[],{"categories":1167},[174],{"categories":1169},[174],{"categories":1171},[225],{"categories":1173},[216],{"categories":1175},[216],{"categories":1177},[177],{"categories":1179},[168],{"categories":1181},[174],{"categories":1183},[216],{"categories":1185},[216],{"categories":1187},[177],{"categories":1189},[177],{"categories":1191},[174],{"categories":1193},[],{"categories":1195},[],{"categories":1197},[174],{"categories":1199},[177],{"categories":1201},[195],{"categories":1203},[225],{"categories":1205},[168],{"categories":1207},[174],{"categories":1209},[],{"categories":1211},[177],{"categories":1213},[177],{"categories":1215},[],{"categories":1217},[168],{"categories":1219},[174],{"categories":1221},[168],{"categories":1223},[168],{"categories":1225},[],{"categories":1227},[],{"categories":1229},[177],{"categories":1231},[177],{"categories":1233},[174],{"categories":1235},[174],{"categories":1237},[195],{"categories":1239},[140],{"categories":1241},[180],{"categories":1243},[195],{"categories":1245},[216],{"categories":1247},[],{"categories":1249},[195],{"categories":1251},[],{"categories":1253},[],{"categories":1255},[],{"categories":1257},[],{"categories":1259},[225],{"categories":1261},[140],{"categories":1263},[],{"categories":1265},[174],{"categories":1267},[174],{"categories":1269},[140],{"categories":1271},[225],{"categories":1273},[],{"categories":1275},[],{"categories":1277},[177],{"categories":1279},[195],{"categories":1281},[195],{"categories":1283},[177],{"categories":1285},[168],{"categories":1287},[174,487],{"categories":1289},[],{"categories":1291},[216],{"categories":1293},[168],{"categories":1295},[177],{"categories":1297},[216],{"categories":1299},[],{"categories":1301},[177],{"categories":1303},[177],{"categories":1305},[174],{"categories":1307},[232],{"categories":1309},[225],{"categories":1311},[216],{"categories":1313},[],{"categories":1315},[177],{"categories":1317},[174],{"categories":1319},[177],{"categories":1321},[177],{"categories":1323},[177],{"categories":1325},[232],{"categories":1327},[177],{"categories":1329},[174],{"categories":1331},[],{"categories":1333},[232],{"categories":1335},[195],{"categories":1337},[177],{"categories":1339},[],{"categories":1341},[],{"categories":1343},[174],{"categories":1345},[177],{"categories":1347},[195],{"categories":1349},[177],{"categories":1351},[],{"categories":1353},[],{"categories":1355},[],{"categories":1357},[177],{"categories":1359},[],{"categories":1361},[],{"categories":1363},[140],{"categories":1365},[174],{"categories":1367},[140],{"categories":1369},[195],{"categories":1371},[174],{"categories":1373},[174],{"categories":1375},[177],{"categories":1377},[174],{"categories":1379},[],{"categories":1381},[],{"categories":1383},[487],{"categories":1385},[],{"categories":1387},[],{"categories":1389},[168],{"categories":1391},[],{"categories":1393},[],{"categories":1395},[],{"categories":1397},[],{"categories":1399},[225],{"categories":1401},[195],{"categories":1403},[232],{"categories":1405},[171],{"categories":1407},[174],{"categories":1409},[174],{"categories":1411},[171],{"categories":1413},[],{"categories":1415},[216],{"categories":1417},[177],{"categories":1419},[171],{"categories":1421},[174],{"categories":1423},[174],{"categories":1425},[168],{"categories":1427},[],{"categories":1429},[168],{"categories":1431},[174],{"categories":1433},[232],{"categories":1435},[177],{"categories":1437},[195],{"categories":1439},[171],{"categories":1441},[174],{"categories":1443},[177],{"categories":1445},[],{"categories":1447},[174],{"categories":1449},[168],{"categories":1451},[174],{"categories":1453},[],{"categories":1455},[195],{"categories":1457},[174],{"categories":1459},[],{"categories":1461},[171],{"categories":1463},[174],{"categories":1465},[],{"categories":1467},[],{"categories":1469},[],{"categories":1471},[174],{"categories":1473},[],{"categories":1475},[487],{"categories":1477},[174],{"categories":1479},[],{"categories":1481},[174],{"categories":1483},[174],{"categories":1485},[174],{"categories":1487},[174,487],{"categories":1489},[174],{"categories":1491},[174],{"categories":1493},[216],{"categories":1495},[177],{"categories":1497},[],{"categories":1499},[177],{"categories":1501},[174],{"categories":1503},[174],{"categories":1505},[174],{"categories":1507},[168],{"categories":1509},[168],{"categories":1511},[225],{"categories":1513},[216],{"categories":1515},[177],{"categories":1517},[],{"categories":1519},[174],{"categories":1521},[195],{"categories":1523},[174],{"categories":1525},[171],{"categories":1527},[],{"categories":1529},[487],{"categories":1531},[216],{"categories":1533},[216],{"categories":1535},[177],{"categories":1537},[195],{"categories":1539},[177],{"categories":1541},[174],{"categories":1543},[],{"categories":1545},[174],{"categories":1547},[],{"categories":1549},[],{"categories":1551},[174],{"categories":1553},[174],{"categories":1555},[174],{"categories":1557},[177],{"categories":1559},[174],{"categories":1561},[],{"categories":1563},[140],{"categories":1565},[177],{"categories":1567},[],{"categories":1569},[174],{"categories":1571},[195],{"categories":1573},[],{"categories":1575},[216],{"categories":1577},[487],{"categories":1579},[195],{"categories":1581},[225],{"categories":1583},[225],{"categories":1585},[195],{"categories":1587},[195],{"categories":1589},[487],{"categories":1591},[],{"categories":1593},[195],{"categories":1595},[174],{"categories":1597},[168],{"categories":1599},[195],{"categories":1601},[],{"categories":1603},[140],{"categories":1605},[195],{"categories":1607},[225],{"categories":1609},[195],{"categories":1611},[487],{"categories":1613},[174],{"categories":1615},[174],{"categories":1617},[],{"categories":1619},[171],{"categories":1621},[],{"categories":1623},[],{"categories":1625},[174],{"categories":1627},[174],{"categories":1629},[174],{"categories":1631},[174],{"categories":1633},[],{"categories":1635},[140],{"categories":1637},[168],{"categories":1639},[],{"categories":1641},[174],{"categories":1643},[174],{"categories":1645},[487],{"categories":1647},[487],{"categories":1649},[],{"categories":1651},[177],{"categories":1653},[195],{"categories":1655},[195],{"categories":1657},[174],{"categories":1659},[177],{"categories":1661},[],{"categories":1663},[216],{"categories":1665},[174],{"categories":1667},[174],{"categories":1669},[],{"categories":1671},[],{"categories":1673},[487],{"categories":1675},[174],{"categories":1677},[225],{"categories":1679},[171],{"categories":1681},[174],{"categories":1683},[],{"categories":1685},[177],{"categories":1687},[168],{"categories":1689},[168],{"categories":1691},[],{"categories":1693},[174],{"categories":1695},[216],{"categories":1697},[177],{"categories":1699},[],{"categories":1701},[174],{"categories":1703},[174],{"categories":1705},[177],{"categories":1707},[],{"categories":1709},[177],{"categories":1711},[225],{"categories":1713},[],{"categories":1715},[174],{"categories":1717},[],{"categories":1719},[174],{"categories":1721},[],{"categories":1723},[174],{"categories":1725},[174],{"categories":1727},[],{"categories":1729},[174],{"categories":1731},[195],{"categories":1733},[174],{"categories":1735},[174],{"categories":1737},[168],{"categories":1739},[174],{"categories":1741},[195],{"categories":1743},[177],{"categories":1745},[],{"categories":1747},[174],{"categories":1749},[232],{"categories":1751},[],{"categories":1753},[],{"categories":1755},[],{"categories":1757},[168],{"categories":1759},[195],{"categories":1761},[177],{"categories":1763},[174],{"categories":1765},[216],{"categories":1767},[177],{"categories":1769},[],{"categories":1771},[177],{"categories":1773},[],{"categories":1775},[174],{"categories":1777},[177],{"categories":1779},[174],{"categories":1781},[],{"categories":1783},[174],{"categories":1785},[174],{"categories":1787},[195],{"categories":1789},[216],{"categories":1791},[177],{"categories":1793},[216],{"categories":1795},[171],{"categories":1797},[],{"categories":1799},[],{"categories":1801},[174],{"categories":1803},[168],{"categories":1805},[195],{"categories":1807},[],{"categories":1809},[],{"categories":1811},[225],{"categories":1813},[216],{"categories":1815},[],{"categories":1817},[174],{"categories":1819},[],{"categories":1821},[232],{"categories":1823},[174],{"categories":1825},[487],{"categories":1827},[225],{"categories":1829},[],{"categories":1831},[177],{"categories":1833},[174],{"categories":1835},[177],{"categories":1837},[177],{"categories":1839},[174],{"categories":1841},[],{"categories":1843},[168],{"categories":1845},[174],{"categories":1847},[171],{"categories":1849},[225],{"categories":1851},[216],{"categories":1853},[],{"categories":1855},[],{"categories":1857},[],{"categories":1859},[177],{"categories":1861},[216],{"categories":1863},[195],{"categories":1865},[174],{"categories":1867},[195],{"categories":1869},[216],{"categories":1871},[],{"categories":1873},[216],{"categories":1875},[195],{"categories":1877},[171],{"categories":1879},[174],{"categories":1881},[195],{"categories":1883},[232],{"categories":1885},[],{"categories":1887},[],{"categories":1889},[140],{"categories":1891},[174,225],{"categories":1893},[195],{"categories":1895},[174],{"categories":1897},[177],{"categories":1899},[177],{"categories":1901},[174],{"categories":1903},[],{"categories":1905},[225],{"categories":1907},[174],{"categories":1909},[140],{"categories":1911},[177],{"categories":1913},[232],{"categories":1915},[487],{"categories":1917},[],{"categories":1919},[168],{"categories":1921},[177],{"categories":1923},[177],{"categories":1925},[225],{"categories":1927},[174],{"categories":1929},[174],{"categories":1931},[],{"categories":1933},[],{"categories":1935},[],{"categories":1937},[487],{"categories":1939},[195],{"categories":1941},[174],{"categories":1943},[174],{"categories":1945},[174],{"categories":1947},[],{"categories":1949},[140],{"categories":1951},[171],{"categories":1953},[],{"categories":1955},[177],{"categories":1957},[487],{"categories":1959},[],{"categories":1961},[216],{"categories":1963},[216],{"categories":1965},[],{"categories":1967},[225],{"categories":1969},[216],{"categories":1971},[174],{"categories":1973},[],{"categories":1975},[195],{"categories":1977},[174],{"categories":1979},[216],{"categories":1981},[177],{"categories":1983},[195],{"categories":1985},[],{"categories":1987},[177],{"categories":1989},[216],{"categories":1991},[174],{"categories":1993},[],{"categories":1995},[174],{"categories":1997},[174],{"categories":1999},[487],{"categories":2001},[195],{"categories":2003},[140],{"categories":2005},[140],{"categories":2007},[],{"categories":2009},[],{"categories":2011},[],{"categories":2013},[177],{"categories":2015},[225],{"categories":2017},[225],{"categories":2019},[],{"categories":2021},[],{"categories":2023},[174],{"categories":2025},[],{"categories":2027},[177],{"categories":2029},[174],{"categories":2031},[],{"categories":2033},[174],{"categories":2035},[171],{"categories":2037},[174],{"categories":2039},[232],{"categories":2041},[177],{"categories":2043},[174],{"categories":2045},[225],{"categories":2047},[195],{"categories":2049},[177],{"categories":2051},[],{"categories":2053},[195],{"categories":2055},[177],{"categories":2057},[177],{"categories":2059},[],{"categories":2061},[171],{"categories":2063},[177],{"categories":2065},[],{"categories":2067},[174],{"categories":2069},[168],{"categories":2071},[195],{"categories":2073},[487],{"categories":2075},[177],{"categories":2077},[177],{"categories":2079},[168],{"categories":2081},[174],{"categories":2083},[],{"categories":2085},[],{"categories":2087},[216],{"categories":2089},[174,171],{"categories":2091},[],{"categories":2093},[168],{"categories":2095},[140],{"categories":2097},[174],{"categories":2099},[225],{"categories":2101},[174],{"categories":2103},[177],{"categories":2105},[174],{"categories":2107},[174],{"categories":2109},[195],{"categories":2111},[177],{"categories":2113},[],{"categories":2115},[],{"categories":2117},[177],{"categories":2119},[174],{"categories":2121},[487],{"categories":2123},[],{"categories":2125},[174],{"categories":2127},[177],{"categories":2129},[],{"categories":2131},[174],{"categories":2133},[232],{"categories":2135},[140],{"categories":2137},[177],{"categories":2139},[174],{"categories":2141},[487],{"categories":2143},[],{"categories":2145},[174],{"categories":2147},[232],{"categories":2149},[216],{"categories":2151},[174],{"categories":2153},[],{"categories":2155},[232],{"categories":2157},[195],{"categories":2159},[174],{"categories":2161},[174],{"categories":2163},[168],{"categories":2165},[],{"categories":2167},[],{"categories":2169},[216],{"categories":2171},[174],{"categories":2173},[140],{"categories":2175},[232],{"categories":2177},[232],{"categories":2179},[195],{"categories":2181},[],{"categories":2183},[],{"categories":2185},[174],{"categories":2187},[],{"categories":2189},[174,225],{"categories":2191},[195],{"categories":2193},[177],{"categories":2195},[225],{"categories":2197},[174],{"categories":2199},[168],{"categories":2201},[],{"categories":2203},[],{"categories":2205},[168],{"categories":2207},[232],{"categories":2209},[174],{"categories":2211},[],{"categories":2213},[216,174],{"categories":2215},[487],{"categories":2217},[168],{"categories":2219},[],{"categories":2221},[171],{"categories":2223},[171],{"categories":2225},[174],{"categories":2227},[225],{"categories":2229},[177],{"categories":2231},[195],{"categories":2233},[232],{"categories":2235},[216],{"categories":2237},[174],{"categories":2239},[174],{"categories":2241},[174],{"categories":2243},[168],{"categories":2245},[174],{"categories":2247},[177],{"categories":2249},[195],{"categories":2251},[],{"categories":2253},[],{"categories":2255},[140],{"categories":2257},[225],{"categories":2259},[174],{"categories":2261},[216],{"categories":2263},[140],{"categories":2265},[174],{"categories":2267},[174],{"categories":2269},[177],{"categories":2271},[177],{"categories":2273},[174,171],{"categories":2275},[],{"categories":2277},[216],{"categories":2279},[],{"categories":2281},[174],{"categories":2283},[195],{"categories":2285},[168],{"categories":2287},[168],{"categories":2289},[177],{"categories":2291},[174],{"categories":2293},[171],{"categories":2295},[225],{"categories":2297},[232],{"categories":2299},[],{"categories":2301},[195],{"categories":2303},[174],{"categories":2305},[174],{"categories":2307},[195],{"categories":2309},[225],{"categories":2311},[174],{"categories":2313},[177],{"categories":2315},[195],{"categories":2317},[174],{"categories":2319},[216],{"categories":2321},[174],{"categories":2323},[174],{"categories":2325},[487],{"categories":2327},[180],{"categories":2329},[177],{"categories":2331},[174],{"categories":2333},[195],{"categories":2335},[177],{"categories":2337},[232],{"categories":2339},[174],{"categories":2341},[],{"categories":2343},[174],{"categories":2345},[],{"categories":2347},[],{"categories":2349},[],{"categories":2351},[171],{"categories":2353},[174],{"categories":2355},[177],{"categories":2357},[195],{"categories":2359},[195],{"categories":2361},[195],{"categories":2363},[195],{"categories":2365},[],{"categories":2367},[168],{"categories":2369},[177],{"categories":2371},[195],{"categories":2373},[168],{"categories":2375},[177],{"categories":2377},[174],{"categories":2379},[174,177],{"categories":2381},[177],{"categories":2383},[487],{"categories":2385},[195],{"categories":2387},[195],{"categories":2389},[177],{"categories":2391},[174],{"categories":2393},[],{"categories":2395},[195],{"categories":2397},[232],{"categories":2399},[168],{"categories":2401},[174],{"categories":2403},[174],{"categories":2405},[],{"categories":2407},[225],{"categories":2409},[],{"categories":2411},[168],{"categories":2413},[177],{"categories":2415},[195],{"categories":2417},[174],{"categories":2419},[195],{"categories":2421},[168],{"categories":2423},[195],{"categories":2425},[195],{"categories":2427},[],{"categories":2429},[171],{"categories":2431},[177],{"categories":2433},[195],{"categories":2435},[195],{"categories":2437},[195],{"categories":2439},[195],{"categories":2441},[195],{"categories":2443},[195],{"categories":2445},[195],{"categories":2447},[195],{"categories":2449},[195],{"categories":2451},[195],{"categories":2453},[140],{"categories":2455},[168],{"categories":2457},[174],{"categories":2459},[174],{"categories":2461},[],{"categories":2463},[174,168],{"categories":2465},[],{"categories":2467},[177],{"categories":2469},[195],{"categories":2471},[177],{"categories":2473},[174],{"categories":2475},[174],{"categories":2477},[174],{"categories":2479},[174],{"categories":2481},[174],{"categories":2483},[177],{"categories":2485},[171],{"categories":2487},[216],{"categories":2489},[195],{"categories":2491},[174],{"categories":2493},[],{"categories":2495},[],{"categories":2497},[177],{"categories":2499},[216],{"categories":2501},[174],{"categories":2503},[],{"categories":2505},[],{"categories":2507},[232],{"categories":2509},[174],{"categories":2511},[],{"categories":2513},[],{"categories":2515},[168],{"categories":2517},[171],{"categories":2519},[174],{"categories":2521},[171],{"categories":2523},[216],{"categories":2525},[],{"categories":2527},[195],{"categories":2529},[],{"categories":2531},[216],{"categories":2533},[174],{"categories":2535},[232],{"categories":2537},[],{"categories":2539},[232],{"categories":2541},[],{"categories":2543},[],{"categories":2545},[177],{"categories":2547},[],{"categories":2549},[171],{"categories":2551},[168],{"categories":2553},[216],{"categories":2555},[225],{"categories":2557},[],{"categories":2559},[],{"categories":2561},[174],{"categories":2563},[168],{"categories":2565},[232],{"categories":2567},[],{"categories":2569},[177],{"categories":2571},[177],{"categories":2573},[195],{"categories":2575},[174],{"categories":2577},[177],{"categories":2579},[174],{"categories":2581},[177],{"categories":2583},[174],{"categories":2585},[180],{"categories":2587},[195],{"categories":2589},[],{"categories":2591},[232],{"categories":2593},[225],{"categories":2595},[177],{"categories":2597},[],{"categories":2599},[174],{"categories":2601},[177],{"categories":2603},[171],{"categories":2605},[168],{"categories":2607},[174],{"categories":2609},[216],{"categories":2611},[225],{"categories":2613},[225],{"categories":2615},[174],{"categories":2617},[140],{"categories":2619},[174],{"categories":2621},[177],{"categories":2623},[171],{"categories":2625},[177],{"categories":2627},[174],{"categories":2629},[174],{"categories":2631},[177],{"categories":2633},[195],{"categories":2635},[],{"categories":2637},[168],{"categories":2639},[174],{"categories":2641},[177],{"categories":2643},[174],{"categories":2645},[174],{"categories":2647},[],{"categories":2649},[216],{"categories":2651},[171],{"categories":2653},[195],{"categories":2655},[174],{"categories":2657},[174],{"categories":2659},[216],{"categories":2661},[232],{"categories":2663},[140],{"categories":2665},[174],{"categories":2667},[195],{"categories":2669},[174],{"categories":2671},[177],{"categories":2673},[487],{"categories":2675},[174],{"categories":2677},[177],{"categories":2679},[140],{"categories":2681},[],{"categories":2683},[177],{"categories":2685},[225],{"categories":2687},[216],{"categories":2689},[174],{"categories":2691},[168],{"categories":2693},[171],{"categories":2695},[225],{"categories":2697},[],{"categories":2699},[177],{"categories":2701},[174],{"categories":2703},[],{"categories":2705},[195],{"categories":2707},[],{"categories":2709},[195],{"categories":2711},[174],{"categories":2713},[177],{"categories":2715},[177],{"categories":2717},[177],{"categories":2719},[],{"categories":2721},[],{"categories":2723},[174],{"categories":2725},[174],{"categories":2727},[],{"categories":2729},[216],{"categories":2731},[177],{"categories":2733},[232],{"categories":2735},[168],{"categories":2737},[],{"categories":2739},[],{"categories":2741},[195],{"categories":2743},[225],{"categories":2745},[174],{"categories":2747},[174],{"categories":2749},[174],{"categories":2751},[225],{"categories":2753},[195],{"categories":2755},[216],{"categories":2757},[174],{"categories":2759},[174],{"categories":2761},[174],{"categories":2763},[195],{"categories":2765},[174],{"categories":2767},[195],{"categories":2769},[177],{"categories":2771},[177],{"categories":2773},[225],{"categories":2775},[177],{"categories":2777},[174],{"categories":2779},[225],{"categories":2781},[216],{"categories":2783},[],{"categories":2785},[177],{"categories":2787},[],{"categories":2789},[],{"categories":2791},[171],{"categories":2793},[174],{"categories":2795},[177],{"categories":2797},[168],{"categories":2799},[177],{"categories":2801},[232],{"categories":2803},[],{"categories":2805},[177],{"categories":2807},[],{"categories":2809},[168],{"categories":2811},[177],{"categories":2813},[],{"categories":2815},[177],{"categories":2817},[174],{"categories":2819},[195],{"categories":2821},[174],{"categories":2823},[177],{"categories":2825},[195],{"categories":2827},[177],{"categories":2829},[225],{"categories":2831},[216],{"categories":2833},[168],{"categories":2835},[],{"categories":2837},[177],{"categories":2839},[216],{"categories":2841},[195],{"categories":2843},[174],{"categories":2845},[216],{"categories":2847},[168],{"categories":2849},[],{"categories":2851},[177],{"categories":2853},[177],{"categories":2855},[174],{"categories":2857},[],{"categories":2859},[177],{"categories":2861},[180],{"categories":2863},[195],{"categories":2865},[177],{"categories":2867},[171],{"categories":2869},[],{"categories":2871},[174],{"categories":2873},[180],{"categories":2875},[174],{"categories":2877},[177],{"categories":2879},[195],{"categories":2881},[168],{"categories":2883},[487],{"categories":2885},[174],{"categories":2887},[174],{"categories":2889},[174],{"categories":2891},[195],{"categories":2893},[171],{"categories":2895},[174],{"categories":2897},[216],{"categories":2899},[195],{"categories":2901},[487],{"categories":2903},[174],{"categories":2905},[],{"categories":2907},[],{"categories":2909},[487],{"categories":2911},[140],{"categories":2913},[177],{"categories":2915},[177],{"categories":2917},[195],{"categories":2919},[174],{"categories":2921},[168],{"categories":2923},[216],{"categories":2925},[177],{"categories":2927},[174],{"categories":2929},[232],{"categories":2931},[174],{"categories":2933},[177],{"categories":2935},[],{"categories":2937},[174],{"categories":2939},[174],{"categories":2941},[195],{"categories":2943},[168],{"categories":2945},[],{"categories":2947},[174],{"categories":2949},[174],{"categories":2951},[225],{"categories":2953},[216],{"categories":2955},[174,177],{"categories":2957},[232,171],{"categories":2959},[174],{"categories":2961},[],{"categories":2963},[177],{"categories":2965},[],{"categories":2967},[225],{"categories":2969},[174],{"categories":2971},[195],{"categories":2973},[],{"categories":2975},[177],{"categories":2977},[],{"categories":2979},[177],{"categories":2981},[168],{"categories":2983},[177],{"categories":2985},[174],{"categories":2987},[487],{"categories":2989},[232],{"categories":2991},[171],{"categories":2993},[171],{"categories":2995},[168],{"categories":2997},[168],{"categories":2999},[174],{"categories":3001},[177],{"categories":3003},[174],{"categories":3005},[174],{"categories":3007},[168],{"categories":3009},[174],{"categories":3011},[232],{"categories":3013},[195],{"categories":3015},[174],{"categories":3017},[177],{"categories":3019},[174],{"categories":3021},[],{"categories":3023},[225],{"categories":3025},[],{"categories":3027},[177],{"categories":3029},[168],{"categories":3031},[],{"categories":3033},[487],{"categories":3035},[174],{"categories":3037},[],{"categories":3039},[195],{"categories":3041},[177],{"categories":3043},[225],{"categories":3045},[174],{"categories":3047},[177],{"categories":3049},[225],{"categories":3051},[177],{"categories":3053},[195],{"categories":3055},[168],{"categories":3057},[195],{"categories":3059},[225],{"categories":3061},[174],{"categories":3063},[216],{"categories":3065},[174],{"categories":3067},[174],{"categories":3069},[174],{"categories":3071},[174],{"categories":3073},[177],{"categories":3075},[174],{"categories":3077},[177],{"categories":3079},[174],{"categories":3081},[168],{"categories":3083},[174],{"categories":3085},[177],{"categories":3087},[216],{"categories":3089},[168],{"categories":3091},[177],{"categories":3093},[216],{"categories":3095},[],{"categories":3097},[174],{"categories":3099},[174],{"categories":3101},[225],{"categories":3103},[],{"categories":3105},[177],{"categories":3107},[232],{"categories":3109},[174],{"categories":3111},[195],{"categories":3113},[232],{"categories":3115},[177],{"categories":3117},[171],{"categories":3119},[171],{"categories":3121},[174],{"categories":3123},[168],{"categories":3125},[],{"categories":3127},[174],{"categories":3129},[],{"categories":3131},[168],{"categories":3133},[174],{"categories":3135},[177],{"categories":3137},[177],{"categories":3139},[],{"categories":3141},[225],{"categories":3143},[225],{"categories":3145},[232],{"categories":3147},[216],{"categories":3149},[],{"categories":3151},[174],{"categories":3153},[168],{"categories":3155},[174],{"categories":3157},[225],{"categories":3159},[168],{"categories":3161},[195],{"categories":3163},[195],{"categories":3165},[],{"categories":3167},[195],{"categories":3169},[177],{"categories":3171},[216],{"categories":3173},[140],{"categories":3175},[174],{"categories":3177},[],{"categories":3179},[195],{"categories":3181},[225],{"categories":3183},[171],{"categories":3185},[174],{"categories":3187},[168],{"categories":3189},[487],{"categories":3191},[168],{"categories":3193},[],{"categories":3195},[],{"categories":3197},[195],{"categories":3199},[],{"categories":3201},[177],{"categories":3203},[177],{"categories":3205},[177],{"categories":3207},[],{"categories":3209},[174],{"categories":3211},[],{"categories":3213},[195],{"categories":3215},[168],{"categories":3217},[216],{"categories":3219},[174],{"categories":3221},[195],{"categories":3223},[195],{"categories":3225},[],{"categories":3227},[195],{"categories":3229},[168],{"categories":3231},[174],{"categories":3233},[],{"categories":3235},[177],{"categories":3237},[177],{"categories":3239},[168],{"categories":3241},[],{"categories":3243},[],{"categories":3245},[],{"categories":3247},[216],{"categories":3249},[177],{"categories":3251},[174],{"categories":3253},[],{"categories":3255},[],{"categories":3257},[],{"categories":3259},[216],{"categories":3261},[],{"categories":3263},[168],{"categories":3265},[],{"categories":3267},[],{"categories":3269},[216],{"categories":3271},[174],{"categories":3273},[195],{"categories":3275},[],{"categories":3277},[232],{"categories":3279},[195],{"categories":3281},[232],{"categories":3283},[174],{"categories":3285},[],{"categories":3287},[],{"categories":3289},[177],{"categories":3291},[],{"categories":3293},[],{"categories":3295},[177],{"categories":3297},[174],{"categories":3299},[],{"categories":3301},[177],{"categories":3303},[195],{"categories":3305},[232],{"categories":3307},[140],{"categories":3309},[177],{"categories":3311},[177],{"categories":3313},[],{"categories":3315},[],{"categories":3317},[],{"categories":3319},[195],{"categories":3321},[],{"categories":3323},[],{"categories":3325},[216],{"categories":3327},[168],{"categories":3329},[],{"categories":3331},[171],{"categories":3333},[232],{"categories":3335},[174],{"categories":3337},[225],{"categories":3339},[168],{"categories":3341},[140],{"categories":3343},[171],{"categories":3345},[225],{"categories":3347},[],{"categories":3349},[],{"categories":3351},[177],{"categories":3353},[168],{"categories":3355},[216],{"categories":3357},[168],{"categories":3359},[177],{"categories":3361},[487],{"categories":3363},[177],{"categories":3365},[],{"categories":3367},[174],{"categories":3369},[195],{"categories":3371},[225],{"categories":3373},[],{"categories":3375},[216],{"categories":3377},[195],{"categories":3379},[168],{"categories":3381},[177],{"categories":3383},[174],{"categories":3385},[171],{"categories":3387},[177,487],{"categories":3389},[177],{"categories":3391},[225],{"categories":3393},[174],{"categories":3395},[140],{"categories":3397},[232],{"categories":3399},[177],{"categories":3401},[],{"categories":3403},[177],{"categories":3405},[174],{"categories":3407},[171],{"categories":3409},[],{"categories":3411},[],{"categories":3413},[174],{"categories":3415},[140],{"categories":3417},[174],{"categories":3419},[],{"categories":3421},[195],{"categories":3423},[],{"categories":3425},[195],{"categories":3427},[225],{"categories":3429},[177],{"categories":3431},[174],{"categories":3433},[232],{"categories":3435},[225],{"categories":3437},[],{"categories":3439},[195],{"categories":3441},[174],{"categories":3443},[],{"categories":3445},[174],{"categories":3447},[177],{"categories":3449},[174],{"categories":3451},[177],{"categories":3453},[174],{"categories":3455},[174],{"categories":3457},[174],{"categories":3459},[174],{"categories":3461},[171],{"categories":3463},[],{"categories":3465},[180],{"categories":3467},[195],{"categories":3469},[174],{"categories":3471},[],{"categories":3473},[225],{"categories":3475},[174],{"categories":3477},[174],{"categories":3479},[177],{"categories":3481},[195],{"categories":3483},[174],{"categories":3485},[174],{"categories":3487},[171],{"categories":3489},[177],{"categories":3491},[216],{"categories":3493},[],{"categories":3495},[140],{"categories":3497},[174],{"categories":3499},[],{"categories":3501},[195],{"categories":3503},[232],{"categories":3505},[],{"categories":3507},[],{"categories":3509},[195],{"categories":3511},[195],{"categories":3513},[232],{"categories":3515},[168],{"categories":3517},[177],{"categories":3519},[177],{"categories":3521},[174],{"categories":3523},[171],{"categories":3525},[],{"categories":3527},[],{"categories":3529},[195],{"categories":3531},[140],{"categories":3533},[225],{"categories":3535},[177],{"categories":3537},[216],{"categories":3539},[140],{"categories":3541},[140],{"categories":3543},[],{"categories":3545},[195],{"categories":3547},[174],{"categories":3549},[174],{"categories":3551},[225],{"categories":3553},[],{"categories":3555},[195],{"categories":3557},[195],{"categories":3559},[195],{"categories":3561},[],{"categories":3563},[177],{"categories":3565},[174],{"categories":3567},[],{"categories":3569},[168],{"categories":3571},[171],{"categories":3573},[],{"categories":3575},[174],{"categories":3577},[174],{"categories":3579},[],{"categories":3581},[225],{"categories":3583},[],{"categories":3585},[],{"categories":3587},[],{"categories":3589},[],{"categories":3591},[174],{"categories":3593},[195],{"categories":3595},[],{"categories":3597},[],{"categories":3599},[174],{"categories":3601},[174],{"categories":3603},[174],{"categories":3605},[140],{"categories":3607},[174],{"categories":3609},[140],{"categories":3611},[],{"categories":3613},[140],{"categories":3615},[140],{"categories":3617},[487],{"categories":3619},[177],{"categories":3621},[225],{"categories":3623},[],{"categories":3625},[],{"categories":3627},[140],{"categories":3629},[225],{"categories":3631},[225],{"categories":3633},[225],{"categories":3635},[],{"categories":3637},[168],{"categories":3639},[225],{"categories":3641},[225],{"categories":3643},[168],{"categories":3645},[225],{"categories":3647},[171],{"categories":3649},[225],{"categories":3651},[225],{"categories":3653},[225],{"categories":3655},[140],{"categories":3657},[195],{"categories":3659},[195],{"categories":3661},[174],{"categories":3663},[225],{"categories":3665},[140],{"categories":3667},[487],{"categories":3669},[140],{"categories":3671},[140],{"categories":3673},[140],{"categories":3675},[],{"categories":3677},[171],{"categories":3679},[],{"categories":3681},[487],{"categories":3683},[225],{"categories":3685},[225],{"categories":3687},[225],{"categories":3689},[177],{"categories":3691},[195,171],{"categories":3693},[140],{"categories":3695},[],{"categories":3697},[],{"categories":3699},[140],{"categories":3701},[],{"categories":3703},[140],{"categories":3705},[195],{"categories":3707},[177],{"categories":3709},[],{"categories":3711},[225],{"categories":3713},[174],{"categories":3715},[216],{"categories":3717},[],{"categories":3719},[174],{"categories":3721},[],{"categories":3723},[195],{"categories":3725},[168],{"categories":3727},[140],{"categories":3729},[],{"categories":3731},[225],{"categories":3733},[195],[3735,3968,4213,4344],{"id":3736,"title":3737,"ai":3738,"body":3743,"categories":3937,"created_at":141,"date_modified":141,"description":130,"extension":143,"faq":141,"featured":144,"kicker_label":141,"meta":3938,"navigation":146,"path":3954,"published_at":3955,"question":141,"scraped_at":3956,"seo":3957,"sitemap":3958,"source_id":3959,"source_name":3960,"source_type":3961,"source_url":3962,"stem":3963,"tags":3964,"thumbnail_url":141,"tldr":3965,"tweet":141,"unknown_tags":3966,"__hash__":3967},"summaries\u002Fsummaries\u002Fff126f8e0954389e-skfolio-build-tune-portfolio-optimizers-in-python-summary.md","skfolio: Build & Tune Portfolio Optimizers in Python",{"provider":7,"model":8,"input_tokens":3739,"output_tokens":3740,"processing_time_ms":3741,"cost_usd":3742},9292,2519,30098,0.00309525,{"type":14,"value":3744,"toc":3931},[3745,3749,3780,3784,3833,3837,3902,3906],[17,3746,3748],{"id":3747},"data-prep-and-baseline-benchmarks-deliver-quick-wins","Data Prep and Baseline Benchmarks Deliver Quick Wins",[22,3750,3751,3752,3755,3756,3759,3760,3763,3764,3767,3768,3771,3772,3775,3776,3779],{},"Load S&P 500 prices via ",[55,3753,3754],{},"skfolio.datasets.load_sp500_dataset()",", convert to returns with ",[55,3757,3758],{},"prices_to_returns()",", and split chronologically (",[55,3761,3762],{},"train_test_split(shuffle=False, test_size=0.33)",") to prevent look-ahead bias—training spans ~67% historical days, testing the rest. Baselines like ",[55,3765,3766],{},"EqualWeighted()",", ",[55,3769,3770],{},"InverseVolatility()",", and ",[55,3773,3774],{},"Random()"," fit on train, predict on test, yielding metrics like annualized Sharpe (printed via ",[55,3777,3778],{},"ptf.annualized_sharpe_ratio","), mean return, and volatility. These expose naive strategies' flaws: equal-weight ignores volatility, random adds noise—use them to benchmark any optimizer.",[17,3781,3783],{"id":3782},"mean-variance-risk-measures-and-clustering-beat-baselines","Mean-Variance, Risk Measures, and Clustering Beat Baselines",[22,3785,3786,3789,3790,3793,3794,3797,3798,3801,3802,3767,3805,3808,3809,3812,3813,3816,3817,3820,3821,3824,3825,3828,3829,3832],{},[55,3787,3788],{},"MeanRisk(risk_measure=RiskMeasure.VARIANCE)"," minimizes variance or maximizes Sharpe (",[55,3791,3792],{},"ObjectiveFunction.MAXIMIZE_RATIO","), generating efficient frontiers (",[55,3795,3796],{},"efficient_frontier_size=20",") plotted by risk vs. Sharpe. Swap risks to ",[55,3799,3800],{},"CVaR"," (95%), ",[55,3803,3804],{},"SEMI_VARIANCE",[55,3806,3807],{},"CDAR",", or ",[55,3810,3811],{},"MAX_DRAWDOWN"," for tail-focused portfolios that cut CVaR@95% and max drawdown vs. variance. ",[55,3814,3815],{},"RiskBudgeting()"," equalizes contributions (variance or CVaR). Hierarchical methods shine: ",[55,3818,3819],{},"HierarchicalRiskParity()"," clusters assets via dendrograms for stable weights; ",[55,3822,3823],{},"NestedClustersOptimization()"," nests ",[55,3826,3827],{},"MeanRisk(CVAR)"," inside ",[55,3830,3831],{},"RiskBudgeting(VARIANCE)"," with 5-fold CV, capturing correlations without covariance pitfalls.",[17,3834,3836],{"id":3835},"robust-priors-constraints-and-views-stabilize-real-world-use","Robust Priors, Constraints, and Views Stabilize Real-World Use",[22,3838,3839,3840,3843,3844,3847,3848,3767,3851,3767,3854,3808,3857,3860,3861,3864,3865,3767,3868,3767,3871,3767,3874,3877,3878,3881,3882,3885,3886,3889,3890,3893,3894,3897,3898,3901],{},"Replace ",[55,3841,3842],{},"EmpiricalCovariance()","\u002F",[55,3845,3846],{},"EmpiricalMu()"," with ",[55,3849,3850],{},"DenoiseCovariance()",[55,3852,3853],{},"ShrunkMu()",[55,3855,3856],{},"GerberCovariance()",[55,3858,3859],{},"EWMu(alpha=0.1)"," in ",[55,3862,3863],{},"EmpiricalPrior()"," for max-Sharpe portfolios resilient to estimation error. Add realism via ",[55,3866,3867],{},"min_weights=0.0",[55,3869,3870],{},"max_weights=0.20",[55,3872,3873],{},"transaction_costs=0.0005",[55,3875,3876],{},"groups"," (e.g., GroupA \u003C=0.6, GroupB>=0.2), ",[55,3879,3880],{},"l2_coef=0.01",". ",[55,3883,3884],{},"BlackLitterman(views=[\"AAPL == 0.0008\", \"JPM - BAC == 0.0002\"])"," blends market priors with views. ",[55,3887,3888],{},"FactorModel()"," on ",[55,3891,3892],{},"load_factors_dataset()"," explains returns via external factors, boosting Sharpe. Pipelines like ",[55,3895,3896],{},"SelectKExtremes(k=8)"," + ",[55,3899,3900],{},"MeanRisk()"," prune to top performers.",[17,3903,3905],{"id":3904},"walk-forward-cv-and-tuning-ensure-out-of-sample-performance","Walk-Forward CV and Tuning Ensure Out-of-Sample Performance",[22,3907,3908,3847,3911,3914,3915,3918,3919,3922,3923,3926,3927,3930],{},[55,3909,3910],{},"cross_val_predict()",[55,3912,3913],{},"WalkForward(train_size=252*2, test_size=63)"," simulates rolling 2-year trains\u002F3-month tests, computing portfolio Sharpe\u002FCalmar. ",[55,3916,3917],{},"GridSearchCV()"," tunes ",[55,3920,3921],{},"l2_coef=[0.0,0.01,0.1]"," and ",[55,3924,3925],{},"mu_estimator__alpha=[0.05,0.1,0.2,0.5]"," on max-Sharpe, selecting best CV Sharpe. Final ",[55,3928,3929],{},"Population()"," of 18 strategies compares annualized mean\u002Fvol\u002FSharpe\u002FSortino\u002FCVaR@95%\u002Fdrawdowns (sorted by test Sharpe), with plots for cumulative returns, weights, risk contributions—revealing hierarchical\u002Frisk-parity often top variance-based in stability.",{"title":130,"searchDepth":131,"depth":131,"links":3932},[3933,3934,3935,3936],{"id":3747,"depth":131,"text":3748},{"id":3782,"depth":131,"text":3783},{"id":3835,"depth":131,"text":3836},{"id":3904,"depth":131,"text":3905},[140],{"content_references":3939,"triage":3949},[3940,3945],{"type":3941,"title":3942,"url":3943,"context":3944},"tool","skfolio","https:\u002F\u002Fgithub.com\u002Fskfolio\u002Fskfolio","mentioned",{"type":3946,"title":3947,"url":3948,"context":3944},"other","Full Codes","https:\u002F\u002Fgithub.com\u002FMarktechpost\u002FAI-Agents-Projects-Tutorials\u002Fblob\u002Fmain\u002FData%20Science\u002Fportfolio_optimization_with_skfolio_Marktechpost.ipynb",{"relevance":3950,"novelty":3950,"quality":3951,"actionability":3951,"composite":3952,"reasoning":3953},3,4,3.45,"Category: Data Science & Visualization. The article provides a practical guide on using the skfolio library for portfolio optimization, which aligns with the audience's interest in actionable AI and data science tools. It includes specific code examples and methodologies that can be directly applied, making it useful for developers looking to implement AI in financial products.","\u002Fsummaries\u002Fff126f8e0954389e-skfolio-build-tune-portfolio-optimizers-in-python-summary","2026-05-12 07:05:02","2026-05-12 15:01:25",{"title":3737,"description":130},{"loc":3954},"ff126f8e0954389e","MarkTechPost","article","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F12\u002Fa-coding-implementation-to-portfolio-optimization-with-skfolio-for-building-testing-tuning-and-comparing-modern-investment-strategies\u002F","summaries\u002Fff126f8e0954389e-skfolio-build-tune-portfolio-optimizers-in-python-summary",[160,158,159],"skfolio's scikit-learn API lets you construct, validate, and compare 18+ portfolio strategies—from baselines to HRP, Black-Litterman, factors, and tuned models—on S&P 500 returns with walk-forward CV and GridSearchCV.",[],"s9QUFNF_HWzNZV61Dh6PEETN3C3-K3FsZalb0rd3HRQ",{"id":3969,"title":3970,"ai":3971,"body":3976,"categories":4183,"created_at":141,"date_modified":141,"description":130,"extension":143,"faq":141,"featured":144,"kicker_label":141,"meta":4184,"navigation":146,"path":4201,"published_at":4202,"question":141,"scraped_at":4203,"seo":4204,"sitemap":4205,"source_id":4206,"source_name":3960,"source_type":3961,"source_url":4207,"stem":4208,"tags":4209,"thumbnail_url":141,"tldr":4210,"tweet":141,"unknown_tags":4211,"__hash__":4212},"summaries\u002Fsummaries\u002Fa59df2d47dafe018-scanpy-pipeline-for-pbmc-scrna-seq-clustering-traj-summary.md","Scanpy Pipeline for PBMC scRNA-seq Clustering & Trajectories",{"provider":7,"model":8,"input_tokens":3972,"output_tokens":3973,"processing_time_ms":3974,"cost_usd":3975},9209,2235,26831,0.0029368,{"type":14,"value":3977,"toc":4177},[3978,3982,4014,4040,4044,4067,4083,4087,4110,4128,4132,4163],[17,3979,3981],{"id":3980},"rigorous-qc-and-filtering-removes-noise-for-reliable-downstream-analysis","Rigorous QC and Filtering Removes Noise for Reliable Downstream Analysis",[22,3983,3984,3985,3988,3989,3992,3993,3996,3997,4000,4001,4004,4005,3767,4008,3767,4011,4013],{},"Load PBMC-3k via ",[55,3986,3987],{},"sc.datasets.pbmc3k()"," (2700 cells, ~2k genes\u002Fcell). Compute QC metrics for mitochondrial (",[55,3990,3991],{},"MT-"," prefix, filter \u003C5% ",[55,3994,3995],{},"pct_counts_mt",") and ribosomal (",[55,3998,3999],{},"RPS\u002FRPL",") genes using ",[55,4002,4003],{},"sc.pp.calculate_qc_metrics",". Visualize with violin plots (",[55,4006,4007],{},"n_genes_by_counts",[55,4009,4010],{},"total_counts",[55,4012,3995],{},") and scatters to spot outliers.",[22,4015,4016,4017,3767,4020,4023,4024,4027,4028,4031,4032,4035,4036,4039],{},"Filter: ",[55,4018,4019],{},"min_genes=200",[55,4021,4022],{},"min_cells=3",", upper ",[55,4025,4026],{},"n_genes_by_counts \u003C2500",". Detect doublets via ",[55,4029,4030],{},"sc.pp.scrublet"," (removes ~sum of ",[55,4033,4034],{},"predicted_doublet","). Preserve raw in ",[55,4037,4038],{},"layers[\"counts\"]",". This yields cleaner data, preventing artifacts in clustering.",[17,4041,4043],{"id":4042},"normalization-hvgs-and-cell-cycle-correction-focus-on-biological-signal","Normalization, HVGs, and Cell-Cycle Correction Focus on Biological Signal",[22,4045,4046,4047,4050,4051,4054,4055,4058,4059,4062,4063,4066],{},"Normalize to 10k counts (",[55,4048,4049],{},"sc.pp.normalize_total(target_sum=1e4)","), log-transform (",[55,4052,4053],{},"sc.pp.log1p","). Identify highly variable genes (",[55,4056,4057],{},"sc.pp.highly_variable_genes(min_mean=0.0125, max_mean=3, min_disp=0.5)","), subset to them (",[55,4060,4061],{},"adata = adata[:, adata.var.highly_variable]","). Store raw in ",[55,4064,4065],{},"adata.raw",".",[22,4068,4069,4070,3767,4072,4074,4075,4078,4079,4082],{},"Score S\u002FG2M phases with 40+ predefined markers (e.g., S: MCM5,PCNA; G2M: HMGB2,CDK1, filter to dataset genes). Regress out ",[55,4071,4010],{},[55,4073,3995],{}," (",[55,4076,4077],{},"sc.pp.regress_out","). Scale (",[55,4080,4081],{},"sc.pp.scale(max_value=10)","). These steps isolate biological variance, regressing technical noise for accurate modeling.",[17,4084,4086],{"id":4085},"dimensionality-reduction-leiden-clustering-and-marker-based-annotation-reveals-cell-types","Dimensionality Reduction, Leiden Clustering, and Marker-Based Annotation Reveals Cell Types",[22,4088,4089,4090,4093,4094,4097,4098,4101,4102,4105,4106,4109],{},"PCA (",[55,4091,4092],{},"sc.tl.pca(svd_solver=\"arpack\")",", check ",[55,4095,4096],{},"n_pcs=50"," variance). Neighbors (",[55,4099,4100],{},"sc.pp.neighbors(n_neighbors=10, n_pcs=40)","). Embeddings: UMAP (",[55,4103,4104],{},"sc.tl.umap","), t-SNE (",[55,4107,4108],{},"sc.tl.tsne(n_pcs=40)",").",[22,4111,4112,4113,4116,4117,4120,4121,3767,4124,4127],{},"Cluster with Leiden (",[55,4114,4115],{},"sc.tl.leiden(resolution=0.5, flavor=\"igraph\", n_iterations=2)","). Rank markers (",[55,4118,4119],{},"sc.tl.rank_genes_groups(method=\"wilcoxon\")",", top 10\u002Fcluster via Wilcoxon). Annotate using PBMC markers: B-cell (CD79A,MS4A1), CD8 T (CD8A,CD8B), CD4 T (IL7R,CD4), NK (GNLY,NKG7), CD14 Mono (CD14,LYZ), FCGR3A Mono (FCGR3A,MS4A7), Dendritic (FCER1A,CST3), Mega (PPBP). Confirm via ",[55,4122,4123],{},"sc.pl.dotplot",[55,4125,4126],{},"sc.pl.stacked_violin(groupby=\"leiden\")",". Visualizes 8-9 clusters matching immune subsets.",[17,4129,4131],{"id":4130},"paga-trajectories-pseudotime-and-custom-scores-enable-developmental-insights","PAGA Trajectories, Pseudotime, and Custom Scores Enable Developmental Insights",[22,4133,4134,4135,4138,4139,4142,4143,4146,4147,4150,4151,4154,4155,4158,4159,4162],{},"Graph-based trajectories: ",[55,4136,4137],{},"sc.tl.paga(groups=\"leiden\")",", threshold=0.1, init UMAP (",[55,4140,4141],{},"sc.tl.umap(init_pos=\"paga\")","). Diffusion maps (",[55,4144,4145],{},"sc.tl.diffmap","), recompute neighbors on ",[55,4148,4149],{},"X_diffmap",", root at cluster 0 (",[55,4152,4153],{},"adata.uns[\"iroot\"]","), pseudotime (",[55,4156,4157],{},"sc.tl.dpt","). Plot ",[55,4160,4161],{},"dpt_pseudotime"," on UMAP.",[22,4164,4165,4166,3767,4169,4172,4173,4176],{},"Custom score: IFN-response genes (ISG15,IFI6,IFIT1,IFIT3,MX1,OAS1,STAT1,IRF7) via ",[55,4167,4168],{},"sc.tl.score_genes(score_name=\"IFN_score\")",[55,4170,4171],{},"cmap=\"viridis\"",". Save full AnnData (",[55,4174,4175],{},"adata.write(\"pbmc3k_analyzed.h5ad\")",") with embeddings, clusters, scores for reuse. Extends basic clustering to infer progression and response states.",{"title":130,"searchDepth":131,"depth":131,"links":4178},[4179,4180,4181,4182],{"id":3980,"depth":131,"text":3981},{"id":4042,"depth":131,"text":4043},{"id":4085,"depth":131,"text":4086},{"id":4130,"depth":131,"text":4131},[140],{"content_references":4185,"triage":4198},[4186,4189,4192,4194],{"type":3941,"title":4187,"url":4188,"context":3944},"Scanpy","https:\u002F\u002Fgithub.com\u002Fscverse\u002Fscanpy",{"type":4190,"title":4191,"context":3944},"dataset","PBMC-3k",{"type":3941,"title":4193,"context":3944},"Scrublet",{"type":3946,"title":4195,"url":4196,"context":4197},"Full Codes with Notebook","https:\u002F\u002Fgithub.com\u002FMarktechpost\u002FAI-Agents-Projects-Tutorials\u002Fblob\u002Fmain\u002FData%20Science\u002Fscanpy_pbmc3k_single_cell_rnaseq_analysis_Marktechpost.ipynb","recommended",{"relevance":3950,"novelty":131,"quality":3951,"actionability":3950,"composite":4199,"reasoning":4200},3.05,"Category: Data Science & Visualization. The article provides a detailed overview of building a single-cell RNA-seq analysis pipeline using Scanpy, which is relevant for data scientists working with biological data. However, it primarily focuses on a specific use case without broader implications or insights that could apply to a wider audience.","\u002Fsummaries\u002Fa59df2d47dafe018-scanpy-pipeline-for-pbmc-scrna-seq-clustering-traj-summary","2026-05-08 21:32:12","2026-05-09 15:37:24",{"title":3970,"description":130},{"loc":4201},"a59df2d47dafe018","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F08\u002Fhow-to-build-a-single-cell-rna-seq-analysis-pipeline-with-scanpy-for-pbmc-clustering-annotation-and-trajectory-discovery\u002F","summaries\u002Fa59df2d47dafe018-scanpy-pipeline-for-pbmc-scrna-seq-clustering-traj-summary",[158,159,160],"Process PBMC-3k data with Scanpy: filter cells (min 200 genes, \u003C2500 genes, \u003C5% mt), remove Scrublet doublets, select HVGs (min_mean=0.0125, max_mean=3, min_disp=0.5), Leiden cluster at res=0.5, annotate via markers, infer PAGA\u002FDPT trajectories, score IFN response.",[],"jTCku7xsp8M-LiBcwiNLzHzB68G5RjE-UBMIb_cET-c",{"id":4214,"title":4215,"ai":4216,"body":4221,"categories":4321,"created_at":141,"date_modified":141,"description":130,"extension":143,"faq":141,"featured":144,"kicker_label":141,"meta":4322,"navigation":146,"path":4332,"published_at":4333,"question":141,"scraped_at":4334,"seo":4335,"sitemap":4336,"source_id":4337,"source_name":3960,"source_type":3961,"source_url":4338,"stem":4339,"tags":4340,"thumbnail_url":141,"tldr":4341,"tweet":141,"unknown_tags":4342,"__hash__":4343},"summaries\u002Fsummaries\u002Fa50c8b812151a371-tabpfn-beats-tree-models-on-tabular-accuracy-with--summary.md","TabPFN Beats Tree Models on Tabular Accuracy with Zero Training",{"provider":7,"model":8,"input_tokens":4217,"output_tokens":4218,"processing_time_ms":4219,"cost_usd":4220},9215,1914,16447,0.00277735,{"type":14,"value":4222,"toc":4316},[4223,4227,4230,4241,4271,4274,4278,4281,4302,4305,4309,4312],[17,4224,4226],{"id":4225},"tabpfns-pretraining-enables-direct-inference-on-tabular-tasks","TabPFN's Pretraining Enables Direct Inference on Tabular Tasks",[22,4228,4229],{},"TabPFN is a foundation model pretrained on millions of synthetic tabular datasets from causal processes, allowing it to perform supervised classification without dataset-specific training. Provide your training data during the .fit() call, which loads pretrained weights in 0.47 seconds—no hyperparameter tuning or iterative optimization needed. Predictions use in-context learning: the model conditions on your full training set (e.g., 4,000 samples) alongside test inputs at inference time, mimicking LLM prompting but for structured data. TabPFN-2.5 extends this to larger datasets up to millions of rows, outperforming tuned XGBoost, CatBoost, and ensembles like AutoGluon on benchmarks by capturing general tabular patterns.",[22,4231,4232,4233,4236,4237,4240],{},"To implement, install via ",[55,4234,4235],{},"pip install tabpfn-client scikit-learn catboost",", set ",[55,4238,4239],{},"TABPFN_TOKEN"," from priorlabs.ai, then:",[4242,4243,4246],"pre",{"className":4244,"code":4245,"language":160,"meta":130,"style":130},"language-python shiki shiki-themes github-light github-dark","from tabpfn_client import TabPFNClassifier\ntabpfn = TabPFNClassifier()\ntabpfn.fit(X_train, y_train)  # Loads weights\ntabpfn_preds = tabpfn.predict(X_test)\n",[55,4247,4248,4256,4261,4266],{"__ignoreMap":130},[4249,4250,4253],"span",{"class":4251,"line":4252},"line",1,[4249,4254,4255],{},"from tabpfn_client import TabPFNClassifier\n",[4249,4257,4258],{"class":4251,"line":131},[4249,4259,4260],{},"tabpfn = TabPFNClassifier()\n",[4249,4262,4263],{"class":4251,"line":3950},[4249,4264,4265],{},"tabpfn.fit(X_train, y_train)  # Loads weights\n",[4249,4267,4268],{"class":4251,"line":3951},[4249,4269,4270],{},"tabpfn_preds = tabpfn.predict(X_test)\n",[22,4272,4273],{},"This shifts computation from training to inference, ideal for rapid prototyping where setup speed trumps everything.",[17,4275,4277],{"id":4276},"quantified-wins-over-tree-based-baselines","Quantified Wins Over Tree-Based Baselines",[22,4279,4280],{},"Tested on scikit-learn's synthetic binary classification: 5,000 samples, 20 features (10 informative, 5 redundant), 80\u002F20 train\u002Ftest split.",[93,4282,4283,4290,4296],{},[96,4284,4285,4289],{},[4286,4287,4288],"strong",{},"Random Forest"," (200 trees): 95.5% accuracy, 9.56s train, 0.0627s infer. Robust bagging handles noise but plateaus on complex interactions.",[96,4291,4292,4295],{},[4286,4293,4294],{},"CatBoost"," (500 iterations, depth=6, lr=0.1): 96.7% accuracy, 8.15s train, 0.0119s infer. Boosting edges out RF via error correction, excels in low-latency production.",[96,4297,4298,4301],{},[4286,4299,4300],{},"TabPFN",": 98.8% accuracy, 0.47s fit, 2.21s infer. Gains 2.1-3.3% accuracy by leveraging pretrained priors on noisy features.",[22,4303,4304],{},"TabPFN wins on accuracy and setup for small-to-medium data (\u003C10k rows), eliminating tuning that tree models demand.",[17,4306,4308],{"id":4307},"inference-cost-and-distillation-for-production","Inference Cost and Distillation for Production",[22,4310,4311],{},"TabPFN's 2.21s inference (vs \u003C0.1s for trees) arises from joint processing of train+test data—scales with training set size, unsuitable for real-time apps or huge datasets without tweaks. Solution: distillation engine converts predictions to compact neural nets or tree ensembles, preserving ~98% of accuracy while slashing inference to milliseconds. Use for offline analysis, A\u002FB tests, or batch scoring; distill for deployment. Best for dev speed on tabular tasks where trees fall short, like healthcare\u002Ffinance with mixed types—no preprocessing grind required.",[4313,4314,4315],"style",{},"html .default .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}html.dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}",{"title":130,"searchDepth":131,"depth":131,"links":4317},[4318,4319,4320],{"id":4225,"depth":131,"text":4226},{"id":4276,"depth":131,"text":4277},{"id":4307,"depth":131,"text":4308},[140],{"content_references":4323,"triage":4328},[4324,4326],{"type":3941,"title":4300,"url":4325,"context":3944},"https:\u002F\u002Fux.priorlabs.ai\u002Fhome",{"type":3946,"title":4195,"url":4327,"context":3944},"https:\u002F\u002Fgithub.com\u002FMarktechpost\u002FAI-Agents-Projects-Tutorials\u002Fblob\u002Fmain\u002FData%20Science\u002FTabPFN.ipynb",{"relevance":4329,"novelty":3951,"quality":3951,"actionability":3951,"composite":4330,"reasoning":4331},5,4.35,"Category: AI & LLMs. The article provides a detailed comparison of TabPFN with traditional tree models, addressing the audience's need for practical AI applications in product development. It includes specific implementation steps for using TabPFN, making it actionable for developers looking to integrate this model into their workflows.","\u002Fsummaries\u002Fa50c8b812151a371-tabpfn-beats-tree-models-on-tabular-accuracy-with-summary","2026-04-19 19:11:03","2026-04-21 15:26:59",{"title":4215,"description":130},{"loc":4332},"a50c8b812151a371","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F04\u002F19\u002Fhow-tabpfn-leverages-in-context-learning-to-achieve-superior-accuracy-on-tabular-datasets-compared-to-random-forest-and-catboost\u002F","summaries\u002Fa50c8b812151a371-tabpfn-beats-tree-models-on-tabular-accuracy-with--summary",[159,158,160],"On a 5k-sample tabular dataset, TabPFN hits 98.8% accuracy vs CatBoost's 96.7% and Random Forest's 95.5%, with 0.47s setup but 2.21s inference due to in-context learning at predict time.",[],"ib8Gsg5sdpFcbFssj_HpjZUdd84YjROCIhNcU90X7HE",{"id":4345,"title":4346,"ai":4347,"body":4352,"categories":4704,"created_at":141,"date_modified":141,"description":130,"extension":143,"faq":141,"featured":144,"kicker_label":141,"meta":4705,"navigation":146,"path":4706,"published_at":4707,"question":141,"scraped_at":141,"seo":4708,"sitemap":4709,"source_id":4710,"source_name":4711,"source_type":3961,"source_url":4712,"stem":4713,"tags":4714,"thumbnail_url":141,"tldr":4715,"tweet":141,"unknown_tags":4716,"__hash__":4717},"summaries\u002Fsummaries\u002Fsynthetically-label-sparse-bequest-donors-realisti-summary.md","Synthetically Label Sparse Bequest Donors Realistically",{"provider":7,"model":8,"input_tokens":4348,"output_tokens":4349,"processing_time_ms":4350,"cost_usd":4351},9589,2408,16814,0.00309915,{"type":14,"value":4353,"toc":4698},[4354,4358,4365,4368,4372,4383,4429,4483,4514,4523,4527,4530,4663,4673,4677,4696],[17,4355,4357],{"id":4356},"tackle-imbalanced-bequest-data-with-synthetic-targets","Tackle Imbalanced Bequest Data with Synthetic Targets",[22,4359,4360,4361,4364],{},"Charity databases have \u003C1% confirmed bequest donors—those formally notifying intent—despite >50% of gifts coming from lifetime strangers. Build a realistic target ",[55,4362,4363],{},"bequest_status"," ('Confirmed' or NA) using a propensity formula on RFMT (recency\u002Ffrequency\u002Fmonetary\u002Ftenure), age groups, and regular giving (RG) status. Add controlled randomness via Bernoulli sampling on propensity probability to mimic human variability and block model 'cheating'—where deterministic labels let algorithms rediscover the exact formula, creating an echo chamber.",[22,4366,4367],{},"Max propensity normalizes to ~357 (sum of peak scores: r=5,f=10,m=3,t=10,age=10x2=20 * rg=1.2), yielding probs like 0.089 for high scorers. This forces models to extract true signals amid noise, mirroring real sparse data.",[17,4369,4371],{"id":4370},"engineer-rfmt-age-and-rg-features-from-transactions","Engineer RFMT, Age, and RG Features from Transactions",[22,4373,4374,4375,4378,4379,4382],{},"Start with ",[55,4376,4377],{},"df_opps"," (opportunities) and ",[55,4380,4381],{},"df_contacts",":",[93,4384,4385],{},[96,4386,4387,4390,4391,4394,4395,4398,4399,4402,4403,4406,4407,4410,4411,4402,4414,4417,4418,4420,4421,4424,4425,4428],{},[4286,4388,4389],{},"RFMT",": Group by ",[55,4392,4393],{},"contact_id","; compute ",[55,4396,4397],{},"last_gift_date"," (max ",[55,4400,4401],{},"close_date","), ",[55,4404,4405],{},"first_gift_date"," (min), ",[55,4408,4409],{},"frequency"," (count ",[55,4412,4413],{},"amount",[55,4415,4416],{},"monetary_value"," (sum ",[55,4419,4413],{},"). Then ",[55,4422,4423],{},"recency"," = months since end_date (2025-12-31); ",[55,4426,4427],{},"tenure"," = months between first\u002Flast gift.",[4242,4430,4432],{"className":4244,"code":4431,"language":160,"meta":130,"style":130},"def generate_rfmt(data):\n    df = data.groupby('contact_id').agg({\n        'close_date': ['max', 'min'],\n        'amount': ['count', 'sum']\n    })\n    df.columns = ['last_gift_date', 'first_gift_date', 'frequency', 'monetary_value']\n    # Convert to date, compute recency\u002Ftenure with relativedelta\n    # ...\n    return df.reset_index()\n",[55,4433,4434,4439,4444,4449,4454,4459,4465,4471,4477],{"__ignoreMap":130},[4249,4435,4436],{"class":4251,"line":4252},[4249,4437,4438],{},"def generate_rfmt(data):\n",[4249,4440,4441],{"class":4251,"line":131},[4249,4442,4443],{},"    df = data.groupby('contact_id').agg({\n",[4249,4445,4446],{"class":4251,"line":3950},[4249,4447,4448],{},"        'close_date': ['max', 'min'],\n",[4249,4450,4451],{"class":4251,"line":3951},[4249,4452,4453],{},"        'amount': ['count', 'sum']\n",[4249,4455,4456],{"class":4251,"line":4329},[4249,4457,4458],{},"    })\n",[4249,4460,4462],{"class":4251,"line":4461},6,[4249,4463,4464],{},"    df.columns = ['last_gift_date', 'first_gift_date', 'frequency', 'monetary_value']\n",[4249,4466,4468],{"class":4251,"line":4467},7,[4249,4469,4470],{},"    # Convert to date, compute recency\u002Ftenure with relativedelta\n",[4249,4472,4474],{"class":4251,"line":4473},8,[4249,4475,4476],{},"    # ...\n",[4249,4478,4480],{"class":4251,"line":4479},9,[4249,4481,4482],{},"    return df.reset_index()\n",[93,4484,4485,4494],{},[96,4486,4487,4490,4491,4066],{},[4286,4488,4489],{},"Age groups",": ",[55,4492,4493],{},"pd.cut(age, bins=[0,39,49,59,69,90], labels=['under_40','40-49','50-59','60-69','70_or_over'])",[96,4495,4496,4499,4500,4503,4504,3843,4507,4510,4511,4513],{},[4286,4497,4498],{},"RG status",": Filter ",[55,4501,4502],{},"df_opps[type=='Regular']","; get ",[55,4505,4506],{},"first_rg_date",[55,4508,4509],{},"last_rg_date"," per ID. If ",[55,4512,4509],{}," in 2025-12: 'Active'; else 'Cancelled'. No RG → 'No RG' post-merge.",[22,4515,4516,4517,3843,4520,4066],{},"Merge right on RFMT (drop no-history contacts), left on RG; fillna 'No RG'; drop extras like ",[55,4518,4519],{},"name",[55,4521,4522],{},"gender",[17,4524,4526],{"id":4525},"sector-tailored-scores-capture-counterintuitive-patterns","Sector-Tailored Scores Capture Counterintuitive Patterns",[22,4528,4529],{},"Assign 0-10 scores per feature, weighted for legacy giving realities (e.g., retired lapsed donors outscore active; mid-value > high-value):",[4531,4532,4533,4552],"table",{},[4534,4535,4536],"thead",{},[4537,4538,4539,4543,4546,4549],"tr",{},[4540,4541,4542],"th",{},"Feature",[4540,4544,4545],{},"Bins\u002FLogic",[4540,4547,4548],{},"Labels",[4540,4550,4551],{},"Rationale",[4553,4554,4555,4577,4597,4617,4635,4649],"tbody",{},[4537,4556,4557,4561,4566,4571],{},[4558,4559,4560],"td",{},"Recency",[4558,4562,4563],{},[55,4564,4565],{},"[-1,18,42,84,1000]",[4558,4567,4568],{},[4249,4569,4570],{},"4,5,2,1",[4558,4572,4573,4574,4066],{},"18-42mo 'sweet spot' for retired lapsed (highest); recent active lower; long dormant still viable. ",[55,4575,4576],{},"pd.cut",[4537,4578,4579,4582,4587,4592],{},[4558,4580,4581],{},"Frequency",[4558,4583,4584],{},[55,4585,4586],{},"[-1,2,9,49,99,10000]",[4558,4588,4589],{},[4249,4590,4591],{},"0,1,4,7,10",[4558,4593,4594,4595,4066],{},"Frequency > value; 100+ 'Revolutionary'=10. ",[55,4596,4576],{},[4537,4598,4599,4602,4611,4614],{},[4558,4600,4601],{},"Monetary (quintiles)",[4558,4603,4604,4607,4608],{},[55,4605,4606],{},"pd.qcut(q=5, labels=[1,2,3,4,5])"," → map ",[55,4609,4610],{},"{1:0,2:2,3:3,4:3,5:1}",[4558,4612,4613],{},"Peak mid-quintiles",[4558,4615,4616],{},"Mid-value (40-80%) most generous legacies; top 20% less confirmatory.",[4537,4618,4619,4622,4627,4632],{},[4558,4620,4621],{},"Tenure",[4558,4623,4624],{},[55,4625,4626],{},"pd.cut(bins=5)",[4558,4628,4629],{},[4249,4630,4631],{},"0,1,3,6,10",[4558,4633,4634],{},"Long tenure >> short; steep curve for loyalty.",[4537,4636,4637,4640,4643,4646],{},[4558,4638,4639],{},"Age",[4558,4641,4642],{},"Map groups",[4558,4644,4645],{},"{'under_40':0,'40-49':1,'50-59':3,'60-69':7,'70+':10}",[4558,4647,4648],{},"Exponential post-60; doubled in formula, not gated.",[4537,4650,4651,4654,4657,4660],{},[4558,4652,4653],{},"RG Weight (multiplier)",[4558,4655,4656],{},"Map",[4558,4658,4659],{},"{'Cancelled':1.2,'Active':1.0,'No RG':0.5}",[4558,4661,4662],{},"Lapsed RG strong signal of estate shift.",[22,4664,4665,4668,4669,4672],{},[4286,4666,4667],{},"Raw propensity"," = ",[55,4670,4671],{},"(r_score + f_score + m_score + t_score + 2*age_score) * rg_weight",". E.g., high-freq recent-lapsed 70+: ~31.8 (prob 0.089); low everything: ~1 (prob 0.003).",[17,4674,4676],{"id":4675},"stochastic-assignment-mimics-real-donor-behavior","Stochastic Assignment Mimics Real Donor Behavior",[22,4678,4679,4680,4683,4684,4687,4688,4691,4692,4695],{},"Convert ",[55,4681,4682],{},"raw_propensity"," to ",[55,4685,4686],{},"assignment_prob"," (e.g., ",[55,4689,4690],{},"\u002F357"," for 0-1 scale), then ",[55,4693,4694],{},"bequest_status = np.random.binomial(1, prob)"," → 'Confirmed' if 1. This injects noise: perfect scorers sometimes miss, low scorers occasionally confirm—breaking determinism so downstream classifiers learn generalizable patterns, not the formula.",[4313,4697,4315],{},{"title":130,"searchDepth":131,"depth":131,"links":4699},[4700,4701,4702,4703],{"id":4356,"depth":131,"text":4357},{"id":4370,"depth":131,"text":4371},{"id":4525,"depth":131,"text":4526},{"id":4675,"depth":131,"text":4676},[140],{},"\u002Fsummaries\u002Fsynthetically-label-sparse-bequest-donors-realisti-summary","2026-04-08 21:21:18",{"title":4346,"description":130},{"loc":4706},"e0225ec94060d95d","Data and Beyond","https:\u002F\u002Funknown","summaries\u002Fsynthetically-label-sparse-bequest-donors-realisti-summary",[160,158,159],"Engineer RFMT-age-RG propensity scores with sector-specific bins (e.g., recency sweet spot 18-42mo=5pts) and stochastic noise to create 'Confirmed' labels, preventing models from overfitting formulas in \u003C1% positive charity data.",[],"Y2cIR1YxXNmF6nVq7KUQn_Jk5dp8tvzxIL29SZ2yDmA"]