[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"summary-79f82c07ea7441fe-trl-code-guide-sft-to-grpo-llm-alignment-on-t4-gpu-summary":3,"summaries-facets-categories":121,"summary-related-79f82c07ea7441fe-trl-code-guide-sft-to-grpo-llm-alignment-on-t4-gpu-summary":3690},{"id":4,"title":5,"ai":6,"body":13,"categories":74,"created_at":76,"date_modified":76,"description":67,"extension":77,"faq":76,"featured":78,"kicker_label":76,"meta":79,"navigation":103,"path":104,"published_at":105,"question":76,"scraped_at":106,"seo":107,"sitemap":108,"source_id":109,"source_name":110,"source_type":111,"source_url":112,"stem":113,"tags":114,"thumbnail_url":76,"tldr":118,"tweet":76,"unknown_tags":119,"__hash__":120},"summaries\u002Fsummaries\u002F79f82c07ea7441fe-trl-code-guide-sft-to-grpo-llm-alignment-on-t4-gpu-summary.md","TRL Code Guide: SFT to GRPO LLM Alignment on T4 GPU",{"provider":7,"model":8,"input_tokens":9,"output_tokens":10,"processing_time_ms":11,"cost_usd":12},"openrouter","x-ai\u002Fgrok-4.1-fast",9458,2615,35753,0.00269195,{"type":14,"value":15,"toc":66},"minimark",[16,21,30,34,44,48,54,58],[17,18,20],"h2",{"id":19},"lora-and-trl-setup-enables-post-training-on-limited-hardware","LoRA and TRL Setup Enables Post-Training on Limited Hardware",[22,23,24,25,29],"p",{},"Use LoRA (r=8, alpha=16, dropout=0.05, targets=",[26,27,28],"span",{},"'q_proj','k_proj','v_proj','o_proj'",") with TRL trainers to adapt Qwen\u002FQwen2.5-0.5B-Instruct on T4 GPU (16GB). Common args across stages: num_train_epochs=1, gradient_checkpointing=True, bf16 if supported else fp16, logging_steps=10, report_to=\"none\", save_strategy=\"no\". Install stack: torchao>=0.16, trl>=0.20, transformers>=4.45, peft>=0.13, bitsandbytes. Helpers like chat_generate apply chat template, generate with temp=0.7\u002Ftop_p=0.9. Cleanup VRAM with gc.collect() + torch.cuda.empty_cache() between stages to fit in Colab.",[17,31,33],{"id":32},"sft-and-rm-build-imitation-and-reward-signals","SFT and RM Build Imitation and Reward Signals",[22,35,36,37,40,41,43],{},"For Supervised Fine-Tuning, load trl-lib\u002FCapybara (train",[26,38,39],{},":300","), use SFTConfig(per_device_train_batch_size=2, gradient_accumulation_steps=4, learning_rate=2e-4, max_length=768). Trainer imitates high-quality chat responses; post-train inference on \"Explain bias-variance tradeoff in two sentences\" yields coherent output. Reward Modeling on trl-lib\u002Fultrafeedback_binarized (train",[26,42,39],{},") uses RewardConfig(batch_size=2, accum_steps=2, lr=1e-4, max_length=512), LoRA task_type=\"SEQ_CLS\". Trains to score chosen vs. rejected pairs, producing a preference-based reward without explicit RL.",[17,45,47],{"id":46},"dpo-skips-rm-for-direct-preference-alignment","DPO Skips RM for Direct Preference Alignment",[22,49,50,51,53],{},"DPOTrainer on same ultrafeedback_binarized",[26,52,39],{}," simplifies via implicit rewards: DPOConfig(batch_size=1, accum_steps=4, lr=5e-6, beta=0.1, max_length=512, max_prompt_length=256). Beta controls KL-divergence from reference policy, preventing mode collapse. Optimizes policy to prefer chosen over rejected responses directly, reducing steps vs. traditional RM+PPO.",[17,55,57],{"id":56},"grpo-uses-custom-rewards-to-sharpen-reasoning","GRPO Uses Custom Rewards to Sharpen Reasoning",[22,59,60,61,65],{},"GRPOTrainer generates num_generations=4 completions per prompt (max_prompt_length=128, max_completion_length=96, max_steps=15), ranks via reward_funcs. Custom dataset: 200 synthetic math problems (e.g., \"Solve 17 + 28 =\", gold=eval). Rewards: correctness_reward (1.0 if last extracted number matches gold else 0), brevity_reward (max(0,1-len(c)\u002F200)",[62,63,64],"em",{},"0.2). GRPOConfig(lr=1e-5, batch=2, accum=2). Inference on \"17+28?\", \"9","7?\", \"100-47?\" produces accurate, concise answers like final numbers, improving verifiable task performance over base.",{"title":67,"searchDepth":68,"depth":68,"links":69},"",2,[70,71,72,73],{"id":19,"depth":68,"text":20},{"id":32,"depth":68,"text":33},{"id":46,"depth":68,"text":47},{"id":56,"depth":68,"text":57},[75],"AI & LLMs",null,"md",false,{"content_references":80,"triage":98},[81,86,89,91,93],{"type":82,"title":83,"url":84,"context":85},"tool","TRL","https:\u002F\u002Fgithub.com\u002Fhuggingface\u002Ftrl","mentioned",{"type":87,"title":88,"context":85},"dataset","trl-lib\u002FCapybara",{"type":87,"title":90,"context":85},"trl-lib\u002Fultrafeedback_binarized",{"type":82,"title":92,"context":85},"Qwen\u002FQwen2.5-0.5B-Instruct",{"type":94,"title":95,"url":96,"context":97},"other","trl_llm_post_training_sft_dpo_grpo_marktechpost.py","https:\u002F\u002Fgithub.com\u002FMarktechpost\u002FAI-Agents-Projects-Tutorials\u002Fblob\u002Fmain\u002FLLM%20Projects\u002Ftrl_llm_post_training_sft_dpo_grpo_marktechpost.py","recommended",{"relevance":99,"novelty":100,"quality":100,"actionability":99,"composite":101,"reasoning":102},5,4,4.55,"Category: AI & LLMs. The article provides a detailed guide on using TRL and LoRA for LLM post-training, addressing practical applications for developers looking to implement AI features. It includes specific configurations and techniques that can be directly applied in production, making it highly actionable.",true,"\u002Fsummaries\u002F79f82c07ea7441fe-trl-code-guide-sft-to-grpo-llm-alignment-on-t4-gpu-summary","2026-05-01 20:52:08","2026-05-03 17:01:49",{"title":5,"description":67},{"loc":104},"79f82c07ea7441fe","MarkTechPost","article","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F01\u002Fa-coding-guide-on-llm-post-training-with-trl-from-supervised-fine-tuning-to-dpo-and-grpo-reasoning\u002F","summaries\u002F79f82c07ea7441fe-trl-code-guide-sft-to-grpo-llm-alignment-on-t4-gpu-summary",[115,116,117],"llm","python","machine-learning","Train Qwen2.5-0.5B via SFT, RM, DPO, GRPO using TRL+LoRA on Colab T4: configs include r=8 LoRA, 300-sample datasets, epochs=1, small batches\u002Faccum for memory efficiency, custom math rewards boost reasoning.",[],"py8Fe1-Noi99CHywKy61Q363dqRBmUxl6tZ9TDJOp3E",[122,125,128,130,133,136,138,140,142,144,146,148,151,153,155,157,159,161,163,165,167,169,172,175,177,179,182,184,186,189,191,193,195,197,199,201,203,205,207,209,211,213,215,217,219,221,223,225,227,229,231,233,235,237,239,241,243,245,247,249,251,253,255,257,259,261,263,265,267,269,271,273,275,277,279,281,283,285,287,289,291,293,295,297,299,301,303,305,307,309,311,313,315,317,319,321,323,325,327,329,331,333,335,337,339,341,343,345,347,349,351,353,355,357,359,361,363,365,367,369,371,373,375,377,379,381,383,385,387,389,391,393,395,397,399,401,403,405,407,409,411,413,415,417,419,421,423,425,427,429,431,433,435,437,439,441,444,446,448,450,452,454,456,458,460,462,464,466,468,470,472,474,476,478,480,482,484,486,488,490,492,494,496,498,500,502,504,506,508,510,512,514,516,518,520,522,524,526,528,530,532,534,536,538,540,542,544,546,548,550,552,554,556,558,560,562,564,566,568,570,572,574,576,578,580,582,584,586,588,590,592,594,596,598,600,602,604,606,608,610,612,614,616,618,620,622,624,626,628,630,632,634,636,638,640,642,644,646,648,650,652,654,656,658,660,662,664,666,668,670,672,674,676,678,680,682,684,686,688,690,692,694,696,698,700,702,704,706,708,710,712,714,716,718,720,722,724,726,728,730,732,734,736,738,740,742,744,746,748,750,752,754,756,758,760,762,764,766,768,770,772,774,776,778,780,782,784,786,788,790,792,794,796,798,800,802,804,806,808,810,812,814,816,818,820,822,824,826,828,830,832,834,836,838,840,842,844,846,848,850,852,854,856,858,860,862,864,866,868,870,872,874,876,878,880,882,884,886,888,890,892,894,896,898,900,902,904,906,908,910,912,914,916,918,920,922,924,926,928,930,932,934,936,938,940,942,944,946,948,950,952,954,956,958,960,962,964,966,968,970,972,974,976,978,980,982,984,986,988,990,992,994,996,998,1000,1002,1004,1006,1008,1010,1012,1014,1016,1018,1020,1022,1024,1026,1028,1030,1032,1034,1036,1038,1040,1042,1044,1046,1048,1050,1052,1054,1056,1058,1060,1062,1064,1066,1068,1070,1072,1074,1076,1078,1080,1082,1084,1086,1088,1090,1092,1094,1096,1098,1100,1102,1104,1106,1108,1110,1112,1114,1116,1118,1120,1122,1124,1126,1128,1130,1132,1134,1136,1138,1140,1142,1144,1146,1148,1150,1152,1154,1156,1158,1160,1162,1164,1166,1168,1170,1172,1174,1176,1178,1180,1182,1184,1186,1188,1190,1192,1194,1196,1198,1200,1202,1204,1206,1208,1210,1212,1214,1216,1218,1220,1222,1224,1226,1228,1230,1232,1234,1236,1238,1240,1242,1244,1246,1248,1250,1252,1254,1256,1258,1260,1262,1264,1266,1268,1270,1272,1274,1276,1278,1280,1282,1284,1286,1288,1290,1292,1294,1296,1298,1300,1302,1304,1306,1308,1310,1312,1314,1316,1318,1320,1322,1324,1326,1328,1330,1332,1334,1336,1338,1340,1342,1344,1346,1348,1350,1352,1354,1356,1358,1360,1362,1364,1366,1368,1370,1372,1374,1376,1378,1380,1382,1384,1386,1388,1390,1392,1394,1396,1398,1400,1402,1404,1406,1408,1410,1412,1414,1416,1418,1420,1422,1424,1426,1428,1430,1432,1434,1436,1438,1440,1442,1444,1446,1448,1450,1452,1454,1456,1458,1460,1462,1464,1466,1468,1470,1472,1474,1476,1478,1480,1482,1484,1486,1488,1490,1492,1494,1496,1498,1500,1502,1504,1506,1508,1510,1512,1514,1516,1518,1520,1522,1524,1526,1528,1530,1532,1534,1536,1538,1540,1542,1544,1546,1548,1550,1552,1554,1556,1558,1560,1562,1564,1566,1568,1570,1572,1574,1576,1578,1580,1582,1584,1586,1588,1590,1592,1594,1596,1598,1600,1602,1604,1606,1608,1610,1612,1614,1616,1618,1620,1622,1624,1626,1628,1630,1632,1634,1636,1638,1640,1642,1644,1646,1648,1650,1652,1654,1656,1658,1660,1662,1664,1666,1668,1670,1672,1674,1676,1678,1680,1682,1684,1686,1688,1690,1692,1694,1696,1698,1700,1702,1704,1706,1708,1710,1712,1714,1716,1718,1720,1722,1724,1726,1728,1730,1732,1734,1736,1738,1740,1742,1744,1746,1748,1750,1752,1754,1756,1758,1760,1762,1764,1766,1768,1770,1772,1774,1776,1778,1780,1782,1784,1786,1788,1790,1792,1794,1796,1798,1800,1802,1804,1806,1808,1810,1812,1814,1816,1818,1820,1822,1824,1826,1828,1830,1832,1834,1836,1838,1840,1842,1844,1846,1848,1850,1852,1854,1856,1858,1860,1862,1864,1866,1868,1870,1872,1874,1876,1878,1880,1882,1884,1886,1888,1890,1892,1894,1896,1898,1900,1902,1904,1906,1908,1910,1912,1914,1916,1918,1920,1922,1924,1926,1928,1930,1932,1934,1936,1938,1940,1942,1944,1946,1948,1950,1952,1954,1956,1958,1960,1962,1964,1966,1968,1970,1972,1974,1976,1978,1980,1982,1984,1986,1988,1990,1992,1994,1996,1998,2000,2002,2004,2006,2008,2010,2012,2014,2016,2018,2020,2022,2024,2026,2028,2030,2032,2034,2036,2038,2040,2042,2044,2046,2048,2050,2052,2054,2056,2058,2060,2062,2064,2066,2068,2070,2072,2074,2076,2078,2080,2082,2084,2086,2088,2090,2092,2094,2096,2098,2100,2102,2104,2106,2108,2110,2112,2114,2116,2118,2120,2122,2124,2126,2128,2130,2132,2134,2136,2138,2140,2142,2144,2146,2148,2150,2152,2154,2156,2158,2160,2162,2164,2166,2168,2170,2172,2174,2176,2178,2180,2182,2184,2186,2188,2190,2192,2194,2196,2198,2200,2202,2204,2206,2208,2210,2212,2214,2216,2218,2220,2222,2224,2226,2228,2230,2232,2234,2236,2238,2240,2242,2244,2246,2248,2250,2252,2254,2256,2258,2260,2262,2264,2266,2268,2270,2272,2274,2276,2278,2280,2282,2284,2286,2288,2290,2292,2294,2296,2298,2300,2302,2304,2306,2308,2310,2312,2314,2316,2318,2320,2322,2324,2326,2328,2330,2332,2334,2336,2338,2340,2342,2344,2346,2348,2350,2352,2354,2356,2358,2360,2362,2364,2366,2368,2370,2372,2374,2376,2378,2380,2382,2384,2386,2388,2390,2392,2394,2396,2398,2400,2402,2404,2406,2408,2410,2412,2414,2416,2418,2420,2422,2424,2426,2428,2430,2432,2434,2436,2438,2440,2442,2444,2446,2448,2450,2452,2454,2456,2458,2460,2462,2464,2466,2468,2470,2472,2474,2476,2478,2480,2482,2484,2486,2488,2490,2492,2494,2496,2498,2500,2502,2504,2506,2508,2510,2512,2514,2516,2518,2520,2522,2524,2526,2528,2530,2532,2534,2536,2538,2540,2542,2544,2546,2548,2550,2552,2554,2556,2558,2560,2562,2564,2566,2568,2570,2572,2574,2576,2578,2580,2582,2584,2586,2588,2590,2592,2594,2596,2598,2600,2602,2604,2606,2608,2610,2612,2614,2616,2618,2620,2622,2624,2626,2628,2630,2632,2634,2636,2638,2640,2642,2644,2646,2648,2650,2652,2654,2656,2658,2660,2662,2664,2666,2668,2670,2672,2674,2676,2678,2680,2682,2684,2686,2688,2690,2692,2694,2696,2698,2700,2702,2704,2706,2708,2710,2712,2714,2716,2718,2720,2722,2724,2726,2728,2730,2732,2734,2736,2738,2740,2742,2744,2746,2748,2750,2752,2754,2756,2758,2760,2762,2764,2766,2768,2770,2772,2774,2776,2778,2780,2782,2784,2786,2788,2790,2792,2794,2796,2798,2800,2802,2804,2806,2808,2810,2812,2814,2816,2818,2820,2822,2824,2826,2828,2830,2832,2834,2836,2838,2840,2842,2844,2846,2848,2850,2852,2854,2856,2858,2860,2862,2864,2866,2868,2870,2872,2874,2876,2878,2880,2882,2884,2886,2888,2890,2892,2894,2896,2898,2900,2902,2904,2906,2908,2910,2912,2914,2916,2918,2920,2922,2924,2926,2928,2930,2932,2934,2936,2938,2940,2942,2944,2946,2948,2950,2952,2954,2956,2958,2960,2962,2964,2966,2968,2970,2972,2974,2976,2978,2980,2982,2984,2986,2988,2990,2992,2994,2996,2998,3000,3002,3004,3006,3008,3010,3012,3014,3016,3018,3020,3022,3024,3026,3028,3030,3032,3034,3036,3038,3040,3042,3044,3046,3048,3050,3052,3054,3056,3058,3060,3062,3064,3066,3068,3070,3072,3074,3076,3078,3080,3082,3084,3086,3088,3090,3092,3094,3096,3098,3100,3102,3104,3106,3108,3110,3112,3114,3116,3118,3120,3122,3124,3126,3128,3130,3132,3134,3136,3138,3140,3142,3144,3146,3148,3150,3152,3154,3156,3158,3160,3162,3164,3166,3168,3170,3172,3174,3176,3178,3180,3182,3184,3186,3188,3190,3192,3194,3196,3198,3200,3202,3204,3206,3208,3210,3212,3214,3216,3218,3220,3222,3224,3226,3228,3230,3232,3234,3236,3238,3240,3242,3244,3246,3248,3250,3252,3254,3256,3258,3260,3262,3264,3266,3268,3270,3272,3274,3276,3278,3280,3282,3284,3286,3288,3290,3292,3294,3296,3298,3300,3302,3304,3306,3308,3310,3312,3314,3316,3318,3320,3322,3324,3326,3328,3330,3332,3334,3336,3338,3340,3342,3344,3346,3348,3350,3352,3354,3356,3358,3360,3362,3364,3366,3368,3370,3372,3374,3376,3378,3380,3382,3384,3386,3388,3390,3392,3394,3396,3398,3400,3402,3404,3406,3408,3410,3412,3414,3416,3418,3420,3422,3424,3426,3428,3430,3432,3434,3436,3438,3440,3442,3444,3446,3448,3450,3452,3454,3456,3458,3460,3462,3464,3466,3468,3470,3472,3474,3476,3478,3480,3482,3484,3486,3488,3490,3492,3494,3496,3498,3500,3502,3504,3506,3508,3510,3512,3514,3516,3518,3520,3522,3524,3526,3528,3530,3532,3534,3536,3538,3540,3542,3544,3546,3548,3550,3552,3554,3556,3558,3560,3562,3564,3566,3568,3570,3572,3574,3576,3578,3580,3582,3584,3586,3588,3590,3592,3594,3596,3598,3600,3602,3604,3606,3608,3610,3612,3614,3616,3618,3620,3622,3624,3626,3628,3630,3632,3634,3636,3638,3640,3642,3644,3646,3648,3650,3652,3654,3656,3658,3660,3662,3664,3666,3668,3670,3672,3674,3676,3678,3680,3682,3684,3686,3688],{"categories":123},[124],"Developer Productivity",{"categories":126},[127],"Business & SaaS",{"categories":129},[75],{"categories":131},[132],"AI Automation",{"categories":134},[135],"Product Strategy",{"categories":137},[75],{"categories":139},[124],{"categories":141},[127],{"categories":143},[],{"categories":145},[75],{"categories":147},[],{"categories":149},[150],"AI News & Trends",{"categories":152},[132],{"categories":154},[150],{"categories":156},[132],{"categories":158},[132],{"categories":160},[75],{"categories":162},[75],{"categories":164},[150],{"categories":166},[75],{"categories":168},[],{"categories":170},[171],"Design & Frontend",{"categories":173},[174],"Data Science & Visualization",{"categories":176},[150],{"categories":178},[],{"categories":180},[181],"Software Engineering",{"categories":183},[75],{"categories":185},[132],{"categories":187},[188],"Marketing & Growth",{"categories":190},[75],{"categories":192},[132],{"categories":194},[],{"categories":196},[],{"categories":198},[171],{"categories":200},[132],{"categories":202},[124],{"categories":204},[171],{"categories":206},[75],{"categories":208},[132],{"categories":210},[150],{"categories":212},[],{"categories":214},[],{"categories":216},[132],{"categories":218},[181],{"categories":220},[],{"categories":222},[127],{"categories":224},[],{"categories":226},[],{"categories":228},[132],{"categories":230},[132],{"categories":232},[75],{"categories":234},[],{"categories":236},[181],{"categories":238},[],{"categories":240},[],{"categories":242},[],{"categories":244},[75],{"categories":246},[188],{"categories":248},[171],{"categories":250},[171],{"categories":252},[75],{"categories":254},[132],{"categories":256},[75],{"categories":258},[75],{"categories":260},[132],{"categories":262},[132],{"categories":264},[174],{"categories":266},[150],{"categories":268},[132],{"categories":270},[188],{"categories":272},[132],{"categories":274},[135],{"categories":276},[],{"categories":278},[132],{"categories":280},[],{"categories":282},[132],{"categories":284},[181],{"categories":286},[171],{"categories":288},[75],{"categories":290},[],{"categories":292},[],{"categories":294},[132],{"categories":296},[],{"categories":298},[75],{"categories":300},[],{"categories":302},[124],{"categories":304},[181],{"categories":306},[127],{"categories":308},[150],{"categories":310},[75],{"categories":312},[],{"categories":314},[75],{"categories":316},[],{"categories":318},[181],{"categories":320},[174],{"categories":322},[],{"categories":324},[75],{"categories":326},[171],{"categories":328},[],{"categories":330},[171],{"categories":332},[132],{"categories":334},[],{"categories":336},[132],{"categories":338},[150],{"categories":340},[75],{"categories":342},[],{"categories":344},[132],{"categories":346},[75],{"categories":348},[135],{"categories":350},[],{"categories":352},[75],{"categories":354},[132],{"categories":356},[132],{"categories":358},[],{"categories":360},[174],{"categories":362},[75],{"categories":364},[],{"categories":366},[124],{"categories":368},[127],{"categories":370},[75],{"categories":372},[132],{"categories":374},[181],{"categories":376},[75],{"categories":378},[],{"categories":380},[],{"categories":382},[75],{"categories":384},[],{"categories":386},[171],{"categories":388},[],{"categories":390},[75],{"categories":392},[],{"categories":394},[132],{"categories":396},[75],{"categories":398},[171],{"categories":400},[],{"categories":402},[75],{"categories":404},[75],{"categories":406},[127],{"categories":408},[132],{"categories":410},[75],{"categories":412},[171],{"categories":414},[132],{"categories":416},[],{"categories":418},[],{"categories":420},[150],{"categories":422},[],{"categories":424},[75],{"categories":426},[127,188],{"categories":428},[],{"categories":430},[75],{"categories":432},[],{"categories":434},[],{"categories":436},[75],{"categories":438},[],{"categories":440},[75],{"categories":442},[443],"DevOps & Cloud",{"categories":445},[],{"categories":447},[150],{"categories":449},[171],{"categories":451},[],{"categories":453},[150],{"categories":455},[150],{"categories":457},[75],{"categories":459},[188],{"categories":461},[],{"categories":463},[127],{"categories":465},[],{"categories":467},[75,443],{"categories":469},[75],{"categories":471},[75],{"categories":473},[132],{"categories":475},[75,181],{"categories":477},[174],{"categories":479},[75],{"categories":481},[188],{"categories":483},[132],{"categories":485},[132],{"categories":487},[],{"categories":489},[132],{"categories":491},[75,127],{"categories":493},[],{"categories":495},[171],{"categories":497},[171],{"categories":499},[],{"categories":501},[],{"categories":503},[150],{"categories":505},[],{"categories":507},[124],{"categories":509},[181],{"categories":511},[75],{"categories":513},[171],{"categories":515},[132],{"categories":517},[181],{"categories":519},[150],{"categories":521},[171],{"categories":523},[],{"categories":525},[75],{"categories":527},[75],{"categories":529},[75],{"categories":531},[150],{"categories":533},[124],{"categories":535},[75],{"categories":537},[132],{"categories":539},[443],{"categories":541},[171],{"categories":543},[132],{"categories":545},[],{"categories":547},[],{"categories":549},[171],{"categories":551},[150],{"categories":553},[174],{"categories":555},[],{"categories":557},[75],{"categories":559},[75],{"categories":561},[127],{"categories":563},[75],{"categories":565},[75],{"categories":567},[150],{"categories":569},[],{"categories":571},[132],{"categories":573},[181],{"categories":575},[],{"categories":577},[75],{"categories":579},[75],{"categories":581},[132],{"categories":583},[],{"categories":585},[],{"categories":587},[75],{"categories":589},[],{"categories":591},[127],{"categories":593},[132],{"categories":595},[],{"categories":597},[124],{"categories":599},[75],{"categories":601},[127],{"categories":603},[150],{"categories":605},[],{"categories":607},[],{"categories":609},[],{"categories":611},[150],{"categories":613},[150],{"categories":615},[],{"categories":617},[],{"categories":619},[127],{"categories":621},[],{"categories":623},[],{"categories":625},[124],{"categories":627},[],{"categories":629},[188],{"categories":631},[132],{"categories":633},[127],{"categories":635},[132],{"categories":637},[],{"categories":639},[135],{"categories":641},[171],{"categories":643},[181],{"categories":645},[75],{"categories":647},[132],{"categories":649},[127],{"categories":651},[75],{"categories":653},[],{"categories":655},[],{"categories":657},[181],{"categories":659},[174],{"categories":661},[135],{"categories":663},[132],{"categories":665},[75],{"categories":667},[],{"categories":669},[443],{"categories":671},[],{"categories":673},[132],{"categories":675},[],{"categories":677},[],{"categories":679},[75],{"categories":681},[171],{"categories":683},[188],{"categories":685},[132],{"categories":687},[],{"categories":689},[124],{"categories":691},[],{"categories":693},[150],{"categories":695},[75,443],{"categories":697},[150],{"categories":699},[75],{"categories":701},[127],{"categories":703},[75],{"categories":705},[],{"categories":707},[127],{"categories":709},[],{"categories":711},[181],{"categories":713},[171],{"categories":715},[150],{"categories":717},[174],{"categories":719},[124],{"categories":721},[75],{"categories":723},[181],{"categories":725},[],{"categories":727},[],{"categories":729},[135],{"categories":731},[],{"categories":733},[75],{"categories":735},[],{"categories":737},[171],{"categories":739},[171],{"categories":741},[171],{"categories":743},[],{"categories":745},[],{"categories":747},[150],{"categories":749},[132],{"categories":751},[75],{"categories":753},[75],{"categories":755},[75],{"categories":757},[127],{"categories":759},[75],{"categories":761},[],{"categories":763},[181],{"categories":765},[181],{"categories":767},[127],{"categories":769},[],{"categories":771},[75],{"categories":773},[75],{"categories":775},[127],{"categories":777},[150],{"categories":779},[188],{"categories":781},[132],{"categories":783},[],{"categories":785},[171],{"categories":787},[],{"categories":789},[75],{"categories":791},[],{"categories":793},[127],{"categories":795},[132],{"categories":797},[],{"categories":799},[443],{"categories":801},[174],{"categories":803},[181],{"categories":805},[188],{"categories":807},[181],{"categories":809},[132],{"categories":811},[],{"categories":813},[],{"categories":815},[132],{"categories":817},[124],{"categories":819},[132],{"categories":821},[135],{"categories":823},[127],{"categories":825},[],{"categories":827},[75],{"categories":829},[135],{"categories":831},[75],{"categories":833},[75],{"categories":835},[188],{"categories":837},[171],{"categories":839},[132],{"categories":841},[],{"categories":843},[],{"categories":845},[443],{"categories":847},[181],{"categories":849},[],{"categories":851},[132],{"categories":853},[75],{"categories":855},[171,75],{"categories":857},[124],{"categories":859},[],{"categories":861},[75],{"categories":863},[124],{"categories":865},[171],{"categories":867},[132],{"categories":869},[181],{"categories":871},[],{"categories":873},[75],{"categories":875},[],{"categories":877},[124],{"categories":879},[],{"categories":881},[132],{"categories":883},[135],{"categories":885},[75],{"categories":887},[75],{"categories":889},[171],{"categories":891},[132],{"categories":893},[443],{"categories":895},[171],{"categories":897},[132],{"categories":899},[75],{"categories":901},[75],{"categories":903},[75],{"categories":905},[150],{"categories":907},[],{"categories":909},[135],{"categories":911},[132],{"categories":913},[171],{"categories":915},[132],{"categories":917},[181],{"categories":919},[171],{"categories":921},[132],{"categories":923},[150],{"categories":925},[],{"categories":927},[75],{"categories":929},[171],{"categories":931},[75],{"categories":933},[124],{"categories":935},[150],{"categories":937},[75],{"categories":939},[188],{"categories":941},[75],{"categories":943},[75],{"categories":945},[132],{"categories":947},[132],{"categories":949},[75],{"categories":951},[132],{"categories":953},[171],{"categories":955},[75],{"categories":957},[],{"categories":959},[],{"categories":961},[181],{"categories":963},[],{"categories":965},[124],{"categories":967},[443],{"categories":969},[],{"categories":971},[124],{"categories":973},[127],{"categories":975},[188],{"categories":977},[],{"categories":979},[127],{"categories":981},[],{"categories":983},[],{"categories":985},[],{"categories":987},[],{"categories":989},[],{"categories":991},[75],{"categories":993},[132],{"categories":995},[443],{"categories":997},[124],{"categories":999},[75],{"categories":1001},[181],{"categories":1003},[135],{"categories":1005},[75],{"categories":1007},[188],{"categories":1009},[75],{"categories":1011},[75],{"categories":1013},[75],{"categories":1015},[75,124],{"categories":1017},[181],{"categories":1019},[181],{"categories":1021},[171],{"categories":1023},[75],{"categories":1025},[],{"categories":1027},[],{"categories":1029},[],{"categories":1031},[181],{"categories":1033},[174],{"categories":1035},[150],{"categories":1037},[171],{"categories":1039},[],{"categories":1041},[75],{"categories":1043},[75],{"categories":1045},[],{"categories":1047},[],{"categories":1049},[132],{"categories":1051},[75],{"categories":1053},[127],{"categories":1055},[],{"categories":1057},[124],{"categories":1059},[75],{"categories":1061},[124],{"categories":1063},[75],{"categories":1065},[181],{"categories":1067},[188],{"categories":1069},[75,171],{"categories":1071},[150],{"categories":1073},[171],{"categories":1075},[],{"categories":1077},[443],{"categories":1079},[171],{"categories":1081},[132],{"categories":1083},[],{"categories":1085},[],{"categories":1087},[],{"categories":1089},[],{"categories":1091},[181],{"categories":1093},[132],{"categories":1095},[132],{"categories":1097},[75],{"categories":1099},[75],{"categories":1101},[],{"categories":1103},[171],{"categories":1105},[],{"categories":1107},[],{"categories":1109},[132],{"categories":1111},[],{"categories":1113},[],{"categories":1115},[188],{"categories":1117},[188],{"categories":1119},[132],{"categories":1121},[],{"categories":1123},[75],{"categories":1125},[75],{"categories":1127},[181],{"categories":1129},[171],{"categories":1131},[171],{"categories":1133},[132],{"categories":1135},[124],{"categories":1137},[75],{"categories":1139},[171],{"categories":1141},[171],{"categories":1143},[132],{"categories":1145},[132],{"categories":1147},[75],{"categories":1149},[],{"categories":1151},[],{"categories":1153},[75],{"categories":1155},[132],{"categories":1157},[150],{"categories":1159},[181],{"categories":1161},[124],{"categories":1163},[75],{"categories":1165},[],{"categories":1167},[132],{"categories":1169},[132],{"categories":1171},[],{"categories":1173},[124],{"categories":1175},[75],{"categories":1177},[124],{"categories":1179},[124],{"categories":1181},[],{"categories":1183},[],{"categories":1185},[132],{"categories":1187},[132],{"categories":1189},[75],{"categories":1191},[75],{"categories":1193},[150],{"categories":1195},[174],{"categories":1197},[135],{"categories":1199},[150],{"categories":1201},[171],{"categories":1203},[],{"categories":1205},[150],{"categories":1207},[],{"categories":1209},[],{"categories":1211},[],{"categories":1213},[],{"categories":1215},[181],{"categories":1217},[174],{"categories":1219},[],{"categories":1221},[75],{"categories":1223},[75],{"categories":1225},[174],{"categories":1227},[181],{"categories":1229},[],{"categories":1231},[],{"categories":1233},[132],{"categories":1235},[150],{"categories":1237},[150],{"categories":1239},[132],{"categories":1241},[124],{"categories":1243},[75,443],{"categories":1245},[],{"categories":1247},[171],{"categories":1249},[124],{"categories":1251},[132],{"categories":1253},[171],{"categories":1255},[],{"categories":1257},[132],{"categories":1259},[132],{"categories":1261},[75],{"categories":1263},[188],{"categories":1265},[181],{"categories":1267},[171],{"categories":1269},[],{"categories":1271},[132],{"categories":1273},[75],{"categories":1275},[132],{"categories":1277},[132],{"categories":1279},[132],{"categories":1281},[188],{"categories":1283},[132],{"categories":1285},[75],{"categories":1287},[],{"categories":1289},[188],{"categories":1291},[150],{"categories":1293},[132],{"categories":1295},[],{"categories":1297},[],{"categories":1299},[75],{"categories":1301},[132],{"categories":1303},[150],{"categories":1305},[132],{"categories":1307},[],{"categories":1309},[],{"categories":1311},[],{"categories":1313},[132],{"categories":1315},[],{"categories":1317},[],{"categories":1319},[174],{"categories":1321},[75],{"categories":1323},[174],{"categories":1325},[150],{"categories":1327},[75],{"categories":1329},[75],{"categories":1331},[132],{"categories":1333},[75],{"categories":1335},[],{"categories":1337},[],{"categories":1339},[443],{"categories":1341},[],{"categories":1343},[],{"categories":1345},[124],{"categories":1347},[],{"categories":1349},[],{"categories":1351},[],{"categories":1353},[],{"categories":1355},[181],{"categories":1357},[150],{"categories":1359},[188],{"categories":1361},[127],{"categories":1363},[75],{"categories":1365},[75],{"categories":1367},[127],{"categories":1369},[],{"categories":1371},[171],{"categories":1373},[132],{"categories":1375},[127],{"categories":1377},[75],{"categories":1379},[75],{"categories":1381},[124],{"categories":1383},[],{"categories":1385},[124],{"categories":1387},[75],{"categories":1389},[188],{"categories":1391},[132],{"categories":1393},[150],{"categories":1395},[127],{"categories":1397},[75],{"categories":1399},[132],{"categories":1401},[],{"categories":1403},[75],{"categories":1405},[124],{"categories":1407},[75],{"categories":1409},[],{"categories":1411},[150],{"categories":1413},[75],{"categories":1415},[],{"categories":1417},[127],{"categories":1419},[75],{"categories":1421},[],{"categories":1423},[],{"categories":1425},[],{"categories":1427},[75],{"categories":1429},[],{"categories":1431},[443],{"categories":1433},[75],{"categories":1435},[],{"categories":1437},[75],{"categories":1439},[75],{"categories":1441},[75],{"categories":1443},[75,443],{"categories":1445},[75],{"categories":1447},[75],{"categories":1449},[171],{"categories":1451},[132],{"categories":1453},[],{"categories":1455},[132],{"categories":1457},[75],{"categories":1459},[75],{"categories":1461},[75],{"categories":1463},[124],{"categories":1465},[124],{"categories":1467},[181],{"categories":1469},[171],{"categories":1471},[132],{"categories":1473},[],{"categories":1475},[75],{"categories":1477},[150],{"categories":1479},[75],{"categories":1481},[127],{"categories":1483},[],{"categories":1485},[443],{"categories":1487},[171],{"categories":1489},[171],{"categories":1491},[132],{"categories":1493},[150],{"categories":1495},[132],{"categories":1497},[75],{"categories":1499},[],{"categories":1501},[75],{"categories":1503},[],{"categories":1505},[],{"categories":1507},[75],{"categories":1509},[75],{"categories":1511},[75],{"categories":1513},[132],{"categories":1515},[75],{"categories":1517},[],{"categories":1519},[174],{"categories":1521},[132],{"categories":1523},[],{"categories":1525},[75],{"categories":1527},[150],{"categories":1529},[],{"categories":1531},[171],{"categories":1533},[443],{"categories":1535},[150],{"categories":1537},[181],{"categories":1539},[181],{"categories":1541},[150],{"categories":1543},[150],{"categories":1545},[443],{"categories":1547},[],{"categories":1549},[150],{"categories":1551},[75],{"categories":1553},[124],{"categories":1555},[150],{"categories":1557},[],{"categories":1559},[174],{"categories":1561},[150],{"categories":1563},[181],{"categories":1565},[150],{"categories":1567},[443],{"categories":1569},[75],{"categories":1571},[75],{"categories":1573},[],{"categories":1575},[127],{"categories":1577},[],{"categories":1579},[],{"categories":1581},[75],{"categories":1583},[75],{"categories":1585},[75],{"categories":1587},[75],{"categories":1589},[],{"categories":1591},[174],{"categories":1593},[124],{"categories":1595},[],{"categories":1597},[75],{"categories":1599},[75],{"categories":1601},[443],{"categories":1603},[443],{"categories":1605},[],{"categories":1607},[132],{"categories":1609},[150],{"categories":1611},[150],{"categories":1613},[75],{"categories":1615},[132],{"categories":1617},[],{"categories":1619},[171],{"categories":1621},[75],{"categories":1623},[75],{"categories":1625},[],{"categories":1627},[],{"categories":1629},[443],{"categories":1631},[75],{"categories":1633},[181],{"categories":1635},[127],{"categories":1637},[75],{"categories":1639},[],{"categories":1641},[132],{"categories":1643},[124],{"categories":1645},[124],{"categories":1647},[],{"categories":1649},[75],{"categories":1651},[171],{"categories":1653},[132],{"categories":1655},[],{"categories":1657},[75],{"categories":1659},[75],{"categories":1661},[132],{"categories":1663},[],{"categories":1665},[132],{"categories":1667},[181],{"categories":1669},[],{"categories":1671},[75],{"categories":1673},[],{"categories":1675},[75],{"categories":1677},[],{"categories":1679},[75],{"categories":1681},[75],{"categories":1683},[],{"categories":1685},[75],{"categories":1687},[150],{"categories":1689},[75],{"categories":1691},[75],{"categories":1693},[124],{"categories":1695},[75],{"categories":1697},[150],{"categories":1699},[132],{"categories":1701},[],{"categories":1703},[75],{"categories":1705},[188],{"categories":1707},[],{"categories":1709},[],{"categories":1711},[],{"categories":1713},[124],{"categories":1715},[150],{"categories":1717},[132],{"categories":1719},[75],{"categories":1721},[171],{"categories":1723},[132],{"categories":1725},[],{"categories":1727},[132],{"categories":1729},[],{"categories":1731},[75],{"categories":1733},[132],{"categories":1735},[75],{"categories":1737},[],{"categories":1739},[75],{"categories":1741},[75],{"categories":1743},[150],{"categories":1745},[171],{"categories":1747},[132],{"categories":1749},[171],{"categories":1751},[127],{"categories":1753},[],{"categories":1755},[],{"categories":1757},[75],{"categories":1759},[124],{"categories":1761},[150],{"categories":1763},[],{"categories":1765},[],{"categories":1767},[181],{"categories":1769},[171],{"categories":1771},[],{"categories":1773},[75],{"categories":1775},[],{"categories":1777},[188],{"categories":1779},[75],{"categories":1781},[443],{"categories":1783},[181],{"categories":1785},[],{"categories":1787},[132],{"categories":1789},[75],{"categories":1791},[132],{"categories":1793},[132],{"categories":1795},[75],{"categories":1797},[],{"categories":1799},[124],{"categories":1801},[75],{"categories":1803},[127],{"categories":1805},[181],{"categories":1807},[171],{"categories":1809},[],{"categories":1811},[],{"categories":1813},[],{"categories":1815},[132],{"categories":1817},[171],{"categories":1819},[150],{"categories":1821},[75],{"categories":1823},[150],{"categories":1825},[171],{"categories":1827},[],{"categories":1829},[171],{"categories":1831},[150],{"categories":1833},[127],{"categories":1835},[75],{"categories":1837},[150],{"categories":1839},[188],{"categories":1841},[],{"categories":1843},[],{"categories":1845},[174],{"categories":1847},[75,181],{"categories":1849},[150],{"categories":1851},[75],{"categories":1853},[132],{"categories":1855},[132],{"categories":1857},[75],{"categories":1859},[],{"categories":1861},[181],{"categories":1863},[75],{"categories":1865},[174],{"categories":1867},[132],{"categories":1869},[188],{"categories":1871},[443],{"categories":1873},[],{"categories":1875},[124],{"categories":1877},[132],{"categories":1879},[132],{"categories":1881},[181],{"categories":1883},[75],{"categories":1885},[75],{"categories":1887},[],{"categories":1889},[],{"categories":1891},[],{"categories":1893},[443],{"categories":1895},[150],{"categories":1897},[75],{"categories":1899},[75],{"categories":1901},[75],{"categories":1903},[],{"categories":1905},[174],{"categories":1907},[127],{"categories":1909},[],{"categories":1911},[132],{"categories":1913},[443],{"categories":1915},[],{"categories":1917},[171],{"categories":1919},[171],{"categories":1921},[],{"categories":1923},[181],{"categories":1925},[171],{"categories":1927},[75],{"categories":1929},[],{"categories":1931},[150],{"categories":1933},[75],{"categories":1935},[171],{"categories":1937},[132],{"categories":1939},[150],{"categories":1941},[],{"categories":1943},[132],{"categories":1945},[171],{"categories":1947},[75],{"categories":1949},[],{"categories":1951},[75],{"categories":1953},[75],{"categories":1955},[443],{"categories":1957},[150],{"categories":1959},[174],{"categories":1961},[174],{"categories":1963},[],{"categories":1965},[],{"categories":1967},[],{"categories":1969},[132],{"categories":1971},[181],{"categories":1973},[181],{"categories":1975},[],{"categories":1977},[],{"categories":1979},[75],{"categories":1981},[],{"categories":1983},[132],{"categories":1985},[75],{"categories":1987},[],{"categories":1989},[75],{"categories":1991},[127],{"categories":1993},[75],{"categories":1995},[188],{"categories":1997},[132],{"categories":1999},[75],{"categories":2001},[181],{"categories":2003},[150],{"categories":2005},[132],{"categories":2007},[],{"categories":2009},[150],{"categories":2011},[132],{"categories":2013},[132],{"categories":2015},[],{"categories":2017},[127],{"categories":2019},[132],{"categories":2021},[],{"categories":2023},[75],{"categories":2025},[124],{"categories":2027},[150],{"categories":2029},[443],{"categories":2031},[132],{"categories":2033},[132],{"categories":2035},[124],{"categories":2037},[75],{"categories":2039},[],{"categories":2041},[],{"categories":2043},[171],{"categories":2045},[75,127],{"categories":2047},[],{"categories":2049},[124],{"categories":2051},[174],{"categories":2053},[75],{"categories":2055},[181],{"categories":2057},[75],{"categories":2059},[132],{"categories":2061},[75],{"categories":2063},[75],{"categories":2065},[150],{"categories":2067},[132],{"categories":2069},[],{"categories":2071},[],{"categories":2073},[132],{"categories":2075},[75],{"categories":2077},[443],{"categories":2079},[],{"categories":2081},[75],{"categories":2083},[132],{"categories":2085},[],{"categories":2087},[75],{"categories":2089},[188],{"categories":2091},[174],{"categories":2093},[132],{"categories":2095},[75],{"categories":2097},[443],{"categories":2099},[],{"categories":2101},[75],{"categories":2103},[188],{"categories":2105},[171],{"categories":2107},[75],{"categories":2109},[],{"categories":2111},[188],{"categories":2113},[150],{"categories":2115},[75],{"categories":2117},[75],{"categories":2119},[124],{"categories":2121},[],{"categories":2123},[],{"categories":2125},[171],{"categories":2127},[75],{"categories":2129},[174],{"categories":2131},[188],{"categories":2133},[188],{"categories":2135},[150],{"categories":2137},[],{"categories":2139},[],{"categories":2141},[75],{"categories":2143},[],{"categories":2145},[75,181],{"categories":2147},[150],{"categories":2149},[132],{"categories":2151},[181],{"categories":2153},[75],{"categories":2155},[124],{"categories":2157},[],{"categories":2159},[],{"categories":2161},[124],{"categories":2163},[188],{"categories":2165},[75],{"categories":2167},[],{"categories":2169},[171,75],{"categories":2171},[443],{"categories":2173},[124],{"categories":2175},[],{"categories":2177},[127],{"categories":2179},[127],{"categories":2181},[75],{"categories":2183},[181],{"categories":2185},[132],{"categories":2187},[150],{"categories":2189},[188],{"categories":2191},[171],{"categories":2193},[75],{"categories":2195},[75],{"categories":2197},[75],{"categories":2199},[124],{"categories":2201},[75],{"categories":2203},[132],{"categories":2205},[150],{"categories":2207},[],{"categories":2209},[],{"categories":2211},[174],{"categories":2213},[181],{"categories":2215},[75],{"categories":2217},[171],{"categories":2219},[174],{"categories":2221},[75],{"categories":2223},[75],{"categories":2225},[132],{"categories":2227},[132],{"categories":2229},[75,127],{"categories":2231},[],{"categories":2233},[171],{"categories":2235},[],{"categories":2237},[75],{"categories":2239},[150],{"categories":2241},[124],{"categories":2243},[124],{"categories":2245},[132],{"categories":2247},[75],{"categories":2249},[127],{"categories":2251},[181],{"categories":2253},[188],{"categories":2255},[],{"categories":2257},[150],{"categories":2259},[75],{"categories":2261},[75],{"categories":2263},[150],{"categories":2265},[181],{"categories":2267},[75],{"categories":2269},[132],{"categories":2271},[150],{"categories":2273},[75],{"categories":2275},[171],{"categories":2277},[75],{"categories":2279},[75],{"categories":2281},[443],{"categories":2283},[135],{"categories":2285},[132],{"categories":2287},[75],{"categories":2289},[150],{"categories":2291},[132],{"categories":2293},[188],{"categories":2295},[75],{"categories":2297},[],{"categories":2299},[75],{"categories":2301},[],{"categories":2303},[],{"categories":2305},[],{"categories":2307},[127],{"categories":2309},[75],{"categories":2311},[132],{"categories":2313},[150],{"categories":2315},[150],{"categories":2317},[150],{"categories":2319},[150],{"categories":2321},[],{"categories":2323},[124],{"categories":2325},[132],{"categories":2327},[150],{"categories":2329},[124],{"categories":2331},[132],{"categories":2333},[75],{"categories":2335},[75,132],{"categories":2337},[132],{"categories":2339},[443],{"categories":2341},[150],{"categories":2343},[150],{"categories":2345},[132],{"categories":2347},[75],{"categories":2349},[],{"categories":2351},[150],{"categories":2353},[188],{"categories":2355},[124],{"categories":2357},[75],{"categories":2359},[75],{"categories":2361},[],{"categories":2363},[181],{"categories":2365},[],{"categories":2367},[124],{"categories":2369},[132],{"categories":2371},[150],{"categories":2373},[75],{"categories":2375},[150],{"categories":2377},[124],{"categories":2379},[150],{"categories":2381},[150],{"categories":2383},[],{"categories":2385},[127],{"categories":2387},[132],{"categories":2389},[150],{"categories":2391},[150],{"categories":2393},[150],{"categories":2395},[150],{"categories":2397},[150],{"categories":2399},[150],{"categories":2401},[150],{"categories":2403},[150],{"categories":2405},[150],{"categories":2407},[150],{"categories":2409},[174],{"categories":2411},[124],{"categories":2413},[75],{"categories":2415},[75],{"categories":2417},[],{"categories":2419},[75,124],{"categories":2421},[],{"categories":2423},[132],{"categories":2425},[150],{"categories":2427},[132],{"categories":2429},[75],{"categories":2431},[75],{"categories":2433},[75],{"categories":2435},[75],{"categories":2437},[75],{"categories":2439},[132],{"categories":2441},[127],{"categories":2443},[171],{"categories":2445},[150],{"categories":2447},[75],{"categories":2449},[],{"categories":2451},[],{"categories":2453},[132],{"categories":2455},[171],{"categories":2457},[75],{"categories":2459},[],{"categories":2461},[],{"categories":2463},[188],{"categories":2465},[75],{"categories":2467},[],{"categories":2469},[],{"categories":2471},[124],{"categories":2473},[127],{"categories":2475},[75],{"categories":2477},[127],{"categories":2479},[171],{"categories":2481},[],{"categories":2483},[150],{"categories":2485},[],{"categories":2487},[171],{"categories":2489},[75],{"categories":2491},[188],{"categories":2493},[],{"categories":2495},[188],{"categories":2497},[],{"categories":2499},[],{"categories":2501},[132],{"categories":2503},[],{"categories":2505},[127],{"categories":2507},[124],{"categories":2509},[171],{"categories":2511},[181],{"categories":2513},[],{"categories":2515},[],{"categories":2517},[75],{"categories":2519},[124],{"categories":2521},[188],{"categories":2523},[],{"categories":2525},[132],{"categories":2527},[132],{"categories":2529},[150],{"categories":2531},[75],{"categories":2533},[132],{"categories":2535},[75],{"categories":2537},[132],{"categories":2539},[75],{"categories":2541},[135],{"categories":2543},[150],{"categories":2545},[],{"categories":2547},[188],{"categories":2549},[181],{"categories":2551},[132],{"categories":2553},[],{"categories":2555},[75],{"categories":2557},[132],{"categories":2559},[127],{"categories":2561},[124],{"categories":2563},[75],{"categories":2565},[171],{"categories":2567},[181],{"categories":2569},[181],{"categories":2571},[75],{"categories":2573},[174],{"categories":2575},[75],{"categories":2577},[132],{"categories":2579},[127],{"categories":2581},[132],{"categories":2583},[75],{"categories":2585},[75],{"categories":2587},[132],{"categories":2589},[150],{"categories":2591},[],{"categories":2593},[124],{"categories":2595},[75],{"categories":2597},[132],{"categories":2599},[75],{"categories":2601},[75],{"categories":2603},[],{"categories":2605},[171],{"categories":2607},[127],{"categories":2609},[150],{"categories":2611},[75],{"categories":2613},[75],{"categories":2615},[171],{"categories":2617},[188],{"categories":2619},[174],{"categories":2621},[75],{"categories":2623},[150],{"categories":2625},[75],{"categories":2627},[132],{"categories":2629},[443],{"categories":2631},[75],{"categories":2633},[132],{"categories":2635},[174],{"categories":2637},[],{"categories":2639},[132],{"categories":2641},[181],{"categories":2643},[171],{"categories":2645},[75],{"categories":2647},[124],{"categories":2649},[127],{"categories":2651},[181],{"categories":2653},[],{"categories":2655},[132],{"categories":2657},[75],{"categories":2659},[],{"categories":2661},[150],{"categories":2663},[],{"categories":2665},[150],{"categories":2667},[75],{"categories":2669},[132],{"categories":2671},[132],{"categories":2673},[132],{"categories":2675},[],{"categories":2677},[],{"categories":2679},[75],{"categories":2681},[75],{"categories":2683},[],{"categories":2685},[171],{"categories":2687},[132],{"categories":2689},[188],{"categories":2691},[124],{"categories":2693},[],{"categories":2695},[],{"categories":2697},[150],{"categories":2699},[181],{"categories":2701},[75],{"categories":2703},[75],{"categories":2705},[75],{"categories":2707},[181],{"categories":2709},[150],{"categories":2711},[171],{"categories":2713},[75],{"categories":2715},[75],{"categories":2717},[75],{"categories":2719},[150],{"categories":2721},[75],{"categories":2723},[150],{"categories":2725},[132],{"categories":2727},[132],{"categories":2729},[181],{"categories":2731},[132],{"categories":2733},[75],{"categories":2735},[181],{"categories":2737},[171],{"categories":2739},[],{"categories":2741},[132],{"categories":2743},[],{"categories":2745},[],{"categories":2747},[127],{"categories":2749},[75],{"categories":2751},[132],{"categories":2753},[124],{"categories":2755},[132],{"categories":2757},[188],{"categories":2759},[],{"categories":2761},[132],{"categories":2763},[],{"categories":2765},[124],{"categories":2767},[132],{"categories":2769},[],{"categories":2771},[132],{"categories":2773},[75],{"categories":2775},[150],{"categories":2777},[75],{"categories":2779},[132],{"categories":2781},[150],{"categories":2783},[132],{"categories":2785},[181],{"categories":2787},[171],{"categories":2789},[124],{"categories":2791},[],{"categories":2793},[132],{"categories":2795},[171],{"categories":2797},[150],{"categories":2799},[75],{"categories":2801},[171],{"categories":2803},[124],{"categories":2805},[],{"categories":2807},[132],{"categories":2809},[132],{"categories":2811},[75],{"categories":2813},[],{"categories":2815},[132],{"categories":2817},[135],{"categories":2819},[150],{"categories":2821},[132],{"categories":2823},[127],{"categories":2825},[],{"categories":2827},[75],{"categories":2829},[135],{"categories":2831},[75],{"categories":2833},[132],{"categories":2835},[150],{"categories":2837},[124],{"categories":2839},[443],{"categories":2841},[75],{"categories":2843},[75],{"categories":2845},[75],{"categories":2847},[150],{"categories":2849},[127],{"categories":2851},[75],{"categories":2853},[171],{"categories":2855},[150],{"categories":2857},[443],{"categories":2859},[75],{"categories":2861},[],{"categories":2863},[],{"categories":2865},[443],{"categories":2867},[174],{"categories":2869},[132],{"categories":2871},[132],{"categories":2873},[150],{"categories":2875},[75],{"categories":2877},[124],{"categories":2879},[171],{"categories":2881},[132],{"categories":2883},[75],{"categories":2885},[188],{"categories":2887},[75],{"categories":2889},[132],{"categories":2891},[],{"categories":2893},[75],{"categories":2895},[75],{"categories":2897},[150],{"categories":2899},[124],{"categories":2901},[],{"categories":2903},[75],{"categories":2905},[75],{"categories":2907},[181],{"categories":2909},[171],{"categories":2911},[75,132],{"categories":2913},[188,127],{"categories":2915},[75],{"categories":2917},[],{"categories":2919},[132],{"categories":2921},[],{"categories":2923},[181],{"categories":2925},[75],{"categories":2927},[150],{"categories":2929},[],{"categories":2931},[132],{"categories":2933},[],{"categories":2935},[132],{"categories":2937},[124],{"categories":2939},[132],{"categories":2941},[75],{"categories":2943},[443],{"categories":2945},[188],{"categories":2947},[127],{"categories":2949},[127],{"categories":2951},[124],{"categories":2953},[124],{"categories":2955},[75],{"categories":2957},[132],{"categories":2959},[75],{"categories":2961},[75],{"categories":2963},[124],{"categories":2965},[75],{"categories":2967},[188],{"categories":2969},[150],{"categories":2971},[75],{"categories":2973},[132],{"categories":2975},[75],{"categories":2977},[],{"categories":2979},[181],{"categories":2981},[],{"categories":2983},[132],{"categories":2985},[124],{"categories":2987},[],{"categories":2989},[443],{"categories":2991},[75],{"categories":2993},[],{"categories":2995},[150],{"categories":2997},[132],{"categories":2999},[181],{"categories":3001},[75],{"categories":3003},[132],{"categories":3005},[181],{"categories":3007},[132],{"categories":3009},[150],{"categories":3011},[124],{"categories":3013},[150],{"categories":3015},[181],{"categories":3017},[75],{"categories":3019},[171],{"categories":3021},[75],{"categories":3023},[75],{"categories":3025},[75],{"categories":3027},[75],{"categories":3029},[132],{"categories":3031},[75],{"categories":3033},[132],{"categories":3035},[75],{"categories":3037},[124],{"categories":3039},[75],{"categories":3041},[132],{"categories":3043},[171],{"categories":3045},[124],{"categories":3047},[132],{"categories":3049},[171],{"categories":3051},[],{"categories":3053},[75],{"categories":3055},[75],{"categories":3057},[181],{"categories":3059},[],{"categories":3061},[132],{"categories":3063},[188],{"categories":3065},[75],{"categories":3067},[150],{"categories":3069},[188],{"categories":3071},[132],{"categories":3073},[127],{"categories":3075},[127],{"categories":3077},[75],{"categories":3079},[124],{"categories":3081},[],{"categories":3083},[75],{"categories":3085},[],{"categories":3087},[124],{"categories":3089},[75],{"categories":3091},[132],{"categories":3093},[132],{"categories":3095},[],{"categories":3097},[181],{"categories":3099},[181],{"categories":3101},[188],{"categories":3103},[171],{"categories":3105},[],{"categories":3107},[75],{"categories":3109},[124],{"categories":3111},[75],{"categories":3113},[181],{"categories":3115},[124],{"categories":3117},[150],{"categories":3119},[150],{"categories":3121},[],{"categories":3123},[150],{"categories":3125},[132],{"categories":3127},[171],{"categories":3129},[174],{"categories":3131},[75],{"categories":3133},[],{"categories":3135},[150],{"categories":3137},[181],{"categories":3139},[127],{"categories":3141},[75],{"categories":3143},[124],{"categories":3145},[443],{"categories":3147},[124],{"categories":3149},[],{"categories":3151},[],{"categories":3153},[150],{"categories":3155},[],{"categories":3157},[132],{"categories":3159},[132],{"categories":3161},[132],{"categories":3163},[],{"categories":3165},[75],{"categories":3167},[],{"categories":3169},[150],{"categories":3171},[124],{"categories":3173},[171],{"categories":3175},[75],{"categories":3177},[150],{"categories":3179},[150],{"categories":3181},[],{"categories":3183},[150],{"categories":3185},[124],{"categories":3187},[75],{"categories":3189},[],{"categories":3191},[132],{"categories":3193},[132],{"categories":3195},[124],{"categories":3197},[],{"categories":3199},[],{"categories":3201},[],{"categories":3203},[171],{"categories":3205},[132],{"categories":3207},[75],{"categories":3209},[],{"categories":3211},[],{"categories":3213},[],{"categories":3215},[171],{"categories":3217},[],{"categories":3219},[124],{"categories":3221},[],{"categories":3223},[],{"categories":3225},[171],{"categories":3227},[75],{"categories":3229},[150],{"categories":3231},[],{"categories":3233},[188],{"categories":3235},[150],{"categories":3237},[188],{"categories":3239},[75],{"categories":3241},[],{"categories":3243},[],{"categories":3245},[132],{"categories":3247},[],{"categories":3249},[],{"categories":3251},[132],{"categories":3253},[75],{"categories":3255},[],{"categories":3257},[132],{"categories":3259},[150],{"categories":3261},[188],{"categories":3263},[174],{"categories":3265},[132],{"categories":3267},[132],{"categories":3269},[],{"categories":3271},[],{"categories":3273},[],{"categories":3275},[150],{"categories":3277},[],{"categories":3279},[],{"categories":3281},[171],{"categories":3283},[124],{"categories":3285},[],{"categories":3287},[127],{"categories":3289},[188],{"categories":3291},[75],{"categories":3293},[181],{"categories":3295},[124],{"categories":3297},[174],{"categories":3299},[127],{"categories":3301},[181],{"categories":3303},[],{"categories":3305},[],{"categories":3307},[132],{"categories":3309},[124],{"categories":3311},[171],{"categories":3313},[124],{"categories":3315},[132],{"categories":3317},[443],{"categories":3319},[132],{"categories":3321},[],{"categories":3323},[75],{"categories":3325},[150],{"categories":3327},[181],{"categories":3329},[],{"categories":3331},[171],{"categories":3333},[150],{"categories":3335},[124],{"categories":3337},[132],{"categories":3339},[75],{"categories":3341},[127],{"categories":3343},[132,443],{"categories":3345},[132],{"categories":3347},[181],{"categories":3349},[75],{"categories":3351},[174],{"categories":3353},[188],{"categories":3355},[132],{"categories":3357},[],{"categories":3359},[132],{"categories":3361},[75],{"categories":3363},[127],{"categories":3365},[],{"categories":3367},[],{"categories":3369},[75],{"categories":3371},[174],{"categories":3373},[75],{"categories":3375},[],{"categories":3377},[150],{"categories":3379},[],{"categories":3381},[150],{"categories":3383},[181],{"categories":3385},[132],{"categories":3387},[75],{"categories":3389},[188],{"categories":3391},[181],{"categories":3393},[],{"categories":3395},[150],{"categories":3397},[75],{"categories":3399},[],{"categories":3401},[75],{"categories":3403},[132],{"categories":3405},[75],{"categories":3407},[132],{"categories":3409},[75],{"categories":3411},[75],{"categories":3413},[75],{"categories":3415},[75],{"categories":3417},[127],{"categories":3419},[],{"categories":3421},[135],{"categories":3423},[150],{"categories":3425},[75],{"categories":3427},[],{"categories":3429},[181],{"categories":3431},[75],{"categories":3433},[75],{"categories":3435},[132],{"categories":3437},[150],{"categories":3439},[75],{"categories":3441},[75],{"categories":3443},[127],{"categories":3445},[132],{"categories":3447},[171],{"categories":3449},[],{"categories":3451},[174],{"categories":3453},[75],{"categories":3455},[],{"categories":3457},[150],{"categories":3459},[188],{"categories":3461},[],{"categories":3463},[],{"categories":3465},[150],{"categories":3467},[150],{"categories":3469},[188],{"categories":3471},[124],{"categories":3473},[132],{"categories":3475},[132],{"categories":3477},[75],{"categories":3479},[127],{"categories":3481},[],{"categories":3483},[],{"categories":3485},[150],{"categories":3487},[174],{"categories":3489},[181],{"categories":3491},[132],{"categories":3493},[171],{"categories":3495},[174],{"categories":3497},[174],{"categories":3499},[],{"categories":3501},[150],{"categories":3503},[75],{"categories":3505},[75],{"categories":3507},[181],{"categories":3509},[],{"categories":3511},[150],{"categories":3513},[150],{"categories":3515},[150],{"categories":3517},[],{"categories":3519},[132],{"categories":3521},[75],{"categories":3523},[],{"categories":3525},[124],{"categories":3527},[127],{"categories":3529},[],{"categories":3531},[75],{"categories":3533},[75],{"categories":3535},[],{"categories":3537},[181],{"categories":3539},[],{"categories":3541},[],{"categories":3543},[],{"categories":3545},[],{"categories":3547},[75],{"categories":3549},[150],{"categories":3551},[],{"categories":3553},[],{"categories":3555},[75],{"categories":3557},[75],{"categories":3559},[75],{"categories":3561},[174],{"categories":3563},[75],{"categories":3565},[174],{"categories":3567},[],{"categories":3569},[174],{"categories":3571},[174],{"categories":3573},[443],{"categories":3575},[132],{"categories":3577},[181],{"categories":3579},[],{"categories":3581},[],{"categories":3583},[174],{"categories":3585},[181],{"categories":3587},[181],{"categories":3589},[181],{"categories":3591},[],{"categories":3593},[124],{"categories":3595},[181],{"categories":3597},[181],{"categories":3599},[124],{"categories":3601},[181],{"categories":3603},[127],{"categories":3605},[181],{"categories":3607},[181],{"categories":3609},[181],{"categories":3611},[174],{"categories":3613},[150],{"categories":3615},[150],{"categories":3617},[75],{"categories":3619},[181],{"categories":3621},[174],{"categories":3623},[443],{"categories":3625},[174],{"categories":3627},[174],{"categories":3629},[174],{"categories":3631},[],{"categories":3633},[127],{"categories":3635},[],{"categories":3637},[443],{"categories":3639},[181],{"categories":3641},[181],{"categories":3643},[181],{"categories":3645},[132],{"categories":3647},[150,127],{"categories":3649},[174],{"categories":3651},[],{"categories":3653},[],{"categories":3655},[174],{"categories":3657},[],{"categories":3659},[174],{"categories":3661},[150],{"categories":3663},[132],{"categories":3665},[],{"categories":3667},[181],{"categories":3669},[75],{"categories":3671},[171],{"categories":3673},[],{"categories":3675},[75],{"categories":3677},[],{"categories":3679},[150],{"categories":3681},[124],{"categories":3683},[174],{"categories":3685},[],{"categories":3687},[181],{"categories":3689},[150],[3691,3776,3827,3899],{"id":3692,"title":3693,"ai":3694,"body":3699,"categories":3750,"created_at":76,"date_modified":76,"description":67,"extension":77,"faq":76,"featured":78,"kicker_label":76,"meta":3751,"navigation":103,"path":3762,"published_at":3763,"question":76,"scraped_at":3764,"seo":3765,"sitemap":3766,"source_id":3767,"source_name":3768,"source_type":111,"source_url":3769,"stem":3770,"tags":3771,"thumbnail_url":76,"tldr":3773,"tweet":76,"unknown_tags":3774,"__hash__":3775},"summaries\u002Fsummaries\u002Fadd9ec06f3d8b78d-decoder-only-transformers-drive-gpt-scaling-summary.md","Decoder-Only Transformers Drive GPT Scaling",{"provider":7,"model":8,"input_tokens":3695,"output_tokens":3696,"processing_time_ms":3697,"cost_usd":3698},8457,1685,17671,0.00202705,{"type":14,"value":3700,"toc":3744},[3701,3705,3708,3711,3715,3718,3721,3725,3728,3731,3735,3738,3741],[17,3702,3704],{"id":3703},"self-attention-enables-parallel-long-range-dependencies","Self-Attention Enables Parallel Long-Range Dependencies",[22,3706,3707],{},"Transformers replace RNNs' sequential processing, which suffers vanishing gradients beyond 50-100 words, with self-attention that computes direct relationships between all token pairs simultaneously. For a token like \"it\" in \"The cat sat on the mat and looked at the fishbowl because it was hungry,\" every prior word votes on relevance via query-key dot products scaled by embed_size^{-0.5}, softmax-normalized, and applied to values. This parallelization trains across thousands of GPUs.",[22,3709,3710],{},"GPT's decoder-only design strips away the encoder, applying a causal mask to block future tokens, forcing rich representations solely from predicting the next token. GPT-1 (117M params, 12 layers) showed modest NLP scores, but GPT-2 (1.5B params) gained zero-shot abilities like summarization via prompting. GPT-3 (175B params, 96 layers) added in-context learning from prompt examples without fine-tuning. Deeper layers progress from syntax (early) to reasoning and world models (late). This simplicity scales better than encoder-decoder setups, avoiding cross-attention overhead.",[17,3712,3714],{"id":3713},"moe-and-test-time-compute-scale-beyond-dense-models","MoE and Test-Time Compute Scale Beyond Dense Models",[22,3716,3717],{},"Dense models activate all parameters per token, making trillions unaffordable. Mixture of Experts (MoE) routes each token to 2-8 specialized experts out of 128+, activating ~5% of weights—e.g., DeepSeek-V3 uses 37B active out of 671B total, trained for $5.6M on 2,048 H800 GPUs, matching GPT-4. Multi-Head Latent Attention (MLA) compresses KV cache to cut memory bandwidth. Tradeoffs include expert collapse (router overloads few experts) and full-model memory needs despite sparse activation.",[22,3719,3720],{},"o1 introduced test-time compute: generate internal reasoning chains (30s for hard problems), backtrack dead ends, and refine via RL on verifiable rewards like math solutions. This outperforms larger instant-response models, decoupling ability from size. GPT-5 routes simple queries fast (System 1) and complex ones deeply (System 2). Open models like DeepSeek-R1 replicate this.",[17,3722,3724],{"id":3723},"multimodal-fusion-and-real-world-impacts","Multimodal Fusion and Real-World Impacts",[22,3726,3727],{},"Early fusion embeds vision tokens from Vision Transformers (e.g., MetaCLIP) into the same space as text, enabling unified attention across modalities—no separate captioning. Models like LLaMA 4, Qwen-VL handle charts, 3D spatial reasoning (GLM-4.5V's rotated positional encoding). This yields native cross-modal reasoning, e.g., diagnosing X-rays directly.",[22,3729,3730],{},"Applications: Harvey AI (RAG + fine-tuned GPT-4) cuts legal review 40-60%; GPT-4.1 hits 54.6% on SWE-bench (21.4pp over GPT-4o), ingesting 1M-token codebases; 75% medical accuracy accelerates drug discovery. Open weights (LLaMA, DeepSeek) ensure data sovereignty.",[17,3732,3734],{"id":3733},"implement-mini-gpt-from-scratch-in-pytorch","Implement Mini-GPT from Scratch in PyTorch",[22,3736,3737],{},"Build a character-level GPT: Tokenizer maps unique chars to indices (vocab_size ~50). SelfAttention computes QKV projections, scores = (Q @ K.T) * scale, weights = softmax(scores), out = weights @ V. TransformerBlock adds residual attention + FFN (4x expand, ReLU), LayerNorm post each.",[22,3739,3740],{},"MiniGPT stacks NUM_LAYERS=2 blocks on token + positional embeddings (BLOCK_SIZE=32), outputs logits via linear to vocab_size. Train on dataset.txt: batch BATCH_SIZE=16 sequences, predict next token with CrossEntropyLoss, Adam at 3e-4, 20 EPOCHS. Generation: sample from last-token softmax via multinomial, append up to 100 tokens from context like \"AI is\".",[22,3742,3743],{},"Project structure: data\u002Fdataset.txt, model\u002F{tokenizer,attention,transformer,gpt}.py, train.py saves model.pth, generate.py loads\u002Finfers. Config: EMBED_SIZE=64, NUM_HEADS=4 (implied in attention). This replicates core logic scalably.",{"title":67,"searchDepth":68,"depth":68,"links":3745},[3746,3747,3748,3749],{"id":3703,"depth":68,"text":3704},{"id":3713,"depth":68,"text":3714},{"id":3723,"depth":68,"text":3724},{"id":3733,"depth":68,"text":3734},[75],{"content_references":3752,"triage":3758},[3753],{"type":3754,"title":3755,"author":3756,"context":3757},"paper","Attention Is All You Need","Ashish Vaswani’s team","cited",{"relevance":100,"novelty":3759,"quality":100,"actionability":68,"composite":3760,"reasoning":3761},3,3.4,"Category: AI & LLMs. The article provides a detailed explanation of the architecture behind GPT models, which is relevant for developers looking to integrate AI features. However, while it offers insights into model design, it lacks practical applications or frameworks that the audience can directly implement.","\u002Fsummaries\u002Fadd9ec06f3d8b78d-decoder-only-transformers-drive-gpt-scaling-summary","2026-04-18 19:32:29","2026-04-19 01:22:04",{"title":3693,"description":67},{"loc":3762},"add9ec06f3d8b78d","Python in Plain English","https:\u002F\u002Fpython.plainenglish.io\u002Fthe-architecture-behind-gpt-models-de61992c088a?source=rss----78073def27b8---4","summaries\u002Fadd9ec06f3d8b78d-decoder-only-transformers-drive-gpt-scaling-summary",[115,116,117,3772],"coding","GPT models use decoder-only transformers with causal masking for next-token prediction, enabling emergent zero-shot and in-context learning when scaled massively, now enhanced by MoE for efficiency and reasoning chains.",[],"FYS789V3fqVrHXGVHROYyiskRFH6nT84_QPvh_I63p0",{"id":3777,"title":3778,"ai":3779,"body":3784,"categories":3812,"created_at":76,"date_modified":76,"description":67,"extension":77,"faq":76,"featured":78,"kicker_label":76,"meta":3813,"navigation":103,"path":3814,"published_at":3815,"question":76,"scraped_at":76,"seo":3816,"sitemap":3817,"source_id":3818,"source_name":3819,"source_type":111,"source_url":3820,"stem":3821,"tags":3822,"thumbnail_url":76,"tldr":3824,"tweet":76,"unknown_tags":3825,"__hash__":3826},"summaries\u002Fsummaries\u002Fkarpathy-s-pure-python-ai-from-scratch-summary.md","Karpathy's Pure Python AI From Scratch",{"provider":7,"model":8,"input_tokens":3780,"output_tokens":3781,"processing_time_ms":3782,"cost_usd":3783},4820,1448,12742,0.0012176,{"type":14,"value":3785,"toc":3807},[3786,3790,3793,3797,3800,3804],[17,3787,3789],{"id":3788},"minimal-code-for-core-ai-models","Minimal Code for Core AI Models",[22,3791,3792],{},"Train and run a full GPT in just 200 lines of dependency-free Python, covering tokenization, model architecture, training loop, and sampling—proving LLMs are accessible without frameworks. Similarly, implement deep RL to master Atari Pong from raw pixels using policy gradients, weighing pros (sample efficiency) against cons (high variance). Character-level RNNs generate poetry, LaTeX, and code; analyze gradients to spot future directions like better optimization. Fool ImageNet classifiers with tiny perturbations, showing even linear models (not just convnets) break easily, challenging robustness claims.",[17,3794,3796],{"id":3795},"historical-benchmarks-and-progress","Historical Benchmarks and Progress",[22,3798,3799],{},"Revisit LeCun's 1989 backprop-trained neural net—the first real-world end-to-end DL app—then upgrade it with 33 years of advances (e.g., modern optimizers, architectures) to quantify progress; preview how 2022 DL will age by 2055. Humans hit 6.7% error@5 on ImageNet vs. top convnets, but manual CIFAR-10 labeling reveals human baselines aren't unbeatable. Early CV state (2012) lags far behind human vision, tempering AI hype.",[17,3801,3803],{"id":3802},"practical-training-and-experiments","Practical Training and Experiments",[22,3805,3806],{},"Follow a battle-tested recipe for neural nets: batch size 0.2-10% of GPU memory, weak regularization first, then strengthen; cosine anneal LR over 1M steps. Scrape 2M selfies to train convnets classifying good\u002Fbad #selfies, visualizing what networks 'think'. Track productivity via window\u002Fkeystroke logging on Ubuntu\u002FOSX, generating HTML viz for insights. Biohacking basics: tweak energy metabolism via experiments. PhD survival: navigate academia with tips on focus, advising.",{"title":67,"searchDepth":68,"depth":68,"links":3808},[3809,3810,3811],{"id":3788,"depth":68,"text":3789},{"id":3795,"depth":68,"text":3796},{"id":3802,"depth":68,"text":3803},[75],{},"\u002Fsummaries\u002Fkarpathy-s-pure-python-ai-from-scratch-summary","2026-04-08 21:21:19",{"title":3778,"description":67},{"loc":3814},"2ff230eac68aac35","Andrej Karpathy Blog","https:\u002F\u002Funknown","summaries\u002Fkarpathy-s-pure-python-ai-from-scratch-summary",[116,115,3823,117],"deep-learning","Andrej Karpathy distills neural nets, LLMs, RL, and Bitcoin into 200-500 line pure Python scripts—no deps needed—to teach core mechanics hands-on.",[],"SiA702o4JPFym6Ze2kqREo-Ap1fC_lo4d1oWU5fAQzM",{"id":3828,"title":3829,"ai":3830,"body":3835,"categories":3887,"created_at":76,"date_modified":76,"description":67,"extension":77,"faq":76,"featured":78,"kicker_label":76,"meta":3888,"navigation":103,"path":3889,"published_at":3815,"question":76,"scraped_at":76,"seo":3890,"sitemap":3891,"source_id":3892,"source_name":3893,"source_type":111,"source_url":3820,"stem":3894,"tags":3895,"thumbnail_url":76,"tldr":3896,"tweet":76,"unknown_tags":3897,"__hash__":3898},"summaries\u002Fsummaries\u002Fmicrogpt-py-full-gpt-in-300-lines-of-pure-python-summary.md","microgpt.py: Full GPT in 300 Lines of Pure Python",{"provider":7,"model":8,"input_tokens":3831,"output_tokens":3832,"processing_time_ms":3833,"cost_usd":3834},11786,1242,8684,0.0029557,{"type":14,"value":3836,"toc":3882},[3837,3841,3857,3861,3868,3872],[17,3838,3840],{"id":3839},"custom-autograd-engine-powers-end-to-end-training","Custom Autograd Engine Powers End-to-End Training",[22,3842,3843,3844,3848,3849,3852,3853,3856],{},"Implements automatic differentiation via ",[3845,3846,3847],"code",{},"Value"," class with slots for efficiency. Supports add, mul, pow, log, exp, ReLU, and backward via topological sort on computation graph. Chain rule propagates gradients recursively: ",[3845,3850,3851],{},"child.grad += local_grad * v.grad",". Enables full forward\u002Fbackward without libraries. For a names dataset (32k lines from ",[3845,3854,3855],{},"names.txt","), builds char-level tokenizer: unique chars (vocab_size=~30+1 BOS token). Model params (~10k total): 1 layer, n_embd=16, block_size=16, n_head=4 (head_dim=4). Weights initialized Gaussian std=0.08. Embeddings: wte (vocab x 16), wpe (16 x 16), lm_head (vocab x 16). Per layer: QKV (4x 16x16), Wo (16x16), MLP fc1 (64x16), fc2 (16x64).",[17,3858,3860],{"id":3859},"gpt-architecture-mirrors-gpt-2-essentials","GPT Architecture Mirrors GPT-2 Essentials",[22,3862,3863,3864,3867],{},"Forward pass: token+pos embeds → RMSNorm → residual blocks. Attention: raw dot-product (scaled by 1\u002Fsqrt(head_dim)), softmax weights → weighted V sum → Wo projection. Causal via key\u002Fvalue history append (no mask). MLP: RMSNorm → fc1 → ReLU → fc2 → residual. Final lm_head logits → softmax probs. Uses RMSNorm (",[3845,3865,3866],{},"scale = (mean(x^2)+eps)^-0.5",") over LayerNorm, ReLU over GeLU, no biases. Keys\u002Fvalues persist across positions for KV cache simulation. Loss: average -log P(next_token) over sequence (BOS-wrapped docs, up to block_size=16).",[17,3869,3871],{"id":3870},"adam-training-inference-in-1000-steps","Adam Training + Inference in 1000 Steps",[22,3873,3874,3875,3878,3879,3881],{},"Shuffles 32k names, cycles through docs. Per step: tokenize ",[26,3876,3877],{},"BOS"," + chars + ",[26,3880,3877],{},", forward all positions (building KV cache), average cross-entropy loss → backward → Adam update (lr=0.01 linear decay to 0, β1=0.85, β2=0.99). Prints loss (drops from ~3 to ~1.5 typically). Inference: start BOS, sample argmax-probs (temp=0.5) until BOS, yields plausible names like 'korsal' after training. Demonstrates: core GPT is simple; libs optimize speed\u002Fscale. Trade-off: slow (minutes on CPU), but reveals every op.",{"title":67,"searchDepth":68,"depth":68,"links":3883},[3884,3885,3886],{"id":3839,"depth":68,"text":3840},{"id":3859,"depth":68,"text":3860},{"id":3870,"depth":68,"text":3871},[75],{},"\u002Fsummaries\u002Fmicrogpt-py-full-gpt-in-300-lines-of-pure-python-summary",{"title":3829,"description":67},{"loc":3889},"56d2bdaaa16d5c3b","Andrej Karpathy Gists","summaries\u002Fmicrogpt-py-full-gpt-in-300-lines-of-pure-python-summary",[115,116,117,3772],"Trains a tiny GPT on names dataset using custom autograd—no deps, no PyTorch—to generate realistic names, distilling the core transformer algorithm.",[],"3fO1PHuRnDxVHEXFsDwlj_bugbD79pZ1c6UEJVeKQE8",{"id":3900,"title":3901,"ai":3902,"body":3907,"categories":4004,"created_at":76,"date_modified":76,"description":67,"extension":77,"faq":76,"featured":78,"kicker_label":76,"meta":4005,"navigation":103,"path":4030,"published_at":76,"question":76,"scraped_at":4031,"seo":4032,"sitemap":4033,"source_id":4034,"source_name":4035,"source_type":111,"source_url":4036,"stem":4037,"tags":4038,"thumbnail_url":76,"tldr":4040,"tweet":76,"unknown_tags":4041,"__hash__":4042},"summaries\u002Fsummaries\u002Fbb2ba5cfd07cd36e-flashattention-2-4x-faster-exact-attention-on-gpus-summary.md","FlashAttention: 2-4x Faster Exact Attention on GPUs",{"provider":7,"model":8,"input_tokens":3903,"output_tokens":3904,"processing_time_ms":3905,"cost_usd":3906},9962,2114,53702,0.0025421,{"type":14,"value":3908,"toc":3998},[3909,3913,3916,3919,3923,3934,3945,3949,3968,3987,3991],[17,3910,3912],{"id":3911},"io-aware-kernel-design-cuts-memory-and-boosts-speed","IO-Aware Kernel Design Cuts Memory and Boosts Speed",[22,3914,3915],{},"FlashAttention computes exact attention without storing the full N^2 attention matrix or gradients, using GPU tiling to maximize SRAM usage and minimize HBM reads\u002Fwrites. This yields 2-4x end-to-end speedups in transformer training on A100 GPUs (e.g., 2.4x for GPT-2 style models) and 3-5x memory savings, enabling longer sequences like 64k tokens on single A100 vs. 16k baseline. Backward pass fuses dP computation with dV, avoiding extra softmax. FlashAttention-2 improves parallelism with better work partitioning (50-73% TFLOPS utilization on A100), supports bf16 on Ampere+, head dims to 256, causal masks aligned to bottom-right for decoder use, and sliding window attention (window_size=(left,right)).",[22,3917,3918],{},"Trade-offs: Requires Ampere+ GPUs (A100\u002FRTX30\u002F40\u002FH100); head dim >192 backward needed A100\u002FH100 originally but now works on consumer GPUs without dropout since v2.5.5. Deterministic backward option trades minor speed\u002Fmemory for reproducibility.",[17,3920,3922],{"id":3921},"installation-matches-hardware-for-peak-performance","Installation Matches Hardware for Peak Performance",[22,3924,3925,3926,3929,3930,3933],{},"Install via ",[3845,3927,3928],{},"pip install flash-attn --no-build-isolation"," (3-5 min compile with ninja on 64-core, CUDA 12+). Needs PyTorch 2.2+, packaging\u002Fpsutil\u002Fninja. Limit jobs with ",[3845,3931,3932],{},"MAX_JOBS=4"," on low-RAM machines. ROCm 6.0+ supports MI200+\u002FRDNA3\u002F4 GPUs via composable_kernel (default, fp16\u002Fbf16 fwd\u002Fbwd) or Triton backend (fp16\u002Fbf16\u002Ffp32, causal\u002FMQA\u002FGQA\u002Fpaged\u002FFP8). Use Nvidia\u002FROCm PyTorch containers for deps.",[22,3935,3936,3937,3940,3941,3944],{},"Beta FlashAttention-3 (H100\u002FH800, CUDA 12.3+, FP16\u002FBF16 fwd\u002Fbwd, FP8 fwd) via separate install; FlashAttention-4 (CuTeDSL, H100\u002FB200, ",[3845,3938,3939],{},"pip install flash-attn-4[cu13]",") for Hopper\u002FBlackwell. Huggingface kernels offer drop-in via ",[3845,3942,3943],{},"get_kernel('kernels-community\u002Fflash-attn2')",".",[17,3946,3948],{"id":3947},"usage-replaces-standard-attention-with-kv-cache-support","Usage Replaces Standard Attention with KV Cache Support",[22,3950,3951,3952,3955,3956,3959,3960,3963,3964,3967],{},"Core: ",[3845,3953,3954],{},"out = flash_attn_func(q, k, v, softmax_scale=1\u002Fmath.sqrt(d), causal=True, dropout_p=0.0)"," or ",[3845,3957,3958],{},"flash_attn_qkvpacked_func(qkv)"," for packed inputs (faster bwd). Supports MQA\u002FGQA (nheads_Q % nheads_KV == 0), ALiBi (",[3845,3961,3962],{},"alibi_slopes","), softcapping (Gemma\u002FGrok), paged KV cache (",[3845,3965,3966],{},"block_table","), variable seq lens.",[22,3969,3970,3971,3974,3975,3978,3979,3982,3983,3986],{},"Inference: ",[3845,3972,3973],{},"flash_attn_with_kvcache(q, k_cache, v_cache, k=new_k, v=new_v, rotary_cos\u002Fsin, cache_seqlens)"," updates cache inplace, applies RoPE, causal\u002Flocal masks. Example causal mask for seqlen_q=2, seqlen_k=5: attends to last 2+3 positions bottom-right aligned. Integrate in MHA via ",[3845,3976,3977],{},"flash_attn\u002Fmodules\u002Fmha.py",". Set ",[3845,3980,3981],{},"dropout_p=0.0"," eval; ",[3845,3984,3985],{},"deterministic=True"," bwd for reproducibility.",[17,3988,3990],{"id":3989},"evolutions-unlock-new-workloads","Evolutions Unlock New Workloads",[22,3992,3993,3994,3997],{},"v2.0: 2x faster rewrite, ",[3845,3995,3996],{},"flash_attn_varlen_*"," for ragged batches. v2.1+: Causal realignment, inference opts (split KV load for seqlen_q=1). v2.3+: Sliding window (Mistral 7B). v2.4+: ALiBi, deterministic bwd. v2.5+: PagedAttention. v2.6+: Softcap. v2.7+: torch.compile compat. Widely adopted (usage.md lists integrations).",{"title":67,"searchDepth":68,"depth":68,"links":3999},[4000,4001,4002,4003],{"id":3911,"depth":68,"text":3912},{"id":3921,"depth":68,"text":3922},{"id":3947,"depth":68,"text":3948},{"id":3989,"depth":68,"text":3990},[75],{"content_references":4006,"triage":4027},[4007,4011,4015,4018,4021,4024],{"type":3754,"title":4008,"author":4009,"url":4010,"context":3757},"FlashAttention: Fast and Memory-Efficient Exact Attention with IO-Awareness","Tri Dao, Daniel Y. Fu, Stefano Ermon, Atri Rudra, Christopher Ré","https:\u002F\u002Farxiv.org\u002Fabs\u002F2205.14135",{"type":3754,"title":4012,"author":4013,"url":4014,"context":3757},"FlashAttention-2: Faster Attention with Better Parallelism and Work Partitioning","Tri Dao","https:\u002F\u002Ftridao.me\u002Fpublications\u002Fflash2\u002Fflash2.pdf",{"type":3754,"title":4016,"author":4013,"url":4017,"context":3757},"FlashAttention-3","https:\u002F\u002Ftridao.me\u002Fpublications\u002Fflash3\u002Fflash3.pdf",{"type":3754,"title":4019,"url":4020,"context":3757},"PagedAttention","https:\u002F\u002Farxiv.org\u002Fabs\u002F2309.06180",{"type":94,"title":4022,"url":4023,"context":85},"IEEE Spectrum article on MLPerf 2.0","https:\u002F\u002Fspectrum.ieee.org\u002Fmlperf-rankings-2022",{"type":82,"title":4025,"url":4026,"context":97},"huggingface\u002Fkernels","https:\u002F\u002Fgithub.com\u002Fhuggingface\u002Fkernels",{"relevance":99,"novelty":100,"quality":100,"actionability":100,"composite":4028,"reasoning":4029},4.35,"Category: AI & LLMs. The article provides a detailed explanation of how to implement FlashAttention to improve transformer training efficiency, addressing a specific pain point for AI developers looking to optimize performance. It includes practical installation instructions and usage examples, making it actionable for the target audience.","\u002Fsummaries\u002Fbb2ba5cfd07cd36e-flashattention-2-4x-faster-exact-attention-on-gpus-summary","2026-04-16 03:01:06",{"title":3901,"description":67},{"loc":4030},"bb2ba5cfd07cd36e","__oneoff__","https:\u002F\u002Fgithub.com\u002FDao-AILab\u002Fflash-attention","summaries\u002Fbb2ba5cfd07cd36e-flashattention-2-4x-faster-exact-attention-on-gpus-summary",[115,117,116,4039],"ai-tools","Replace PyTorch's scaled_dot_product_attention with FlashAttention kernels to cut transformer training memory by 3x+ and speed up by 2-4x via IO-aware tiling that fuses softmax and skips materializing N^2 attention matrix.",[],"p9Q0kYcZPBLc6PM17T6f5gjDVL-qvZ13UNiKPL_ijhY"]