[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"summary-1dcaa9cf36eee656-blt-cuts-inference-bandwidth-50-92-via-diffusion-s-summary":3,"summaries-facets-categories":107,"summary-related-1dcaa9cf36eee656-blt-cuts-inference-bandwidth-50-92-via-diffusion-s-summary":3676},{"id":4,"title":5,"ai":6,"body":13,"categories":68,"created_at":70,"date_modified":70,"description":61,"extension":71,"faq":70,"featured":72,"kicker_label":70,"meta":73,"navigation":89,"path":90,"published_at":91,"question":70,"scraped_at":92,"seo":93,"sitemap":94,"source_id":95,"source_name":96,"source_type":97,"source_url":98,"stem":99,"tags":100,"thumbnail_url":70,"tldr":104,"tweet":70,"unknown_tags":105,"__hash__":106},"summaries\u002Fsummaries\u002F1dcaa9cf36eee656-blt-cuts-inference-bandwidth-50-92-via-diffusion-s-summary.md","BLT Cuts Inference Bandwidth 50-92% via Diffusion & Speculation",{"provider":7,"model":8,"input_tokens":9,"output_tokens":10,"processing_time_ms":11,"cost_usd":12},"openrouter","x-ai\u002Fgrok-4.1-fast",8589,2722,30748,0.00305615,{"type":14,"value":15,"toc":60},"minimark",[16,21,25,29,37,40,44,47,50,53,57],[17,18,20],"h2",{"id":19},"blts-memory-bandwidth-bottleneck-in-byte-level-generation","BLT's Memory Bandwidth Bottleneck in Byte-Level Generation",[22,23,24],"p",{},"Byte-level models like BLT avoid tokenization pitfalls—noise sensitivity, poor multilingual support, weak character\u002Fcode handling—by processing raw bytes via entropy-based patches (avg 4 bytes, max 8). Computation uses local encoder, global Transformer, local decoder on latent tokens. Inference slows because autoregressive decoder generates one byte\u002Fstep, vs. tokens covering multiple bytes. This multiplies memory loads for weights\u002FKV caches, the key serving bottleneck. BLT needs 4x more decoder passes than token models for equivalent text, hiking bandwidth costs.",[17,26,28],{"id":27},"block-diffusion-enables-multi-byte-decoding-per-pass-blt-d","Block Diffusion Enables Multi-Byte Decoding per Pass (BLT-D)",[22,30,31,32,36],{},"BLT-D replaces byte-by-byte autoregression with discrete diffusion in fixed blocks (B=4\u002F8\u002F16 bytes). Training: corrupt blocks by masking bytes independently with prob t~U(0,1); loss combines next-byte prediction on clean seq + masked prediction on corrupted. Inference: start with ",[33,34,35],"span",{},"MASK"," block, iteratively unmask multiple bytes\u002Fpass via confidence (prob>α) or entropy-bounded (cumulative entropy\u003Cγ) sampling. Encoder\u002Fglobal called once\u002Fblock, not per-patch; supports KV caching.",[22,38,39],{},"At 3B params on BLT-1T (1T tokens), BLT-D-4 matches BLT scores on FLORES-101 translation (French\u002FEnglish, German\u002FEnglish; 4-shot BLEU), nears on HumanEval\u002FMBPP coding (0\u002F3-shot pass@1). BLT-D-16 cuts bandwidth 87-92% but drops coding pass@1. Likelihoods (ARC-Easy\u002FChallenge, PIQA, HellaSwag, MMLU) near baseline via causal-masked decoder. Translation gains most; coding sensitive to block size. Entropy-bounded + top-p boosts diversity (higher type-token ratio) as NFEs rise.",[17,41,43],{"id":42},"no-training-speculation-recycles-existing-decoder-blt-s-blt-dv","No-Training Speculation Recycles Existing Decoder (BLT-S, BLT-DV)",[22,45,46],{},"BLT-S uses lightweight decoder as self-drafter: generate k=8\u002F16 bytes ignoring patch boundaries, conditioning on last latent; verify via full encode\u002Fglobal\u002Fdecode, accept to first mismatch. Greedy decoding guarantees identical output to BLT (no quality loss); reduces encoder\u002Fglobal calls despite more decoder passes. At 3B\u002Fk=16, 77% bandwidth cut.",[22,48,49],{},"BLT-DV (on BLT-D weights): one-step diffusion drafts block, autoregressive verify accepts to mismatch. Single-step diffusion degrades alone but verification fixes it. At 3B, up to 81% bandwidth reduction.",[22,51,52],{},"All trained 1B:240k steps, 3B:480k on BLT-1T (public + Datacomp-LM subset). Efficiency proxies: decoder\u002Fencoder NFEs, GB bandwidth (16-bit, param\u002Fforward counts). Wall-clock needs optimized serving.",[17,54,56],{"id":55},"practical-tradeoffs-for-production-deployment","Practical Tradeoffs for Production Deployment",[22,58,59],{},"BLT-D fastest (esp B=16) but coding tradeoffs; BLT-S zero-loss safest. All preserve autoregressive likelihoods\u002Freasoning. Bandwidth proxies predict real gains in memory-bound serving. Future: optimized inference impl. Byte-level now viable for production-scale speed without tokenizer fragility.",{"title":61,"searchDepth":62,"depth":62,"links":63},"",2,[64,65,66,67],{"id":19,"depth":62,"text":20},{"id":27,"depth":62,"text":28},{"id":42,"depth":62,"text":43},{"id":55,"depth":62,"text":56},[69],"AI & LLMs",null,"md",false,{"content_references":74,"triage":84},[75,80],{"type":76,"title":77,"url":78,"context":79},"paper","Fast Byte Latent Transformer That Reduces Inference Memory Bandwidth by Over 50% Without Tokenization","https:\u002F\u002Farxiv.org\u002Fpdf\u002F2605.08044","recommended",{"type":76,"title":81,"url":82,"context":83},"Byte Latent Transformer (BLT): A Tokenizer-Free Model That Scales Efficiently","https:\u002F\u002Fwww.marktechpost.com\u002F2024\u002F12\u002F13\u002Fmeta-ai-introduces-byte-latent-transformer-blt-a-tokenizer-free-model-that-scales-efficiently\u002F","cited",{"relevance":85,"novelty":86,"quality":86,"actionability":62,"composite":87,"reasoning":88},3,4,3.25,"Category: AI & LLMs. The article discusses a new approach to improving inference bandwidth in AI models, which is relevant to AI engineering. However, it lacks practical applications or frameworks that the audience can directly implement, focusing instead on theoretical advancements.",true,"\u002Fsummaries\u002F1dcaa9cf36eee656-blt-cuts-inference-bandwidth-50-92-via-diffusion-s-summary","2026-05-11 17:52:15","2026-05-12 15:01:28",{"title":5,"description":61},{"loc":90},"1dcaa9cf36eee656","MarkTechPost","article","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F11\u002Fmeta-and-stanford-researchers-propose-fast-byte-latent-transformer-that-reduces-inference-memory-bandwidth-by-over-50-without-tokenization\u002F","summaries\u002F1dcaa9cf36eee656-blt-cuts-inference-bandwidth-50-92-via-diffusion-s-summary",[101,102,103],"llm","machine-learning","research","Meta\u002FStanford researchers accelerate Byte Latent Transformer (BLT) inference with BLT-D (diffusion decoding), BLT-S (self-speculation), and BLT-DV (diffusion+verification), reducing memory bandwidth 50-92% at 3B params while nearing baseline performance on translation\u002Fcoding tasks.",[],"xMZyx1diuvh2XXZUy_NPhOgWy_XqDJeXjel738dmvjs",[108,111,114,116,119,122,124,126,128,130,132,134,137,139,141,143,145,147,149,151,153,155,158,161,163,165,168,170,172,175,177,179,181,183,185,187,189,191,193,195,197,199,201,203,205,207,209,211,213,215,217,219,221,223,225,227,229,231,233,235,237,239,241,243,245,247,249,251,253,255,257,259,261,263,265,267,269,271,273,275,277,279,281,283,285,287,289,291,293,295,297,299,301,303,305,307,309,311,313,315,317,319,321,323,325,327,329,331,333,335,337,339,341,343,345,347,349,351,353,355,357,359,361,363,365,367,369,371,373,375,377,379,381,383,385,387,389,391,393,395,397,399,401,403,405,407,409,411,413,415,417,419,421,423,425,427,430,432,434,436,438,440,442,444,446,448,450,452,454,456,458,460,462,464,466,468,470,472,474,476,478,480,482,484,486,488,490,492,494,496,498,500,502,504,506,508,510,512,514,516,518,520,522,524,526,528,530,532,534,536,538,540,542,544,546,548,550,552,554,556,558,560,562,564,566,568,570,572,574,576,578,580,582,584,586,588,590,592,594,596,598,600,602,604,606,608,610,612,614,616,618,620,622,624,626,628,630,632,634,636,638,640,642,644,646,648,650,652,654,656,658,660,662,664,666,668,670,672,674,676,678,680,682,684,686,688,690,692,694,696,698,700,702,704,706,708,710,712,714,716,718,720,722,724,726,728,730,732,734,736,738,740,742,744,746,748,750,752,754,756,758,760,762,764,766,768,770,772,774,776,778,780,782,784,786,788,790,792,794,796,798,800,802,804,806,808,810,812,814,816,818,820,822,824,826,828,830,832,834,836,838,840,842,844,846,848,850,852,854,856,858,860,862,864,866,868,870,872,874,876,878,880,882,884,886,888,890,892,894,896,898,900,902,904,906,908,910,912,914,916,918,920,922,924,926,928,930,932,934,936,938,940,942,944,946,948,950,952,954,956,958,960,962,964,966,968,970,972,974,976,978,980,982,984,986,988,990,992,994,996,998,1000,1002,1004,1006,1008,1010,1012,1014,1016,1018,1020,1022,1024,1026,1028,1030,1032,1034,1036,1038,1040,1042,1044,1046,1048,1050,1052,1054,1056,1058,1060,1062,1064,1066,1068,1070,1072,1074,1076,1078,1080,1082,1084,1086,1088,1090,1092,1094,1096,1098,1100,1102,1104,1106,1108,1110,1112,1114,1116,1118,1120,1122,1124,1126,1128,1130,1132,1134,1136,1138,1140,1142,1144,1146,1148,1150,1152,1154,1156,1158,1160,1162,1164,1166,1168,1170,1172,1174,1176,1178,1180,1182,1184,1186,1188,1190,1192,1194,1196,1198,1200,1202,1204,1206,1208,1210,1212,1214,1216,1218,1220,1222,1224,1226,1228,1230,1232,1234,1236,1238,1240,1242,1244,1246,1248,1250,1252,1254,1256,1258,1260,1262,1264,1266,1268,1270,1272,1274,1276,1278,1280,1282,1284,1286,1288,1290,1292,1294,1296,1298,1300,1302,1304,1306,1308,1310,1312,1314,1316,1318,1320,1322,1324,1326,1328,1330,1332,1334,1336,1338,1340,1342,1344,1346,1348,1350,1352,1354,1356,1358,1360,1362,1364,1366,1368,1370,1372,1374,1376,1378,1380,1382,1384,1386,1388,1390,1392,1394,1396,1398,1400,1402,1404,1406,1408,1410,1412,1414,1416,1418,1420,1422,1424,1426,1428,1430,1432,1434,1436,1438,1440,1442,1444,1446,1448,1450,1452,1454,1456,1458,1460,1462,1464,1466,1468,1470,1472,1474,1476,1478,1480,1482,1484,1486,1488,1490,1492,1494,1496,1498,1500,1502,1504,1506,1508,1510,1512,1514,1516,1518,1520,1522,1524,1526,1528,1530,1532,1534,1536,1538,1540,1542,1544,1546,1548,1550,1552,1554,1556,1558,1560,1562,1564,1566,1568,1570,1572,1574,1576,1578,1580,1582,1584,1586,1588,1590,1592,1594,1596,1598,1600,1602,1604,1606,1608,1610,1612,1614,1616,1618,1620,1622,1624,1626,1628,1630,1632,1634,1636,1638,1640,1642,1644,1646,1648,1650,1652,1654,1656,1658,1660,1662,1664,1666,1668,1670,1672,1674,1676,1678,1680,1682,1684,1686,1688,1690,1692,1694,1696,1698,1700,1702,1704,1706,1708,1710,1712,1714,1716,1718,1720,1722,1724,1726,1728,1730,1732,1734,1736,1738,1740,1742,1744,1746,1748,1750,1752,1754,1756,1758,1760,1762,1764,1766,1768,1770,1772,1774,1776,1778,1780,1782,1784,1786,1788,1790,1792,1794,1796,1798,1800,1802,1804,1806,1808,1810,1812,1814,1816,1818,1820,1822,1824,1826,1828,1830,1832,1834,1836,1838,1840,1842,1844,1846,1848,1850,1852,1854,1856,1858,1860,1862,1864,1866,1868,1870,1872,1874,1876,1878,1880,1882,1884,1886,1888,1890,1892,1894,1896,1898,1900,1902,1904,1906,1908,1910,1912,1914,1916,1918,1920,1922,1924,1926,1928,1930,1932,1934,1936,1938,1940,1942,1944,1946,1948,1950,1952,1954,1956,1958,1960,1962,1964,1966,1968,1970,1972,1974,1976,1978,1980,1982,1984,1986,1988,1990,1992,1994,1996,1998,2000,2002,2004,2006,2008,2010,2012,2014,2016,2018,2020,2022,2024,2026,2028,2030,2032,2034,2036,2038,2040,2042,2044,2046,2048,2050,2052,2054,2056,2058,2060,2062,2064,2066,2068,2070,2072,2074,2076,2078,2080,2082,2084,2086,2088,2090,2092,2094,2096,2098,2100,2102,2104,2106,2108,2110,2112,2114,2116,2118,2120,2122,2124,2126,2128,2130,2132,2134,2136,2138,2140,2142,2144,2146,2148,2150,2152,2154,2156,2158,2160,2162,2164,2166,2168,2170,2172,2174,2176,2178,2180,2182,2184,2186,2188,2190,2192,2194,2196,2198,2200,2202,2204,2206,2208,2210,2212,2214,2216,2218,2220,2222,2224,2226,2228,2230,2232,2234,2236,2238,2240,2242,2244,2246,2248,2250,2252,2254,2256,2258,2260,2262,2264,2266,2268,2270,2272,2274,2276,2278,2280,2282,2284,2286,2288,2290,2292,2294,2296,2298,2300,2302,2304,2306,2308,2310,2312,2314,2316,2318,2320,2322,2324,2326,2328,2330,2332,2334,2336,2338,2340,2342,2344,2346,2348,2350,2352,2354,2356,2358,2360,2362,2364,2366,2368,2370,2372,2374,2376,2378,2380,2382,2384,2386,2388,2390,2392,2394,2396,2398,2400,2402,2404,2406,2408,2410,2412,2414,2416,2418,2420,2422,2424,2426,2428,2430,2432,2434,2436,2438,2440,2442,2444,2446,2448,2450,2452,2454,2456,2458,2460,2462,2464,2466,2468,2470,2472,2474,2476,2478,2480,2482,2484,2486,2488,2490,2492,2494,2496,2498,2500,2502,2504,2506,2508,2510,2512,2514,2516,2518,2520,2522,2524,2526,2528,2530,2532,2534,2536,2538,2540,2542,2544,2546,2548,2550,2552,2554,2556,2558,2560,2562,2564,2566,2568,2570,2572,2574,2576,2578,2580,2582,2584,2586,2588,2590,2592,2594,2596,2598,2600,2602,2604,2606,2608,2610,2612,2614,2616,2618,2620,2622,2624,2626,2628,2630,2632,2634,2636,2638,2640,2642,2644,2646,2648,2650,2652,2654,2656,2658,2660,2662,2664,2666,2668,2670,2672,2674,2676,2678,2680,2682,2684,2686,2688,2690,2692,2694,2696,2698,2700,2702,2704,2706,2708,2710,2712,2714,2716,2718,2720,2722,2724,2726,2728,2730,2732,2734,2736,2738,2740,2742,2744,2746,2748,2750,2752,2754,2756,2758,2760,2762,2764,2766,2768,2770,2772,2774,2776,2778,2780,2782,2784,2786,2788,2790,2792,2794,2796,2798,2800,2802,2804,2806,2808,2810,2812,2814,2816,2818,2820,2822,2824,2826,2828,2830,2832,2834,2836,2838,2840,2842,2844,2846,2848,2850,2852,2854,2856,2858,2860,2862,2864,2866,2868,2870,2872,2874,2876,2878,2880,2882,2884,2886,2888,2890,2892,2894,2896,2898,2900,2902,2904,2906,2908,2910,2912,2914,2916,2918,2920,2922,2924,2926,2928,2930,2932,2934,2936,2938,2940,2942,2944,2946,2948,2950,2952,2954,2956,2958,2960,2962,2964,2966,2968,2970,2972,2974,2976,2978,2980,2982,2984,2986,2988,2990,2992,2994,2996,2998,3000,3002,3004,3006,3008,3010,3012,3014,3016,3018,3020,3022,3024,3026,3028,3030,3032,3034,3036,3038,3040,3042,3044,3046,3048,3050,3052,3054,3056,3058,3060,3062,3064,3066,3068,3070,3072,3074,3076,3078,3080,3082,3084,3086,3088,3090,3092,3094,3096,3098,3100,3102,3104,3106,3108,3110,3112,3114,3116,3118,3120,3122,3124,3126,3128,3130,3132,3134,3136,3138,3140,3142,3144,3146,3148,3150,3152,3154,3156,3158,3160,3162,3164,3166,3168,3170,3172,3174,3176,3178,3180,3182,3184,3186,3188,3190,3192,3194,3196,3198,3200,3202,3204,3206,3208,3210,3212,3214,3216,3218,3220,3222,3224,3226,3228,3230,3232,3234,3236,3238,3240,3242,3244,3246,3248,3250,3252,3254,3256,3258,3260,3262,3264,3266,3268,3270,3272,3274,3276,3278,3280,3282,3284,3286,3288,3290,3292,3294,3296,3298,3300,3302,3304,3306,3308,3310,3312,3314,3316,3318,3320,3322,3324,3326,3328,3330,3332,3334,3336,3338,3340,3342,3344,3346,3348,3350,3352,3354,3356,3358,3360,3362,3364,3366,3368,3370,3372,3374,3376,3378,3380,3382,3384,3386,3388,3390,3392,3394,3396,3398,3400,3402,3404,3406,3408,3410,3412,3414,3416,3418,3420,3422,3424,3426,3428,3430,3432,3434,3436,3438,3440,3442,3444,3446,3448,3450,3452,3454,3456,3458,3460,3462,3464,3466,3468,3470,3472,3474,3476,3478,3480,3482,3484,3486,3488,3490,3492,3494,3496,3498,3500,3502,3504,3506,3508,3510,3512,3514,3516,3518,3520,3522,3524,3526,3528,3530,3532,3534,3536,3538,3540,3542,3544,3546,3548,3550,3552,3554,3556,3558,3560,3562,3564,3566,3568,3570,3572,3574,3576,3578,3580,3582,3584,3586,3588,3590,3592,3594,3596,3598,3600,3602,3604,3606,3608,3610,3612,3614,3616,3618,3620,3622,3624,3626,3628,3630,3632,3634,3636,3638,3640,3642,3644,3646,3648,3650,3652,3654,3656,3658,3660,3662,3664,3666,3668,3670,3672,3674],{"categories":109},[110],"Developer Productivity",{"categories":112},[113],"Business & SaaS",{"categories":115},[69],{"categories":117},[118],"AI Automation",{"categories":120},[121],"Product Strategy",{"categories":123},[69],{"categories":125},[110],{"categories":127},[113],{"categories":129},[],{"categories":131},[69],{"categories":133},[],{"categories":135},[136],"AI News & Trends",{"categories":138},[118],{"categories":140},[136],{"categories":142},[118],{"categories":144},[118],{"categories":146},[69],{"categories":148},[69],{"categories":150},[136],{"categories":152},[69],{"categories":154},[],{"categories":156},[157],"Design & Frontend",{"categories":159},[160],"Data Science & Visualization",{"categories":162},[136],{"categories":164},[],{"categories":166},[167],"Software Engineering",{"categories":169},[69],{"categories":171},[118],{"categories":173},[174],"Marketing & Growth",{"categories":176},[69],{"categories":178},[118],{"categories":180},[],{"categories":182},[],{"categories":184},[157],{"categories":186},[118],{"categories":188},[110],{"categories":190},[157],{"categories":192},[69],{"categories":194},[118],{"categories":196},[136],{"categories":198},[],{"categories":200},[],{"categories":202},[118],{"categories":204},[167],{"categories":206},[],{"categories":208},[113],{"categories":210},[],{"categories":212},[],{"categories":214},[118],{"categories":216},[118],{"categories":218},[69],{"categories":220},[],{"categories":222},[167],{"categories":224},[],{"categories":226},[],{"categories":228},[],{"categories":230},[69],{"categories":232},[174],{"categories":234},[157],{"categories":236},[157],{"categories":238},[69],{"categories":240},[118],{"categories":242},[69],{"categories":244},[69],{"categories":246},[118],{"categories":248},[118],{"categories":250},[160],{"categories":252},[136],{"categories":254},[118],{"categories":256},[174],{"categories":258},[118],{"categories":260},[121],{"categories":262},[],{"categories":264},[118],{"categories":266},[],{"categories":268},[118],{"categories":270},[167],{"categories":272},[157],{"categories":274},[69],{"categories":276},[],{"categories":278},[],{"categories":280},[118],{"categories":282},[],{"categories":284},[69],{"categories":286},[],{"categories":288},[110],{"categories":290},[167],{"categories":292},[113],{"categories":294},[136],{"categories":296},[69],{"categories":298},[],{"categories":300},[69],{"categories":302},[],{"categories":304},[167],{"categories":306},[160],{"categories":308},[],{"categories":310},[69],{"categories":312},[157],{"categories":314},[],{"categories":316},[157],{"categories":318},[118],{"categories":320},[],{"categories":322},[118],{"categories":324},[136],{"categories":326},[69],{"categories":328},[],{"categories":330},[118],{"categories":332},[69],{"categories":334},[121],{"categories":336},[],{"categories":338},[69],{"categories":340},[118],{"categories":342},[118],{"categories":344},[],{"categories":346},[160],{"categories":348},[69],{"categories":350},[],{"categories":352},[110],{"categories":354},[113],{"categories":356},[69],{"categories":358},[118],{"categories":360},[167],{"categories":362},[69],{"categories":364},[],{"categories":366},[],{"categories":368},[69],{"categories":370},[],{"categories":372},[157],{"categories":374},[],{"categories":376},[69],{"categories":378},[],{"categories":380},[118],{"categories":382},[69],{"categories":384},[157],{"categories":386},[],{"categories":388},[69],{"categories":390},[69],{"categories":392},[113],{"categories":394},[118],{"categories":396},[69],{"categories":398},[157],{"categories":400},[118],{"categories":402},[],{"categories":404},[],{"categories":406},[136],{"categories":408},[],{"categories":410},[69],{"categories":412},[113,174],{"categories":414},[],{"categories":416},[69],{"categories":418},[],{"categories":420},[],{"categories":422},[69],{"categories":424},[],{"categories":426},[69],{"categories":428},[429],"DevOps & Cloud",{"categories":431},[],{"categories":433},[136],{"categories":435},[157],{"categories":437},[],{"categories":439},[136],{"categories":441},[136],{"categories":443},[69],{"categories":445},[174],{"categories":447},[],{"categories":449},[113],{"categories":451},[],{"categories":453},[69,429],{"categories":455},[69],{"categories":457},[69],{"categories":459},[118],{"categories":461},[69,167],{"categories":463},[160],{"categories":465},[69],{"categories":467},[174],{"categories":469},[118],{"categories":471},[118],{"categories":473},[],{"categories":475},[118],{"categories":477},[69,113],{"categories":479},[],{"categories":481},[157],{"categories":483},[157],{"categories":485},[],{"categories":487},[],{"categories":489},[136],{"categories":491},[],{"categories":493},[110],{"categories":495},[167],{"categories":497},[69],{"categories":499},[157],{"categories":501},[118],{"categories":503},[167],{"categories":505},[136],{"categories":507},[157],{"categories":509},[],{"categories":511},[69],{"categories":513},[69],{"categories":515},[69],{"categories":517},[136],{"categories":519},[110],{"categories":521},[69],{"categories":523},[118],{"categories":525},[429],{"categories":527},[157],{"categories":529},[118],{"categories":531},[],{"categories":533},[],{"categories":535},[157],{"categories":537},[136],{"categories":539},[160],{"categories":541},[],{"categories":543},[69],{"categories":545},[69],{"categories":547},[113],{"categories":549},[69],{"categories":551},[69],{"categories":553},[136],{"categories":555},[],{"categories":557},[118],{"categories":559},[167],{"categories":561},[],{"categories":563},[69],{"categories":565},[69],{"categories":567},[118],{"categories":569},[],{"categories":571},[],{"categories":573},[69],{"categories":575},[],{"categories":577},[113],{"categories":579},[118],{"categories":581},[],{"categories":583},[110],{"categories":585},[69],{"categories":587},[113],{"categories":589},[136],{"categories":591},[],{"categories":593},[],{"categories":595},[],{"categories":597},[136],{"categories":599},[136],{"categories":601},[],{"categories":603},[],{"categories":605},[113],{"categories":607},[],{"categories":609},[],{"categories":611},[110],{"categories":613},[],{"categories":615},[174],{"categories":617},[118],{"categories":619},[113],{"categories":621},[118],{"categories":623},[],{"categories":625},[121],{"categories":627},[157],{"categories":629},[167],{"categories":631},[69],{"categories":633},[118],{"categories":635},[113],{"categories":637},[69],{"categories":639},[],{"categories":641},[],{"categories":643},[167],{"categories":645},[160],{"categories":647},[121],{"categories":649},[118],{"categories":651},[69],{"categories":653},[],{"categories":655},[429],{"categories":657},[],{"categories":659},[118],{"categories":661},[],{"categories":663},[],{"categories":665},[69],{"categories":667},[157],{"categories":669},[174],{"categories":671},[118],{"categories":673},[],{"categories":675},[110],{"categories":677},[],{"categories":679},[136],{"categories":681},[69,429],{"categories":683},[136],{"categories":685},[69],{"categories":687},[113],{"categories":689},[69],{"categories":691},[],{"categories":693},[113],{"categories":695},[],{"categories":697},[167],{"categories":699},[157],{"categories":701},[136],{"categories":703},[160],{"categories":705},[110],{"categories":707},[69],{"categories":709},[167],{"categories":711},[],{"categories":713},[],{"categories":715},[121],{"categories":717},[],{"categories":719},[69],{"categories":721},[],{"categories":723},[157],{"categories":725},[157],{"categories":727},[157],{"categories":729},[],{"categories":731},[],{"categories":733},[136],{"categories":735},[118],{"categories":737},[69],{"categories":739},[69],{"categories":741},[69],{"categories":743},[113],{"categories":745},[69],{"categories":747},[],{"categories":749},[167],{"categories":751},[167],{"categories":753},[113],{"categories":755},[],{"categories":757},[69],{"categories":759},[69],{"categories":761},[113],{"categories":763},[136],{"categories":765},[174],{"categories":767},[118],{"categories":769},[],{"categories":771},[157],{"categories":773},[],{"categories":775},[69],{"categories":777},[],{"categories":779},[113],{"categories":781},[118],{"categories":783},[],{"categories":785},[429],{"categories":787},[160],{"categories":789},[167],{"categories":791},[174],{"categories":793},[167],{"categories":795},[118],{"categories":797},[],{"categories":799},[],{"categories":801},[118],{"categories":803},[110],{"categories":805},[118],{"categories":807},[121],{"categories":809},[113],{"categories":811},[],{"categories":813},[69],{"categories":815},[121],{"categories":817},[69],{"categories":819},[69],{"categories":821},[174],{"categories":823},[157],{"categories":825},[118],{"categories":827},[],{"categories":829},[],{"categories":831},[429],{"categories":833},[167],{"categories":835},[],{"categories":837},[118],{"categories":839},[69],{"categories":841},[157,69],{"categories":843},[110],{"categories":845},[],{"categories":847},[69],{"categories":849},[110],{"categories":851},[157],{"categories":853},[118],{"categories":855},[167],{"categories":857},[],{"categories":859},[69],{"categories":861},[],{"categories":863},[110],{"categories":865},[],{"categories":867},[118],{"categories":869},[121],{"categories":871},[69],{"categories":873},[69],{"categories":875},[157],{"categories":877},[118],{"categories":879},[429],{"categories":881},[157],{"categories":883},[118],{"categories":885},[69],{"categories":887},[69],{"categories":889},[69],{"categories":891},[136],{"categories":893},[],{"categories":895},[121],{"categories":897},[118],{"categories":899},[157],{"categories":901},[118],{"categories":903},[167],{"categories":905},[157],{"categories":907},[118],{"categories":909},[136],{"categories":911},[],{"categories":913},[69],{"categories":915},[157],{"categories":917},[69],{"categories":919},[110],{"categories":921},[136],{"categories":923},[69],{"categories":925},[174],{"categories":927},[69],{"categories":929},[69],{"categories":931},[118],{"categories":933},[118],{"categories":935},[69],{"categories":937},[118],{"categories":939},[157],{"categories":941},[69],{"categories":943},[],{"categories":945},[],{"categories":947},[167],{"categories":949},[],{"categories":951},[110],{"categories":953},[429],{"categories":955},[],{"categories":957},[110],{"categories":959},[113],{"categories":961},[174],{"categories":963},[],{"categories":965},[113],{"categories":967},[],{"categories":969},[],{"categories":971},[],{"categories":973},[],{"categories":975},[],{"categories":977},[69],{"categories":979},[118],{"categories":981},[429],{"categories":983},[110],{"categories":985},[69],{"categories":987},[167],{"categories":989},[121],{"categories":991},[69],{"categories":993},[174],{"categories":995},[69],{"categories":997},[69],{"categories":999},[69],{"categories":1001},[69,110],{"categories":1003},[167],{"categories":1005},[167],{"categories":1007},[157],{"categories":1009},[69],{"categories":1011},[],{"categories":1013},[],{"categories":1015},[],{"categories":1017},[167],{"categories":1019},[160],{"categories":1021},[136],{"categories":1023},[157],{"categories":1025},[],{"categories":1027},[69],{"categories":1029},[69],{"categories":1031},[],{"categories":1033},[],{"categories":1035},[118],{"categories":1037},[69],{"categories":1039},[113],{"categories":1041},[],{"categories":1043},[110],{"categories":1045},[69],{"categories":1047},[110],{"categories":1049},[69],{"categories":1051},[167],{"categories":1053},[174],{"categories":1055},[69,157],{"categories":1057},[136],{"categories":1059},[157],{"categories":1061},[],{"categories":1063},[429],{"categories":1065},[157],{"categories":1067},[118],{"categories":1069},[],{"categories":1071},[],{"categories":1073},[],{"categories":1075},[],{"categories":1077},[167],{"categories":1079},[118],{"categories":1081},[118],{"categories":1083},[69],{"categories":1085},[69],{"categories":1087},[],{"categories":1089},[157],{"categories":1091},[],{"categories":1093},[],{"categories":1095},[118],{"categories":1097},[],{"categories":1099},[],{"categories":1101},[174],{"categories":1103},[174],{"categories":1105},[118],{"categories":1107},[],{"categories":1109},[69],{"categories":1111},[69],{"categories":1113},[167],{"categories":1115},[157],{"categories":1117},[157],{"categories":1119},[118],{"categories":1121},[110],{"categories":1123},[69],{"categories":1125},[157],{"categories":1127},[157],{"categories":1129},[118],{"categories":1131},[118],{"categories":1133},[69],{"categories":1135},[],{"categories":1137},[],{"categories":1139},[69],{"categories":1141},[118],{"categories":1143},[136],{"categories":1145},[167],{"categories":1147},[110],{"categories":1149},[69],{"categories":1151},[],{"categories":1153},[118],{"categories":1155},[118],{"categories":1157},[],{"categories":1159},[110],{"categories":1161},[69],{"categories":1163},[110],{"categories":1165},[110],{"categories":1167},[],{"categories":1169},[],{"categories":1171},[118],{"categories":1173},[118],{"categories":1175},[69],{"categories":1177},[69],{"categories":1179},[136],{"categories":1181},[160],{"categories":1183},[121],{"categories":1185},[136],{"categories":1187},[157],{"categories":1189},[],{"categories":1191},[136],{"categories":1193},[],{"categories":1195},[],{"categories":1197},[],{"categories":1199},[],{"categories":1201},[167],{"categories":1203},[160],{"categories":1205},[],{"categories":1207},[69],{"categories":1209},[69],{"categories":1211},[160],{"categories":1213},[167],{"categories":1215},[],{"categories":1217},[],{"categories":1219},[118],{"categories":1221},[136],{"categories":1223},[136],{"categories":1225},[118],{"categories":1227},[110],{"categories":1229},[69,429],{"categories":1231},[],{"categories":1233},[157],{"categories":1235},[110],{"categories":1237},[118],{"categories":1239},[157],{"categories":1241},[],{"categories":1243},[118],{"categories":1245},[118],{"categories":1247},[69],{"categories":1249},[174],{"categories":1251},[167],{"categories":1253},[157],{"categories":1255},[],{"categories":1257},[118],{"categories":1259},[69],{"categories":1261},[118],{"categories":1263},[118],{"categories":1265},[118],{"categories":1267},[174],{"categories":1269},[118],{"categories":1271},[69],{"categories":1273},[],{"categories":1275},[174],{"categories":1277},[136],{"categories":1279},[118],{"categories":1281},[],{"categories":1283},[],{"categories":1285},[69],{"categories":1287},[118],{"categories":1289},[136],{"categories":1291},[118],{"categories":1293},[],{"categories":1295},[],{"categories":1297},[],{"categories":1299},[118],{"categories":1301},[],{"categories":1303},[],{"categories":1305},[160],{"categories":1307},[69],{"categories":1309},[160],{"categories":1311},[136],{"categories":1313},[69],{"categories":1315},[69],{"categories":1317},[118],{"categories":1319},[69],{"categories":1321},[],{"categories":1323},[],{"categories":1325},[429],{"categories":1327},[],{"categories":1329},[],{"categories":1331},[110],{"categories":1333},[],{"categories":1335},[],{"categories":1337},[],{"categories":1339},[],{"categories":1341},[167],{"categories":1343},[136],{"categories":1345},[174],{"categories":1347},[113],{"categories":1349},[69],{"categories":1351},[69],{"categories":1353},[113],{"categories":1355},[],{"categories":1357},[157],{"categories":1359},[118],{"categories":1361},[113],{"categories":1363},[69],{"categories":1365},[69],{"categories":1367},[110],{"categories":1369},[],{"categories":1371},[110],{"categories":1373},[69],{"categories":1375},[174],{"categories":1377},[118],{"categories":1379},[136],{"categories":1381},[113],{"categories":1383},[69],{"categories":1385},[118],{"categories":1387},[],{"categories":1389},[69],{"categories":1391},[110],{"categories":1393},[69],{"categories":1395},[],{"categories":1397},[136],{"categories":1399},[69],{"categories":1401},[],{"categories":1403},[113],{"categories":1405},[69],{"categories":1407},[],{"categories":1409},[],{"categories":1411},[],{"categories":1413},[69],{"categories":1415},[],{"categories":1417},[429],{"categories":1419},[69],{"categories":1421},[],{"categories":1423},[69],{"categories":1425},[69],{"categories":1427},[69],{"categories":1429},[69,429],{"categories":1431},[69],{"categories":1433},[69],{"categories":1435},[157],{"categories":1437},[118],{"categories":1439},[],{"categories":1441},[118],{"categories":1443},[69],{"categories":1445},[69],{"categories":1447},[69],{"categories":1449},[110],{"categories":1451},[110],{"categories":1453},[167],{"categories":1455},[157],{"categories":1457},[118],{"categories":1459},[],{"categories":1461},[69],{"categories":1463},[136],{"categories":1465},[69],{"categories":1467},[113],{"categories":1469},[],{"categories":1471},[429],{"categories":1473},[157],{"categories":1475},[157],{"categories":1477},[118],{"categories":1479},[136],{"categories":1481},[118],{"categories":1483},[69],{"categories":1485},[],{"categories":1487},[69],{"categories":1489},[],{"categories":1491},[],{"categories":1493},[69],{"categories":1495},[69],{"categories":1497},[69],{"categories":1499},[118],{"categories":1501},[69],{"categories":1503},[],{"categories":1505},[160],{"categories":1507},[118],{"categories":1509},[],{"categories":1511},[69],{"categories":1513},[136],{"categories":1515},[],{"categories":1517},[157],{"categories":1519},[429],{"categories":1521},[136],{"categories":1523},[167],{"categories":1525},[167],{"categories":1527},[136],{"categories":1529},[136],{"categories":1531},[429],{"categories":1533},[],{"categories":1535},[136],{"categories":1537},[69],{"categories":1539},[110],{"categories":1541},[136],{"categories":1543},[],{"categories":1545},[160],{"categories":1547},[136],{"categories":1549},[167],{"categories":1551},[136],{"categories":1553},[429],{"categories":1555},[69],{"categories":1557},[69],{"categories":1559},[],{"categories":1561},[113],{"categories":1563},[],{"categories":1565},[],{"categories":1567},[69],{"categories":1569},[69],{"categories":1571},[69],{"categories":1573},[69],{"categories":1575},[],{"categories":1577},[160],{"categories":1579},[110],{"categories":1581},[],{"categories":1583},[69],{"categories":1585},[69],{"categories":1587},[429],{"categories":1589},[429],{"categories":1591},[],{"categories":1593},[118],{"categories":1595},[136],{"categories":1597},[136],{"categories":1599},[69],{"categories":1601},[118],{"categories":1603},[],{"categories":1605},[157],{"categories":1607},[69],{"categories":1609},[69],{"categories":1611},[],{"categories":1613},[],{"categories":1615},[429],{"categories":1617},[69],{"categories":1619},[167],{"categories":1621},[113],{"categories":1623},[69],{"categories":1625},[],{"categories":1627},[118],{"categories":1629},[110],{"categories":1631},[110],{"categories":1633},[],{"categories":1635},[69],{"categories":1637},[157],{"categories":1639},[118],{"categories":1641},[],{"categories":1643},[69],{"categories":1645},[69],{"categories":1647},[118],{"categories":1649},[],{"categories":1651},[118],{"categories":1653},[167],{"categories":1655},[],{"categories":1657},[69],{"categories":1659},[],{"categories":1661},[69],{"categories":1663},[],{"categories":1665},[69],{"categories":1667},[69],{"categories":1669},[],{"categories":1671},[69],{"categories":1673},[136],{"categories":1675},[69],{"categories":1677},[69],{"categories":1679},[110],{"categories":1681},[69],{"categories":1683},[136],{"categories":1685},[118],{"categories":1687},[],{"categories":1689},[69],{"categories":1691},[174],{"categories":1693},[],{"categories":1695},[],{"categories":1697},[],{"categories":1699},[110],{"categories":1701},[136],{"categories":1703},[118],{"categories":1705},[69],{"categories":1707},[157],{"categories":1709},[118],{"categories":1711},[],{"categories":1713},[118],{"categories":1715},[],{"categories":1717},[69],{"categories":1719},[118],{"categories":1721},[69],{"categories":1723},[],{"categories":1725},[69],{"categories":1727},[69],{"categories":1729},[136],{"categories":1731},[157],{"categories":1733},[118],{"categories":1735},[157],{"categories":1737},[113],{"categories":1739},[],{"categories":1741},[],{"categories":1743},[69],{"categories":1745},[110],{"categories":1747},[136],{"categories":1749},[],{"categories":1751},[],{"categories":1753},[167],{"categories":1755},[157],{"categories":1757},[],{"categories":1759},[69],{"categories":1761},[],{"categories":1763},[174],{"categories":1765},[69],{"categories":1767},[429],{"categories":1769},[167],{"categories":1771},[],{"categories":1773},[118],{"categories":1775},[69],{"categories":1777},[118],{"categories":1779},[118],{"categories":1781},[69],{"categories":1783},[],{"categories":1785},[110],{"categories":1787},[69],{"categories":1789},[113],{"categories":1791},[167],{"categories":1793},[157],{"categories":1795},[],{"categories":1797},[],{"categories":1799},[],{"categories":1801},[118],{"categories":1803},[157],{"categories":1805},[136],{"categories":1807},[69],{"categories":1809},[136],{"categories":1811},[157],{"categories":1813},[],{"categories":1815},[157],{"categories":1817},[136],{"categories":1819},[113],{"categories":1821},[69],{"categories":1823},[136],{"categories":1825},[174],{"categories":1827},[],{"categories":1829},[],{"categories":1831},[160],{"categories":1833},[69,167],{"categories":1835},[136],{"categories":1837},[69],{"categories":1839},[118],{"categories":1841},[118],{"categories":1843},[69],{"categories":1845},[],{"categories":1847},[167],{"categories":1849},[69],{"categories":1851},[160],{"categories":1853},[118],{"categories":1855},[174],{"categories":1857},[429],{"categories":1859},[],{"categories":1861},[110],{"categories":1863},[118],{"categories":1865},[118],{"categories":1867},[167],{"categories":1869},[69],{"categories":1871},[69],{"categories":1873},[],{"categories":1875},[],{"categories":1877},[],{"categories":1879},[429],{"categories":1881},[136],{"categories":1883},[69],{"categories":1885},[69],{"categories":1887},[69],{"categories":1889},[],{"categories":1891},[160],{"categories":1893},[113],{"categories":1895},[],{"categories":1897},[118],{"categories":1899},[429],{"categories":1901},[],{"categories":1903},[157],{"categories":1905},[157],{"categories":1907},[],{"categories":1909},[167],{"categories":1911},[157],{"categories":1913},[69],{"categories":1915},[],{"categories":1917},[136],{"categories":1919},[69],{"categories":1921},[157],{"categories":1923},[118],{"categories":1925},[136],{"categories":1927},[],{"categories":1929},[118],{"categories":1931},[157],{"categories":1933},[69],{"categories":1935},[],{"categories":1937},[69],{"categories":1939},[69],{"categories":1941},[429],{"categories":1943},[136],{"categories":1945},[160],{"categories":1947},[160],{"categories":1949},[],{"categories":1951},[],{"categories":1953},[],{"categories":1955},[118],{"categories":1957},[167],{"categories":1959},[167],{"categories":1961},[],{"categories":1963},[],{"categories":1965},[69],{"categories":1967},[],{"categories":1969},[118],{"categories":1971},[69],{"categories":1973},[],{"categories":1975},[69],{"categories":1977},[113],{"categories":1979},[69],{"categories":1981},[174],{"categories":1983},[118],{"categories":1985},[69],{"categories":1987},[167],{"categories":1989},[136],{"categories":1991},[118],{"categories":1993},[],{"categories":1995},[136],{"categories":1997},[118],{"categories":1999},[118],{"categories":2001},[],{"categories":2003},[113],{"categories":2005},[118],{"categories":2007},[],{"categories":2009},[69],{"categories":2011},[110],{"categories":2013},[136],{"categories":2015},[429],{"categories":2017},[118],{"categories":2019},[118],{"categories":2021},[110],{"categories":2023},[69],{"categories":2025},[],{"categories":2027},[],{"categories":2029},[157],{"categories":2031},[69,113],{"categories":2033},[],{"categories":2035},[110],{"categories":2037},[160],{"categories":2039},[69],{"categories":2041},[167],{"categories":2043},[69],{"categories":2045},[118],{"categories":2047},[69],{"categories":2049},[69],{"categories":2051},[136],{"categories":2053},[118],{"categories":2055},[],{"categories":2057},[],{"categories":2059},[118],{"categories":2061},[69],{"categories":2063},[429],{"categories":2065},[],{"categories":2067},[69],{"categories":2069},[118],{"categories":2071},[],{"categories":2073},[69],{"categories":2075},[174],{"categories":2077},[160],{"categories":2079},[118],{"categories":2081},[69],{"categories":2083},[429],{"categories":2085},[],{"categories":2087},[69],{"categories":2089},[174],{"categories":2091},[157],{"categories":2093},[69],{"categories":2095},[],{"categories":2097},[174],{"categories":2099},[136],{"categories":2101},[69],{"categories":2103},[69],{"categories":2105},[110],{"categories":2107},[],{"categories":2109},[],{"categories":2111},[157],{"categories":2113},[69],{"categories":2115},[160],{"categories":2117},[174],{"categories":2119},[174],{"categories":2121},[136],{"categories":2123},[],{"categories":2125},[],{"categories":2127},[69],{"categories":2129},[],{"categories":2131},[69,167],{"categories":2133},[136],{"categories":2135},[118],{"categories":2137},[167],{"categories":2139},[69],{"categories":2141},[110],{"categories":2143},[],{"categories":2145},[],{"categories":2147},[110],{"categories":2149},[174],{"categories":2151},[69],{"categories":2153},[],{"categories":2155},[157,69],{"categories":2157},[429],{"categories":2159},[110],{"categories":2161},[],{"categories":2163},[113],{"categories":2165},[113],{"categories":2167},[69],{"categories":2169},[167],{"categories":2171},[118],{"categories":2173},[136],{"categories":2175},[174],{"categories":2177},[157],{"categories":2179},[69],{"categories":2181},[69],{"categories":2183},[69],{"categories":2185},[110],{"categories":2187},[69],{"categories":2189},[118],{"categories":2191},[136],{"categories":2193},[],{"categories":2195},[],{"categories":2197},[160],{"categories":2199},[167],{"categories":2201},[69],{"categories":2203},[157],{"categories":2205},[160],{"categories":2207},[69],{"categories":2209},[69],{"categories":2211},[118],{"categories":2213},[118],{"categories":2215},[69,113],{"categories":2217},[],{"categories":2219},[157],{"categories":2221},[],{"categories":2223},[69],{"categories":2225},[136],{"categories":2227},[110],{"categories":2229},[110],{"categories":2231},[118],{"categories":2233},[69],{"categories":2235},[113],{"categories":2237},[167],{"categories":2239},[174],{"categories":2241},[],{"categories":2243},[136],{"categories":2245},[69],{"categories":2247},[69],{"categories":2249},[136],{"categories":2251},[167],{"categories":2253},[69],{"categories":2255},[118],{"categories":2257},[136],{"categories":2259},[69],{"categories":2261},[157],{"categories":2263},[69],{"categories":2265},[69],{"categories":2267},[429],{"categories":2269},[121],{"categories":2271},[118],{"categories":2273},[69],{"categories":2275},[136],{"categories":2277},[118],{"categories":2279},[174],{"categories":2281},[69],{"categories":2283},[],{"categories":2285},[69],{"categories":2287},[],{"categories":2289},[],{"categories":2291},[],{"categories":2293},[113],{"categories":2295},[69],{"categories":2297},[118],{"categories":2299},[136],{"categories":2301},[136],{"categories":2303},[136],{"categories":2305},[136],{"categories":2307},[],{"categories":2309},[110],{"categories":2311},[118],{"categories":2313},[136],{"categories":2315},[110],{"categories":2317},[118],{"categories":2319},[69],{"categories":2321},[69,118],{"categories":2323},[118],{"categories":2325},[429],{"categories":2327},[136],{"categories":2329},[136],{"categories":2331},[118],{"categories":2333},[69],{"categories":2335},[],{"categories":2337},[136],{"categories":2339},[174],{"categories":2341},[110],{"categories":2343},[69],{"categories":2345},[69],{"categories":2347},[],{"categories":2349},[167],{"categories":2351},[],{"categories":2353},[110],{"categories":2355},[118],{"categories":2357},[136],{"categories":2359},[69],{"categories":2361},[136],{"categories":2363},[110],{"categories":2365},[136],{"categories":2367},[136],{"categories":2369},[],{"categories":2371},[113],{"categories":2373},[118],{"categories":2375},[136],{"categories":2377},[136],{"categories":2379},[136],{"categories":2381},[136],{"categories":2383},[136],{"categories":2385},[136],{"categories":2387},[136],{"categories":2389},[136],{"categories":2391},[136],{"categories":2393},[136],{"categories":2395},[160],{"categories":2397},[110],{"categories":2399},[69],{"categories":2401},[69],{"categories":2403},[],{"categories":2405},[69,110],{"categories":2407},[],{"categories":2409},[118],{"categories":2411},[136],{"categories":2413},[118],{"categories":2415},[69],{"categories":2417},[69],{"categories":2419},[69],{"categories":2421},[69],{"categories":2423},[69],{"categories":2425},[118],{"categories":2427},[113],{"categories":2429},[157],{"categories":2431},[136],{"categories":2433},[69],{"categories":2435},[],{"categories":2437},[],{"categories":2439},[118],{"categories":2441},[157],{"categories":2443},[69],{"categories":2445},[],{"categories":2447},[],{"categories":2449},[174],{"categories":2451},[69],{"categories":2453},[],{"categories":2455},[],{"categories":2457},[110],{"categories":2459},[113],{"categories":2461},[69],{"categories":2463},[113],{"categories":2465},[157],{"categories":2467},[],{"categories":2469},[136],{"categories":2471},[],{"categories":2473},[157],{"categories":2475},[69],{"categories":2477},[174],{"categories":2479},[],{"categories":2481},[174],{"categories":2483},[],{"categories":2485},[],{"categories":2487},[118],{"categories":2489},[],{"categories":2491},[113],{"categories":2493},[110],{"categories":2495},[157],{"categories":2497},[167],{"categories":2499},[],{"categories":2501},[],{"categories":2503},[69],{"categories":2505},[110],{"categories":2507},[174],{"categories":2509},[],{"categories":2511},[118],{"categories":2513},[118],{"categories":2515},[136],{"categories":2517},[69],{"categories":2519},[118],{"categories":2521},[69],{"categories":2523},[118],{"categories":2525},[69],{"categories":2527},[121],{"categories":2529},[136],{"categories":2531},[],{"categories":2533},[174],{"categories":2535},[167],{"categories":2537},[118],{"categories":2539},[],{"categories":2541},[69],{"categories":2543},[118],{"categories":2545},[113],{"categories":2547},[110],{"categories":2549},[69],{"categories":2551},[157],{"categories":2553},[167],{"categories":2555},[167],{"categories":2557},[69],{"categories":2559},[160],{"categories":2561},[69],{"categories":2563},[118],{"categories":2565},[113],{"categories":2567},[118],{"categories":2569},[69],{"categories":2571},[69],{"categories":2573},[118],{"categories":2575},[136],{"categories":2577},[],{"categories":2579},[110],{"categories":2581},[69],{"categories":2583},[118],{"categories":2585},[69],{"categories":2587},[69],{"categories":2589},[],{"categories":2591},[157],{"categories":2593},[113],{"categories":2595},[136],{"categories":2597},[69],{"categories":2599},[69],{"categories":2601},[157],{"categories":2603},[174],{"categories":2605},[160],{"categories":2607},[69],{"categories":2609},[136],{"categories":2611},[69],{"categories":2613},[118],{"categories":2615},[429],{"categories":2617},[69],{"categories":2619},[118],{"categories":2621},[160],{"categories":2623},[],{"categories":2625},[118],{"categories":2627},[167],{"categories":2629},[157],{"categories":2631},[69],{"categories":2633},[110],{"categories":2635},[113],{"categories":2637},[167],{"categories":2639},[],{"categories":2641},[118],{"categories":2643},[69],{"categories":2645},[],{"categories":2647},[136],{"categories":2649},[],{"categories":2651},[136],{"categories":2653},[69],{"categories":2655},[118],{"categories":2657},[118],{"categories":2659},[118],{"categories":2661},[],{"categories":2663},[],{"categories":2665},[69],{"categories":2667},[69],{"categories":2669},[],{"categories":2671},[157],{"categories":2673},[118],{"categories":2675},[174],{"categories":2677},[110],{"categories":2679},[],{"categories":2681},[],{"categories":2683},[136],{"categories":2685},[167],{"categories":2687},[69],{"categories":2689},[69],{"categories":2691},[69],{"categories":2693},[167],{"categories":2695},[136],{"categories":2697},[157],{"categories":2699},[69],{"categories":2701},[69],{"categories":2703},[69],{"categories":2705},[136],{"categories":2707},[69],{"categories":2709},[136],{"categories":2711},[118],{"categories":2713},[118],{"categories":2715},[167],{"categories":2717},[118],{"categories":2719},[69],{"categories":2721},[167],{"categories":2723},[157],{"categories":2725},[],{"categories":2727},[118],{"categories":2729},[],{"categories":2731},[],{"categories":2733},[113],{"categories":2735},[69],{"categories":2737},[118],{"categories":2739},[110],{"categories":2741},[118],{"categories":2743},[174],{"categories":2745},[],{"categories":2747},[118],{"categories":2749},[],{"categories":2751},[110],{"categories":2753},[118],{"categories":2755},[],{"categories":2757},[118],{"categories":2759},[69],{"categories":2761},[136],{"categories":2763},[69],{"categories":2765},[118],{"categories":2767},[136],{"categories":2769},[118],{"categories":2771},[167],{"categories":2773},[157],{"categories":2775},[110],{"categories":2777},[],{"categories":2779},[118],{"categories":2781},[157],{"categories":2783},[136],{"categories":2785},[69],{"categories":2787},[157],{"categories":2789},[110],{"categories":2791},[],{"categories":2793},[118],{"categories":2795},[118],{"categories":2797},[69],{"categories":2799},[],{"categories":2801},[118],{"categories":2803},[121],{"categories":2805},[136],{"categories":2807},[118],{"categories":2809},[113],{"categories":2811},[],{"categories":2813},[69],{"categories":2815},[121],{"categories":2817},[69],{"categories":2819},[118],{"categories":2821},[136],{"categories":2823},[110],{"categories":2825},[429],{"categories":2827},[69],{"categories":2829},[69],{"categories":2831},[69],{"categories":2833},[136],{"categories":2835},[113],{"categories":2837},[69],{"categories":2839},[157],{"categories":2841},[136],{"categories":2843},[429],{"categories":2845},[69],{"categories":2847},[],{"categories":2849},[],{"categories":2851},[429],{"categories":2853},[160],{"categories":2855},[118],{"categories":2857},[118],{"categories":2859},[136],{"categories":2861},[69],{"categories":2863},[110],{"categories":2865},[157],{"categories":2867},[118],{"categories":2869},[69],{"categories":2871},[174],{"categories":2873},[69],{"categories":2875},[118],{"categories":2877},[],{"categories":2879},[69],{"categories":2881},[69],{"categories":2883},[136],{"categories":2885},[110],{"categories":2887},[],{"categories":2889},[69],{"categories":2891},[69],{"categories":2893},[167],{"categories":2895},[157],{"categories":2897},[69,118],{"categories":2899},[174,113],{"categories":2901},[69],{"categories":2903},[],{"categories":2905},[118],{"categories":2907},[],{"categories":2909},[167],{"categories":2911},[69],{"categories":2913},[136],{"categories":2915},[],{"categories":2917},[118],{"categories":2919},[],{"categories":2921},[118],{"categories":2923},[110],{"categories":2925},[118],{"categories":2927},[69],{"categories":2929},[429],{"categories":2931},[174],{"categories":2933},[113],{"categories":2935},[113],{"categories":2937},[110],{"categories":2939},[110],{"categories":2941},[69],{"categories":2943},[118],{"categories":2945},[69],{"categories":2947},[69],{"categories":2949},[110],{"categories":2951},[69],{"categories":2953},[174],{"categories":2955},[136],{"categories":2957},[69],{"categories":2959},[118],{"categories":2961},[69],{"categories":2963},[],{"categories":2965},[167],{"categories":2967},[],{"categories":2969},[118],{"categories":2971},[110],{"categories":2973},[],{"categories":2975},[429],{"categories":2977},[69],{"categories":2979},[],{"categories":2981},[136],{"categories":2983},[118],{"categories":2985},[167],{"categories":2987},[69],{"categories":2989},[118],{"categories":2991},[167],{"categories":2993},[118],{"categories":2995},[136],{"categories":2997},[110],{"categories":2999},[136],{"categories":3001},[167],{"categories":3003},[69],{"categories":3005},[157],{"categories":3007},[69],{"categories":3009},[69],{"categories":3011},[69],{"categories":3013},[69],{"categories":3015},[118],{"categories":3017},[69],{"categories":3019},[118],{"categories":3021},[69],{"categories":3023},[110],{"categories":3025},[69],{"categories":3027},[118],{"categories":3029},[157],{"categories":3031},[110],{"categories":3033},[118],{"categories":3035},[157],{"categories":3037},[],{"categories":3039},[69],{"categories":3041},[69],{"categories":3043},[167],{"categories":3045},[],{"categories":3047},[118],{"categories":3049},[174],{"categories":3051},[69],{"categories":3053},[136],{"categories":3055},[174],{"categories":3057},[118],{"categories":3059},[113],{"categories":3061},[113],{"categories":3063},[69],{"categories":3065},[110],{"categories":3067},[],{"categories":3069},[69],{"categories":3071},[],{"categories":3073},[110],{"categories":3075},[69],{"categories":3077},[118],{"categories":3079},[118],{"categories":3081},[],{"categories":3083},[167],{"categories":3085},[167],{"categories":3087},[174],{"categories":3089},[157],{"categories":3091},[],{"categories":3093},[69],{"categories":3095},[110],{"categories":3097},[69],{"categories":3099},[167],{"categories":3101},[110],{"categories":3103},[136],{"categories":3105},[136],{"categories":3107},[],{"categories":3109},[136],{"categories":3111},[118],{"categories":3113},[157],{"categories":3115},[160],{"categories":3117},[69],{"categories":3119},[],{"categories":3121},[136],{"categories":3123},[167],{"categories":3125},[113],{"categories":3127},[69],{"categories":3129},[110],{"categories":3131},[429],{"categories":3133},[110],{"categories":3135},[],{"categories":3137},[],{"categories":3139},[136],{"categories":3141},[],{"categories":3143},[118],{"categories":3145},[118],{"categories":3147},[118],{"categories":3149},[],{"categories":3151},[69],{"categories":3153},[],{"categories":3155},[136],{"categories":3157},[110],{"categories":3159},[157],{"categories":3161},[69],{"categories":3163},[136],{"categories":3165},[136],{"categories":3167},[],{"categories":3169},[136],{"categories":3171},[110],{"categories":3173},[69],{"categories":3175},[],{"categories":3177},[118],{"categories":3179},[118],{"categories":3181},[110],{"categories":3183},[],{"categories":3185},[],{"categories":3187},[],{"categories":3189},[157],{"categories":3191},[118],{"categories":3193},[69],{"categories":3195},[],{"categories":3197},[],{"categories":3199},[],{"categories":3201},[157],{"categories":3203},[],{"categories":3205},[110],{"categories":3207},[],{"categories":3209},[],{"categories":3211},[157],{"categories":3213},[69],{"categories":3215},[136],{"categories":3217},[],{"categories":3219},[174],{"categories":3221},[136],{"categories":3223},[174],{"categories":3225},[69],{"categories":3227},[],{"categories":3229},[],{"categories":3231},[118],{"categories":3233},[],{"categories":3235},[],{"categories":3237},[118],{"categories":3239},[69],{"categories":3241},[],{"categories":3243},[118],{"categories":3245},[136],{"categories":3247},[174],{"categories":3249},[160],{"categories":3251},[118],{"categories":3253},[118],{"categories":3255},[],{"categories":3257},[],{"categories":3259},[],{"categories":3261},[136],{"categories":3263},[],{"categories":3265},[],{"categories":3267},[157],{"categories":3269},[110],{"categories":3271},[],{"categories":3273},[113],{"categories":3275},[174],{"categories":3277},[69],{"categories":3279},[167],{"categories":3281},[110],{"categories":3283},[160],{"categories":3285},[113],{"categories":3287},[167],{"categories":3289},[],{"categories":3291},[],{"categories":3293},[118],{"categories":3295},[110],{"categories":3297},[157],{"categories":3299},[110],{"categories":3301},[118],{"categories":3303},[429],{"categories":3305},[118],{"categories":3307},[],{"categories":3309},[69],{"categories":3311},[136],{"categories":3313},[167],{"categories":3315},[],{"categories":3317},[157],{"categories":3319},[136],{"categories":3321},[110],{"categories":3323},[118],{"categories":3325},[69],{"categories":3327},[113],{"categories":3329},[118,429],{"categories":3331},[118],{"categories":3333},[167],{"categories":3335},[69],{"categories":3337},[160],{"categories":3339},[174],{"categories":3341},[118],{"categories":3343},[],{"categories":3345},[118],{"categories":3347},[69],{"categories":3349},[113],{"categories":3351},[],{"categories":3353},[],{"categories":3355},[69],{"categories":3357},[160],{"categories":3359},[69],{"categories":3361},[],{"categories":3363},[136],{"categories":3365},[],{"categories":3367},[136],{"categories":3369},[167],{"categories":3371},[118],{"categories":3373},[69],{"categories":3375},[174],{"categories":3377},[167],{"categories":3379},[],{"categories":3381},[136],{"categories":3383},[69],{"categories":3385},[],{"categories":3387},[69],{"categories":3389},[118],{"categories":3391},[69],{"categories":3393},[118],{"categories":3395},[69],{"categories":3397},[69],{"categories":3399},[69],{"categories":3401},[69],{"categories":3403},[113],{"categories":3405},[],{"categories":3407},[121],{"categories":3409},[136],{"categories":3411},[69],{"categories":3413},[],{"categories":3415},[167],{"categories":3417},[69],{"categories":3419},[69],{"categories":3421},[118],{"categories":3423},[136],{"categories":3425},[69],{"categories":3427},[69],{"categories":3429},[113],{"categories":3431},[118],{"categories":3433},[157],{"categories":3435},[],{"categories":3437},[160],{"categories":3439},[69],{"categories":3441},[],{"categories":3443},[136],{"categories":3445},[174],{"categories":3447},[],{"categories":3449},[],{"categories":3451},[136],{"categories":3453},[136],{"categories":3455},[174],{"categories":3457},[110],{"categories":3459},[118],{"categories":3461},[118],{"categories":3463},[69],{"categories":3465},[113],{"categories":3467},[],{"categories":3469},[],{"categories":3471},[136],{"categories":3473},[160],{"categories":3475},[167],{"categories":3477},[118],{"categories":3479},[157],{"categories":3481},[160],{"categories":3483},[160],{"categories":3485},[],{"categories":3487},[136],{"categories":3489},[69],{"categories":3491},[69],{"categories":3493},[167],{"categories":3495},[],{"categories":3497},[136],{"categories":3499},[136],{"categories":3501},[136],{"categories":3503},[],{"categories":3505},[118],{"categories":3507},[69],{"categories":3509},[],{"categories":3511},[110],{"categories":3513},[113],{"categories":3515},[],{"categories":3517},[69],{"categories":3519},[69],{"categories":3521},[],{"categories":3523},[167],{"categories":3525},[],{"categories":3527},[],{"categories":3529},[],{"categories":3531},[],{"categories":3533},[69],{"categories":3535},[136],{"categories":3537},[],{"categories":3539},[],{"categories":3541},[69],{"categories":3543},[69],{"categories":3545},[69],{"categories":3547},[160],{"categories":3549},[69],{"categories":3551},[160],{"categories":3553},[],{"categories":3555},[160],{"categories":3557},[160],{"categories":3559},[429],{"categories":3561},[118],{"categories":3563},[167],{"categories":3565},[],{"categories":3567},[],{"categories":3569},[160],{"categories":3571},[167],{"categories":3573},[167],{"categories":3575},[167],{"categories":3577},[],{"categories":3579},[110],{"categories":3581},[167],{"categories":3583},[167],{"categories":3585},[110],{"categories":3587},[167],{"categories":3589},[113],{"categories":3591},[167],{"categories":3593},[167],{"categories":3595},[167],{"categories":3597},[160],{"categories":3599},[136],{"categories":3601},[136],{"categories":3603},[69],{"categories":3605},[167],{"categories":3607},[160],{"categories":3609},[429],{"categories":3611},[160],{"categories":3613},[160],{"categories":3615},[160],{"categories":3617},[],{"categories":3619},[113],{"categories":3621},[],{"categories":3623},[429],{"categories":3625},[167],{"categories":3627},[167],{"categories":3629},[167],{"categories":3631},[118],{"categories":3633},[136,113],{"categories":3635},[160],{"categories":3637},[],{"categories":3639},[],{"categories":3641},[160],{"categories":3643},[],{"categories":3645},[160],{"categories":3647},[136],{"categories":3649},[118],{"categories":3651},[],{"categories":3653},[167],{"categories":3655},[69],{"categories":3657},[157],{"categories":3659},[],{"categories":3661},[69],{"categories":3663},[],{"categories":3665},[136],{"categories":3667},[110],{"categories":3669},[160],{"categories":3671},[],{"categories":3673},[167],{"categories":3675},[136],[3677,3747,3827,3897],{"id":3678,"title":3679,"ai":3680,"body":3685,"categories":3716,"created_at":70,"date_modified":70,"description":61,"extension":71,"faq":70,"featured":72,"kicker_label":70,"meta":3717,"navigation":89,"path":3734,"published_at":3735,"question":70,"scraped_at":3736,"seo":3737,"sitemap":3738,"source_id":3739,"source_name":3740,"source_type":97,"source_url":3741,"stem":3742,"tags":3743,"thumbnail_url":70,"tldr":3744,"tweet":70,"unknown_tags":3745,"__hash__":3746},"summaries\u002Fsummaries\u002F5c8a61f1aa3cea08-llm-scaling-works-via-strong-superposition-summary.md","LLM Scaling Works via Strong Superposition",{"provider":7,"model":8,"input_tokens":3681,"output_tokens":3682,"processing_time_ms":3683,"cost_usd":3684},4549,1921,23559,0.00136345,{"type":14,"value":3686,"toc":3711},[3687,3691,3694,3697,3701,3704,3708],[17,3688,3690],{"id":3689},"superposition-drives-predictable-error-reduction","Superposition Drives Predictable Error Reduction",[22,3692,3693],{},"Language models represent tens of thousands of tokens in spaces with only thousands of dimensions by using superposition: squeezing multiple concepts into the same dimensions with slight overlaps. In the dominant 'strong superposition' regime, every token gets represented, and error stems from overlap noise, not dropped rare tokens. Doubling model width (m) halves error via the geometric 1\u002Fm relationship, yielding power-law scaling (exponent ~1) regardless of data distribution. Weak superposition, where only common tokens are stored cleanly, requires power-law token frequencies for scaling—less reliable for natural language's flatter distributions.",[22,3695,3696],{},"This mechanistic view outperforms prior assumptions: real LLMs don't discard rare tokens but overlap everything, matching theory with measured overlap strength shrinking at 1\u002Fm.",[17,3698,3700],{"id":3699},"validation-across-real-models-matches-theory","Validation Across Real Models Matches Theory",[22,3702,3703],{},"Analysis of output layers in OPT, GPT-2, Qwen2.5, and Pythia (100M to 70B parameters) confirms strong superposition: all tokens represented with overlaps scaling at 1\u002Fm. Observed exponent of 0.91 aligns with theory's 1; DeepMind's Chinchilla data hits 0.88. Simplified models toggling overlap regimes prove scaling emerges directly from geometry, not just data power laws ('power law in, power law out').",[17,3705,3707],{"id":3706},"limits-and-optimization-opportunities","Limits and Optimization Opportunities",[22,3709,3710],{},"Scaling halts when width equals vocabulary size—no more overlaps needed, error from superposition vanishes, breaking power laws. Natural language's even frequencies limit speedup, but uneven domains (e.g., specialized vocab) enable steeper curves. Architectures promoting denser packing, like Nvidia's nGPT (vectors on unit sphere), boost performance at fixed size. Trade-off: denser overlaps hinder mechanistic interpretability, complicating AI safety.",{"title":61,"searchDepth":62,"depth":62,"links":3712},[3713,3714,3715],{"id":3689,"depth":62,"text":3690},{"id":3699,"depth":62,"text":3700},{"id":3706,"depth":62,"text":3707},[],{"content_references":3718,"triage":3732},[3719,3723,3727],{"type":76,"title":3720,"author":3721,"url":3722,"context":83},"Toy Model of Superposition","Anthropic","https:\u002F\u002Ftransformer-circuits.pub\u002F2022\u002Ftoy_model\u002Findex.html",{"type":76,"title":3724,"author":3725,"url":3726,"context":83},"Chinchilla","DeepMind","https:\u002F\u002Fthe-decoder.com\u002Fdeepmind-artificial-intelligence-is-far-from-being-fed-up\u002F",{"type":76,"title":3728,"author":3729,"url":3730,"context":3731},"nGPT","Nvidia","https:\u002F\u002Farxiv.org\u002Fabs\u002F2410.01131","mentioned",{"relevance":85,"novelty":86,"quality":86,"actionability":62,"composite":87,"reasoning":3733},"Category: AI & LLMs. The article discusses the mechanics of LLM scaling through strong superposition, which is relevant to AI engineering. It presents new insights into how model width affects prediction error, but lacks practical applications or frameworks that the audience can directly implement.","\u002Fsummaries\u002F5c8a61f1aa3cea08-llm-scaling-works-via-strong-superposition-summary","2026-05-03 08:42:45","2026-05-03 17:01:29",{"title":3679,"description":61},{"loc":3734},"5c8a61f1aa3cea08","The Decoder","https:\u002F\u002Fthe-decoder.com\u002Fmit-study-explains-why-scaling-language-models-works-so-reliably\u002F","summaries\u002F5c8a61f1aa3cea08-llm-scaling-works-via-strong-superposition-summary",[101,102,103],"LLMs pack all tokens into limited dimensions via overlapping vectors (strong superposition), causing prediction error to halve when model width doubles—explaining reliable power-law scaling.",[],"TxCrmsO7g860jqMKD8Z7LhJqkiaTNkcDx-Z3AQT2GA0",{"id":3748,"title":3749,"ai":3750,"body":3755,"categories":3797,"created_at":70,"date_modified":70,"description":61,"extension":71,"faq":70,"featured":72,"kicker_label":70,"meta":3798,"navigation":89,"path":3815,"published_at":70,"question":70,"scraped_at":3816,"seo":3817,"sitemap":3818,"source_id":3819,"source_name":3820,"source_type":97,"source_url":3821,"stem":3822,"tags":3823,"thumbnail_url":70,"tldr":3824,"tweet":70,"unknown_tags":3825,"__hash__":3826},"summaries\u002Fsummaries\u002Fd445780e74d7b6ed-llm-pretraining-scaling-fsdp-wins-until-comms-crat-summary.md","LLM Pretraining Scaling: FSDP Wins Until Comms Crater",{"provider":7,"model":8,"input_tokens":3751,"output_tokens":3752,"processing_time_ms":3753,"cost_usd":3754},8296,2378,19998,0.00282555,{"type":14,"value":3756,"toc":3791},[3757,3761,3764,3767,3771,3774,3778,3781,3785,3788],[17,3758,3760],{"id":3759},"fsdp-dominates-parallelism-until-scale-forces-pipeline-trade-offs","FSDP Dominates Parallelism Until Scale Forces Pipeline Trade-offs",[22,3762,3763],{},"Pretraining FLOPs = 6ND (2 forward + 4 backward per param-token). Data parallel (DP) copies weights across GPUs but hits HBM limits (B300: 288GB). Fully Sharded Data Parallel (FSDP) shards params per layer across GPUs, all-gathering full weights per layer (forward\u002Fbackward) while overlapping comms with compute since weights are layer-independent. FSDP comms: params×3 (all-gather forward\u002Fback + reduce-scatter backward), 50% over DP's params×2 all-reduce—achievable because all-gather is half an all-reduce. Use hierarchical collectives across NVLink domains: reduce-scatter intra-domain, all-reduce shards inter-domain, all-gather intra-domain to saturate IB bandwidth.",[22,3765,3766],{},"Comms time stays flat with GPU count (ring all-reduce chunks scale inversely with participants), but compute drops linearly, cratering MFU at 'crossover' (comms > compute). Delay crossover by larger batches (more compute\u002FGPU) or sparser models; TPUs excel with bigger domains. Batch size floors FSDP at ~1K GPUs (e.g., 10M-token batch, 10K seq len = 1K seqs). Add pipeline parallelism (PP) next, but it introduces bubbles (idle GPUs at batch start\u002Fend) unfillable in training due to per-batch gradient sync. PP constrains architecture (e.g., Kimi's cross-layer attention, mixed attention types cause stage imbalance), slowing research.",[17,3768,3770],{"id":3769},"distillation-remains-cheap-and-evasion-proof","Distillation Remains Cheap and Evasion-Proof",[22,3772,3773],{},"Frontier labs can't halt distillation: 1T tokens from Opus 4.6 costs $25M ($25\u002FMTok), commoditizing open models rapidly (cf. Fineweb 18.5T, OpenWebText 9B). Hiding chain-of-thought (CoT) fails—instruct no-think\u002Fdirect solve or RLVR on reconstructed CoT. Core value in local tool use (file edits, bash) evades cloud hiding; users resist workflow migration. Products atop APIs distill better: reward 'gold diffs' (final user-accepted code) over rejected intermediates from 10+ turn sessions.",[17,3775,3777],{"id":3776},"agentic-ai-shifts-cybersecurity-toward-defense","Agentic AI Shifts Cybersecurity Toward Defense",[22,3779,3780],{},"Mythos chains 5+ vulns into exploits (vs. prior single-vuln finds), but software is securer now despite human probing—sudden AI intelligence influx likely strengthens defense via industry patching (e.g., Glasswing reveals zero-days). AI excels at vuln finding over patching (XKCD: fixes break edge cases\u002Ffeatures). Solutions: LLM-port C to Rust; formal verification (e.g., seL4 proofs); patching mirrors LLM bug-finding in others' repos. Hoarding Mythos risky—build\u002Frelease classifiers rejecting cyberattack intents (Anthropic plans for 4.7). Evade classifiers by subproblems (harmless vulns). Patching own code routine for coding LLMs.",[17,3782,3784],{"id":3783},"pipeline-rl-fixes-stragglers-causalitybias-dooms-runs","Pipeline RL Fixes Stragglers; Causality\u002FBias Dooms Runs",[22,3786,3787],{},"RL responses grow in mean\u002Fvariance length, straggling GPU utilization. Pipeline RL does 'in-flight weight updates': swap generating model mid-trajectory post-training step, ensuring recent-model rollouts without full offline RL off-policyness.",[22,3789,3790],{},"Pretraining fails via causality breaks (MoE expert-choice routes token n+k affecting n; token-dropping ignores early for later matches—rumored Llama 4\u002FGemini 2 flops) or bias (FP16 collectives round large sums wrong, e.g., post-1024 granularity skips +1; GPT-4 initial bug). Bias compounds > variance. New scale unveils bespoke issues (numerics, kernels)—not 5 fixable failure modes. RL inference needs training-engine fidelity (numerical drift biases); enforce disciplined compute multipliers to avoid bug stacks. Kernel optimization AGI-hard (Nvidia took ages for Blackwell).",{"title":61,"searchDepth":62,"depth":62,"links":3792},[3793,3794,3795,3796],{"id":3759,"depth":62,"text":3760},{"id":3769,"depth":62,"text":3770},{"id":3776,"depth":62,"text":3777},{"id":3783,"depth":62,"text":3784},[],{"content_references":3799,"triage":3812},[3800,3804,3807],{"type":3801,"title":3802,"url":3803,"context":3731},"podcast","Conversation with Michael Nielsen","https:\u002F\u002Fwww.dwarkesh.com\u002Fp\u002Fmichael-nielsen",{"type":76,"title":3805,"url":3806,"context":83},"Pipeline RL","https:\u002F\u002Farxiv.org\u002Fpdf\u002F2509.19128",{"type":3808,"title":3809,"author":3810,"url":3811,"context":3731},"other","Pretraining parallelisms lecture","Horace He","https:\u002F\u002Fhorace.io\u002F",{"relevance":86,"novelty":85,"quality":86,"actionability":62,"composite":3813,"reasoning":3814},3.4,"Category: AI & LLMs. The article discusses the practical application of Fully Sharded Data Parallel (FSDP) for scaling pretraining in LLMs, which addresses a specific pain point for AI developers regarding efficient model training. However, while it provides technical insights, it lacks concrete actionable steps that the audience could directly implement.","\u002Fsummaries\u002Fd445780e74d7b6ed-llm-pretraining-scaling-fsdp-wins-until-comms-crat-summary","2026-04-19 01:22:25",{"title":3749,"description":61},{"loc":3815},"d445780e74d7b6ed","Dwarkesh Patel","https:\u002F\u002Fwww.dwarkesh.com\u002Fp\u002Fwhat-i-learned-april-15","summaries\u002Fd445780e74d7b6ed-llm-pretraining-scaling-fsdp-wins-until-comms-crat-summary",[101,102,103],"Use FSDP as default for scaling pretraining (params×3 comms overhead) until GPU count hits comms crossover; distillation costs $25M\u002FT from frontier models, unstoppable via tool use; training fails from causality breaks and FP16 bias.",[],"UCftWL3lVDs_ij_juNq8mtYfE_yqIH5SLhHL1KTHG3s",{"id":3828,"title":3829,"ai":3830,"body":3835,"categories":3869,"created_at":70,"date_modified":70,"description":61,"extension":71,"faq":70,"featured":72,"kicker_label":70,"meta":3870,"navigation":89,"path":3884,"published_at":3885,"question":70,"scraped_at":3886,"seo":3887,"sitemap":3888,"source_id":3889,"source_name":96,"source_type":97,"source_url":3890,"stem":3891,"tags":3892,"thumbnail_url":70,"tldr":3894,"tweet":70,"unknown_tags":3895,"__hash__":3896},"summaries\u002Fsummaries\u002Fc6f1bc88e627db47-parcae-stabilizes-loops-to-match-2x-transformer-qu-summary.md","Parcae Stabilizes Loops to Match 2x Transformer Quality",{"provider":7,"model":8,"input_tokens":3831,"output_tokens":3832,"processing_time_ms":3833,"cost_usd":3834},8134,2375,17913,0.00230745,{"type":14,"value":3836,"toc":3864},[3837,3841,3844,3847,3851,3854,3857,3861],[17,3838,3840],{"id":3839},"designing-stable-looped-architectures","Designing Stable Looped Architectures",[22,3842,3843],{},"Looped transformers route activations through a fixed block of layers T times, boosting compute without adding parameters—ideal for memory-constrained edge deployment. Parcae uses a middle-looped structure: prelude (P) embeds input to latent e; recurrent block (R) updates hidden state h_t for T loops with e injected each iteration; coda (C) outputs from final h_T. Prior looped models like RDMs fail due to residual state explosion and loss spikes from unconstrained dynamics.",[22,3845,3846],{},"Model the loop as a nonlinear dynamical system: h_{t+1} = Ā h_t + B̄ e + R̄(h_t, e). Stability requires spectral norm ρ(Ā) \u003C 1. Parcae discretizes a continuous system using zero-order hold and Euler integration with learned step Δ: Ā = exp(Δ A), B̄ = Δ B. Constrain A as diagonal with negative entries A = Diag(-exp(log A)), ensuring ρ(Ā) \u003C 1 by design—no hyperparameter tuning needed for convergence. This fixes addition-based (ρ(Ā)=1, marginal) and concatenation-projection (ρ(Ā)>1, unstable) flaws in priors.",[17,3848,3850],{"id":3849},"beating-baselines-with-parameter-efficiency","Beating Baselines with Parameter Efficiency",[22,3852,3853],{},"On Huginn, 350M Parcae drops validation perplexity 6.3% vs RDMs (10.76 to 10.09 PPL), 9.1% on WikiText, +1.8 downstream accuracy points. At 100M, 4.5% PPL gain (14.23 to 13.59). On FineWeb-Edu (104B tokens, nanochat setup), 1.3B Parcae scores 2.99 points higher on Core, 1.18 on Core-Extended than parameter-matched Transformers. Critically, 770M Parcae hits 25.07 Core—matching 1.3B Transformer's 25.45—delivering up to 87.5% of twice-sized Transformer's quality.",[22,3855,3856],{},"Looping adds an orthogonal scaling axis: isoFLOP tests at 140M\u002F370M show looped Parcae (optimal mean recurrence μ_rec) beats fixed-depth (μ_rec=1) by 1.2-2.0 Core points under same params\u002FFLOPs.",[17,3858,3860],{"id":3859},"first-scaling-laws-for-recurrence-depth","First Scaling Laws for Recurrence Depth",[22,3862,3863],{},"Optimal μ_rec scales as C^{0.40}, training tokens as C^{0.78} (C= FLOP budget), holding across scales. Test-time loop count T beyond training saturates via L(T) = L_∞ + Z e^{-z T}, plateauing near training μ_rec—setting a ceiling on extrapolation. This parametric law predicts held-out loss with 0.85-1.31% error, enabling reliable planning: train deeper loops for compute-optimal quality without memory bloat.",{"title":61,"searchDepth":62,"depth":62,"links":3865},[3866,3867,3868],{"id":3839,"depth":62,"text":3840},{"id":3849,"depth":62,"text":3850},{"id":3859,"depth":62,"text":3860},[69],{"content_references":3871,"triage":3881},[3872,3875,3878],{"type":76,"title":3873,"url":3874,"context":79},"Parcae","https:\u002F\u002Farxiv.org\u002Fpdf\u002F2604.12946",{"type":3808,"title":3876,"url":3877,"context":79},"Parcae Model Weights","https:\u002F\u002Fhuggingface.co\u002Fcollections\u002FSandyResearch\u002Fparcae",{"type":3808,"title":3879,"url":3880,"context":79},"Parcae Technical Details","https:\u002F\u002Fwww.together.ai\u002Fblog\u002Fparcae",{"relevance":85,"novelty":85,"quality":86,"actionability":62,"composite":3882,"reasoning":3883},3.05,"Category: AI & LLMs. The article discusses a new architecture for looped transformers, which is relevant to AI engineering, but it lacks practical applications or frameworks that the audience can directly implement. While it presents some new insights into model efficiency, it does not provide actionable steps for product builders.","\u002Fsummaries\u002Fc6f1bc88e627db47-parcae-stabilizes-loops-to-match-2x-transformer-qu-summary","2026-04-16 08:30:30","2026-04-19 01:22:43",{"title":3829,"description":61},{"loc":3884},"c6f1bc88e627db47","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F04\u002F16\u002Fucsd-and-together-ai-research-introduces-parcae-a-stable-architecture-for-looped-language-models-that-achieves-the-quality-of-a-transformer-twice-the-size\u002F","summaries\u002Fc6f1bc88e627db47-parcae-stabilizes-loops-to-match-2x-transformer-qu-summary",[101,102,3893,103],"deep-learning","Parcae enforces looped transformer stability via negative diagonal matrices in a dynamical system, outperforming baselines and achieving 87.5% of a twice-sized Transformer's quality at half parameters.",[],"w5bUNLMbNnMepMdiskfNW1esyE0__I9nWhPLCEPtMq8",{"id":3898,"title":3899,"ai":3900,"body":3905,"categories":3933,"created_at":70,"date_modified":70,"description":61,"extension":71,"faq":70,"featured":72,"kicker_label":70,"meta":3934,"navigation":89,"path":3938,"published_at":70,"question":70,"scraped_at":3939,"seo":3940,"sitemap":3941,"source_id":3942,"source_name":3943,"source_type":97,"source_url":3944,"stem":3945,"tags":3946,"thumbnail_url":70,"tldr":3948,"tweet":70,"unknown_tags":3949,"__hash__":3950},"summaries\u002Fsummaries\u002Fdf29e9b47ffb4ae6-financebench-llm-eval-dataset-for-sec-filing-qa-summary.md","FinanceBench: LLM Eval Dataset for SEC Filing QA",{"provider":7,"model":8,"input_tokens":3901,"output_tokens":3902,"processing_time_ms":3903,"cost_usd":3904},10599,1737,10323,0.00296565,{"type":14,"value":3906,"toc":3928},[3907,3911,3914,3918,3921,3925],[17,3908,3910],{"id":3909},"core-structure-enables-llm-financial-reasoning-benchmarks","Core Structure Enables LLM Financial Reasoning Benchmarks",[22,3912,3913],{},"FinanceBench structures QA pairs from public company SEC filings (10K, 10Q, 8K) across sectors like Industrials (3M), IT (Adobe), Utilities (AES). Key columns include financebench_id, company, doc_name (e.g., 3M_2018_10K), question_type (metrics-generated, domain-relevant, novel-generated), question_reasoning (information extraction, numerical\u002Flogical reasoning), question, answer, justification, evidence (text snippets\u002Fpages), gics_sector, doc_type, doc_period (e.g., 2018-2023), doc_link. All subsets labeled OPEN_SOURCE. Enables testing LLMs on production-grade tasks: direct extraction (e.g., 3M FY2018 CAPEX $1577M from 'Purchases of PP&E'), calculated metrics (e.g., Adobe FY2015 operating cash flow ratio 0.66 = cash from ops \u002F current liabilities), multi-year averages (Activision Blizzard FY2017-19 capex\u002Frevenue 1.9%).",[17,3915,3917],{"id":3916},"numerical-reasoning-tasks-build-real-world-ratios","Numerical Reasoning Tasks Build Real-World Ratios",[22,3919,3920],{},"Dataset stresses formula-based computations from balance sheets, income\u002Fcash flow statements. Examples: fixed asset turnover (Activision Blizzard FY2019: 24.26 = revenue \u002F avg PP&E); DPO (Amazon FY2017: 93.86 = 365 * avg payables \u002F (COGS + Δinventory)); inventory turnover (AES FY2022: 9.5 = cost of sales \u002F inventory); ROA (AES FY2022: -0.02 = net income \u002F avg total assets); FCF conversion (Adobe FY2022: improved 143% to 156% = (ops cash - CAPEX) \u002F net income); YoY changes (Amazon revenue FY16-17: 30.8%; Adobe op income FY15-16: 65.4%). Justifications detail line items (e.g., 'Net cash provided by operating activities') and math steps, with evidence texts\u002Fpages for verifiability.",[17,3922,3924],{"id":3923},"domain-relevant-and-novel-questions-test-analyst-insights","Domain-Relevant and Novel Questions Test Analyst Insights",[22,3926,3927],{},"Beyond extraction, probes qualitative\u002Fquantitative judgment: capital intensity (3M FY2022: no, via 5.1% CAPEX\u002Frevenue, 20% fixed assets\u002Ftotal assets, 12.4% ROA); liquidity (3M Q2 FY2023 quick ratio 0.96 = (current assets - inventory) \u002F current liabilities, needs improvement); operating margin drivers (3M FY2022 decline 1.7% from litigation\u002FPFAS exit); segment growth (3M consumer -0.9% organic excluding M&A); dividend stability (3M 65 consecutive years increases); debt securities (3M Q2 2023: MMM26\u002F30\u002F31 on NYSE); restructuring costs (AES FY2022: 0, not outlined). Novel tasks like 'segment dragging growth' or 8K agendas (Amcor 2022: debt substitution) mimic analyst workflows, grounding LLMs in evidence-based reasoning over filings.",{"title":61,"searchDepth":62,"depth":62,"links":3929},[3930,3931,3932],{"id":3909,"depth":62,"text":3910},{"id":3916,"depth":62,"text":3917},{"id":3923,"depth":62,"text":3924},[69],{"content_references":3935,"triage":3936},[],{"relevance":85,"novelty":86,"quality":86,"actionability":62,"composite":87,"reasoning":3937},"Category: AI & LLMs. The article provides a dataset for evaluating LLMs on financial QA tasks, which is relevant for AI developers looking to integrate financial reasoning into their products. However, while it presents novel insights into the dataset's structure and applications, it lacks actionable steps for implementation.","\u002Fsummaries\u002Fdf29e9b47ffb4ae6-financebench-llm-eval-dataset-for-sec-filing-qa-summary","2026-04-16 02:57:08",{"title":3899,"description":61},{"loc":3938},"df29e9b47ffb4ae6","__oneoff__","https:\u002F\u002Fhuggingface.co\u002Fdatasets\u002FPatronusAI\u002Ffinancebench","summaries\u002Fdf29e9b47ffb4ae6-financebench-llm-eval-dataset-for-sec-filing-qa-summary",[101,3947,102,103],"data-science","FinanceBench benchmarks LLMs on 10K+ financial QA tasks from real 10K\u002F10Q filings, covering metric extraction, numerical ratios like ROA (-0.02 for AES), and domain reasoning like liquidity via quick ratio (0.96 for 3M).",[],"PVbgs9cbbO3dtOWaj0J_mTAqpv6rBJ4-p_CvQdpOaSc"]