[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"summary-55edf2b2761da126-spec-decoding-accelerates-rl-rollouts-1-8x-at-8b-2-summary":3,"summaries-facets-categories":124,"summary-related-55edf2b2761da126-spec-decoding-accelerates-rl-rollouts-1-8x-at-8b-2-summary":3694},{"id":4,"title":5,"ai":6,"body":13,"categories":84,"created_at":85,"date_modified":85,"description":78,"extension":86,"faq":85,"featured":87,"kicker_label":85,"meta":88,"navigation":105,"path":106,"published_at":107,"question":85,"scraped_at":108,"seo":109,"sitemap":110,"source_id":111,"source_name":112,"source_type":113,"source_url":114,"stem":115,"tags":116,"thumbnail_url":85,"tldr":121,"tweet":85,"unknown_tags":122,"__hash__":123},"summaries\u002Fsummaries\u002F55edf2b2761da126-spec-decoding-accelerates-rl-rollouts-1-8x-at-8b-2-summary.md","Spec Decoding Accelerates RL Rollouts 1.8x at 8B, 2.5x at 235B",{"provider":7,"model":8,"input_tokens":9,"output_tokens":10,"processing_time_ms":11,"cost_usd":12},"openrouter","x-ai\u002Fgrok-4.1-fast",8885,2416,52736,0.00296235,{"type":14,"value":15,"toc":77},"minimark",[16,21,25,28,32,35,38,61,64,67,71,74],[17,18,20],"h2",{"id":19},"target-rollout-generation-to-cut-rl-training-time","Target Rollout Generation to Cut RL Training Time",[22,23,24],"p",{},"In synchronous RL post-training for tasks like math reasoning or code generation, rollout generation dominates 65-72% of step time across RL-Think (continuing reasoning models) and RL-Zero (training base models from scratch) workloads on Qwen3-8B. The five RL stages—data loading, preparation, generation, log-prob recompute (27-33%), and optimization—make generation the sole high-impact target, as other phases remain unchanged by rollout optimizations.",[22,26,27],{},"Speculative decoding addresses this by using a fast draft model to propose multiple tokens, verified by the target model via rejection sampling. This guarantees identical output distribution to autoregressive generation, avoiding off-policy corrections or fidelity loss common in async, low-precision, or replay methods. Result: faster rollouts with unchanged training signals, KL penalties, and GRPO losses computed solely on target policy samples.",[17,29,31],{"id":30},"integrate-via-two-path-architecture-in-nemo-rl-v060","Integrate via Two-Path Architecture in NeMo RL v0.6.0",[22,33,34],{},"Embed speculative decoding directly in NeMo RL using vLLM backend (SGLang also supported). A two-path system handles policy updates: general EAGLE-3 path for any pretrained draft (no native MTP needed); native path for MTP-equipped models. Online adaptation caches verifier hidden states and log-probs to supervise draft head gradient-free, preventing policy gradient interference.",[22,36,37],{},"Critical configs maximize speedup:",[39,40,41,49,55],"ul",{},[42,43,44,48],"li",{},[45,46,47],"strong",{},"Draft init",": Domain-aligned (e.g., DAPO post-training data) beats generic (UltraChat\u002FMagpie): 1.77× vs 1.51× gen speedup on RL-Zero at k=3.",[42,50,51,54],{},[45,52,53],{},"Draft length k",": Optimum k=3 (1.77× RL-Zero, 1.53× RL-Think); k=5 drops to 1.44×\u002F0.84×, k=7 to 1.21×\u002F0.71× as verification overhead outweighs gains in complex reasoning traces.",[42,56,57,60],{},[45,58,59],{},"Online adaptation",": Boosts weak inits (UltraChat: 1.51× to 1.63×) but minimal for strong ones (DAPO: 1.77× to 1.78×).",[22,62,63],{},"N-gram drafting fails despite >2 token acceptance (0.7×\u002F0.5× speedups), proving acceptance alone insufficient if verification slows net progress.",[22,65,66],{},"Complements async execution: at 8B RL-Think (policy lag 1, 16 nodes), cuts exposed gen time 10.4s to 0.6s\u002Fstep, end-to-end 75s to 60.5s (1.24×).",[17,68,70],{"id":69},"achieve-18-gen-14-step-speedup-at-8b-25-projected-at-235b","Achieve 1.8× Gen, 1.4× Step Speedup at 8B; 2.5× Projected at 235B",[22,72,73],{},"On 32 GB200 GPUs, EAGLE-3 drops RL-Zero gen from 100s to 56.6s (1.8×), RL-Think 133.6s to 87s (1.54×), yielding 1.41×\u002F1.35× step speedups. AIME-2024 validation accuracy matches autoregressive baselines, validating lossless property.",[22,75,76],{},"Simulator projects for Qwen3-235B-A22B: synchronous 512 GB200s at k=3 (accept=3) gives 2.72× rollout\u002F1.70× end-to-end; async 2048 GPUs (lag 2) hits ~3.5× rollout\u002F2.5× end-to-end. Speculation shrinks per-rollout cost; async hides remainder behind compute.",{"title":78,"searchDepth":79,"depth":79,"links":80},"",2,[81,82,83],{"id":19,"depth":79,"text":20},{"id":30,"depth":79,"text":31},{"id":69,"depth":79,"text":70},[],null,"md",false,{"content_references":89,"triage":100},[90,95],{"type":91,"title":92,"url":93,"context":94},"paper","Speculative Decoding in NeMo RL","https:\u002F\u002Farxiv.org\u002Fabs\u002F2604.26779","cited",{"type":96,"title":97,"url":98,"context":99},"tool","NeMo RL","https:\u002F\u002Fgithub.com\u002FNVIDIA-NeMo\u002FRL\u002F","recommended",{"relevance":101,"novelty":102,"quality":102,"actionability":101,"composite":103,"reasoning":104},3,4,3.45,"Category: AI & LLMs. The article discusses a specific optimization technique in reinforcement learning that could be relevant for AI developers looking to improve model training efficiency. It provides insights into speculative decoding, which is a novel approach, but lacks detailed actionable steps for implementation.",true,"\u002Fsummaries\u002F55edf2b2761da126-spec-decoding-accelerates-rl-rollouts-1-8x-at-8b-2-summary","2026-05-02 03:47:47","2026-05-03 17:01:46",{"title":5,"description":78},{"loc":106},"55edf2b2761da126","MarkTechPost","article","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F01\u002Fa-new-nvidia-research-shows-speculative-decoding-in-nemo-rl-achieves-1-8x-rollout-generation-speedup-at-8b-and-projects-2-5x-end-to-end-speedup-at-235b\u002F","summaries\u002F55edf2b2761da126-spec-decoding-accelerates-rl-rollouts-1-8x-at-8b-2-summary",[117,118,119,120],"llm","machine-learning","research","ai-tools","Integrate speculative decoding into NeMo RL training loops using a draft model verifier setup to cut rollout generation time by 1.8× at 8B scale—65-72% of RL steps—while preserving exact output distribution, projecting 2.5× end-to-end speedup at 235B.",[],"Z_3ZvCH6IEvR0mEHJViwktGkUyhz18XLCmzMOGoC3nQ",[125,128,131,134,137,140,142,144,146,148,150,152,155,157,159,161,163,165,167,169,171,173,176,179,181,183,186,188,190,193,195,197,199,201,203,205,207,209,211,213,215,217,219,221,223,225,227,229,231,233,235,237,239,241,243,245,247,249,251,253,255,257,259,261,263,265,267,269,271,273,275,277,279,281,283,285,287,289,291,293,295,297,299,301,303,305,307,309,311,313,315,317,319,321,323,325,327,329,331,333,335,337,339,341,343,345,347,349,351,353,355,357,359,361,363,365,367,369,371,373,375,377,379,381,383,385,387,389,391,393,395,397,399,401,403,405,407,409,411,413,415,417,419,421,423,425,427,429,431,433,435,437,439,441,443,445,448,450,452,454,456,458,460,462,464,466,468,470,472,474,476,478,480,482,484,486,488,490,492,494,496,498,500,502,504,506,508,510,512,514,516,518,520,522,524,526,528,530,532,534,536,538,540,542,544,546,548,550,552,554,556,558,560,562,564,566,568,570,572,574,576,578,580,582,584,586,588,590,592,594,596,598,600,602,604,606,608,610,612,614,616,618,620,622,624,626,628,630,632,634,636,638,640,642,644,646,648,650,652,654,656,658,660,662,664,666,668,670,672,674,676,678,680,682,684,686,688,690,692,694,696,698,700,702,704,706,708,710,712,714,716,718,720,722,724,726,728,730,732,734,736,738,740,742,744,746,748,750,752,754,756,758,760,762,764,766,768,770,772,774,776,778,780,782,784,786,788,790,792,794,796,798,800,802,804,806,808,810,812,814,816,818,820,822,824,826,828,830,832,834,836,838,840,842,844,846,848,850,852,854,856,858,860,862,864,866,868,870,872,874,876,878,880,882,884,886,888,890,892,894,896,898,900,902,904,906,908,910,912,914,916,918,920,922,924,926,928,930,932,934,936,938,940,942,944,946,948,950,952,954,956,958,960,962,964,966,968,970,972,974,976,978,980,982,984,986,988,990,992,994,996,998,1000,1002,1004,1006,1008,1010,1012,1014,1016,1018,1020,1022,1024,1026,1028,1030,1032,1034,1036,1038,1040,1042,1044,1046,1048,1050,1052,1054,1056,1058,1060,1062,1064,1066,1068,1070,1072,1074,1076,1078,1080,1082,1084,1086,1088,1090,1092,1094,1096,1098,1100,1102,1104,1106,1108,1110,1112,1114,1116,1118,1120,1122,1124,1126,1128,1130,1132,1134,1136,1138,1140,1142,1144,1146,1148,1150,1152,1154,1156,1158,1160,1162,1164,1166,1168,1170,1172,1174,1176,1178,1180,1182,1184,1186,1188,1190,1192,1194,1196,1198,1200,1202,1204,1206,1208,1210,1212,1214,1216,1218,1220,1222,1224,1226,1228,1230,1232,1234,1236,1238,1240,1242,1244,1246,1248,1250,1252,1254,1256,1258,1260,1262,1264,1266,1268,1270,1272,1274,1276,1278,1280,1282,1284,1286,1288,1290,1292,1294,1296,1298,1300,1302,1304,1306,1308,1310,1312,1314,1316,1318,1320,1322,1324,1326,1328,1330,1332,1334,1336,1338,1340,1342,1344,1346,1348,1350,1352,1354,1356,1358,1360,1362,1364,1366,1368,1370,1372,1374,1376,1378,1380,1382,1384,1386,1388,1390,1392,1394,1396,1398,1400,1402,1404,1406,1408,1410,1412,1414,1416,1418,1420,1422,1424,1426,1428,1430,1432,1434,1436,1438,1440,1442,1444,1446,1448,1450,1452,1454,1456,1458,1460,1462,1464,1466,1468,1470,1472,1474,1476,1478,1480,1482,1484,1486,1488,1490,1492,1494,1496,1498,1500,1502,1504,1506,1508,1510,1512,1514,1516,1518,1520,1522,1524,1526,1528,1530,1532,1534,1536,1538,1540,1542,1544,1546,1548,1550,1552,1554,1556,1558,1560,1562,1564,1566,1568,1570,1572,1574,1576,1578,1580,1582,1584,1586,1588,1590,1592,1594,1596,1598,1600,1602,1604,1606,1608,1610,1612,1614,1616,1618,1620,1622,1624,1626,1628,1630,1632,1634,1636,1638,1640,1642,1644,1646,1648,1650,1652,1654,1656,1658,1660,1662,1664,1666,1668,1670,1672,1674,1676,1678,1680,1682,1684,1686,1688,1690,1692,1694,1696,1698,1700,1702,1704,1706,1708,1710,1712,1714,1716,1718,1720,1722,1724,1726,1728,1730,1732,1734,1736,1738,1740,1742,1744,1746,1748,1750,1752,1754,1756,1758,1760,1762,1764,1766,1768,1770,1772,1774,1776,1778,1780,1782,1784,1786,1788,1790,1792,1794,1796,1798,1800,1802,1804,1806,1808,1810,1812,1814,1816,1818,1820,1822,1824,1826,1828,1830,1832,1834,1836,1838,1840,1842,1844,1846,1848,1850,1852,1854,1856,1858,1860,1862,1864,1866,1868,1870,1872,1874,1876,1878,1880,1882,1884,1886,1888,1890,1892,1894,1896,1898,1900,1902,1904,1906,1908,1910,1912,1914,1916,1918,1920,1922,1924,1926,1928,1930,1932,1934,1936,1938,1940,1942,1944,1946,1948,1950,1952,1954,1956,1958,1960,1962,1964,1966,1968,1970,1972,1974,1976,1978,1980,1982,1984,1986,1988,1990,1992,1994,1996,1998,2000,2002,2004,2006,2008,2010,2012,2014,2016,2018,2020,2022,2024,2026,2028,2030,2032,2034,2036,2038,2040,2042,2044,2046,2048,2050,2052,2054,2056,2058,2060,2062,2064,2066,2068,2070,2072,2074,2076,2078,2080,2082,2084,2086,2088,2090,2092,2094,2096,2098,2100,2102,2104,2106,2108,2110,2112,2114,2116,2118,2120,2122,2124,2126,2128,2130,2132,2134,2136,2138,2140,2142,2144,2146,2148,2150,2152,2154,2156,2158,2160,2162,2164,2166,2168,2170,2172,2174,2176,2178,2180,2182,2184,2186,2188,2190,2192,2194,2196,2198,2200,2202,2204,2206,2208,2210,2212,2214,2216,2218,2220,2222,2224,2226,2228,2230,2232,2234,2236,2238,2240,2242,2244,2246,2248,2250,2252,2254,2256,2258,2260,2262,2264,2266,2268,2270,2272,2274,2276,2278,2280,2282,2284,2286,2288,2290,2292,2294,2296,2298,2300,2302,2304,2306,2308,2310,2312,2314,2316,2318,2320,2322,2324,2326,2328,2330,2332,2334,2336,2338,2340,2342,2344,2346,2348,2350,2352,2354,2356,2358,2360,2362,2364,2366,2368,2370,2372,2374,2376,2378,2380,2382,2384,2386,2388,2390,2392,2394,2396,2398,2400,2402,2404,2406,2408,2410,2412,2414,2416,2418,2420,2422,2424,2426,2428,2430,2432,2434,2436,2438,2440,2442,2444,2446,2448,2450,2452,2454,2456,2458,2460,2462,2464,2466,2468,2470,2472,2474,2476,2478,2480,2482,2484,2486,2488,2490,2492,2494,2496,2498,2500,2502,2504,2506,2508,2510,2512,2514,2516,2518,2520,2522,2524,2526,2528,2530,2532,2534,2536,2538,2540,2542,2544,2546,2548,2550,2552,2554,2556,2558,2560,2562,2564,2566,2568,2570,2572,2574,2576,2578,2580,2582,2584,2586,2588,2590,2592,2594,2596,2598,2600,2602,2604,2606,2608,2610,2612,2614,2616,2618,2620,2622,2624,2626,2628,2630,2632,2634,2636,2638,2640,2642,2644,2646,2648,2650,2652,2654,2656,2658,2660,2662,2664,2666,2668,2670,2672,2674,2676,2678,2680,2682,2684,2686,2688,2690,2692,2694,2696,2698,2700,2702,2704,2706,2708,2710,2712,2714,2716,2718,2720,2722,2724,2726,2728,2730,2732,2734,2736,2738,2740,2742,2744,2746,2748,2750,2752,2754,2756,2758,2760,2762,2764,2766,2768,2770,2772,2774,2776,2778,2780,2782,2784,2786,2788,2790,2792,2794,2796,2798,2800,2802,2804,2806,2808,2810,2812,2814,2816,2818,2820,2822,2824,2826,2828,2830,2832,2834,2836,2838,2840,2842,2844,2846,2848,2850,2852,2854,2856,2858,2860,2862,2864,2866,2868,2870,2872,2874,2876,2878,2880,2882,2884,2886,2888,2890,2892,2894,2896,2898,2900,2902,2904,2906,2908,2910,2912,2914,2916,2918,2920,2922,2924,2926,2928,2930,2932,2934,2936,2938,2940,2942,2944,2946,2948,2950,2952,2954,2956,2958,2960,2962,2964,2966,2968,2970,2972,2974,2976,2978,2980,2982,2984,2986,2988,2990,2992,2994,2996,2998,3000,3002,3004,3006,3008,3010,3012,3014,3016,3018,3020,3022,3024,3026,3028,3030,3032,3034,3036,3038,3040,3042,3044,3046,3048,3050,3052,3054,3056,3058,3060,3062,3064,3066,3068,3070,3072,3074,3076,3078,3080,3082,3084,3086,3088,3090,3092,3094,3096,3098,3100,3102,3104,3106,3108,3110,3112,3114,3116,3118,3120,3122,3124,3126,3128,3130,3132,3134,3136,3138,3140,3142,3144,3146,3148,3150,3152,3154,3156,3158,3160,3162,3164,3166,3168,3170,3172,3174,3176,3178,3180,3182,3184,3186,3188,3190,3192,3194,3196,3198,3200,3202,3204,3206,3208,3210,3212,3214,3216,3218,3220,3222,3224,3226,3228,3230,3232,3234,3236,3238,3240,3242,3244,3246,3248,3250,3252,3254,3256,3258,3260,3262,3264,3266,3268,3270,3272,3274,3276,3278,3280,3282,3284,3286,3288,3290,3292,3294,3296,3298,3300,3302,3304,3306,3308,3310,3312,3314,3316,3318,3320,3322,3324,3326,3328,3330,3332,3334,3336,3338,3340,3342,3344,3346,3348,3350,3352,3354,3356,3358,3360,3362,3364,3366,3368,3370,3372,3374,3376,3378,3380,3382,3384,3386,3388,3390,3392,3394,3396,3398,3400,3402,3404,3406,3408,3410,3412,3414,3416,3418,3420,3422,3424,3426,3428,3430,3432,3434,3436,3438,3440,3442,3444,3446,3448,3450,3452,3454,3456,3458,3460,3462,3464,3466,3468,3470,3472,3474,3476,3478,3480,3482,3484,3486,3488,3490,3492,3494,3496,3498,3500,3502,3504,3506,3508,3510,3512,3514,3516,3518,3520,3522,3524,3526,3528,3530,3532,3534,3536,3538,3540,3542,3544,3546,3548,3550,3552,3554,3556,3558,3560,3562,3564,3566,3568,3570,3572,3574,3576,3578,3580,3582,3584,3586,3588,3590,3592,3594,3596,3598,3600,3602,3604,3606,3608,3610,3612,3614,3616,3618,3620,3622,3624,3626,3628,3630,3632,3634,3636,3638,3640,3642,3644,3646,3648,3650,3652,3654,3656,3658,3660,3662,3664,3666,3668,3670,3672,3674,3676,3678,3680,3682,3684,3686,3688,3690,3692],{"categories":126},[127],"Developer Productivity",{"categories":129},[130],"Business & SaaS",{"categories":132},[133],"AI & LLMs",{"categories":135},[136],"AI Automation",{"categories":138},[139],"Product Strategy",{"categories":141},[133],{"categories":143},[127],{"categories":145},[130],{"categories":147},[],{"categories":149},[133],{"categories":151},[],{"categories":153},[154],"AI News & Trends",{"categories":156},[136],{"categories":158},[154],{"categories":160},[136],{"categories":162},[136],{"categories":164},[133],{"categories":166},[133],{"categories":168},[154],{"categories":170},[133],{"categories":172},[],{"categories":174},[175],"Design & Frontend",{"categories":177},[178],"Data Science & Visualization",{"categories":180},[154],{"categories":182},[],{"categories":184},[185],"Software Engineering",{"categories":187},[133],{"categories":189},[136],{"categories":191},[192],"Marketing & Growth",{"categories":194},[133],{"categories":196},[136],{"categories":198},[],{"categories":200},[],{"categories":202},[175],{"categories":204},[136],{"categories":206},[127],{"categories":208},[175],{"categories":210},[133],{"categories":212},[136],{"categories":214},[154],{"categories":216},[],{"categories":218},[],{"categories":220},[136],{"categories":222},[185],{"categories":224},[],{"categories":226},[130],{"categories":228},[],{"categories":230},[],{"categories":232},[136],{"categories":234},[136],{"categories":236},[133],{"categories":238},[],{"categories":240},[185],{"categories":242},[],{"categories":244},[],{"categories":246},[],{"categories":248},[133],{"categories":250},[192],{"categories":252},[175],{"categories":254},[175],{"categories":256},[133],{"categories":258},[136],{"categories":260},[133],{"categories":262},[133],{"categories":264},[136],{"categories":266},[136],{"categories":268},[178],{"categories":270},[154],{"categories":272},[136],{"categories":274},[192],{"categories":276},[136],{"categories":278},[139],{"categories":280},[],{"categories":282},[136],{"categories":284},[],{"categories":286},[136],{"categories":288},[185],{"categories":290},[175],{"categories":292},[133],{"categories":294},[],{"categories":296},[],{"categories":298},[136],{"categories":300},[],{"categories":302},[133],{"categories":304},[],{"categories":306},[127],{"categories":308},[185],{"categories":310},[130],{"categories":312},[154],{"categories":314},[133],{"categories":316},[],{"categories":318},[133],{"categories":320},[],{"categories":322},[185],{"categories":324},[178],{"categories":326},[],{"categories":328},[133],{"categories":330},[175],{"categories":332},[],{"categories":334},[175],{"categories":336},[136],{"categories":338},[],{"categories":340},[136],{"categories":342},[154],{"categories":344},[133],{"categories":346},[],{"categories":348},[136],{"categories":350},[133],{"categories":352},[139],{"categories":354},[],{"categories":356},[133],{"categories":358},[136],{"categories":360},[136],{"categories":362},[],{"categories":364},[178],{"categories":366},[133],{"categories":368},[],{"categories":370},[127],{"categories":372},[130],{"categories":374},[133],{"categories":376},[136],{"categories":378},[185],{"categories":380},[133],{"categories":382},[],{"categories":384},[],{"categories":386},[133],{"categories":388},[],{"categories":390},[175],{"categories":392},[],{"categories":394},[133],{"categories":396},[],{"categories":398},[136],{"categories":400},[133],{"categories":402},[175],{"categories":404},[],{"categories":406},[133],{"categories":408},[133],{"categories":410},[130],{"categories":412},[136],{"categories":414},[133],{"categories":416},[175],{"categories":418},[136],{"categories":420},[],{"categories":422},[],{"categories":424},[154],{"categories":426},[],{"categories":428},[133],{"categories":430},[130,192],{"categories":432},[],{"categories":434},[133],{"categories":436},[],{"categories":438},[],{"categories":440},[133],{"categories":442},[],{"categories":444},[133],{"categories":446},[447],"DevOps & Cloud",{"categories":449},[],{"categories":451},[154],{"categories":453},[175],{"categories":455},[],{"categories":457},[154],{"categories":459},[154],{"categories":461},[133],{"categories":463},[192],{"categories":465},[],{"categories":467},[130],{"categories":469},[],{"categories":471},[133,447],{"categories":473},[133],{"categories":475},[133],{"categories":477},[136],{"categories":479},[133,185],{"categories":481},[178],{"categories":483},[133],{"categories":485},[192],{"categories":487},[136],{"categories":489},[136],{"categories":491},[],{"categories":493},[136],{"categories":495},[133,130],{"categories":497},[],{"categories":499},[175],{"categories":501},[175],{"categories":503},[],{"categories":505},[],{"categories":507},[154],{"categories":509},[],{"categories":511},[127],{"categories":513},[185],{"categories":515},[133],{"categories":517},[175],{"categories":519},[136],{"categories":521},[185],{"categories":523},[154],{"categories":525},[175],{"categories":527},[],{"categories":529},[133],{"categories":531},[133],{"categories":533},[133],{"categories":535},[154],{"categories":537},[127],{"categories":539},[133],{"categories":541},[136],{"categories":543},[447],{"categories":545},[175],{"categories":547},[136],{"categories":549},[],{"categories":551},[],{"categories":553},[175],{"categories":555},[154],{"categories":557},[178],{"categories":559},[],{"categories":561},[133],{"categories":563},[133],{"categories":565},[130],{"categories":567},[133],{"categories":569},[133],{"categories":571},[154],{"categories":573},[],{"categories":575},[136],{"categories":577},[185],{"categories":579},[],{"categories":581},[133],{"categories":583},[133],{"categories":585},[136],{"categories":587},[],{"categories":589},[],{"categories":591},[133],{"categories":593},[],{"categories":595},[130],{"categories":597},[136],{"categories":599},[],{"categories":601},[127],{"categories":603},[133],{"categories":605},[130],{"categories":607},[154],{"categories":609},[],{"categories":611},[],{"categories":613},[],{"categories":615},[154],{"categories":617},[154],{"categories":619},[],{"categories":621},[],{"categories":623},[130],{"categories":625},[],{"categories":627},[],{"categories":629},[127],{"categories":631},[],{"categories":633},[192],{"categories":635},[136],{"categories":637},[130],{"categories":639},[136],{"categories":641},[],{"categories":643},[139],{"categories":645},[175],{"categories":647},[185],{"categories":649},[133],{"categories":651},[136],{"categories":653},[130],{"categories":655},[133],{"categories":657},[],{"categories":659},[],{"categories":661},[185],{"categories":663},[178],{"categories":665},[139],{"categories":667},[136],{"categories":669},[133],{"categories":671},[],{"categories":673},[447],{"categories":675},[],{"categories":677},[136],{"categories":679},[],{"categories":681},[],{"categories":683},[133],{"categories":685},[175],{"categories":687},[192],{"categories":689},[136],{"categories":691},[],{"categories":693},[127],{"categories":695},[],{"categories":697},[154],{"categories":699},[133,447],{"categories":701},[154],{"categories":703},[133],{"categories":705},[130],{"categories":707},[133],{"categories":709},[],{"categories":711},[130],{"categories":713},[],{"categories":715},[185],{"categories":717},[175],{"categories":719},[154],{"categories":721},[178],{"categories":723},[127],{"categories":725},[133],{"categories":727},[185],{"categories":729},[],{"categories":731},[],{"categories":733},[139],{"categories":735},[],{"categories":737},[133],{"categories":739},[],{"categories":741},[175],{"categories":743},[175],{"categories":745},[175],{"categories":747},[],{"categories":749},[],{"categories":751},[154],{"categories":753},[136],{"categories":755},[133],{"categories":757},[133],{"categories":759},[133],{"categories":761},[130],{"categories":763},[133],{"categories":765},[],{"categories":767},[185],{"categories":769},[185],{"categories":771},[130],{"categories":773},[],{"categories":775},[133],{"categories":777},[133],{"categories":779},[130],{"categories":781},[154],{"categories":783},[192],{"categories":785},[136],{"categories":787},[],{"categories":789},[175],{"categories":791},[],{"categories":793},[133],{"categories":795},[],{"categories":797},[130],{"categories":799},[136],{"categories":801},[],{"categories":803},[447],{"categories":805},[178],{"categories":807},[185],{"categories":809},[192],{"categories":811},[185],{"categories":813},[136],{"categories":815},[],{"categories":817},[],{"categories":819},[136],{"categories":821},[127],{"categories":823},[136],{"categories":825},[139],{"categories":827},[130],{"categories":829},[],{"categories":831},[133],{"categories":833},[139],{"categories":835},[133],{"categories":837},[133],{"categories":839},[192],{"categories":841},[175],{"categories":843},[136],{"categories":845},[],{"categories":847},[],{"categories":849},[447],{"categories":851},[185],{"categories":853},[],{"categories":855},[136],{"categories":857},[133],{"categories":859},[175,133],{"categories":861},[127],{"categories":863},[],{"categories":865},[133],{"categories":867},[127],{"categories":869},[175],{"categories":871},[136],{"categories":873},[185],{"categories":875},[],{"categories":877},[133],{"categories":879},[],{"categories":881},[127],{"categories":883},[],{"categories":885},[136],{"categories":887},[139],{"categories":889},[133],{"categories":891},[133],{"categories":893},[175],{"categories":895},[136],{"categories":897},[447],{"categories":899},[175],{"categories":901},[136],{"categories":903},[133],{"categories":905},[133],{"categories":907},[133],{"categories":909},[154],{"categories":911},[],{"categories":913},[139],{"categories":915},[136],{"categories":917},[175],{"categories":919},[136],{"categories":921},[185],{"categories":923},[175],{"categories":925},[136],{"categories":927},[154],{"categories":929},[],{"categories":931},[133],{"categories":933},[175],{"categories":935},[133],{"categories":937},[127],{"categories":939},[154],{"categories":941},[133],{"categories":943},[192],{"categories":945},[133],{"categories":947},[133],{"categories":949},[136],{"categories":951},[136],{"categories":953},[133],{"categories":955},[136],{"categories":957},[175],{"categories":959},[133],{"categories":961},[],{"categories":963},[],{"categories":965},[185],{"categories":967},[],{"categories":969},[127],{"categories":971},[447],{"categories":973},[],{"categories":975},[127],{"categories":977},[130],{"categories":979},[192],{"categories":981},[],{"categories":983},[130],{"categories":985},[],{"categories":987},[],{"categories":989},[],{"categories":991},[],{"categories":993},[],{"categories":995},[133],{"categories":997},[136],{"categories":999},[447],{"categories":1001},[127],{"categories":1003},[133],{"categories":1005},[185],{"categories":1007},[139],{"categories":1009},[133],{"categories":1011},[192],{"categories":1013},[133],{"categories":1015},[133],{"categories":1017},[133],{"categories":1019},[133,127],{"categories":1021},[185],{"categories":1023},[185],{"categories":1025},[175],{"categories":1027},[133],{"categories":1029},[],{"categories":1031},[],{"categories":1033},[],{"categories":1035},[185],{"categories":1037},[178],{"categories":1039},[154],{"categories":1041},[175],{"categories":1043},[],{"categories":1045},[133],{"categories":1047},[133],{"categories":1049},[],{"categories":1051},[],{"categories":1053},[136],{"categories":1055},[133],{"categories":1057},[130],{"categories":1059},[],{"categories":1061},[127],{"categories":1063},[133],{"categories":1065},[127],{"categories":1067},[133],{"categories":1069},[185],{"categories":1071},[192],{"categories":1073},[133,175],{"categories":1075},[154],{"categories":1077},[175],{"categories":1079},[],{"categories":1081},[447],{"categories":1083},[175],{"categories":1085},[136],{"categories":1087},[],{"categories":1089},[],{"categories":1091},[],{"categories":1093},[],{"categories":1095},[185],{"categories":1097},[136],{"categories":1099},[136],{"categories":1101},[133],{"categories":1103},[133],{"categories":1105},[],{"categories":1107},[175],{"categories":1109},[],{"categories":1111},[],{"categories":1113},[136],{"categories":1115},[],{"categories":1117},[],{"categories":1119},[192],{"categories":1121},[192],{"categories":1123},[136],{"categories":1125},[],{"categories":1127},[133],{"categories":1129},[133],{"categories":1131},[185],{"categories":1133},[175],{"categories":1135},[175],{"categories":1137},[136],{"categories":1139},[127],{"categories":1141},[133],{"categories":1143},[175],{"categories":1145},[175],{"categories":1147},[136],{"categories":1149},[136],{"categories":1151},[133],{"categories":1153},[],{"categories":1155},[],{"categories":1157},[133],{"categories":1159},[136],{"categories":1161},[154],{"categories":1163},[185],{"categories":1165},[127],{"categories":1167},[133],{"categories":1169},[],{"categories":1171},[136],{"categories":1173},[136],{"categories":1175},[],{"categories":1177},[127],{"categories":1179},[133],{"categories":1181},[127],{"categories":1183},[127],{"categories":1185},[],{"categories":1187},[],{"categories":1189},[136],{"categories":1191},[136],{"categories":1193},[133],{"categories":1195},[133],{"categories":1197},[154],{"categories":1199},[178],{"categories":1201},[139],{"categories":1203},[154],{"categories":1205},[175],{"categories":1207},[],{"categories":1209},[154],{"categories":1211},[],{"categories":1213},[],{"categories":1215},[],{"categories":1217},[],{"categories":1219},[185],{"categories":1221},[178],{"categories":1223},[],{"categories":1225},[133],{"categories":1227},[133],{"categories":1229},[178],{"categories":1231},[185],{"categories":1233},[],{"categories":1235},[],{"categories":1237},[136],{"categories":1239},[154],{"categories":1241},[154],{"categories":1243},[136],{"categories":1245},[127],{"categories":1247},[133,447],{"categories":1249},[],{"categories":1251},[175],{"categories":1253},[127],{"categories":1255},[136],{"categories":1257},[175],{"categories":1259},[],{"categories":1261},[136],{"categories":1263},[136],{"categories":1265},[133],{"categories":1267},[192],{"categories":1269},[185],{"categories":1271},[175],{"categories":1273},[],{"categories":1275},[136],{"categories":1277},[133],{"categories":1279},[136],{"categories":1281},[136],{"categories":1283},[136],{"categories":1285},[192],{"categories":1287},[136],{"categories":1289},[133],{"categories":1291},[],{"categories":1293},[192],{"categories":1295},[154],{"categories":1297},[136],{"categories":1299},[],{"categories":1301},[],{"categories":1303},[133],{"categories":1305},[136],{"categories":1307},[154],{"categories":1309},[136],{"categories":1311},[],{"categories":1313},[],{"categories":1315},[],{"categories":1317},[136],{"categories":1319},[],{"categories":1321},[],{"categories":1323},[178],{"categories":1325},[133],{"categories":1327},[178],{"categories":1329},[154],{"categories":1331},[133],{"categories":1333},[133],{"categories":1335},[136],{"categories":1337},[133],{"categories":1339},[],{"categories":1341},[],{"categories":1343},[447],{"categories":1345},[],{"categories":1347},[],{"categories":1349},[127],{"categories":1351},[],{"categories":1353},[],{"categories":1355},[],{"categories":1357},[],{"categories":1359},[185],{"categories":1361},[154],{"categories":1363},[192],{"categories":1365},[130],{"categories":1367},[133],{"categories":1369},[133],{"categories":1371},[130],{"categories":1373},[],{"categories":1375},[175],{"categories":1377},[136],{"categories":1379},[130],{"categories":1381},[133],{"categories":1383},[133],{"categories":1385},[127],{"categories":1387},[],{"categories":1389},[127],{"categories":1391},[133],{"categories":1393},[192],{"categories":1395},[136],{"categories":1397},[154],{"categories":1399},[130],{"categories":1401},[133],{"categories":1403},[136],{"categories":1405},[],{"categories":1407},[133],{"categories":1409},[127],{"categories":1411},[133],{"categories":1413},[],{"categories":1415},[154],{"categories":1417},[133],{"categories":1419},[],{"categories":1421},[130],{"categories":1423},[133],{"categories":1425},[],{"categories":1427},[],{"categories":1429},[],{"categories":1431},[133],{"categories":1433},[],{"categories":1435},[447],{"categories":1437},[133],{"categories":1439},[],{"categories":1441},[133],{"categories":1443},[133],{"categories":1445},[133],{"categories":1447},[133,447],{"categories":1449},[133],{"categories":1451},[133],{"categories":1453},[175],{"categories":1455},[136],{"categories":1457},[],{"categories":1459},[136],{"categories":1461},[133],{"categories":1463},[133],{"categories":1465},[133],{"categories":1467},[127],{"categories":1469},[127],{"categories":1471},[185],{"categories":1473},[175],{"categories":1475},[136],{"categories":1477},[],{"categories":1479},[133],{"categories":1481},[154],{"categories":1483},[133],{"categories":1485},[130],{"categories":1487},[],{"categories":1489},[447],{"categories":1491},[175],{"categories":1493},[175],{"categories":1495},[136],{"categories":1497},[154],{"categories":1499},[136],{"categories":1501},[133],{"categories":1503},[],{"categories":1505},[133],{"categories":1507},[],{"categories":1509},[],{"categories":1511},[133],{"categories":1513},[133],{"categories":1515},[133],{"categories":1517},[136],{"categories":1519},[133],{"categories":1521},[],{"categories":1523},[178],{"categories":1525},[136],{"categories":1527},[],{"categories":1529},[133],{"categories":1531},[154],{"categories":1533},[],{"categories":1535},[175],{"categories":1537},[447],{"categories":1539},[154],{"categories":1541},[185],{"categories":1543},[185],{"categories":1545},[154],{"categories":1547},[154],{"categories":1549},[447],{"categories":1551},[],{"categories":1553},[154],{"categories":1555},[133],{"categories":1557},[127],{"categories":1559},[154],{"categories":1561},[],{"categories":1563},[178],{"categories":1565},[154],{"categories":1567},[185],{"categories":1569},[154],{"categories":1571},[447],{"categories":1573},[133],{"categories":1575},[133],{"categories":1577},[],{"categories":1579},[130],{"categories":1581},[],{"categories":1583},[],{"categories":1585},[133],{"categories":1587},[133],{"categories":1589},[133],{"categories":1591},[133],{"categories":1593},[],{"categories":1595},[178],{"categories":1597},[127],{"categories":1599},[],{"categories":1601},[133],{"categories":1603},[133],{"categories":1605},[447],{"categories":1607},[447],{"categories":1609},[],{"categories":1611},[136],{"categories":1613},[154],{"categories":1615},[154],{"categories":1617},[133],{"categories":1619},[136],{"categories":1621},[],{"categories":1623},[175],{"categories":1625},[133],{"categories":1627},[133],{"categories":1629},[],{"categories":1631},[],{"categories":1633},[447],{"categories":1635},[133],{"categories":1637},[185],{"categories":1639},[130],{"categories":1641},[133],{"categories":1643},[],{"categories":1645},[136],{"categories":1647},[127],{"categories":1649},[127],{"categories":1651},[],{"categories":1653},[133],{"categories":1655},[175],{"categories":1657},[136],{"categories":1659},[],{"categories":1661},[133],{"categories":1663},[133],{"categories":1665},[136],{"categories":1667},[],{"categories":1669},[136],{"categories":1671},[185],{"categories":1673},[],{"categories":1675},[133],{"categories":1677},[],{"categories":1679},[133],{"categories":1681},[],{"categories":1683},[133],{"categories":1685},[133],{"categories":1687},[],{"categories":1689},[133],{"categories":1691},[154],{"categories":1693},[133],{"categories":1695},[133],{"categories":1697},[127],{"categories":1699},[133],{"categories":1701},[154],{"categories":1703},[136],{"categories":1705},[],{"categories":1707},[133],{"categories":1709},[192],{"categories":1711},[],{"categories":1713},[],{"categories":1715},[],{"categories":1717},[127],{"categories":1719},[154],{"categories":1721},[136],{"categories":1723},[133],{"categories":1725},[175],{"categories":1727},[136],{"categories":1729},[],{"categories":1731},[136],{"categories":1733},[],{"categories":1735},[133],{"categories":1737},[136],{"categories":1739},[133],{"categories":1741},[],{"categories":1743},[133],{"categories":1745},[133],{"categories":1747},[154],{"categories":1749},[175],{"categories":1751},[136],{"categories":1753},[175],{"categories":1755},[130],{"categories":1757},[],{"categories":1759},[],{"categories":1761},[133],{"categories":1763},[127],{"categories":1765},[154],{"categories":1767},[],{"categories":1769},[],{"categories":1771},[185],{"categories":1773},[175],{"categories":1775},[],{"categories":1777},[133],{"categories":1779},[],{"categories":1781},[192],{"categories":1783},[133],{"categories":1785},[447],{"categories":1787},[185],{"categories":1789},[],{"categories":1791},[136],{"categories":1793},[133],{"categories":1795},[136],{"categories":1797},[136],{"categories":1799},[133],{"categories":1801},[],{"categories":1803},[127],{"categories":1805},[133],{"categories":1807},[130],{"categories":1809},[185],{"categories":1811},[175],{"categories":1813},[],{"categories":1815},[],{"categories":1817},[],{"categories":1819},[136],{"categories":1821},[175],{"categories":1823},[154],{"categories":1825},[133],{"categories":1827},[154],{"categories":1829},[175],{"categories":1831},[],{"categories":1833},[175],{"categories":1835},[154],{"categories":1837},[130],{"categories":1839},[133],{"categories":1841},[154],{"categories":1843},[192],{"categories":1845},[],{"categories":1847},[],{"categories":1849},[178],{"categories":1851},[133,185],{"categories":1853},[154],{"categories":1855},[133],{"categories":1857},[136],{"categories":1859},[136],{"categories":1861},[133],{"categories":1863},[],{"categories":1865},[185],{"categories":1867},[133],{"categories":1869},[178],{"categories":1871},[136],{"categories":1873},[192],{"categories":1875},[447],{"categories":1877},[],{"categories":1879},[127],{"categories":1881},[136],{"categories":1883},[136],{"categories":1885},[185],{"categories":1887},[133],{"categories":1889},[133],{"categories":1891},[],{"categories":1893},[],{"categories":1895},[],{"categories":1897},[447],{"categories":1899},[154],{"categories":1901},[133],{"categories":1903},[133],{"categories":1905},[133],{"categories":1907},[],{"categories":1909},[178],{"categories":1911},[130],{"categories":1913},[],{"categories":1915},[136],{"categories":1917},[447],{"categories":1919},[],{"categories":1921},[175],{"categories":1923},[175],{"categories":1925},[],{"categories":1927},[185],{"categories":1929},[175],{"categories":1931},[133],{"categories":1933},[],{"categories":1935},[154],{"categories":1937},[133],{"categories":1939},[175],{"categories":1941},[136],{"categories":1943},[154],{"categories":1945},[],{"categories":1947},[136],{"categories":1949},[175],{"categories":1951},[133],{"categories":1953},[],{"categories":1955},[133],{"categories":1957},[133],{"categories":1959},[447],{"categories":1961},[154],{"categories":1963},[178],{"categories":1965},[178],{"categories":1967},[],{"categories":1969},[],{"categories":1971},[],{"categories":1973},[136],{"categories":1975},[185],{"categories":1977},[185],{"categories":1979},[],{"categories":1981},[],{"categories":1983},[133],{"categories":1985},[],{"categories":1987},[136],{"categories":1989},[133],{"categories":1991},[],{"categories":1993},[133],{"categories":1995},[130],{"categories":1997},[133],{"categories":1999},[192],{"categories":2001},[136],{"categories":2003},[133],{"categories":2005},[185],{"categories":2007},[154],{"categories":2009},[136],{"categories":2011},[],{"categories":2013},[154],{"categories":2015},[136],{"categories":2017},[136],{"categories":2019},[],{"categories":2021},[130],{"categories":2023},[136],{"categories":2025},[],{"categories":2027},[133],{"categories":2029},[127],{"categories":2031},[154],{"categories":2033},[447],{"categories":2035},[136],{"categories":2037},[136],{"categories":2039},[127],{"categories":2041},[133],{"categories":2043},[],{"categories":2045},[],{"categories":2047},[175],{"categories":2049},[133,130],{"categories":2051},[],{"categories":2053},[127],{"categories":2055},[178],{"categories":2057},[133],{"categories":2059},[185],{"categories":2061},[133],{"categories":2063},[136],{"categories":2065},[133],{"categories":2067},[133],{"categories":2069},[154],{"categories":2071},[136],{"categories":2073},[],{"categories":2075},[],{"categories":2077},[136],{"categories":2079},[133],{"categories":2081},[447],{"categories":2083},[],{"categories":2085},[133],{"categories":2087},[136],{"categories":2089},[],{"categories":2091},[133],{"categories":2093},[192],{"categories":2095},[178],{"categories":2097},[136],{"categories":2099},[133],{"categories":2101},[447],{"categories":2103},[],{"categories":2105},[133],{"categories":2107},[192],{"categories":2109},[175],{"categories":2111},[133],{"categories":2113},[],{"categories":2115},[192],{"categories":2117},[154],{"categories":2119},[133],{"categories":2121},[133],{"categories":2123},[127],{"categories":2125},[],{"categories":2127},[],{"categories":2129},[175],{"categories":2131},[133],{"categories":2133},[178],{"categories":2135},[192],{"categories":2137},[192],{"categories":2139},[154],{"categories":2141},[],{"categories":2143},[],{"categories":2145},[133],{"categories":2147},[],{"categories":2149},[133,185],{"categories":2151},[154],{"categories":2153},[136],{"categories":2155},[185],{"categories":2157},[133],{"categories":2159},[127],{"categories":2161},[],{"categories":2163},[],{"categories":2165},[127],{"categories":2167},[192],{"categories":2169},[133],{"categories":2171},[],{"categories":2173},[175,133],{"categories":2175},[447],{"categories":2177},[127],{"categories":2179},[],{"categories":2181},[130],{"categories":2183},[130],{"categories":2185},[133],{"categories":2187},[185],{"categories":2189},[136],{"categories":2191},[154],{"categories":2193},[192],{"categories":2195},[175],{"categories":2197},[133],{"categories":2199},[133],{"categories":2201},[133],{"categories":2203},[127],{"categories":2205},[133],{"categories":2207},[136],{"categories":2209},[154],{"categories":2211},[],{"categories":2213},[],{"categories":2215},[178],{"categories":2217},[185],{"categories":2219},[133],{"categories":2221},[175],{"categories":2223},[178],{"categories":2225},[133],{"categories":2227},[133],{"categories":2229},[136],{"categories":2231},[136],{"categories":2233},[133,130],{"categories":2235},[],{"categories":2237},[175],{"categories":2239},[],{"categories":2241},[133],{"categories":2243},[154],{"categories":2245},[127],{"categories":2247},[127],{"categories":2249},[136],{"categories":2251},[133],{"categories":2253},[130],{"categories":2255},[185],{"categories":2257},[192],{"categories":2259},[],{"categories":2261},[154],{"categories":2263},[133],{"categories":2265},[133],{"categories":2267},[154],{"categories":2269},[185],{"categories":2271},[133],{"categories":2273},[136],{"categories":2275},[154],{"categories":2277},[133],{"categories":2279},[175],{"categories":2281},[133],{"categories":2283},[133],{"categories":2285},[447],{"categories":2287},[139],{"categories":2289},[136],{"categories":2291},[133],{"categories":2293},[154],{"categories":2295},[136],{"categories":2297},[192],{"categories":2299},[133],{"categories":2301},[],{"categories":2303},[133],{"categories":2305},[],{"categories":2307},[],{"categories":2309},[],{"categories":2311},[130],{"categories":2313},[133],{"categories":2315},[136],{"categories":2317},[154],{"categories":2319},[154],{"categories":2321},[154],{"categories":2323},[154],{"categories":2325},[],{"categories":2327},[127],{"categories":2329},[136],{"categories":2331},[154],{"categories":2333},[127],{"categories":2335},[136],{"categories":2337},[133],{"categories":2339},[133,136],{"categories":2341},[136],{"categories":2343},[447],{"categories":2345},[154],{"categories":2347},[154],{"categories":2349},[136],{"categories":2351},[133],{"categories":2353},[],{"categories":2355},[154],{"categories":2357},[192],{"categories":2359},[127],{"categories":2361},[133],{"categories":2363},[133],{"categories":2365},[],{"categories":2367},[185],{"categories":2369},[],{"categories":2371},[127],{"categories":2373},[136],{"categories":2375},[154],{"categories":2377},[133],{"categories":2379},[154],{"categories":2381},[127],{"categories":2383},[154],{"categories":2385},[154],{"categories":2387},[],{"categories":2389},[130],{"categories":2391},[136],{"categories":2393},[154],{"categories":2395},[154],{"categories":2397},[154],{"categories":2399},[154],{"categories":2401},[154],{"categories":2403},[154],{"categories":2405},[154],{"categories":2407},[154],{"categories":2409},[154],{"categories":2411},[154],{"categories":2413},[178],{"categories":2415},[127],{"categories":2417},[133],{"categories":2419},[133],{"categories":2421},[],{"categories":2423},[133,127],{"categories":2425},[],{"categories":2427},[136],{"categories":2429},[154],{"categories":2431},[136],{"categories":2433},[133],{"categories":2435},[133],{"categories":2437},[133],{"categories":2439},[133],{"categories":2441},[133],{"categories":2443},[136],{"categories":2445},[130],{"categories":2447},[175],{"categories":2449},[154],{"categories":2451},[133],{"categories":2453},[],{"categories":2455},[],{"categories":2457},[136],{"categories":2459},[175],{"categories":2461},[133],{"categories":2463},[],{"categories":2465},[],{"categories":2467},[192],{"categories":2469},[133],{"categories":2471},[],{"categories":2473},[],{"categories":2475},[127],{"categories":2477},[130],{"categories":2479},[133],{"categories":2481},[130],{"categories":2483},[175],{"categories":2485},[],{"categories":2487},[154],{"categories":2489},[],{"categories":2491},[175],{"categories":2493},[133],{"categories":2495},[192],{"categories":2497},[],{"categories":2499},[192],{"categories":2501},[],{"categories":2503},[],{"categories":2505},[136],{"categories":2507},[],{"categories":2509},[130],{"categories":2511},[127],{"categories":2513},[175],{"categories":2515},[185],{"categories":2517},[],{"categories":2519},[],{"categories":2521},[133],{"categories":2523},[127],{"categories":2525},[192],{"categories":2527},[],{"categories":2529},[136],{"categories":2531},[136],{"categories":2533},[154],{"categories":2535},[133],{"categories":2537},[136],{"categories":2539},[133],{"categories":2541},[136],{"categories":2543},[133],{"categories":2545},[139],{"categories":2547},[154],{"categories":2549},[],{"categories":2551},[192],{"categories":2553},[185],{"categories":2555},[136],{"categories":2557},[],{"categories":2559},[133],{"categories":2561},[136],{"categories":2563},[130],{"categories":2565},[127],{"categories":2567},[133],{"categories":2569},[175],{"categories":2571},[185],{"categories":2573},[185],{"categories":2575},[133],{"categories":2577},[178],{"categories":2579},[133],{"categories":2581},[136],{"categories":2583},[130],{"categories":2585},[136],{"categories":2587},[133],{"categories":2589},[133],{"categories":2591},[136],{"categories":2593},[154],{"categories":2595},[],{"categories":2597},[127],{"categories":2599},[133],{"categories":2601},[136],{"categories":2603},[133],{"categories":2605},[133],{"categories":2607},[],{"categories":2609},[175],{"categories":2611},[130],{"categories":2613},[154],{"categories":2615},[133],{"categories":2617},[133],{"categories":2619},[175],{"categories":2621},[192],{"categories":2623},[178],{"categories":2625},[133],{"categories":2627},[154],{"categories":2629},[133],{"categories":2631},[136],{"categories":2633},[447],{"categories":2635},[133],{"categories":2637},[136],{"categories":2639},[178],{"categories":2641},[],{"categories":2643},[136],{"categories":2645},[185],{"categories":2647},[175],{"categories":2649},[133],{"categories":2651},[127],{"categories":2653},[130],{"categories":2655},[185],{"categories":2657},[],{"categories":2659},[136],{"categories":2661},[133],{"categories":2663},[],{"categories":2665},[154],{"categories":2667},[],{"categories":2669},[154],{"categories":2671},[133],{"categories":2673},[136],{"categories":2675},[136],{"categories":2677},[136],{"categories":2679},[],{"categories":2681},[],{"categories":2683},[133],{"categories":2685},[133],{"categories":2687},[],{"categories":2689},[175],{"categories":2691},[136],{"categories":2693},[192],{"categories":2695},[127],{"categories":2697},[],{"categories":2699},[],{"categories":2701},[154],{"categories":2703},[185],{"categories":2705},[133],{"categories":2707},[133],{"categories":2709},[133],{"categories":2711},[185],{"categories":2713},[154],{"categories":2715},[175],{"categories":2717},[133],{"categories":2719},[133],{"categories":2721},[133],{"categories":2723},[154],{"categories":2725},[133],{"categories":2727},[154],{"categories":2729},[136],{"categories":2731},[136],{"categories":2733},[185],{"categories":2735},[136],{"categories":2737},[133],{"categories":2739},[185],{"categories":2741},[175],{"categories":2743},[],{"categories":2745},[136],{"categories":2747},[],{"categories":2749},[],{"categories":2751},[130],{"categories":2753},[133],{"categories":2755},[136],{"categories":2757},[127],{"categories":2759},[136],{"categories":2761},[192],{"categories":2763},[],{"categories":2765},[136],{"categories":2767},[],{"categories":2769},[127],{"categories":2771},[136],{"categories":2773},[],{"categories":2775},[136],{"categories":2777},[133],{"categories":2779},[154],{"categories":2781},[133],{"categories":2783},[136],{"categories":2785},[154],{"categories":2787},[136],{"categories":2789},[185],{"categories":2791},[175],{"categories":2793},[127],{"categories":2795},[],{"categories":2797},[136],{"categories":2799},[175],{"categories":2801},[154],{"categories":2803},[133],{"categories":2805},[175],{"categories":2807},[127],{"categories":2809},[],{"categories":2811},[136],{"categories":2813},[136],{"categories":2815},[133],{"categories":2817},[],{"categories":2819},[136],{"categories":2821},[139],{"categories":2823},[154],{"categories":2825},[136],{"categories":2827},[130],{"categories":2829},[],{"categories":2831},[133],{"categories":2833},[139],{"categories":2835},[133],{"categories":2837},[136],{"categories":2839},[154],{"categories":2841},[127],{"categories":2843},[447],{"categories":2845},[133],{"categories":2847},[133],{"categories":2849},[133],{"categories":2851},[154],{"categories":2853},[130],{"categories":2855},[133],{"categories":2857},[175],{"categories":2859},[154],{"categories":2861},[447],{"categories":2863},[133],{"categories":2865},[],{"categories":2867},[],{"categories":2869},[447],{"categories":2871},[178],{"categories":2873},[136],{"categories":2875},[136],{"categories":2877},[154],{"categories":2879},[133],{"categories":2881},[127],{"categories":2883},[175],{"categories":2885},[136],{"categories":2887},[133],{"categories":2889},[192],{"categories":2891},[133],{"categories":2893},[136],{"categories":2895},[],{"categories":2897},[133],{"categories":2899},[133],{"categories":2901},[154],{"categories":2903},[127],{"categories":2905},[],{"categories":2907},[133],{"categories":2909},[133],{"categories":2911},[185],{"categories":2913},[175],{"categories":2915},[133,136],{"categories":2917},[192,130],{"categories":2919},[133],{"categories":2921},[],{"categories":2923},[136],{"categories":2925},[],{"categories":2927},[185],{"categories":2929},[133],{"categories":2931},[154],{"categories":2933},[],{"categories":2935},[136],{"categories":2937},[],{"categories":2939},[136],{"categories":2941},[127],{"categories":2943},[136],{"categories":2945},[133],{"categories":2947},[447],{"categories":2949},[192],{"categories":2951},[130],{"categories":2953},[130],{"categories":2955},[127],{"categories":2957},[127],{"categories":2959},[133],{"categories":2961},[136],{"categories":2963},[133],{"categories":2965},[133],{"categories":2967},[127],{"categories":2969},[133],{"categories":2971},[192],{"categories":2973},[154],{"categories":2975},[133],{"categories":2977},[136],{"categories":2979},[133],{"categories":2981},[],{"categories":2983},[185],{"categories":2985},[],{"categories":2987},[136],{"categories":2989},[127],{"categories":2991},[],{"categories":2993},[447],{"categories":2995},[133],{"categories":2997},[],{"categories":2999},[154],{"categories":3001},[136],{"categories":3003},[185],{"categories":3005},[133],{"categories":3007},[136],{"categories":3009},[185],{"categories":3011},[136],{"categories":3013},[154],{"categories":3015},[127],{"categories":3017},[154],{"categories":3019},[185],{"categories":3021},[133],{"categories":3023},[175],{"categories":3025},[133],{"categories":3027},[133],{"categories":3029},[133],{"categories":3031},[133],{"categories":3033},[136],{"categories":3035},[133],{"categories":3037},[136],{"categories":3039},[133],{"categories":3041},[127],{"categories":3043},[133],{"categories":3045},[136],{"categories":3047},[175],{"categories":3049},[127],{"categories":3051},[136],{"categories":3053},[175],{"categories":3055},[],{"categories":3057},[133],{"categories":3059},[133],{"categories":3061},[185],{"categories":3063},[],{"categories":3065},[136],{"categories":3067},[192],{"categories":3069},[133],{"categories":3071},[154],{"categories":3073},[192],{"categories":3075},[136],{"categories":3077},[130],{"categories":3079},[130],{"categories":3081},[133],{"categories":3083},[127],{"categories":3085},[],{"categories":3087},[133],{"categories":3089},[],{"categories":3091},[127],{"categories":3093},[133],{"categories":3095},[136],{"categories":3097},[136],{"categories":3099},[],{"categories":3101},[185],{"categories":3103},[185],{"categories":3105},[192],{"categories":3107},[175],{"categories":3109},[],{"categories":3111},[133],{"categories":3113},[127],{"categories":3115},[133],{"categories":3117},[185],{"categories":3119},[127],{"categories":3121},[154],{"categories":3123},[154],{"categories":3125},[],{"categories":3127},[154],{"categories":3129},[136],{"categories":3131},[175],{"categories":3133},[178],{"categories":3135},[133],{"categories":3137},[],{"categories":3139},[154],{"categories":3141},[185],{"categories":3143},[130],{"categories":3145},[133],{"categories":3147},[127],{"categories":3149},[447],{"categories":3151},[127],{"categories":3153},[],{"categories":3155},[],{"categories":3157},[154],{"categories":3159},[],{"categories":3161},[136],{"categories":3163},[136],{"categories":3165},[136],{"categories":3167},[],{"categories":3169},[133],{"categories":3171},[],{"categories":3173},[154],{"categories":3175},[127],{"categories":3177},[175],{"categories":3179},[133],{"categories":3181},[154],{"categories":3183},[154],{"categories":3185},[],{"categories":3187},[154],{"categories":3189},[127],{"categories":3191},[133],{"categories":3193},[],{"categories":3195},[136],{"categories":3197},[136],{"categories":3199},[127],{"categories":3201},[],{"categories":3203},[],{"categories":3205},[],{"categories":3207},[175],{"categories":3209},[136],{"categories":3211},[133],{"categories":3213},[],{"categories":3215},[],{"categories":3217},[],{"categories":3219},[175],{"categories":3221},[],{"categories":3223},[127],{"categories":3225},[],{"categories":3227},[],{"categories":3229},[175],{"categories":3231},[133],{"categories":3233},[154],{"categories":3235},[],{"categories":3237},[192],{"categories":3239},[154],{"categories":3241},[192],{"categories":3243},[133],{"categories":3245},[],{"categories":3247},[],{"categories":3249},[136],{"categories":3251},[],{"categories":3253},[],{"categories":3255},[136],{"categories":3257},[133],{"categories":3259},[],{"categories":3261},[136],{"categories":3263},[154],{"categories":3265},[192],{"categories":3267},[178],{"categories":3269},[136],{"categories":3271},[136],{"categories":3273},[],{"categories":3275},[],{"categories":3277},[],{"categories":3279},[154],{"categories":3281},[],{"categories":3283},[],{"categories":3285},[175],{"categories":3287},[127],{"categories":3289},[],{"categories":3291},[130],{"categories":3293},[192],{"categories":3295},[133],{"categories":3297},[185],{"categories":3299},[127],{"categories":3301},[178],{"categories":3303},[130],{"categories":3305},[185],{"categories":3307},[],{"categories":3309},[],{"categories":3311},[136],{"categories":3313},[127],{"categories":3315},[175],{"categories":3317},[127],{"categories":3319},[136],{"categories":3321},[447],{"categories":3323},[136],{"categories":3325},[],{"categories":3327},[133],{"categories":3329},[154],{"categories":3331},[185],{"categories":3333},[],{"categories":3335},[175],{"categories":3337},[154],{"categories":3339},[127],{"categories":3341},[136],{"categories":3343},[133],{"categories":3345},[130],{"categories":3347},[136,447],{"categories":3349},[136],{"categories":3351},[185],{"categories":3353},[133],{"categories":3355},[178],{"categories":3357},[192],{"categories":3359},[136],{"categories":3361},[],{"categories":3363},[136],{"categories":3365},[133],{"categories":3367},[130],{"categories":3369},[],{"categories":3371},[],{"categories":3373},[133],{"categories":3375},[178],{"categories":3377},[133],{"categories":3379},[],{"categories":3381},[154],{"categories":3383},[],{"categories":3385},[154],{"categories":3387},[185],{"categories":3389},[136],{"categories":3391},[133],{"categories":3393},[192],{"categories":3395},[185],{"categories":3397},[],{"categories":3399},[154],{"categories":3401},[133],{"categories":3403},[],{"categories":3405},[133],{"categories":3407},[136],{"categories":3409},[133],{"categories":3411},[136],{"categories":3413},[133],{"categories":3415},[133],{"categories":3417},[133],{"categories":3419},[133],{"categories":3421},[130],{"categories":3423},[],{"categories":3425},[139],{"categories":3427},[154],{"categories":3429},[133],{"categories":3431},[],{"categories":3433},[185],{"categories":3435},[133],{"categories":3437},[133],{"categories":3439},[136],{"categories":3441},[154],{"categories":3443},[133],{"categories":3445},[133],{"categories":3447},[130],{"categories":3449},[136],{"categories":3451},[175],{"categories":3453},[],{"categories":3455},[178],{"categories":3457},[133],{"categories":3459},[],{"categories":3461},[154],{"categories":3463},[192],{"categories":3465},[],{"categories":3467},[],{"categories":3469},[154],{"categories":3471},[154],{"categories":3473},[192],{"categories":3475},[127],{"categories":3477},[136],{"categories":3479},[136],{"categories":3481},[133],{"categories":3483},[130],{"categories":3485},[],{"categories":3487},[],{"categories":3489},[154],{"categories":3491},[178],{"categories":3493},[185],{"categories":3495},[136],{"categories":3497},[175],{"categories":3499},[178],{"categories":3501},[178],{"categories":3503},[],{"categories":3505},[154],{"categories":3507},[133],{"categories":3509},[133],{"categories":3511},[185],{"categories":3513},[],{"categories":3515},[154],{"categories":3517},[154],{"categories":3519},[154],{"categories":3521},[],{"categories":3523},[136],{"categories":3525},[133],{"categories":3527},[],{"categories":3529},[127],{"categories":3531},[130],{"categories":3533},[],{"categories":3535},[133],{"categories":3537},[133],{"categories":3539},[],{"categories":3541},[185],{"categories":3543},[],{"categories":3545},[],{"categories":3547},[],{"categories":3549},[],{"categories":3551},[133],{"categories":3553},[154],{"categories":3555},[],{"categories":3557},[],{"categories":3559},[133],{"categories":3561},[133],{"categories":3563},[133],{"categories":3565},[178],{"categories":3567},[133],{"categories":3569},[178],{"categories":3571},[],{"categories":3573},[178],{"categories":3575},[178],{"categories":3577},[447],{"categories":3579},[136],{"categories":3581},[185],{"categories":3583},[],{"categories":3585},[],{"categories":3587},[178],{"categories":3589},[185],{"categories":3591},[185],{"categories":3593},[185],{"categories":3595},[],{"categories":3597},[127],{"categories":3599},[185],{"categories":3601},[185],{"categories":3603},[127],{"categories":3605},[185],{"categories":3607},[130],{"categories":3609},[185],{"categories":3611},[185],{"categories":3613},[185],{"categories":3615},[178],{"categories":3617},[154],{"categories":3619},[154],{"categories":3621},[133],{"categories":3623},[185],{"categories":3625},[178],{"categories":3627},[447],{"categories":3629},[178],{"categories":3631},[178],{"categories":3633},[178],{"categories":3635},[],{"categories":3637},[130],{"categories":3639},[],{"categories":3641},[447],{"categories":3643},[185],{"categories":3645},[185],{"categories":3647},[185],{"categories":3649},[136],{"categories":3651},[154,130],{"categories":3653},[178],{"categories":3655},[],{"categories":3657},[],{"categories":3659},[178],{"categories":3661},[],{"categories":3663},[178],{"categories":3665},[154],{"categories":3667},[136],{"categories":3669},[],{"categories":3671},[185],{"categories":3673},[133],{"categories":3675},[175],{"categories":3677},[],{"categories":3679},[133],{"categories":3681},[],{"categories":3683},[154],{"categories":3685},[127],{"categories":3687},[178],{"categories":3689},[],{"categories":3691},[185],{"categories":3693},[154],[3695,3777,3986,4059],{"id":3696,"title":3697,"ai":3698,"body":3703,"categories":3753,"created_at":85,"date_modified":85,"description":78,"extension":86,"faq":85,"featured":87,"kicker_label":85,"meta":3754,"navigation":105,"path":3765,"published_at":3766,"question":85,"scraped_at":3767,"seo":3768,"sitemap":3769,"source_id":3770,"source_name":112,"source_type":113,"source_url":3771,"stem":3772,"tags":3773,"thumbnail_url":85,"tldr":3774,"tweet":85,"unknown_tags":3775,"__hash__":3776},"summaries\u002Fsummaries\u002F1dcaa9cf36eee656-blt-cuts-inference-bandwidth-50-92-via-diffusion-s-summary.md","BLT Cuts Inference Bandwidth 50-92% via Diffusion & Speculation",{"provider":7,"model":8,"input_tokens":3699,"output_tokens":3700,"processing_time_ms":3701,"cost_usd":3702},8589,2722,30748,0.00305615,{"type":14,"value":3704,"toc":3747},[3705,3709,3712,3716,3724,3727,3731,3734,3737,3740,3744],[17,3706,3708],{"id":3707},"blts-memory-bandwidth-bottleneck-in-byte-level-generation","BLT's Memory Bandwidth Bottleneck in Byte-Level Generation",[22,3710,3711],{},"Byte-level models like BLT avoid tokenization pitfalls—noise sensitivity, poor multilingual support, weak character\u002Fcode handling—by processing raw bytes via entropy-based patches (avg 4 bytes, max 8). Computation uses local encoder, global Transformer, local decoder on latent tokens. Inference slows because autoregressive decoder generates one byte\u002Fstep, vs. tokens covering multiple bytes. This multiplies memory loads for weights\u002FKV caches, the key serving bottleneck. BLT needs 4x more decoder passes than token models for equivalent text, hiking bandwidth costs.",[17,3713,3715],{"id":3714},"block-diffusion-enables-multi-byte-decoding-per-pass-blt-d","Block Diffusion Enables Multi-Byte Decoding per Pass (BLT-D)",[22,3717,3718,3719,3723],{},"BLT-D replaces byte-by-byte autoregression with discrete diffusion in fixed blocks (B=4\u002F8\u002F16 bytes). Training: corrupt blocks by masking bytes independently with prob t~U(0,1); loss combines next-byte prediction on clean seq + masked prediction on corrupted. Inference: start with ",[3720,3721,3722],"span",{},"MASK"," block, iteratively unmask multiple bytes\u002Fpass via confidence (prob>α) or entropy-bounded (cumulative entropy\u003Cγ) sampling. Encoder\u002Fglobal called once\u002Fblock, not per-patch; supports KV caching.",[22,3725,3726],{},"At 3B params on BLT-1T (1T tokens), BLT-D-4 matches BLT scores on FLORES-101 translation (French\u002FEnglish, German\u002FEnglish; 4-shot BLEU), nears on HumanEval\u002FMBPP coding (0\u002F3-shot pass@1). BLT-D-16 cuts bandwidth 87-92% but drops coding pass@1. Likelihoods (ARC-Easy\u002FChallenge, PIQA, HellaSwag, MMLU) near baseline via causal-masked decoder. Translation gains most; coding sensitive to block size. Entropy-bounded + top-p boosts diversity (higher type-token ratio) as NFEs rise.",[17,3728,3730],{"id":3729},"no-training-speculation-recycles-existing-decoder-blt-s-blt-dv","No-Training Speculation Recycles Existing Decoder (BLT-S, BLT-DV)",[22,3732,3733],{},"BLT-S uses lightweight decoder as self-drafter: generate k=8\u002F16 bytes ignoring patch boundaries, conditioning on last latent; verify via full encode\u002Fglobal\u002Fdecode, accept to first mismatch. Greedy decoding guarantees identical output to BLT (no quality loss); reduces encoder\u002Fglobal calls despite more decoder passes. At 3B\u002Fk=16, 77% bandwidth cut.",[22,3735,3736],{},"BLT-DV (on BLT-D weights): one-step diffusion drafts block, autoregressive verify accepts to mismatch. Single-step diffusion degrades alone but verification fixes it. At 3B, up to 81% bandwidth reduction.",[22,3738,3739],{},"All trained 1B:240k steps, 3B:480k on BLT-1T (public + Datacomp-LM subset). Efficiency proxies: decoder\u002Fencoder NFEs, GB bandwidth (16-bit, param\u002Fforward counts). Wall-clock needs optimized serving.",[17,3741,3743],{"id":3742},"practical-tradeoffs-for-production-deployment","Practical Tradeoffs for Production Deployment",[22,3745,3746],{},"BLT-D fastest (esp B=16) but coding tradeoffs; BLT-S zero-loss safest. All preserve autoregressive likelihoods\u002Freasoning. Bandwidth proxies predict real gains in memory-bound serving. Future: optimized inference impl. Byte-level now viable for production-scale speed without tokenizer fragility.",{"title":78,"searchDepth":79,"depth":79,"links":3748},[3749,3750,3751,3752],{"id":3707,"depth":79,"text":3708},{"id":3714,"depth":79,"text":3715},{"id":3729,"depth":79,"text":3730},{"id":3742,"depth":79,"text":3743},[133],{"content_references":3755,"triage":3762},[3756,3759],{"type":91,"title":3757,"url":3758,"context":99},"Fast Byte Latent Transformer That Reduces Inference Memory Bandwidth by Over 50% Without Tokenization","https:\u002F\u002Farxiv.org\u002Fpdf\u002F2605.08044",{"type":91,"title":3760,"url":3761,"context":94},"Byte Latent Transformer (BLT): A Tokenizer-Free Model That Scales Efficiently","https:\u002F\u002Fwww.marktechpost.com\u002F2024\u002F12\u002F13\u002Fmeta-ai-introduces-byte-latent-transformer-blt-a-tokenizer-free-model-that-scales-efficiently\u002F",{"relevance":101,"novelty":102,"quality":102,"actionability":79,"composite":3763,"reasoning":3764},3.25,"Category: AI & LLMs. The article discusses a new approach to improving inference bandwidth in AI models, which is relevant to AI engineering. However, it lacks practical applications or frameworks that the audience can directly implement, focusing instead on theoretical advancements.","\u002Fsummaries\u002F1dcaa9cf36eee656-blt-cuts-inference-bandwidth-50-92-via-diffusion-s-summary","2026-05-11 17:52:15","2026-05-12 15:01:28",{"title":3697,"description":78},{"loc":3765},"1dcaa9cf36eee656","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F11\u002Fmeta-and-stanford-researchers-propose-fast-byte-latent-transformer-that-reduces-inference-memory-bandwidth-by-over-50-without-tokenization\u002F","summaries\u002F1dcaa9cf36eee656-blt-cuts-inference-bandwidth-50-92-via-diffusion-s-summary",[117,118,119],"Meta\u002FStanford researchers accelerate Byte Latent Transformer (BLT) inference with BLT-D (diffusion decoding), BLT-S (self-speculation), and BLT-DV (diffusion+verification), reducing memory bandwidth 50-92% at 3B params while nearing baseline performance on translation\u002Fcoding tasks.",[],"xMZyx1diuvh2XXZUy_NPhOgWy_XqDJeXjel738dmvjs",{"id":3778,"title":3779,"ai":3780,"body":3785,"categories":3956,"created_at":85,"date_modified":85,"description":78,"extension":86,"faq":85,"featured":87,"kicker_label":85,"meta":3957,"navigation":105,"path":3974,"published_at":3975,"question":85,"scraped_at":3976,"seo":3977,"sitemap":3978,"source_id":3979,"source_name":112,"source_type":113,"source_url":3980,"stem":3981,"tags":3982,"thumbnail_url":85,"tldr":3983,"tweet":85,"unknown_tags":3984,"__hash__":3985},"summaries\u002Fsummaries\u002F2d4fed29fea91900-star-elastic-pack-30b-23b-12b-models-in-one-checkp-summary.md","Star Elastic: Pack 30B\u002F23B\u002F12B Models in One Checkpoint",{"provider":7,"model":8,"input_tokens":3781,"output_tokens":3782,"processing_time_ms":3783,"cost_usd":3784},9074,2939,32047,0.0032618,{"type":14,"value":3786,"toc":3950},[3787,3791,3794,3797,3801,3808,3811,3815,3818,3886,3890,3893,3896,3939,3946],[17,3788,3790],{"id":3789},"nested-weight-sharing-compresses-multiple-sizes-into-one-checkpoint","Nested Weight-Sharing Compresses Multiple Sizes into One Checkpoint",[22,3792,3793],{},"Train one 30B hybrid Mamba-Transformer-MoE parent model on 160B tokens to embed smaller 23B and 12B submodels as contiguous subsets of its highest-importance components. Rank embedding channels, attention heads, Mamba SSM heads, MoE experts, and FFN channels by contribution to accuracy using Router-Weighted Expert Activation Pruning (REAP), which weighs routing gates and output magnitudes over naive frequency pruning. A learnable end-to-end router takes a target budget (e.g., 2.8B active params) as one-hot input, outputs differentiable masks via Gumbel-Softmax, and trains jointly with knowledge distillation from the parent—penalizing budget deviations while maximizing accuracy. Use a two-stage curriculum: short-context (8K tokens, uniform budgets) then long-context (49K tokens, p(30B)=0.5, p(23B)=0.3, p(12B)=0.2), boosting AIME-2025 scores by up to 19.8% on smaller variants. Width compression (reducing dims\u002Fheads\u002Fexperts) recovers 98.1% baseline performance versus 95.2% for depth (layer dropping), so prioritize width for reasoning tasks.",[22,3795,3796],{},"This yields 360x fewer tokens than separate pretraining and 7x over sequential distillation, with all variants zero-shot slicable from one 58.9 GB BF16 checkpoint—versus 126.1 GB for independents.",[17,3798,3800],{"id":3799},"phase-specific-sizing-optimizes-reasoning-accuracy-latency","Phase-Specific Sizing Optimizes Reasoning Accuracy-Latency",[22,3802,3803,3804],{},"Ditch fixed-model token caps in ",[3805,3806,3807],"think",{}," phases: assign smaller nested models (e.g., 23B) to high-volume reasoning traces and larger (30B) to precise final answers in ℳS → ℳL configs. The 23B→30B setup beats Nemotron Nano v3 defaults by 16% accuracy at 1.9x lower latency, as reasoning tolerates capacity cuts but answers demand precision. Elastic-23B hits 85.63 on AIME-2025 (vs. Qwen3-30B-A3B's 80.00), matching or exceeding same-size independents on GPQA, LiveCodeBench v5, MMLU-Pro, IFBench, Tau Bench.",[22,3809,3810],{},"12B runs 2.4x throughput of 30B on H100 at BF16; NVFP4 12B hits 7,426 tokens\u002Fs (3.4x) on RTX Pro 6000.",[17,3812,3814],{"id":3813},"quantization-preserves-nesting-for-edge-deployment","Quantization Preserves Nesting for Edge Deployment",[22,3816,3817],{},"Apply Quantization-Aware Distillation (QAD) on the elastic checkpoint to maintain zero-shot slicing post-quant. FP8 PTQ recovers 98.69% BF16 accuracy on 30B; NVFP4 PTQ drops 4.12% but QAD (~5B tokens, 48K context) hits 97.79%. Single NVFP4 checkpoint: 18.7 GB (30B), enabling 12B\u002F8 GB on RTX 5080 (BF16 OOMs). Memory table:",[3819,3820,3821,3840],"table",{},[3822,3823,3824],"thead",{},[3825,3826,3827,3831,3834,3837],"tr",{},[3828,3829,3830],"th",{},"Variant",[3828,3832,3833],{},"30B",[3828,3835,3836],{},"23B",[3828,3838,3839],{},"12B",[3841,3842,3843,3858,3872],"tbody",{},[3825,3844,3845,3849,3852,3855],{},[3846,3847,3848],"td",{},"BF16",[3846,3850,3851],{},"58.9 GB",[3846,3853,3854],{},"44.0 GB",[3846,3856,3857],{},"23.2 GB",[3825,3859,3860,3863,3866,3869],{},[3846,3861,3862],{},"FP8",[3846,3864,3865],{},"31.4 GB",[3846,3867,3868],{},"23.7 GB",[3846,3870,3871],{},"13.0 GB",[3825,3873,3874,3877,3880,3883],{},[3846,3875,3876],{},"NVFP4",[3846,3878,3879],{},"18.7 GB",[3846,3881,3882],{},"14.1 GB",[3846,3884,3885],{},"8.0 GB",[17,3887,3889],{"id":3888},"load-and-serve-with-transformers-or-vllm","Load and Serve with Transformers or vLLM",[22,3891,3892],{},"Grab from HF: nvidia\u002FNVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-{BF16|FP8|NVFP4}. Use trust_remote_code=True for hybrid arch.",[22,3894,3895],{},"Transformers example:",[3897,3898,3902],"pre",{"className":3899,"code":3900,"language":3901,"meta":78,"style":78},"language-python shiki shiki-themes github-light github-dark","from transformers import AutoTokenizer, AutoModelForCausalLM\nimport torch\nmodel_id = \"nvidia\u002FNVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-BF16\"\ntokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)\nmodel = AutoModelForCausalLM.from_pretrained(model_id, trust_remote_code=True, torch_dtype=torch.bfloat16, device_map=\"auto\")\n# Generate with max_new_tokens=4096 for \u003Cthink> + answer\n","python",[3903,3904,3905,3912,3917,3922,3927,3933],"code",{"__ignoreMap":78},[3720,3906,3909],{"class":3907,"line":3908},"line",1,[3720,3910,3911],{},"from transformers import AutoTokenizer, AutoModelForCausalLM\n",[3720,3913,3914],{"class":3907,"line":79},[3720,3915,3916],{},"import torch\n",[3720,3918,3919],{"class":3907,"line":101},[3720,3920,3921],{},"model_id = \"nvidia\u002FNVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-BF16\"\n",[3720,3923,3924],{"class":3907,"line":102},[3720,3925,3926],{},"tokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)\n",[3720,3928,3930],{"class":3907,"line":3929},5,[3720,3931,3932],{},"model = AutoModelForCausalLM.from_pretrained(model_id, trust_remote_code=True, torch_dtype=torch.bfloat16, device_map=\"auto\")\n",[3720,3934,3936],{"class":3907,"line":3935},6,[3720,3937,3938],{},"# Generate with max_new_tokens=4096 for \u003Cthink> + answer\n",[22,3940,3941,3942,3945],{},"vLLM for prod: ",[3903,3943,3944],{},"vllm serve \u003Cmodel_id>"," (OpenAI API compat), or Docker\u002FSGLang. Query via curl with max_tokens=4096, temperature=0.6.",[3947,3948,3949],"style",{},"html .default .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}html.dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}",{"title":78,"searchDepth":79,"depth":79,"links":3951},[3952,3953,3954,3955],{"id":3789,"depth":79,"text":3790},{"id":3799,"depth":79,"text":3800},{"id":3813,"depth":79,"text":3814},{"id":3888,"depth":79,"text":3889},[],{"content_references":3958,"triage":3971},[3959,3962,3965,3968],{"type":91,"title":3960,"url":3961,"context":99},"Star Elastic","https:\u002F\u002Fcas-bridge.xethub.hf.co\u002Fxet-bridge-us\u002F69cd91b34a304b3afe4ceaa4\u002Fcedbede2a32a1757cd46b5ce6edbe0934f2c8437f61509d8f63aae86f96b43cb?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=cas%2F20260509%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260509T212853Z&X-Amz-Expires=3600&X-Amz-Signature=a776c3adc5cd45d923a82950ea17eefb271caf85b0586ff79855f575381030a7&X-Amz-SignedHeaders=host&X-Xet-Cas-Uid=689a286d51b587fe5035c19f&response-content-disposition=inline%3B+filename*%3DUTF-8%27%27star_elastic_arxiv.pdf%3B+filename%3D%22star_elastic_arxiv.pdf%22%3B&response-content-type=application%2Fpdf&x-amz-checksum-mode=ENABLED&x-id=GetObject&Expires=1778365733&Policy=eyJTdGF0ZW1lbnQiOlt7IkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTc3ODM2NTczM319LCJSZXNvdXJjZSI6Imh0dHBzOi8vY2FzLWJyaWRnZS54ZXRodWIuaGYuY28veGV0LWJyaWRnZS11cy82OWNkOTFiMzRhMzA0YjNhZmU0Y2VhYTQvY2VkYmVkZTJhMzJhMTc1N2NkNDZiNWNlNmVkYmUwOTM0ZjJjODQzN2Y2MTUwOWQ4ZjYzYWFlODZmOTZiNDNjYioifV19&Signature=fpq%7EPKyILz2ZDcwgCMn%7EsYfSySqpZ5Fr-A3MXBBG94lfu6bTv6y63ejTUL16B8v03HIJyKwrdGgHoYAQr88iQ05qS%7EoIszdd0eU2dfem3CVxM-t3e8rIo4-i4OTBjP2oPAMjCqmwzcC6uPG3Xqm-3Tiq5IfrsDFSKSUPZavMI6nU%7EBBpxd-i-L3C4-4v80nzJWfkHZiKb0EHr3PN8CRlA6In1X2-tH3dXBm0GM0j83%7EBtcclb-4C18vdpfEuvEaKOf0tMxsf5zI0acMPdCJxnVatq%7EgZwixiF%7E53DxgPc94Pb93zl0TVTcLH4%7ExH8yi7Xj9YYjdMKB634Q1GeapoJA__&Key-Pair-Id=K2L8F4GPSG1IFC",{"type":96,"title":3963,"url":3964,"context":99},"NVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-BF16","https:\u002F\u002Fhuggingface.co\u002Fnvidia\u002FNVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-BF16",{"type":96,"title":3966,"url":3967,"context":99},"NVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-FP8","https:\u002F\u002Fhuggingface.co\u002Fnvidia\u002FNVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-FP8",{"type":96,"title":3969,"url":3970,"context":99},"NVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-NVFP4","https:\u002F\u002Fhuggingface.co\u002Fnvidia\u002FNVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-NVFP4",{"relevance":101,"novelty":101,"quality":102,"actionability":79,"composite":3972,"reasoning":3973},3.05,"Category: AI & LLMs. The article discusses a new model architecture from NVIDIA that could be relevant for developers looking to integrate advanced AI models into their products. However, while it provides technical details, it lacks practical steps or frameworks that the audience could directly apply in their work.","\u002Fsummaries\u002F2d4fed29fea91900-star-elastic-pack-30b-23b-12b-models-in-one-checkp-summary","2026-05-09 22:24:23","2026-05-10 15:26:52",{"title":3779,"description":78},{"loc":3974},"2d4fed29fea91900","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F09\u002Fnvidia-ai-releases-star-elastic-one-checkpoint-that-contains-30b-23b-and-12b-reasoning-models-with-zero-shot-slicing\u002F","summaries\u002F2d4fed29fea91900-star-elastic-pack-30b-23b-12b-models-in-one-checkp-summary",[117,120,118],"NVIDIA's Star Elastic embeds nested 30B (3.6B active), 23B (2.8B), and 12B (2.0B) reasoning models in a single checkpoint via importance-ranked weight-sharing, slashing training costs 360x and enabling phase-specific sizing for 16% accuracy gains at 1.9x lower latency.",[],"MmEv9MTKlBfvzKFMrwhf1uWOYr3g3Xhj2RLeYFKTfm8",{"id":3987,"title":3988,"ai":3989,"body":3994,"categories":4030,"created_at":85,"date_modified":85,"description":78,"extension":86,"faq":85,"featured":87,"kicker_label":85,"meta":4031,"navigation":105,"path":4046,"published_at":4047,"question":85,"scraped_at":4048,"seo":4049,"sitemap":4050,"source_id":4051,"source_name":4052,"source_type":113,"source_url":4053,"stem":4054,"tags":4055,"thumbnail_url":85,"tldr":4056,"tweet":85,"unknown_tags":4057,"__hash__":4058},"summaries\u002Fsummaries\u002F9c05119c3bd0f686-sovereign-ai-grounds-robotics-in-physics-for-1-1m--summary.md","Sovereign AI Grounds Robotics in Physics for 1.1M States\u002FSec",{"provider":7,"model":8,"input_tokens":3990,"output_tokens":3991,"processing_time_ms":3992,"cost_usd":3993},4417,1733,23053,0.0017274,{"type":14,"value":3995,"toc":4024},[3996,4000,4003,4007,4010,4014,4017,4021],[17,3997,3999],{"id":3998},"build-sub-millisecond-robotics-control-with-jax-tpu-v6","Build Sub-Millisecond Robotics Control with JAX + TPU v6",[22,4001,4002],{},"To overcome reinforcement learning's brittleness in real-world chaos, Sovereign AI leverages JAX 0.9.0+ on Google's TPU v6 Trillium for extreme speed: over 1.1 million states per second at 0.894 ms latency. This ensures a 22-DoF humanoid robot processes decisions faster than its actuators move, preventing delays that cause falls. Implement by running the full notebook on GitHub (frank-morales2020\u002FMLxDL), which integrates hardware acceleration for latent space computations without simulation pitfalls.",[17,4004,4006],{"id":4005},"anchor-predictions-to-physics-laws-via-jepa-for-47x-failure-sensitivity","Anchor Predictions to Physics Laws via JEPA for 4.7x Failure Sensitivity",[22,4008,4009],{},"Joint Embedding Predictive Architecture (JEPA) operates in a physics-informed latent space, using a Physics Anchor to monitor energy patterns. Detect anomalies by thresholding: energy loss of 8.5467 signals motor seizure (failure), while expansion of 4.8101 indicates intentional momentum for maneuvers like sideways slides. This delivers 4.7x greater sensitivity over traditional methods, grounding neural predictions in conservation laws so AI distinguishes planned actions from disasters in real time.",[17,4011,4013],{"id":4012},"gain-auditability-and-recovery-with-gemini-31-pro-oversight","Gain Auditability and Recovery with Gemini 3.1 Pro Oversight",[22,4015,4016],{},"Feed JEPA's abstract metrics into Gemini 3.1 Pro's Deep Thinking mode as the executive controller. It translates spikes into human-readable reports, diagnosing joint failures or sensor glitches, then outputs recovery plans. This Sovereign Return on Investment (SROI) enables full energy expenditure audits, making decisions transparent and recoverable rather than black-box guesses.",[17,4018,4020],{"id":4019},"slash-bandwidth-797-for-6g-scale-autonomy-with-semantic-compression","Slash Bandwidth 79.7% for 6G-Scale Autonomy with Semantic Compression",[22,4022,4023],{},"Compress data to transmit only semantic meaning, not raw sensors, yielding 79.7% bandwidth savings. For 6G networks, this sustains high-fidelity autonomy in bandwidth-constrained environments, ensuring reliable physical-world deployment without overwhelming infrastructure.",{"title":78,"searchDepth":79,"depth":79,"links":4025},[4026,4027,4028,4029],{"id":3998,"depth":79,"text":3999},{"id":4005,"depth":79,"text":4006},{"id":4012,"depth":79,"text":4013},{"id":4019,"depth":79,"text":4020},[133],{"content_references":4032,"triage":4043},[4033,4037,4039,4041],{"type":96,"title":4034,"url":4035,"context":4036},"MLxDL (GEMINI_TPU.ipynb)","https:\u002F\u002Fgithub.com\u002Ffrank-morales2020\u002FMLxDL\u002Fblob\u002Fmain\u002FGEMINI_TPU.ipynb","mentioned",{"type":96,"title":4038,"context":4036},"JAX 0.9.0+",{"type":96,"title":4040,"context":4036},"TPU v6 Trillium",{"type":96,"title":4042,"context":4036},"Gemini 3.1 Pro",{"relevance":3929,"novelty":102,"quality":102,"actionability":102,"composite":4044,"reasoning":4045},4.35,"Category: AI & LLMs. The article provides in-depth insights into using AI for robotics control, addressing practical applications like real-time decision-making and failure detection, which are crucial for product builders. It includes specific frameworks and tools like JAX and JEPA, making it actionable for developers looking to implement these techniques.","\u002Fsummaries\u002F9c05119c3bd0f686-sovereign-ai-grounds-robotics-in-physics-for-1-1m-summary","2026-05-08 15:34:13","2026-05-09 15:36:56",{"title":3988,"description":78},{"loc":4046},"9c05119c3bd0f686","AI Simplified in Plain English","https:\u002F\u002Fmedium.com\u002Fai-simplified-in-plain-english\u002Fsovereign-ai-bridging-the-gap-between-neural-logic-and-physical-reality-27847c54ddbc?source=rss----f37ab7d4e76b---4","summaries\u002F9c05119c3bd0f686-sovereign-ai-grounds-robotics-in-physics-for-1-1m--summary",[117,118,120],"Sovereign AI uses JEPA with physics anchors on JAX\u002FTPU v6 to process 1.1M states\u002Fsec at 0.894ms latency, detecting failures 4.7x better via energy patterns, with Gemini 3.1 Pro generating auditable reports and recovery plans.",[],"S_G2pfMpHvfDy7cXXXBE5nN5ar3Jtvpq5EuPS2bYuY8",{"id":4060,"title":4061,"ai":4062,"body":4067,"categories":4096,"created_at":85,"date_modified":85,"description":78,"extension":86,"faq":85,"featured":87,"kicker_label":85,"meta":4097,"navigation":105,"path":4109,"published_at":4110,"question":85,"scraped_at":4111,"seo":4112,"sitemap":4113,"source_id":4114,"source_name":112,"source_type":113,"source_url":4115,"stem":4116,"tags":4117,"thumbnail_url":85,"tldr":4118,"tweet":85,"unknown_tags":4119,"__hash__":4120},"summaries\u002Fsummaries\u002F4e271633d433ef16-gemma-4-mtp-drafters-3x-faster-inference-no-qualit-summary.md","Gemma 4 MTP Drafters: 3x Faster Inference, No Quality Loss",{"provider":7,"model":8,"input_tokens":4063,"output_tokens":4064,"processing_time_ms":4065,"cost_usd":4066},7596,1980,21477,0.00248655,{"type":14,"value":4068,"toc":4092},[4069,4073,4076,4079,4083,4086,4089],[17,4070,4072],{"id":4071},"speculative-decoding-overcomes-autoregressive-latency","Speculative Decoding Overcomes Autoregressive Latency",[22,4074,4075],{},"Standard LLM inference generates one token at a time autoregressively, creating a memory-bandwidth bottleneck: billions of parameters load from VRAM per token, leaving GPUs underutilized as data transfer dominates. Even predictable tokens (e.g., 'words' after 'Actions speak louder than...') require full computation, equal to complex reasoning steps.",[22,4077,4078],{},"Speculative decoding fixes this by pairing a small, fast drafter model with the large target (Gemma 4). The drafter proposes a sequence of tokens quickly—faster than the target processes one. The target verifies the entire draft in one parallel forward pass. Matches accept the full sequence plus one extra target-generated token, all in the time of a single standard pass. Verification ensures identical outputs to vanilla autoregressive generation, delivering lossless speedup. Gemma 4 drafters hit up to 3x overall inference speed post-60M downloads.",[17,4080,4082],{"id":4081},"mtp-architecture-shares-resources-for-edge-and-scale","MTP Architecture Shares Resources for Edge and Scale",[22,4084,4085],{},"Gemma 4's Multi-Token Prediction (MTP) drafters enhance speculative decoding by sharing the target's KV cache—storing prior attention computations—avoiding redundant context recompute. This cuts drafter overhead sharply.",[22,4087,4088],{},"For edge variants (E2B, E4B) on mobile, embedder-layer clustering accelerates logit computation (internal reps to vocab probabilities), targeting hardware-limited final steps. On Gemma 4 26B MoE, Apple Silicon sees ~2.2x speedup at batch size 4-8 (vs. batch 1 routing issues); NVIDIA A100 shows batch-dependent gains too.",[22,4090,4091],{},"Implement via Hugging Face Gemma 4 collections; speeds production apps without quality or accuracy trade-offs.",{"title":78,"searchDepth":79,"depth":79,"links":4093},[4094,4095],{"id":4071,"depth":79,"text":4072},{"id":4081,"depth":79,"text":4082},[133],{"content_references":4098,"triage":4106},[4099,4102],{"type":96,"title":4100,"url":4101,"context":4036},"Gemma 4 Model Weights","https:\u002F\u002Fhuggingface.co\u002Fcollections\u002Fgoogle\u002Fgemma-4",{"type":4103,"title":4104,"url":4105,"context":99},"other","Multi-Token Prediction for Gemma 4","https:\u002F\u002Fblog.google\u002Finnovation-and-ai\u002Ftechnology\u002Fdevelopers-tools\u002Fmulti-token-prediction-gemma-4\u002F?linkId=61725841",{"relevance":102,"novelty":101,"quality":102,"actionability":102,"composite":4107,"reasoning":4108},3.8,"Category: AI & LLMs. The article discusses the new Multi-Token Prediction (MTP) drafters for Gemma 4, which addresses a specific pain point of inference speed in AI models, making it relevant for developers looking to implement faster AI features. It provides actionable insights on how to implement this technology via Hugging Face, which adds to its practical value.","\u002Fsummaries\u002F4e271633d433ef16-gemma-4-mtp-drafters-3x-faster-inference-no-qualit-summary","2026-05-06 08:23:04","2026-05-06 16:14:12",{"title":4061,"description":78},{"loc":4109},"4e271633d433ef16","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F06\u002Fgoogle-ai-releases-multi-token-prediction-mtp-drafters-for-gemma-4-delivering-up-to-3x-faster-inference-without-quality-loss\u002F","summaries\u002F4e271633d433ef16-gemma-4-mtp-drafters-3x-faster-inference-no-qualit-summary",[117,118,120],"Pair Gemma 4 with lightweight MTP drafters using speculative decoding to generate up to 3x more tokens per pass by drafting sequences and verifying in parallel, sharing KV cache for efficiency without altering outputs.",[],"9zKQbGealE55IZRdNqkOAfuDfHteluCgF1trH9nWXc4"]