[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"summary-c6f1bc88e627db47-parcae-stabilizes-loops-to-match-2x-transformer-qu-summary":3,"summaries-facets-categories":95,"summary-related-c6f1bc88e627db47-parcae-stabilizes-loops-to-match-2x-transformer-qu-summary":3664},{"id":4,"title":5,"ai":6,"body":13,"categories":52,"created_at":54,"date_modified":54,"description":46,"extension":55,"faq":54,"featured":56,"kicker_label":54,"meta":57,"navigation":76,"path":77,"published_at":78,"question":54,"scraped_at":79,"seo":80,"sitemap":81,"source_id":82,"source_name":83,"source_type":84,"source_url":85,"stem":86,"tags":87,"thumbnail_url":54,"tldr":92,"tweet":54,"unknown_tags":93,"__hash__":94},"summaries\u002Fsummaries\u002Fc6f1bc88e627db47-parcae-stabilizes-loops-to-match-2x-transformer-qu-summary.md","Parcae Stabilizes Loops to Match 2x Transformer Quality",{"provider":7,"model":8,"input_tokens":9,"output_tokens":10,"processing_time_ms":11,"cost_usd":12},"openrouter","x-ai\u002Fgrok-4.1-fast",8134,2375,17913,0.00230745,{"type":14,"value":15,"toc":45},"minimark",[16,21,25,28,32,35,38,42],[17,18,20],"h2",{"id":19},"designing-stable-looped-architectures","Designing Stable Looped Architectures",[22,23,24],"p",{},"Looped transformers route activations through a fixed block of layers T times, boosting compute without adding parameters—ideal for memory-constrained edge deployment. Parcae uses a middle-looped structure: prelude (P) embeds input to latent e; recurrent block (R) updates hidden state h_t for T loops with e injected each iteration; coda (C) outputs from final h_T. Prior looped models like RDMs fail due to residual state explosion and loss spikes from unconstrained dynamics.",[22,26,27],{},"Model the loop as a nonlinear dynamical system: h_{t+1} = Ā h_t + B̄ e + R̄(h_t, e). Stability requires spectral norm ρ(Ā) \u003C 1. Parcae discretizes a continuous system using zero-order hold and Euler integration with learned step Δ: Ā = exp(Δ A), B̄ = Δ B. Constrain A as diagonal with negative entries A = Diag(-exp(log A)), ensuring ρ(Ā) \u003C 1 by design—no hyperparameter tuning needed for convergence. This fixes addition-based (ρ(Ā)=1, marginal) and concatenation-projection (ρ(Ā)>1, unstable) flaws in priors.",[17,29,31],{"id":30},"beating-baselines-with-parameter-efficiency","Beating Baselines with Parameter Efficiency",[22,33,34],{},"On Huginn, 350M Parcae drops validation perplexity 6.3% vs RDMs (10.76 to 10.09 PPL), 9.1% on WikiText, +1.8 downstream accuracy points. At 100M, 4.5% PPL gain (14.23 to 13.59). On FineWeb-Edu (104B tokens, nanochat setup), 1.3B Parcae scores 2.99 points higher on Core, 1.18 on Core-Extended than parameter-matched Transformers. Critically, 770M Parcae hits 25.07 Core—matching 1.3B Transformer's 25.45—delivering up to 87.5% of twice-sized Transformer's quality.",[22,36,37],{},"Looping adds an orthogonal scaling axis: isoFLOP tests at 140M\u002F370M show looped Parcae (optimal mean recurrence μ_rec) beats fixed-depth (μ_rec=1) by 1.2-2.0 Core points under same params\u002FFLOPs.",[17,39,41],{"id":40},"first-scaling-laws-for-recurrence-depth","First Scaling Laws for Recurrence Depth",[22,43,44],{},"Optimal μ_rec scales as C^{0.40}, training tokens as C^{0.78} (C= FLOP budget), holding across scales. Test-time loop count T beyond training saturates via L(T) = L_∞ + Z e^{-z T}, plateauing near training μ_rec—setting a ceiling on extrapolation. This parametric law predicts held-out loss with 0.85-1.31% error, enabling reliable planning: train deeper loops for compute-optimal quality without memory bloat.",{"title":46,"searchDepth":47,"depth":47,"links":48},"",2,[49,50,51],{"id":19,"depth":47,"text":20},{"id":30,"depth":47,"text":31},{"id":40,"depth":47,"text":41},[53],"AI & LLMs",null,"md",false,{"content_references":58,"triage":71},[59,64,68],{"type":60,"title":61,"url":62,"context":63},"paper","Parcae","https:\u002F\u002Farxiv.org\u002Fpdf\u002F2604.12946","recommended",{"type":65,"title":66,"url":67,"context":63},"other","Parcae Model Weights","https:\u002F\u002Fhuggingface.co\u002Fcollections\u002FSandyResearch\u002Fparcae",{"type":65,"title":69,"url":70,"context":63},"Parcae Technical Details","https:\u002F\u002Fwww.together.ai\u002Fblog\u002Fparcae",{"relevance":72,"novelty":72,"quality":73,"actionability":47,"composite":74,"reasoning":75},3,4,3.05,"Category: AI & LLMs. The article discusses a new architecture for looped transformers, which is relevant to AI engineering, but it lacks practical applications or frameworks that the audience can directly implement. While it presents some new insights into model efficiency, it does not provide actionable steps for product builders.",true,"\u002Fsummaries\u002Fc6f1bc88e627db47-parcae-stabilizes-loops-to-match-2x-transformer-qu-summary","2026-04-16 08:30:30","2026-04-19 01:22:43",{"title":5,"description":46},{"loc":77},"c6f1bc88e627db47","MarkTechPost","article","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F04\u002F16\u002Fucsd-and-together-ai-research-introduces-parcae-a-stable-architecture-for-looped-language-models-that-achieves-the-quality-of-a-transformer-twice-the-size\u002F","summaries\u002Fc6f1bc88e627db47-parcae-stabilizes-loops-to-match-2x-transformer-qu-summary",[88,89,90,91],"llm","machine-learning","deep-learning","research","Parcae enforces looped transformer stability via negative diagonal matrices in a dynamical system, outperforming baselines and achieving 87.5% of a twice-sized Transformer's quality at half parameters.",[],"w5bUNLMbNnMepMdiskfNW1esyE0__I9nWhPLCEPtMq8",[96,99,102,104,107,110,112,114,116,118,120,122,125,127,129,131,133,135,137,139,141,143,146,149,151,153,156,158,160,163,165,167,169,171,173,175,177,179,181,183,185,187,189,191,193,195,197,199,201,203,205,207,209,211,213,215,217,219,221,223,225,227,229,231,233,235,237,239,241,243,245,247,249,251,253,255,257,259,261,263,265,267,269,271,273,275,277,279,281,283,285,287,289,291,293,295,297,299,301,303,305,307,309,311,313,315,317,319,321,323,325,327,329,331,333,335,337,339,341,343,345,347,349,351,353,355,357,359,361,363,365,367,369,371,373,375,377,379,381,383,385,387,389,391,393,395,397,399,401,403,405,407,409,411,413,415,418,420,422,424,426,428,430,432,434,436,438,440,442,444,446,448,450,452,454,456,458,460,462,464,466,468,470,472,474,476,478,480,482,484,486,488,490,492,494,496,498,500,502,504,506,508,510,512,514,516,518,520,522,524,526,528,530,532,534,536,538,540,542,544,546,548,550,552,554,556,558,560,562,564,566,568,570,572,574,576,578,580,582,584,586,588,590,592,594,596,598,600,602,604,606,608,610,612,614,616,618,620,622,624,626,628,630,632,634,636,638,640,642,644,646,648,650,652,654,656,658,660,662,664,666,668,670,672,674,676,678,680,682,684,686,688,690,692,694,696,698,700,702,704,706,708,710,712,714,716,718,720,722,724,726,728,730,732,734,736,738,740,742,744,746,748,750,752,754,756,758,760,762,764,766,768,770,772,774,776,778,780,782,784,786,788,790,792,794,796,798,800,802,804,806,808,810,812,814,816,818,820,822,824,826,828,830,832,834,836,838,840,842,844,846,848,850,852,854,856,858,860,862,864,866,868,870,872,874,876,878,880,882,884,886,888,890,892,894,896,898,900,902,904,906,908,910,912,914,916,918,920,922,924,926,928,930,932,934,936,938,940,942,944,946,948,950,952,954,956,958,960,962,964,966,968,970,972,974,976,978,980,982,984,986,988,990,992,994,996,998,1000,1002,1004,1006,1008,1010,1012,1014,1016,1018,1020,1022,1024,1026,1028,1030,1032,1034,1036,1038,1040,1042,1044,1046,1048,1050,1052,1054,1056,1058,1060,1062,1064,1066,1068,1070,1072,1074,1076,1078,1080,1082,1084,1086,1088,1090,1092,1094,1096,1098,1100,1102,1104,1106,1108,1110,1112,1114,1116,1118,1120,1122,1124,1126,1128,1130,1132,1134,1136,1138,1140,1142,1144,1146,1148,1150,1152,1154,1156,1158,1160,1162,1164,1166,1168,1170,1172,1174,1176,1178,1180,1182,1184,1186,1188,1190,1192,1194,1196,1198,1200,1202,1204,1206,1208,1210,1212,1214,1216,1218,1220,1222,1224,1226,1228,1230,1232,1234,1236,1238,1240,1242,1244,1246,1248,1250,1252,1254,1256,1258,1260,1262,1264,1266,1268,1270,1272,1274,1276,1278,1280,1282,1284,1286,1288,1290,1292,1294,1296,1298,1300,1302,1304,1306,1308,1310,1312,1314,1316,1318,1320,1322,1324,1326,1328,1330,1332,1334,1336,1338,1340,1342,1344,1346,1348,1350,1352,1354,1356,1358,1360,1362,1364,1366,1368,1370,1372,1374,1376,1378,1380,1382,1384,1386,1388,1390,1392,1394,1396,1398,1400,1402,1404,1406,1408,1410,1412,1414,1416,1418,1420,1422,1424,1426,1428,1430,1432,1434,1436,1438,1440,1442,1444,1446,1448,1450,1452,1454,1456,1458,1460,1462,1464,1466,1468,1470,1472,1474,1476,1478,1480,1482,1484,1486,1488,1490,1492,1494,1496,1498,1500,1502,1504,1506,1508,1510,1512,1514,1516,1518,1520,1522,1524,1526,1528,1530,1532,1534,1536,1538,1540,1542,1544,1546,1548,1550,1552,1554,1556,1558,1560,1562,1564,1566,1568,1570,1572,1574,1576,1578,1580,1582,1584,1586,1588,1590,1592,1594,1596,1598,1600,1602,1604,1606,1608,1610,1612,1614,1616,1618,1620,1622,1624,1626,1628,1630,1632,1634,1636,1638,1640,1642,1644,1646,1648,1650,1652,1654,1656,1658,1660,1662,1664,1666,1668,1670,1672,1674,1676,1678,1680,1682,1684,1686,1688,1690,1692,1694,1696,1698,1700,1702,1704,1706,1708,1710,1712,1714,1716,1718,1720,1722,1724,1726,1728,1730,1732,1734,1736,1738,1740,1742,1744,1746,1748,1750,1752,1754,1756,1758,1760,1762,1764,1766,1768,1770,1772,1774,1776,1778,1780,1782,1784,1786,1788,1790,1792,1794,1796,1798,1800,1802,1804,1806,1808,1810,1812,1814,1816,1818,1820,1822,1824,1826,1828,1830,1832,1834,1836,1838,1840,1842,1844,1846,1848,1850,1852,1854,1856,1858,1860,1862,1864,1866,1868,1870,1872,1874,1876,1878,1880,1882,1884,1886,1888,1890,1892,1894,1896,1898,1900,1902,1904,1906,1908,1910,1912,1914,1916,1918,1920,1922,1924,1926,1928,1930,1932,1934,1936,1938,1940,1942,1944,1946,1948,1950,1952,1954,1956,1958,1960,1962,1964,1966,1968,1970,1972,1974,1976,1978,1980,1982,1984,1986,1988,1990,1992,1994,1996,1998,2000,2002,2004,2006,2008,2010,2012,2014,2016,2018,2020,2022,2024,2026,2028,2030,2032,2034,2036,2038,2040,2042,2044,2046,2048,2050,2052,2054,2056,2058,2060,2062,2064,2066,2068,2070,2072,2074,2076,2078,2080,2082,2084,2086,2088,2090,2092,2094,2096,2098,2100,2102,2104,2106,2108,2110,2112,2114,2116,2118,2120,2122,2124,2126,2128,2130,2132,2134,2136,2138,2140,2142,2144,2146,2148,2150,2152,2154,2156,2158,2160,2162,2164,2166,2168,2170,2172,2174,2176,2178,2180,2182,2184,2186,2188,2190,2192,2194,2196,2198,2200,2202,2204,2206,2208,2210,2212,2214,2216,2218,2220,2222,2224,2226,2228,2230,2232,2234,2236,2238,2240,2242,2244,2246,2248,2250,2252,2254,2256,2258,2260,2262,2264,2266,2268,2270,2272,2274,2276,2278,2280,2282,2284,2286,2288,2290,2292,2294,2296,2298,2300,2302,2304,2306,2308,2310,2312,2314,2316,2318,2320,2322,2324,2326,2328,2330,2332,2334,2336,2338,2340,2342,2344,2346,2348,2350,2352,2354,2356,2358,2360,2362,2364,2366,2368,2370,2372,2374,2376,2378,2380,2382,2384,2386,2388,2390,2392,2394,2396,2398,2400,2402,2404,2406,2408,2410,2412,2414,2416,2418,2420,2422,2424,2426,2428,2430,2432,2434,2436,2438,2440,2442,2444,2446,2448,2450,2452,2454,2456,2458,2460,2462,2464,2466,2468,2470,2472,2474,2476,2478,2480,2482,2484,2486,2488,2490,2492,2494,2496,2498,2500,2502,2504,2506,2508,2510,2512,2514,2516,2518,2520,2522,2524,2526,2528,2530,2532,2534,2536,2538,2540,2542,2544,2546,2548,2550,2552,2554,2556,2558,2560,2562,2564,2566,2568,2570,2572,2574,2576,2578,2580,2582,2584,2586,2588,2590,2592,2594,2596,2598,2600,2602,2604,2606,2608,2610,2612,2614,2616,2618,2620,2622,2624,2626,2628,2630,2632,2634,2636,2638,2640,2642,2644,2646,2648,2650,2652,2654,2656,2658,2660,2662,2664,2666,2668,2670,2672,2674,2676,2678,2680,2682,2684,2686,2688,2690,2692,2694,2696,2698,2700,2702,2704,2706,2708,2710,2712,2714,2716,2718,2720,2722,2724,2726,2728,2730,2732,2734,2736,2738,2740,2742,2744,2746,2748,2750,2752,2754,2756,2758,2760,2762,2764,2766,2768,2770,2772,2774,2776,2778,2780,2782,2784,2786,2788,2790,2792,2794,2796,2798,2800,2802,2804,2806,2808,2810,2812,2814,2816,2818,2820,2822,2824,2826,2828,2830,2832,2834,2836,2838,2840,2842,2844,2846,2848,2850,2852,2854,2856,2858,2860,2862,2864,2866,2868,2870,2872,2874,2876,2878,2880,2882,2884,2886,2888,2890,2892,2894,2896,2898,2900,2902,2904,2906,2908,2910,2912,2914,2916,2918,2920,2922,2924,2926,2928,2930,2932,2934,2936,2938,2940,2942,2944,2946,2948,2950,2952,2954,2956,2958,2960,2962,2964,2966,2968,2970,2972,2974,2976,2978,2980,2982,2984,2986,2988,2990,2992,2994,2996,2998,3000,3002,3004,3006,3008,3010,3012,3014,3016,3018,3020,3022,3024,3026,3028,3030,3032,3034,3036,3038,3040,3042,3044,3046,3048,3050,3052,3054,3056,3058,3060,3062,3064,3066,3068,3070,3072,3074,3076,3078,3080,3082,3084,3086,3088,3090,3092,3094,3096,3098,3100,3102,3104,3106,3108,3110,3112,3114,3116,3118,3120,3122,3124,3126,3128,3130,3132,3134,3136,3138,3140,3142,3144,3146,3148,3150,3152,3154,3156,3158,3160,3162,3164,3166,3168,3170,3172,3174,3176,3178,3180,3182,3184,3186,3188,3190,3192,3194,3196,3198,3200,3202,3204,3206,3208,3210,3212,3214,3216,3218,3220,3222,3224,3226,3228,3230,3232,3234,3236,3238,3240,3242,3244,3246,3248,3250,3252,3254,3256,3258,3260,3262,3264,3266,3268,3270,3272,3274,3276,3278,3280,3282,3284,3286,3288,3290,3292,3294,3296,3298,3300,3302,3304,3306,3308,3310,3312,3314,3316,3318,3320,3322,3324,3326,3328,3330,3332,3334,3336,3338,3340,3342,3344,3346,3348,3350,3352,3354,3356,3358,3360,3362,3364,3366,3368,3370,3372,3374,3376,3378,3380,3382,3384,3386,3388,3390,3392,3394,3396,3398,3400,3402,3404,3406,3408,3410,3412,3414,3416,3418,3420,3422,3424,3426,3428,3430,3432,3434,3436,3438,3440,3442,3444,3446,3448,3450,3452,3454,3456,3458,3460,3462,3464,3466,3468,3470,3472,3474,3476,3478,3480,3482,3484,3486,3488,3490,3492,3494,3496,3498,3500,3502,3504,3506,3508,3510,3512,3514,3516,3518,3520,3522,3524,3526,3528,3530,3532,3534,3536,3538,3540,3542,3544,3546,3548,3550,3552,3554,3556,3558,3560,3562,3564,3566,3568,3570,3572,3574,3576,3578,3580,3582,3584,3586,3588,3590,3592,3594,3596,3598,3600,3602,3604,3606,3608,3610,3612,3614,3616,3618,3620,3622,3624,3626,3628,3630,3632,3634,3636,3638,3640,3642,3644,3646,3648,3650,3652,3654,3656,3658,3660,3662],{"categories":97},[98],"Developer Productivity",{"categories":100},[101],"Business & SaaS",{"categories":103},[53],{"categories":105},[106],"AI Automation",{"categories":108},[109],"Product Strategy",{"categories":111},[53],{"categories":113},[98],{"categories":115},[101],{"categories":117},[],{"categories":119},[53],{"categories":121},[],{"categories":123},[124],"AI News & Trends",{"categories":126},[106],{"categories":128},[124],{"categories":130},[106],{"categories":132},[106],{"categories":134},[53],{"categories":136},[53],{"categories":138},[124],{"categories":140},[53],{"categories":142},[],{"categories":144},[145],"Design & Frontend",{"categories":147},[148],"Data Science & Visualization",{"categories":150},[124],{"categories":152},[],{"categories":154},[155],"Software Engineering",{"categories":157},[53],{"categories":159},[106],{"categories":161},[162],"Marketing & Growth",{"categories":164},[53],{"categories":166},[106],{"categories":168},[],{"categories":170},[],{"categories":172},[145],{"categories":174},[106],{"categories":176},[98],{"categories":178},[145],{"categories":180},[53],{"categories":182},[106],{"categories":184},[124],{"categories":186},[],{"categories":188},[],{"categories":190},[106],{"categories":192},[155],{"categories":194},[],{"categories":196},[101],{"categories":198},[],{"categories":200},[],{"categories":202},[106],{"categories":204},[106],{"categories":206},[53],{"categories":208},[],{"categories":210},[155],{"categories":212},[],{"categories":214},[],{"categories":216},[],{"categories":218},[53],{"categories":220},[162],{"categories":222},[145],{"categories":224},[145],{"categories":226},[53],{"categories":228},[106],{"categories":230},[53],{"categories":232},[53],{"categories":234},[106],{"categories":236},[106],{"categories":238},[148],{"categories":240},[124],{"categories":242},[106],{"categories":244},[162],{"categories":246},[106],{"categories":248},[109],{"categories":250},[],{"categories":252},[106],{"categories":254},[],{"categories":256},[106],{"categories":258},[155],{"categories":260},[145],{"categories":262},[53],{"categories":264},[],{"categories":266},[],{"categories":268},[106],{"categories":270},[],{"categories":272},[53],{"categories":274},[],{"categories":276},[98],{"categories":278},[155],{"categories":280},[101],{"categories":282},[124],{"categories":284},[53],{"categories":286},[],{"categories":288},[53],{"categories":290},[],{"categories":292},[155],{"categories":294},[148],{"categories":296},[],{"categories":298},[53],{"categories":300},[145],{"categories":302},[],{"categories":304},[145],{"categories":306},[106],{"categories":308},[],{"categories":310},[106],{"categories":312},[124],{"categories":314},[53],{"categories":316},[],{"categories":318},[106],{"categories":320},[53],{"categories":322},[109],{"categories":324},[],{"categories":326},[53],{"categories":328},[106],{"categories":330},[106],{"categories":332},[],{"categories":334},[148],{"categories":336},[53],{"categories":338},[],{"categories":340},[98],{"categories":342},[101],{"categories":344},[53],{"categories":346},[106],{"categories":348},[155],{"categories":350},[53],{"categories":352},[],{"categories":354},[],{"categories":356},[53],{"categories":358},[],{"categories":360},[145],{"categories":362},[],{"categories":364},[53],{"categories":366},[],{"categories":368},[106],{"categories":370},[53],{"categories":372},[145],{"categories":374},[],{"categories":376},[53],{"categories":378},[53],{"categories":380},[101],{"categories":382},[106],{"categories":384},[53],{"categories":386},[145],{"categories":388},[106],{"categories":390},[],{"categories":392},[],{"categories":394},[124],{"categories":396},[],{"categories":398},[53],{"categories":400},[101,162],{"categories":402},[],{"categories":404},[53],{"categories":406},[],{"categories":408},[],{"categories":410},[53],{"categories":412},[],{"categories":414},[53],{"categories":416},[417],"DevOps & Cloud",{"categories":419},[],{"categories":421},[124],{"categories":423},[145],{"categories":425},[],{"categories":427},[124],{"categories":429},[124],{"categories":431},[53],{"categories":433},[162],{"categories":435},[],{"categories":437},[101],{"categories":439},[],{"categories":441},[53,417],{"categories":443},[53],{"categories":445},[53],{"categories":447},[106],{"categories":449},[53,155],{"categories":451},[148],{"categories":453},[53],{"categories":455},[162],{"categories":457},[106],{"categories":459},[106],{"categories":461},[],{"categories":463},[106],{"categories":465},[53,101],{"categories":467},[],{"categories":469},[145],{"categories":471},[145],{"categories":473},[],{"categories":475},[],{"categories":477},[124],{"categories":479},[],{"categories":481},[98],{"categories":483},[155],{"categories":485},[53],{"categories":487},[145],{"categories":489},[106],{"categories":491},[155],{"categories":493},[124],{"categories":495},[145],{"categories":497},[],{"categories":499},[53],{"categories":501},[53],{"categories":503},[53],{"categories":505},[124],{"categories":507},[98],{"categories":509},[53],{"categories":511},[106],{"categories":513},[417],{"categories":515},[145],{"categories":517},[106],{"categories":519},[],{"categories":521},[],{"categories":523},[145],{"categories":525},[124],{"categories":527},[148],{"categories":529},[],{"categories":531},[53],{"categories":533},[53],{"categories":535},[101],{"categories":537},[53],{"categories":539},[53],{"categories":541},[124],{"categories":543},[],{"categories":545},[106],{"categories":547},[155],{"categories":549},[],{"categories":551},[53],{"categories":553},[53],{"categories":555},[106],{"categories":557},[],{"categories":559},[],{"categories":561},[53],{"categories":563},[],{"categories":565},[101],{"categories":567},[106],{"categories":569},[],{"categories":571},[98],{"categories":573},[53],{"categories":575},[101],{"categories":577},[124],{"categories":579},[],{"categories":581},[],{"categories":583},[],{"categories":585},[124],{"categories":587},[124],{"categories":589},[],{"categories":591},[],{"categories":593},[101],{"categories":595},[],{"categories":597},[],{"categories":599},[98],{"categories":601},[],{"categories":603},[162],{"categories":605},[106],{"categories":607},[101],{"categories":609},[106],{"categories":611},[],{"categories":613},[109],{"categories":615},[145],{"categories":617},[155],{"categories":619},[53],{"categories":621},[106],{"categories":623},[101],{"categories":625},[53],{"categories":627},[],{"categories":629},[],{"categories":631},[155],{"categories":633},[148],{"categories":635},[109],{"categories":637},[106],{"categories":639},[53],{"categories":641},[],{"categories":643},[417],{"categories":645},[],{"categories":647},[106],{"categories":649},[],{"categories":651},[],{"categories":653},[53],{"categories":655},[145],{"categories":657},[162],{"categories":659},[106],{"categories":661},[],{"categories":663},[98],{"categories":665},[],{"categories":667},[124],{"categories":669},[53,417],{"categories":671},[124],{"categories":673},[53],{"categories":675},[101],{"categories":677},[53],{"categories":679},[],{"categories":681},[101],{"categories":683},[],{"categories":685},[155],{"categories":687},[145],{"categories":689},[124],{"categories":691},[148],{"categories":693},[98],{"categories":695},[53],{"categories":697},[155],{"categories":699},[],{"categories":701},[],{"categories":703},[109],{"categories":705},[],{"categories":707},[53],{"categories":709},[],{"categories":711},[145],{"categories":713},[145],{"categories":715},[145],{"categories":717},[],{"categories":719},[],{"categories":721},[124],{"categories":723},[106],{"categories":725},[53],{"categories":727},[53],{"categories":729},[53],{"categories":731},[101],{"categories":733},[53],{"categories":735},[],{"categories":737},[155],{"categories":739},[155],{"categories":741},[101],{"categories":743},[],{"categories":745},[53],{"categories":747},[53],{"categories":749},[101],{"categories":751},[124],{"categories":753},[162],{"categories":755},[106],{"categories":757},[],{"categories":759},[145],{"categories":761},[],{"categories":763},[53],{"categories":765},[],{"categories":767},[101],{"categories":769},[106],{"categories":771},[],{"categories":773},[417],{"categories":775},[148],{"categories":777},[155],{"categories":779},[162],{"categories":781},[155],{"categories":783},[106],{"categories":785},[],{"categories":787},[],{"categories":789},[106],{"categories":791},[98],{"categories":793},[106],{"categories":795},[109],{"categories":797},[101],{"categories":799},[],{"categories":801},[53],{"categories":803},[109],{"categories":805},[53],{"categories":807},[53],{"categories":809},[162],{"categories":811},[145],{"categories":813},[106],{"categories":815},[],{"categories":817},[],{"categories":819},[417],{"categories":821},[155],{"categories":823},[],{"categories":825},[106],{"categories":827},[53],{"categories":829},[145,53],{"categories":831},[98],{"categories":833},[],{"categories":835},[53],{"categories":837},[98],{"categories":839},[145],{"categories":841},[106],{"categories":843},[155],{"categories":845},[],{"categories":847},[53],{"categories":849},[],{"categories":851},[98],{"categories":853},[],{"categories":855},[106],{"categories":857},[109],{"categories":859},[53],{"categories":861},[53],{"categories":863},[145],{"categories":865},[106],{"categories":867},[417],{"categories":869},[145],{"categories":871},[106],{"categories":873},[53],{"categories":875},[53],{"categories":877},[53],{"categories":879},[124],{"categories":881},[],{"categories":883},[109],{"categories":885},[106],{"categories":887},[145],{"categories":889},[106],{"categories":891},[155],{"categories":893},[145],{"categories":895},[106],{"categories":897},[124],{"categories":899},[],{"categories":901},[53],{"categories":903},[145],{"categories":905},[53],{"categories":907},[98],{"categories":909},[124],{"categories":911},[53],{"categories":913},[162],{"categories":915},[53],{"categories":917},[53],{"categories":919},[106],{"categories":921},[106],{"categories":923},[53],{"categories":925},[106],{"categories":927},[145],{"categories":929},[53],{"categories":931},[],{"categories":933},[],{"categories":935},[155],{"categories":937},[],{"categories":939},[98],{"categories":941},[417],{"categories":943},[],{"categories":945},[98],{"categories":947},[101],{"categories":949},[162],{"categories":951},[],{"categories":953},[101],{"categories":955},[],{"categories":957},[],{"categories":959},[],{"categories":961},[],{"categories":963},[],{"categories":965},[53],{"categories":967},[106],{"categories":969},[417],{"categories":971},[98],{"categories":973},[53],{"categories":975},[155],{"categories":977},[109],{"categories":979},[53],{"categories":981},[162],{"categories":983},[53],{"categories":985},[53],{"categories":987},[53],{"categories":989},[53,98],{"categories":991},[155],{"categories":993},[155],{"categories":995},[145],{"categories":997},[53],{"categories":999},[],{"categories":1001},[],{"categories":1003},[],{"categories":1005},[155],{"categories":1007},[148],{"categories":1009},[124],{"categories":1011},[145],{"categories":1013},[],{"categories":1015},[53],{"categories":1017},[53],{"categories":1019},[],{"categories":1021},[],{"categories":1023},[106],{"categories":1025},[53],{"categories":1027},[101],{"categories":1029},[],{"categories":1031},[98],{"categories":1033},[53],{"categories":1035},[98],{"categories":1037},[53],{"categories":1039},[155],{"categories":1041},[162],{"categories":1043},[53,145],{"categories":1045},[124],{"categories":1047},[145],{"categories":1049},[],{"categories":1051},[417],{"categories":1053},[145],{"categories":1055},[106],{"categories":1057},[],{"categories":1059},[],{"categories":1061},[],{"categories":1063},[],{"categories":1065},[155],{"categories":1067},[106],{"categories":1069},[106],{"categories":1071},[53],{"categories":1073},[53],{"categories":1075},[],{"categories":1077},[145],{"categories":1079},[],{"categories":1081},[],{"categories":1083},[106],{"categories":1085},[],{"categories":1087},[],{"categories":1089},[162],{"categories":1091},[162],{"categories":1093},[106],{"categories":1095},[],{"categories":1097},[53],{"categories":1099},[53],{"categories":1101},[155],{"categories":1103},[145],{"categories":1105},[145],{"categories":1107},[106],{"categories":1109},[98],{"categories":1111},[53],{"categories":1113},[145],{"categories":1115},[145],{"categories":1117},[106],{"categories":1119},[106],{"categories":1121},[53],{"categories":1123},[],{"categories":1125},[],{"categories":1127},[53],{"categories":1129},[106],{"categories":1131},[124],{"categories":1133},[155],{"categories":1135},[98],{"categories":1137},[53],{"categories":1139},[],{"categories":1141},[106],{"categories":1143},[106],{"categories":1145},[],{"categories":1147},[98],{"categories":1149},[53],{"categories":1151},[98],{"categories":1153},[98],{"categories":1155},[],{"categories":1157},[],{"categories":1159},[106],{"categories":1161},[106],{"categories":1163},[53],{"categories":1165},[53],{"categories":1167},[124],{"categories":1169},[148],{"categories":1171},[109],{"categories":1173},[124],{"categories":1175},[145],{"categories":1177},[],{"categories":1179},[124],{"categories":1181},[],{"categories":1183},[],{"categories":1185},[],{"categories":1187},[],{"categories":1189},[155],{"categories":1191},[148],{"categories":1193},[],{"categories":1195},[53],{"categories":1197},[53],{"categories":1199},[148],{"categories":1201},[155],{"categories":1203},[],{"categories":1205},[],{"categories":1207},[106],{"categories":1209},[124],{"categories":1211},[124],{"categories":1213},[106],{"categories":1215},[98],{"categories":1217},[53,417],{"categories":1219},[],{"categories":1221},[145],{"categories":1223},[98],{"categories":1225},[106],{"categories":1227},[145],{"categories":1229},[],{"categories":1231},[106],{"categories":1233},[106],{"categories":1235},[53],{"categories":1237},[162],{"categories":1239},[155],{"categories":1241},[145],{"categories":1243},[],{"categories":1245},[106],{"categories":1247},[53],{"categories":1249},[106],{"categories":1251},[106],{"categories":1253},[106],{"categories":1255},[162],{"categories":1257},[106],{"categories":1259},[53],{"categories":1261},[],{"categories":1263},[162],{"categories":1265},[124],{"categories":1267},[106],{"categories":1269},[],{"categories":1271},[],{"categories":1273},[53],{"categories":1275},[106],{"categories":1277},[124],{"categories":1279},[106],{"categories":1281},[],{"categories":1283},[],{"categories":1285},[],{"categories":1287},[106],{"categories":1289},[],{"categories":1291},[],{"categories":1293},[148],{"categories":1295},[53],{"categories":1297},[148],{"categories":1299},[124],{"categories":1301},[53],{"categories":1303},[53],{"categories":1305},[106],{"categories":1307},[53],{"categories":1309},[],{"categories":1311},[],{"categories":1313},[417],{"categories":1315},[],{"categories":1317},[],{"categories":1319},[98],{"categories":1321},[],{"categories":1323},[],{"categories":1325},[],{"categories":1327},[],{"categories":1329},[155],{"categories":1331},[124],{"categories":1333},[162],{"categories":1335},[101],{"categories":1337},[53],{"categories":1339},[53],{"categories":1341},[101],{"categories":1343},[],{"categories":1345},[145],{"categories":1347},[106],{"categories":1349},[101],{"categories":1351},[53],{"categories":1353},[53],{"categories":1355},[98],{"categories":1357},[],{"categories":1359},[98],{"categories":1361},[53],{"categories":1363},[162],{"categories":1365},[106],{"categories":1367},[124],{"categories":1369},[101],{"categories":1371},[53],{"categories":1373},[106],{"categories":1375},[],{"categories":1377},[53],{"categories":1379},[98],{"categories":1381},[53],{"categories":1383},[],{"categories":1385},[124],{"categories":1387},[53],{"categories":1389},[],{"categories":1391},[101],{"categories":1393},[53],{"categories":1395},[],{"categories":1397},[],{"categories":1399},[],{"categories":1401},[53],{"categories":1403},[],{"categories":1405},[417],{"categories":1407},[53],{"categories":1409},[],{"categories":1411},[53],{"categories":1413},[53],{"categories":1415},[53],{"categories":1417},[53,417],{"categories":1419},[53],{"categories":1421},[53],{"categories":1423},[145],{"categories":1425},[106],{"categories":1427},[],{"categories":1429},[106],{"categories":1431},[53],{"categories":1433},[53],{"categories":1435},[53],{"categories":1437},[98],{"categories":1439},[98],{"categories":1441},[155],{"categories":1443},[145],{"categories":1445},[106],{"categories":1447},[],{"categories":1449},[53],{"categories":1451},[124],{"categories":1453},[53],{"categories":1455},[101],{"categories":1457},[],{"categories":1459},[417],{"categories":1461},[145],{"categories":1463},[145],{"categories":1465},[106],{"categories":1467},[124],{"categories":1469},[106],{"categories":1471},[53],{"categories":1473},[],{"categories":1475},[53],{"categories":1477},[],{"categories":1479},[],{"categories":1481},[53],{"categories":1483},[53],{"categories":1485},[53],{"categories":1487},[106],{"categories":1489},[53],{"categories":1491},[],{"categories":1493},[148],{"categories":1495},[106],{"categories":1497},[],{"categories":1499},[53],{"categories":1501},[124],{"categories":1503},[],{"categories":1505},[145],{"categories":1507},[417],{"categories":1509},[124],{"categories":1511},[155],{"categories":1513},[155],{"categories":1515},[124],{"categories":1517},[124],{"categories":1519},[417],{"categories":1521},[],{"categories":1523},[124],{"categories":1525},[53],{"categories":1527},[98],{"categories":1529},[124],{"categories":1531},[],{"categories":1533},[148],{"categories":1535},[124],{"categories":1537},[155],{"categories":1539},[124],{"categories":1541},[417],{"categories":1543},[53],{"categories":1545},[53],{"categories":1547},[],{"categories":1549},[101],{"categories":1551},[],{"categories":1553},[],{"categories":1555},[53],{"categories":1557},[53],{"categories":1559},[53],{"categories":1561},[53],{"categories":1563},[],{"categories":1565},[148],{"categories":1567},[98],{"categories":1569},[],{"categories":1571},[53],{"categories":1573},[53],{"categories":1575},[417],{"categories":1577},[417],{"categories":1579},[],{"categories":1581},[106],{"categories":1583},[124],{"categories":1585},[124],{"categories":1587},[53],{"categories":1589},[106],{"categories":1591},[],{"categories":1593},[145],{"categories":1595},[53],{"categories":1597},[53],{"categories":1599},[],{"categories":1601},[],{"categories":1603},[417],{"categories":1605},[53],{"categories":1607},[155],{"categories":1609},[101],{"categories":1611},[53],{"categories":1613},[],{"categories":1615},[106],{"categories":1617},[98],{"categories":1619},[98],{"categories":1621},[],{"categories":1623},[53],{"categories":1625},[145],{"categories":1627},[106],{"categories":1629},[],{"categories":1631},[53],{"categories":1633},[53],{"categories":1635},[106],{"categories":1637},[],{"categories":1639},[106],{"categories":1641},[155],{"categories":1643},[],{"categories":1645},[53],{"categories":1647},[],{"categories":1649},[53],{"categories":1651},[],{"categories":1653},[53],{"categories":1655},[53],{"categories":1657},[],{"categories":1659},[53],{"categories":1661},[124],{"categories":1663},[53],{"categories":1665},[53],{"categories":1667},[98],{"categories":1669},[53],{"categories":1671},[124],{"categories":1673},[106],{"categories":1675},[],{"categories":1677},[53],{"categories":1679},[162],{"categories":1681},[],{"categories":1683},[],{"categories":1685},[],{"categories":1687},[98],{"categories":1689},[124],{"categories":1691},[106],{"categories":1693},[53],{"categories":1695},[145],{"categories":1697},[106],{"categories":1699},[],{"categories":1701},[106],{"categories":1703},[],{"categories":1705},[53],{"categories":1707},[106],{"categories":1709},[53],{"categories":1711},[],{"categories":1713},[53],{"categories":1715},[53],{"categories":1717},[124],{"categories":1719},[145],{"categories":1721},[106],{"categories":1723},[145],{"categories":1725},[101],{"categories":1727},[],{"categories":1729},[],{"categories":1731},[53],{"categories":1733},[98],{"categories":1735},[124],{"categories":1737},[],{"categories":1739},[],{"categories":1741},[155],{"categories":1743},[145],{"categories":1745},[],{"categories":1747},[53],{"categories":1749},[],{"categories":1751},[162],{"categories":1753},[53],{"categories":1755},[417],{"categories":1757},[155],{"categories":1759},[],{"categories":1761},[106],{"categories":1763},[53],{"categories":1765},[106],{"categories":1767},[106],{"categories":1769},[53],{"categories":1771},[],{"categories":1773},[98],{"categories":1775},[53],{"categories":1777},[101],{"categories":1779},[155],{"categories":1781},[145],{"categories":1783},[],{"categories":1785},[],{"categories":1787},[],{"categories":1789},[106],{"categories":1791},[145],{"categories":1793},[124],{"categories":1795},[53],{"categories":1797},[124],{"categories":1799},[145],{"categories":1801},[],{"categories":1803},[145],{"categories":1805},[124],{"categories":1807},[101],{"categories":1809},[53],{"categories":1811},[124],{"categories":1813},[162],{"categories":1815},[],{"categories":1817},[],{"categories":1819},[148],{"categories":1821},[53,155],{"categories":1823},[124],{"categories":1825},[53],{"categories":1827},[106],{"categories":1829},[106],{"categories":1831},[53],{"categories":1833},[],{"categories":1835},[155],{"categories":1837},[53],{"categories":1839},[148],{"categories":1841},[106],{"categories":1843},[162],{"categories":1845},[417],{"categories":1847},[],{"categories":1849},[98],{"categories":1851},[106],{"categories":1853},[106],{"categories":1855},[155],{"categories":1857},[53],{"categories":1859},[53],{"categories":1861},[],{"categories":1863},[],{"categories":1865},[],{"categories":1867},[417],{"categories":1869},[124],{"categories":1871},[53],{"categories":1873},[53],{"categories":1875},[53],{"categories":1877},[],{"categories":1879},[148],{"categories":1881},[101],{"categories":1883},[],{"categories":1885},[106],{"categories":1887},[417],{"categories":1889},[],{"categories":1891},[145],{"categories":1893},[145],{"categories":1895},[],{"categories":1897},[155],{"categories":1899},[145],{"categories":1901},[53],{"categories":1903},[],{"categories":1905},[124],{"categories":1907},[53],{"categories":1909},[145],{"categories":1911},[106],{"categories":1913},[124],{"categories":1915},[],{"categories":1917},[106],{"categories":1919},[145],{"categories":1921},[53],{"categories":1923},[],{"categories":1925},[53],{"categories":1927},[53],{"categories":1929},[417],{"categories":1931},[124],{"categories":1933},[148],{"categories":1935},[148],{"categories":1937},[],{"categories":1939},[],{"categories":1941},[],{"categories":1943},[106],{"categories":1945},[155],{"categories":1947},[155],{"categories":1949},[],{"categories":1951},[],{"categories":1953},[53],{"categories":1955},[],{"categories":1957},[106],{"categories":1959},[53],{"categories":1961},[],{"categories":1963},[53],{"categories":1965},[101],{"categories":1967},[53],{"categories":1969},[162],{"categories":1971},[106],{"categories":1973},[53],{"categories":1975},[155],{"categories":1977},[124],{"categories":1979},[106],{"categories":1981},[],{"categories":1983},[124],{"categories":1985},[106],{"categories":1987},[106],{"categories":1989},[],{"categories":1991},[101],{"categories":1993},[106],{"categories":1995},[],{"categories":1997},[53],{"categories":1999},[98],{"categories":2001},[124],{"categories":2003},[417],{"categories":2005},[106],{"categories":2007},[106],{"categories":2009},[98],{"categories":2011},[53],{"categories":2013},[],{"categories":2015},[],{"categories":2017},[145],{"categories":2019},[53,101],{"categories":2021},[],{"categories":2023},[98],{"categories":2025},[148],{"categories":2027},[53],{"categories":2029},[155],{"categories":2031},[53],{"categories":2033},[106],{"categories":2035},[53],{"categories":2037},[53],{"categories":2039},[124],{"categories":2041},[106],{"categories":2043},[],{"categories":2045},[],{"categories":2047},[106],{"categories":2049},[53],{"categories":2051},[417],{"categories":2053},[],{"categories":2055},[53],{"categories":2057},[106],{"categories":2059},[],{"categories":2061},[53],{"categories":2063},[162],{"categories":2065},[148],{"categories":2067},[106],{"categories":2069},[53],{"categories":2071},[417],{"categories":2073},[],{"categories":2075},[53],{"categories":2077},[162],{"categories":2079},[145],{"categories":2081},[53],{"categories":2083},[],{"categories":2085},[162],{"categories":2087},[124],{"categories":2089},[53],{"categories":2091},[53],{"categories":2093},[98],{"categories":2095},[],{"categories":2097},[],{"categories":2099},[145],{"categories":2101},[53],{"categories":2103},[148],{"categories":2105},[162],{"categories":2107},[162],{"categories":2109},[124],{"categories":2111},[],{"categories":2113},[],{"categories":2115},[53],{"categories":2117},[],{"categories":2119},[53,155],{"categories":2121},[124],{"categories":2123},[106],{"categories":2125},[155],{"categories":2127},[53],{"categories":2129},[98],{"categories":2131},[],{"categories":2133},[],{"categories":2135},[98],{"categories":2137},[162],{"categories":2139},[53],{"categories":2141},[],{"categories":2143},[145,53],{"categories":2145},[417],{"categories":2147},[98],{"categories":2149},[],{"categories":2151},[101],{"categories":2153},[101],{"categories":2155},[53],{"categories":2157},[155],{"categories":2159},[106],{"categories":2161},[124],{"categories":2163},[162],{"categories":2165},[145],{"categories":2167},[53],{"categories":2169},[53],{"categories":2171},[53],{"categories":2173},[98],{"categories":2175},[53],{"categories":2177},[106],{"categories":2179},[124],{"categories":2181},[],{"categories":2183},[],{"categories":2185},[148],{"categories":2187},[155],{"categories":2189},[53],{"categories":2191},[145],{"categories":2193},[148],{"categories":2195},[53],{"categories":2197},[53],{"categories":2199},[106],{"categories":2201},[106],{"categories":2203},[53,101],{"categories":2205},[],{"categories":2207},[145],{"categories":2209},[],{"categories":2211},[53],{"categories":2213},[124],{"categories":2215},[98],{"categories":2217},[98],{"categories":2219},[106],{"categories":2221},[53],{"categories":2223},[101],{"categories":2225},[155],{"categories":2227},[162],{"categories":2229},[],{"categories":2231},[124],{"categories":2233},[53],{"categories":2235},[53],{"categories":2237},[124],{"categories":2239},[155],{"categories":2241},[53],{"categories":2243},[106],{"categories":2245},[124],{"categories":2247},[53],{"categories":2249},[145],{"categories":2251},[53],{"categories":2253},[53],{"categories":2255},[417],{"categories":2257},[109],{"categories":2259},[106],{"categories":2261},[53],{"categories":2263},[124],{"categories":2265},[106],{"categories":2267},[162],{"categories":2269},[53],{"categories":2271},[],{"categories":2273},[53],{"categories":2275},[],{"categories":2277},[],{"categories":2279},[],{"categories":2281},[101],{"categories":2283},[53],{"categories":2285},[106],{"categories":2287},[124],{"categories":2289},[124],{"categories":2291},[124],{"categories":2293},[124],{"categories":2295},[],{"categories":2297},[98],{"categories":2299},[106],{"categories":2301},[124],{"categories":2303},[98],{"categories":2305},[106],{"categories":2307},[53],{"categories":2309},[53,106],{"categories":2311},[106],{"categories":2313},[417],{"categories":2315},[124],{"categories":2317},[124],{"categories":2319},[106],{"categories":2321},[53],{"categories":2323},[],{"categories":2325},[124],{"categories":2327},[162],{"categories":2329},[98],{"categories":2331},[53],{"categories":2333},[53],{"categories":2335},[],{"categories":2337},[155],{"categories":2339},[],{"categories":2341},[98],{"categories":2343},[106],{"categories":2345},[124],{"categories":2347},[53],{"categories":2349},[124],{"categories":2351},[98],{"categories":2353},[124],{"categories":2355},[124],{"categories":2357},[],{"categories":2359},[101],{"categories":2361},[106],{"categories":2363},[124],{"categories":2365},[124],{"categories":2367},[124],{"categories":2369},[124],{"categories":2371},[124],{"categories":2373},[124],{"categories":2375},[124],{"categories":2377},[124],{"categories":2379},[124],{"categories":2381},[124],{"categories":2383},[148],{"categories":2385},[98],{"categories":2387},[53],{"categories":2389},[53],{"categories":2391},[],{"categories":2393},[53,98],{"categories":2395},[],{"categories":2397},[106],{"categories":2399},[124],{"categories":2401},[106],{"categories":2403},[53],{"categories":2405},[53],{"categories":2407},[53],{"categories":2409},[53],{"categories":2411},[53],{"categories":2413},[106],{"categories":2415},[101],{"categories":2417},[145],{"categories":2419},[124],{"categories":2421},[53],{"categories":2423},[],{"categories":2425},[],{"categories":2427},[106],{"categories":2429},[145],{"categories":2431},[53],{"categories":2433},[],{"categories":2435},[],{"categories":2437},[162],{"categories":2439},[53],{"categories":2441},[],{"categories":2443},[],{"categories":2445},[98],{"categories":2447},[101],{"categories":2449},[53],{"categories":2451},[101],{"categories":2453},[145],{"categories":2455},[],{"categories":2457},[124],{"categories":2459},[],{"categories":2461},[145],{"categories":2463},[53],{"categories":2465},[162],{"categories":2467},[],{"categories":2469},[162],{"categories":2471},[],{"categories":2473},[],{"categories":2475},[106],{"categories":2477},[],{"categories":2479},[101],{"categories":2481},[98],{"categories":2483},[145],{"categories":2485},[155],{"categories":2487},[],{"categories":2489},[],{"categories":2491},[53],{"categories":2493},[98],{"categories":2495},[162],{"categories":2497},[],{"categories":2499},[106],{"categories":2501},[106],{"categories":2503},[124],{"categories":2505},[53],{"categories":2507},[106],{"categories":2509},[53],{"categories":2511},[106],{"categories":2513},[53],{"categories":2515},[109],{"categories":2517},[124],{"categories":2519},[],{"categories":2521},[162],{"categories":2523},[155],{"categories":2525},[106],{"categories":2527},[],{"categories":2529},[53],{"categories":2531},[106],{"categories":2533},[101],{"categories":2535},[98],{"categories":2537},[53],{"categories":2539},[145],{"categories":2541},[155],{"categories":2543},[155],{"categories":2545},[53],{"categories":2547},[148],{"categories":2549},[53],{"categories":2551},[106],{"categories":2553},[101],{"categories":2555},[106],{"categories":2557},[53],{"categories":2559},[53],{"categories":2561},[106],{"categories":2563},[124],{"categories":2565},[],{"categories":2567},[98],{"categories":2569},[53],{"categories":2571},[106],{"categories":2573},[53],{"categories":2575},[53],{"categories":2577},[],{"categories":2579},[145],{"categories":2581},[101],{"categories":2583},[124],{"categories":2585},[53],{"categories":2587},[53],{"categories":2589},[145],{"categories":2591},[162],{"categories":2593},[148],{"categories":2595},[53],{"categories":2597},[124],{"categories":2599},[53],{"categories":2601},[106],{"categories":2603},[417],{"categories":2605},[53],{"categories":2607},[106],{"categories":2609},[148],{"categories":2611},[],{"categories":2613},[106],{"categories":2615},[155],{"categories":2617},[145],{"categories":2619},[53],{"categories":2621},[98],{"categories":2623},[101],{"categories":2625},[155],{"categories":2627},[],{"categories":2629},[106],{"categories":2631},[53],{"categories":2633},[],{"categories":2635},[124],{"categories":2637},[],{"categories":2639},[124],{"categories":2641},[53],{"categories":2643},[106],{"categories":2645},[106],{"categories":2647},[106],{"categories":2649},[],{"categories":2651},[],{"categories":2653},[53],{"categories":2655},[53],{"categories":2657},[],{"categories":2659},[145],{"categories":2661},[106],{"categories":2663},[162],{"categories":2665},[98],{"categories":2667},[],{"categories":2669},[],{"categories":2671},[124],{"categories":2673},[155],{"categories":2675},[53],{"categories":2677},[53],{"categories":2679},[53],{"categories":2681},[155],{"categories":2683},[124],{"categories":2685},[145],{"categories":2687},[53],{"categories":2689},[53],{"categories":2691},[53],{"categories":2693},[124],{"categories":2695},[53],{"categories":2697},[124],{"categories":2699},[106],{"categories":2701},[106],{"categories":2703},[155],{"categories":2705},[106],{"categories":2707},[53],{"categories":2709},[155],{"categories":2711},[145],{"categories":2713},[],{"categories":2715},[106],{"categories":2717},[],{"categories":2719},[],{"categories":2721},[101],{"categories":2723},[53],{"categories":2725},[106],{"categories":2727},[98],{"categories":2729},[106],{"categories":2731},[162],{"categories":2733},[],{"categories":2735},[106],{"categories":2737},[],{"categories":2739},[98],{"categories":2741},[106],{"categories":2743},[],{"categories":2745},[106],{"categories":2747},[53],{"categories":2749},[124],{"categories":2751},[53],{"categories":2753},[106],{"categories":2755},[124],{"categories":2757},[106],{"categories":2759},[155],{"categories":2761},[145],{"categories":2763},[98],{"categories":2765},[],{"categories":2767},[106],{"categories":2769},[145],{"categories":2771},[124],{"categories":2773},[53],{"categories":2775},[145],{"categories":2777},[98],{"categories":2779},[],{"categories":2781},[106],{"categories":2783},[106],{"categories":2785},[53],{"categories":2787},[],{"categories":2789},[106],{"categories":2791},[109],{"categories":2793},[124],{"categories":2795},[106],{"categories":2797},[101],{"categories":2799},[],{"categories":2801},[53],{"categories":2803},[109],{"categories":2805},[53],{"categories":2807},[106],{"categories":2809},[124],{"categories":2811},[98],{"categories":2813},[417],{"categories":2815},[53],{"categories":2817},[53],{"categories":2819},[53],{"categories":2821},[124],{"categories":2823},[101],{"categories":2825},[53],{"categories":2827},[145],{"categories":2829},[124],{"categories":2831},[417],{"categories":2833},[53],{"categories":2835},[],{"categories":2837},[],{"categories":2839},[417],{"categories":2841},[148],{"categories":2843},[106],{"categories":2845},[106],{"categories":2847},[124],{"categories":2849},[53],{"categories":2851},[98],{"categories":2853},[145],{"categories":2855},[106],{"categories":2857},[53],{"categories":2859},[162],{"categories":2861},[53],{"categories":2863},[106],{"categories":2865},[],{"categories":2867},[53],{"categories":2869},[53],{"categories":2871},[124],{"categories":2873},[98],{"categories":2875},[],{"categories":2877},[53],{"categories":2879},[53],{"categories":2881},[155],{"categories":2883},[145],{"categories":2885},[53,106],{"categories":2887},[162,101],{"categories":2889},[53],{"categories":2891},[],{"categories":2893},[106],{"categories":2895},[],{"categories":2897},[155],{"categories":2899},[53],{"categories":2901},[124],{"categories":2903},[],{"categories":2905},[106],{"categories":2907},[],{"categories":2909},[106],{"categories":2911},[98],{"categories":2913},[106],{"categories":2915},[53],{"categories":2917},[417],{"categories":2919},[162],{"categories":2921},[101],{"categories":2923},[101],{"categories":2925},[98],{"categories":2927},[98],{"categories":2929},[53],{"categories":2931},[106],{"categories":2933},[53],{"categories":2935},[53],{"categories":2937},[98],{"categories":2939},[53],{"categories":2941},[162],{"categories":2943},[124],{"categories":2945},[53],{"categories":2947},[106],{"categories":2949},[53],{"categories":2951},[],{"categories":2953},[155],{"categories":2955},[],{"categories":2957},[106],{"categories":2959},[98],{"categories":2961},[],{"categories":2963},[417],{"categories":2965},[53],{"categories":2967},[],{"categories":2969},[124],{"categories":2971},[106],{"categories":2973},[155],{"categories":2975},[53],{"categories":2977},[106],{"categories":2979},[155],{"categories":2981},[106],{"categories":2983},[124],{"categories":2985},[98],{"categories":2987},[124],{"categories":2989},[155],{"categories":2991},[53],{"categories":2993},[145],{"categories":2995},[53],{"categories":2997},[53],{"categories":2999},[53],{"categories":3001},[53],{"categories":3003},[106],{"categories":3005},[53],{"categories":3007},[106],{"categories":3009},[53],{"categories":3011},[98],{"categories":3013},[53],{"categories":3015},[106],{"categories":3017},[145],{"categories":3019},[98],{"categories":3021},[106],{"categories":3023},[145],{"categories":3025},[],{"categories":3027},[53],{"categories":3029},[53],{"categories":3031},[155],{"categories":3033},[],{"categories":3035},[106],{"categories":3037},[162],{"categories":3039},[53],{"categories":3041},[124],{"categories":3043},[162],{"categories":3045},[106],{"categories":3047},[101],{"categories":3049},[101],{"categories":3051},[53],{"categories":3053},[98],{"categories":3055},[],{"categories":3057},[53],{"categories":3059},[],{"categories":3061},[98],{"categories":3063},[53],{"categories":3065},[106],{"categories":3067},[106],{"categories":3069},[],{"categories":3071},[155],{"categories":3073},[155],{"categories":3075},[162],{"categories":3077},[145],{"categories":3079},[],{"categories":3081},[53],{"categories":3083},[98],{"categories":3085},[53],{"categories":3087},[155],{"categories":3089},[98],{"categories":3091},[124],{"categories":3093},[124],{"categories":3095},[],{"categories":3097},[124],{"categories":3099},[106],{"categories":3101},[145],{"categories":3103},[148],{"categories":3105},[53],{"categories":3107},[],{"categories":3109},[124],{"categories":3111},[155],{"categories":3113},[101],{"categories":3115},[53],{"categories":3117},[98],{"categories":3119},[417],{"categories":3121},[98],{"categories":3123},[],{"categories":3125},[],{"categories":3127},[124],{"categories":3129},[],{"categories":3131},[106],{"categories":3133},[106],{"categories":3135},[106],{"categories":3137},[],{"categories":3139},[53],{"categories":3141},[],{"categories":3143},[124],{"categories":3145},[98],{"categories":3147},[145],{"categories":3149},[53],{"categories":3151},[124],{"categories":3153},[124],{"categories":3155},[],{"categories":3157},[124],{"categories":3159},[98],{"categories":3161},[53],{"categories":3163},[],{"categories":3165},[106],{"categories":3167},[106],{"categories":3169},[98],{"categories":3171},[],{"categories":3173},[],{"categories":3175},[],{"categories":3177},[145],{"categories":3179},[106],{"categories":3181},[53],{"categories":3183},[],{"categories":3185},[],{"categories":3187},[],{"categories":3189},[145],{"categories":3191},[],{"categories":3193},[98],{"categories":3195},[],{"categories":3197},[],{"categories":3199},[145],{"categories":3201},[53],{"categories":3203},[124],{"categories":3205},[],{"categories":3207},[162],{"categories":3209},[124],{"categories":3211},[162],{"categories":3213},[53],{"categories":3215},[],{"categories":3217},[],{"categories":3219},[106],{"categories":3221},[],{"categories":3223},[],{"categories":3225},[106],{"categories":3227},[53],{"categories":3229},[],{"categories":3231},[106],{"categories":3233},[124],{"categories":3235},[162],{"categories":3237},[148],{"categories":3239},[106],{"categories":3241},[106],{"categories":3243},[],{"categories":3245},[],{"categories":3247},[],{"categories":3249},[124],{"categories":3251},[],{"categories":3253},[],{"categories":3255},[145],{"categories":3257},[98],{"categories":3259},[],{"categories":3261},[101],{"categories":3263},[162],{"categories":3265},[53],{"categories":3267},[155],{"categories":3269},[98],{"categories":3271},[148],{"categories":3273},[101],{"categories":3275},[155],{"categories":3277},[],{"categories":3279},[],{"categories":3281},[106],{"categories":3283},[98],{"categories":3285},[145],{"categories":3287},[98],{"categories":3289},[106],{"categories":3291},[417],{"categories":3293},[106],{"categories":3295},[],{"categories":3297},[53],{"categories":3299},[124],{"categories":3301},[155],{"categories":3303},[],{"categories":3305},[145],{"categories":3307},[124],{"categories":3309},[98],{"categories":3311},[106],{"categories":3313},[53],{"categories":3315},[101],{"categories":3317},[106,417],{"categories":3319},[106],{"categories":3321},[155],{"categories":3323},[53],{"categories":3325},[148],{"categories":3327},[162],{"categories":3329},[106],{"categories":3331},[],{"categories":3333},[106],{"categories":3335},[53],{"categories":3337},[101],{"categories":3339},[],{"categories":3341},[],{"categories":3343},[53],{"categories":3345},[148],{"categories":3347},[53],{"categories":3349},[],{"categories":3351},[124],{"categories":3353},[],{"categories":3355},[124],{"categories":3357},[155],{"categories":3359},[106],{"categories":3361},[53],{"categories":3363},[162],{"categories":3365},[155],{"categories":3367},[],{"categories":3369},[124],{"categories":3371},[53],{"categories":3373},[],{"categories":3375},[53],{"categories":3377},[106],{"categories":3379},[53],{"categories":3381},[106],{"categories":3383},[53],{"categories":3385},[53],{"categories":3387},[53],{"categories":3389},[53],{"categories":3391},[101],{"categories":3393},[],{"categories":3395},[109],{"categories":3397},[124],{"categories":3399},[53],{"categories":3401},[],{"categories":3403},[155],{"categories":3405},[53],{"categories":3407},[53],{"categories":3409},[106],{"categories":3411},[124],{"categories":3413},[53],{"categories":3415},[53],{"categories":3417},[101],{"categories":3419},[106],{"categories":3421},[145],{"categories":3423},[],{"categories":3425},[148],{"categories":3427},[53],{"categories":3429},[],{"categories":3431},[124],{"categories":3433},[162],{"categories":3435},[],{"categories":3437},[],{"categories":3439},[124],{"categories":3441},[124],{"categories":3443},[162],{"categories":3445},[98],{"categories":3447},[106],{"categories":3449},[106],{"categories":3451},[53],{"categories":3453},[101],{"categories":3455},[],{"categories":3457},[],{"categories":3459},[124],{"categories":3461},[148],{"categories":3463},[155],{"categories":3465},[106],{"categories":3467},[145],{"categories":3469},[148],{"categories":3471},[148],{"categories":3473},[],{"categories":3475},[124],{"categories":3477},[53],{"categories":3479},[53],{"categories":3481},[155],{"categories":3483},[],{"categories":3485},[124],{"categories":3487},[124],{"categories":3489},[124],{"categories":3491},[],{"categories":3493},[106],{"categories":3495},[53],{"categories":3497},[],{"categories":3499},[98],{"categories":3501},[101],{"categories":3503},[],{"categories":3505},[53],{"categories":3507},[53],{"categories":3509},[],{"categories":3511},[155],{"categories":3513},[],{"categories":3515},[],{"categories":3517},[],{"categories":3519},[],{"categories":3521},[53],{"categories":3523},[124],{"categories":3525},[],{"categories":3527},[],{"categories":3529},[53],{"categories":3531},[53],{"categories":3533},[53],{"categories":3535},[148],{"categories":3537},[53],{"categories":3539},[148],{"categories":3541},[],{"categories":3543},[148],{"categories":3545},[148],{"categories":3547},[417],{"categories":3549},[106],{"categories":3551},[155],{"categories":3553},[],{"categories":3555},[],{"categories":3557},[148],{"categories":3559},[155],{"categories":3561},[155],{"categories":3563},[155],{"categories":3565},[],{"categories":3567},[98],{"categories":3569},[155],{"categories":3571},[155],{"categories":3573},[98],{"categories":3575},[155],{"categories":3577},[101],{"categories":3579},[155],{"categories":3581},[155],{"categories":3583},[155],{"categories":3585},[148],{"categories":3587},[124],{"categories":3589},[124],{"categories":3591},[53],{"categories":3593},[155],{"categories":3595},[148],{"categories":3597},[417],{"categories":3599},[148],{"categories":3601},[148],{"categories":3603},[148],{"categories":3605},[],{"categories":3607},[101],{"categories":3609},[],{"categories":3611},[417],{"categories":3613},[155],{"categories":3615},[155],{"categories":3617},[155],{"categories":3619},[106],{"categories":3621},[124,101],{"categories":3623},[148],{"categories":3625},[],{"categories":3627},[],{"categories":3629},[148],{"categories":3631},[],{"categories":3633},[148],{"categories":3635},[124],{"categories":3637},[106],{"categories":3639},[],{"categories":3641},[155],{"categories":3643},[53],{"categories":3645},[145],{"categories":3647},[],{"categories":3649},[53],{"categories":3651},[],{"categories":3653},[124],{"categories":3655},[98],{"categories":3657},[148],{"categories":3659},[],{"categories":3661},[155],{"categories":3663},[124],[3665,3733,3815,3876],{"id":3666,"title":3667,"ai":3668,"body":3673,"categories":3707,"created_at":54,"date_modified":54,"description":46,"extension":55,"faq":54,"featured":56,"kicker_label":54,"meta":3708,"navigation":76,"path":3721,"published_at":3722,"question":54,"scraped_at":3723,"seo":3724,"sitemap":3725,"source_id":3726,"source_name":83,"source_type":84,"source_url":3727,"stem":3728,"tags":3729,"thumbnail_url":54,"tldr":3730,"tweet":54,"unknown_tags":3731,"__hash__":3732},"summaries\u002Fsummaries\u002Fdcb9afa6c7f04fd4-aurora-fixes-muon-s-neuron-death-in-tall-mlps-summary.md","Aurora Fixes Muon's Neuron Death in Tall MLPs",{"provider":7,"model":8,"input_tokens":3669,"output_tokens":3670,"processing_time_ms":3671,"cost_usd":3672},7761,2013,23604,0.00253605,{"type":14,"value":3674,"toc":3702},[3675,3679,3682,3685,3689,3692,3695,3699],[17,3676,3678],{"id":3677},"muons-orthogonal-updates-cause-neuron-death-in-tall-matrices","Muon's Orthogonal Updates Cause Neuron Death in Tall Matrices",[22,3680,3681],{},"Muon computes the polar factor UVᵀ of gradient matrix G (via thin SVD) for semi-orthogonal weight updates W ← W - η UVᵀ, enabling fast convergence on nanoGPT speedrun benchmarks over AdamW. In tall matrices like SwiGLU MLP up-projections (more rows n than columns m), row-norm anisotropy emerges: impossible for perfectly orthogonal matrices to have uniform row norms of 1, so some rows get massive updates while others starve. By training step 500, >1\u002F4 neurons die permanently, starving downstream layers and compounding inefficiency. Leverage scores (squared row norms of U) become highly anisotropic, amplifying the death spiral.",[22,3683,3684],{},"NorMuon patches this with inverse RMS row normalization to unit norm, boosting performance but sacrificing polar factor precision. U-NorMuon refines to target norm √(n\u002Fm) for column-orthogonal tall matrices, eliminating death and stabilizing gradients even in untouched layers like down-projections—at 340M scale, it outperforms Muon\u002FNorMuon with isotropic leverage.",[17,3686,3688],{"id":3687},"aurora-solves-joint-constraints-for-precise-uniform-updates","Aurora Solves Joint Constraints for Precise, Uniform Updates",[22,3690,3691],{},"Aurora reformulates as steepest descent maximizing Tr(GᵀU) under dual constraints: UᵀU = Iₙ (left semi-orthogonality) and ||U_||₂ = √(m\u002Fn) ∀i (uniform row leverage). This forces all singular values of U to 1, achieving perfect orthogonality without trade-offs—unlike NorMuon's post-hoc normalization.",[22,3693,3694],{},"Implement as drop-in Muon replacement: Riemannian Aurora (gradient projection on Stiefel\u002Fequal-leverage manifold) or vanilla Aurora (simpler). For wide\u002Fsquare matrices, orthogonality implies uniformity, so unchanged. Open-source code supports scale; adds only 6% compute vs. Muon.",[17,3696,3698],{"id":3697},"sota-results-scale-with-mlp-width","SOTA Results Scale with MLP Width",[22,3700,3701],{},"At 1.1B parameters, Aurora trains 100x data-efficient model on open internet data, beating larger models on HellaSwag. Tops modded-nanoGPT speedrun (prior SOTA: NorMuon). Gains grow with MLP expansion (wider = taller matrices = more anisotropy risk), confirming hypothesis. Use for GPT-style training to avoid silent capacity loss.",{"title":46,"searchDepth":47,"depth":47,"links":3703},[3704,3705,3706],{"id":3677,"depth":47,"text":3678},{"id":3687,"depth":47,"text":3688},{"id":3697,"depth":47,"text":3698},[53],{"content_references":3709,"triage":3718},[3710,3714],{"type":60,"title":3711,"author":3712,"url":3713,"context":63},"Aurora","Tilde Research","https:\u002F\u002Fblog.tilderesearch.com\u002Fblog\u002Faurora",{"type":3715,"title":3716,"url":3717,"context":63},"tool","aurora-release","https:\u002F\u002Fgithub.com\u002Ftilde-research\u002Faurora-release",{"relevance":72,"novelty":73,"quality":73,"actionability":47,"composite":3719,"reasoning":3720},3.25,"Category: AI & LLMs. The article discusses a new optimizer, Aurora, that addresses a specific technical problem in deep learning models, which is relevant to AI engineering. However, while it presents novel insights into the optimizer's mechanics and performance, it lacks practical guidance for implementation that the target audience could directly act upon.","\u002Fsummaries\u002Fdcb9afa6c7f04fd4-aurora-fixes-muon-s-neuron-death-in-tall-mlps-summary","2026-05-12 08:07:28","2026-05-12 15:01:25",{"title":3667,"description":46},{"loc":3721},"dcb9afa6c7f04fd4","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F12\u002Ftilde-research-introduces-aurora-a-leverage-aware-optimizer-that-fixes-a-hidden-neuron-death-problem-in-muon\u002F","summaries\u002Fdcb9afa6c7f04fd4-aurora-fixes-muon-s-neuron-death-in-tall-mlps-summary",[89,88,90],"Aurora optimizer eliminates >25% neuron death in Muon's tall matrices by jointly enforcing left semi-orthogonality and uniform row norms √(n\u002Fm), delivering SOTA on nanoGPT speedrun with 6% compute overhead.",[],"LbY7EBmj0SNTdCqYLDJeH1MTGWukIbA19aMUaOvqp7Y",{"id":3734,"title":3735,"ai":3736,"body":3741,"categories":3791,"created_at":54,"date_modified":54,"description":46,"extension":55,"faq":54,"featured":56,"kicker_label":54,"meta":3792,"navigation":76,"path":3803,"published_at":3804,"question":54,"scraped_at":3805,"seo":3806,"sitemap":3807,"source_id":3808,"source_name":83,"source_type":84,"source_url":3809,"stem":3810,"tags":3811,"thumbnail_url":54,"tldr":3812,"tweet":54,"unknown_tags":3813,"__hash__":3814},"summaries\u002Fsummaries\u002F1dcaa9cf36eee656-blt-cuts-inference-bandwidth-50-92-via-diffusion-s-summary.md","BLT Cuts Inference Bandwidth 50-92% via Diffusion & Speculation",{"provider":7,"model":8,"input_tokens":3737,"output_tokens":3738,"processing_time_ms":3739,"cost_usd":3740},8589,2722,30748,0.00305615,{"type":14,"value":3742,"toc":3785},[3743,3747,3750,3754,3762,3765,3769,3772,3775,3778,3782],[17,3744,3746],{"id":3745},"blts-memory-bandwidth-bottleneck-in-byte-level-generation","BLT's Memory Bandwidth Bottleneck in Byte-Level Generation",[22,3748,3749],{},"Byte-level models like BLT avoid tokenization pitfalls—noise sensitivity, poor multilingual support, weak character\u002Fcode handling—by processing raw bytes via entropy-based patches (avg 4 bytes, max 8). Computation uses local encoder, global Transformer, local decoder on latent tokens. Inference slows because autoregressive decoder generates one byte\u002Fstep, vs. tokens covering multiple bytes. This multiplies memory loads for weights\u002FKV caches, the key serving bottleneck. BLT needs 4x more decoder passes than token models for equivalent text, hiking bandwidth costs.",[17,3751,3753],{"id":3752},"block-diffusion-enables-multi-byte-decoding-per-pass-blt-d","Block Diffusion Enables Multi-Byte Decoding per Pass (BLT-D)",[22,3755,3756,3757,3761],{},"BLT-D replaces byte-by-byte autoregression with discrete diffusion in fixed blocks (B=4\u002F8\u002F16 bytes). Training: corrupt blocks by masking bytes independently with prob t~U(0,1); loss combines next-byte prediction on clean seq + masked prediction on corrupted. Inference: start with ",[3758,3759,3760],"span",{},"MASK"," block, iteratively unmask multiple bytes\u002Fpass via confidence (prob>α) or entropy-bounded (cumulative entropy\u003Cγ) sampling. Encoder\u002Fglobal called once\u002Fblock, not per-patch; supports KV caching.",[22,3763,3764],{},"At 3B params on BLT-1T (1T tokens), BLT-D-4 matches BLT scores on FLORES-101 translation (French\u002FEnglish, German\u002FEnglish; 4-shot BLEU), nears on HumanEval\u002FMBPP coding (0\u002F3-shot pass@1). BLT-D-16 cuts bandwidth 87-92% but drops coding pass@1. Likelihoods (ARC-Easy\u002FChallenge, PIQA, HellaSwag, MMLU) near baseline via causal-masked decoder. Translation gains most; coding sensitive to block size. Entropy-bounded + top-p boosts diversity (higher type-token ratio) as NFEs rise.",[17,3766,3768],{"id":3767},"no-training-speculation-recycles-existing-decoder-blt-s-blt-dv","No-Training Speculation Recycles Existing Decoder (BLT-S, BLT-DV)",[22,3770,3771],{},"BLT-S uses lightweight decoder as self-drafter: generate k=8\u002F16 bytes ignoring patch boundaries, conditioning on last latent; verify via full encode\u002Fglobal\u002Fdecode, accept to first mismatch. Greedy decoding guarantees identical output to BLT (no quality loss); reduces encoder\u002Fglobal calls despite more decoder passes. At 3B\u002Fk=16, 77% bandwidth cut.",[22,3773,3774],{},"BLT-DV (on BLT-D weights): one-step diffusion drafts block, autoregressive verify accepts to mismatch. Single-step diffusion degrades alone but verification fixes it. At 3B, up to 81% bandwidth reduction.",[22,3776,3777],{},"All trained 1B:240k steps, 3B:480k on BLT-1T (public + Datacomp-LM subset). Efficiency proxies: decoder\u002Fencoder NFEs, GB bandwidth (16-bit, param\u002Fforward counts). Wall-clock needs optimized serving.",[17,3779,3781],{"id":3780},"practical-tradeoffs-for-production-deployment","Practical Tradeoffs for Production Deployment",[22,3783,3784],{},"BLT-D fastest (esp B=16) but coding tradeoffs; BLT-S zero-loss safest. All preserve autoregressive likelihoods\u002Freasoning. Bandwidth proxies predict real gains in memory-bound serving. Future: optimized inference impl. Byte-level now viable for production-scale speed without tokenizer fragility.",{"title":46,"searchDepth":47,"depth":47,"links":3786},[3787,3788,3789,3790],{"id":3745,"depth":47,"text":3746},{"id":3752,"depth":47,"text":3753},{"id":3767,"depth":47,"text":3768},{"id":3780,"depth":47,"text":3781},[53],{"content_references":3793,"triage":3801},[3794,3797],{"type":60,"title":3795,"url":3796,"context":63},"Fast Byte Latent Transformer That Reduces Inference Memory Bandwidth by Over 50% Without Tokenization","https:\u002F\u002Farxiv.org\u002Fpdf\u002F2605.08044",{"type":60,"title":3798,"url":3799,"context":3800},"Byte Latent Transformer (BLT): A Tokenizer-Free Model That Scales Efficiently","https:\u002F\u002Fwww.marktechpost.com\u002F2024\u002F12\u002F13\u002Fmeta-ai-introduces-byte-latent-transformer-blt-a-tokenizer-free-model-that-scales-efficiently\u002F","cited",{"relevance":72,"novelty":73,"quality":73,"actionability":47,"composite":3719,"reasoning":3802},"Category: AI & LLMs. The article discusses a new approach to improving inference bandwidth in AI models, which is relevant to AI engineering. However, it lacks practical applications or frameworks that the audience can directly implement, focusing instead on theoretical advancements.","\u002Fsummaries\u002F1dcaa9cf36eee656-blt-cuts-inference-bandwidth-50-92-via-diffusion-s-summary","2026-05-11 17:52:15","2026-05-12 15:01:28",{"title":3735,"description":46},{"loc":3803},"1dcaa9cf36eee656","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F11\u002Fmeta-and-stanford-researchers-propose-fast-byte-latent-transformer-that-reduces-inference-memory-bandwidth-by-over-50-without-tokenization\u002F","summaries\u002F1dcaa9cf36eee656-blt-cuts-inference-bandwidth-50-92-via-diffusion-s-summary",[88,89,91],"Meta\u002FStanford researchers accelerate Byte Latent Transformer (BLT) inference with BLT-D (diffusion decoding), BLT-S (self-speculation), and BLT-DV (diffusion+verification), reducing memory bandwidth 50-92% at 3B params while nearing baseline performance on translation\u002Fcoding tasks.",[],"xMZyx1diuvh2XXZUy_NPhOgWy_XqDJeXjel738dmvjs",{"id":3816,"title":3817,"ai":3818,"body":3823,"categories":3851,"created_at":54,"date_modified":54,"description":46,"extension":55,"faq":54,"featured":56,"kicker_label":54,"meta":3852,"navigation":76,"path":3863,"published_at":3864,"question":54,"scraped_at":3865,"seo":3866,"sitemap":3867,"source_id":3868,"source_name":3869,"source_type":84,"source_url":3870,"stem":3871,"tags":3872,"thumbnail_url":54,"tldr":3873,"tweet":54,"unknown_tags":3874,"__hash__":3875},"summaries\u002Fsummaries\u002F36eeccb45fcfb891-sentences-define-word-meanings-via-self-attention-summary.md","Sentences Define Word Meanings via Self-Attention",{"provider":7,"model":8,"input_tokens":3819,"output_tokens":3820,"processing_time_ms":3821,"cost_usd":3822},6053,1614,12893,0.00199495,{"type":14,"value":3824,"toc":3846},[3825,3829,3832,3836,3839,3843],[17,3826,3828],{"id":3827},"sequential-architectures-failed-to-capture-full-context","Sequential Architectures Failed to Capture Full Context",[22,3830,3831],{},"Pre-Transformer models processed language word-by-word, causing inevitable information loss. RNNs from the late 1980s suffered vanishing gradients, where early words faded by sentence end—like a goldfish memory in long sequences. LSTMs (1997) added forget, input, and output gates to selectively retain info, powering Google Translate and Gmail Smart Reply, but tripled parameters and computation costs. GRUs (2014) merged gates for half the compute with similar performance. Seq2Seq models also compressed entire inputs into fixed-size vectors for tasks like translation, creating bottlenecks where long inputs lost early details—short sentences worked, but nuance blurred in longer ones. All shared a core limit: sequential processing prevented parallel handling, capping scalability for documents beyond hundreds of words.",[17,3833,3835],{"id":3834},"self-attention-enables-sentence-level-meaning-resolution","Self-Attention Enables Sentence-Level Meaning Resolution",[22,3837,3838],{},"The 2017 'Attention Is All You Need' paper by eight Google engineers introduced Transformers, ditching RNNs\u002FLSTMs\u002FGRUs for parallel processing via self-attention. Every word simultaneously queries every other: 'How relevant are you to me?' This dynamically adjusts representations based on full context. For 'I bought apple to eat,' 'apple' weights 'eat' and 'bought' toward fruit; in 'I bought Apple stock to sell,' it shifts to company. Ambiguous pronouns resolve naturally, as in 'The trophy did not fit in the suitcase because it was too big'—full sentence clarifies 'it' as suitcase. Mimicking human reading (whole-sentence intake), this eliminates fixed meanings for words like 'bank' (river\u002Fmoney) or 'apple' (fruit\u002Fcompany), deriving them from sentence signals. Original Transformer trained in 3.5 days on eight GPUs, beating benchmarks.",[17,3840,3842],{"id":3841},"transformers-scale-to-power-all-modern-llms","Transformers Scale to Power All Modern LLMs",[22,3844,3845],{},"OpenAI's GPT series built directly on this: GPT-1 (117M parameters) to GPT-4 (>1T estimated), all using self-attention for billions of relevance computations per second. Every chatbot (ChatGPT, Claude), autocomplete, and LLM since runs this core operation, replacing fading memories and bottlenecks. Words lack inherent meaning—sentences solve them as variables, a truth machines grasped only after 30 years and one six-page paper.",{"title":46,"searchDepth":47,"depth":47,"links":3847},[3848,3849,3850],{"id":3827,"depth":47,"text":3828},{"id":3834,"depth":47,"text":3835},{"id":3841,"depth":47,"text":3842},[53],{"content_references":3853,"triage":3861},[3854,3858],{"type":60,"title":3855,"author":3856,"publisher":3857,"context":3800},"Attention Is All You Need","Eight engineers at Google","Google",{"type":3715,"title":3859,"url":3860,"context":63},"Self-Attention Interactive Walkthrough","https:\u002F\u002Fnursnaaz.github.io",{"relevance":72,"novelty":72,"quality":73,"actionability":47,"composite":74,"reasoning":3862},"Category: AI & LLMs. The article discusses the evolution of language models and the significance of self-attention in Transformers, which is relevant to AI-powered product builders. However, it lacks practical applications or frameworks that the audience could directly implement.","\u002Fsummaries\u002F36eeccb45fcfb891-sentences-define-word-meanings-via-self-attention-summary","2026-04-21 00:30:43","2026-04-21 15:26:03",{"title":3817,"description":46},{"loc":3863},"36eeccb45fcfb891","Generative AI","https:\u002F\u002Fgenerativeai.pub\u002Fwords-dont-have-meaning-sentences-do-ef5b7745eac2?source=rss----440100e76000---4","summaries\u002F36eeccb45fcfb891-sentences-define-word-meanings-via-self-attention-summary",[88,89,90],"Transformers ended 30 years of sequential processing flaws by using self-attention, where every word weighs relevance from the entire sentence context, powering GPT and all modern LLMs.",[],"oCj4Ws9wcBmSiLHpHgFwkn32mNINxj5NzpYDjicxhYg",{"id":3877,"title":3878,"ai":3879,"body":3884,"categories":3912,"created_at":54,"date_modified":54,"description":46,"extension":55,"faq":54,"featured":56,"kicker_label":54,"meta":3913,"navigation":76,"path":3921,"published_at":3922,"question":54,"scraped_at":3923,"seo":3924,"sitemap":3925,"source_id":3926,"source_name":3927,"source_type":84,"source_url":3928,"stem":3929,"tags":3930,"thumbnail_url":54,"tldr":3931,"tweet":54,"unknown_tags":3932,"__hash__":3933},"summaries\u002Fsummaries\u002Fd184bc13d59ed16f-53x-ai-efficiency-via-model-distillation-by-2025-summary.md","53x AI Efficiency via Model Distillation by 2025",{"provider":7,"model":8,"input_tokens":3880,"output_tokens":3881,"processing_time_ms":3882,"cost_usd":3883},3863,1216,6667,0.00135795,{"type":14,"value":3885,"toc":3907},[3886,3890,3893,3897,3900,3904],[17,3887,3889],{"id":3888},"core-technique-student-mimics-teachers-nuances","Core Technique: Student Mimics Teacher's Nuances",[22,3891,3892],{},"Model distillation compresses large AI models into smaller ones by having a 'student' model learn directly from a 'teacher' model's soft outputs—probability distributions over answers—rather than hard final labels. This captures subtle knowledge like confidence levels that label-only training misses, enabling deployment on limited hardware. In practice, apply it when large models are accurate but too slow or resource-heavy: the student slashes model size and boosts inference speed dramatically without major accuracy drops.",[17,3894,3896],{"id":3895},"proven-efficiency-gains-and-real-world-impact","Proven Efficiency Gains and Real-World Impact",[22,3898,3899],{},"Distillation delivers 53x overall efficiency improvements by 2025 across speed, cost, size, and energy use, making AI greener and cheaper for production. For instance, it turns impossible edge deployments into reality, as the author experienced in a project where mimicking a large model's behavior overcame hardware constraints. Smaller models run faster and cheaper while retaining complex capabilities, ideal for real-world apps over bulky originals.",[17,3901,3903],{"id":3902},"evolution-from-2015-pioneer-to-modern-power","Evolution from 2015 Pioneer to Modern Power",[22,3905,3906],{},"Geoffrey Hinton introduced distillation in his 2015 paper, starting with basic mimicry. It has since advanced to embed reasoning and instruction-following into compact models. By 2025, expect widespread adoption for massive gains, evolving beyond simple compression to transfer advanced AI behaviors efficiently. This thin intro highlights the method's maturity but cuts off before deeper 2025 specifics or code examples.",{"title":46,"searchDepth":47,"depth":47,"links":3908},[3909,3910,3911],{"id":3888,"depth":47,"text":3889},{"id":3895,"depth":47,"text":3896},{"id":3902,"depth":47,"text":3903},[53],{"content_references":3914,"triage":3918},[3915],{"type":60,"title":3916,"author":3917,"context":3800},"Geoffrey Hinton’s pioneering 2015 paper","Geoffrey Hinton",{"relevance":73,"novelty":72,"quality":73,"actionability":72,"composite":3919,"reasoning":3920},3.6,"Category: AI & LLMs. The article discusses model distillation, a relevant technique for improving AI efficiency, which addresses the audience's pain point of deploying AI models in resource-constrained environments. It provides a concrete example of efficiency gains but lacks detailed actionable steps for implementation.","\u002Fsummaries\u002Fd184bc13d59ed16f-53x-ai-efficiency-via-model-distillation-by-2025-summary","2026-04-17 03:31:01","2026-04-19 01:22:22",{"title":3878,"description":46},{"loc":3921},"d184bc13d59ed16f","AI Simplified in Plain English","https:\u002F\u002Fmedium.com\u002Fai-simplified-in-plain-english\u002Fdiscover-the-hidden-power-of-model-distillation-38f40d343c85?source=rss----f37ab7d4e76b---4","summaries\u002Fd184bc13d59ed16f-53x-ai-efficiency-via-model-distillation-by-2025-summary",[89,90,88],"Train small 'student' models on large 'teacher' models' soft probabilities—not just labels—to match performance while slashing size, speed, and costs by 53x by 2025.",[],"ECAF7XlAWzBa4F6NguTIJJt3uE7W33V32wLDIbjUNnU"]