[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"summary-3555a47e3851a952-gliguard-300m-safety-model-beats-90x-larger-rivals-summary":3,"summaries-facets-categories":108,"summary-related-3555a47e3851a952-gliguard-300m-safety-model-beats-90x-larger-rivals-summary":3704},{"id":4,"title":5,"ai":6,"body":13,"categories":58,"created_at":59,"date_modified":59,"description":52,"extension":60,"faq":59,"featured":61,"kicker_label":59,"meta":62,"navigation":89,"path":90,"published_at":91,"question":59,"scraped_at":92,"seo":93,"sitemap":94,"source_id":95,"source_name":96,"source_type":97,"source_url":98,"stem":99,"tags":100,"thumbnail_url":59,"tldr":105,"tweet":59,"unknown_tags":106,"__hash__":107},"summaries\u002Fsummaries\u002F3555a47e3851a952-gliguard-300m-safety-model-beats-90x-larger-rivals-summary.md","GLiGuard: 300M Safety Model Beats 90x Larger Rivals",{"provider":7,"model":8,"input_tokens":9,"output_tokens":10,"processing_time_ms":11,"cost_usd":12},"openrouter","x-ai\u002Fgrok-4.1-fast",7840,1946,22871,0.00251835,{"type":14,"value":15,"toc":51},"minimark",[16,21,25,28,32,44,48],[17,18,20],"h2",{"id":19},"encoder-models-fix-latency-bottlenecks-in-production-guardrails","Encoder Models Fix Latency Bottlenecks in Production Guardrails",[22,23,24],"p",{},"Safety moderation for LLM apps requires checking every prompt and response, but decoder-only models like LlamaGuard4 (12B), WildGuard (7B), ShieldGemma (27B), and NemoGuard (8B) generate verdicts autoregressively—one token at a time—causing compounded latency and costs in multi-turn conversations. These architectures suit flexible, natural-language policies but treat classification as generation, adding sequential overhead for multi-dimension checks (e.g., harm type, jailbreaks, refusals). Switch to encoder models like GLiGuard, which process full inputs in parallel and output fixed labels instantly, reframing moderation as efficient classification.",[22,26,27],{},"GLiGuard, fine-tuned from Fastino's 300M GLiNER2-base-v1 checkpoint, encodes input text alongside task definitions and candidate labels, scoring all options in one forward pass. Adding safety dimensions incurs zero extra latency—just more input labels. This yields 26ms latency on A100 GPU (vs. 426ms for baselines) and 16x higher throughput, scaling seamlessly for real-time apps.",[17,29,31],{"id":30},"simultaneous-multi-task-moderation-without-overhead","Simultaneous Multi-Task Moderation Without Overhead",[22,33,34,35,39,40,43],{},"Run four tasks concurrently: (1) prompt safety (safe\u002Funsafe), (2) response safety (safe\u002Funsafe), (3) harm category (e.g., toxic speech, violence), (4) jailbreak strategy detection. Input format bundles text with labels like \"",[36,37,38],"span",{},"HARM_VIOLENCE","\" or \"",[36,41,42],{},"JAILBREAK_REFUSAL","\", letting the model score and select top matches instantly. Early training exposed confusion between similar harms (toxic vs. violence), fixed by Pioneer-generated synthetic edge cases atop 87k human-annotated WildGuardTrain examples (for prompts, responses, refusals) and GPT-4.1 labels for harms\u002Fjailbreaks. Full fine-tuning over 20 epochs with AdamW produced robust distinctions.",[17,45,47],{"id":46},"benchmark-beating-accuracy-validates-small-model-efficiency","Benchmark-Beating Accuracy Validates Small-Model Efficiency",[22,49,50],{},"Across 9 safety benchmarks (prompt\u002Fresponse classification, adversarial robustness, harm differentiation, low false positives), GLiGuard's macro-F1 scores match or exceed giants: beats ShieldGemma2-27B by up to 5 points on some, ties LlamaGuard4-12B overall. Examples: 88.5% F1 on WildGuard (vs. 87.2% ShieldGemma), 92.1% on HarmBench-Red-Teal (vs. 90.5%). No accuracy sacrifice despite 23-90x fewer parameters—proving encoder classification extracts max value from small models for fixed-label tasks. Open-source at Hugging Face (fastino\u002Fgliguard-LLMGuardrails-300M), GitHub (fastino-ai\u002FGLiGuard), with GLiNER details at gliner.ai.",{"title":52,"searchDepth":53,"depth":53,"links":54},"",2,[55,56,57],{"id":19,"depth":53,"text":20},{"id":30,"depth":53,"text":31},{"id":46,"depth":53,"text":47},[],null,"md",false,{"content_references":63,"triage":84},[64,69,73,77,80],{"type":65,"title":66,"url":67,"context":68},"paper","GLiGuard: A 300M Parameter Safety Moderation Model","https:\u002F\u002Farxiv.org\u002Fabs\u002F2605.07982","recommended",{"type":70,"title":71,"url":72,"context":68},"tool","GLiGuard Model Weights","https:\u002F\u002Fhuggingface.co\u002Ffastino\u002Fgliguard-LLMGuardrails-300M",{"type":74,"title":75,"url":76,"context":68},"other","GLiGuard GitHub Repo","https:\u002F\u002Fgithub.com\u002Ffastino-ai\u002FGLiGuard",{"type":70,"title":78,"url":79,"context":68},"GLiNER Technical Details","https:\u002F\u002Fgliner.ai\u002F",{"type":81,"title":82,"context":83},"dataset","WildGuardTrain","cited",{"relevance":85,"novelty":86,"quality":85,"actionability":86,"composite":87,"reasoning":88},4,3,3.6,"Category: AI & LLMs. The article discusses a new safety moderation model, GLiGuard, which presents a practical application for AI-powered products by addressing latency issues in LLM safety. It provides insights into model architecture and performance metrics, but lacks detailed implementation guidance for developers.",true,"\u002Fsummaries\u002F3555a47e3851a952-gliguard-300m-safety-model-beats-90x-larger-rivals-summary","2026-05-13 20:41:13","2026-05-13 23:00:26",{"title":5,"description":52},{"loc":90},"3555a47e3851a952","MarkTechPost","article","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F13\u002Ffastino-labs-open-sources-gliguard-a-300m-parameter-safety-moderation-model-that-matches-or-exceeds-accuracy-of-models-23-90x-its-size\u002F","summaries\u002F3555a47e3851a952-gliguard-300m-safety-model-beats-90x-larger-rivals-summary",[101,102,103,104],"llm","open-source","ai-tools","machine-learning","Deploy GLiGuard, a 300M encoder model, for LLM safety moderation: matches accuracy of 23-90x larger models across 9 benchmarks while running 16x faster at 26ms per request.",[],"V_LcHY6WiXIBXCqegT-b3AbhX-6D9z5mX9yDCaqs7TY",[109,112,115,118,121,124,126,128,130,132,134,136,139,141,143,145,147,149,151,153,155,157,160,163,165,167,170,172,174,177,179,181,183,185,187,189,191,193,195,197,199,201,203,205,207,209,211,213,215,217,219,221,223,225,227,229,231,233,235,237,239,241,243,245,247,249,251,253,255,257,259,261,263,265,267,269,271,273,276,278,280,282,284,286,288,290,292,294,296,298,300,302,304,306,308,310,312,314,316,318,320,322,324,326,328,330,332,334,336,338,340,342,344,346,348,350,352,354,356,358,360,362,364,366,368,370,372,374,376,378,380,382,384,386,388,390,392,394,396,398,400,402,404,406,408,410,412,414,416,418,420,422,424,426,428,430,432,434,436,438,440,442,444,446,448,450,452,454,456,458,460,462,464,466,468,470,472,474,476,478,480,482,484,486,488,490,492,494,496,498,500,502,504,506,508,510,512,514,516,518,520,522,524,526,528,530,532,534,536,538,540,542,544,546,548,550,552,554,556,558,560,562,564,566,568,570,572,574,576,578,580,582,584,586,588,590,592,594,596,598,600,602,604,606,608,610,612,614,616,618,620,622,624,626,628,630,632,634,636,638,640,642,644,646,648,650,652,654,656,658,660,662,664,666,668,670,672,674,676,678,680,682,684,686,688,690,692,694,696,698,700,702,704,706,708,710,712,714,716,718,720,722,724,726,728,730,732,734,736,738,740,742,744,746,748,750,752,754,756,758,760,762,764,766,768,770,772,774,776,778,780,782,784,786,788,790,792,794,796,798,800,802,804,806,808,810,812,814,816,818,820,822,824,826,828,830,832,834,836,838,840,842,844,846,848,850,852,854,856,858,860,862,864,866,868,870,872,874,876,878,880,882,884,886,888,890,892,894,896,898,900,902,904,906,908,910,912,914,916,918,920,922,924,926,928,930,932,934,936,938,940,942,944,946,948,950,952,954,956,958,960,962,964,966,968,970,972,974,976,978,980,982,984,986,988,990,992,994,996,998,1000,1002,1004,1006,1008,1010,1012,1014,1016,1018,1020,1022,1024,1026,1028,1030,1032,1034,1036,1038,1040,1042,1044,1046,1048,1050,1052,1054,1056,1058,1060,1062,1064,1066,1068,1070,1072,1074,1076,1078,1080,1082,1084,1086,1088,1090,1092,1094,1096,1098,1100,1102,1104,1106,1108,1110,1112,1114,1116,1118,1120,1122,1124,1126,1128,1130,1132,1134,1136,1138,1140,1142,1144,1146,1148,1150,1152,1154,1156,1158,1160,1162,1164,1166,1168,1170,1172,1174,1176,1178,1180,1182,1184,1186,1188,1190,1192,1194,1196,1198,1200,1202,1204,1206,1208,1210,1212,1214,1216,1218,1220,1222,1224,1226,1228,1230,1232,1234,1236,1238,1240,1242,1244,1246,1248,1250,1252,1254,1256,1258,1260,1262,1264,1266,1268,1270,1272,1274,1276,1278,1280,1282,1284,1286,1288,1290,1292,1294,1296,1298,1300,1302,1304,1306,1308,1310,1312,1314,1316,1318,1320,1322,1324,1326,1328,1330,1332,1334,1336,1338,1340,1342,1344,1346,1348,1350,1352,1354,1356,1358,1360,1362,1364,1366,1368,1370,1372,1374,1376,1378,1380,1382,1384,1386,1388,1390,1392,1394,1396,1398,1400,1402,1404,1406,1408,1410,1412,1414,1416,1418,1420,1422,1424,1426,1428,1430,1432,1434,1436,1438,1440,1442,1444,1446,1448,1450,1452,1454,1456,1458,1460,1462,1464,1466,1468,1470,1472,1474,1476,1478,1480,1482,1484,1486,1488,1490,1492,1494,1496,1498,1500,1502,1504,1506,1508,1510,1512,1514,1516,1518,1520,1522,1524,1526,1528,1530,1532,1534,1536,1538,1540,1542,1544,1546,1548,1550,1552,1554,1556,1558,1560,1562,1564,1566,1568,1570,1572,1574,1576,1578,1580,1582,1584,1586,1588,1590,1592,1594,1596,1598,1600,1602,1604,1606,1608,1610,1612,1614,1616,1618,1620,1622,1624,1626,1628,1630,1632,1634,1636,1638,1640,1642,1644,1646,1648,1650,1652,1654,1656,1658,1660,1662,1664,1666,1668,1670,1672,1674,1676,1678,1680,1682,1684,1686,1688,1690,1692,1694,1696,1698,1700,1702,1704,1706,1708,1710,1712,1714,1716,1718,1720,1722,1724,1726,1728,1730,1732,1734,1736,1738,1740,1742,1744,1746,1748,1750,1752,1754,1756,1758,1760,1762,1764,1766,1768,1770,1772,1774,1776,1778,1780,1782,1784,1786,1788,1790,1792,1794,1796,1798,1800,1802,1804,1806,1808,1810,1812,1814,1816,1818,1820,1822,1824,1826,1828,1830,1832,1834,1836,1838,1840,1842,1844,1846,1848,1850,1852,1854,1856,1858,1860,1862,1864,1866,1868,1870,1872,1874,1876,1878,1880,1882,1884,1886,1888,1890,1892,1894,1896,1898,1900,1902,1904,1906,1908,1910,1912,1914,1916,1918,1920,1922,1924,1926,1928,1930,1932,1934,1936,1938,1940,1942,1944,1946,1948,1950,1952,1954,1956,1958,1960,1962,1964,1966,1968,1970,1972,1974,1976,1978,1980,1982,1984,1986,1988,1990,1992,1994,1996,1998,2000,2002,2004,2006,2008,2010,2012,2014,2016,2018,2020,2022,2024,2026,2028,2030,2032,2034,2036,2038,2040,2042,2044,2046,2048,2050,2052,2054,2056,2058,2060,2062,2064,2066,2068,2070,2072,2074,2076,2078,2080,2082,2084,2086,2088,2090,2092,2094,2096,2098,2100,2102,2104,2106,2108,2110,2112,2114,2116,2118,2120,2122,2124,2126,2128,2130,2132,2134,2136,2138,2140,2142,2144,2146,2148,2150,2152,2154,2156,2158,2160,2162,2164,2166,2168,2170,2172,2174,2176,2178,2180,2182,2184,2186,2188,2190,2192,2194,2196,2198,2200,2202,2204,2206,2208,2210,2212,2214,2216,2218,2220,2222,2224,2226,2228,2230,2232,2234,2236,2238,2240,2242,2244,2246,2248,2250,2252,2254,2256,2258,2260,2262,2264,2266,2268,2270,2272,2274,2276,2278,2280,2282,2284,2286,2288,2290,2292,2294,2296,2298,2300,2302,2304,2306,2308,2310,2312,2314,2316,2318,2320,2322,2324,2326,2328,2330,2332,2334,2336,2338,2340,2342,2344,2346,2348,2350,2352,2354,2356,2358,2360,2362,2364,2366,2368,2370,2372,2374,2376,2378,2380,2382,2384,2386,2388,2390,2392,2394,2396,2398,2400,2402,2404,2406,2408,2410,2412,2414,2416,2418,2420,2422,2424,2426,2428,2430,2432,2434,2436,2438,2440,2442,2444,2446,2448,2450,2452,2454,2456,2458,2460,2462,2464,2466,2468,2470,2472,2474,2476,2478,2480,2482,2484,2486,2488,2490,2492,2494,2496,2498,2500,2502,2504,2506,2508,2510,2512,2514,2516,2518,2520,2522,2524,2526,2528,2530,2532,2534,2536,2538,2540,2542,2544,2546,2548,2550,2552,2554,2556,2558,2560,2562,2564,2566,2568,2570,2572,2574,2576,2578,2580,2582,2584,2586,2588,2590,2592,2594,2596,2598,2600,2602,2604,2606,2608,2610,2612,2614,2616,2618,2620,2622,2624,2626,2628,2630,2632,2634,2636,2638,2640,2642,2644,2646,2648,2650,2652,2654,2656,2658,2660,2662,2664,2666,2668,2670,2672,2674,2676,2678,2680,2682,2684,2686,2688,2690,2692,2694,2696,2698,2700,2702,2704,2706,2708,2710,2712,2714,2716,2718,2720,2722,2724,2726,2728,2730,2732,2734,2736,2738,2740,2742,2744,2746,2748,2750,2752,2754,2756,2758,2760,2762,2764,2766,2768,2770,2772,2774,2776,2778,2780,2782,2784,2786,2788,2790,2792,2794,2796,2798,2800,2802,2804,2806,2808,2810,2812,2814,2816,2818,2820,2822,2824,2826,2828,2830,2832,2834,2836,2838,2840,2842,2844,2846,2848,2850,2852,2854,2856,2858,2860,2862,2864,2866,2868,2870,2872,2874,2876,2878,2880,2882,2884,2886,2888,2890,2892,2894,2896,2898,2900,2902,2904,2906,2908,2910,2912,2914,2916,2918,2920,2922,2924,2926,2928,2930,2932,2934,2936,2938,2940,2942,2944,2946,2948,2950,2952,2954,2956,2958,2960,2962,2964,2966,2968,2970,2972,2974,2976,2978,2980,2982,2984,2986,2988,2990,2992,2994,2996,2998,3000,3002,3004,3006,3008,3010,3012,3014,3016,3018,3020,3022,3024,3026,3028,3030,3032,3034,3036,3038,3040,3042,3044,3046,3048,3050,3052,3054,3056,3058,3060,3062,3064,3066,3068,3070,3072,3074,3076,3078,3080,3082,3084,3086,3088,3090,3092,3094,3096,3098,3100,3102,3104,3106,3108,3110,3112,3114,3116,3118,3120,3122,3124,3126,3128,3130,3132,3134,3136,3138,3140,3142,3144,3146,3148,3150,3152,3154,3156,3158,3160,3162,3164,3166,3168,3170,3172,3174,3176,3178,3180,3182,3184,3186,3188,3190,3192,3194,3196,3198,3200,3202,3204,3206,3208,3210,3212,3214,3216,3218,3220,3222,3224,3226,3228,3230,3232,3234,3236,3238,3240,3242,3244,3246,3248,3250,3252,3254,3256,3258,3260,3262,3264,3266,3268,3270,3272,3274,3276,3278,3280,3282,3284,3286,3288,3290,3292,3294,3296,3298,3300,3302,3304,3306,3308,3310,3312,3314,3316,3318,3320,3322,3324,3326,3328,3330,3332,3334,3336,3338,3340,3342,3344,3346,3348,3350,3352,3354,3356,3358,3360,3362,3364,3366,3368,3370,3372,3374,3376,3378,3380,3382,3384,3386,3388,3390,3392,3394,3396,3398,3400,3402,3404,3406,3408,3410,3412,3414,3416,3418,3420,3422,3424,3426,3428,3430,3432,3434,3436,3438,3440,3442,3444,3446,3448,3450,3452,3454,3456,3458,3460,3462,3464,3466,3468,3470,3472,3474,3476,3478,3480,3482,3484,3486,3488,3490,3492,3494,3496,3498,3500,3502,3504,3506,3508,3510,3512,3514,3516,3518,3520,3522,3524,3526,3528,3530,3532,3534,3536,3538,3540,3542,3544,3546,3548,3550,3552,3554,3556,3558,3560,3562,3564,3566,3568,3570,3572,3574,3576,3578,3580,3582,3584,3586,3588,3590,3592,3594,3596,3598,3600,3602,3604,3606,3608,3610,3612,3614,3616,3618,3620,3622,3624,3626,3628,3630,3632,3634,3636,3638,3640,3642,3644,3646,3648,3650,3652,3654,3656,3658,3660,3662,3664,3666,3668,3670,3672,3674,3676,3678,3680,3682,3684,3686,3688,3690,3692,3694,3696,3698,3700,3702],{"categories":110},[111],"Developer Productivity",{"categories":113},[114],"Business & SaaS",{"categories":116},[117],"AI & LLMs",{"categories":119},[120],"AI Automation",{"categories":122},[123],"Product Strategy",{"categories":125},[117],{"categories":127},[111],{"categories":129},[114],{"categories":131},[],{"categories":133},[117],{"categories":135},[],{"categories":137},[138],"AI News & Trends",{"categories":140},[120],{"categories":142},[138],{"categories":144},[120],{"categories":146},[120],{"categories":148},[117],{"categories":150},[117],{"categories":152},[138],{"categories":154},[117],{"categories":156},[],{"categories":158},[159],"Design & Frontend",{"categories":161},[162],"Data Science & Visualization",{"categories":164},[138],{"categories":166},[],{"categories":168},[169],"Software Engineering",{"categories":171},[117],{"categories":173},[120],{"categories":175},[176],"Marketing & Growth",{"categories":178},[117],{"categories":180},[120],{"categories":182},[],{"categories":184},[],{"categories":186},[159],{"categories":188},[120],{"categories":190},[111],{"categories":192},[159],{"categories":194},[117],{"categories":196},[120],{"categories":198},[138],{"categories":200},[],{"categories":202},[],{"categories":204},[120],{"categories":206},[169],{"categories":208},[],{"categories":210},[114],{"categories":212},[],{"categories":214},[],{"categories":216},[120],{"categories":218},[120],{"categories":220},[117],{"categories":222},[],{"categories":224},[169],{"categories":226},[],{"categories":228},[],{"categories":230},[],{"categories":232},[117],{"categories":234},[176],{"categories":236},[159],{"categories":238},[159],{"categories":240},[117],{"categories":242},[120],{"categories":244},[117],{"categories":246},[117],{"categories":248},[120],{"categories":250},[120],{"categories":252},[162],{"categories":254},[138],{"categories":256},[120],{"categories":258},[176],{"categories":260},[120],{"categories":262},[123],{"categories":264},[],{"categories":266},[120],{"categories":268},[],{"categories":270},[120],{"categories":272},[169],{"categories":274},[275],"DevOps & Cloud",{"categories":277},[159],{"categories":279},[117],{"categories":281},[],{"categories":283},[],{"categories":285},[120],{"categories":287},[],{"categories":289},[117],{"categories":291},[],{"categories":293},[111],{"categories":295},[169],{"categories":297},[114],{"categories":299},[138],{"categories":301},[117],{"categories":303},[],{"categories":305},[117],{"categories":307},[],{"categories":309},[169],{"categories":311},[162],{"categories":313},[],{"categories":315},[117],{"categories":317},[159],{"categories":319},[],{"categories":321},[159],{"categories":323},[120],{"categories":325},[],{"categories":327},[120],{"categories":329},[138],{"categories":331},[114],{"categories":333},[117],{"categories":335},[],{"categories":337},[120],{"categories":339},[117],{"categories":341},[123],{"categories":343},[],{"categories":345},[117],{"categories":347},[120],{"categories":349},[120],{"categories":351},[],{"categories":353},[162],{"categories":355},[117],{"categories":357},[],{"categories":359},[111],{"categories":361},[114],{"categories":363},[117],{"categories":365},[120],{"categories":367},[169],{"categories":369},[117],{"categories":371},[],{"categories":373},[],{"categories":375},[117],{"categories":377},[],{"categories":379},[159],{"categories":381},[],{"categories":383},[117],{"categories":385},[],{"categories":387},[120],{"categories":389},[117],{"categories":391},[159],{"categories":393},[],{"categories":395},[117],{"categories":397},[117],{"categories":399},[114],{"categories":401},[120],{"categories":403},[117],{"categories":405},[159],{"categories":407},[120],{"categories":409},[],{"categories":411},[],{"categories":413},[138],{"categories":415},[],{"categories":417},[117],{"categories":419},[114,176],{"categories":421},[],{"categories":423},[117],{"categories":425},[],{"categories":427},[],{"categories":429},[117],{"categories":431},[],{"categories":433},[117],{"categories":435},[275],{"categories":437},[],{"categories":439},[138],{"categories":441},[159],{"categories":443},[],{"categories":445},[138],{"categories":447},[138],{"categories":449},[117],{"categories":451},[176],{"categories":453},[],{"categories":455},[114],{"categories":457},[],{"categories":459},[117,275],{"categories":461},[117],{"categories":463},[117],{"categories":465},[120],{"categories":467},[117,169],{"categories":469},[162],{"categories":471},[117],{"categories":473},[176],{"categories":475},[120],{"categories":477},[120],{"categories":479},[],{"categories":481},[120],{"categories":483},[117,114],{"categories":485},[],{"categories":487},[159],{"categories":489},[159],{"categories":491},[],{"categories":493},[],{"categories":495},[138],{"categories":497},[],{"categories":499},[111],{"categories":501},[169],{"categories":503},[117],{"categories":505},[159],{"categories":507},[120],{"categories":509},[169],{"categories":511},[138],{"categories":513},[159],{"categories":515},[],{"categories":517},[117],{"categories":519},[117],{"categories":521},[117],{"categories":523},[138],{"categories":525},[111],{"categories":527},[117],{"categories":529},[120],{"categories":531},[275],{"categories":533},[159],{"categories":535},[120],{"categories":537},[],{"categories":539},[],{"categories":541},[159],{"categories":543},[138],{"categories":545},[162],{"categories":547},[],{"categories":549},[117],{"categories":551},[117],{"categories":553},[114],{"categories":555},[117],{"categories":557},[117],{"categories":559},[138],{"categories":561},[],{"categories":563},[120],{"categories":565},[169],{"categories":567},[],{"categories":569},[117],{"categories":571},[117],{"categories":573},[120],{"categories":575},[],{"categories":577},[],{"categories":579},[117],{"categories":581},[],{"categories":583},[114],{"categories":585},[120],{"categories":587},[],{"categories":589},[111],{"categories":591},[117],{"categories":593},[114],{"categories":595},[138],{"categories":597},[],{"categories":599},[],{"categories":601},[],{"categories":603},[138],{"categories":605},[138],{"categories":607},[],{"categories":609},[],{"categories":611},[114],{"categories":613},[],{"categories":615},[],{"categories":617},[111],{"categories":619},[],{"categories":621},[176],{"categories":623},[120],{"categories":625},[114],{"categories":627},[120],{"categories":629},[169],{"categories":631},[],{"categories":633},[123],{"categories":635},[159],{"categories":637},[169],{"categories":639},[117],{"categories":641},[120],{"categories":643},[114],{"categories":645},[117],{"categories":647},[],{"categories":649},[],{"categories":651},[169],{"categories":653},[162],{"categories":655},[123],{"categories":657},[120],{"categories":659},[117],{"categories":661},[],{"categories":663},[275],{"categories":665},[],{"categories":667},[120],{"categories":669},[],{"categories":671},[],{"categories":673},[117],{"categories":675},[159],{"categories":677},[176],{"categories":679},[120],{"categories":681},[],{"categories":683},[111],{"categories":685},[],{"categories":687},[138],{"categories":689},[117,275],{"categories":691},[138],{"categories":693},[117],{"categories":695},[114],{"categories":697},[117],{"categories":699},[],{"categories":701},[114],{"categories":703},[],{"categories":705},[169],{"categories":707},[159],{"categories":709},[138],{"categories":711},[162],{"categories":713},[111],{"categories":715},[117],{"categories":717},[169],{"categories":719},[],{"categories":721},[],{"categories":723},[123],{"categories":725},[],{"categories":727},[117],{"categories":729},[],{"categories":731},[159],{"categories":733},[159],{"categories":735},[159],{"categories":737},[],{"categories":739},[],{"categories":741},[138],{"categories":743},[120],{"categories":745},[117],{"categories":747},[117],{"categories":749},[117],{"categories":751},[114],{"categories":753},[117],{"categories":755},[],{"categories":757},[169],{"categories":759},[169],{"categories":761},[114],{"categories":763},[],{"categories":765},[117],{"categories":767},[117],{"categories":769},[114],{"categories":771},[138],{"categories":773},[176],{"categories":775},[120],{"categories":777},[],{"categories":779},[159],{"categories":781},[],{"categories":783},[117],{"categories":785},[],{"categories":787},[114],{"categories":789},[120],{"categories":791},[],{"categories":793},[275],{"categories":795},[162],{"categories":797},[169],{"categories":799},[176],{"categories":801},[169],{"categories":803},[120],{"categories":805},[],{"categories":807},[],{"categories":809},[120],{"categories":811},[111],{"categories":813},[120],{"categories":815},[123],{"categories":817},[114],{"categories":819},[],{"categories":821},[117],{"categories":823},[123],{"categories":825},[117],{"categories":827},[117],{"categories":829},[176],{"categories":831},[159],{"categories":833},[120],{"categories":835},[],{"categories":837},[],{"categories":839},[275],{"categories":841},[169],{"categories":843},[],{"categories":845},[120],{"categories":847},[117],{"categories":849},[159,117],{"categories":851},[111],{"categories":853},[],{"categories":855},[117],{"categories":857},[111],{"categories":859},[159],{"categories":861},[120],{"categories":863},[169],{"categories":865},[],{"categories":867},[117],{"categories":869},[],{"categories":871},[],{"categories":873},[111],{"categories":875},[],{"categories":877},[120],{"categories":879},[123],{"categories":881},[117],{"categories":883},[117],{"categories":885},[159],{"categories":887},[120],{"categories":889},[275],{"categories":891},[159],{"categories":893},[120],{"categories":895},[117],{"categories":897},[117],{"categories":899},[117],{"categories":901},[138],{"categories":903},[],{"categories":905},[123],{"categories":907},[120],{"categories":909},[159],{"categories":911},[120],{"categories":913},[169],{"categories":915},[159],{"categories":917},[120],{"categories":919},[138],{"categories":921},[],{"categories":923},[117],{"categories":925},[159],{"categories":927},[117],{"categories":929},[111],{"categories":931},[138],{"categories":933},[117],{"categories":935},[176],{"categories":937},[117],{"categories":939},[117],{"categories":941},[120],{"categories":943},[120],{"categories":945},[117],{"categories":947},[120],{"categories":949},[159],{"categories":951},[117],{"categories":953},[],{"categories":955},[],{"categories":957},[169],{"categories":959},[],{"categories":961},[111],{"categories":963},[275],{"categories":965},[],{"categories":967},[111],{"categories":969},[114],{"categories":971},[176],{"categories":973},[],{"categories":975},[114],{"categories":977},[],{"categories":979},[],{"categories":981},[],{"categories":983},[],{"categories":985},[],{"categories":987},[117],{"categories":989},[120],{"categories":991},[275],{"categories":993},[111],{"categories":995},[117],{"categories":997},[169],{"categories":999},[123],{"categories":1001},[117],{"categories":1003},[176],{"categories":1005},[117],{"categories":1007},[117],{"categories":1009},[117],{"categories":1011},[117,111],{"categories":1013},[169],{"categories":1015},[169],{"categories":1017},[159],{"categories":1019},[117],{"categories":1021},[],{"categories":1023},[],{"categories":1025},[],{"categories":1027},[169],{"categories":1029},[162],{"categories":1031},[138],{"categories":1033},[159],{"categories":1035},[],{"categories":1037},[117],{"categories":1039},[117],{"categories":1041},[],{"categories":1043},[],{"categories":1045},[120],{"categories":1047},[117],{"categories":1049},[114],{"categories":1051},[],{"categories":1053},[111],{"categories":1055},[117],{"categories":1057},[111],{"categories":1059},[117],{"categories":1061},[169],{"categories":1063},[176],{"categories":1065},[117,159],{"categories":1067},[138],{"categories":1069},[159],{"categories":1071},[],{"categories":1073},[275],{"categories":1075},[159],{"categories":1077},[120],{"categories":1079},[],{"categories":1081},[],{"categories":1083},[],{"categories":1085},[],{"categories":1087},[169],{"categories":1089},[120],{"categories":1091},[120],{"categories":1093},[275],{"categories":1095},[117],{"categories":1097},[117],{"categories":1099},[117],{"categories":1101},[],{"categories":1103},[159],{"categories":1105},[],{"categories":1107},[],{"categories":1109},[120],{"categories":1111},[],{"categories":1113},[],{"categories":1115},[176],{"categories":1117},[176],{"categories":1119},[120],{"categories":1121},[],{"categories":1123},[117],{"categories":1125},[117],{"categories":1127},[169],{"categories":1129},[159],{"categories":1131},[159],{"categories":1133},[120],{"categories":1135},[111],{"categories":1137},[117],{"categories":1139},[159],{"categories":1141},[159],{"categories":1143},[120],{"categories":1145},[120],{"categories":1147},[117],{"categories":1149},[],{"categories":1151},[],{"categories":1153},[117],{"categories":1155},[120],{"categories":1157},[138],{"categories":1159},[169],{"categories":1161},[111],{"categories":1163},[117],{"categories":1165},[],{"categories":1167},[120],{"categories":1169},[120],{"categories":1171},[],{"categories":1173},[111],{"categories":1175},[117],{"categories":1177},[111],{"categories":1179},[111],{"categories":1181},[],{"categories":1183},[],{"categories":1185},[120],{"categories":1187},[120],{"categories":1189},[117],{"categories":1191},[117],{"categories":1193},[138],{"categories":1195},[162],{"categories":1197},[123],{"categories":1199},[138],{"categories":1201},[159],{"categories":1203},[],{"categories":1205},[138],{"categories":1207},[],{"categories":1209},[],{"categories":1211},[],{"categories":1213},[],{"categories":1215},[169],{"categories":1217},[162],{"categories":1219},[],{"categories":1221},[117],{"categories":1223},[117],{"categories":1225},[162],{"categories":1227},[169],{"categories":1229},[],{"categories":1231},[],{"categories":1233},[120],{"categories":1235},[138],{"categories":1237},[138],{"categories":1239},[120],{"categories":1241},[111],{"categories":1243},[117,275],{"categories":1245},[],{"categories":1247},[159],{"categories":1249},[111],{"categories":1251},[120],{"categories":1253},[159],{"categories":1255},[],{"categories":1257},[120],{"categories":1259},[120],{"categories":1261},[117],{"categories":1263},[176],{"categories":1265},[169],{"categories":1267},[159],{"categories":1269},[],{"categories":1271},[120],{"categories":1273},[117],{"categories":1275},[120],{"categories":1277},[120],{"categories":1279},[120],{"categories":1281},[176],{"categories":1283},[120],{"categories":1285},[117],{"categories":1287},[],{"categories":1289},[176],{"categories":1291},[138],{"categories":1293},[120],{"categories":1295},[],{"categories":1297},[],{"categories":1299},[117],{"categories":1301},[120],{"categories":1303},[138],{"categories":1305},[120],{"categories":1307},[],{"categories":1309},[],{"categories":1311},[],{"categories":1313},[120],{"categories":1315},[],{"categories":1317},[],{"categories":1319},[162],{"categories":1321},[117],{"categories":1323},[162],{"categories":1325},[138],{"categories":1327},[117],{"categories":1329},[117],{"categories":1331},[120],{"categories":1333},[117],{"categories":1335},[],{"categories":1337},[],{"categories":1339},[275],{"categories":1341},[],{"categories":1343},[],{"categories":1345},[111],{"categories":1347},[],{"categories":1349},[],{"categories":1351},[],{"categories":1353},[],{"categories":1355},[169],{"categories":1357},[138],{"categories":1359},[176],{"categories":1361},[114],{"categories":1363},[117],{"categories":1365},[117],{"categories":1367},[114],{"categories":1369},[],{"categories":1371},[159],{"categories":1373},[120],{"categories":1375},[114],{"categories":1377},[117],{"categories":1379},[117],{"categories":1381},[111],{"categories":1383},[],{"categories":1385},[111],{"categories":1387},[117],{"categories":1389},[176],{"categories":1391},[120],{"categories":1393},[138],{"categories":1395},[114],{"categories":1397},[117],{"categories":1399},[120],{"categories":1401},[],{"categories":1403},[117],{"categories":1405},[111],{"categories":1407},[117],{"categories":1409},[],{"categories":1411},[138],{"categories":1413},[117],{"categories":1415},[],{"categories":1417},[114],{"categories":1419},[117],{"categories":1421},[],{"categories":1423},[],{"categories":1425},[],{"categories":1427},[117],{"categories":1429},[],{"categories":1431},[275],{"categories":1433},[117],{"categories":1435},[],{"categories":1437},[117],{"categories":1439},[117],{"categories":1441},[117],{"categories":1443},[117,275],{"categories":1445},[117],{"categories":1447},[117],{"categories":1449},[159],{"categories":1451},[120],{"categories":1453},[],{"categories":1455},[120],{"categories":1457},[117],{"categories":1459},[117],{"categories":1461},[117],{"categories":1463},[111],{"categories":1465},[111],{"categories":1467},[169],{"categories":1469},[159],{"categories":1471},[120],{"categories":1473},[],{"categories":1475},[117],{"categories":1477},[138],{"categories":1479},[117],{"categories":1481},[114],{"categories":1483},[],{"categories":1485},[275],{"categories":1487},[159],{"categories":1489},[159],{"categories":1491},[120],{"categories":1493},[138],{"categories":1495},[120],{"categories":1497},[117],{"categories":1499},[],{"categories":1501},[117],{"categories":1503},[],{"categories":1505},[],{"categories":1507},[117],{"categories":1509},[117],{"categories":1511},[117],{"categories":1513},[120],{"categories":1515},[117],{"categories":1517},[],{"categories":1519},[162],{"categories":1521},[120],{"categories":1523},[],{"categories":1525},[],{"categories":1527},[117],{"categories":1529},[138],{"categories":1531},[],{"categories":1533},[159],{"categories":1535},[275],{"categories":1537},[138],{"categories":1539},[169],{"categories":1541},[169],{"categories":1543},[138],{"categories":1545},[138],{"categories":1547},[275],{"categories":1549},[],{"categories":1551},[138],{"categories":1553},[117],{"categories":1555},[111],{"categories":1557},[138],{"categories":1559},[],{"categories":1561},[162],{"categories":1563},[138],{"categories":1565},[169],{"categories":1567},[138],{"categories":1569},[275],{"categories":1571},[117],{"categories":1573},[117],{"categories":1575},[],{"categories":1577},[114],{"categories":1579},[],{"categories":1581},[],{"categories":1583},[117],{"categories":1585},[117],{"categories":1587},[117],{"categories":1589},[117],{"categories":1591},[],{"categories":1593},[162],{"categories":1595},[111],{"categories":1597},[],{"categories":1599},[117],{"categories":1601},[117],{"categories":1603},[275],{"categories":1605},[275],{"categories":1607},[],{"categories":1609},[120],{"categories":1611},[138],{"categories":1613},[138],{"categories":1615},[117],{"categories":1617},[120],{"categories":1619},[],{"categories":1621},[159],{"categories":1623},[117],{"categories":1625},[117],{"categories":1627},[],{"categories":1629},[],{"categories":1631},[275],{"categories":1633},[117],{"categories":1635},[169],{"categories":1637},[114],{"categories":1639},[117],{"categories":1641},[],{"categories":1643},[120],{"categories":1645},[111],{"categories":1647},[111],{"categories":1649},[],{"categories":1651},[117],{"categories":1653},[159],{"categories":1655},[120],{"categories":1657},[],{"categories":1659},[117],{"categories":1661},[117],{"categories":1663},[120],{"categories":1665},[],{"categories":1667},[120],{"categories":1669},[169],{"categories":1671},[],{"categories":1673},[117],{"categories":1675},[],{"categories":1677},[117],{"categories":1679},[],{"categories":1681},[117],{"categories":1683},[117],{"categories":1685},[],{"categories":1687},[117],{"categories":1689},[138],{"categories":1691},[117],{"categories":1693},[117],{"categories":1695},[111],{"categories":1697},[117],{"categories":1699},[138],{"categories":1701},[120],{"categories":1703},[],{"categories":1705},[117],{"categories":1707},[176],{"categories":1709},[],{"categories":1711},[],{"categories":1713},[],{"categories":1715},[111],{"categories":1717},[138],{"categories":1719},[120],{"categories":1721},[117],{"categories":1723},[159],{"categories":1725},[120],{"categories":1727},[],{"categories":1729},[120],{"categories":1731},[],{"categories":1733},[117],{"categories":1735},[120],{"categories":1737},[117],{"categories":1739},[],{"categories":1741},[117],{"categories":1743},[117],{"categories":1745},[138],{"categories":1747},[159],{"categories":1749},[120],{"categories":1751},[159],{"categories":1753},[114],{"categories":1755},[],{"categories":1757},[],{"categories":1759},[117],{"categories":1761},[111],{"categories":1763},[138],{"categories":1765},[],{"categories":1767},[],{"categories":1769},[169],{"categories":1771},[159],{"categories":1773},[],{"categories":1775},[117],{"categories":1777},[],{"categories":1779},[176],{"categories":1781},[117],{"categories":1783},[275],{"categories":1785},[169],{"categories":1787},[],{"categories":1789},[120],{"categories":1791},[117],{"categories":1793},[120],{"categories":1795},[120],{"categories":1797},[117],{"categories":1799},[],{"categories":1801},[111],{"categories":1803},[117],{"categories":1805},[114],{"categories":1807},[169],{"categories":1809},[159],{"categories":1811},[],{"categories":1813},[],{"categories":1815},[],{"categories":1817},[120],{"categories":1819},[159],{"categories":1821},[138],{"categories":1823},[117],{"categories":1825},[138],{"categories":1827},[159],{"categories":1829},[],{"categories":1831},[159],{"categories":1833},[138],{"categories":1835},[114],{"categories":1837},[117],{"categories":1839},[138],{"categories":1841},[176],{"categories":1843},[],{"categories":1845},[],{"categories":1847},[162],{"categories":1849},[117,169],{"categories":1851},[138],{"categories":1853},[117],{"categories":1855},[120],{"categories":1857},[120],{"categories":1859},[117],{"categories":1861},[],{"categories":1863},[169],{"categories":1865},[117],{"categories":1867},[162],{"categories":1869},[120],{"categories":1871},[176],{"categories":1873},[275],{"categories":1875},[],{"categories":1877},[111],{"categories":1879},[120],{"categories":1881},[120],{"categories":1883},[169],{"categories":1885},[117],{"categories":1887},[117],{"categories":1889},[],{"categories":1891},[],{"categories":1893},[],{"categories":1895},[275],{"categories":1897},[138],{"categories":1899},[117],{"categories":1901},[117],{"categories":1903},[117],{"categories":1905},[],{"categories":1907},[162],{"categories":1909},[114],{"categories":1911},[],{"categories":1913},[120],{"categories":1915},[275],{"categories":1917},[],{"categories":1919},[159],{"categories":1921},[159],{"categories":1923},[],{"categories":1925},[169],{"categories":1927},[159],{"categories":1929},[117],{"categories":1931},[],{"categories":1933},[138],{"categories":1935},[117],{"categories":1937},[159],{"categories":1939},[120],{"categories":1941},[138],{"categories":1943},[],{"categories":1945},[120],{"categories":1947},[159],{"categories":1949},[117],{"categories":1951},[],{"categories":1953},[117],{"categories":1955},[117],{"categories":1957},[275],{"categories":1959},[138],{"categories":1961},[162],{"categories":1963},[162],{"categories":1965},[],{"categories":1967},[],{"categories":1969},[],{"categories":1971},[120],{"categories":1973},[169],{"categories":1975},[169],{"categories":1977},[],{"categories":1979},[],{"categories":1981},[117],{"categories":1983},[],{"categories":1985},[120],{"categories":1987},[117],{"categories":1989},[],{"categories":1991},[117],{"categories":1993},[114],{"categories":1995},[117],{"categories":1997},[176],{"categories":1999},[120],{"categories":2001},[117],{"categories":2003},[169],{"categories":2005},[],{"categories":2007},[138],{"categories":2009},[120],{"categories":2011},[],{"categories":2013},[138],{"categories":2015},[120],{"categories":2017},[120],{"categories":2019},[],{"categories":2021},[114],{"categories":2023},[120],{"categories":2025},[],{"categories":2027},[117],{"categories":2029},[111],{"categories":2031},[138],{"categories":2033},[275],{"categories":2035},[120],{"categories":2037},[120],{"categories":2039},[111],{"categories":2041},[117],{"categories":2043},[],{"categories":2045},[],{"categories":2047},[159],{"categories":2049},[117,114],{"categories":2051},[],{"categories":2053},[111],{"categories":2055},[162],{"categories":2057},[117],{"categories":2059},[169],{"categories":2061},[117],{"categories":2063},[120],{"categories":2065},[117],{"categories":2067},[117],{"categories":2069},[138],{"categories":2071},[120],{"categories":2073},[],{"categories":2075},[],{"categories":2077},[120],{"categories":2079},[117],{"categories":2081},[275],{"categories":2083},[],{"categories":2085},[117],{"categories":2087},[120],{"categories":2089},[],{"categories":2091},[117],{"categories":2093},[176],{"categories":2095},[162],{"categories":2097},[120],{"categories":2099},[117],{"categories":2101},[275],{"categories":2103},[],{"categories":2105},[117],{"categories":2107},[176],{"categories":2109},[159],{"categories":2111},[117],{"categories":2113},[],{"categories":2115},[176],{"categories":2117},[138],{"categories":2119},[117],{"categories":2121},[117],{"categories":2123},[111],{"categories":2125},[],{"categories":2127},[],{"categories":2129},[159],{"categories":2131},[117],{"categories":2133},[162],{"categories":2135},[176],{"categories":2137},[176],{"categories":2139},[138],{"categories":2141},[],{"categories":2143},[],{"categories":2145},[117],{"categories":2147},[],{"categories":2149},[117,169],{"categories":2151},[138],{"categories":2153},[120],{"categories":2155},[169],{"categories":2157},[117],{"categories":2159},[111],{"categories":2161},[],{"categories":2163},[],{"categories":2165},[111],{"categories":2167},[176],{"categories":2169},[117],{"categories":2171},[],{"categories":2173},[159,117],{"categories":2175},[275],{"categories":2177},[111],{"categories":2179},[],{"categories":2181},[114],{"categories":2183},[114],{"categories":2185},[117],{"categories":2187},[169],{"categories":2189},[120],{"categories":2191},[138],{"categories":2193},[176],{"categories":2195},[159],{"categories":2197},[117],{"categories":2199},[117],{"categories":2201},[117],{"categories":2203},[111],{"categories":2205},[117],{"categories":2207},[120],{"categories":2209},[138],{"categories":2211},[],{"categories":2213},[],{"categories":2215},[162],{"categories":2217},[169],{"categories":2219},[117],{"categories":2221},[159],{"categories":2223},[162],{"categories":2225},[117],{"categories":2227},[117],{"categories":2229},[120],{"categories":2231},[120],{"categories":2233},[117,114],{"categories":2235},[],{"categories":2237},[159],{"categories":2239},[],{"categories":2241},[117],{"categories":2243},[138],{"categories":2245},[111],{"categories":2247},[111],{"categories":2249},[120],{"categories":2251},[117],{"categories":2253},[114],{"categories":2255},[169],{"categories":2257},[176],{"categories":2259},[],{"categories":2261},[138],{"categories":2263},[117],{"categories":2265},[117],{"categories":2267},[138],{"categories":2269},[169],{"categories":2271},[117],{"categories":2273},[120],{"categories":2275},[138],{"categories":2277},[117],{"categories":2279},[159],{"categories":2281},[117],{"categories":2283},[117],{"categories":2285},[275],{"categories":2287},[123],{"categories":2289},[120],{"categories":2291},[117],{"categories":2293},[138],{"categories":2295},[120],{"categories":2297},[176],{"categories":2299},[117],{"categories":2301},[],{"categories":2303},[117],{"categories":2305},[],{"categories":2307},[],{"categories":2309},[],{"categories":2311},[114],{"categories":2313},[117],{"categories":2315},[120],{"categories":2317},[138],{"categories":2319},[138],{"categories":2321},[138],{"categories":2323},[138],{"categories":2325},[],{"categories":2327},[111],{"categories":2329},[120],{"categories":2331},[138],{"categories":2333},[111],{"categories":2335},[120],{"categories":2337},[117],{"categories":2339},[117,120],{"categories":2341},[120],{"categories":2343},[275],{"categories":2345},[138],{"categories":2347},[138],{"categories":2349},[120],{"categories":2351},[117],{"categories":2353},[],{"categories":2355},[138],{"categories":2357},[176],{"categories":2359},[111],{"categories":2361},[117],{"categories":2363},[117],{"categories":2365},[],{"categories":2367},[169],{"categories":2369},[],{"categories":2371},[111],{"categories":2373},[120],{"categories":2375},[138],{"categories":2377},[117],{"categories":2379},[138],{"categories":2381},[111],{"categories":2383},[138],{"categories":2385},[138],{"categories":2387},[],{"categories":2389},[114],{"categories":2391},[120],{"categories":2393},[138],{"categories":2395},[138],{"categories":2397},[138],{"categories":2399},[138],{"categories":2401},[138],{"categories":2403},[138],{"categories":2405},[138],{"categories":2407},[138],{"categories":2409},[138],{"categories":2411},[138],{"categories":2413},[162],{"categories":2415},[111],{"categories":2417},[117],{"categories":2419},[117],{"categories":2421},[],{"categories":2423},[117,111],{"categories":2425},[],{"categories":2427},[120],{"categories":2429},[138],{"categories":2431},[120],{"categories":2433},[117],{"categories":2435},[117],{"categories":2437},[117],{"categories":2439},[117],{"categories":2441},[117],{"categories":2443},[120],{"categories":2445},[114],{"categories":2447},[159],{"categories":2449},[138],{"categories":2451},[117],{"categories":2453},[],{"categories":2455},[],{"categories":2457},[120],{"categories":2459},[159],{"categories":2461},[117],{"categories":2463},[],{"categories":2465},[],{"categories":2467},[176],{"categories":2469},[117],{"categories":2471},[],{"categories":2473},[],{"categories":2475},[111],{"categories":2477},[114],{"categories":2479},[117],{"categories":2481},[114],{"categories":2483},[159],{"categories":2485},[],{"categories":2487},[138],{"categories":2489},[],{"categories":2491},[159],{"categories":2493},[117],{"categories":2495},[176],{"categories":2497},[],{"categories":2499},[176],{"categories":2501},[],{"categories":2503},[],{"categories":2505},[120],{"categories":2507},[],{"categories":2509},[114],{"categories":2511},[111],{"categories":2513},[159],{"categories":2515},[169],{"categories":2517},[],{"categories":2519},[],{"categories":2521},[117],{"categories":2523},[111],{"categories":2525},[176],{"categories":2527},[],{"categories":2529},[120],{"categories":2531},[120],{"categories":2533},[138],{"categories":2535},[117],{"categories":2537},[120],{"categories":2539},[117],{"categories":2541},[120],{"categories":2543},[117],{"categories":2545},[123],{"categories":2547},[138],{"categories":2549},[],{"categories":2551},[176],{"categories":2553},[169],{"categories":2555},[120],{"categories":2557},[],{"categories":2559},[117],{"categories":2561},[120],{"categories":2563},[114],{"categories":2565},[111],{"categories":2567},[117],{"categories":2569},[159],{"categories":2571},[169],{"categories":2573},[169],{"categories":2575},[117],{"categories":2577},[162],{"categories":2579},[117],{"categories":2581},[120],{"categories":2583},[114],{"categories":2585},[120],{"categories":2587},[117],{"categories":2589},[117],{"categories":2591},[120],{"categories":2593},[138],{"categories":2595},[],{"categories":2597},[111],{"categories":2599},[117],{"categories":2601},[120],{"categories":2603},[117],{"categories":2605},[117],{"categories":2607},[],{"categories":2609},[159],{"categories":2611},[114],{"categories":2613},[138],{"categories":2615},[117],{"categories":2617},[117],{"categories":2619},[159],{"categories":2621},[176],{"categories":2623},[162],{"categories":2625},[117],{"categories":2627},[138],{"categories":2629},[117],{"categories":2631},[120],{"categories":2633},[275],{"categories":2635},[117],{"categories":2637},[120],{"categories":2639},[162],{"categories":2641},[],{"categories":2643},[120],{"categories":2645},[169],{"categories":2647},[159],{"categories":2649},[117],{"categories":2651},[111],{"categories":2653},[114],{"categories":2655},[169],{"categories":2657},[],{"categories":2659},[120],{"categories":2661},[117],{"categories":2663},[],{"categories":2665},[138],{"categories":2667},[],{"categories":2669},[138],{"categories":2671},[117],{"categories":2673},[120],{"categories":2675},[120],{"categories":2677},[120],{"categories":2679},[],{"categories":2681},[],{"categories":2683},[117],{"categories":2685},[117],{"categories":2687},[],{"categories":2689},[159],{"categories":2691},[120],{"categories":2693},[176],{"categories":2695},[111],{"categories":2697},[],{"categories":2699},[],{"categories":2701},[138],{"categories":2703},[169],{"categories":2705},[117],{"categories":2707},[117],{"categories":2709},[117],{"categories":2711},[169],{"categories":2713},[138],{"categories":2715},[159],{"categories":2717},[117],{"categories":2719},[117],{"categories":2721},[117],{"categories":2723},[138],{"categories":2725},[117],{"categories":2727},[138],{"categories":2729},[138],{"categories":2731},[120],{"categories":2733},[120],{"categories":2735},[169],{"categories":2737},[120],{"categories":2739},[117],{"categories":2741},[169],{"categories":2743},[159],{"categories":2745},[],{"categories":2747},[120],{"categories":2749},[],{"categories":2751},[],{"categories":2753},[],{"categories":2755},[114],{"categories":2757},[117],{"categories":2759},[120],{"categories":2761},[111],{"categories":2763},[120],{"categories":2765},[176],{"categories":2767},[],{"categories":2769},[120],{"categories":2771},[],{"categories":2773},[111],{"categories":2775},[120],{"categories":2777},[],{"categories":2779},[120],{"categories":2781},[117],{"categories":2783},[138],{"categories":2785},[117],{"categories":2787},[120],{"categories":2789},[138],{"categories":2791},[120],{"categories":2793},[169],{"categories":2795},[159],{"categories":2797},[111],{"categories":2799},[],{"categories":2801},[120],{"categories":2803},[159],{"categories":2805},[275],{"categories":2807},[138],{"categories":2809},[117],{"categories":2811},[159],{"categories":2813},[111],{"categories":2815},[],{"categories":2817},[120],{"categories":2819},[120],{"categories":2821},[117],{"categories":2823},[],{"categories":2825},[120],{"categories":2827},[123],{"categories":2829},[138],{"categories":2831},[120],{"categories":2833},[114],{"categories":2835},[],{"categories":2837},[117],{"categories":2839},[123],{"categories":2841},[117],{"categories":2843},[120],{"categories":2845},[138],{"categories":2847},[111],{"categories":2849},[275],{"categories":2851},[117],{"categories":2853},[117],{"categories":2855},[117],{"categories":2857},[138],{"categories":2859},[114],{"categories":2861},[117],{"categories":2863},[159],{"categories":2865},[138],{"categories":2867},[275],{"categories":2869},[117],{"categories":2871},[],{"categories":2873},[],{"categories":2875},[275],{"categories":2877},[162],{"categories":2879},[120],{"categories":2881},[120],{"categories":2883},[138],{"categories":2885},[117],{"categories":2887},[111],{"categories":2889},[159],{"categories":2891},[120],{"categories":2893},[117],{"categories":2895},[176],{"categories":2897},[117],{"categories":2899},[120],{"categories":2901},[],{"categories":2903},[117],{"categories":2905},[117],{"categories":2907},[138],{"categories":2909},[111],{"categories":2911},[],{"categories":2913},[117],{"categories":2915},[117],{"categories":2917},[169],{"categories":2919},[159],{"categories":2921},[117,120],{"categories":2923},[176,114],{"categories":2925},[117],{"categories":2927},[],{"categories":2929},[120],{"categories":2931},[],{"categories":2933},[169],{"categories":2935},[],{"categories":2937},[117],{"categories":2939},[138],{"categories":2941},[],{"categories":2943},[120],{"categories":2945},[],{"categories":2947},[159],{"categories":2949},[120],{"categories":2951},[111],{"categories":2953},[120],{"categories":2955},[117],{"categories":2957},[275],{"categories":2959},[176],{"categories":2961},[114],{"categories":2963},[114],{"categories":2965},[111],{"categories":2967},[111],{"categories":2969},[117],{"categories":2971},[120],{"categories":2973},[117],{"categories":2975},[117],{"categories":2977},[111],{"categories":2979},[117],{"categories":2981},[176],{"categories":2983},[138],{"categories":2985},[117],{"categories":2987},[120],{"categories":2989},[117],{"categories":2991},[],{"categories":2993},[169],{"categories":2995},[],{"categories":2997},[120],{"categories":2999},[111],{"categories":3001},[],{"categories":3003},[275],{"categories":3005},[117],{"categories":3007},[],{"categories":3009},[138],{"categories":3011},[120],{"categories":3013},[169],{"categories":3015},[117],{"categories":3017},[120],{"categories":3019},[169],{"categories":3021},[120],{"categories":3023},[138],{"categories":3025},[111],{"categories":3027},[138],{"categories":3029},[169],{"categories":3031},[117],{"categories":3033},[159],{"categories":3035},[117],{"categories":3037},[117],{"categories":3039},[117],{"categories":3041},[117],{"categories":3043},[120],{"categories":3045},[117],{"categories":3047},[120],{"categories":3049},[117],{"categories":3051},[111],{"categories":3053},[117],{"categories":3055},[120],{"categories":3057},[159],{"categories":3059},[111],{"categories":3061},[120],{"categories":3063},[159],{"categories":3065},[],{"categories":3067},[117],{"categories":3069},[117],{"categories":3071},[169],{"categories":3073},[],{"categories":3075},[120],{"categories":3077},[176],{"categories":3079},[117],{"categories":3081},[138],{"categories":3083},[176],{"categories":3085},[120],{"categories":3087},[114],{"categories":3089},[114],{"categories":3091},[117],{"categories":3093},[111],{"categories":3095},[],{"categories":3097},[117],{"categories":3099},[],{"categories":3101},[111],{"categories":3103},[117],{"categories":3105},[120],{"categories":3107},[120],{"categories":3109},[],{"categories":3111},[169],{"categories":3113},[169],{"categories":3115},[176],{"categories":3117},[159],{"categories":3119},[],{"categories":3121},[117],{"categories":3123},[111],{"categories":3125},[117],{"categories":3127},[169],{"categories":3129},[111],{"categories":3131},[138],{"categories":3133},[138],{"categories":3135},[],{"categories":3137},[138],{"categories":3139},[120],{"categories":3141},[159],{"categories":3143},[162],{"categories":3145},[117],{"categories":3147},[],{"categories":3149},[138],{"categories":3151},[169],{"categories":3153},[114],{"categories":3155},[117],{"categories":3157},[111],{"categories":3159},[275],{"categories":3161},[111],{"categories":3163},[],{"categories":3165},[],{"categories":3167},[138],{"categories":3169},[],{"categories":3171},[120],{"categories":3173},[120],{"categories":3175},[120],{"categories":3177},[],{"categories":3179},[117],{"categories":3181},[],{"categories":3183},[138],{"categories":3185},[111],{"categories":3187},[159],{"categories":3189},[117],{"categories":3191},[138],{"categories":3193},[138],{"categories":3195},[],{"categories":3197},[138],{"categories":3199},[111],{"categories":3201},[117],{"categories":3203},[],{"categories":3205},[120],{"categories":3207},[120],{"categories":3209},[111],{"categories":3211},[],{"categories":3213},[],{"categories":3215},[],{"categories":3217},[159],{"categories":3219},[120],{"categories":3221},[117],{"categories":3223},[],{"categories":3225},[],{"categories":3227},[],{"categories":3229},[159],{"categories":3231},[],{"categories":3233},[111],{"categories":3235},[],{"categories":3237},[],{"categories":3239},[159],{"categories":3241},[117],{"categories":3243},[138],{"categories":3245},[],{"categories":3247},[176],{"categories":3249},[138],{"categories":3251},[176],{"categories":3253},[117],{"categories":3255},[],{"categories":3257},[],{"categories":3259},[120],{"categories":3261},[],{"categories":3263},[],{"categories":3265},[120],{"categories":3267},[117],{"categories":3269},[],{"categories":3271},[120],{"categories":3273},[138],{"categories":3275},[176],{"categories":3277},[162],{"categories":3279},[120],{"categories":3281},[120],{"categories":3283},[],{"categories":3285},[],{"categories":3287},[],{"categories":3289},[138],{"categories":3291},[],{"categories":3293},[],{"categories":3295},[159],{"categories":3297},[111],{"categories":3299},[],{"categories":3301},[114],{"categories":3303},[176],{"categories":3305},[117],{"categories":3307},[169],{"categories":3309},[111],{"categories":3311},[162],{"categories":3313},[114],{"categories":3315},[169],{"categories":3317},[],{"categories":3319},[],{"categories":3321},[120],{"categories":3323},[111],{"categories":3325},[159],{"categories":3327},[111],{"categories":3329},[120],{"categories":3331},[275],{"categories":3333},[120],{"categories":3335},[],{"categories":3337},[117],{"categories":3339},[138],{"categories":3341},[169],{"categories":3343},[],{"categories":3345},[159],{"categories":3347},[138],{"categories":3349},[111],{"categories":3351},[120],{"categories":3353},[117],{"categories":3355},[114],{"categories":3357},[120,275],{"categories":3359},[120],{"categories":3361},[169],{"categories":3363},[117],{"categories":3365},[162],{"categories":3367},[176],{"categories":3369},[120],{"categories":3371},[],{"categories":3373},[120],{"categories":3375},[117],{"categories":3377},[114],{"categories":3379},[],{"categories":3381},[],{"categories":3383},[117],{"categories":3385},[162],{"categories":3387},[117],{"categories":3389},[],{"categories":3391},[138],{"categories":3393},[],{"categories":3395},[138],{"categories":3397},[169],{"categories":3399},[120],{"categories":3401},[117],{"categories":3403},[176],{"categories":3405},[169],{"categories":3407},[],{"categories":3409},[138],{"categories":3411},[117],{"categories":3413},[],{"categories":3415},[117],{"categories":3417},[120],{"categories":3419},[117],{"categories":3421},[120],{"categories":3423},[117],{"categories":3425},[117],{"categories":3427},[117],{"categories":3429},[117],{"categories":3431},[114],{"categories":3433},[],{"categories":3435},[123],{"categories":3437},[138],{"categories":3439},[117],{"categories":3441},[],{"categories":3443},[169],{"categories":3445},[117],{"categories":3447},[117],{"categories":3449},[120],{"categories":3451},[138],{"categories":3453},[117],{"categories":3455},[117],{"categories":3457},[114],{"categories":3459},[120],{"categories":3461},[159],{"categories":3463},[],{"categories":3465},[162],{"categories":3467},[117],{"categories":3469},[],{"categories":3471},[138],{"categories":3473},[176],{"categories":3475},[],{"categories":3477},[],{"categories":3479},[138],{"categories":3481},[138],{"categories":3483},[176],{"categories":3485},[111],{"categories":3487},[120],{"categories":3489},[120],{"categories":3491},[117],{"categories":3493},[114],{"categories":3495},[],{"categories":3497},[],{"categories":3499},[138],{"categories":3501},[162],{"categories":3503},[169],{"categories":3505},[120],{"categories":3507},[159],{"categories":3509},[162],{"categories":3511},[162],{"categories":3513},[],{"categories":3515},[138],{"categories":3517},[117],{"categories":3519},[117],{"categories":3521},[169],{"categories":3523},[],{"categories":3525},[138],{"categories":3527},[138],{"categories":3529},[138],{"categories":3531},[],{"categories":3533},[120],{"categories":3535},[117],{"categories":3537},[],{"categories":3539},[111],{"categories":3541},[114],{"categories":3543},[],{"categories":3545},[117],{"categories":3547},[117],{"categories":3549},[],{"categories":3551},[169],{"categories":3553},[],{"categories":3555},[],{"categories":3557},[],{"categories":3559},[],{"categories":3561},[117],{"categories":3563},[138],{"categories":3565},[],{"categories":3567},[],{"categories":3569},[117],{"categories":3571},[117],{"categories":3573},[117],{"categories":3575},[162],{"categories":3577},[117],{"categories":3579},[162],{"categories":3581},[],{"categories":3583},[162],{"categories":3585},[162],{"categories":3587},[275],{"categories":3589},[120],{"categories":3591},[169],{"categories":3593},[],{"categories":3595},[],{"categories":3597},[162],{"categories":3599},[169],{"categories":3601},[169],{"categories":3603},[169],{"categories":3605},[],{"categories":3607},[111],{"categories":3609},[169],{"categories":3611},[169],{"categories":3613},[111],{"categories":3615},[169],{"categories":3617},[114],{"categories":3619},[169],{"categories":3621},[169],{"categories":3623},[169],{"categories":3625},[162],{"categories":3627},[138],{"categories":3629},[138],{"categories":3631},[117],{"categories":3633},[169],{"categories":3635},[162],{"categories":3637},[275],{"categories":3639},[162],{"categories":3641},[162],{"categories":3643},[162],{"categories":3645},[],{"categories":3647},[114],{"categories":3649},[],{"categories":3651},[275],{"categories":3653},[169],{"categories":3655},[169],{"categories":3657},[169],{"categories":3659},[120],{"categories":3661},[138,114],{"categories":3663},[162],{"categories":3665},[],{"categories":3667},[],{"categories":3669},[162],{"categories":3671},[],{"categories":3673},[162],{"categories":3675},[138],{"categories":3677},[120],{"categories":3679},[],{"categories":3681},[169],{"categories":3683},[117],{"categories":3685},[159],{"categories":3687},[],{"categories":3689},[117],{"categories":3691},[],{"categories":3693},[138],{"categories":3695},[111],{"categories":3697},[162],{"categories":3699},[],{"categories":3701},[169],{"categories":3703},[138],[3705,3777,3854,4061],{"id":3706,"title":3707,"ai":3708,"body":3713,"categories":3749,"created_at":59,"date_modified":59,"description":52,"extension":60,"faq":59,"featured":61,"kicker_label":59,"meta":3750,"navigation":89,"path":3765,"published_at":3766,"question":59,"scraped_at":3767,"seo":3768,"sitemap":3769,"source_id":3770,"source_name":96,"source_type":97,"source_url":3771,"stem":3772,"tags":3773,"thumbnail_url":59,"tldr":3774,"tweet":59,"unknown_tags":3775,"__hash__":3776},"summaries\u002Fsummaries\u002Fdda195cde5fb0456-qwen-scope-saes-unlock-actionable-llm-internals-summary.md","Qwen-Scope SAEs Unlock Actionable LLM Internals",{"provider":7,"model":8,"input_tokens":3709,"output_tokens":3710,"processing_time_ms":3711,"cost_usd":3712},8913,1989,15174,0.0027546,{"type":14,"value":3714,"toc":3743},[3715,3719,3722,3726,3729,3733,3736,3740],[17,3716,3718],{"id":3717},"sae-decomposition-reveals-interpretable-llm-features","SAE Decomposition Reveals Interpretable LLM Features",[22,3720,3721],{},"Sparse autoencoders (SAEs) translate high-dimensional LLM activations into sparse latent features, each corresponding to concepts like languages or behaviors. For Qwen3 and Qwen3.5 models, Qwen-Scope releases 14 SAE groups across 7 variants: dense models (1.7B, 8B, 2B, 9B, 27B) and MoE (30B-A3B, 35B-A3B). SAEs train per layer on residual streams, using top-k (k=50 or 100) activations; dense models expand 16x hidden size, MoE use 32K (16x) or 128K (64x) widths. Except Qwen3.5-27B (instruct), all use base checkpoints. This layer-wise dictionary enables diagnosis of issues like language mixing or repetition without weight changes.",[17,3723,3725],{"id":3724},"steer-outputs-and-classify-via-feature-interventions","Steer Outputs and Classify via Feature Interventions",[22,3727,3728],{},"Apply steering with h' = h + αd to amplify\u002Fsuppress features: suppress Chinese feature (ID 6159) to fix English prompts mixing languages; activate classical-Chinese feature (ID 36398) for stylistic shifts. For toxicity, build classifiers from features firing more on toxic data—OR-rule yields F1>0.90 on English for 1.7B\u002F8B models; English features transfer cross-lingually (stronger to Russian\u002FFrench, weaker to Arabic\u002FChinese), retaining 99% performance with 10% discovery data. These zero-shot methods cut compute needs versus full evals or training heads.",[17,3730,3732],{"id":3731},"proxy-benchmark-analysis-without-model-runs","Proxy Benchmark Analysis Without Model Runs",[22,3734,3735],{},"SAE features act as micro-capabilities for eval: compute redundancy metric from activation overlap correlates ρ≈0.85 with performance-based redundancy on 17 benchmarks (MMLU, GSM8K, MATH, etc.); GSM8K shares 63% features with MATH, allowing safe omission. Pairwise overlap, partialed by MMLU, correlates 75.5% with capability similarity—retain low-overlap benchmarks, consolidate high-overlap ones to streamline suites without forward passes.",[17,3737,3739],{"id":3738},"augment-training-with-feature-driven-signals","Augment Training with Feature-Driven Signals",[22,3741,3742],{},"For SFT, Sparse Autoencoder-guided SFT (SASFT) suppresses non-target language features via auxiliary loss, cutting code-switching >50% across Gemma-2\u002FLlama-3.1\u002FQwen3 on Chinese\u002FRussian\u002FKorean (full elimination in cases like Qwen3-1.7B Korean), preserving multilingual benchmarks. For RL, synthetically generate repetition via feature steering as rare negatives in DAPO, sharply reducing repetition in 1.7B\u002F8B\u002F30B-A3B. Safety synthesis targets missing features: 4k pairs cover 99.74% features (vs. lower for random), boosting accuracy to 77.75% when mixed 1:1 with real data—matching 120k real-only under budget.",{"title":52,"searchDepth":53,"depth":53,"links":3744},[3745,3746,3747,3748],{"id":3717,"depth":53,"text":3718},{"id":3724,"depth":53,"text":3725},{"id":3731,"depth":53,"text":3732},{"id":3738,"depth":53,"text":3739},[117],{"content_references":3751,"triage":3761},[3752,3755,3758],{"type":65,"title":3753,"url":3754,"context":68},"Qwen Scope","https:\u002F\u002Fqianwen-res.oss-accelerate.aliyuncs.com\u002Fqwen-scope\u002FQwen_Scope.pdf",{"type":81,"title":3756,"url":3757,"context":68},"Qwen-Scope Weights","https:\u002F\u002Fhuggingface.co\u002Fcollections\u002FQwen\u002Fqwen-scope",{"type":74,"title":3759,"url":3760,"context":68},"Qwen-Scope Technical Details","https:\u002F\u002Fqwen.ai\u002Fblog?id=qwen-scope",{"relevance":3762,"novelty":85,"quality":85,"actionability":85,"composite":3763,"reasoning":3764},5,4.35,"Category: AI & LLMs. The article provides in-depth insights into Qwen-Scope's sparse autoencoders, which are practical tools for developers working with LLMs, addressing specific pain points like feature interpretation and output steering. It offers actionable techniques for applying these features in real-world scenarios, such as toxicity classification and training optimizations.","\u002Fsummaries\u002Fdda195cde5fb0456-qwen-scope-saes-unlock-actionable-llm-internals-summary","2026-05-01 08:25:21","2026-05-03 17:01:52",{"title":3707,"description":52},{"loc":3765},"dda195cde5fb0456","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F01\u002Fqwen-ai-releases-qwen-scope-an-open-source-sparse-autoencoders-sae-suite-that-turns-llm-internal-features-into-practical-development-tools\u002F","summaries\u002Fdda195cde5fb0456-qwen-scope-saes-unlock-actionable-llm-internals-summary",[101,104,102,103],"Qwen-Scope's open SAEs on 7 Qwen models decompose activations into interpretable features for steering outputs, proxy benchmark analysis (ρ=0.85 correlation), toxicity classification (F1>0.90), and training fixes like 50% code-switching reduction.",[],"zbictEOZXC-EHp6nI5NAS1Np-cHfHzWO9BF_YlaGEmc",{"id":3778,"title":3779,"ai":3780,"body":3785,"categories":3818,"created_at":59,"date_modified":59,"description":52,"extension":60,"faq":59,"featured":61,"kicker_label":59,"meta":3819,"navigation":89,"path":3842,"published_at":3843,"question":59,"scraped_at":3844,"seo":3845,"sitemap":3846,"source_id":3847,"source_name":96,"source_type":97,"source_url":3848,"stem":3849,"tags":3850,"thumbnail_url":59,"tldr":3851,"tweet":59,"unknown_tags":3852,"__hash__":3853},"summaries\u002Fsummaries\u002F07f85059ce2b1c55-antangelmed-103b-moe-medical-llm-matches-40b-dense-summary.md","AntAngelMed: 103B MoE Medical LLM Matches 40B Dense at 7x Speed",{"provider":7,"model":8,"input_tokens":3781,"output_tokens":3782,"processing_time_ms":3783,"cost_usd":3784},8023,3168,43093,0.00316595,{"type":14,"value":3786,"toc":3813},[3787,3791,3794,3798,3801,3805],[17,3788,3790],{"id":3789},"sparse-moe-delivers-massive-capacity-at-low-compute","Sparse MoE Delivers Massive Capacity at Low Compute",[22,3792,3793],{},"AntAngelMed packs 103B total parameters into a 1\u002F32 activation-ratio Mixture-of-Experts (MoE) architecture, activating just 6.1B params per inference to match performance of ~40B dense models while achieving up to 7x efficiency over equivalently sized dense setups—speed advantages grow further with longer outputs. MoE works by routing inputs to a subset of 'expert' sub-networks instead of using all params per token, scaling knowledge without proportional compute hikes. Builds on Ling-flash-2.0 base via Ling Scaling Laws, with refinements like finer expert granularity, optimized shared expert ratio, attention balancing, auxiliary-loss-free sigmoid routing, Multi-Token Prediction (MTP) layer, QK-Norm, and Partial-RoPE (subset of attention heads). On H20 GPUs, hits >200 tokens\u002Fsecond (3x a 36B dense model), extends to 128K context via YaRN for full clinical docs or multi-turn dialogues. FP8 quantization + EAGLE3 speculative decoding yields 71% HumanEval uplift, 45% GSM8K, 94% Math-500 at 32 concurrency, stabilizing throughput for coding\u002Fmath proxies.",[17,3795,3797],{"id":3796},"three-stage-training-infuses-medical-depth","Three-Stage Training Infuses Medical Depth",[22,3799,3800],{},"Layer general reasoning atop medical specialization through: (1) Continual pre-training on vast medical corpora—encyclopedias, web text, papers—from Ling-flash-2.0 checkpoint; (2) Supervised Fine-Tuning (SFT) on mixed instructions preserving chain-of-thought via math\u002Fcoding\u002Flogic tasks alongside doctor-patient Q&A, diagnostics, ethics\u002Fsafety; (3) GRPO Reinforcement Learning (lighter PPO variant estimating baselines from group scores, per DeepSeekMath paper) with rewards targeting empathy, structured clinical outputs, safety, evidence-based reasoning to slash hallucinations. This progression embeds domain expertise without eroding broad capabilities.",[17,3802,3804],{"id":3803},"leads-benchmarks-deploys-easily-open-source","Leads Benchmarks, Deploys Easily Open-Source",[22,3806,3807,3808,3812],{},"Tops HealthBench (OpenAI's multi-turn clinical dialogues): #1 open-source, beats proprietary models, widest margin on HealthBench-Hard. Dominates MedAIBench (China Nat’l AI Medical Facility): elite in knowledge Q&A\u002Fethics-safety. #1 overall MedBench (36 datasets, ~700K samples across knowledge QA, understanding, generation, complex reasoning, safety\u002Fethics). Apache 2.0 weights (HuggingFace: MedAIBase\u002FAntAngelMed), MIT code (GitHub: MedAIBase\u002FAntAngelMed). Transformers load: ",[3809,3810,3811],"code",{},"AutoModelForCausalLM.from_pretrained(\"MedAIBase\u002FAntAngelMed\", device_map=\"auto\", trust_remote_code=True)",". Runs on vLLM v0.11.0 (4-GPU tensor parallel), SGLang+FlashAttention-3, vLLM-Ascend (Huawei 910B NPUs). From Health Information Center of Zhejiang Province, Ant Healthcare, Zhejiang Anzhen’er Medical AI Technology Co., Ltd.",{"title":52,"searchDepth":53,"depth":53,"links":3814},[3815,3816,3817],{"id":3789,"depth":53,"text":3790},{"id":3796,"depth":53,"text":3797},{"id":3803,"depth":53,"text":3804},[],{"content_references":3820,"triage":3839},[3821,3824,3827,3830,3834,3837],{"type":65,"title":3822,"url":3823,"context":83},"DeepSeekMath","https:\u002F\u002Farxiv.org\u002Fabs\u002F2402.03300",{"type":70,"title":3825,"url":3826,"context":68},"AntAngelMed","https:\u002F\u002Fhuggingface.co\u002FMedAIBase\u002FAntAngelMed",{"type":70,"title":3828,"url":3829,"context":68},"AntAngelMed GitHub Repo","https:\u002F\u002Fgithub.com\u002FMedAIBase\u002FAntAngelMed",{"type":74,"title":3831,"author":3832,"context":3833},"Ling-flash-2.0","inclusionAI","mentioned",{"type":81,"title":3835,"author":3836,"context":83},"HealthBench","OpenAI",{"type":81,"title":3838,"context":83},"MedBench",{"relevance":86,"novelty":85,"quality":85,"actionability":53,"composite":3840,"reasoning":3841},3.25,"Category: AI & LLMs. The article discusses a new medical LLM that showcases innovative architecture and efficiency, which is relevant to AI product builders. However, it lacks specific actionable insights or frameworks that the audience could directly implement in their projects.","\u002Fsummaries\u002F07f85059ce2b1c55-antangelmed-103b-moe-medical-llm-matches-40b-dense-summary","2026-05-12 21:21:47","2026-05-13 12:00:59",{"title":3779,"description":52},{"loc":3842},"07f85059ce2b1c55","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F12\u002Fmeet-antangelmed-a-103b-parameter-open-source-medical-language-model-built-on-a-1-32-activation-ratio-moe-architecture\u002F","summaries\u002F07f85059ce2b1c55-antangelmed-103b-moe-medical-llm-matches-40b-dense-summary",[101,102,104],"103B-param open-source medical LLM activates only 6.1B params via 1\u002F32 MoE, rivals 40B dense models with 7x efficiency, tops HealthBench\u002FMedBench, runs 200+ tps on H20.",[],"BMkdtRqd6qJuSshJwJCoVJVxaHNukE4u3QyIRxxvstU",{"id":3855,"title":3856,"ai":3857,"body":3862,"categories":4031,"created_at":59,"date_modified":59,"description":52,"extension":60,"faq":59,"featured":61,"kicker_label":59,"meta":4032,"navigation":89,"path":4049,"published_at":4050,"question":59,"scraped_at":4051,"seo":4052,"sitemap":4053,"source_id":4054,"source_name":96,"source_type":97,"source_url":4055,"stem":4056,"tags":4057,"thumbnail_url":59,"tldr":4058,"tweet":59,"unknown_tags":4059,"__hash__":4060},"summaries\u002Fsummaries\u002F2d4fed29fea91900-star-elastic-pack-30b-23b-12b-models-in-one-checkp-summary.md","Star Elastic: Pack 30B\u002F23B\u002F12B Models in One Checkpoint",{"provider":7,"model":8,"input_tokens":3858,"output_tokens":3859,"processing_time_ms":3860,"cost_usd":3861},9074,2939,32047,0.0032618,{"type":14,"value":3863,"toc":4025},[3864,3868,3871,3874,3878,3885,3888,3892,3895,3963,3967,3970,3973,4014,4021],[17,3865,3867],{"id":3866},"nested-weight-sharing-compresses-multiple-sizes-into-one-checkpoint","Nested Weight-Sharing Compresses Multiple Sizes into One Checkpoint",[22,3869,3870],{},"Train one 30B hybrid Mamba-Transformer-MoE parent model on 160B tokens to embed smaller 23B and 12B submodels as contiguous subsets of its highest-importance components. Rank embedding channels, attention heads, Mamba SSM heads, MoE experts, and FFN channels by contribution to accuracy using Router-Weighted Expert Activation Pruning (REAP), which weighs routing gates and output magnitudes over naive frequency pruning. A learnable end-to-end router takes a target budget (e.g., 2.8B active params) as one-hot input, outputs differentiable masks via Gumbel-Softmax, and trains jointly with knowledge distillation from the parent—penalizing budget deviations while maximizing accuracy. Use a two-stage curriculum: short-context (8K tokens, uniform budgets) then long-context (49K tokens, p(30B)=0.5, p(23B)=0.3, p(12B)=0.2), boosting AIME-2025 scores by up to 19.8% on smaller variants. Width compression (reducing dims\u002Fheads\u002Fexperts) recovers 98.1% baseline performance versus 95.2% for depth (layer dropping), so prioritize width for reasoning tasks.",[22,3872,3873],{},"This yields 360x fewer tokens than separate pretraining and 7x over sequential distillation, with all variants zero-shot slicable from one 58.9 GB BF16 checkpoint—versus 126.1 GB for independents.",[17,3875,3877],{"id":3876},"phase-specific-sizing-optimizes-reasoning-accuracy-latency","Phase-Specific Sizing Optimizes Reasoning Accuracy-Latency",[22,3879,3880,3881],{},"Ditch fixed-model token caps in ",[3882,3883,3884],"think",{}," phases: assign smaller nested models (e.g., 23B) to high-volume reasoning traces and larger (30B) to precise final answers in ℳS → ℳL configs. The 23B→30B setup beats Nemotron Nano v3 defaults by 16% accuracy at 1.9x lower latency, as reasoning tolerates capacity cuts but answers demand precision. Elastic-23B hits 85.63 on AIME-2025 (vs. Qwen3-30B-A3B's 80.00), matching or exceeding same-size independents on GPQA, LiveCodeBench v5, MMLU-Pro, IFBench, Tau Bench.",[22,3886,3887],{},"12B runs 2.4x throughput of 30B on H100 at BF16; NVFP4 12B hits 7,426 tokens\u002Fs (3.4x) on RTX Pro 6000.",[17,3889,3891],{"id":3890},"quantization-preserves-nesting-for-edge-deployment","Quantization Preserves Nesting for Edge Deployment",[22,3893,3894],{},"Apply Quantization-Aware Distillation (QAD) on the elastic checkpoint to maintain zero-shot slicing post-quant. FP8 PTQ recovers 98.69% BF16 accuracy on 30B; NVFP4 PTQ drops 4.12% but QAD (~5B tokens, 48K context) hits 97.79%. Single NVFP4 checkpoint: 18.7 GB (30B), enabling 12B\u002F8 GB on RTX 5080 (BF16 OOMs). Memory table:",[3896,3897,3898,3917],"table",{},[3899,3900,3901],"thead",{},[3902,3903,3904,3908,3911,3914],"tr",{},[3905,3906,3907],"th",{},"Variant",[3905,3909,3910],{},"30B",[3905,3912,3913],{},"23B",[3905,3915,3916],{},"12B",[3918,3919,3920,3935,3949],"tbody",{},[3902,3921,3922,3926,3929,3932],{},[3923,3924,3925],"td",{},"BF16",[3923,3927,3928],{},"58.9 GB",[3923,3930,3931],{},"44.0 GB",[3923,3933,3934],{},"23.2 GB",[3902,3936,3937,3940,3943,3946],{},[3923,3938,3939],{},"FP8",[3923,3941,3942],{},"31.4 GB",[3923,3944,3945],{},"23.7 GB",[3923,3947,3948],{},"13.0 GB",[3902,3950,3951,3954,3957,3960],{},[3923,3952,3953],{},"NVFP4",[3923,3955,3956],{},"18.7 GB",[3923,3958,3959],{},"14.1 GB",[3923,3961,3962],{},"8.0 GB",[17,3964,3966],{"id":3965},"load-and-serve-with-transformers-or-vllm","Load and Serve with Transformers or vLLM",[22,3968,3969],{},"Grab from HF: nvidia\u002FNVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-{BF16|FP8|NVFP4}. Use trust_remote_code=True for hybrid arch.",[22,3971,3972],{},"Transformers example:",[3974,3975,3979],"pre",{"className":3976,"code":3977,"language":3978,"meta":52,"style":52},"language-python shiki shiki-themes github-light github-dark","from transformers import AutoTokenizer, AutoModelForCausalLM\nimport torch\nmodel_id = \"nvidia\u002FNVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-BF16\"\ntokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)\nmodel = AutoModelForCausalLM.from_pretrained(model_id, trust_remote_code=True, torch_dtype=torch.bfloat16, device_map=\"auto\")\n# Generate with max_new_tokens=4096 for \u003Cthink> + answer\n","python",[3809,3980,3981,3988,3993,3998,4003,4008],{"__ignoreMap":52},[36,3982,3985],{"class":3983,"line":3984},"line",1,[36,3986,3987],{},"from transformers import AutoTokenizer, AutoModelForCausalLM\n",[36,3989,3990],{"class":3983,"line":53},[36,3991,3992],{},"import torch\n",[36,3994,3995],{"class":3983,"line":86},[36,3996,3997],{},"model_id = \"nvidia\u002FNVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-BF16\"\n",[36,3999,4000],{"class":3983,"line":85},[36,4001,4002],{},"tokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)\n",[36,4004,4005],{"class":3983,"line":3762},[36,4006,4007],{},"model = AutoModelForCausalLM.from_pretrained(model_id, trust_remote_code=True, torch_dtype=torch.bfloat16, device_map=\"auto\")\n",[36,4009,4011],{"class":3983,"line":4010},6,[36,4012,4013],{},"# Generate with max_new_tokens=4096 for \u003Cthink> + answer\n",[22,4015,4016,4017,4020],{},"vLLM for prod: ",[3809,4018,4019],{},"vllm serve \u003Cmodel_id>"," (OpenAI API compat), or Docker\u002FSGLang. Query via curl with max_tokens=4096, temperature=0.6.",[4022,4023,4024],"style",{},"html .default .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}html.dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}",{"title":52,"searchDepth":53,"depth":53,"links":4026},[4027,4028,4029,4030],{"id":3866,"depth":53,"text":3867},{"id":3876,"depth":53,"text":3877},{"id":3890,"depth":53,"text":3891},{"id":3965,"depth":53,"text":3966},[],{"content_references":4033,"triage":4046},[4034,4037,4040,4043],{"type":65,"title":4035,"url":4036,"context":68},"Star Elastic","https:\u002F\u002Fcas-bridge.xethub.hf.co\u002Fxet-bridge-us\u002F69cd91b34a304b3afe4ceaa4\u002Fcedbede2a32a1757cd46b5ce6edbe0934f2c8437f61509d8f63aae86f96b43cb?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=cas%2F20260509%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260509T212853Z&X-Amz-Expires=3600&X-Amz-Signature=a776c3adc5cd45d923a82950ea17eefb271caf85b0586ff79855f575381030a7&X-Amz-SignedHeaders=host&X-Xet-Cas-Uid=689a286d51b587fe5035c19f&response-content-disposition=inline%3B+filename*%3DUTF-8%27%27star_elastic_arxiv.pdf%3B+filename%3D%22star_elastic_arxiv.pdf%22%3B&response-content-type=application%2Fpdf&x-amz-checksum-mode=ENABLED&x-id=GetObject&Expires=1778365733&Policy=eyJTdGF0ZW1lbnQiOlt7IkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTc3ODM2NTczM319LCJSZXNvdXJjZSI6Imh0dHBzOi8vY2FzLWJyaWRnZS54ZXRodWIuaGYuY28veGV0LWJyaWRnZS11cy82OWNkOTFiMzRhMzA0YjNhZmU0Y2VhYTQvY2VkYmVkZTJhMzJhMTc1N2NkNDZiNWNlNmVkYmUwOTM0ZjJjODQzN2Y2MTUwOWQ4ZjYzYWFlODZmOTZiNDNjYioifV19&Signature=fpq%7EPKyILz2ZDcwgCMn%7EsYfSySqpZ5Fr-A3MXBBG94lfu6bTv6y63ejTUL16B8v03HIJyKwrdGgHoYAQr88iQ05qS%7EoIszdd0eU2dfem3CVxM-t3e8rIo4-i4OTBjP2oPAMjCqmwzcC6uPG3Xqm-3Tiq5IfrsDFSKSUPZavMI6nU%7EBBpxd-i-L3C4-4v80nzJWfkHZiKb0EHr3PN8CRlA6In1X2-tH3dXBm0GM0j83%7EBtcclb-4C18vdpfEuvEaKOf0tMxsf5zI0acMPdCJxnVatq%7EgZwixiF%7E53DxgPc94Pb93zl0TVTcLH4%7ExH8yi7Xj9YYjdMKB634Q1GeapoJA__&Key-Pair-Id=K2L8F4GPSG1IFC",{"type":70,"title":4038,"url":4039,"context":68},"NVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-BF16","https:\u002F\u002Fhuggingface.co\u002Fnvidia\u002FNVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-BF16",{"type":70,"title":4041,"url":4042,"context":68},"NVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-FP8","https:\u002F\u002Fhuggingface.co\u002Fnvidia\u002FNVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-FP8",{"type":70,"title":4044,"url":4045,"context":68},"NVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-NVFP4","https:\u002F\u002Fhuggingface.co\u002Fnvidia\u002FNVIDIA-Nemotron-Labs-3-Elastic-30B-A3B-NVFP4",{"relevance":86,"novelty":86,"quality":85,"actionability":53,"composite":4047,"reasoning":4048},3.05,"Category: AI & LLMs. The article discusses a new model architecture from NVIDIA that could be relevant for developers looking to integrate advanced AI models into their products. However, while it provides technical details, it lacks practical steps or frameworks that the audience could directly apply in their work.","\u002Fsummaries\u002F2d4fed29fea91900-star-elastic-pack-30b-23b-12b-models-in-one-checkp-summary","2026-05-09 22:24:23","2026-05-10 15:26:52",{"title":3856,"description":52},{"loc":4049},"2d4fed29fea91900","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F09\u002Fnvidia-ai-releases-star-elastic-one-checkpoint-that-contains-30b-23b-and-12b-reasoning-models-with-zero-shot-slicing\u002F","summaries\u002F2d4fed29fea91900-star-elastic-pack-30b-23b-12b-models-in-one-checkp-summary",[101,103,104],"NVIDIA's Star Elastic embeds nested 30B (3.6B active), 23B (2.8B), and 12B (2.0B) reasoning models in a single checkpoint via importance-ranked weight-sharing, slashing training costs 360x and enabling phase-specific sizing for 16% accuracy gains at 1.9x lower latency.",[],"MmEv9MTKlBfvzKFMrwhf1uWOYr3g3Xhj2RLeYFKTfm8",{"id":4062,"title":4063,"ai":4064,"body":4069,"categories":4105,"created_at":59,"date_modified":59,"description":52,"extension":60,"faq":59,"featured":61,"kicker_label":59,"meta":4106,"navigation":89,"path":4119,"published_at":4120,"question":59,"scraped_at":4121,"seo":4122,"sitemap":4123,"source_id":4124,"source_name":4125,"source_type":97,"source_url":4126,"stem":4127,"tags":4128,"thumbnail_url":59,"tldr":4129,"tweet":59,"unknown_tags":4130,"__hash__":4131},"summaries\u002Fsummaries\u002F9c05119c3bd0f686-sovereign-ai-grounds-robotics-in-physics-for-1-1m--summary.md","Sovereign AI Grounds Robotics in Physics for 1.1M States\u002FSec",{"provider":7,"model":8,"input_tokens":4065,"output_tokens":4066,"processing_time_ms":4067,"cost_usd":4068},4417,1733,23053,0.0017274,{"type":14,"value":4070,"toc":4099},[4071,4075,4078,4082,4085,4089,4092,4096],[17,4072,4074],{"id":4073},"build-sub-millisecond-robotics-control-with-jax-tpu-v6","Build Sub-Millisecond Robotics Control with JAX + TPU v6",[22,4076,4077],{},"To overcome reinforcement learning's brittleness in real-world chaos, Sovereign AI leverages JAX 0.9.0+ on Google's TPU v6 Trillium for extreme speed: over 1.1 million states per second at 0.894 ms latency. This ensures a 22-DoF humanoid robot processes decisions faster than its actuators move, preventing delays that cause falls. Implement by running the full notebook on GitHub (frank-morales2020\u002FMLxDL), which integrates hardware acceleration for latent space computations without simulation pitfalls.",[17,4079,4081],{"id":4080},"anchor-predictions-to-physics-laws-via-jepa-for-47x-failure-sensitivity","Anchor Predictions to Physics Laws via JEPA for 4.7x Failure Sensitivity",[22,4083,4084],{},"Joint Embedding Predictive Architecture (JEPA) operates in a physics-informed latent space, using a Physics Anchor to monitor energy patterns. Detect anomalies by thresholding: energy loss of 8.5467 signals motor seizure (failure), while expansion of 4.8101 indicates intentional momentum for maneuvers like sideways slides. This delivers 4.7x greater sensitivity over traditional methods, grounding neural predictions in conservation laws so AI distinguishes planned actions from disasters in real time.",[17,4086,4088],{"id":4087},"gain-auditability-and-recovery-with-gemini-31-pro-oversight","Gain Auditability and Recovery with Gemini 3.1 Pro Oversight",[22,4090,4091],{},"Feed JEPA's abstract metrics into Gemini 3.1 Pro's Deep Thinking mode as the executive controller. It translates spikes into human-readable reports, diagnosing joint failures or sensor glitches, then outputs recovery plans. This Sovereign Return on Investment (SROI) enables full energy expenditure audits, making decisions transparent and recoverable rather than black-box guesses.",[17,4093,4095],{"id":4094},"slash-bandwidth-797-for-6g-scale-autonomy-with-semantic-compression","Slash Bandwidth 79.7% for 6G-Scale Autonomy with Semantic Compression",[22,4097,4098],{},"Compress data to transmit only semantic meaning, not raw sensors, yielding 79.7% bandwidth savings. For 6G networks, this sustains high-fidelity autonomy in bandwidth-constrained environments, ensuring reliable physical-world deployment without overwhelming infrastructure.",{"title":52,"searchDepth":53,"depth":53,"links":4100},[4101,4102,4103,4104],{"id":4073,"depth":53,"text":4074},{"id":4080,"depth":53,"text":4081},{"id":4087,"depth":53,"text":4088},{"id":4094,"depth":53,"text":4095},[117],{"content_references":4107,"triage":4117},[4108,4111,4113,4115],{"type":70,"title":4109,"url":4110,"context":3833},"MLxDL (GEMINI_TPU.ipynb)","https:\u002F\u002Fgithub.com\u002Ffrank-morales2020\u002FMLxDL\u002Fblob\u002Fmain\u002FGEMINI_TPU.ipynb",{"type":70,"title":4112,"context":3833},"JAX 0.9.0+",{"type":70,"title":4114,"context":3833},"TPU v6 Trillium",{"type":70,"title":4116,"context":3833},"Gemini 3.1 Pro",{"relevance":3762,"novelty":85,"quality":85,"actionability":85,"composite":3763,"reasoning":4118},"Category: AI & LLMs. The article provides in-depth insights into using AI for robotics control, addressing practical applications like real-time decision-making and failure detection, which are crucial for product builders. It includes specific frameworks and tools like JAX and JEPA, making it actionable for developers looking to implement these techniques.","\u002Fsummaries\u002F9c05119c3bd0f686-sovereign-ai-grounds-robotics-in-physics-for-1-1m-summary","2026-05-08 15:34:13","2026-05-09 15:36:56",{"title":4063,"description":52},{"loc":4119},"9c05119c3bd0f686","AI Simplified in Plain English","https:\u002F\u002Fmedium.com\u002Fai-simplified-in-plain-english\u002Fsovereign-ai-bridging-the-gap-between-neural-logic-and-physical-reality-27847c54ddbc?source=rss----f37ab7d4e76b---4","summaries\u002F9c05119c3bd0f686-sovereign-ai-grounds-robotics-in-physics-for-1-1m--summary",[101,104,103],"Sovereign AI uses JEPA with physics anchors on JAX\u002FTPU v6 to process 1.1M states\u002Fsec at 0.894ms latency, detecting failures 4.7x better via energy patterns, with Gemini 3.1 Pro generating auditable reports and recovery plans.",[],"S_G2pfMpHvfDy7cXXXBE5nN5ar3Jtvpq5EuPS2bYuY8"]