[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"summary-9b169e39b5c1f580-twell-delivers-20-llm-speedups-via-gpu-optimized-s-summary":3,"summaries-facets-categories":227,"summary-related-9b169e39b5c1f580-twell-delivers-20-llm-speedups-via-gpu-optimized-s-summary":3797},{"id":4,"title":5,"ai":6,"body":13,"categories":182,"created_at":183,"date_modified":183,"description":176,"extension":184,"faq":183,"featured":185,"kicker_label":183,"meta":186,"navigation":208,"path":209,"published_at":210,"question":183,"scraped_at":211,"seo":212,"sitemap":213,"source_id":214,"source_name":215,"source_type":216,"source_url":217,"stem":218,"tags":219,"thumbnail_url":183,"tldr":224,"tweet":183,"unknown_tags":225,"__hash__":226},"summaries\u002Fsummaries\u002F9b169e39b5c1f580-twell-delivers-20-llm-speedups-via-gpu-optimized-s-summary.md","TwELL Delivers 20% LLM Speedups via GPU-Optimized Sparsity",{"provider":7,"model":8,"input_tokens":9,"output_tokens":10,"processing_time_ms":11,"cost_usd":12},"openrouter","x-ai\u002Fgrok-4.1-fast",9210,2209,23408,0.00243965,{"type":14,"value":15,"toc":175},"minimark",[16,21,25,28,31,34,38,41,44,47,51,54,166,169,172],[17,18,20],"h2",{"id":19},"twell-format-enables-zero-overhead-sparsity-on-gpus","TwELL Format Enables Zero-Overhead Sparsity on GPUs",[22,23,24],"p",{},"Feedforward layers consume ⅔ of LLM parameters and 80%+ of FLOPs, but activation sparsity leaves 99%+ of hidden neurons at zero post-ReLU. Standard ELLPACK sparsity fails on batched GEMM (training\u002Fhigh-throughput inference) due to dense-to-sparse conversion overheads that match or exceed savings on Tensor Core-optimized GPUs.",[22,26,27],{},"TwELL fixes this by tile-wise packing: partition gate activation columns into horizontal tiles matching matmul kernel tile size T_n (e.g., CTA dimensions). Pack non-zeros + indices locally per tile in ELL-style within the gate projection epilogue—no extra kernel, global read, or sync. Compression factor C ensures T\u002FC > max non-zeros\u002Ftile; store as single 32-bit matrix for locality.",[22,29,30],{},"Inference fuses up\u002Fdown projections in one kernel per input row: CTAs iterate tile non-zeros, loading W_u columns and W_d rows for dot products. Hidden state h_u stays in registers, slashing DRAM. Training uses hybrid format: route low-nz rows (\u003Cthreshold) to compact ELL, overflow to dense backup, handling non-uniform sparsity (max nz\u002Frow >> average).",[22,32,33],{},"Supports gated MLPs (Llama\u002FQwen) and non-gated Transformers (11.2% inference speedup at L1=2e-5).",[17,35,37],{"id":36},"induce-sparsity-with-relu-l1no-hyperparam-tweaks","Induce Sparsity with ReLU + L1—No Hyperparam Tweaks",[22,39,40],{},"Replace SiLU with ReLU in gates for exact zeros on negatives. Add L1 loss on hidden activations (post-up projection, pre-down): L1 = 2×10⁻⁵ × mean(|h| over tokens\u002Fdims\u002Flayers), summed to CE loss.",[22,42,43],{},"Sparsity stabilizes in ~1000 steps (~1B tokens). At L1=2e-5, nz activations drop from 911 to 29\u002Flayer (99.5% sparse) in 1.5B model (d_ff=5632); 30%+ neurons die permanently but accuracy holds (46.4% → 46.2% tasks). Test 8 L1 values: up to 3e-5, \u003C2% relative CE rise, no task accuracy drop (ARC\u002FHellaSwag\u002Fetc.).",[22,45,46],{},"Mitigate dead neurons via gate weight reinitialization: +19.1% speedup vs +17.9% baseline, same sparsity\u002Faccuracy. Train on fineweb-edu (10-40B tokens, chinchilla-optimal), ctx=2048, batch=1M—no LR\u002Foptimizer\u002Fweight decay changes.",[17,48,50],{"id":49},"speedups-grow-with-scale-patterns-favor-early-layers","Speedups Grow with Scale; Patterns Favor Early Layers",[22,52,53],{},"On 8x H100 PCIe (seq=2048):",[55,56,57,82],"table",{},[58,59,60],"thead",{},[61,62,63,67,70,73,76,79],"tr",{},[64,65,66],"th",{},"Model",[64,68,69],{},"Inf Speedup",[64,71,72],{},"Train Throughput",[64,74,75],{},"Peak Mem Δ",[64,77,78],{},"Energy\u002Ftok Δ",[64,80,81],{},"Accuracy Δ",[83,84,85,106,126,146],"tbody",{},[61,86,87,91,94,97,100,103],{},[88,89,90],"td",{},"0.5B",[88,92,93],{},"+17.0%",[88,95,96],{},"-1.5%",[88,98,99],{},"-19.2%",[88,101,102],{},"-11.8%",[88,104,105],{},"40.4→40.4%",[61,107,108,111,114,117,120,123],{},[88,109,110],{},"1B",[88,112,113],{},"+18.1%",[88,115,116],{},"+7.1%",[88,118,119],{},"-25.5%",[88,121,122],{},"-14.6%",[88,124,125],{},"44.6→44.7%",[61,127,128,131,134,137,140,143],{},[88,129,130],{},"1.5B",[88,132,133],{},"+18.8%",[88,135,136],{},"+11.6%",[88,138,139],{},"-28.1%",[88,141,142],{},"-15.0%",[88,144,145],{},"46.4→46.2%",[61,147,148,151,154,157,160,163],{},[88,149,150],{},"2B",[88,152,153],{},"+20.5%",[88,155,156],{},"+21.9%",[88,158,159],{},"+22.3%*",[88,161,162],{},"-17.0%",[88,164,165],{},"49.1→48.8%",[22,167,168],{},"*2B uses larger micro-batch via mem savings (46.7→57.1GB peak). Nz\u002Flayer falls 39→24 (0.5B→2B), amplifying skips. -0.996 Pearson corr: sparser layers = bigger gains.",[22,170,171],{},"Patterns: Layer 1-2 least active in 28L 1.5B; peak early-middle (reasoning\u002Fknowledge). Sequence pos 1 fires exponentially more neurons than later. Larger gains on RTX PRO 6000 (188 SMs): sparse thrives where dense GEMM lags.",[22,173,174],{},"Open-source kernels (H100 TMA\u002Fpersistent CTAs; RTX verified), code for Llama\u002Fetc. Future: fine-tune dense models.",{"title":176,"searchDepth":177,"depth":177,"links":178},"",2,[179,180,181],{"id":19,"depth":177,"text":20},{"id":36,"depth":177,"text":37},{"id":49,"depth":177,"text":50},[],null,"md",false,{"content_references":187,"triage":203},[188,194,199],{"type":189,"title":190,"author":191,"url":192,"context":193},"paper","Sparser, Faster, Lighter LLMs — TwELL & Sparse CUDA Kernels","Sakana AI and NVIDIA","https:\u002F\u002Farxiv.org\u002Fabs\u002F2603.23198","cited",{"type":195,"title":196,"url":197,"context":198},"other","Code & Kernels","https:\u002F\u002Fgithub.com\u002FSakanaAI\u002Fsparser-faster-llms","recommended",{"type":195,"title":200,"url":201,"context":202},"Project Page","https:\u002F\u002Fpub.sakana.ai\u002Fsparser-faster-llms\u002F","mentioned",{"relevance":204,"novelty":205,"quality":205,"actionability":205,"composite":206,"reasoning":207},5,4,4.35,"Category: AI & LLMs. The article provides a detailed explanation of how to achieve significant speedups in LLMs through GPU-optimized sparsity techniques, addressing a core topic of AI engineering that product builders would prioritize. It includes specific techniques like using ReLU and L1 loss to induce sparsity, which are actionable for developers looking to optimize their AI models.",true,"\u002Fsummaries\u002F9b169e39b5c1f580-twell-delivers-20-llm-speedups-via-gpu-optimized-s-summary","2026-05-11 08:36:00","2026-05-11 15:04:14",{"title":5,"description":176},{"loc":209},"9b169e39b5c1f580","MarkTechPost","article","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F11\u002Fsakana-ai-and-nvidia-introduce-twell-with-cuda-kernels-for-20-5-inference-and-21-9-training-speedup-in-llms\u002F","summaries\u002F9b169e39b5c1f580-twell-delivers-20-llm-speedups-via-gpu-optimized-s-summary",[220,221,222,223],"llm","machine-learning","open-source","software-engineering","Use ReLU gate activation + L1=2e-5 on hidden activations to induce 99.5% sparsity in feedforward layers, then TwELL CUDA kernels yield 20.5% inference and 21.9% training speedups on H100s with no accuracy loss.",[223],"Y9zPbz1evh-vqSU1FsJragaDNpxe6l3-Y9uUMk_yJT4",[228,231,234,237,240,243,245,247,249,251,253,255,258,260,262,264,266,268,270,272,274,276,279,282,284,286,289,291,293,296,298,300,302,304,306,308,310,312,314,316,318,320,322,324,326,328,330,332,334,336,338,340,342,344,346,348,350,352,354,356,358,360,362,364,366,368,370,372,374,376,378,380,382,384,386,388,390,392,394,396,398,400,402,404,406,408,410,412,414,416,418,420,422,424,426,428,430,432,434,436,438,440,442,444,446,448,450,452,454,456,458,460,462,464,466,468,470,472,474,476,478,480,482,484,486,488,490,492,494,496,498,500,502,504,506,508,510,512,514,516,518,520,522,524,526,528,530,532,534,536,538,540,542,544,546,548,551,553,555,557,559,561,563,565,567,569,571,573,575,577,579,581,583,585,587,589,591,593,595,597,599,601,603,605,607,609,611,613,615,617,619,621,623,625,627,629,631,633,635,637,639,641,643,645,647,649,651,653,655,657,659,661,663,665,667,669,671,673,675,677,679,681,683,685,687,689,691,693,695,697,699,701,703,705,707,709,711,713,715,717,719,721,723,725,727,729,731,733,735,737,739,741,743,745,747,749,751,753,755,757,759,761,763,765,767,769,771,773,775,777,779,781,783,785,787,789,791,793,795,797,799,801,803,805,807,809,811,813,815,817,819,821,823,825,827,829,831,833,835,837,839,841,843,845,847,849,851,853,855,857,859,861,863,865,867,869,871,873,875,877,879,881,883,885,887,889,891,893,895,897,899,901,903,905,907,909,911,913,915,917,919,921,923,925,927,929,931,933,935,937,939,941,943,945,947,949,951,953,955,957,959,961,963,965,967,969,971,973,975,977,979,981,983,985,987,989,991,993,995,997,999,1001,1003,1005,1007,1009,1011,1013,1015,1017,1019,1021,1023,1025,1027,1029,1031,1033,1035,1037,1039,1041,1043,1045,1047,1049,1051,1053,1055,1057,1059,1061,1063,1065,1067,1069,1071,1073,1075,1077,1079,1081,1083,1085,1087,1089,1091,1093,1095,1097,1099,1101,1103,1105,1107,1109,1111,1113,1115,1117,1119,1121,1123,1125,1127,1129,1131,1133,1135,1137,1139,1141,1143,1145,1147,1149,1151,1153,1155,1157,1159,1161,1163,1165,1167,1169,1171,1173,1175,1177,1179,1181,1183,1185,1187,1189,1191,1193,1195,1197,1199,1201,1203,1205,1207,1209,1211,1213,1215,1217,1219,1221,1223,1225,1227,1229,1231,1233,1235,1237,1239,1241,1243,1245,1247,1249,1251,1253,1255,1257,1259,1261,1263,1265,1267,1269,1271,1273,1275,1277,1279,1281,1283,1285,1287,1289,1291,1293,1295,1297,1299,1301,1303,1305,1307,1309,1311,1313,1315,1317,1319,1321,1323,1325,1327,1329,1331,1333,1335,1337,1339,1341,1343,1345,1347,1349,1351,1353,1355,1357,1359,1361,1363,1365,1367,1369,1371,1373,1375,1377,1379,1381,1383,1385,1387,1389,1391,1393,1395,1397,1399,1401,1403,1405,1407,1409,1411,1413,1415,1417,1419,1421,1423,1425,1427,1429,1431,1433,1435,1437,1439,1441,1443,1445,1447,1449,1451,1453,1455,1457,1459,1461,1463,1465,1467,1469,1471,1473,1475,1477,1479,1481,1483,1485,1487,1489,1491,1493,1495,1497,1499,1501,1503,1505,1507,1509,1511,1513,1515,1517,1519,1521,1523,1525,1527,1529,1531,1533,1535,1537,1539,1541,1543,1545,1547,1549,1551,1553,1555,1557,1559,1561,1563,1565,1567,1569,1571,1573,1575,1577,1579,1581,1583,1585,1587,1589,1591,1593,1595,1597,1599,1601,1603,1605,1607,1609,1611,1613,1615,1617,1619,1621,1623,1625,1627,1629,1631,1633,1635,1637,1639,1641,1643,1645,1647,1649,1651,1653,1655,1657,1659,1661,1663,1665,1667,1669,1671,1673,1675,1677,1679,1681,1683,1685,1687,1689,1691,1693,1695,1697,1699,1701,1703,1705,1707,1709,1711,1713,1715,1717,1719,1721,1723,1725,1727,1729,1731,1733,1735,1737,1739,1741,1743,1745,1747,1749,1751,1753,1755,1757,1759,1761,1763,1765,1767,1769,1771,1773,1775,1777,1779,1781,1783,1785,1787,1789,1791,1793,1795,1797,1799,1801,1803,1805,1807,1809,1811,1813,1815,1817,1819,1821,1823,1825,1827,1829,1831,1833,1835,1837,1839,1841,1843,1845,1847,1849,1851,1853,1855,1857,1859,1861,1863,1865,1867,1869,1871,1873,1875,1877,1879,1881,1883,1885,1887,1889,1891,1893,1895,1897,1899,1901,1903,1905,1907,1909,1911,1913,1915,1917,1919,1921,1923,1925,1927,1929,1931,1933,1935,1937,1939,1941,1943,1945,1947,1949,1951,1953,1955,1957,1959,1961,1963,1965,1967,1969,1971,1973,1975,1977,1979,1981,1983,1985,1987,1989,1991,1993,1995,1997,1999,2001,2003,2005,2007,2009,2011,2013,2015,2017,2019,2021,2023,2025,2027,2029,2031,2033,2035,2037,2039,2041,2043,2045,2047,2049,2051,2053,2055,2057,2059,2061,2063,2065,2067,2069,2071,2073,2075,2077,2079,2081,2083,2085,2087,2089,2091,2093,2095,2097,2099,2101,2103,2105,2107,2109,2111,2113,2115,2117,2119,2121,2123,2125,2127,2129,2131,2133,2135,2137,2139,2141,2143,2145,2147,2149,2151,2153,2155,2157,2159,2161,2163,2165,2167,2169,2171,2173,2175,2177,2179,2181,2183,2185,2187,2189,2191,2193,2195,2197,2199,2201,2203,2205,2207,2209,2211,2213,2215,2217,2219,2221,2223,2225,2227,2229,2231,2233,2235,2237,2239,2241,2243,2245,2247,2249,2251,2253,2255,2257,2259,2261,2263,2265,2267,2269,2271,2273,2275,2277,2279,2281,2283,2285,2287,2289,2291,2293,2295,2297,2299,2301,2303,2305,2307,2309,2311,2313,2315,2317,2319,2321,2323,2325,2327,2329,2331,2333,2335,2337,2339,2341,2343,2345,2347,2349,2351,2353,2355,2357,2359,2361,2363,2365,2367,2369,2371,2373,2375,2377,2379,2381,2383,2385,2387,2389,2391,2393,2395,2397,2399,2401,2403,2405,2407,2409,2411,2413,2415,2417,2419,2421,2423,2425,2427,2429,2431,2433,2435,2437,2439,2441,2443,2445,2447,2449,2451,2453,2455,2457,2459,2461,2463,2465,2467,2469,2471,2473,2475,2477,2479,2481,2483,2485,2487,2489,2491,2493,2495,2497,2499,2501,2503,2505,2507,2509,2511,2513,2515,2517,2519,2521,2523,2525,2527,2529,2531,2533,2535,2537,2539,2541,2543,2545,2547,2549,2551,2553,2555,2557,2559,2561,2563,2565,2567,2569,2571,2573,2575,2577,2579,2581,2583,2585,2587,2589,2591,2593,2595,2597,2599,2601,2603,2605,2607,2609,2611,2613,2615,2617,2619,2621,2623,2625,2627,2629,2631,2633,2635,2637,2639,2641,2643,2645,2647,2649,2651,2653,2655,2657,2659,2661,2663,2665,2667,2669,2671,2673,2675,2677,2679,2681,2683,2685,2687,2689,2691,2693,2695,2697,2699,2701,2703,2705,2707,2709,2711,2713,2715,2717,2719,2721,2723,2725,2727,2729,2731,2733,2735,2737,2739,2741,2743,2745,2747,2749,2751,2753,2755,2757,2759,2761,2763,2765,2767,2769,2771,2773,2775,2777,2779,2781,2783,2785,2787,2789,2791,2793,2795,2797,2799,2801,2803,2805,2807,2809,2811,2813,2815,2817,2819,2821,2823,2825,2827,2829,2831,2833,2835,2837,2839,2841,2843,2845,2847,2849,2851,2853,2855,2857,2859,2861,2863,2865,2867,2869,2871,2873,2875,2877,2879,2881,2883,2885,2887,2889,2891,2893,2895,2897,2899,2901,2903,2905,2907,2909,2911,2913,2915,2917,2919,2921,2923,2925,2927,2929,2931,2933,2935,2937,2939,2941,2943,2945,2947,2949,2951,2953,2955,2957,2959,2961,2963,2965,2967,2969,2971,2973,2975,2977,2979,2981,2983,2985,2987,2989,2991,2993,2995,2997,2999,3001,3003,3005,3007,3009,3011,3013,3015,3017,3019,3021,3023,3025,3027,3029,3031,3033,3035,3037,3039,3041,3043,3045,3047,3049,3051,3053,3055,3057,3059,3061,3063,3065,3067,3069,3071,3073,3075,3077,3079,3081,3083,3085,3087,3089,3091,3093,3095,3097,3099,3101,3103,3105,3107,3109,3111,3113,3115,3117,3119,3121,3123,3125,3127,3129,3131,3133,3135,3137,3139,3141,3143,3145,3147,3149,3151,3153,3155,3157,3159,3161,3163,3165,3167,3169,3171,3173,3175,3177,3179,3181,3183,3185,3187,3189,3191,3193,3195,3197,3199,3201,3203,3205,3207,3209,3211,3213,3215,3217,3219,3221,3223,3225,3227,3229,3231,3233,3235,3237,3239,3241,3243,3245,3247,3249,3251,3253,3255,3257,3259,3261,3263,3265,3267,3269,3271,3273,3275,3277,3279,3281,3283,3285,3287,3289,3291,3293,3295,3297,3299,3301,3303,3305,3307,3309,3311,3313,3315,3317,3319,3321,3323,3325,3327,3329,3331,3333,3335,3337,3339,3341,3343,3345,3347,3349,3351,3353,3355,3357,3359,3361,3363,3365,3367,3369,3371,3373,3375,3377,3379,3381,3383,3385,3387,3389,3391,3393,3395,3397,3399,3401,3403,3405,3407,3409,3411,3413,3415,3417,3419,3421,3423,3425,3427,3429,3431,3433,3435,3437,3439,3441,3443,3445,3447,3449,3451,3453,3455,3457,3459,3461,3463,3465,3467,3469,3471,3473,3475,3477,3479,3481,3483,3485,3487,3489,3491,3493,3495,3497,3499,3501,3503,3505,3507,3509,3511,3513,3515,3517,3519,3521,3523,3525,3527,3529,3531,3533,3535,3537,3539,3541,3543,3545,3547,3549,3551,3553,3555,3557,3559,3561,3563,3565,3567,3569,3571,3573,3575,3577,3579,3581,3583,3585,3587,3589,3591,3593,3595,3597,3599,3601,3603,3605,3607,3609,3611,3613,3615,3617,3619,3621,3623,3625,3627,3629,3631,3633,3635,3637,3639,3641,3643,3645,3647,3649,3651,3653,3655,3657,3659,3661,3663,3665,3667,3669,3671,3673,3675,3677,3679,3681,3683,3685,3687,3689,3691,3693,3695,3697,3699,3701,3703,3705,3707,3709,3711,3713,3715,3717,3719,3721,3723,3725,3727,3729,3731,3733,3735,3737,3739,3741,3743,3745,3747,3749,3751,3753,3755,3757,3759,3761,3763,3765,3767,3769,3771,3773,3775,3777,3779,3781,3783,3785,3787,3789,3791,3793,3795],{"categories":229},[230],"Developer Productivity",{"categories":232},[233],"Business & SaaS",{"categories":235},[236],"AI & LLMs",{"categories":238},[239],"AI Automation",{"categories":241},[242],"Product Strategy",{"categories":244},[236],{"categories":246},[230],{"categories":248},[233],{"categories":250},[],{"categories":252},[236],{"categories":254},[],{"categories":256},[257],"AI News & Trends",{"categories":259},[239],{"categories":261},[257],{"categories":263},[239],{"categories":265},[239],{"categories":267},[236],{"categories":269},[236],{"categories":271},[257],{"categories":273},[236],{"categories":275},[],{"categories":277},[278],"Design & Frontend",{"categories":280},[281],"Data Science & Visualization",{"categories":283},[257],{"categories":285},[],{"categories":287},[288],"Software Engineering",{"categories":290},[236],{"categories":292},[239],{"categories":294},[295],"Marketing & Growth",{"categories":297},[236],{"categories":299},[239],{"categories":301},[],{"categories":303},[],{"categories":305},[278],{"categories":307},[239],{"categories":309},[230],{"categories":311},[278],{"categories":313},[236],{"categories":315},[239],{"categories":317},[257],{"categories":319},[],{"categories":321},[],{"categories":323},[239],{"categories":325},[288],{"categories":327},[],{"categories":329},[233],{"categories":331},[],{"categories":333},[],{"categories":335},[239],{"categories":337},[239],{"categories":339},[236],{"categories":341},[],{"categories":343},[288],{"categories":345},[],{"categories":347},[],{"categories":349},[],{"categories":351},[236],{"categories":353},[295],{"categories":355},[278],{"categories":357},[278],{"categories":359},[236],{"categories":361},[239],{"categories":363},[236],{"categories":365},[236],{"categories":367},[239],{"categories":369},[239],{"categories":371},[281],{"categories":373},[257],{"categories":375},[239],{"categories":377},[295],{"categories":379},[239],{"categories":381},[242],{"categories":383},[],{"categories":385},[239],{"categories":387},[],{"categories":389},[239],{"categories":391},[288],{"categories":393},[278],{"categories":395},[236],{"categories":397},[],{"categories":399},[],{"categories":401},[239],{"categories":403},[],{"categories":405},[236],{"categories":407},[],{"categories":409},[230],{"categories":411},[288],{"categories":413},[233],{"categories":415},[257],{"categories":417},[236],{"categories":419},[],{"categories":421},[236],{"categories":423},[],{"categories":425},[288],{"categories":427},[281],{"categories":429},[],{"categories":431},[236],{"categories":433},[278],{"categories":435},[],{"categories":437},[278],{"categories":439},[239],{"categories":441},[],{"categories":443},[239],{"categories":445},[257],{"categories":447},[236],{"categories":449},[],{"categories":451},[239],{"categories":453},[236],{"categories":455},[242],{"categories":457},[],{"categories":459},[236],{"categories":461},[239],{"categories":463},[239],{"categories":465},[],{"categories":467},[281],{"categories":469},[236],{"categories":471},[],{"categories":473},[230],{"categories":475},[233],{"categories":477},[236],{"categories":479},[239],{"categories":481},[288],{"categories":483},[236],{"categories":485},[],{"categories":487},[],{"categories":489},[236],{"categories":491},[],{"categories":493},[278],{"categories":495},[],{"categories":497},[236],{"categories":499},[],{"categories":501},[239],{"categories":503},[236],{"categories":505},[278],{"categories":507},[],{"categories":509},[236],{"categories":511},[236],{"categories":513},[233],{"categories":515},[239],{"categories":517},[236],{"categories":519},[278],{"categories":521},[239],{"categories":523},[],{"categories":525},[],{"categories":527},[257],{"categories":529},[],{"categories":531},[236],{"categories":533},[233,295],{"categories":535},[],{"categories":537},[236],{"categories":539},[],{"categories":541},[],{"categories":543},[236],{"categories":545},[],{"categories":547},[236],{"categories":549},[550],"DevOps & Cloud",{"categories":552},[],{"categories":554},[257],{"categories":556},[278],{"categories":558},[],{"categories":560},[257],{"categories":562},[257],{"categories":564},[236],{"categories":566},[295],{"categories":568},[],{"categories":570},[233],{"categories":572},[],{"categories":574},[236,550],{"categories":576},[236],{"categories":578},[236],{"categories":580},[239],{"categories":582},[236,288],{"categories":584},[281],{"categories":586},[236],{"categories":588},[295],{"categories":590},[239],{"categories":592},[239],{"categories":594},[],{"categories":596},[239],{"categories":598},[236,233],{"categories":600},[],{"categories":602},[278],{"categories":604},[278],{"categories":606},[],{"categories":608},[],{"categories":610},[257],{"categories":612},[],{"categories":614},[230],{"categories":616},[288],{"categories":618},[236],{"categories":620},[278],{"categories":622},[239],{"categories":624},[288],{"categories":626},[257],{"categories":628},[278],{"categories":630},[],{"categories":632},[236],{"categories":634},[236],{"categories":636},[236],{"categories":638},[257],{"categories":640},[230],{"categories":642},[236],{"categories":644},[239],{"categories":646},[550],{"categories":648},[278],{"categories":650},[239],{"categories":652},[],{"categories":654},[],{"categories":656},[278],{"categories":658},[257],{"categories":660},[281],{"categories":662},[],{"categories":664},[236],{"categories":666},[236],{"categories":668},[233],{"categories":670},[236],{"categories":672},[236],{"categories":674},[257],{"categories":676},[],{"categories":678},[239],{"categories":680},[288],{"categories":682},[],{"categories":684},[236],{"categories":686},[236],{"categories":688},[239],{"categories":690},[],{"categories":692},[],{"categories":694},[236],{"categories":696},[],{"categories":698},[233],{"categories":700},[239],{"categories":702},[],{"categories":704},[230],{"categories":706},[236],{"categories":708},[233],{"categories":710},[257],{"categories":712},[],{"categories":714},[],{"categories":716},[],{"categories":718},[257],{"categories":720},[257],{"categories":722},[],{"categories":724},[],{"categories":726},[233],{"categories":728},[],{"categories":730},[],{"categories":732},[230],{"categories":734},[],{"categories":736},[295],{"categories":738},[239],{"categories":740},[233],{"categories":742},[239],{"categories":744},[],{"categories":746},[242],{"categories":748},[278],{"categories":750},[288],{"categories":752},[236],{"categories":754},[239],{"categories":756},[233],{"categories":758},[236],{"categories":760},[],{"categories":762},[],{"categories":764},[288],{"categories":766},[281],{"categories":768},[242],{"categories":770},[239],{"categories":772},[236],{"categories":774},[],{"categories":776},[550],{"categories":778},[],{"categories":780},[239],{"categories":782},[],{"categories":784},[],{"categories":786},[236],{"categories":788},[278],{"categories":790},[295],{"categories":792},[239],{"categories":794},[],{"categories":796},[230],{"categories":798},[],{"categories":800},[257],{"categories":802},[236,550],{"categories":804},[257],{"categories":806},[236],{"categories":808},[233],{"categories":810},[236],{"categories":812},[],{"categories":814},[233],{"categories":816},[],{"categories":818},[288],{"categories":820},[278],{"categories":822},[257],{"categories":824},[281],{"categories":826},[230],{"categories":828},[236],{"categories":830},[288],{"categories":832},[],{"categories":834},[],{"categories":836},[242],{"categories":838},[],{"categories":840},[236],{"categories":842},[],{"categories":844},[278],{"categories":846},[278],{"categories":848},[278],{"categories":850},[],{"categories":852},[],{"categories":854},[257],{"categories":856},[239],{"categories":858},[236],{"categories":860},[236],{"categories":862},[236],{"categories":864},[233],{"categories":866},[236],{"categories":868},[],{"categories":870},[288],{"categories":872},[288],{"categories":874},[233],{"categories":876},[],{"categories":878},[236],{"categories":880},[236],{"categories":882},[233],{"categories":884},[257],{"categories":886},[295],{"categories":888},[239],{"categories":890},[],{"categories":892},[278],{"categories":894},[],{"categories":896},[236],{"categories":898},[],{"categories":900},[233],{"categories":902},[239],{"categories":904},[],{"categories":906},[550],{"categories":908},[281],{"categories":910},[288],{"categories":912},[295],{"categories":914},[288],{"categories":916},[239],{"categories":918},[],{"categories":920},[],{"categories":922},[239],{"categories":924},[230],{"categories":926},[239],{"categories":928},[242],{"categories":930},[233],{"categories":932},[],{"categories":934},[236],{"categories":936},[242],{"categories":938},[236],{"categories":940},[236],{"categories":942},[295],{"categories":944},[278],{"categories":946},[239],{"categories":948},[],{"categories":950},[],{"categories":952},[550],{"categories":954},[288],{"categories":956},[],{"categories":958},[239],{"categories":960},[236],{"categories":962},[278,236],{"categories":964},[230],{"categories":966},[],{"categories":968},[236],{"categories":970},[230],{"categories":972},[278],{"categories":974},[239],{"categories":976},[288],{"categories":978},[],{"categories":980},[236],{"categories":982},[],{"categories":984},[230],{"categories":986},[],{"categories":988},[239],{"categories":990},[242],{"categories":992},[236],{"categories":994},[236],{"categories":996},[278],{"categories":998},[239],{"categories":1000},[550],{"categories":1002},[278],{"categories":1004},[239],{"categories":1006},[236],{"categories":1008},[236],{"categories":1010},[236],{"categories":1012},[257],{"categories":1014},[],{"categories":1016},[242],{"categories":1018},[239],{"categories":1020},[278],{"categories":1022},[239],{"categories":1024},[288],{"categories":1026},[278],{"categories":1028},[239],{"categories":1030},[257],{"categories":1032},[],{"categories":1034},[236],{"categories":1036},[278],{"categories":1038},[236],{"categories":1040},[230],{"categories":1042},[257],{"categories":1044},[236],{"categories":1046},[295],{"categories":1048},[236],{"categories":1050},[236],{"categories":1052},[239],{"categories":1054},[239],{"categories":1056},[236],{"categories":1058},[239],{"categories":1060},[278],{"categories":1062},[236],{"categories":1064},[],{"categories":1066},[],{"categories":1068},[288],{"categories":1070},[],{"categories":1072},[230],{"categories":1074},[550],{"categories":1076},[],{"categories":1078},[230],{"categories":1080},[233],{"categories":1082},[295],{"categories":1084},[],{"categories":1086},[233],{"categories":1088},[],{"categories":1090},[],{"categories":1092},[],{"categories":1094},[],{"categories":1096},[],{"categories":1098},[236],{"categories":1100},[239],{"categories":1102},[550],{"categories":1104},[230],{"categories":1106},[236],{"categories":1108},[288],{"categories":1110},[242],{"categories":1112},[236],{"categories":1114},[295],{"categories":1116},[236],{"categories":1118},[236],{"categories":1120},[236],{"categories":1122},[236,230],{"categories":1124},[288],{"categories":1126},[288],{"categories":1128},[278],{"categories":1130},[236],{"categories":1132},[],{"categories":1134},[],{"categories":1136},[],{"categories":1138},[288],{"categories":1140},[281],{"categories":1142},[257],{"categories":1144},[278],{"categories":1146},[],{"categories":1148},[236],{"categories":1150},[236],{"categories":1152},[],{"categories":1154},[],{"categories":1156},[239],{"categories":1158},[236],{"categories":1160},[233],{"categories":1162},[],{"categories":1164},[230],{"categories":1166},[236],{"categories":1168},[230],{"categories":1170},[236],{"categories":1172},[288],{"categories":1174},[295],{"categories":1176},[236,278],{"categories":1178},[257],{"categories":1180},[278],{"categories":1182},[],{"categories":1184},[550],{"categories":1186},[278],{"categories":1188},[239],{"categories":1190},[],{"categories":1192},[],{"categories":1194},[],{"categories":1196},[],{"categories":1198},[288],{"categories":1200},[239],{"categories":1202},[239],{"categories":1204},[236],{"categories":1206},[236],{"categories":1208},[],{"categories":1210},[278],{"categories":1212},[],{"categories":1214},[],{"categories":1216},[239],{"categories":1218},[],{"categories":1220},[],{"categories":1222},[295],{"categories":1224},[295],{"categories":1226},[239],{"categories":1228},[],{"categories":1230},[236],{"categories":1232},[236],{"categories":1234},[288],{"categories":1236},[278],{"categories":1238},[278],{"categories":1240},[239],{"categories":1242},[230],{"categories":1244},[236],{"categories":1246},[278],{"categories":1248},[278],{"categories":1250},[239],{"categories":1252},[239],{"categories":1254},[236],{"categories":1256},[],{"categories":1258},[],{"categories":1260},[236],{"categories":1262},[239],{"categories":1264},[257],{"categories":1266},[288],{"categories":1268},[230],{"categories":1270},[236],{"categories":1272},[],{"categories":1274},[239],{"categories":1276},[239],{"categories":1278},[],{"categories":1280},[230],{"categories":1282},[236],{"categories":1284},[230],{"categories":1286},[230],{"categories":1288},[],{"categories":1290},[],{"categories":1292},[239],{"categories":1294},[239],{"categories":1296},[236],{"categories":1298},[236],{"categories":1300},[257],{"categories":1302},[281],{"categories":1304},[242],{"categories":1306},[257],{"categories":1308},[278],{"categories":1310},[],{"categories":1312},[257],{"categories":1314},[],{"categories":1316},[],{"categories":1318},[],{"categories":1320},[],{"categories":1322},[288],{"categories":1324},[281],{"categories":1326},[],{"categories":1328},[236],{"categories":1330},[236],{"categories":1332},[281],{"categories":1334},[288],{"categories":1336},[],{"categories":1338},[],{"categories":1340},[239],{"categories":1342},[257],{"categories":1344},[257],{"categories":1346},[239],{"categories":1348},[230],{"categories":1350},[236,550],{"categories":1352},[],{"categories":1354},[278],{"categories":1356},[230],{"categories":1358},[239],{"categories":1360},[278],{"categories":1362},[],{"categories":1364},[239],{"categories":1366},[239],{"categories":1368},[236],{"categories":1370},[295],{"categories":1372},[288],{"categories":1374},[278],{"categories":1376},[],{"categories":1378},[239],{"categories":1380},[236],{"categories":1382},[239],{"categories":1384},[239],{"categories":1386},[239],{"categories":1388},[295],{"categories":1390},[239],{"categories":1392},[236],{"categories":1394},[],{"categories":1396},[295],{"categories":1398},[257],{"categories":1400},[239],{"categories":1402},[],{"categories":1404},[],{"categories":1406},[236],{"categories":1408},[239],{"categories":1410},[257],{"categories":1412},[239],{"categories":1414},[],{"categories":1416},[],{"categories":1418},[],{"categories":1420},[239],{"categories":1422},[],{"categories":1424},[],{"categories":1426},[281],{"categories":1428},[236],{"categories":1430},[281],{"categories":1432},[257],{"categories":1434},[236],{"categories":1436},[236],{"categories":1438},[239],{"categories":1440},[236],{"categories":1442},[],{"categories":1444},[],{"categories":1446},[550],{"categories":1448},[],{"categories":1450},[],{"categories":1452},[230],{"categories":1454},[],{"categories":1456},[],{"categories":1458},[],{"categories":1460},[],{"categories":1462},[288],{"categories":1464},[257],{"categories":1466},[295],{"categories":1468},[233],{"categories":1470},[236],{"categories":1472},[236],{"categories":1474},[233],{"categories":1476},[],{"categories":1478},[278],{"categories":1480},[239],{"categories":1482},[233],{"categories":1484},[236],{"categories":1486},[236],{"categories":1488},[230],{"categories":1490},[],{"categories":1492},[230],{"categories":1494},[236],{"categories":1496},[295],{"categories":1498},[239],{"categories":1500},[257],{"categories":1502},[233],{"categories":1504},[236],{"categories":1506},[239],{"categories":1508},[],{"categories":1510},[236],{"categories":1512},[230],{"categories":1514},[236],{"categories":1516},[],{"categories":1518},[257],{"categories":1520},[236],{"categories":1522},[],{"categories":1524},[233],{"categories":1526},[236],{"categories":1528},[],{"categories":1530},[],{"categories":1532},[],{"categories":1534},[236],{"categories":1536},[],{"categories":1538},[550],{"categories":1540},[236],{"categories":1542},[],{"categories":1544},[236],{"categories":1546},[236],{"categories":1548},[236],{"categories":1550},[236,550],{"categories":1552},[236],{"categories":1554},[236],{"categories":1556},[278],{"categories":1558},[239],{"categories":1560},[],{"categories":1562},[239],{"categories":1564},[236],{"categories":1566},[236],{"categories":1568},[236],{"categories":1570},[230],{"categories":1572},[230],{"categories":1574},[288],{"categories":1576},[278],{"categories":1578},[239],{"categories":1580},[],{"categories":1582},[236],{"categories":1584},[257],{"categories":1586},[236],{"categories":1588},[233],{"categories":1590},[],{"categories":1592},[550],{"categories":1594},[278],{"categories":1596},[278],{"categories":1598},[239],{"categories":1600},[257],{"categories":1602},[239],{"categories":1604},[236],{"categories":1606},[],{"categories":1608},[236],{"categories":1610},[],{"categories":1612},[],{"categories":1614},[236],{"categories":1616},[236],{"categories":1618},[236],{"categories":1620},[239],{"categories":1622},[236],{"categories":1624},[],{"categories":1626},[281],{"categories":1628},[239],{"categories":1630},[],{"categories":1632},[236],{"categories":1634},[257],{"categories":1636},[],{"categories":1638},[278],{"categories":1640},[550],{"categories":1642},[257],{"categories":1644},[288],{"categories":1646},[288],{"categories":1648},[257],{"categories":1650},[257],{"categories":1652},[550],{"categories":1654},[],{"categories":1656},[257],{"categories":1658},[236],{"categories":1660},[230],{"categories":1662},[257],{"categories":1664},[],{"categories":1666},[281],{"categories":1668},[257],{"categories":1670},[288],{"categories":1672},[257],{"categories":1674},[550],{"categories":1676},[236],{"categories":1678},[236],{"categories":1680},[],{"categories":1682},[233],{"categories":1684},[],{"categories":1686},[],{"categories":1688},[236],{"categories":1690},[236],{"categories":1692},[236],{"categories":1694},[236],{"categories":1696},[],{"categories":1698},[281],{"categories":1700},[230],{"categories":1702},[],{"categories":1704},[236],{"categories":1706},[236],{"categories":1708},[550],{"categories":1710},[550],{"categories":1712},[],{"categories":1714},[239],{"categories":1716},[257],{"categories":1718},[257],{"categories":1720},[236],{"categories":1722},[239],{"categories":1724},[],{"categories":1726},[278],{"categories":1728},[236],{"categories":1730},[236],{"categories":1732},[],{"categories":1734},[],{"categories":1736},[550],{"categories":1738},[236],{"categories":1740},[288],{"categories":1742},[233],{"categories":1744},[236],{"categories":1746},[],{"categories":1748},[239],{"categories":1750},[230],{"categories":1752},[230],{"categories":1754},[],{"categories":1756},[236],{"categories":1758},[278],{"categories":1760},[239],{"categories":1762},[],{"categories":1764},[236],{"categories":1766},[236],{"categories":1768},[239],{"categories":1770},[],{"categories":1772},[239],{"categories":1774},[288],{"categories":1776},[],{"categories":1778},[236],{"categories":1780},[],{"categories":1782},[236],{"categories":1784},[],{"categories":1786},[236],{"categories":1788},[236],{"categories":1790},[],{"categories":1792},[236],{"categories":1794},[257],{"categories":1796},[236],{"categories":1798},[236],{"categories":1800},[230],{"categories":1802},[236],{"categories":1804},[257],{"categories":1806},[239],{"categories":1808},[],{"categories":1810},[236],{"categories":1812},[295],{"categories":1814},[],{"categories":1816},[],{"categories":1818},[],{"categories":1820},[230],{"categories":1822},[257],{"categories":1824},[239],{"categories":1826},[236],{"categories":1828},[278],{"categories":1830},[239],{"categories":1832},[],{"categories":1834},[239],{"categories":1836},[],{"categories":1838},[236],{"categories":1840},[239],{"categories":1842},[236],{"categories":1844},[],{"categories":1846},[236],{"categories":1848},[236],{"categories":1850},[257],{"categories":1852},[278],{"categories":1854},[239],{"categories":1856},[278],{"categories":1858},[233],{"categories":1860},[],{"categories":1862},[],{"categories":1864},[236],{"categories":1866},[230],{"categories":1868},[257],{"categories":1870},[],{"categories":1872},[],{"categories":1874},[288],{"categories":1876},[278],{"categories":1878},[],{"categories":1880},[236],{"categories":1882},[],{"categories":1884},[295],{"categories":1886},[236],{"categories":1888},[550],{"categories":1890},[288],{"categories":1892},[],{"categories":1894},[239],{"categories":1896},[236],{"categories":1898},[239],{"categories":1900},[239],{"categories":1902},[236],{"categories":1904},[],{"categories":1906},[230],{"categories":1908},[236],{"categories":1910},[233],{"categories":1912},[288],{"categories":1914},[278],{"categories":1916},[],{"categories":1918},[],{"categories":1920},[],{"categories":1922},[239],{"categories":1924},[278],{"categories":1926},[257],{"categories":1928},[236],{"categories":1930},[257],{"categories":1932},[278],{"categories":1934},[],{"categories":1936},[278],{"categories":1938},[257],{"categories":1940},[233],{"categories":1942},[236],{"categories":1944},[257],{"categories":1946},[295],{"categories":1948},[],{"categories":1950},[],{"categories":1952},[281],{"categories":1954},[236,288],{"categories":1956},[257],{"categories":1958},[236],{"categories":1960},[239],{"categories":1962},[239],{"categories":1964},[236],{"categories":1966},[],{"categories":1968},[288],{"categories":1970},[236],{"categories":1972},[281],{"categories":1974},[239],{"categories":1976},[295],{"categories":1978},[550],{"categories":1980},[],{"categories":1982},[230],{"categories":1984},[239],{"categories":1986},[239],{"categories":1988},[288],{"categories":1990},[236],{"categories":1992},[236],{"categories":1994},[],{"categories":1996},[],{"categories":1998},[],{"categories":2000},[550],{"categories":2002},[257],{"categories":2004},[236],{"categories":2006},[236],{"categories":2008},[236],{"categories":2010},[],{"categories":2012},[281],{"categories":2014},[233],{"categories":2016},[],{"categories":2018},[239],{"categories":2020},[550],{"categories":2022},[],{"categories":2024},[278],{"categories":2026},[278],{"categories":2028},[],{"categories":2030},[288],{"categories":2032},[278],{"categories":2034},[236],{"categories":2036},[],{"categories":2038},[257],{"categories":2040},[236],{"categories":2042},[278],{"categories":2044},[239],{"categories":2046},[257],{"categories":2048},[],{"categories":2050},[239],{"categories":2052},[278],{"categories":2054},[236],{"categories":2056},[],{"categories":2058},[236],{"categories":2060},[236],{"categories":2062},[550],{"categories":2064},[257],{"categories":2066},[281],{"categories":2068},[281],{"categories":2070},[],{"categories":2072},[],{"categories":2074},[],{"categories":2076},[239],{"categories":2078},[288],{"categories":2080},[288],{"categories":2082},[],{"categories":2084},[],{"categories":2086},[236],{"categories":2088},[],{"categories":2090},[239],{"categories":2092},[236],{"categories":2094},[],{"categories":2096},[236],{"categories":2098},[233],{"categories":2100},[236],{"categories":2102},[295],{"categories":2104},[239],{"categories":2106},[236],{"categories":2108},[288],{"categories":2110},[257],{"categories":2112},[239],{"categories":2114},[],{"categories":2116},[257],{"categories":2118},[239],{"categories":2120},[239],{"categories":2122},[],{"categories":2124},[233],{"categories":2126},[239],{"categories":2128},[],{"categories":2130},[236],{"categories":2132},[230],{"categories":2134},[257],{"categories":2136},[550],{"categories":2138},[239],{"categories":2140},[239],{"categories":2142},[230],{"categories":2144},[236],{"categories":2146},[],{"categories":2148},[],{"categories":2150},[278],{"categories":2152},[236,233],{"categories":2154},[],{"categories":2156},[230],{"categories":2158},[281],{"categories":2160},[236],{"categories":2162},[288],{"categories":2164},[236],{"categories":2166},[239],{"categories":2168},[236],{"categories":2170},[236],{"categories":2172},[257],{"categories":2174},[239],{"categories":2176},[],{"categories":2178},[],{"categories":2180},[239],{"categories":2182},[236],{"categories":2184},[550],{"categories":2186},[],{"categories":2188},[236],{"categories":2190},[239],{"categories":2192},[],{"categories":2194},[236],{"categories":2196},[295],{"categories":2198},[281],{"categories":2200},[239],{"categories":2202},[236],{"categories":2204},[550],{"categories":2206},[],{"categories":2208},[236],{"categories":2210},[295],{"categories":2212},[278],{"categories":2214},[236],{"categories":2216},[],{"categories":2218},[295],{"categories":2220},[257],{"categories":2222},[236],{"categories":2224},[236],{"categories":2226},[230],{"categories":2228},[],{"categories":2230},[],{"categories":2232},[278],{"categories":2234},[236],{"categories":2236},[281],{"categories":2238},[295],{"categories":2240},[295],{"categories":2242},[257],{"categories":2244},[],{"categories":2246},[],{"categories":2248},[236],{"categories":2250},[],{"categories":2252},[236,288],{"categories":2254},[257],{"categories":2256},[239],{"categories":2258},[288],{"categories":2260},[236],{"categories":2262},[230],{"categories":2264},[],{"categories":2266},[],{"categories":2268},[230],{"categories":2270},[295],{"categories":2272},[236],{"categories":2274},[],{"categories":2276},[278,236],{"categories":2278},[550],{"categories":2280},[230],{"categories":2282},[],{"categories":2284},[233],{"categories":2286},[233],{"categories":2288},[236],{"categories":2290},[288],{"categories":2292},[239],{"categories":2294},[257],{"categories":2296},[295],{"categories":2298},[278],{"categories":2300},[236],{"categories":2302},[236],{"categories":2304},[236],{"categories":2306},[230],{"categories":2308},[236],{"categories":2310},[239],{"categories":2312},[257],{"categories":2314},[],{"categories":2316},[],{"categories":2318},[281],{"categories":2320},[288],{"categories":2322},[236],{"categories":2324},[278],{"categories":2326},[281],{"categories":2328},[236],{"categories":2330},[236],{"categories":2332},[239],{"categories":2334},[239],{"categories":2336},[236,233],{"categories":2338},[],{"categories":2340},[278],{"categories":2342},[],{"categories":2344},[236],{"categories":2346},[257],{"categories":2348},[230],{"categories":2350},[230],{"categories":2352},[239],{"categories":2354},[236],{"categories":2356},[233],{"categories":2358},[288],{"categories":2360},[295],{"categories":2362},[],{"categories":2364},[257],{"categories":2366},[236],{"categories":2368},[236],{"categories":2370},[257],{"categories":2372},[288],{"categories":2374},[236],{"categories":2376},[239],{"categories":2378},[257],{"categories":2380},[236],{"categories":2382},[278],{"categories":2384},[236],{"categories":2386},[236],{"categories":2388},[550],{"categories":2390},[242],{"categories":2392},[239],{"categories":2394},[236],{"categories":2396},[257],{"categories":2398},[239],{"categories":2400},[295],{"categories":2402},[236],{"categories":2404},[],{"categories":2406},[236],{"categories":2408},[],{"categories":2410},[],{"categories":2412},[],{"categories":2414},[233],{"categories":2416},[236],{"categories":2418},[239],{"categories":2420},[257],{"categories":2422},[257],{"categories":2424},[257],{"categories":2426},[257],{"categories":2428},[],{"categories":2430},[230],{"categories":2432},[239],{"categories":2434},[257],{"categories":2436},[230],{"categories":2438},[239],{"categories":2440},[236],{"categories":2442},[236,239],{"categories":2444},[239],{"categories":2446},[550],{"categories":2448},[257],{"categories":2450},[257],{"categories":2452},[239],{"categories":2454},[236],{"categories":2456},[],{"categories":2458},[257],{"categories":2460},[295],{"categories":2462},[230],{"categories":2464},[236],{"categories":2466},[236],{"categories":2468},[],{"categories":2470},[288],{"categories":2472},[],{"categories":2474},[230],{"categories":2476},[239],{"categories":2478},[257],{"categories":2480},[236],{"categories":2482},[257],{"categories":2484},[230],{"categories":2486},[257],{"categories":2488},[257],{"categories":2490},[],{"categories":2492},[233],{"categories":2494},[239],{"categories":2496},[257],{"categories":2498},[257],{"categories":2500},[257],{"categories":2502},[257],{"categories":2504},[257],{"categories":2506},[257],{"categories":2508},[257],{"categories":2510},[257],{"categories":2512},[257],{"categories":2514},[257],{"categories":2516},[281],{"categories":2518},[230],{"categories":2520},[236],{"categories":2522},[236],{"categories":2524},[],{"categories":2526},[236,230],{"categories":2528},[],{"categories":2530},[239],{"categories":2532},[257],{"categories":2534},[239],{"categories":2536},[236],{"categories":2538},[236],{"categories":2540},[236],{"categories":2542},[236],{"categories":2544},[236],{"categories":2546},[239],{"categories":2548},[233],{"categories":2550},[278],{"categories":2552},[257],{"categories":2554},[236],{"categories":2556},[],{"categories":2558},[],{"categories":2560},[239],{"categories":2562},[278],{"categories":2564},[236],{"categories":2566},[],{"categories":2568},[],{"categories":2570},[295],{"categories":2572},[236],{"categories":2574},[],{"categories":2576},[],{"categories":2578},[230],{"categories":2580},[233],{"categories":2582},[236],{"categories":2584},[233],{"categories":2586},[278],{"categories":2588},[],{"categories":2590},[257],{"categories":2592},[],{"categories":2594},[278],{"categories":2596},[236],{"categories":2598},[295],{"categories":2600},[],{"categories":2602},[295],{"categories":2604},[],{"categories":2606},[],{"categories":2608},[239],{"categories":2610},[],{"categories":2612},[233],{"categories":2614},[230],{"categories":2616},[278],{"categories":2618},[288],{"categories":2620},[],{"categories":2622},[],{"categories":2624},[236],{"categories":2626},[230],{"categories":2628},[295],{"categories":2630},[],{"categories":2632},[239],{"categories":2634},[239],{"categories":2636},[257],{"categories":2638},[236],{"categories":2640},[239],{"categories":2642},[236],{"categories":2644},[239],{"categories":2646},[236],{"categories":2648},[242],{"categories":2650},[257],{"categories":2652},[],{"categories":2654},[295],{"categories":2656},[288],{"categories":2658},[239],{"categories":2660},[],{"categories":2662},[236],{"categories":2664},[239],{"categories":2666},[233],{"categories":2668},[230],{"categories":2670},[236],{"categories":2672},[278],{"categories":2674},[288],{"categories":2676},[288],{"categories":2678},[236],{"categories":2680},[281],{"categories":2682},[236],{"categories":2684},[239],{"categories":2686},[233],{"categories":2688},[239],{"categories":2690},[236],{"categories":2692},[236],{"categories":2694},[239],{"categories":2696},[257],{"categories":2698},[],{"categories":2700},[230],{"categories":2702},[236],{"categories":2704},[239],{"categories":2706},[236],{"categories":2708},[236],{"categories":2710},[],{"categories":2712},[278],{"categories":2714},[233],{"categories":2716},[257],{"categories":2718},[236],{"categories":2720},[236],{"categories":2722},[278],{"categories":2724},[295],{"categories":2726},[281],{"categories":2728},[236],{"categories":2730},[257],{"categories":2732},[236],{"categories":2734},[239],{"categories":2736},[550],{"categories":2738},[236],{"categories":2740},[239],{"categories":2742},[281],{"categories":2744},[],{"categories":2746},[239],{"categories":2748},[288],{"categories":2750},[278],{"categories":2752},[236],{"categories":2754},[230],{"categories":2756},[233],{"categories":2758},[288],{"categories":2760},[],{"categories":2762},[239],{"categories":2764},[236],{"categories":2766},[],{"categories":2768},[257],{"categories":2770},[],{"categories":2772},[257],{"categories":2774},[236],{"categories":2776},[239],{"categories":2778},[239],{"categories":2780},[239],{"categories":2782},[],{"categories":2784},[],{"categories":2786},[236],{"categories":2788},[236],{"categories":2790},[],{"categories":2792},[278],{"categories":2794},[239],{"categories":2796},[295],{"categories":2798},[230],{"categories":2800},[],{"categories":2802},[],{"categories":2804},[257],{"categories":2806},[288],{"categories":2808},[236],{"categories":2810},[236],{"categories":2812},[236],{"categories":2814},[288],{"categories":2816},[257],{"categories":2818},[278],{"categories":2820},[236],{"categories":2822},[236],{"categories":2824},[236],{"categories":2826},[257],{"categories":2828},[236],{"categories":2830},[257],{"categories":2832},[239],{"categories":2834},[239],{"categories":2836},[288],{"categories":2838},[239],{"categories":2840},[236],{"categories":2842},[288],{"categories":2844},[278],{"categories":2846},[],{"categories":2848},[239],{"categories":2850},[],{"categories":2852},[],{"categories":2854},[233],{"categories":2856},[236],{"categories":2858},[239],{"categories":2860},[230],{"categories":2862},[239],{"categories":2864},[295],{"categories":2866},[],{"categories":2868},[239],{"categories":2870},[],{"categories":2872},[230],{"categories":2874},[239],{"categories":2876},[],{"categories":2878},[239],{"categories":2880},[236],{"categories":2882},[257],{"categories":2884},[236],{"categories":2886},[239],{"categories":2888},[257],{"categories":2890},[239],{"categories":2892},[288],{"categories":2894},[278],{"categories":2896},[230],{"categories":2898},[],{"categories":2900},[239],{"categories":2902},[278],{"categories":2904},[257],{"categories":2906},[236],{"categories":2908},[278],{"categories":2910},[230],{"categories":2912},[],{"categories":2914},[239],{"categories":2916},[239],{"categories":2918},[236],{"categories":2920},[],{"categories":2922},[239],{"categories":2924},[242],{"categories":2926},[257],{"categories":2928},[239],{"categories":2930},[233],{"categories":2932},[],{"categories":2934},[236],{"categories":2936},[242],{"categories":2938},[236],{"categories":2940},[239],{"categories":2942},[257],{"categories":2944},[230],{"categories":2946},[550],{"categories":2948},[236],{"categories":2950},[236],{"categories":2952},[236],{"categories":2954},[257],{"categories":2956},[233],{"categories":2958},[236],{"categories":2960},[278],{"categories":2962},[257],{"categories":2964},[550],{"categories":2966},[236],{"categories":2968},[],{"categories":2970},[],{"categories":2972},[550],{"categories":2974},[281],{"categories":2976},[239],{"categories":2978},[239],{"categories":2980},[257],{"categories":2982},[236],{"categories":2984},[230],{"categories":2986},[278],{"categories":2988},[239],{"categories":2990},[236],{"categories":2992},[295],{"categories":2994},[236],{"categories":2996},[239],{"categories":2998},[],{"categories":3000},[236],{"categories":3002},[236],{"categories":3004},[257],{"categories":3006},[230],{"categories":3008},[],{"categories":3010},[236],{"categories":3012},[236],{"categories":3014},[288],{"categories":3016},[278],{"categories":3018},[236,239],{"categories":3020},[295,233],{"categories":3022},[236],{"categories":3024},[],{"categories":3026},[239],{"categories":3028},[],{"categories":3030},[288],{"categories":3032},[236],{"categories":3034},[257],{"categories":3036},[],{"categories":3038},[239],{"categories":3040},[],{"categories":3042},[239],{"categories":3044},[230],{"categories":3046},[239],{"categories":3048},[236],{"categories":3050},[550],{"categories":3052},[295],{"categories":3054},[233],{"categories":3056},[233],{"categories":3058},[230],{"categories":3060},[230],{"categories":3062},[236],{"categories":3064},[239],{"categories":3066},[236],{"categories":3068},[236],{"categories":3070},[230],{"categories":3072},[236],{"categories":3074},[295],{"categories":3076},[257],{"categories":3078},[236],{"categories":3080},[239],{"categories":3082},[236],{"categories":3084},[],{"categories":3086},[288],{"categories":3088},[],{"categories":3090},[239],{"categories":3092},[230],{"categories":3094},[],{"categories":3096},[550],{"categories":3098},[236],{"categories":3100},[],{"categories":3102},[257],{"categories":3104},[239],{"categories":3106},[288],{"categories":3108},[236],{"categories":3110},[239],{"categories":3112},[288],{"categories":3114},[239],{"categories":3116},[257],{"categories":3118},[230],{"categories":3120},[257],{"categories":3122},[288],{"categories":3124},[236],{"categories":3126},[278],{"categories":3128},[236],{"categories":3130},[236],{"categories":3132},[236],{"categories":3134},[236],{"categories":3136},[239],{"categories":3138},[236],{"categories":3140},[239],{"categories":3142},[236],{"categories":3144},[230],{"categories":3146},[236],{"categories":3148},[239],{"categories":3150},[278],{"categories":3152},[230],{"categories":3154},[239],{"categories":3156},[278],{"categories":3158},[],{"categories":3160},[236],{"categories":3162},[236],{"categories":3164},[288],{"categories":3166},[],{"categories":3168},[239],{"categories":3170},[295],{"categories":3172},[236],{"categories":3174},[257],{"categories":3176},[295],{"categories":3178},[239],{"categories":3180},[233],{"categories":3182},[233],{"categories":3184},[236],{"categories":3186},[230],{"categories":3188},[],{"categories":3190},[236],{"categories":3192},[],{"categories":3194},[230],{"categories":3196},[236],{"categories":3198},[239],{"categories":3200},[239],{"categories":3202},[],{"categories":3204},[288],{"categories":3206},[288],{"categories":3208},[295],{"categories":3210},[278],{"categories":3212},[],{"categories":3214},[236],{"categories":3216},[230],{"categories":3218},[236],{"categories":3220},[288],{"categories":3222},[230],{"categories":3224},[257],{"categories":3226},[257],{"categories":3228},[],{"categories":3230},[257],{"categories":3232},[239],{"categories":3234},[278],{"categories":3236},[281],{"categories":3238},[236],{"categories":3240},[],{"categories":3242},[257],{"categories":3244},[288],{"categories":3246},[233],{"categories":3248},[236],{"categories":3250},[230],{"categories":3252},[550],{"categories":3254},[230],{"categories":3256},[],{"categories":3258},[],{"categories":3260},[257],{"categories":3262},[],{"categories":3264},[239],{"categories":3266},[239],{"categories":3268},[239],{"categories":3270},[],{"categories":3272},[236],{"categories":3274},[],{"categories":3276},[257],{"categories":3278},[230],{"categories":3280},[278],{"categories":3282},[236],{"categories":3284},[257],{"categories":3286},[257],{"categories":3288},[],{"categories":3290},[257],{"categories":3292},[230],{"categories":3294},[236],{"categories":3296},[],{"categories":3298},[239],{"categories":3300},[239],{"categories":3302},[230],{"categories":3304},[],{"categories":3306},[],{"categories":3308},[],{"categories":3310},[278],{"categories":3312},[239],{"categories":3314},[236],{"categories":3316},[],{"categories":3318},[],{"categories":3320},[],{"categories":3322},[278],{"categories":3324},[],{"categories":3326},[230],{"categories":3328},[],{"categories":3330},[],{"categories":3332},[278],{"categories":3334},[236],{"categories":3336},[257],{"categories":3338},[],{"categories":3340},[295],{"categories":3342},[257],{"categories":3344},[295],{"categories":3346},[236],{"categories":3348},[],{"categories":3350},[],{"categories":3352},[239],{"categories":3354},[],{"categories":3356},[],{"categories":3358},[239],{"categories":3360},[236],{"categories":3362},[],{"categories":3364},[239],{"categories":3366},[257],{"categories":3368},[295],{"categories":3370},[281],{"categories":3372},[239],{"categories":3374},[239],{"categories":3376},[],{"categories":3378},[],{"categories":3380},[],{"categories":3382},[257],{"categories":3384},[],{"categories":3386},[],{"categories":3388},[278],{"categories":3390},[230],{"categories":3392},[],{"categories":3394},[233],{"categories":3396},[295],{"categories":3398},[236],{"categories":3400},[288],{"categories":3402},[230],{"categories":3404},[281],{"categories":3406},[233],{"categories":3408},[288],{"categories":3410},[],{"categories":3412},[],{"categories":3414},[239],{"categories":3416},[230],{"categories":3418},[278],{"categories":3420},[230],{"categories":3422},[239],{"categories":3424},[550],{"categories":3426},[239],{"categories":3428},[],{"categories":3430},[236],{"categories":3432},[257],{"categories":3434},[288],{"categories":3436},[],{"categories":3438},[278],{"categories":3440},[257],{"categories":3442},[230],{"categories":3444},[239],{"categories":3446},[236],{"categories":3448},[233],{"categories":3450},[239,550],{"categories":3452},[239],{"categories":3454},[288],{"categories":3456},[236],{"categories":3458},[281],{"categories":3460},[295],{"categories":3462},[239],{"categories":3464},[],{"categories":3466},[239],{"categories":3468},[236],{"categories":3470},[233],{"categories":3472},[],{"categories":3474},[],{"categories":3476},[236],{"categories":3478},[281],{"categories":3480},[236],{"categories":3482},[],{"categories":3484},[257],{"categories":3486},[],{"categories":3488},[257],{"categories":3490},[288],{"categories":3492},[239],{"categories":3494},[236],{"categories":3496},[295],{"categories":3498},[288],{"categories":3500},[],{"categories":3502},[257],{"categories":3504},[236],{"categories":3506},[],{"categories":3508},[236],{"categories":3510},[239],{"categories":3512},[236],{"categories":3514},[239],{"categories":3516},[236],{"categories":3518},[236],{"categories":3520},[236],{"categories":3522},[236],{"categories":3524},[233],{"categories":3526},[],{"categories":3528},[242],{"categories":3530},[257],{"categories":3532},[236],{"categories":3534},[],{"categories":3536},[288],{"categories":3538},[236],{"categories":3540},[236],{"categories":3542},[239],{"categories":3544},[257],{"categories":3546},[236],{"categories":3548},[236],{"categories":3550},[233],{"categories":3552},[239],{"categories":3554},[278],{"categories":3556},[],{"categories":3558},[281],{"categories":3560},[236],{"categories":3562},[],{"categories":3564},[257],{"categories":3566},[295],{"categories":3568},[],{"categories":3570},[],{"categories":3572},[257],{"categories":3574},[257],{"categories":3576},[295],{"categories":3578},[230],{"categories":3580},[239],{"categories":3582},[239],{"categories":3584},[236],{"categories":3586},[233],{"categories":3588},[],{"categories":3590},[],{"categories":3592},[257],{"categories":3594},[281],{"categories":3596},[288],{"categories":3598},[239],{"categories":3600},[278],{"categories":3602},[281],{"categories":3604},[281],{"categories":3606},[],{"categories":3608},[257],{"categories":3610},[236],{"categories":3612},[236],{"categories":3614},[288],{"categories":3616},[],{"categories":3618},[257],{"categories":3620},[257],{"categories":3622},[257],{"categories":3624},[],{"categories":3626},[239],{"categories":3628},[236],{"categories":3630},[],{"categories":3632},[230],{"categories":3634},[233],{"categories":3636},[],{"categories":3638},[236],{"categories":3640},[236],{"categories":3642},[],{"categories":3644},[288],{"categories":3646},[],{"categories":3648},[],{"categories":3650},[],{"categories":3652},[],{"categories":3654},[236],{"categories":3656},[257],{"categories":3658},[],{"categories":3660},[],{"categories":3662},[236],{"categories":3664},[236],{"categories":3666},[236],{"categories":3668},[281],{"categories":3670},[236],{"categories":3672},[281],{"categories":3674},[],{"categories":3676},[281],{"categories":3678},[281],{"categories":3680},[550],{"categories":3682},[239],{"categories":3684},[288],{"categories":3686},[],{"categories":3688},[],{"categories":3690},[281],{"categories":3692},[288],{"categories":3694},[288],{"categories":3696},[288],{"categories":3698},[],{"categories":3700},[230],{"categories":3702},[288],{"categories":3704},[288],{"categories":3706},[230],{"categories":3708},[288],{"categories":3710},[233],{"categories":3712},[288],{"categories":3714},[288],{"categories":3716},[288],{"categories":3718},[281],{"categories":3720},[257],{"categories":3722},[257],{"categories":3724},[236],{"categories":3726},[288],{"categories":3728},[281],{"categories":3730},[550],{"categories":3732},[281],{"categories":3734},[281],{"categories":3736},[281],{"categories":3738},[],{"categories":3740},[233],{"categories":3742},[],{"categories":3744},[550],{"categories":3746},[288],{"categories":3748},[288],{"categories":3750},[288],{"categories":3752},[239],{"categories":3754},[257,233],{"categories":3756},[281],{"categories":3758},[],{"categories":3760},[],{"categories":3762},[281],{"categories":3764},[],{"categories":3766},[281],{"categories":3768},[257],{"categories":3770},[239],{"categories":3772},[],{"categories":3774},[288],{"categories":3776},[236],{"categories":3778},[278],{"categories":3780},[],{"categories":3782},[236],{"categories":3784},[],{"categories":3786},[257],{"categories":3788},[230],{"categories":3790},[281],{"categories":3792},[],{"categories":3794},[288],{"categories":3796},[257],[3798,3877,4077,4148],{"id":3799,"title":3800,"ai":3801,"body":3806,"categories":3839,"created_at":183,"date_modified":183,"description":176,"extension":184,"faq":183,"featured":185,"kicker_label":183,"meta":3840,"navigation":208,"path":3865,"published_at":3866,"question":183,"scraped_at":3867,"seo":3868,"sitemap":3869,"source_id":3870,"source_name":215,"source_type":216,"source_url":3871,"stem":3872,"tags":3873,"thumbnail_url":183,"tldr":3874,"tweet":183,"unknown_tags":3875,"__hash__":3876},"summaries\u002Fsummaries\u002F07f85059ce2b1c55-antangelmed-103b-moe-medical-llm-matches-40b-dense-summary.md","AntAngelMed: 103B MoE Medical LLM Matches 40B Dense at 7x Speed",{"provider":7,"model":8,"input_tokens":3802,"output_tokens":3803,"processing_time_ms":3804,"cost_usd":3805},8023,3168,43093,0.00316595,{"type":14,"value":3807,"toc":3834},[3808,3812,3815,3819,3822,3826],[17,3809,3811],{"id":3810},"sparse-moe-delivers-massive-capacity-at-low-compute","Sparse MoE Delivers Massive Capacity at Low Compute",[22,3813,3814],{},"AntAngelMed packs 103B total parameters into a 1\u002F32 activation-ratio Mixture-of-Experts (MoE) architecture, activating just 6.1B params per inference to match performance of ~40B dense models while achieving up to 7x efficiency over equivalently sized dense setups—speed advantages grow further with longer outputs. MoE works by routing inputs to a subset of 'expert' sub-networks instead of using all params per token, scaling knowledge without proportional compute hikes. Builds on Ling-flash-2.0 base via Ling Scaling Laws, with refinements like finer expert granularity, optimized shared expert ratio, attention balancing, auxiliary-loss-free sigmoid routing, Multi-Token Prediction (MTP) layer, QK-Norm, and Partial-RoPE (subset of attention heads). On H20 GPUs, hits >200 tokens\u002Fsecond (3x a 36B dense model), extends to 128K context via YaRN for full clinical docs or multi-turn dialogues. FP8 quantization + EAGLE3 speculative decoding yields 71% HumanEval uplift, 45% GSM8K, 94% Math-500 at 32 concurrency, stabilizing throughput for coding\u002Fmath proxies.",[17,3816,3818],{"id":3817},"three-stage-training-infuses-medical-depth","Three-Stage Training Infuses Medical Depth",[22,3820,3821],{},"Layer general reasoning atop medical specialization through: (1) Continual pre-training on vast medical corpora—encyclopedias, web text, papers—from Ling-flash-2.0 checkpoint; (2) Supervised Fine-Tuning (SFT) on mixed instructions preserving chain-of-thought via math\u002Fcoding\u002Flogic tasks alongside doctor-patient Q&A, diagnostics, ethics\u002Fsafety; (3) GRPO Reinforcement Learning (lighter PPO variant estimating baselines from group scores, per DeepSeekMath paper) with rewards targeting empathy, structured clinical outputs, safety, evidence-based reasoning to slash hallucinations. This progression embeds domain expertise without eroding broad capabilities.",[17,3823,3825],{"id":3824},"leads-benchmarks-deploys-easily-open-source","Leads Benchmarks, Deploys Easily Open-Source",[22,3827,3828,3829,3833],{},"Tops HealthBench (OpenAI's multi-turn clinical dialogues): #1 open-source, beats proprietary models, widest margin on HealthBench-Hard. Dominates MedAIBench (China Nat’l AI Medical Facility): elite in knowledge Q&A\u002Fethics-safety. #1 overall MedBench (36 datasets, ~700K samples across knowledge QA, understanding, generation, complex reasoning, safety\u002Fethics). Apache 2.0 weights (HuggingFace: MedAIBase\u002FAntAngelMed), MIT code (GitHub: MedAIBase\u002FAntAngelMed). Transformers load: ",[3830,3831,3832],"code",{},"AutoModelForCausalLM.from_pretrained(\"MedAIBase\u002FAntAngelMed\", device_map=\"auto\", trust_remote_code=True)",". Runs on vLLM v0.11.0 (4-GPU tensor parallel), SGLang+FlashAttention-3, vLLM-Ascend (Huawei 910B NPUs). From Health Information Center of Zhejiang Province, Ant Healthcare, Zhejiang Anzhen’er Medical AI Technology Co., Ltd.",{"title":176,"searchDepth":177,"depth":177,"links":3835},[3836,3837,3838],{"id":3810,"depth":177,"text":3811},{"id":3817,"depth":177,"text":3818},{"id":3824,"depth":177,"text":3825},[],{"content_references":3841,"triage":3861},[3842,3845,3849,3852,3855,3859],{"type":189,"title":3843,"url":3844,"context":193},"DeepSeekMath","https:\u002F\u002Farxiv.org\u002Fabs\u002F2402.03300",{"type":3846,"title":3847,"url":3848,"context":198},"tool","AntAngelMed","https:\u002F\u002Fhuggingface.co\u002FMedAIBase\u002FAntAngelMed",{"type":3846,"title":3850,"url":3851,"context":198},"AntAngelMed GitHub Repo","https:\u002F\u002Fgithub.com\u002FMedAIBase\u002FAntAngelMed",{"type":195,"title":3853,"author":3854,"context":202},"Ling-flash-2.0","inclusionAI",{"type":3856,"title":3857,"author":3858,"context":193},"dataset","HealthBench","OpenAI",{"type":3856,"title":3860,"context":193},"MedBench",{"relevance":3862,"novelty":205,"quality":205,"actionability":177,"composite":3863,"reasoning":3864},3,3.25,"Category: AI & LLMs. The article discusses a new medical LLM that showcases innovative architecture and efficiency, which is relevant to AI product builders. However, it lacks specific actionable insights or frameworks that the audience could directly implement in their projects.","\u002Fsummaries\u002F07f85059ce2b1c55-antangelmed-103b-moe-medical-llm-matches-40b-dense-summary","2026-05-12 21:21:47","2026-05-13 12:00:59",{"title":3800,"description":176},{"loc":3865},"07f85059ce2b1c55","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F12\u002Fmeet-antangelmed-a-103b-parameter-open-source-medical-language-model-built-on-a-1-32-activation-ratio-moe-architecture\u002F","summaries\u002F07f85059ce2b1c55-antangelmed-103b-moe-medical-llm-matches-40b-dense-summary",[220,222,221],"103B-param open-source medical LLM activates only 6.1B params via 1\u002F32 MoE, rivals 40B dense models with 7x efficiency, tops HealthBench\u002FMedBench, runs 200+ tps on H20.",[],"BMkdtRqd6qJuSshJwJCoVJVxaHNukE4u3QyIRxxvstU",{"id":3878,"title":3879,"ai":3880,"body":3885,"categories":4035,"created_at":183,"date_modified":183,"description":176,"extension":184,"faq":183,"featured":185,"kicker_label":183,"meta":4036,"navigation":208,"path":4065,"published_at":183,"question":183,"scraped_at":4066,"seo":4067,"sitemap":4068,"source_id":4069,"source_name":4070,"source_type":216,"source_url":4071,"stem":4072,"tags":4073,"thumbnail_url":183,"tldr":4074,"tweet":183,"unknown_tags":4075,"__hash__":4076},"summaries\u002Fsummaries\u002F5f72f336c67bc8d8-gemma-2-open-llms-trained-on-13t-tokens-top-benchm-summary.md","Gemma 2: Open LLMs Trained on 13T Tokens, Top Benchmarks",{"provider":7,"model":8,"input_tokens":3881,"output_tokens":3882,"processing_time_ms":3883,"cost_usd":3884},6087,2342,14579,0.0023659,{"type":14,"value":3886,"toc":4030},[3887,3891,3894,3897,3959,3963,3966,3969,3973,3976,3979,4027],[17,3888,3890],{"id":3889},"deploy-high-performance-llms-on-limited-hardware","Deploy High-Performance LLMs on Limited Hardware",[22,3892,3893],{},"Gemma 2 models (2B, 9B, 27B parameters) are text-to-text, decoder-only LLMs optimized for question answering, summarization, and reasoning. Their small size enables deployment on laptops, desktops, or personal cloud setups, unlike larger models needing massive clusters. Train the 27B on 13T tokens, 9B on 8T, and 2B on 2T from diverse sources like web docs, code, math\u002Fscience, and multilingual text. Preprocessing filters duplicates, PII, low-quality content, and adult material using heuristics and classifiers, ensuring broad task coverage without common failure modes.",[22,3895,3896],{},"On benchmarks, larger variants excel: 27B PT hits 75.2 MMLU (5-shot), 86.4 HellaSwag (10-shot), 51.8 HumanEval pass@1, 74.0 GSM8K (5-shot maj@1); 9B PT at 71.3 MMLU, 40.2 HumanEval; 2B PT at 51.3 MMLU. They surpass comparably-sized open alternatives across reasoning (ARC-c 71.4 for 27B), QA (TriviaQA 83.7), and math (MATH 42.3), proving state-of-the-art efficiency.",[55,3898,3899,3915],{},[58,3900,3901],{},[61,3902,3903,3906,3909,3912],{},[64,3904,3905],{},"Benchmark",[64,3907,3908],{},"2B PT",[64,3910,3911],{},"9B PT",[64,3913,3914],{},"27B PT",[83,3916,3917,3931,3945],{},[61,3918,3919,3922,3925,3928],{},[88,3920,3921],{},"MMLU 5-shot",[88,3923,3924],{},"51.3",[88,3926,3927],{},"71.3",[88,3929,3930],{},"75.2",[61,3932,3933,3936,3939,3942],{},[88,3934,3935],{},"HumanEval pass@1",[88,3937,3938],{},"17.7",[88,3940,3941],{},"40.2",[88,3943,3944],{},"51.8",[61,3946,3947,3950,3953,3956],{},[88,3948,3949],{},"GSM8K 5-shot",[88,3951,3952],{},"23.9",[88,3954,3955],{},"68.6",[88,3957,3958],{},"74.0",[17,3960,3962],{"id":3961},"train-efficiently-with-tpuv5p-jax-and-pathways","Train Efficiently with TPUv5p, JAX, and Pathways",[22,3964,3965],{},"Leverage TPUv5p hardware for matrix-heavy training, offering higher throughput than GPUs for LLMs. Use JAX for hardware acceleration and ML Pathways for multi-task orchestration in a single Python process, simplifying workflows as in Gemini papers. This combo scales to 13T tokens while cutting development overhead—ideal for replicating on custom infra.",[22,3967,3968],{},"Data mix includes web, code, math, and polyglot sources; dedupe at sentence\u002Fparagraph levels, filter via quality classifiers, and remove PII\u002Fadult content to boost generalization without memorization risks.",[17,3970,3972],{"id":3971},"pass-safety-and-dangerous-capability-thresholds","Pass Safety and Dangerous Capability Thresholds",[22,3974,3975],{},"Instruction-tuned (IT) variants score low toxicity (RealToxicity 8.84 avg for 27B IT) and bias (CrowS-Pairs 36.67 top-1), with strong BBQ (86.94 Disambig for 27B) and TruthfulQA (51.60). They meet Google's internal policies on child safety, harms, and memorization.",[22,3977,3978],{},"Dangerous evals cap risks: 27B IT solves 34\u002F76 InterCode-CTF cyber challenges (low success), 1\u002F13 internal CTF, 0\u002F13 HackTheBox; persuasion tests show 81% find it interesting but minimal harmful shifts (1% toward incorrect beliefs, £3.72 mean donation). Mitigate via preprocessing, post-training, and monitoring—users must add safeguards for production.",[55,3980,3981,3997],{},[58,3982,3983],{},[61,3984,3985,3988,3991,3994],{},[64,3986,3987],{},"Safety Benchmark",[64,3989,3990],{},"2B IT",[64,3992,3993],{},"9B IT",[64,3995,3996],{},"27B IT",[83,3998,3999,4013],{},[61,4000,4001,4004,4007,4010],{},[88,4002,4003],{},"RealToxicity avg",[88,4005,4006],{},"8.16",[88,4008,4009],{},"8.25",[88,4011,4012],{},"8.84",[61,4014,4015,4018,4021,4024],{},[88,4016,4017],{},"TruthfulQA",[88,4019,4020],{},"43.72",[88,4022,4023],{},"50.27",[88,4025,4026],{},"51.60",[22,4028,4029],{},"Limitations: May amplify biases, hallucinate, or violate policies without filters; not for high-risk uses like medical\u002Flegal advice.",{"title":176,"searchDepth":177,"depth":177,"links":4031},[4032,4033,4034],{"id":3889,"depth":177,"text":3890},{"id":3961,"depth":177,"text":3962},{"id":3971,"depth":177,"text":3972},[],{"content_references":4037,"triage":4062},[4038,4043,4046,4049,4053,4056,4059],{"type":189,"title":4039,"author":4040,"publisher":4041,"url":4042,"context":193},"Gemma","Gemma Team","Kaggle","https:\u002F\u002Fwww.kaggle.com\u002Fm\u002F3301",{"type":189,"title":4044,"url":4045,"context":193},"Gemma 2 technical report","https:\u002F\u002Fstorage.googleapis.com\u002Fdeepmind-media\u002Fgemma\u002Fgemma-2-report.pdf",{"type":189,"title":4047,"url":4048,"context":193},"Evaluating Frontier Models for Dangerous Capabilities","https:\u002F\u002Farxiv.org\u002Fabs\u002F2403.13793",{"type":4050,"title":4051,"url":4052,"context":193},"report","2023 Google AI Principles Progress Update","https:\u002F\u002Fstorage.googleapis.com\u002Fgweb-uniblog-publish-prod\u002Fdocuments\u002F2023_Google_AI_Principles_Progress_Update.pdf#page=11",{"type":3846,"title":4054,"url":4055,"context":202},"Tensor Processing Unit (TPU)","https:\u002F\u002Fcloud.google.com\u002Ftpu\u002Fdocs\u002Fintro-to-tpu",{"type":3846,"title":4057,"url":4058,"context":202},"JAX","https:\u002F\u002Fgithub.com\u002Fjax-ml\u002Fjax",{"type":195,"title":4060,"url":4061,"context":202},"ML Pathways","https:\u002F\u002Fblog.google\u002Ftechnology\u002Fai\u002Fintroducing-pathways-next-generation-ai-architecture\u002F",{"relevance":205,"novelty":3862,"quality":205,"actionability":3862,"composite":4063,"reasoning":4064},3.6,"Category: AI & LLMs. The article discusses the performance and deployment of the Gemma 2 LLMs, which addresses the audience's interest in practical AI applications. It provides insights into model efficiency and training techniques, but lacks detailed actionable steps for implementation.","\u002Fsummaries\u002F5f72f336c67bc8d8-gemma-2-open-llms-trained-on-13t-tokens-top-benchm-summary","2026-04-16 03:04:59",{"title":3879,"description":176},{"loc":4065},"5f72f336c67bc8d8","__oneoff__","https:\u002F\u002Fai.google.dev\u002Fgemma\u002Fdocs\u002Fcore\u002Fmodel_card_2","summaries\u002F5f72f336c67bc8d8-gemma-2-open-llms-trained-on-13t-tokens-top-benchm-summary",[220,222,221],"Google's Gemma 2 family (2B, 9B, 27B params) are lightweight open decoder-only LLMs trained on 2-13T tokens, outperforming similar-sized open models on MMLU (75.2 for 27B), HumanEval (51.8), and safety benchmarks while running on laptops.",[],"FxYd5aKdUzJM6mZmlDOJJQimECfkFIoTj2FT-EqQlhs",{"id":4078,"title":4079,"ai":4080,"body":4085,"categories":4121,"created_at":183,"date_modified":183,"description":176,"extension":184,"faq":183,"featured":185,"kicker_label":183,"meta":4122,"navigation":208,"path":4135,"published_at":4136,"question":183,"scraped_at":4137,"seo":4138,"sitemap":4139,"source_id":4140,"source_name":215,"source_type":216,"source_url":4141,"stem":4142,"tags":4143,"thumbnail_url":183,"tldr":4145,"tweet":183,"unknown_tags":4146,"__hash__":4147},"summaries\u002Fsummaries\u002Fdda195cde5fb0456-qwen-scope-saes-unlock-actionable-llm-internals-summary.md","Qwen-Scope SAEs Unlock Actionable LLM Internals",{"provider":7,"model":8,"input_tokens":4081,"output_tokens":4082,"processing_time_ms":4083,"cost_usd":4084},8913,1989,15174,0.0027546,{"type":14,"value":4086,"toc":4115},[4087,4091,4094,4098,4101,4105,4108,4112],[17,4088,4090],{"id":4089},"sae-decomposition-reveals-interpretable-llm-features","SAE Decomposition Reveals Interpretable LLM Features",[22,4092,4093],{},"Sparse autoencoders (SAEs) translate high-dimensional LLM activations into sparse latent features, each corresponding to concepts like languages or behaviors. For Qwen3 and Qwen3.5 models, Qwen-Scope releases 14 SAE groups across 7 variants: dense models (1.7B, 8B, 2B, 9B, 27B) and MoE (30B-A3B, 35B-A3B). SAEs train per layer on residual streams, using top-k (k=50 or 100) activations; dense models expand 16x hidden size, MoE use 32K (16x) or 128K (64x) widths. Except Qwen3.5-27B (instruct), all use base checkpoints. This layer-wise dictionary enables diagnosis of issues like language mixing or repetition without weight changes.",[17,4095,4097],{"id":4096},"steer-outputs-and-classify-via-feature-interventions","Steer Outputs and Classify via Feature Interventions",[22,4099,4100],{},"Apply steering with h' = h + αd to amplify\u002Fsuppress features: suppress Chinese feature (ID 6159) to fix English prompts mixing languages; activate classical-Chinese feature (ID 36398) for stylistic shifts. For toxicity, build classifiers from features firing more on toxic data—OR-rule yields F1>0.90 on English for 1.7B\u002F8B models; English features transfer cross-lingually (stronger to Russian\u002FFrench, weaker to Arabic\u002FChinese), retaining 99% performance with 10% discovery data. These zero-shot methods cut compute needs versus full evals or training heads.",[17,4102,4104],{"id":4103},"proxy-benchmark-analysis-without-model-runs","Proxy Benchmark Analysis Without Model Runs",[22,4106,4107],{},"SAE features act as micro-capabilities for eval: compute redundancy metric from activation overlap correlates ρ≈0.85 with performance-based redundancy on 17 benchmarks (MMLU, GSM8K, MATH, etc.); GSM8K shares 63% features with MATH, allowing safe omission. Pairwise overlap, partialed by MMLU, correlates 75.5% with capability similarity—retain low-overlap benchmarks, consolidate high-overlap ones to streamline suites without forward passes.",[17,4109,4111],{"id":4110},"augment-training-with-feature-driven-signals","Augment Training with Feature-Driven Signals",[22,4113,4114],{},"For SFT, Sparse Autoencoder-guided SFT (SASFT) suppresses non-target language features via auxiliary loss, cutting code-switching >50% across Gemma-2\u002FLlama-3.1\u002FQwen3 on Chinese\u002FRussian\u002FKorean (full elimination in cases like Qwen3-1.7B Korean), preserving multilingual benchmarks. For RL, synthetically generate repetition via feature steering as rare negatives in DAPO, sharply reducing repetition in 1.7B\u002F8B\u002F30B-A3B. Safety synthesis targets missing features: 4k pairs cover 99.74% features (vs. lower for random), boosting accuracy to 77.75% when mixed 1:1 with real data—matching 120k real-only under budget.",{"title":176,"searchDepth":177,"depth":177,"links":4116},[4117,4118,4119,4120],{"id":4089,"depth":177,"text":4090},{"id":4096,"depth":177,"text":4097},{"id":4103,"depth":177,"text":4104},{"id":4110,"depth":177,"text":4111},[236],{"content_references":4123,"triage":4133},[4124,4127,4130],{"type":189,"title":4125,"url":4126,"context":198},"Qwen Scope","https:\u002F\u002Fqianwen-res.oss-accelerate.aliyuncs.com\u002Fqwen-scope\u002FQwen_Scope.pdf",{"type":3856,"title":4128,"url":4129,"context":198},"Qwen-Scope Weights","https:\u002F\u002Fhuggingface.co\u002Fcollections\u002FQwen\u002Fqwen-scope",{"type":195,"title":4131,"url":4132,"context":198},"Qwen-Scope Technical Details","https:\u002F\u002Fqwen.ai\u002Fblog?id=qwen-scope",{"relevance":204,"novelty":205,"quality":205,"actionability":205,"composite":206,"reasoning":4134},"Category: AI & LLMs. The article provides in-depth insights into Qwen-Scope's sparse autoencoders, which are practical tools for developers working with LLMs, addressing specific pain points like feature interpretation and output steering. It offers actionable techniques for applying these features in real-world scenarios, such as toxicity classification and training optimizations.","\u002Fsummaries\u002Fdda195cde5fb0456-qwen-scope-saes-unlock-actionable-llm-internals-summary","2026-05-01 08:25:21","2026-05-03 17:01:52",{"title":4079,"description":176},{"loc":4135},"dda195cde5fb0456","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F01\u002Fqwen-ai-releases-qwen-scope-an-open-source-sparse-autoencoders-sae-suite-that-turns-llm-internal-features-into-practical-development-tools\u002F","summaries\u002Fdda195cde5fb0456-qwen-scope-saes-unlock-actionable-llm-internals-summary",[220,221,222,4144],"ai-tools","Qwen-Scope's open SAEs on 7 Qwen models decompose activations into interpretable features for steering outputs, proxy benchmark analysis (ρ=0.85 correlation), toxicity classification (F1>0.90), and training fixes like 50% code-switching reduction.",[],"zbictEOZXC-EHp6nI5NAS1Np-cHfHzWO9BF_YlaGEmc",{"id":4149,"title":4150,"ai":4151,"body":4156,"categories":4196,"created_at":183,"date_modified":183,"description":176,"extension":184,"faq":183,"featured":185,"kicker_label":183,"meta":4197,"navigation":208,"path":4214,"published_at":4215,"question":183,"scraped_at":4216,"seo":4217,"sitemap":4218,"source_id":4219,"source_name":215,"source_type":216,"source_url":4220,"stem":4221,"tags":4222,"thumbnail_url":183,"tldr":4224,"tweet":183,"unknown_tags":4225,"__hash__":4226},"summaries\u002Fsummaries\u002Fd64cbc961f981052-openmythos-770m-rdt-matches-1-3b-transformer-power-summary.md","OpenMythos: 770M RDT Matches 1.3B Transformer Power",{"provider":7,"model":8,"input_tokens":4152,"output_tokens":4153,"processing_time_ms":4154,"cost_usd":4155},5480,2000,15694,0.0020735,{"type":14,"value":4157,"toc":4191},[4158,4162,4165,4168,4171,4175,4178,4181,4185,4188],[17,4159,4161],{"id":4160},"recurrent-depth-transformers-scale-reasoning-via-inference-loops","Recurrent-Depth Transformers Scale Reasoning via Inference Loops",[22,4163,4164],{},"Recurrent-Depth Transformers (RDTs), or Looped Transformers, differ from standard transformers by reusing a fixed set of weights iteratively across T loop steps (up to 16 in OpenMythos) in a single forward pass. This decouples reasoning depth from parameter count: deeper reasoning comes from more loops at inference, not more layers or params. The structure follows Prelude → Recurrent Block → Coda, where Prelude and Coda are one-time standard transformer layers.",[22,4166,4167],{},"In the Recurrent Block, update hidden state ht+1 = A·ht + B·e + Transformer(ht, e), with encoded input e re-injected each step to prevent drift. This mimics draft refinement, enabling continuous latent-space reasoning without mid-loop token emissions—equivalent to chain-of-thought over vectors, per Saunshi et al. (2025). Unlike standard transformers failing on unseen depths (e.g., 5-hop trained model flops on 10-hop), RDTs extend depth at inference without retraining: allocate more loops to hard problems.",[22,4169,4170],{},"Replace standard FFN with Mixture-of-Experts (MoE) from DeepSeekMoE: sparse top-K experts per token plus shared experts, routed differently per loop for distinct computation despite tied weights. Use Multi-Latent Attention from DeepSeek-V2, caching compressed low-rank KV latents for 10–20× KV memory savings.",[17,4172,4174],{"id":4173},"stability-and-adaptive-depth-prevent-explosion-or-overthinking","Stability and Adaptive Depth Prevent Explosion or Overthinking",[22,4176,4177],{},"Looping risks residual explosion (unbounded ht growth) or overthinking (drift past solutions). Enforce Linear Time-Invariant (LTI) constraint from Parcae: spectral radius ρ(A) \u003C 1 by construction, ensuring stability independent of learning rate. Add Adaptive Computation Time (ACT) halting: learned scalar per position dynamically stops loops when converged—harder tokens get more compute.",[22,4179,4180],{},"Depth-Wise LoRA adapters apply small rank-r matrices per iteration, differentiating behavior without bloating params, blending pure tying and unique layers.",[17,4182,4184],{"id":4183},"half-the-params-equivalent-performance-via-predictable-scaling","Half the Params, Equivalent Performance via Predictable Scaling",[22,4186,4187],{},"At 770M params, OpenMythos RDT matches 1.3B standard transformer on identical data, per Parcae (Prairie et al., 2026) scaling laws: optimal recurrence and token count follow power laws. This shifts scaling focus from training params to inference loops, challenging bigger-is-better assumptions.",[22,4189,4190],{},"OpenMythos delivers PyTorch code for RDT with MoE, LTI training, LoRA adapters, and baselines—falsifiable hypothesis for Claude Mythos, runnable for experimenting with looped dynamics.",{"title":176,"searchDepth":177,"depth":177,"links":4192},[4193,4194,4195],{"id":4160,"depth":177,"text":4161},{"id":4173,"depth":177,"text":4174},{"id":4183,"depth":177,"text":4184},[],{"content_references":4198,"triage":4211},[4199,4202,4205,4209],{"type":3846,"title":4200,"url":4201,"context":202},"OpenMythos","https:\u002F\u002Fgithub.com\u002Fkyegomez\u002FOpenMythos",{"type":189,"title":4203,"url":4204,"context":193},"Saunshi et al. (2025)","https:\u002F\u002Farxiv.org\u002Fabs\u002F2502.17416",{"type":189,"title":4206,"author":4207,"url":4208,"context":193},"Parcae","Prairie et al.","https:\u002F\u002Farxiv.org\u002Fabs\u002F2604.12946",{"type":195,"title":4210,"context":202},"COCONUT (2024)",{"relevance":3862,"novelty":3862,"quality":205,"actionability":177,"composite":4212,"reasoning":4213},3.05,"Category: AI & LLMs. The article discusses a new architecture for transformers, which is relevant to AI engineering, but it lacks practical applications or examples for product builders to implement this technology. While it presents some novel insights into the structure and functioning of Recurrent-Depth Transformers, it does not provide actionable steps or frameworks that the audience can directly apply.","\u002Fsummaries\u002Fd64cbc961f981052-openmythos-770m-rdt-matches-1-3b-transformer-power-summary","2026-04-19 19:47:49","2026-04-21 15:26:59",{"title":4150,"description":176},{"loc":4214},"d64cbc961f981052","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F04\u002F19\u002Fmeet-openmythos-an-open-source-pytorch-reconstruction-of-claude-mythos-where-770m-parameters-match-a-1-3b-transformer\u002F","summaries\u002Fd64cbc961f981052-openmythos-770m-rdt-matches-1-3b-transformer-power-summary",[220,221,222,4223],"python","OpenMythos reconstructs Claude Mythos as a Recurrent-Depth Transformer (RDT) in PyTorch: loop the same weights T=16 times for reasoning depth, achieving 1.3B transformer performance at 770M params via MoE, stability fixes, and inference-time scaling.",[],"catU0v9NcZQXj7dgnu-iH80ub7d_pZ-fh6mDqyuTN3c"]