[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"summary-openmythos-770m-rdt-matches-1-3b-transformer-power-summary":3,"summaries-facets-categories":105,"summary-related-openmythos-770m-rdt-matches-1-3b-transformer-power-summary":4511},{"id":4,"title":5,"ai":6,"body":13,"categories":58,"created_at":59,"date_modified":59,"description":52,"extension":60,"faq":59,"featured":61,"kicker_label":59,"meta":62,"navigation":86,"path":87,"published_at":88,"question":59,"scraped_at":89,"seo":90,"sitemap":91,"source_id":92,"source_name":93,"source_type":94,"source_url":95,"stem":96,"tags":97,"thumbnail_url":59,"tldr":102,"tweet":59,"unknown_tags":103,"__hash__":104},"summaries\u002Fsummaries\u002Fopenmythos-770m-rdt-matches-1-3b-transformer-power-summary.md","OpenMythos: 770M RDT Matches 1.3B Transformer Power",{"provider":7,"model":8,"input_tokens":9,"output_tokens":10,"processing_time_ms":11,"cost_usd":12},"openrouter","x-ai\u002Fgrok-4.1-fast",5480,2000,15694,0.0020735,{"type":14,"value":15,"toc":51},"minimark",[16,21,25,28,31,35,38,41,45,48],[17,18,20],"h2",{"id":19},"recurrent-depth-transformers-scale-reasoning-via-inference-loops","Recurrent-Depth Transformers Scale Reasoning via Inference Loops",[22,23,24],"p",{},"Recurrent-Depth Transformers (RDTs), or Looped Transformers, differ from standard transformers by reusing a fixed set of weights iteratively across T loop steps (up to 16 in OpenMythos) in a single forward pass. This decouples reasoning depth from parameter count: deeper reasoning comes from more loops at inference, not more layers or params. The structure follows Prelude → Recurrent Block → Coda, where Prelude and Coda are one-time standard transformer layers.",[22,26,27],{},"In the Recurrent Block, update hidden state ht+1 = A·ht + B·e + Transformer(ht, e), with encoded input e re-injected each step to prevent drift. This mimics draft refinement, enabling continuous latent-space reasoning without mid-loop token emissions—equivalent to chain-of-thought over vectors, per Saunshi et al. (2025). Unlike standard transformers failing on unseen depths (e.g., 5-hop trained model flops on 10-hop), RDTs extend depth at inference without retraining: allocate more loops to hard problems.",[22,29,30],{},"Replace standard FFN with Mixture-of-Experts (MoE) from DeepSeekMoE: sparse top-K experts per token plus shared experts, routed differently per loop for distinct computation despite tied weights. Use Multi-Latent Attention from DeepSeek-V2, caching compressed low-rank KV latents for 10–20× KV memory savings.",[17,32,34],{"id":33},"stability-and-adaptive-depth-prevent-explosion-or-overthinking","Stability and Adaptive Depth Prevent Explosion or Overthinking",[22,36,37],{},"Looping risks residual explosion (unbounded ht growth) or overthinking (drift past solutions). Enforce Linear Time-Invariant (LTI) constraint from Parcae: spectral radius ρ(A) \u003C 1 by construction, ensuring stability independent of learning rate. Add Adaptive Computation Time (ACT) halting: learned scalar per position dynamically stops loops when converged—harder tokens get more compute.",[22,39,40],{},"Depth-Wise LoRA adapters apply small rank-r matrices per iteration, differentiating behavior without bloating params, blending pure tying and unique layers.",[17,42,44],{"id":43},"half-the-params-equivalent-performance-via-predictable-scaling","Half the Params, Equivalent Performance via Predictable Scaling",[22,46,47],{},"At 770M params, OpenMythos RDT matches 1.3B standard transformer on identical data, per Parcae (Prairie et al., 2026) scaling laws: optimal recurrence and token count follow power laws. This shifts scaling focus from training params to inference loops, challenging bigger-is-better assumptions.",[22,49,50],{},"OpenMythos delivers PyTorch code for RDT with MoE, LTI training, LoRA adapters, and baselines—falsifiable hypothesis for Claude Mythos, runnable for experimenting with looped dynamics.",{"title":52,"searchDepth":53,"depth":53,"links":54},"",2,[55,56,57],{"id":19,"depth":53,"text":20},{"id":33,"depth":53,"text":34},{"id":43,"depth":53,"text":44},[],null,"md",false,{"content_references":63,"triage":81},[64,69,74,78],{"type":65,"title":66,"url":67,"context":68},"tool","OpenMythos","https:\u002F\u002Fgithub.com\u002Fkyegomez\u002FOpenMythos","mentioned",{"type":70,"title":71,"url":72,"context":73},"paper","Saunshi et al. (2025)","https:\u002F\u002Farxiv.org\u002Fabs\u002F2502.17416","cited",{"type":70,"title":75,"author":76,"url":77,"context":73},"Parcae","Prairie et al.","https:\u002F\u002Farxiv.org\u002Fabs\u002F2604.12946",{"type":79,"title":80,"context":68},"other","COCONUT (2024)",{"relevance":82,"novelty":82,"quality":83,"actionability":53,"composite":84,"reasoning":85},3,4,3.05,"Category: AI & LLMs. The article discusses a new architecture for transformers, which is relevant to AI engineering, but it lacks practical applications or examples for product builders to implement this technology. While it presents some novel insights into the structure and functioning of Recurrent-Depth Transformers, it does not provide actionable steps or frameworks that the audience can directly apply.",true,"\u002Fsummaries\u002Fopenmythos-770m-rdt-matches-1-3b-transformer-power-summary","2026-04-19 19:47:49","2026-04-21 15:26:59",{"title":5,"description":52},{"loc":87},"d64cbc961f981052","MarkTechPost","article","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F04\u002F19\u002Fmeet-openmythos-an-open-source-pytorch-reconstruction-of-claude-mythos-where-770m-parameters-match-a-1-3b-transformer\u002F","summaries\u002Fopenmythos-770m-rdt-matches-1-3b-transformer-power-summary",[98,99,100,101],"llm","machine-learning","open-source","python","OpenMythos reconstructs Claude Mythos as a Recurrent-Depth Transformer (RDT) in PyTorch: loop the same weights T=16 times for reasoning depth, achieving 1.3B transformer performance at 770M params via MoE, stability fixes, and inference-time scaling.",[],"fwcGvrplxzzV-ClPwppvCpAyd3ndycdr9bj9bm6OOSU",[106,109,111,114,116,119,122,125,128,130,132,134,136,138,140,142,145,147,149,151,153,155,157,160,162,164,166,168,170,172,174,176,178,180,182,184,186,188,190,192,194,196,198,200,202,205,207,209,211,213,215,217,219,221,223,225,227,229,231,233,235,237,239,241,243,245,247,249,251,253,255,257,259,261,263,265,267,269,271,273,275,277,279,281,283,285,287,289,291,293,295,297,299,301,303,305,307,309,311,313,315,317,319,321,323,325,327,329,331,333,335,337,339,341,343,345,347,349,351,353,355,357,359,361,363,365,367,369,371,373,375,377,379,381,383,385,387,389,391,393,395,397,399,401,403,405,407,409,411,413,415,417,419,421,423,425,427,429,431,433,435,437,439,441,443,445,447,449,451,453,455,457,459,461,463,465,467,470,472,474,476,478,480,482,484,486,488,490,492,494,496,498,500,502,504,506,508,510,512,514,516,518,520,522,524,526,528,530,532,534,536,538,540,542,544,546,548,550,552,554,557,559,561,563,565,567,569,571,573,575,577,579,581,583,585,587,589,591,593,595,597,599,601,603,605,607,609,611,613,615,617,619,621,623,625,627,629,631,633,635,637,639,641,643,645,647,649,651,653,655,657,659,661,663,665,667,669,671,673,675,677,679,681,683,685,687,689,691,693,695,697,699,701,703,705,707,709,711,713,715,717,719,721,723,725,727,729,731,733,735,737,739,741,743,745,747,749,751,753,755,757,759,761,763,765,767,769,771,773,775,777,779,781,783,785,787,789,791,793,795,797,799,801,803,805,807,809,811,813,815,817,819,821,823,825,827,829,831,833,835,837,839,841,843,845,847,849,851,853,855,857,859,861,863,865,867,869,871,873,875,877,879,881,883,885,887,889,891,893,895,897,899,901,903,905,907,909,911,913,915,917,919,921,923,925,927,929,931,933,935,937,939,941,943,945,947,949,951,953,955,957,959,961,963,965,967,969,971,973,975,977,979,981,983,985,987,989,991,993,995,997,999,1001,1003,1005,1007,1009,1011,1013,1015,1017,1019,1021,1023,1025,1027,1029,1031,1033,1035,1037,1039,1041,1043,1045,1047,1049,1051,1053,1055,1057,1059,1061,1063,1065,1067,1069,1071,1073,1075,1077,1079,1081,1083,1085,1087,1089,1091,1093,1095,1097,1099,1101,1103,1105,1107,1109,1111,1113,1115,1117,1119,1121,1123,1125,1127,1129,1131,1133,1135,1137,1139,1141,1143,1145,1147,1149,1151,1153,1155,1157,1159,1161,1163,1165,1167,1169,1171,1173,1175,1177,1179,1181,1183,1185,1187,1189,1191,1193,1195,1197,1199,1201,1203,1205,1207,1209,1211,1213,1215,1217,1219,1221,1223,1225,1227,1229,1231,1233,1235,1237,1239,1241,1243,1245,1247,1249,1251,1253,1255,1257,1259,1261,1263,1265,1267,1269,1271,1273,1275,1277,1279,1281,1283,1285,1287,1289,1291,1293,1295,1297,1299,1301,1303,1305,1307,1309,1311,1313,1315,1317,1319,1321,1323,1325,1327,1329,1331,1333,1335,1337,1339,1341,1343,1345,1347,1349,1351,1353,1355,1357,1359,1361,1363,1365,1367,1369,1371,1373,1375,1377,1379,1381,1383,1385,1387,1389,1391,1393,1395,1397,1399,1401,1403,1405,1407,1409,1411,1413,1415,1417,1419,1421,1423,1425,1427,1429,1431,1433,1435,1437,1439,1441,1443,1445,1447,1449,1451,1453,1455,1457,1459,1461,1463,1465,1467,1469,1471,1473,1475,1477,1479,1481,1483,1485,1487,1489,1491,1493,1495,1497,1499,1501,1503,1505,1507,1509,1511,1513,1515,1517,1519,1521,1523,1525,1527,1529,1531,1533,1535,1537,1539,1541,1543,1545,1547,1549,1551,1553,1555,1557,1559,1561,1563,1565,1567,1569,1571,1573,1575,1577,1579,1581,1583,1585,1587,1589,1591,1593,1595,1597,1599,1601,1603,1605,1607,1609,1611,1613,1615,1617,1619,1621,1623,1625,1627,1629,1631,1633,1635,1637,1639,1641,1643,1645,1647,1649,1651,1653,1655,1657,1659,1661,1663,1665,1667,1669,1671,1673,1675,1677,1679,1681,1683,1685,1687,1689,1691,1693,1695,1697,1699,1701,1703,1705,1707,1709,1711,1713,1715,1717,1719,1721,1723,1725,1727,1729,1731,1733,1735,1737,1739,1741,1743,1745,1747,1749,1751,1753,1755,1757,1759,1761,1763,1765,1767,1769,1771,1773,1775,1777,1779,1781,1783,1785,1787,1789,1791,1793,1795,1797,1799,1801,1803,1805,1807,1809,1811,1813,1815,1817,1819,1821,1823,1825,1827,1829,1831,1833,1835,1837,1839,1841,1843,1845,1847,1849,1851,1853,1855,1857,1859,1861,1863,1865,1867,1869,1871,1873,1875,1877,1879,1881,1883,1885,1887,1889,1891,1893,1895,1897,1899,1901,1903,1905,1907,1909,1911,1913,1915,1917,1919,1921,1923,1925,1927,1929,1931,1933,1935,1937,1939,1941,1943,1945,1947,1949,1951,1953,1955,1957,1959,1961,1963,1965,1967,1969,1971,1973,1975,1977,1979,1981,1983,1985,1987,1989,1991,1993,1995,1997,1999,2001,2003,2005,2007,2009,2011,2013,2015,2017,2019,2021,2023,2025,2027,2029,2031,2033,2035,2037,2039,2041,2043,2045,2047,2049,2051,2053,2055,2057,2059,2061,2063,2065,2067,2069,2071,2073,2075,2077,2079,2081,2083,2085,2087,2089,2091,2093,2095,2097,2099,2101,2103,2105,2107,2109,2111,2113,2115,2117,2119,2121,2123,2125,2127,2129,2131,2133,2135,2137,2139,2141,2143,2145,2147,2149,2151,2153,2155,2157,2159,2161,2163,2165,2167,2169,2171,2173,2175,2177,2179,2181,2183,2185,2187,2189,2191,2193,2195,2197,2199,2201,2203,2205,2207,2209,2211,2213,2215,2217,2219,2221,2223,2225,2227,2229,2231,2233,2235,2237,2239,2241,2243,2245,2247,2249,2251,2253,2255,2257,2259,2261,2263,2265,2267,2269,2271,2273,2275,2277,2279,2281,2283,2285,2287,2289,2291,2293,2295,2297,2299,2301,2303,2305,2307,2309,2311,2313,2315,2317,2319,2321,2323,2325,2327,2329,2331,2333,2335,2337,2339,2341,2343,2345,2347,2349,2351,2353,2355,2357,2359,2361,2363,2365,2367,2369,2371,2373,2375,2377,2379,2381,2383,2385,2387,2389,2391,2393,2395,2397,2399,2401,2403,2405,2407,2409,2411,2413,2415,2417,2419,2421,2423,2425,2427,2429,2431,2433,2435,2437,2439,2441,2443,2445,2447,2449,2451,2453,2455,2457,2459,2461,2463,2465,2467,2469,2471,2473,2475,2477,2479,2481,2483,2485,2487,2489,2491,2493,2495,2497,2499,2501,2503,2505,2507,2509,2511,2513,2515,2517,2519,2521,2523,2525,2527,2529,2531,2533,2535,2537,2539,2541,2543,2545,2547,2549,2551,2553,2555,2557,2559,2561,2563,2565,2567,2569,2571,2573,2575,2577,2579,2581,2583,2585,2587,2589,2591,2593,2595,2597,2599,2601,2603,2605,2607,2609,2611,2613,2615,2617,2619,2621,2623,2625,2627,2629,2631,2633,2635,2637,2639,2641,2643,2645,2647,2649,2651,2653,2655,2657,2659,2661,2663,2665,2667,2669,2671,2673,2675,2677,2679,2681,2683,2685,2687,2689,2691,2693,2695,2697,2699,2701,2703,2705,2707,2709,2711,2713,2715,2717,2719,2721,2723,2725,2727,2729,2731,2733,2735,2737,2739,2741,2743,2745,2747,2749,2751,2753,2755,2757,2759,2761,2763,2765,2767,2769,2771,2773,2775,2777,2779,2781,2783,2785,2787,2789,2791,2793,2795,2797,2799,2801,2803,2805,2807,2809,2811,2813,2815,2817,2819,2821,2823,2825,2827,2829,2831,2833,2835,2837,2839,2841,2843,2845,2847,2849,2851,2853,2855,2857,2859,2861,2863,2865,2867,2869,2871,2873,2875,2877,2879,2881,2883,2885,2887,2889,2891,2893,2895,2897,2899,2901,2903,2905,2907,2909,2911,2913,2915,2917,2919,2921,2923,2925,2927,2929,2931,2933,2935,2937,2939,2941,2943,2945,2947,2949,2951,2953,2955,2957,2959,2961,2963,2965,2967,2969,2971,2973,2975,2977,2979,2981,2983,2985,2987,2989,2991,2993,2995,2997,2999,3001,3003,3005,3007,3009,3011,3013,3015,3017,3019,3021,3023,3025,3027,3029,3031,3033,3035,3037,3039,3041,3043,3045,3047,3049,3051,3053,3055,3057,3059,3061,3063,3065,3067,3069,3071,3073,3075,3077,3079,3081,3083,3085,3087,3089,3091,3093,3095,3097,3099,3101,3103,3105,3107,3109,3111,3113,3115,3117,3119,3121,3123,3125,3127,3129,3131,3133,3135,3137,3139,3141,3143,3145,3147,3149,3151,3153,3155,3157,3159,3161,3163,3165,3167,3169,3171,3173,3175,3177,3179,3181,3183,3185,3187,3189,3191,3193,3195,3197,3199,3201,3203,3205,3207,3209,3211,3213,3215,3217,3219,3221,3223,3225,3227,3229,3231,3233,3235,3237,3239,3241,3243,3245,3247,3249,3251,3253,3255,3257,3259,3261,3263,3265,3267,3269,3271,3273,3275,3277,3279,3281,3283,3285,3287,3289,3291,3293,3295,3297,3299,3301,3303,3305,3307,3309,3311,3313,3315,3317,3319,3321,3323,3325,3327,3329,3331,3333,3335,3337,3339,3341,3343,3345,3347,3349,3351,3353,3355,3357,3359,3361,3363,3365,3367,3369,3371,3373,3375,3377,3379,3381,3383,3385,3387,3389,3391,3393,3395,3397,3399,3401,3403,3405,3407,3409,3411,3413,3415,3417,3419,3421,3423,3425,3427,3429,3431,3433,3435,3437,3439,3441,3443,3445,3447,3449,3451,3453,3455,3457,3459,3461,3463,3465,3467,3469,3471,3473,3475,3477,3479,3481,3483,3485,3487,3489,3491,3493,3495,3497,3499,3501,3503,3505,3507,3509,3511,3513,3515,3517,3519,3521,3523,3525,3527,3529,3531,3533,3535,3537,3539,3541,3543,3545,3547,3549,3551,3553,3555,3557,3559,3561,3563,3565,3567,3569,3571,3573,3575,3577,3579,3581,3583,3585,3587,3589,3591,3593,3595,3597,3599,3601,3603,3605,3607,3609,3611,3613,3615,3617,3619,3621,3623,3625,3627,3629,3631,3633,3635,3637,3639,3641,3643,3645,3647,3649,3651,3653,3655,3657,3659,3661,3663,3665,3667,3669,3671,3673,3675,3677,3679,3681,3683,3685,3687,3689,3691,3693,3695,3697,3699,3701,3703,3705,3707,3709,3711,3713,3715,3717,3719,3721,3723,3725,3727,3729,3731,3733,3735,3737,3739,3741,3743,3745,3747,3749,3751,3753,3755,3757,3759,3761,3763,3765,3767,3769,3771,3773,3775,3777,3779,3781,3783,3785,3787,3789,3791,3793,3795,3797,3799,3801,3803,3805,3807,3809,3811,3813,3815,3817,3819,3821,3823,3825,3827,3829,3831,3833,3835,3837,3839,3841,3843,3845,3847,3849,3851,3853,3855,3857,3859,3861,3863,3865,3867,3869,3871,3873,3875,3877,3879,3881,3883,3885,3887,3889,3891,3893,3895,3897,3899,3901,3903,3905,3907,3909,3911,3913,3915,3917,3919,3921,3923,3925,3927,3929,3931,3933,3935,3937,3939,3941,3943,3945,3947,3949,3951,3953,3955,3957,3959,3961,3963,3965,3967,3969,3971,3973,3975,3977,3979,3981,3983,3985,3987,3989,3991,3993,3995,3997,3999,4001,4003,4005,4007,4009,4011,4013,4015,4017,4019,4021,4023,4025,4027,4029,4031,4033,4035,4037,4039,4041,4043,4045,4047,4049,4051,4053,4055,4057,4059,4061,4063,4065,4067,4069,4071,4073,4075,4077,4079,4081,4083,4085,4087,4089,4091,4093,4095,4097,4099,4101,4103,4105,4107,4109,4111,4113,4115,4117,4119,4121,4123,4125,4127,4129,4131,4133,4135,4137,4139,4141,4143,4145,4147,4149,4151,4153,4155,4157,4159,4161,4163,4165,4167,4169,4171,4173,4175,4177,4179,4181,4183,4185,4187,4189,4191,4193,4195,4197,4199,4201,4203,4205,4207,4209,4211,4213,4215,4217,4219,4221,4223,4225,4227,4229,4231,4233,4235,4237,4239,4241,4243,4245,4247,4249,4251,4253,4255,4257,4259,4261,4263,4265,4267,4269,4271,4273,4275,4277,4279,4281,4283,4285,4287,4289,4291,4293,4295,4297,4299,4301,4303,4305,4307,4309,4311,4313,4315,4317,4319,4321,4323,4325,4327,4329,4331,4333,4335,4337,4339,4341,4343,4345,4347,4349,4351,4353,4355,4357,4359,4361,4363,4365,4367,4369,4371,4373,4375,4377,4379,4381,4383,4385,4387,4389,4391,4393,4395,4397,4399,4401,4403,4405,4407,4409,4411,4413,4415,4417,4419,4421,4423,4425,4427,4429,4431,4433,4435,4437,4439,4441,4443,4445,4447,4449,4451,4453,4455,4457,4459,4461,4463,4465,4467,4469,4471,4473,4475,4477,4479,4481,4483,4485,4487,4489,4491,4493,4495,4497,4499,4501,4503,4505,4507,4509],{"categories":107},[108],"Business & SaaS",{"categories":110},[108],{"categories":112},[113],"AI News & Trends",{"categories":115},[],{"categories":117},[118],"AI Automation",{"categories":120},[121],"Marketing & Growth",{"categories":123},[124],"Design & Frontend",{"categories":126},[127],"Software Engineering",{"categories":129},[118],{"categories":131},[],{"categories":133},[124],{"categories":135},[124],{"categories":137},[118],{"categories":139},[124],{"categories":141},[124],{"categories":143},[144],"AI & LLMs",{"categories":146},[124],{"categories":148},[124],{"categories":150},[],{"categories":152},[124],{"categories":154},[124],{"categories":156},[144],{"categories":158},[159],"Developer Productivity",{"categories":161},[144],{"categories":163},[144],{"categories":165},[144],{"categories":167},[113],{"categories":169},[144],{"categories":171},[118],{"categories":173},[108],{"categories":175},[113],{"categories":177},[121],{"categories":179},[],{"categories":181},[],{"categories":183},[118],{"categories":185},[118],{"categories":187},[118],{"categories":189},[121],{"categories":191},[144],{"categories":193},[159],{"categories":195},[113],{"categories":197},[],{"categories":199},[],{"categories":201},[],{"categories":203},[204],"Data Science & Visualization",{"categories":206},[],{"categories":208},[118],{"categories":210},[127],{"categories":212},[118],{"categories":214},[118],{"categories":216},[144],{"categories":218},[121],{"categories":220},[118],{"categories":222},[],{"categories":224},[],{"categories":226},[],{"categories":228},[124],{"categories":230},[124],{"categories":232},[118],{"categories":234},[121],{"categories":236},[159],{"categories":238},[124],{"categories":240},[144],{"categories":242},[127],{"categories":244},[144],{"categories":246},[],{"categories":248},[118],{"categories":250},[144],{"categories":252},[159],{"categories":254},[159],{"categories":256},[],{"categories":258},[121],{"categories":260},[108],{"categories":262},[144],{"categories":264},[108],{"categories":266},[108],{"categories":268},[118],{"categories":270},[121],{"categories":272},[118],{"categories":274},[108],{"categories":276},[118],{"categories":278},[124],{"categories":280},[144],{"categories":282},[124],{"categories":284},[144],{"categories":286},[108],{"categories":288},[144],{"categories":290},[121],{"categories":292},[],{"categories":294},[144],{"categories":296},[108],{"categories":298},[],{"categories":300},[113],{"categories":302},[127],{"categories":304},[],{"categories":306},[144],{"categories":308},[124],{"categories":310},[144],{"categories":312},[124],{"categories":314},[],{"categories":316},[118],{"categories":318},[],{"categories":320},[],{"categories":322},[],{"categories":324},[144],{"categories":326},[],{"categories":328},[144],{"categories":330},[144],{"categories":332},[124],{"categories":334},[144],{"categories":336},[159],{"categories":338},[118],{"categories":340},[121],{"categories":342},[159],{"categories":344},[159],{"categories":346},[159],{"categories":348},[121],{"categories":350},[121],{"categories":352},[144],{"categories":354},[144],{"categories":356},[124],{"categories":358},[108],{"categories":360},[124],{"categories":362},[127],{"categories":364},[108],{"categories":366},[108],{"categories":368},[108],{"categories":370},[124],{"categories":372},[],{"categories":374},[],{"categories":376},[144],{"categories":378},[144],{"categories":380},[127],{"categories":382},[144],{"categories":384},[144],{"categories":386},[],{"categories":388},[144],{"categories":390},[144],{"categories":392},[],{"categories":394},[144],{"categories":396},[113],{"categories":398},[113],{"categories":400},[],{"categories":402},[],{"categories":404},[121],{"categories":406},[121],{"categories":408},[127],{"categories":410},[144],{"categories":412},[],{"categories":414},[],{"categories":416},[118],{"categories":418},[144],{"categories":420},[144],{"categories":422},[],{"categories":424},[144,108],{"categories":426},[144],{"categories":428},[],{"categories":430},[144],{"categories":432},[144],{"categories":434},[],{"categories":436},[],{"categories":438},[118],{"categories":440},[144],{"categories":442},[144],{"categories":444},[118],{"categories":446},[144],{"categories":448},[],{"categories":450},[],{"categories":452},[144],{"categories":454},[],{"categories":456},[144],{"categories":458},[144],{"categories":460},[],{"categories":462},[118],{"categories":464},[124],{"categories":466},[],{"categories":468},[118,469],"DevOps & Cloud",{"categories":471},[144],{"categories":473},[118],{"categories":475},[144],{"categories":477},[],{"categories":479},[],{"categories":481},[],{"categories":483},[],{"categories":485},[144],{"categories":487},[118],{"categories":489},[],{"categories":491},[118],{"categories":493},[],{"categories":495},[144],{"categories":497},[],{"categories":499},[],{"categories":501},[],{"categories":503},[],{"categories":505},[118],{"categories":507},[124],{"categories":509},[144],{"categories":511},[121],{"categories":513},[113],{"categories":515},[108],{"categories":517},[159],{"categories":519},[],{"categories":521},[118],{"categories":523},[118],{"categories":525},[144],{"categories":527},[],{"categories":529},[],{"categories":531},[],{"categories":533},[118],{"categories":535},[],{"categories":537},[118],{"categories":539},[118],{"categories":541},[113],{"categories":543},[118],{"categories":545},[144],{"categories":547},[],{"categories":549},[144],{"categories":551},[],{"categories":553},[113],{"categories":555},[118,556],"Product Strategy",{"categories":558},[127],{"categories":560},[469],{"categories":562},[556],{"categories":564},[144],{"categories":566},[118],{"categories":568},[],{"categories":570},[113],{"categories":572},[113],{"categories":574},[118],{"categories":576},[],{"categories":578},[118],{"categories":580},[144],{"categories":582},[144],{"categories":584},[159],{"categories":586},[144],{"categories":588},[],{"categories":590},[144,127],{"categories":592},[113],{"categories":594},[144],{"categories":596},[113],{"categories":598},[118],{"categories":600},[113],{"categories":602},[],{"categories":604},[127],{"categories":606},[108],{"categories":608},[],{"categories":610},[118],{"categories":612},[118],{"categories":614},[118],{"categories":616},[118],{"categories":618},[108],{"categories":620},[124],{"categories":622},[121],{"categories":624},[],{"categories":626},[118],{"categories":628},[],{"categories":630},[113],{"categories":632},[113],{"categories":634},[113],{"categories":636},[118],{"categories":638},[113],{"categories":640},[144],{"categories":642},[159],{"categories":644},[144],{"categories":646},[127],{"categories":648},[144,159],{"categories":650},[159],{"categories":652},[159],{"categories":654},[159],{"categories":656},[159],{"categories":658},[144],{"categories":660},[],{"categories":662},[],{"categories":664},[121],{"categories":666},[],{"categories":668},[144],{"categories":670},[159],{"categories":672},[144],{"categories":674},[124],{"categories":676},[127],{"categories":678},[],{"categories":680},[144],{"categories":682},[159],{"categories":684},[121],{"categories":686},[113],{"categories":688},[127],{"categories":690},[144],{"categories":692},[],{"categories":694},[127],{"categories":696},[124],{"categories":698},[108],{"categories":700},[108],{"categories":702},[],{"categories":704},[124],{"categories":706},[108],{"categories":708},[113],{"categories":710},[159],{"categories":712},[118],{"categories":714},[118],{"categories":716},[144],{"categories":718},[144],{"categories":720},[113],{"categories":722},[113],{"categories":724},[159],{"categories":726},[113],{"categories":728},[],{"categories":730},[556],{"categories":732},[118],{"categories":734},[113],{"categories":736},[113],{"categories":738},[113],{"categories":740},[144],{"categories":742},[118],{"categories":744},[118],{"categories":746},[108],{"categories":748},[108],{"categories":750},[144],{"categories":752},[113],{"categories":754},[],{"categories":756},[144],{"categories":758},[108],{"categories":760},[118],{"categories":762},[118],{"categories":764},[118],{"categories":766},[124],{"categories":768},[118],{"categories":770},[159],{"categories":772},[113],{"categories":774},[113],{"categories":776},[113],{"categories":778},[113],{"categories":780},[113],{"categories":782},[],{"categories":784},[],{"categories":786},[159],{"categories":788},[113],{"categories":790},[113],{"categories":792},[113],{"categories":794},[],{"categories":796},[144],{"categories":798},[],{"categories":800},[],{"categories":802},[124],{"categories":804},[108],{"categories":806},[],{"categories":808},[113],{"categories":810},[118],{"categories":812},[118],{"categories":814},[118],{"categories":816},[121],{"categories":818},[118],{"categories":820},[],{"categories":822},[113],{"categories":824},[113],{"categories":826},[144],{"categories":828},[],{"categories":830},[121],{"categories":832},[121],{"categories":834},[144],{"categories":836},[113],{"categories":838},[108],{"categories":840},[127],{"categories":842},[144],{"categories":844},[],{"categories":846},[144],{"categories":848},[144],{"categories":850},[127],{"categories":852},[144],{"categories":854},[144],{"categories":856},[144],{"categories":858},[121],{"categories":860},[113],{"categories":862},[144],{"categories":864},[144],{"categories":866},[113],{"categories":868},[118],{"categories":870},[159],{"categories":872},[108],{"categories":874},[144],{"categories":876},[159],{"categories":878},[159],{"categories":880},[],{"categories":882},[121],{"categories":884},[113],{"categories":886},[113],{"categories":888},[159],{"categories":890},[118],{"categories":892},[118],{"categories":894},[118],{"categories":896},[118],{"categories":898},[124],{"categories":900},[144],{"categories":902},[144],{"categories":904},[556],{"categories":906},[144],{"categories":908},[144],{"categories":910},[118],{"categories":912},[108],{"categories":914},[121],{"categories":916},[],{"categories":918},[108],{"categories":920},[108],{"categories":922},[],{"categories":924},[124],{"categories":926},[144],{"categories":928},[],{"categories":930},[],{"categories":932},[113],{"categories":934},[113],{"categories":936},[113],{"categories":938},[113],{"categories":940},[],{"categories":942},[113],{"categories":944},[144],{"categories":946},[144],{"categories":948},[],{"categories":950},[113],{"categories":952},[113],{"categories":954},[108],{"categories":956},[144],{"categories":958},[],{"categories":960},[],{"categories":962},[113],{"categories":964},[113],{"categories":966},[113],{"categories":968},[144],{"categories":970},[113],{"categories":972},[113],{"categories":974},[113],{"categories":976},[113],{"categories":978},[113],{"categories":980},[],{"categories":982},[118],{"categories":984},[144],{"categories":986},[121],{"categories":988},[108],{"categories":990},[118],{"categories":992},[144],{"categories":994},[],{"categories":996},[121],{"categories":998},[113],{"categories":1000},[113],{"categories":1002},[113],{"categories":1004},[113],{"categories":1006},[159],{"categories":1008},[127],{"categories":1010},[],{"categories":1012},[144],{"categories":1014},[118],{"categories":1016},[118],{"categories":1018},[118],{"categories":1020},[469],{"categories":1022},[118],{"categories":1024},[144],{"categories":1026},[144],{"categories":1028},[127],{"categories":1030},[469],{"categories":1032},[204],{"categories":1034},[144],{"categories":1036},[204],{"categories":1038},[],{"categories":1040},[121],{"categories":1042},[121],{"categories":1044},[124],{"categories":1046},[469],{"categories":1048},[118],{"categories":1050},[144],{"categories":1052},[144],{"categories":1054},[118],{"categories":1056},[118],{"categories":1058},[118],{"categories":1060},[159],{"categories":1062},[159],{"categories":1064},[118],{"categories":1066},[118],{"categories":1068},[],{"categories":1070},[118],{"categories":1072},[118],{"categories":1074},[144],{"categories":1076},[204],{"categories":1078},[118],{"categories":1080},[118],{"categories":1082},[118],{"categories":1084},[118],{"categories":1086},[108],{"categories":1088},[124],{"categories":1090},[113],{"categories":1092},[127],{"categories":1094},[469],{"categories":1096},[127],{"categories":1098},[204],{"categories":1100},[],{"categories":1102},[127],{"categories":1104},[],{"categories":1106},[],{"categories":1108},[127],{"categories":1110},[144],{"categories":1112},[],{"categories":1114},[],{"categories":1116},[],{"categories":1118},[108],{"categories":1120},[],{"categories":1122},[],{"categories":1124},[204],{"categories":1126},[144],{"categories":1128},[469],{"categories":1130},[144],{"categories":1132},[],{"categories":1134},[118],{"categories":1136},[159],{"categories":1138},[159],{"categories":1140},[121],{"categories":1142},[121],{"categories":1144},[121],{"categories":1146},[469],{"categories":1148},[127],{"categories":1150},[118],{"categories":1152},[108],{"categories":1154},[108],{"categories":1156},[127],{"categories":1158},[124],{"categories":1160},[204],{"categories":1162},[124],{"categories":1164},[],{"categories":1166},[144],{"categories":1168},[118],{"categories":1170},[118],{"categories":1172},[159],{"categories":1174},[118],{"categories":1176},[118],{"categories":1178},[124],{"categories":1180},[124],{"categories":1182},[118],{"categories":1184},[469],{"categories":1186},[144],{"categories":1188},[],{"categories":1190},[121],{"categories":1192},[118],{"categories":1194},[108],{"categories":1196},[118],{"categories":1198},[118],{"categories":1200},[],{"categories":1202},[144],{"categories":1204},[118],{"categories":1206},[118],{"categories":1208},[159],{"categories":1210},[118],{"categories":1212},[144],{"categories":1214},[],{"categories":1216},[118],{"categories":1218},[],{"categories":1220},[124],{"categories":1222},[159],{"categories":1224},[144],{"categories":1226},[127],{"categories":1228},[124],{"categories":1230},[159],{"categories":1232},[204],{"categories":1234},[159],{"categories":1236},[],{"categories":1238},[144],{"categories":1240},[144],{"categories":1242},[556],{"categories":1244},[127],{"categories":1246},[144,118],{"categories":1248},[118],{"categories":1250},[144],{"categories":1252},[118],{"categories":1254},[118,127],{"categories":1256},[118],{"categories":1258},[144],{"categories":1260},[],{"categories":1262},[159],{"categories":1264},[144],{"categories":1266},[118],{"categories":1268},[144],{"categories":1270},[],{"categories":1272},[127],{"categories":1274},[108],{"categories":1276},[118],{"categories":1278},[],{"categories":1280},[204],{"categories":1282},[127],{"categories":1284},[118],{"categories":1286},[127],{"categories":1288},[],{"categories":1290},[118],{"categories":1292},[],{"categories":1294},[118],{"categories":1296},[],{"categories":1298},[],{"categories":1300},[124],{"categories":1302},[159],{"categories":1304},[144],{"categories":1306},[118],{"categories":1308},[],{"categories":1310},[118],{"categories":1312},[127],{"categories":1314},[144],{"categories":1316},[144],{"categories":1318},[127],{"categories":1320},[127],{"categories":1322},[159],{"categories":1324},[108],{"categories":1326},[],{"categories":1328},[144],{"categories":1330},[144],{"categories":1332},[144],{"categories":1334},[118],{"categories":1336},[144],{"categories":1338},[],{"categories":1340},[124],{"categories":1342},[144],{"categories":1344},[118],{"categories":1346},[],{"categories":1348},[144],{"categories":1350},[],{"categories":1352},[144],{"categories":1354},[],{"categories":1356},[],{"categories":1358},[],{"categories":1360},[144],{"categories":1362},[144],{"categories":1364},[144],{"categories":1366},[144],{"categories":1368},[],{"categories":1370},[144],{"categories":1372},[144],{"categories":1374},[144],{"categories":1376},[],{"categories":1378},[144],{"categories":1380},[],{"categories":1382},[121],{"categories":1384},[144],{"categories":1386},[],{"categories":1388},[],{"categories":1390},[],{"categories":1392},[144],{"categories":1394},[113],{"categories":1396},[113],{"categories":1398},[],{"categories":1400},[118],{"categories":1402},[144],{"categories":1404},[],{"categories":1406},[144],{"categories":1408},[144],{"categories":1410},[113],{"categories":1412},[],{"categories":1414},[144],{"categories":1416},[113],{"categories":1418},[118],{"categories":1420},[144],{"categories":1422},[],{"categories":1424},[],{"categories":1426},[],{"categories":1428},[118],{"categories":1430},[118],{"categories":1432},[118],{"categories":1434},[118],{"categories":1436},[144],{"categories":1438},[124],{"categories":1440},[124],{"categories":1442},[118],{"categories":1444},[118],{"categories":1446},[159],{"categories":1448},[556],{"categories":1450},[159],{"categories":1452},[159],{"categories":1454},[144],{"categories":1456},[118],{"categories":1458},[144],{"categories":1460},[159],{"categories":1462},[144],{"categories":1464},[118],{"categories":1466},[118],{"categories":1468},[118],{"categories":1470},[118],{"categories":1472},[118],{"categories":1474},[144],{"categories":1476},[159],{"categories":1478},[159],{"categories":1480},[121],{"categories":1482},[118],{"categories":1484},[],{"categories":1486},[118],{"categories":1488},[],{"categories":1490},[113],{"categories":1492},[144],{"categories":1494},[],{"categories":1496},[108],{"categories":1498},[124],{"categories":1500},[124],{"categories":1502},[118],{"categories":1504},[118],{"categories":1506},[144],{"categories":1508},[144],{"categories":1510},[113],{"categories":1512},[113],{"categories":1514},[469],{"categories":1516},[118],{"categories":1518},[113],{"categories":1520},[],{"categories":1522},[144],{"categories":1524},[118],{"categories":1526},[118],{"categories":1528},[118],{"categories":1530},[118],{"categories":1532},[144],{"categories":1534},[144],{"categories":1536},[144],{"categories":1538},[144],{"categories":1540},[118],{"categories":1542},[118],{"categories":1544},[118],{"categories":1546},[118],{"categories":1548},[],{"categories":1550},[124],{"categories":1552},[144],{"categories":1554},[144],{"categories":1556},[144],{"categories":1558},[],{"categories":1560},[121],{"categories":1562},[],{"categories":1564},[159],{"categories":1566},[],{"categories":1568},[118],{"categories":1570},[159],{"categories":1572},[124],{"categories":1574},[159],{"categories":1576},[],{"categories":1578},[159],{"categories":1580},[159],{"categories":1582},[],{"categories":1584},[124],{"categories":1586},[118],{"categories":1588},[118],{"categories":1590},[159],{"categories":1592},[144],{"categories":1594},[144],{"categories":1596},[],{"categories":1598},[113],{"categories":1600},[],{"categories":1602},[121],{"categories":1604},[],{"categories":1606},[124],{"categories":1608},[113],{"categories":1610},[124],{"categories":1612},[124],{"categories":1614},[124],{"categories":1616},[124],{"categories":1618},[124],{"categories":1620},[124],{"categories":1622},[124],{"categories":1624},[124],{"categories":1626},[124],{"categories":1628},[124],{"categories":1630},[],{"categories":1632},[118],{"categories":1634},[124],{"categories":1636},[144],{"categories":1638},[144],{"categories":1640},[124],{"categories":1642},[124],{"categories":1644},[124],{"categories":1646},[124],{"categories":1648},[124],{"categories":1650},[124],{"categories":1652},[124],{"categories":1654},[144,124],{"categories":1656},[124],{"categories":1658},[124],{"categories":1660},[124],{"categories":1662},[124],{"categories":1664},[],{"categories":1666},[124],{"categories":1668},[124],{"categories":1670},[124],{"categories":1672},[124],{"categories":1674},[124],{"categories":1676},[124],{"categories":1678},[124],{"categories":1680},[124],{"categories":1682},[124],{"categories":1684},[124,144],{"categories":1686},[124],{"categories":1688},[124],{"categories":1690},[],{"categories":1692},[113],{"categories":1694},[],{"categories":1696},[144],{"categories":1698},[],{"categories":1700},[118],{"categories":1702},[469],{"categories":1704},[556],{"categories":1706},[118],{"categories":1708},[118],{"categories":1710},[],{"categories":1712},[118],{"categories":1714},[],{"categories":1716},[118],{"categories":1718},[],{"categories":1720},[],{"categories":1722},[144],{"categories":1724},[144],{"categories":1726},[144],{"categories":1728},[113],{"categories":1730},[113],{"categories":1732},[113],{"categories":1734},[113],{"categories":1736},[],{"categories":1738},[113],{"categories":1740},[],{"categories":1742},[113],{"categories":1744},[144],{"categories":1746},[113],{"categories":1748},[113],{"categories":1750},[113],{"categories":1752},[113],{"categories":1754},[144],{"categories":1756},[113],{"categories":1758},[118],{"categories":1760},[],{"categories":1762},[118],{"categories":1764},[113],{"categories":1766},[144],{"categories":1768},[113],{"categories":1770},[113],{"categories":1772},[113],{"categories":1774},[144],{"categories":1776},[144],{"categories":1778},[144],{"categories":1780},[],{"categories":1782},[],{"categories":1784},[144],{"categories":1786},[113],{"categories":1788},[],{"categories":1790},[144],{"categories":1792},[118],{"categories":1794},[144],{"categories":1796},[118],{"categories":1798},[118],{"categories":1800},[144],{"categories":1802},[],{"categories":1804},[],{"categories":1806},[118],{"categories":1808},[118],{"categories":1810},[118],{"categories":1812},[118],{"categories":1814},[118],{"categories":1816},[118],{"categories":1818},[118],{"categories":1820},[118],{"categories":1822},[],{"categories":1824},[118],{"categories":1826},[118],{"categories":1828},[118],{"categories":1830},[144],{"categories":1832},[144],{"categories":1834},[144],{"categories":1836},[113],{"categories":1838},[144],{"categories":1840},[144],{"categories":1842},[144],{"categories":1844},[118],{"categories":1846},[121],{"categories":1848},[121],{"categories":1850},[121],{"categories":1852},[118],{"categories":1854},[],{"categories":1856},[144],{"categories":1858},[],{"categories":1860},[],{"categories":1862},[144],{"categories":1864},[],{"categories":1866},[118],{"categories":1868},[124],{"categories":1870},[159],{"categories":1872},[204],{"categories":1874},[144],{"categories":1876},[118],{"categories":1878},[124],{"categories":1880},[],{"categories":1882},[118],{"categories":1884},[121,108],{"categories":1886},[118],{"categories":1888},[118],{"categories":1890},[469],{"categories":1892},[127],{"categories":1894},[121],{"categories":1896},[159],{"categories":1898},[144],{"categories":1900},[],{"categories":1902},[144],{"categories":1904},[],{"categories":1906},[144],{"categories":1908},[144],{"categories":1910},[118],{"categories":1912},[],{"categories":1914},[144],{"categories":1916},[118],{"categories":1918},[144],{"categories":1920},[159],{"categories":1922},[118],{"categories":1924},[144],{"categories":1926},[144,159],{"categories":1928},[159],{"categories":1930},[],{"categories":1932},[144],{"categories":1934},[144],{"categories":1936},[144],{"categories":1938},[],{"categories":1940},[],{"categories":1942},[118],{"categories":1944},[121],{"categories":1946},[113],{"categories":1948},[118],{"categories":1950},[144],{"categories":1952},[113],{"categories":1954},[],{"categories":1956},[159],{"categories":1958},[113],{"categories":1960},[],{"categories":1962},[204],{"categories":1964},[121],{"categories":1966},[108],{"categories":1968},[113],{"categories":1970},[144],{"categories":1972},[118],{"categories":1974},[144],{"categories":1976},[118],{"categories":1978},[118],{"categories":1980},[113],{"categories":1982},[159],{"categories":1984},[124],{"categories":1986},[108],{"categories":1988},[144],{"categories":1990},[144],{"categories":1992},[],{"categories":1994},[],{"categories":1996},[144],{"categories":1998},[],{"categories":2000},[144],{"categories":2002},[113],{"categories":2004},[],{"categories":2006},[118],{"categories":2008},[159],{"categories":2010},[113],{"categories":2012},[159],{"categories":2014},[118],{"categories":2016},[144],{"categories":2018},[],{"categories":2020},[118],{"categories":2022},[118],{"categories":2024},[124],{"categories":2026},[118],{"categories":2028},[124],{"categories":2030},[118],{"categories":2032},[118],{"categories":2034},[124],{"categories":2036},[],{"categories":2038},[],{"categories":2040},[124],{"categories":2042},[124],{"categories":2044},[124],{"categories":2046},[127],{"categories":2048},[159],{"categories":2050},[159],{"categories":2052},[118],{"categories":2054},[113],{"categories":2056},[159],{"categories":2058},[159],{"categories":2060},[121],{"categories":2062},[124],{"categories":2064},[118],{"categories":2066},[118],{"categories":2068},[144],{"categories":2070},[159],{"categories":2072},[144],{"categories":2074},[],{"categories":2076},[469],{"categories":2078},[556],{"categories":2080},[],{"categories":2082},[],{"categories":2084},[118],{"categories":2086},[113],{"categories":2088},[121],{"categories":2090},[121],{"categories":2092},[204],{"categories":2094},[124],{"categories":2096},[204],{"categories":2098},[204],{"categories":2100},[118],{"categories":2102},[],{"categories":2104},[],{"categories":2106},[204],{"categories":2108},[127],{"categories":2110},[144],{"categories":2112},[127],{"categories":2114},[204],{"categories":2116},[127],{"categories":2118},[204],{"categories":2120},[108],{"categories":2122},[127],{"categories":2124},[159],{"categories":2126},[144],{"categories":2128},[],{"categories":2130},[204],{"categories":2132},[469],{"categories":2134},[],{"categories":2136},[144],{"categories":2138},[144],{"categories":2140},[],{"categories":2142},[],{"categories":2144},[144],{"categories":2146},[144],{"categories":2148},[113],{"categories":2150},[144],{"categories":2152},[],{"categories":2154},[113],{"categories":2156},[],{"categories":2158},[],{"categories":2160},[113],{"categories":2162},[113],{"categories":2164},[144],{"categories":2166},[144],{"categories":2168},[144],{"categories":2170},[144],{"categories":2172},[144],{"categories":2174},[144],{"categories":2176},[121],{"categories":2178},[],{"categories":2180},[144],{"categories":2182},[],{"categories":2184},[],{"categories":2186},[118],{"categories":2188},[159],{"categories":2190},[],{"categories":2192},[469],{"categories":2194},[144,469],{"categories":2196},[144],{"categories":2198},[],{"categories":2200},[124],{"categories":2202},[124],{"categories":2204},[124],{"categories":2206},[124],{"categories":2208},[124],{"categories":2210},[],{"categories":2212},[],{"categories":2214},[],{"categories":2216},[127],{"categories":2218},[118],{"categories":2220},[108],{"categories":2222},[127],{"categories":2224},[159],{"categories":2226},[124],{"categories":2228},[],{"categories":2230},[121],{"categories":2232},[556],{"categories":2234},[204],{"categories":2236},[204],{"categories":2238},[204],{"categories":2240},[159],{"categories":2242},[556],{"categories":2244},[159],{"categories":2246},[],{"categories":2248},[108],{"categories":2250},[127],{"categories":2252},[144],{"categories":2254},[124],{"categories":2256},[121],{"categories":2258},[127],{"categories":2260},[121],{"categories":2262},[144],{"categories":2264},[124],{"categories":2266},[127],{"categories":2268},[469],{"categories":2270},[144],{"categories":2272},[113],{"categories":2274},[127],{"categories":2276},[],{"categories":2278},[144],{"categories":2280},[127],{"categories":2282},[127],{"categories":2284},[118],{"categories":2286},[],{"categories":2288},[121],{"categories":2290},[121],{"categories":2292},[121],{"categories":2294},[118],{"categories":2296},[144],{"categories":2298},[],{"categories":2300},[108],{"categories":2302},[159],{"categories":2304},[159],{"categories":2306},[204],{"categories":2308},[108],{"categories":2310},[113],{"categories":2312},[204],{"categories":2314},[],{"categories":2316},[113],{"categories":2318},[113],{"categories":2320},[113],{"categories":2322},[144],{"categories":2324},[108],{"categories":2326},[144],{"categories":2328},[],{"categories":2330},[],{"categories":2332},[],{"categories":2334},[127],{"categories":2336},[118],{"categories":2338},[],{"categories":2340},[159],{"categories":2342},[124],{"categories":2344},[],{"categories":2346},[121],{"categories":2348},[],{"categories":2350},[124],{"categories":2352},[144],{"categories":2354},[159],{"categories":2356},[108],{"categories":2358},[],{"categories":2360},[124],{"categories":2362},[124],{"categories":2364},[144],{"categories":2366},[],{"categories":2368},[],{"categories":2370},[127],{"categories":2372},[144],{"categories":2374},[],{"categories":2376},[118],{"categories":2378},[144],{"categories":2380},[],{"categories":2382},[127],{"categories":2384},[118],{"categories":2386},[144],{"categories":2388},[204],{"categories":2390},[144],{"categories":2392},[],{"categories":2394},[204],{"categories":2396},[144],{"categories":2398},[127],{"categories":2400},[144],{"categories":2402},[204],{"categories":2404},[118],{"categories":2406},[144],{"categories":2408},[144],{"categories":2410},[144,118],{"categories":2412},[118],{"categories":2414},[118],{"categories":2416},[118],{"categories":2418},[124],{"categories":2420},[159],{"categories":2422},[144],{"categories":2424},[159],{"categories":2426},[124],{"categories":2428},[144],{"categories":2430},[],{"categories":2432},[],{"categories":2434},[144],{"categories":2436},[144],{"categories":2438},[144],{"categories":2440},[118],{"categories":2442},[144],{"categories":2444},[],{"categories":2446},[144],{"categories":2448},[144],{"categories":2450},[118],{"categories":2452},[118],{"categories":2454},[144],{"categories":2456},[144],{"categories":2458},[],{"categories":2460},[144],{"categories":2462},[],{"categories":2464},[144],{"categories":2466},[144],{"categories":2468},[144],{"categories":2470},[144],{"categories":2472},[144],{"categories":2474},[144],{"categories":2476},[144],{"categories":2478},[],{"categories":2480},[144],{"categories":2482},[113],{"categories":2484},[113],{"categories":2486},[],{"categories":2488},[],{"categories":2490},[144],{"categories":2492},[],{"categories":2494},[144],{"categories":2496},[144,469],{"categories":2498},[],{"categories":2500},[113],{"categories":2502},[],{"categories":2504},[144],{"categories":2506},[],{"categories":2508},[],{"categories":2510},[],{"categories":2512},[144],{"categories":2514},[],{"categories":2516},[144],{"categories":2518},[],{"categories":2520},[144],{"categories":2522},[144],{"categories":2524},[],{"categories":2526},[],{"categories":2528},[144,469],{"categories":2530},[469,144],{"categories":2532},[113],{"categories":2534},[],{"categories":2536},[144],{"categories":2538},[],{"categories":2540},[144],{"categories":2542},[144],{"categories":2544},[],{"categories":2546},[113],{"categories":2548},[144,108],{"categories":2550},[113],{"categories":2552},[127],{"categories":2554},[],{"categories":2556},[118],{"categories":2558},[144],{"categories":2560},[121],{"categories":2562},[144],{"categories":2564},[159],{"categories":2566},[159],{"categories":2568},[469],{"categories":2570},[113],{"categories":2572},[144],{"categories":2574},[469],{"categories":2576},[127],{"categories":2578},[144],{"categories":2580},[159],{"categories":2582},[],{"categories":2584},[144],{"categories":2586},[],{"categories":2588},[],{"categories":2590},[144],{"categories":2592},[],{"categories":2594},[144],{"categories":2596},[127],{"categories":2598},[108],{"categories":2600},[159],{"categories":2602},[121],{"categories":2604},[118],{"categories":2606},[159],{"categories":2608},[],{"categories":2610},[121],{"categories":2612},[],{"categories":2614},[],{"categories":2616},[144],{"categories":2618},[113],{"categories":2620},[121],{"categories":2622},[],{"categories":2624},[144],{"categories":2626},[113],{"categories":2628},[113],{"categories":2630},[121],{"categories":2632},[113],{"categories":2634},[144],{"categories":2636},[113],{"categories":2638},[144],{"categories":2640},[],{"categories":2642},[144],{"categories":2644},[144],{"categories":2646},[144],{"categories":2648},[113],{"categories":2650},[],{"categories":2652},[],{"categories":2654},[124],{"categories":2656},[113],{"categories":2658},[],{"categories":2660},[144],{"categories":2662},[144],{"categories":2664},[144],{"categories":2666},[144],{"categories":2668},[144],{"categories":2670},[144],{"categories":2672},[144],{"categories":2674},[144],{"categories":2676},[144],{"categories":2678},[121],{"categories":2680},[144,124],{"categories":2682},[113],{"categories":2684},[113],{"categories":2686},[144],{"categories":2688},[127],{"categories":2690},[204],{"categories":2692},[144],{"categories":2694},[144],{"categories":2696},[],{"categories":2698},[],{"categories":2700},[144],{"categories":2702},[144],{"categories":2704},[],{"categories":2706},[124],{"categories":2708},[124],{"categories":2710},[159],{"categories":2712},[144],{"categories":2714},[159],{"categories":2716},[144],{"categories":2718},[144],{"categories":2720},[],{"categories":2722},[144],{"categories":2724},[],{"categories":2726},[],{"categories":2728},[144],{"categories":2730},[],{"categories":2732},[],{"categories":2734},[113],{"categories":2736},[],{"categories":2738},[144],{"categories":2740},[144],{"categories":2742},[144],{"categories":2744},[],{"categories":2746},[144],{"categories":2748},[113],{"categories":2750},[556],{"categories":2752},[118],{"categories":2754},[144],{"categories":2756},[],{"categories":2758},[118],{"categories":2760},[144],{"categories":2762},[],{"categories":2764},[144],{"categories":2766},[],{"categories":2768},[118],{"categories":2770},[],{"categories":2772},[],{"categories":2774},[118],{"categories":2776},[118],{"categories":2778},[118],{"categories":2780},[144],{"categories":2782},[],{"categories":2784},[118],{"categories":2786},[118],{"categories":2788},[],{"categories":2790},[],{"categories":2792},[118],{"categories":2794},[144],{"categories":2796},[113],{"categories":2798},[556],{"categories":2800},[121],{"categories":2802},[],{"categories":2804},[124],{"categories":2806},[144],{"categories":2808},[144],{"categories":2810},[108],{"categories":2812},[113],{"categories":2814},[113],{"categories":2816},[113],{"categories":2818},[113],{"categories":2820},[],{"categories":2822},[118],{"categories":2824},[118],{"categories":2826},[118],{"categories":2828},[118],{"categories":2830},[159],{"categories":2832},[144],{"categories":2834},[108],{"categories":2836},[],{"categories":2838},[159],{"categories":2840},[118],{"categories":2842},[124],{"categories":2844},[124],{"categories":2846},[124],{"categories":2848},[124],{"categories":2850},[124],{"categories":2852},[124],{"categories":2854},[144,108],{"categories":2856},[118],{"categories":2858},[108],{"categories":2860},[113],{"categories":2862},[113],{"categories":2864},[159],{"categories":2866},[],{"categories":2868},[],{"categories":2870},[121],{"categories":2872},[],{"categories":2874},[144],{"categories":2876},[121],{"categories":2878},[144],{"categories":2880},[127],{"categories":2882},[118],{"categories":2884},[108],{"categories":2886},[118],{"categories":2888},[127],{"categories":2890},[159],{"categories":2892},[118],{"categories":2894},[],{"categories":2896},[159],{"categories":2898},[],{"categories":2900},[],{"categories":2902},[118],{"categories":2904},[118],{"categories":2906},[118],{"categories":2908},[144],{"categories":2910},[144],{"categories":2912},[144],{"categories":2914},[144],{"categories":2916},[144],{"categories":2918},[],{"categories":2920},[469],{"categories":2922},[144],{"categories":2924},[],{"categories":2926},[],{"categories":2928},[],{"categories":2930},[159],{"categories":2932},[],{"categories":2934},[144],{"categories":2936},[],{"categories":2938},[113],{"categories":2940},[144],{"categories":2942},[113],{"categories":2944},[144],{"categories":2946},[118],{"categories":2948},[],{"categories":2950},[144],{"categories":2952},[144],{"categories":2954},[],{"categories":2956},[204],{"categories":2958},[204],{"categories":2960},[127],{"categories":2962},[124],{"categories":2964},[],{"categories":2966},[144],{"categories":2968},[118],{"categories":2970},[],{"categories":2972},[],{"categories":2974},[144],{"categories":2976},[127],{"categories":2978},[118],{"categories":2980},[108],{"categories":2982},[159,127],{"categories":2984},[127],{"categories":2986},[144],{"categories":2988},[118],{"categories":2990},[],{"categories":2992},[],{"categories":2994},[],{"categories":2996},[],{"categories":2998},[],{"categories":3000},[],{"categories":3002},[144],{"categories":3004},[],{"categories":3006},[],{"categories":3008},[144],{"categories":3010},[],{"categories":3012},[],{"categories":3014},[],{"categories":3016},[144],{"categories":3018},[113],{"categories":3020},[],{"categories":3022},[],{"categories":3024},[],{"categories":3026},[144],{"categories":3028},[],{"categories":3030},[144],{"categories":3032},[144],{"categories":3034},[],{"categories":3036},[144],{"categories":3038},[127],{"categories":3040},[],{"categories":3042},[159],{"categories":3044},[159],{"categories":3046},[],{"categories":3048},[121],{"categories":3050},[],{"categories":3052},[],{"categories":3054},[],{"categories":3056},[124],{"categories":3058},[113],{"categories":3060},[118],{"categories":3062},[144],{"categories":3064},[108],{"categories":3066},[144],{"categories":3068},[],{"categories":3070},[],{"categories":3072},[108],{"categories":3074},[121],{"categories":3076},[118],{"categories":3078},[],{"categories":3080},[469],{"categories":3082},[],{"categories":3084},[121],{"categories":3086},[144],{"categories":3088},[144],{"categories":3090},[121],{"categories":3092},[144],{"categories":3094},[124],{"categories":3096},[118],{"categories":3098},[144],{"categories":3100},[118],{"categories":3102},[144],{"categories":3104},[118],{"categories":3106},[159],{"categories":3108},[159],{"categories":3110},[124],{"categories":3112},[],{"categories":3114},[144],{"categories":3116},[144],{"categories":3118},[121],{"categories":3120},[556],{"categories":3122},[159],{"categories":3124},[113],{"categories":3126},[144],{"categories":3128},[113],{"categories":3130},[144],{"categories":3132},[144],{"categories":3134},[],{"categories":3136},[144],{"categories":3138},[],{"categories":3140},[144],{"categories":3142},[121],{"categories":3144},[144],{"categories":3146},[144],{"categories":3148},[144],{"categories":3150},[],{"categories":3152},[144],{"categories":3154},[144],{"categories":3156},[556],{"categories":3158},[],{"categories":3160},[113],{"categories":3162},[469],{"categories":3164},[127],{"categories":3166},[],{"categories":3168},[204],{"categories":3170},[],{"categories":3172},[],{"categories":3174},[113],{"categories":3176},[144],{"categories":3178},[],{"categories":3180},[144],{"categories":3182},[144],{"categories":3184},[118],{"categories":3186},[144],{"categories":3188},[113],{"categories":3190},[113],{"categories":3192},[124],{"categories":3194},[124],{"categories":3196},[124],{"categories":3198},[144],{"categories":3200},[204],{"categories":3202},[113],{"categories":3204},[159],{"categories":3206},[],{"categories":3208},[124],{"categories":3210},[124],{"categories":3212},[469],{"categories":3214},[124],{"categories":3216},[124],{"categories":3218},[118],{"categories":3220},[113],{"categories":3222},[469],{"categories":3224},[144],{"categories":3226},[144],{"categories":3228},[144],{"categories":3230},[144],{"categories":3232},[],{"categories":3234},[118],{"categories":3236},[144],{"categories":3238},[124],{"categories":3240},[],{"categories":3242},[],{"categories":3244},[113],{"categories":3246},[],{"categories":3248},[118],{"categories":3250},[118],{"categories":3252},[118],{"categories":3254},[118],{"categories":3256},[118],{"categories":3258},[118],{"categories":3260},[118],{"categories":3262},[118],{"categories":3264},[],{"categories":3266},[],{"categories":3268},[144],{"categories":3270},[],{"categories":3272},[118],{"categories":3274},[159],{"categories":3276},[159],{"categories":3278},[204],{"categories":3280},[108],{"categories":3282},[],{"categories":3284},[],{"categories":3286},[],{"categories":3288},[124],{"categories":3290},[144],{"categories":3292},[],{"categories":3294},[108],{"categories":3296},[108],{"categories":3298},[124],{"categories":3300},[159],{"categories":3302},[204],{"categories":3304},[124],{"categories":3306},[124],{"categories":3308},[],{"categories":3310},[118],{"categories":3312},[108],{"categories":3314},[108],{"categories":3316},[144],{"categories":3318},[118],{"categories":3320},[127],{"categories":3322},[124],{"categories":3324},[],{"categories":3326},[121],{"categories":3328},[204],{"categories":3330},[113],{"categories":3332},[113],{"categories":3334},[113],{"categories":3336},[469],{"categories":3338},[],{"categories":3340},[118],{"categories":3342},[],{"categories":3344},[118],{"categories":3346},[118],{"categories":3348},[144],{"categories":3350},[144],{"categories":3352},[127],{"categories":3354},[118],{"categories":3356},[127],{"categories":3358},[],{"categories":3360},[118],{"categories":3362},[124],{"categories":3364},[124],{"categories":3366},[124],{"categories":3368},[144],{"categories":3370},[118],{"categories":3372},[144],{"categories":3374},[108],{"categories":3376},[113],{"categories":3378},[124],{"categories":3380},[113],{"categories":3382},[144],{"categories":3384},[],{"categories":3386},[113],{"categories":3388},[118],{"categories":3390},[113],{"categories":3392},[113],{"categories":3394},[113],{"categories":3396},[113],{"categories":3398},[],{"categories":3400},[],{"categories":3402},[113],{"categories":3404},[113],{"categories":3406},[],{"categories":3408},[113],{"categories":3410},[113],{"categories":3412},[144],{"categories":3414},[144],{"categories":3416},[113],{"categories":3418},[113],{"categories":3420},[144],{"categories":3422},[],{"categories":3424},[144],{"categories":3426},[118],{"categories":3428},[144],{"categories":3430},[144],{"categories":3432},[],{"categories":3434},[144],{"categories":3436},[144],{"categories":3438},[144],{"categories":3440},[113],{"categories":3442},[],{"categories":3444},[],{"categories":3446},[],{"categories":3448},[],{"categories":3450},[144],{"categories":3452},[144],{"categories":3454},[],{"categories":3456},[121],{"categories":3458},[113],{"categories":3460},[],{"categories":3462},[],{"categories":3464},[],{"categories":3466},[],{"categories":3468},[],{"categories":3470},[144],{"categories":3472},[],{"categories":3474},[],{"categories":3476},[144],{"categories":3478},[],{"categories":3480},[118],{"categories":3482},[118],{"categories":3484},[118],{"categories":3486},[108],{"categories":3488},[],{"categories":3490},[121],{"categories":3492},[127],{"categories":3494},[127],{"categories":3496},[469],{"categories":3498},[113],{"categories":3500},[],{"categories":3502},[144],{"categories":3504},[144],{"categories":3506},[108],{"categories":3508},[],{"categories":3510},[108],{"categories":3512},[],{"categories":3514},[],{"categories":3516},[],{"categories":3518},[127],{"categories":3520},[118],{"categories":3522},[118],{"categories":3524},[118],{"categories":3526},[118],{"categories":3528},[118],{"categories":3530},[],{"categories":3532},[113],{"categories":3534},[144],{"categories":3536},[144],{"categories":3538},[144],{"categories":3540},[],{"categories":3542},[108],{"categories":3544},[],{"categories":3546},[124],{"categories":3548},[204],{"categories":3550},[124],{"categories":3552},[],{"categories":3554},[],{"categories":3556},[144],{"categories":3558},[118],{"categories":3560},[],{"categories":3562},[144],{"categories":3564},[144],{"categories":3566},[144],{"categories":3568},[118],{"categories":3570},[118],{"categories":3572},[144],{"categories":3574},[204],{"categories":3576},[118],{"categories":3578},[],{"categories":3580},[144],{"categories":3582},[],{"categories":3584},[556],{"categories":3586},[127],{"categories":3588},[204],{"categories":3590},[127],{"categories":3592},[469],{"categories":3594},[144],{"categories":3596},[127],{"categories":3598},[113],{"categories":3600},[469],{"categories":3602},[127],{"categories":3604},[124],{"categories":3606},[124],{"categories":3608},[],{"categories":3610},[127],{"categories":3612},[],{"categories":3614},[159],{"categories":3616},[127],{"categories":3618},[],{"categories":3620},[204],{"categories":3622},[204],{"categories":3624},[556],{"categories":3626},[],{"categories":3628},[144],{"categories":3630},[127],{"categories":3632},[469],{"categories":3634},[118],{"categories":3636},[118],{"categories":3638},[204],{"categories":3640},[144],{"categories":3642},[159],{"categories":3644},[144],{"categories":3646},[],{"categories":3648},[],{"categories":3650},[],{"categories":3652},[121],{"categories":3654},[144],{"categories":3656},[124],{"categories":3658},[127],{"categories":3660},[127],{"categories":3662},[144],{"categories":3664},[121],{"categories":3666},[159],{"categories":3668},[144],{"categories":3670},[127],{"categories":3672},[144],{"categories":3674},[127],{"categories":3676},[159],{"categories":3678},[159],{"categories":3680},[118],{"categories":3682},[159],{"categories":3684},[127],{"categories":3686},[108],{"categories":3688},[127],{"categories":3690},[127],{"categories":3692},[127],{"categories":3694},[127],{"categories":3696},[],{"categories":3698},[113],{"categories":3700},[],{"categories":3702},[204],{"categories":3704},[144],{"categories":3706},[144],{"categories":3708},[],{"categories":3710},[],{"categories":3712},[],{"categories":3714},[144],{"categories":3716},[113],{"categories":3718},[144],{"categories":3720},[144],{"categories":3722},[],{"categories":3724},[144],{"categories":3726},[124],{"categories":3728},[144],{"categories":3730},[144],{"categories":3732},[144],{"categories":3734},[],{"categories":3736},[],{"categories":3738},[],{"categories":3740},[469],{"categories":3742},[469],{"categories":3744},[108],{"categories":3746},[118],{"categories":3748},[108,121],{"categories":3750},[144],{"categories":3752},[113],{"categories":3754},[],{"categories":3756},[124],{"categories":3758},[204],{"categories":3760},[144],{"categories":3762},[127],{"categories":3764},[144],{"categories":3766},[],{"categories":3768},[204],{"categories":3770},[469],{"categories":3772},[118],{"categories":3774},[108],{"categories":3776},[469],{"categories":3778},[118],{"categories":3780},[159],{"categories":3782},[118],{"categories":3784},[159],{"categories":3786},[144],{"categories":3788},[159],{"categories":3790},[159],{"categories":3792},[127],{"categories":3794},[204],{"categories":3796},[144],{"categories":3798},[121],{"categories":3800},[],{"categories":3802},[144],{"categories":3804},[124],{"categories":3806},[204],{"categories":3808},[108],{"categories":3810},[144],{"categories":3812},[204],{"categories":3814},[159],{"categories":3816},[144],{"categories":3818},[144],{"categories":3820},[204],{"categories":3822},[144],{"categories":3824},[159],{"categories":3826},[144],{"categories":3828},[],{"categories":3830},[144],{"categories":3832},[144],{"categories":3834},[144],{"categories":3836},[144],{"categories":3838},[],{"categories":3840},[118],{"categories":3842},[469],{"categories":3844},[],{"categories":3846},[],{"categories":3848},[144],{"categories":3850},[108],{"categories":3852},[121],{"categories":3854},[108],{"categories":3856},[108],{"categories":3858},[118],{"categories":3860},[],{"categories":3862},[144],{"categories":3864},[113],{"categories":3866},[144],{"categories":3868},[144],{"categories":3870},[],{"categories":3872},[118],{"categories":3874},[113],{"categories":3876},[144,469],{"categories":3878},[118,469],{"categories":3880},[469],{"categories":3882},[144],{"categories":3884},[118],{"categories":3886},[118],{"categories":3888},[127],{"categories":3890},[127],{"categories":3892},[127],{"categories":3894},[144],{"categories":3896},[124],{"categories":3898},[118],{"categories":3900},[],{"categories":3902},[469],{"categories":3904},[],{"categories":3906},[469],{"categories":3908},[469],{"categories":3910},[108],{"categories":3912},[118],{"categories":3914},[],{"categories":3916},[469],{"categories":3918},[144],{"categories":3920},[113],{"categories":3922},[144],{"categories":3924},[124],{"categories":3926},[127],{"categories":3928},[127],{"categories":3930},[127],{"categories":3932},[469],{"categories":3934},[],{"categories":3936},[],{"categories":3938},[],{"categories":3940},[144],{"categories":3942},[127],{"categories":3944},[144],{"categories":3946},[127],{"categories":3948},[469],{"categories":3950},[469],{"categories":3952},[144],{"categories":3954},[118],{"categories":3956},[],{"categories":3958},[144],{"categories":3960},[144],{"categories":3962},[144],{"categories":3964},[],{"categories":3966},[],{"categories":3968},[469],{"categories":3970},[469],{"categories":3972},[144,469],{"categories":3974},[118],{"categories":3976},[118],{"categories":3978},[118],{"categories":3980},[118],{"categories":3982},[118],{"categories":3984},[118],{"categories":3986},[],{"categories":3988},[127],{"categories":3990},[144],{"categories":3992},[127],{"categories":3994},[121],{"categories":3996},[144],{"categories":3998},[556],{"categories":4000},[556],{"categories":4002},[118],{"categories":4004},[127],{"categories":4006},[],{"categories":4008},[118],{"categories":4010},[144],{"categories":4012},[],{"categories":4014},[124],{"categories":4016},[],{"categories":4018},[144],{"categories":4020},[118],{"categories":4022},[113],{"categories":4024},[144],{"categories":4026},[],{"categories":4028},[],{"categories":4030},[124],{"categories":4032},[124],{"categories":4034},[159],{"categories":4036},[124],{"categories":4038},[118],{"categories":4040},[],{"categories":4042},[118],{"categories":4044},[113],{"categories":4046},[144],{"categories":4048},[144],{"categories":4050},[],{"categories":4052},[144],{"categories":4054},[159],{"categories":4056},[144],{"categories":4058},[],{"categories":4060},[204],{"categories":4062},[127],{"categories":4064},[127],{"categories":4066},[108],{"categories":4068},[108],{"categories":4070},[108],{"categories":4072},[118],{"categories":4074},[108],{"categories":4076},[118],{"categories":4078},[469],{"categories":4080},[556],{"categories":4082},[113],{"categories":4084},[113],{"categories":4086},[113],{"categories":4088},[469],{"categories":4090},[113,108],{"categories":4092},[204],{"categories":4094},[118],{"categories":4096},[],{"categories":4098},[144],{"categories":4100},[],{"categories":4102},[127],{"categories":4104},[204],{"categories":4106},[124],{"categories":4108},[127],{"categories":4110},[159],{"categories":4112},[],{"categories":4114},[118],{"categories":4116},[],{"categories":4118},[556],{"categories":4120},[],{"categories":4122},[124],{"categories":4124},[124],{"categories":4126},[204],{"categories":4128},[],{"categories":4130},[144],{"categories":4132},[204],{"categories":4134},[],{"categories":4136},[144],{"categories":4138},[144],{"categories":4140},[],{"categories":4142},[159],{"categories":4144},[144],{"categories":4146},[],{"categories":4148},[144],{"categories":4150},[],{"categories":4152},[],{"categories":4154},[118],{"categories":4156},[118],{"categories":4158},[],{"categories":4160},[127],{"categories":4162},[127],{"categories":4164},[127],{"categories":4166},[144,118],{"categories":4168},[118],{"categories":4170},[118],{"categories":4172},[118],{"categories":4174},[204],{"categories":4176},[204],{"categories":4178},[],{"categories":4180},[113],{"categories":4182},[144],{"categories":4184},[204],{"categories":4186},[204],{"categories":4188},[113],{"categories":4190},[108],{"categories":4192},[118],{"categories":4194},[127],{"categories":4196},[144],{"categories":4198},[144],{"categories":4200},[118],{"categories":4202},[127],{"categories":4204},[118],{"categories":4206},[144],{"categories":4208},[121],{"categories":4210},[],{"categories":4212},[144],{"categories":4214},[],{"categories":4216},[144],{"categories":4218},[144],{"categories":4220},[127],{"categories":4222},[],{"categories":4224},[204],{"categories":4226},[144],{"categories":4228},[118],{"categories":4230},[118],{"categories":4232},[127],{"categories":4234},[159],{"categories":4236},[159],{"categories":4238},[113],{"categories":4240},[144],{"categories":4242},[118],{"categories":4244},[],{"categories":4246},[118],{"categories":4248},[144],{"categories":4250},[113],{"categories":4252},[144],{"categories":4254},[144],{"categories":4256},[144],{"categories":4258},[118],{"categories":4260},[204],{"categories":4262},[144],{"categories":4264},[124],{"categories":4266},[144],{"categories":4268},[144],{"categories":4270},[144],{"categories":4272},[144],{"categories":4274},[],{"categories":4276},[144],{"categories":4278},[204],{"categories":4280},[124],{"categories":4282},[144],{"categories":4284},[124],{"categories":4286},[],{"categories":4288},[],{"categories":4290},[],{"categories":4292},[144],{"categories":4294},[],{"categories":4296},[],{"categories":4298},[],{"categories":4300},[],{"categories":4302},[118],{"categories":4304},[159],{"categories":4306},[118],{"categories":4308},[118],{"categories":4310},[127],{"categories":4312},[108],{"categories":4314},[144],{"categories":4316},[144],{"categories":4318},[144],{"categories":4320},[108],{"categories":4322},[159],{"categories":4324},[],{"categories":4326},[204],{"categories":4328},[121],{"categories":4330},[144],{"categories":4332},[124],{"categories":4334},[159],{"categories":4336},[159],{"categories":4338},[556],{"categories":4340},[118],{"categories":4342},[144],{"categories":4344},[144],{"categories":4346},[159],{"categories":4348},[144],{"categories":4350},[],{"categories":4352},[],{"categories":4354},[469],{"categories":4356},[124],{"categories":4358},[159],{"categories":4360},[144],{"categories":4362},[113],{"categories":4364},[159],{"categories":4366},[108],{"categories":4368},[118],{"categories":4370},[118],{"categories":4372},[113],{"categories":4374},[144],{"categories":4376},[],{"categories":4378},[],{"categories":4380},[],{"categories":4382},[144],{"categories":4384},[],{"categories":4386},[113],{"categories":4388},[],{"categories":4390},[144],{"categories":4392},[],{"categories":4394},[113],{"categories":4396},[118],{"categories":4398},[144],{"categories":4400},[469],{"categories":4402},[144],{"categories":4404},[159],{"categories":4406},[144],{"categories":4408},[159],{"categories":4410},[159],{"categories":4412},[],{"categories":4414},[],{"categories":4416},[159],{"categories":4418},[159],{"categories":4420},[159],{"categories":4422},[],{"categories":4424},[159],{"categories":4426},[118],{"categories":4428},[118],{"categories":4430},[],{"categories":4432},[144],{"categories":4434},[121],{"categories":4436},[204],{"categories":4438},[144],{"categories":4440},[],{"categories":4442},[159],{"categories":4444},[144],{"categories":4446},[556],{"categories":4448},[159],{"categories":4450},[159],{"categories":4452},[121],{"categories":4454},[127],{"categories":4456},[127],{"categories":4458},[],{"categories":4460},[127],{"categories":4462},[144],{"categories":4464},[],{"categories":4466},[],{"categories":4468},[118],{"categories":4470},[],{"categories":4472},[118],{"categories":4474},[118],{"categories":4476},[113],{"categories":4478},[144],{"categories":4480},[113],{"categories":4482},[159],{"categories":4484},[113],{"categories":4486},[127],{"categories":4488},[127],{"categories":4490},[127],{"categories":4492},[113],{"categories":4494},[144],{"categories":4496},[118],{"categories":4498},[469],{"categories":4500},[108],{"categories":4502},[469],{"categories":4504},[469],{"categories":4506},[127],{"categories":4508},[469],{"categories":4510},[469],[4512,4577,4851,5112],{"id":4513,"title":4514,"ai":4515,"body":4519,"categories":4557,"created_at":59,"date_modified":59,"description":52,"extension":60,"faq":59,"featured":61,"kicker_label":59,"meta":4558,"navigation":86,"path":4568,"published_at":88,"question":59,"scraped_at":4569,"seo":4570,"sitemap":4571,"source_id":92,"source_name":93,"source_type":94,"source_url":95,"stem":4572,"tags":4573,"thumbnail_url":59,"tldr":4574,"tweet":59,"unknown_tags":4575,"__hash__":4576},"summaries\u002Fsummaries\u002Fopenmythos-770m-rdt-matches-1-3b-transformer-summary.md","OpenMythos: 770M RDT Matches 1.3B Transformer",{"provider":7,"model":8,"input_tokens":9,"output_tokens":4516,"processing_time_ms":4517,"cost_usd":4518},1854,16835,0.0020005,{"type":14,"value":4520,"toc":4552},[4521,4525,4528,4531,4534,4536,4539,4542,4546,4549],[17,4522,4524],{"id":4523},"recurrent-depth-transformers-scale-reasoning-with-loops-not-layers","Recurrent-Depth Transformers Scale Reasoning with Loops, Not Layers",[22,4526,4527],{},"Standard transformers like GPT or Llama stack unique layers with independent weights, where capability ties directly to parameter count. Recurrent-Depth Transformers (RDTs), or Looped Transformers, reuse a fixed set of weights iteratively across T=16 loop steps in a single forward pass. This decouples reasoning depth from stored parameters: run more loops at inference for harder problems, exit early for simple ones.",[22,4529,4530],{},"The structure follows Prelude → Recurrent Block → Coda. Prelude and Coda are one-time standard transformer layers. The Recurrent Block updates hidden state ht+1 = A·ht + B·e + Transformer(ht, e), reinjecting encoded input e each step to prevent drift. Reasoning stays in continuous latent space—no mid-loop token emissions—equivalent to chain-of-thought over vectors, per Saunshi et al. (2025). This supports multi-step reasoning natively: a model trained on 5-hop chains handles 10-hop at inference by doubling loops, unlike fixed-depth transformers.",[22,4532,4533],{},"FFN uses Mixture-of-Experts (MoE) from DeepSeekMoE: sparse top-K experts per token plus shared experts, with router selecting distinct subsets per loop for varied computation. Attention employs Multi-Latent Attention from DeepSeek-V2, compressing KV to latents for 10–20× memory savings.",[17,4535,34],{"id":33},[22,4537,4538],{},"Looped models risk residual explosion (unbounded ht growth) or overthinking (drift past solutions). OpenMythos enforces Linear Time-Invariant (LTI) constraints from Parcae: spectral radius ρ(A) \u003C 1 by construction, ensuring stability independent of learning rate.",[22,4540,4541],{},"Adaptive Computation Time (ACT) halting uses a learned scalar per position to stop loops dynamically—harder tokens get more compute. Depth-Wise LoRA adapters add low-rank matrices per iteration, differentiating behavior without full untying, keeping params lean.",[17,4543,4545],{"id":4544},"half-the-params-for-equivalent-performance-reshapes-scaling","Half the Params for Equivalent Performance Reshapes Scaling",[22,4547,4548],{},"Parcae (Prairie et al., 2026) shows 770M RDT matches 1.3B dense transformer on identical data—~50% param efficiency. Optimal recurrence and token count follow power laws, yielding predictable scaling for looped training. Inference compute via loop depth becomes the key axis, not training params, challenging bigger-is-better assumptions.",[22,4550,4551],{},"OpenMythos delivers PyTorch code for RDT with MoE, LTI injection, depth-LoRA, and baselines—falsifiable hypothesis for testing Claude Mythos and advancing looped architectures beyond parameter races.",{"title":52,"searchDepth":53,"depth":53,"links":4553},[4554,4555,4556],{"id":4523,"depth":53,"text":4524},{"id":33,"depth":53,"text":34},{"id":4544,"depth":53,"text":4545},[],{"content_references":4559,"triage":4566},[4560,4563,4564],{"type":65,"title":66,"author":4561,"url":67,"context":4562},"Kye Gomez","recommended",{"type":70,"title":71,"url":72,"context":73},{"type":70,"title":4565,"url":77,"context":73},"Parcae (Prairie et al., 2026)",{"relevance":82,"novelty":82,"quality":83,"actionability":53,"composite":84,"reasoning":4567},"Category: AI & LLMs. The article discusses a new architecture for transformers, which is relevant to AI engineering, but lacks practical applications or frameworks that the audience can implement directly. While it presents some novel insights into the architecture of Recurrent-Depth Transformers, it does not provide actionable steps for product builders.","\u002Fsummaries\u002Fopenmythos-770m-rdt-matches-1-3b-transformer-summary","2026-04-20 16:57:34",{"title":4514,"description":52},{"loc":4568},"summaries\u002Fopenmythos-770m-rdt-matches-1-3b-transformer-summary",[98,99,100,101],"OpenMythos reconstructs Claude Mythos as a Recurrent-Depth Transformer (RDT) in PyTorch, using looped weights for reasoning depth that delivers 1.3B transformer performance at 770M params—half the size via inference-time iteration.",[],"0CuMpRpinH512AlQeFcAkWyaVRk8bDmWG3vhPtcmdT4",{"id":4578,"title":4579,"ai":4580,"body":4585,"categories":4833,"created_at":59,"date_modified":59,"description":52,"extension":60,"faq":59,"featured":61,"kicker_label":59,"meta":4834,"navigation":86,"path":4839,"published_at":4840,"question":59,"scraped_at":4840,"seo":4841,"sitemap":4842,"source_id":4843,"source_name":4844,"source_type":94,"source_url":4845,"stem":4846,"tags":4847,"thumbnail_url":59,"tldr":4848,"tweet":59,"unknown_tags":4849,"__hash__":4850},"summaries\u002Fsummaries\u002Ftransformers-core-library-for-multimodal-ml-models-summary.md","Transformers: Core Library for Multimodal ML Models",{"provider":7,"model":8,"input_tokens":4581,"output_tokens":4582,"processing_time_ms":4583,"cost_usd":4584},9481,2241,18312,0.00272555,{"type":14,"value":4586,"toc":4826},[4587,4591,4611,4614,4656,4659,4663,4666,4707,4710,4713,4724,4727,4731,4734,4737,4744,4747,4754,4758,4765,4768,4771,4775,4822],[17,4588,4590],{"id":4589},"standardized-access-to-cutting-edge-models","Standardized Access to Cutting-Edge Models",[22,4592,4593,4594,4598,4599,4602,4603,4606,4607,4610],{},"Transformers centralizes implementations of state-of-the-art architectures across modalities: text (e.g., BERT, GPT), vision (e.g., ViT), audio (e.g., Whisper), and multimodal (e.g., CLIP, BLIP). Load any model from the Hugging Face Hub with ",[4595,4596,4597],"code",{},"from_pretrained(model_id)","—handles tokenizers, configs, and weights automatically. Supports PyTorch, TensorFlow, JAX, and Flax for flexible inference or training pipelines. Trade-off: Massive scope means occasional bloat; stick to ",[4595,4600,4601],{},"pip install transformers"," core for most needs, add extras like ",[4595,4604,4605],{},"torch",", ",[4595,4608,4609],{},"tensorflow"," only when required.",[22,4612,4613],{},"Example quickstart (inferred from src structure and examples folder):",[4615,4616,4619],"pre",{"className":4617,"code":4618,"language":101,"meta":52,"style":52},"language-python shiki shiki-themes github-light github-dark","from transformers import AutoTokenizer, AutoModelForCausalLM\n\ntokenizer = AutoTokenizer.from_pretrained('gpt2')\nmodel = AutoModelForCausalLM.from_pretrained('gpt2')\ninputs = tokenizer('Hello world', return_tensors='pt')\noutputs = model(**inputs)\n",[4595,4620,4621,4629,4634,4639,4644,4650],{"__ignoreMap":52},[4622,4623,4626],"span",{"class":4624,"line":4625},"line",1,[4622,4627,4628],{},"from transformers import AutoTokenizer, AutoModelForCausalLM\n",[4622,4630,4631],{"class":4624,"line":53},[4622,4632,4633],{"emptyLinePlaceholder":86},"\n",[4622,4635,4636],{"class":4624,"line":82},[4622,4637,4638],{},"tokenizer = AutoTokenizer.from_pretrained('gpt2')\n",[4622,4640,4641],{"class":4624,"line":83},[4622,4642,4643],{},"model = AutoModelForCausalLM.from_pretrained('gpt2')\n",[4622,4645,4647],{"class":4624,"line":4646},5,[4622,4648,4649],{},"inputs = tokenizer('Hello world', return_tensors='pt')\n",[4622,4651,4653],{"class":4624,"line":4652},6,[4622,4654,4655],{},"outputs = model(**inputs)\n",[22,4657,4658],{},"This pattern scales to 100k+ models on the Hub, enabling rapid prototyping of RAG, agents, or generation apps.",[17,4660,4662],{"id":4661},"developer-ecosystem-for-production-pipelines","Developer Ecosystem for Production Pipelines",[22,4664,4665],{},"Repo structure prioritizes real-world use:",[4667,4668,4669,4677,4683,4689,4695,4701],"ul",{},[4670,4671,4672,4676],"li",{},[4673,4674,4675],"strong",{},"src\u002Ftransformers",": Model definitions, pipelines, tokenizers—core engine.",[4670,4678,4679,4682],{},[4673,4680,4681],{},"docs",": Comprehensive guides (recently updated with Qianfan-OCR VLM).",[4670,4684,4685,4688],{},[4673,4686,4687],{},"examples",": End-to-end scripts for training, serving (e.g., refactored serving modules with batching, streaming, tool calls, VLM support).",[4670,4690,4691,4694],{},[4673,4692,4693],{},"notebooks",": Jupyter demos, including AMD dev cloud notebooks for hardware testing.",[4670,4696,4697,4700],{},[4673,4698,4699],{},"benchmark\u002Fbenchmark_v2",": Performance measurement tools, with recent cache optimizations and continuous batching (CB) tweaks for throughput.",[4670,4702,4703,4706],{},[4673,4704,4705],{},"docker",": Containers for QA, type checking, reproducible envs.",[22,4708,4709],{},"These let you benchmark latency (e.g., CB memory fixes for int64 tensors), deploy via examples\u002Fserving (now modular with model_manager, response\u002Fchat endpoints), and automate with scripts (e.g., bandit S110 for secure except blocks).",[22,4711,4712],{},"Recent commits show maturity:",[4667,4714,4715,4718,4721],{},[4670,4716,4717],{},"Typing rules (e.g., rule 15 for tie_word_embeddings) ensure config robustness.",[4670,4719,4720],{},"ZeRO-3 fixes for from_pretrained load buffers correctly in sharded setups.",[4670,4722,4723],{},"Serving refactor: Added queue draining, locks for concurrency, transcription guards—directly actionable for API servers.",[22,4725,4726],{},"\"🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.\"",[17,4728,4730],{"id":4729},"active-maintenance-signals-reliability","Active Maintenance Signals Reliability",[22,4732,4733],{},"160k stars, 32.9k forks, 1.1k issues, 1.3k PRs—vibrant community. Main branch at commit a29df2d (Apr 17, 2026) with 22k+ commits. Folders like .ai (typing rules), .github (workflows), .circleci (CI) indicate CI\u002FCD rigor. Recent PRs (#45495 revert for AMD CI, #45280 Qianfan-OCR integration with modular VLM tests) add niche models while fixing dtype mismatches, DDP errors.",[22,4735,4736],{},"Benchmark updates rework deps, remove outdated templates for cleaner DX. Examples PR #44796 refactors serving: Supports compile graphs, tool calls, VLMs—\"better stream\", \"batch output\" for prod-scale inference.",[22,4738,4739,4740,4743],{},"Trade-offs: Frequent commits (daily) mean test your branch; use tags (265 available) for stability. For indie builders, pin versions like ",[4595,4741,4742],{},"transformers==4.40.0"," to avoid breaks.",[22,4745,4746],{},"\"Fix ZeRO-3 from_pretrained: load registered buffers in _load_state_dict_into_zero3_model\"—fixes real sharding pain in distributed training.",[22,4748,4749,4750,4753],{},"\"",[4622,4751,4752],{},"refactor"," Serving into proper modules (#44796)\"—streamlines deploying chat\u002Fcompletion endpoints with metrics, warmup.",[17,4755,4757],{"id":4756},"scaling-from-prototype-to-production","Scaling from Prototype to Production",[22,4759,4760,4761,4764],{},"Use pipelines for no-code inference: ",[4595,4762,4763],{},"pipeline('sentiment-analysis')",". For agents, combine with function calling in causal LMs. Fine-tune via Trainer API in examples. Benchmarks reveal throughput gains (e.g., CB tweaks reduce memory via int64). Docker for edge deployment; notebooks for experimentation.",[22,4766,4767],{},"Opinion: Skip rolling your own tokenizer\u002Fmodel loader—Transformers handles edge cases (e.g., tie embeddings, modular VLMs) you won't. Pair with Accelerate for multi-GPU, Optimum for ONNX\u002FTensorRT export.",[22,4769,4770],{},"\"Rework dependencies and extras + Remove outdated templates folder (#43536)\"—keeps installs lean.",[17,4772,4774],{"id":4773},"key-takeaways","Key Takeaways",[4667,4776,4777,4790,4797,4804,4807,4810,4813,4816,4819],{},[4670,4778,4779,4780,4782,4783,4785,4786,4789],{},"Install minimally: ",[4595,4781,4601],{},"—add ",[4595,4784,4605],{}," or ",[4595,4787,4788],{},"tf"," as needed; avoids 1GB+ bloat.",[4670,4791,4792,4793,4796],{},"Load models instantly: ",[4595,4794,4795],{},"AutoModel.from_pretrained('microsoft\u002FDialoGPT-medium')"," for chatbots.",[4670,4798,4799,4800,4803],{},"Benchmark first: Run ",[4595,4801,4802],{},"benchmark_v2"," scripts to measure your hardware's tokens\u002Fsec before scaling.",[4670,4805,4806],{},"Deploy via examples\u002Fserving: Supports streaming, batching, tool calls—test with VLM endpoints.",[4670,4808,4809],{},"Check docs for new models like Qianfan-OCR; use modular inheritance for custom VLMs.",[4670,4811,4812],{},"Fix common pitfalls: Verify buffers load in ZeRO-3; use typing rules for config safety.",[4670,4814,4815],{},"Prototype in notebooks (AMD\u002FGPU ready); productionize with Docker\u002FCI from .github.",[4670,4817,4818],{},"Pin versions for stability; follow main for bleeding-edge (e.g., CB optimizations).",[4670,4820,4821],{},"Contribute via PRs: Focus on benchmarks or examples for max impact.",[4823,4824,4825],"style",{},"html .default .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}html.dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}",{"title":52,"searchDepth":53,"depth":53,"links":4827},[4828,4829,4830,4831,4832],{"id":4589,"depth":53,"text":4590},{"id":4661,"depth":53,"text":4662},{"id":4729,"depth":53,"text":4730},{"id":4756,"depth":53,"text":4757},{"id":4773,"depth":53,"text":4774},[],{"content_references":4835,"triage":4836},[],{"relevance":4646,"novelty":83,"quality":83,"actionability":83,"composite":4837,"reasoning":4838},4.35,"Category: AI & LLMs. The article provides a comprehensive overview of the Hugging Face Transformers library, detailing its capabilities for building and deploying multimodal ML models, which directly addresses the needs of developers looking to integrate AI into their products. It includes practical examples and a structured approach to using the library, making it actionable for the target audience.","\u002Fsummaries\u002Ftransformers-core-library-for-multimodal-ml-models-summary","2026-04-19 14:53:09",{"title":4579,"description":52},{"loc":4839},"53d940334a2a5afd","__oneoff__","https:\u002F\u002Fgithub.com\u002Fhuggingface\u002Ftransformers","summaries\u002Ftransformers-core-library-for-multimodal-ml-models-summary",[98,99,100,101],"Hugging Face Transformers delivers PyTorch\u002FTensorFlow\u002FJAX code for SOTA text, vision, audio, multimodal models—use it to run inference or fine-tune without reinventing wheels.",[],"5eY0eLOpWWwl9vmY5PNYHyejEo__qg4nVFqeIUQFitI",{"id":4852,"title":4853,"ai":4854,"body":4859,"categories":5078,"created_at":59,"date_modified":59,"description":52,"extension":60,"faq":59,"featured":61,"kicker_label":59,"meta":5079,"navigation":86,"path":5101,"published_at":59,"question":59,"scraped_at":5102,"seo":5103,"sitemap":5104,"source_id":5105,"source_name":4844,"source_type":94,"source_url":5106,"stem":5107,"tags":5108,"thumbnail_url":59,"tldr":5109,"tweet":59,"unknown_tags":5110,"__hash__":5111},"summaries\u002Fsummaries\u002Fturboquant-6-4x-kv-cache-compression-at-q8-0-speed-summary.md","TurboQuant+: 6.4x KV Cache Compression at q8_0 Speed",{"provider":7,"model":8,"input_tokens":4855,"output_tokens":4856,"processing_time_ms":4857,"cost_usd":4858},11014,3209,20267,0.0037848,{"type":14,"value":4860,"toc":5071},[4861,4865,4868,4871,4877,4880,4884,4887,4890,4895,4898,4901,4905,4908,5019,5024,5028,5031,5036,5039,5041],[17,4862,4864],{"id":4863},"turboquant-formats-deliver-extreme-compression-with-minimal-quality-loss","TurboQuant Formats Deliver Extreme Compression with Minimal Quality Loss",[22,4866,4867],{},"TurboQuant+ ports Google's TurboQuant (ICLR 2026) to llama.cpp, compressing KV cache via PolarQuant (multi-centroid scalar quantization) + Walsh-Hadamard Transform (WHT) rotation, dropping the paper's 1-bit QJL error correction which amplified softmax variance. Formats: turbo2 (2.5 bits\u002Fval, 6.4x vs fp16), turbo3 (3.5 bits\u002Fval at block=32, 4.6x; 3.125 bits\u002Fval at block=128, 5.12x), turbo4 (4.25 bits\u002Fval, 3.8x). On M5 Max (Qwen3.5-27B\u002F35B-A3B), turbo4 PPL 6.125 (+0.23% vs q8_0 baseline 6.111 on wikitext-2 512 chunks); turbo3 6.176 (+1.06%). turbo4 outperforms q4_0 (6.142, +0.52%) in quality at similar compression.",[22,4869,4870],{},"Block size optimization (study: docs\u002Fpapers\u002Fblock-size-experiment.md) boosts turbo3 to 5.12x at block=128 with identical PPL across 512-32K contexts, 3 architectures (Qwen2.5-1.5B, Llama3.1-8B, Qwen3.5-27B), validated on M2 Pro\u002FM5 Max Metal. Larger blocks reduce overhead but risk cache thrashing on older hardware—default block=32 balances.",[4872,4873,4874],"blockquote",{},[22,4875,4876],{},"\"Compresses transformer KV cache 3.8-6.4x using PolarQuant + Walsh-Hadamard rotation. Near q8_0 prefill speed and ~0.9x decode throughput at long context (Apple Silicon).\"",[22,4878,4879],{},"Asymmetric K\u002FV caching preserves quality on Q4_K_M weights: keep K at q8_0 (attention routing), compress V (turbo3\u002F4). E.g., Qwen2.5-7B Q4_K_M: q8_0-K + turbo4-V PPL 6.64 (+1.0% vs q8_0); symmetric turbo3 catastrophic (3556 PPL). Bigger models tolerate symmetric better (104B Command-R+: turbo3 +3.6%). Config guide: docs\u002Fturboquant-recommendations.md.",[17,4881,4883],{"id":4882},"layer-aware-and-sparse-optimizations-maximize-speed-and-quality","Layer-Aware and Sparse Optimizations Maximize Speed and Quality",[22,4885,4886],{},"Boundary V (layer-aware): Protects first\u002Flast 2 layers at q8_0-V, turbo2-V elsewhere. Recovers 37-91% of quality gap to turbo3 (e.g., Qwen3.5-35B MoE: turbo2 5.257 → Boundary 5.148 vs turbo3 5.137). Scales with depth (91% on 64L MoE). Enabled via TURBO_LAYER_ADAPTIVE=7; no speed hit.",[22,4888,4889],{},"Sparse V dequant: Skips V dequant for softmax weights \u003C1e-6 (most at long context). +22.8% decode at 32K (turbo3: 0.76x → 0.93x q8_0), no PPL change (wikitext-103 50 chunks, CI±0.021). General opt: +5% on q8_0 KV. Validated 1.5B-104B; dense models gain less (1-2% as FFN dominates).",[4872,4891,4892],{},[22,4893,4894],{},"\"Sparse V: Attention-gated KV cache decoding that skips low-weight V positions during inference. Up to +22.8% decode speed at 32K context... no measurable PPL change.\"",[22,4896,4897],{},"Prefill scales 2K-32K: turbo3\u002F4 ≥ q8_0 (e.g., 32K: turbo3 1204 vs 1098 t\u002Fs). Decode (M5 Max Qwen3.5-35B-A3B Sparse V): turbo4 1060 t\u002Fs long ctx (0.90x q8_0); real 24K PDF: turbo4 63.7 t\u002Fs (0.93x). M1 Max 38K doc: turbo4 +33.9% decode vs q8_0.",[22,4899,4900],{},"Optimization path (4K prefill): fp32 WHT (739 t\u002Fs, 0.27x q8_0) → fp16 + vectorized butterfly + graph rotation + block-32 + dequant → 2524 t\u002Fs (0.98x). KL div vs f16: turbo4 0.009633 (lower than q4_0 0.008091? Wait, table shows turbo4 better top-p agreement 95.98%).",[17,4902,4904],{"id":4903},"cross-hardware-benchmarks-confirm-production-readiness","Cross-Hardware Benchmarks Confirm Production Readiness",[22,4906,4907],{},"Apple Silicon (M5 Max 128GB): 104B@128K turbo3 (PPL 4.024? Wait, table 6.415 +3.6%; 74GB peak). Raise iogpu.wired_limit_mb=117964. M1 Max: turbo4 beats q8_0 long ctx. CUDA (RTX3090 Qwen3.5-9B Q4_K_M): turbo3\u002F4 decode 95-98 t\u002Fs (0.93-0.96x q8_0). AMD RX9070 XT (RDNA4 HIP): q8_0-K + turbo4-V +1.0% PPL, +2.5% decode.",[4909,4910,4911,4936],"table",{},[4912,4913,4914],"thead",{},[4915,4916,4917,4921,4924,4927,4930,4933],"tr",{},[4918,4919,4920],"th",{},"Hardware",[4918,4922,4923],{},"Model",[4918,4925,4926],{},"Config",[4918,4928,4929],{},"Decode t\u002Fs",[4918,4931,4932],{},"vs q8_0",[4918,4934,4935],{},"Notes",[4937,4938,4939,4960,4980,4999],"tbody",{},[4915,4940,4941,4945,4948,4951,4954,4957],{},[4942,4943,4944],"td",{},"M5 Max",[4942,4946,4947],{},"Qwen3.5-35B-A3B",[4942,4949,4950],{},"turbo4 + Sparse V",[4942,4952,4953],{},"1060 (32K)",[4942,4955,4956],{},"0.90x",[4942,4958,4959],{},"MoE",[4915,4961,4962,4965,4968,4971,4974,4977],{},[4942,4963,4964],{},"RTX3090",[4942,4966,4967],{},"Qwen3.5-9B Q4_K_M",[4942,4969,4970],{},"turbo4\u002Fturbo4",[4942,4972,4973],{},"95.87",[4942,4975,4976],{},"0.93x",[4942,4978,4979],{},"CUDA",[4915,4981,4982,4985,4987,4990,4993,4996],{},[4942,4983,4984],{},"M1 Max 64GB",[4942,4986,4947],{},[4942,4988,4989],{},"turbo4",[4942,4991,4992],{},"16.6 (38K)",[4942,4994,4995],{},"+33.9%",[4942,4997,4998],{},"Real doc",[4915,5000,5001,5004,5007,5010,5013,5016],{},[4942,5002,5003],{},"RX9070 XT",[4942,5005,5006],{},"Qwen2.5-7B Q4_K_M",[4942,5008,5009],{},"q8_0-K\u002Fturbo4-V",[4942,5011,5012],{},"86.8",[4942,5014,5015],{},"+2.5%",[4942,5017,5018],{},"HIP",[4872,5020,5021],{},[22,5022,5023],{},"\"104B at 128K context on a MacBook with turbo3 (PPL 4.024, 74 GB peak memory).\"",[17,5025,5027],{"id":5026},"retrieval-and-perplexity-validate-fidelity","Retrieval and Perplexity Validate Fidelity",[22,5029,5030],{},"NIAH (Kamradt\u002FRULER): turbo4 31\u002F33 (+3% vs q8_0 30\u002F33); turbo3 + Sparse V 9\u002F9. Multi-key 100% to 32K. Long ctx PPL (32K wikitext-103 50ch): turbo3 +1.64% vs q8_0, Sparse V delta=0. PPL stable: Llama3.1-70B turbo4 +6.3%, Command-R+104B +1.9%.",[4872,5032,5033],{},[22,5034,5035],{},"\"turbo4 beats q8_0 on retrieval (31\u002F33 vs 30\u002F33). Shared failure at 8K\u002F100% is a model weakness, not quantization.\"",[22,5037,5038],{},"Python prototype confirms: turbo4 cosine sim 0.96, MSE 0.0007. Gaussianization exact (kurtosis 900→2.9).",[17,5040,4774],{"id":4773},[4667,5042,5043,5046,5053,5056,5059,5062,5065,5068],{},[4670,5044,5045],{},"Use turbo4 for best quality\u002Fcompression balance (3.8x, +0.23% PPL); turbo3 for max (5.12x block=128, +1% PPL).",[4670,5047,5048,5049,5052],{},"Asymmetric q8_0-K + turbo",[4622,5050,5051],{},"3\u002F4","-V on Q4_K_M weights; symmetric on Q8_0+ or large models.",[4670,5054,5055],{},"Enable Sparse V always (+22% long decode, no PPL hit); Boundary V on deep models.",[4670,5057,5058],{},"Prefill ≥ q8_0 speed; validate decode on your hardware (M5+ best for turbo3).",[4670,5060,5061],{},"Build llama.cpp from fork; test PPL\u002FNIAH on your model before deploy.",[4670,5063,5064],{},"For Apple Silicon max ctx: sysctl iogpu.wired_limit_mb=90% RAM.",[4670,5066,5067],{},"Upstream path: Stable pieces as llama.cpp patches.",[4670,5069,5070],{},"MLX Swift fork for 2.5x faster Apple decode (144 t\u002Fs Qwen3.5-35B-A3B).",{"title":52,"searchDepth":53,"depth":53,"links":5072},[5073,5074,5075,5076,5077],{"id":4863,"depth":53,"text":4864},{"id":4882,"depth":53,"text":4883},{"id":4903,"depth":53,"text":4904},{"id":5026,"depth":53,"text":5027},{"id":4773,"depth":53,"text":4774},[],{"content_references":5080,"triage":5099},[5081,5084,5087,5091,5095],{"type":70,"title":5082,"url":5083,"context":68},"TurboQuant: Redefining AI Efficiency with Extreme Compression","https:\u002F\u002Fresearch.google\u002Fblog\u002Fturboquant-redefining-ai-efficiency-with-extreme-compression\u002F",{"type":65,"title":5085,"url":5086,"context":68},"llama-cpp-turboquant","https:\u002F\u002Fgithub.com\u002FTheTom\u002Fllama-cpp-turboquant",{"type":65,"title":5088,"author":5089,"url":5090,"context":4562},"mlx-swift-lm","ekryski","https:\u002F\u002Fgithub.com\u002Fekryski\u002Fmlx-swift-lm",{"type":65,"title":5092,"author":5093,"url":5094,"context":73},"LLMTest_NeedleInAHaystack","gkamradt","https:\u002F\u002Fgithub.com\u002Fgkamradt\u002FLLMTest_NeedleInAHaystack",{"type":65,"title":5096,"author":5097,"url":5098,"context":73},"RULER","NVIDIA","https:\u002F\u002Fgithub.com\u002FNVIDIA\u002FRULER",{"relevance":82,"novelty":82,"quality":83,"actionability":53,"composite":84,"reasoning":5100},"Category: AI & LLMs. The article discusses a specific implementation of TurboQuant for KV cache compression, which is relevant to AI engineering. However, it lacks practical application details that the target audience could act on immediately, focusing more on technical specifications and performance metrics.","\u002Fsummaries\u002Fturboquant-6-4x-kv-cache-compression-at-q8-0-speed-summary","2026-04-16 03:08:34",{"title":4853,"description":52},{"loc":5101},"2a9849ad35620d4f","https:\u002F\u002Fgithub.com\u002FTheTom\u002Fturboquant_plus.git","summaries\u002Fturboquant-6-4x-kv-cache-compression-at-q8-0-speed-summary",[98,100,99,101],"Implements TurboQuant in llama.cpp for 3.8-6.4x KV cache compression (turbo2\u002F3\u002F4 formats) with PPL near q8_0, matching prefill speed, and 0.9x decode on Apple Silicon, CUDA, AMD—plus Sparse V for +22.8% decode.",[],"plWu_YBdijURN1H3PHIgB0YlJzvsqC5LW0H3Gmp6cl8",{"id":5113,"title":5114,"ai":5115,"body":5120,"categories":5175,"created_at":59,"date_modified":59,"description":52,"extension":60,"faq":59,"featured":61,"kicker_label":59,"meta":5176,"navigation":86,"path":5194,"published_at":5195,"question":59,"scraped_at":5196,"seo":5197,"sitemap":5198,"source_id":5199,"source_name":93,"source_type":94,"source_url":5200,"stem":5201,"tags":5202,"thumbnail_url":59,"tldr":5203,"tweet":59,"unknown_tags":5204,"__hash__":5205},"summaries\u002Fsummaries\u002Ftrl-code-guide-sft-to-grpo-llm-alignment-on-t4-gpu-summary.md","TRL Code Guide: SFT to GRPO LLM Alignment on T4 GPU",{"provider":7,"model":8,"input_tokens":5116,"output_tokens":5117,"processing_time_ms":5118,"cost_usd":5119},9458,2615,35753,0.00269195,{"type":14,"value":5121,"toc":5169},[5122,5126,5133,5137,5147,5151,5157,5161],[17,5123,5125],{"id":5124},"lora-and-trl-setup-enables-post-training-on-limited-hardware","LoRA and TRL Setup Enables Post-Training on Limited Hardware",[22,5127,5128,5129,5132],{},"Use LoRA (r=8, alpha=16, dropout=0.05, targets=",[4622,5130,5131],{},"'q_proj','k_proj','v_proj','o_proj'",") with TRL trainers to adapt Qwen\u002FQwen2.5-0.5B-Instruct on T4 GPU (16GB). Common args across stages: num_train_epochs=1, gradient_checkpointing=True, bf16 if supported else fp16, logging_steps=10, report_to=\"none\", save_strategy=\"no\". Install stack: torchao>=0.16, trl>=0.20, transformers>=4.45, peft>=0.13, bitsandbytes. Helpers like chat_generate apply chat template, generate with temp=0.7\u002Ftop_p=0.9. Cleanup VRAM with gc.collect() + torch.cuda.empty_cache() between stages to fit in Colab.",[17,5134,5136],{"id":5135},"sft-and-rm-build-imitation-and-reward-signals","SFT and RM Build Imitation and Reward Signals",[22,5138,5139,5140,5143,5144,5146],{},"For Supervised Fine-Tuning, load trl-lib\u002FCapybara (train",[4622,5141,5142],{},":300","), use SFTConfig(per_device_train_batch_size=2, gradient_accumulation_steps=4, learning_rate=2e-4, max_length=768). Trainer imitates high-quality chat responses; post-train inference on \"Explain bias-variance tradeoff in two sentences\" yields coherent output. Reward Modeling on trl-lib\u002Fultrafeedback_binarized (train",[4622,5145,5142],{},") uses RewardConfig(batch_size=2, accum_steps=2, lr=1e-4, max_length=512), LoRA task_type=\"SEQ_CLS\". Trains to score chosen vs. rejected pairs, producing a preference-based reward without explicit RL.",[17,5148,5150],{"id":5149},"dpo-skips-rm-for-direct-preference-alignment","DPO Skips RM for Direct Preference Alignment",[22,5152,5153,5154,5156],{},"DPOTrainer on same ultrafeedback_binarized",[4622,5155,5142],{}," simplifies via implicit rewards: DPOConfig(batch_size=1, accum_steps=4, lr=5e-6, beta=0.1, max_length=512, max_prompt_length=256). Beta controls KL-divergence from reference policy, preventing mode collapse. Optimizes policy to prefer chosen over rejected responses directly, reducing steps vs. traditional RM+PPO.",[17,5158,5160],{"id":5159},"grpo-uses-custom-rewards-to-sharpen-reasoning","GRPO Uses Custom Rewards to Sharpen Reasoning",[22,5162,5163,5164,5168],{},"GRPOTrainer generates num_generations=4 completions per prompt (max_prompt_length=128, max_completion_length=96, max_steps=15), ranks via reward_funcs. Custom dataset: 200 synthetic math problems (e.g., \"Solve 17 + 28 =\", gold=eval). Rewards: correctness_reward (1.0 if last extracted number matches gold else 0), brevity_reward (max(0,1-len(c)\u002F200)",[5165,5166,5167],"em",{},"0.2). GRPOConfig(lr=1e-5, batch=2, accum=2). Inference on \"17+28?\", \"9","7?\", \"100-47?\" produces accurate, concise answers like final numbers, improving verifiable task performance over base.",{"title":52,"searchDepth":53,"depth":53,"links":5170},[5171,5172,5173,5174],{"id":5124,"depth":53,"text":5125},{"id":5135,"depth":53,"text":5136},{"id":5149,"depth":53,"text":5150},{"id":5159,"depth":53,"text":5160},[144],{"content_references":5177,"triage":5191},[5178,5181,5184,5186,5188],{"type":65,"title":5179,"url":5180,"context":68},"TRL","https:\u002F\u002Fgithub.com\u002Fhuggingface\u002Ftrl",{"type":5182,"title":5183,"context":68},"dataset","trl-lib\u002FCapybara",{"type":5182,"title":5185,"context":68},"trl-lib\u002Fultrafeedback_binarized",{"type":65,"title":5187,"context":68},"Qwen\u002FQwen2.5-0.5B-Instruct",{"type":79,"title":5189,"url":5190,"context":4562},"trl_llm_post_training_sft_dpo_grpo_marktechpost.py","https:\u002F\u002Fgithub.com\u002FMarktechpost\u002FAI-Agents-Projects-Tutorials\u002Fblob\u002Fmain\u002FLLM%20Projects\u002Ftrl_llm_post_training_sft_dpo_grpo_marktechpost.py",{"relevance":4646,"novelty":83,"quality":83,"actionability":4646,"composite":5192,"reasoning":5193},4.55,"Category: AI & LLMs. The article provides a detailed guide on using TRL and LoRA for LLM post-training, addressing practical applications for developers looking to implement AI features. It includes specific configurations and techniques that can be directly applied in production, making it highly actionable.","\u002Fsummaries\u002Ftrl-code-guide-sft-to-grpo-llm-alignment-on-t4-gpu-summary","2026-05-01 20:52:08","2026-05-03 17:01:49",{"title":5114,"description":52},{"loc":5194},"79f82c07ea7441fe","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F01\u002Fa-coding-guide-on-llm-post-training-with-trl-from-supervised-fine-tuning-to-dpo-and-grpo-reasoning\u002F","summaries\u002Ftrl-code-guide-sft-to-grpo-llm-alignment-on-t4-gpu-summary",[98,101,99],"Train Qwen2.5-0.5B via SFT, RM, DPO, GRPO using TRL+LoRA on Colab T4: configs include r=8 LoRA, 300-sample datasets, epochs=1, small batches\u002Faccum for memory efficiency, custom math rewards boost reasoning.",[],"4miREre7IX2LguMbkA_nsqybys6v0iG-V2aT-eEsJ4g"]