[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"summary-minimal-numpy-rnn-for-char-level-text-gen-summary":3,"summaries-facets-categories":207,"summary-related-minimal-numpy-rnn-for-char-level-text-gen-summary":4612},{"id":4,"title":5,"ai":6,"body":13,"categories":184,"created_at":186,"date_modified":186,"description":178,"extension":187,"faq":186,"featured":188,"kicker_label":186,"meta":189,"navigation":190,"path":191,"published_at":192,"question":186,"scraped_at":186,"seo":193,"sitemap":194,"source_id":195,"source_name":196,"source_type":197,"source_url":198,"stem":199,"tags":200,"thumbnail_url":186,"tldr":204,"tweet":186,"unknown_tags":205,"__hash__":206},"summaries\u002Fsummaries\u002Fminimal-numpy-rnn-for-char-level-text-gen-summary.md","Minimal NumPy RNN for Char-Level Text Gen",{"provider":7,"model":8,"input_tokens":9,"output_tokens":10,"processing_time_ms":11,"cost_usd":12},"openrouter","x-ai\u002Fgrok-4.1-fast",10743,1482,11844,0.0024192,{"type":14,"value":15,"toc":177},"minimark",[16,21,38,61,81,85,104,128,135,139,149,164,174],[17,18,20],"h2",{"id":19},"rnn-architecture-and-one-hot-encoding","RNN Architecture and One-Hot Encoding",[22,23,24,25,29,30,33,34,37],"p",{},"Load text from 'input.txt' into ",[26,27,28],"code",{},"data",", extract unique ",[26,31,32],{},"chars"," for vocabulary (vocab_size = len(chars)). Map chars to indices with ",[26,35,36],{},"char_to_ix"," and reverse. Use one-hot encoding: inputs are lists of indices turned into (vocab_size, 1) vectors with 1 at input index.",[22,39,40,41,44,45,48,49,52,53,56,57,60],{},"Hidden layer size fixed at 100 neurons (",[26,42,43],{},"hidden_size=100","), sequence length 25 (",[26,46,47],{},"seq_length=25","), learning rate 0.1. Weights initialized small: ",[26,50,51],{},"Wxh = np.random.randn(100, vocab_size)*0.01"," (input-to-hidden), ",[26,54,55],{},"Whh"," (hidden-to-hidden, 100x100), ",[26,58,59],{},"Why"," (hidden-to-output, vocab_size x 100). Biases zero-initialized. Scaling by 0.01 keeps initial activations small for tanh stability and breaks symmetry so hidden units learn distinct features.",[22,62,63,64,67,68,71,72,75,76,80],{},"Forward step per timestep t: ",[26,65,66],{},"hs[t] = tanh(Wxh @ xs[t] + Whh @ hs[t-1] + bh)",", then ",[26,69,70],{},"ys[t] = Why @ hs[t] + by",", softmax ",[26,73,74],{},"ps[t] = exp(ys[t])\u002Fsum(exp(ys[t]))"," for next-char probs. Loss is negative log-likelihood: sum -log(ps[t]",[77,78,79],"span",{},"target",").",[17,82,84],{"id":83},"backpropagation-through-time-and-gradients","Backpropagation Through Time and Gradients",[22,86,87,88,91,92,95,96,99,100,103],{},"In ",[26,89,90],{},"lossFun(inputs, targets, hprev)",": forward pass stores xs, hs, ys, ps for all timesteps. Backward pass starts from output: ",[26,93,94],{},"dy = ps[t].copy(); dy[target] -= 1"," (softmax + cross-entropy gradient simplifies to this). Accumulate ",[26,97,98],{},"dWhy += dy @ hs[t].T",", ",[26,101,102],{},"dby += dy",".",[22,105,106,107,110,111,114,115,99,118,99,121,99,124,127],{},"Propagate to hidden: ",[26,108,109],{},"dh = Why.T @ dy + dhnext"," (dhnext from future timestep), ",[26,112,113],{},"dhraw = (1 - hs[t]^2) * dh"," (tanh derivative), then ",[26,116,117],{},"dbh += dhraw",[26,119,120],{},"dWxh += dhraw @ xs[t].T",[26,122,123],{},"dWhh += dhraw @ hs[t-1].T",[26,125,126],{},"dhnext = Whh.T @ dhraw"," for prior timestep.",[22,129,130,131,134],{},"Clip all gradients to ",[77,132,133],{},"-5, 5"," to prevent exploding gradients. Returns total loss, all dparams, final h for next sequence.",[17,136,138],{"id":137},"adagrad-training-and-text-sampling","Adagrad Training and Text Sampling",[22,140,141,142,145,146,103],{},"Infinite loop sweeps data left-to-right in seq_length=25 chunks: reset hprev=zeros every epoch (when p >= len(data)). Compute inputs\u002Ftargets as char indices for data",[77,143,144],{},"p:p+25"," and shifted ",[77,147,148],{},"p+1:p+26",[22,150,151,152,155,156,159,160,163],{},"Every 100 iterations: sample 200 chars from model starting with inputs",[77,153,154],{},"0"," seed: forward like training but pick ",[26,157,158],{},"ix = np.random.choice(vocab_size, p=ps.ravel())",", decode to text, print. Smooth loss: ",[26,161,162],{},"smooth_loss *= 0.999 + loss * 0.001",", print every 100 iters.",[22,165,166,167,99,170,173],{},"Update with Adagrad: mem vars track ",[26,168,169],{},"mem += dparam**2",[26,171,172],{},"param -= lr * dparam \u002F sqrt(mem + 1e-8)",". Advance p by 25, n +=1. Initial smooth_loss = -log(1\u002Fvocab_size)*25.",[22,175,176],{},"Common issues: input.txt must exceed seq_length+1 chars (else IndexError in loss); large datasets like Shakespeare need 100k+ iters for loss ~3.0 and coherent text.",{"title":178,"searchDepth":179,"depth":179,"links":180},"",2,[181,182,183],{"id":19,"depth":179,"text":20},{"id":83,"depth":179,"text":84},{"id":137,"depth":179,"text":138},[185],"Data Science & Visualization",null,"md",false,{},true,"\u002Fsummaries\u002Fminimal-numpy-rnn-for-char-level-text-gen-summary","2026-04-08 21:21:20",{"title":5,"description":178},{"loc":191},"7fdb0ca0899660d5","Andrej Karpathy Gists","article","https:\u002F\u002Funknown","summaries\u002Fminimal-numpy-rnn-for-char-level-text-gen-summary",[201,202,203],"python","machine-learning","deep-learning","Build a vanilla RNN language model from scratch in ~170 lines of NumPy: processes text chunks of 25 chars, trains with BPTT and Adagrad, generates samples after 100 iterations.",[],"ytSsn8v5OXyfyPKcCX7WUMYHNuzZlEdeMutJWRk1eM0",[208,211,213,216,218,221,224,227,230,232,234,236,238,240,242,244,247,249,251,253,255,257,259,262,264,266,268,270,272,274,276,278,280,282,284,286,288,290,292,294,296,298,300,302,304,306,308,310,312,314,316,318,320,322,324,326,328,330,332,334,336,338,340,342,344,346,348,350,352,354,356,358,360,362,364,366,368,370,372,374,376,378,380,382,384,386,388,390,392,394,396,398,400,402,404,406,408,410,412,414,416,418,420,422,424,426,428,430,432,434,436,438,440,442,444,446,448,450,452,454,456,458,460,462,464,466,468,470,472,474,476,478,480,482,484,486,488,490,492,494,496,498,500,502,504,506,508,510,512,514,516,518,520,522,524,526,528,530,532,534,536,538,540,542,544,546,548,550,552,554,556,558,560,562,564,566,568,571,573,575,577,579,581,583,585,587,589,591,593,595,597,599,601,603,605,607,609,611,613,615,617,619,621,623,625,627,629,631,633,635,637,639,641,643,645,647,649,651,653,655,658,660,662,664,666,668,670,672,674,676,678,680,682,684,686,688,690,692,694,696,698,700,702,704,706,708,710,712,714,716,718,720,722,724,726,728,730,732,734,736,738,740,742,744,746,748,750,752,754,756,758,760,762,764,766,768,770,772,774,776,778,780,782,784,786,788,790,792,794,796,798,800,802,804,806,808,810,812,814,816,818,820,822,824,826,828,830,832,834,836,838,840,842,844,846,848,850,852,854,856,858,860,862,864,866,868,870,872,874,876,878,880,882,884,886,888,890,892,894,896,898,900,902,904,906,908,910,912,914,916,918,920,922,924,926,928,930,932,934,936,938,940,942,944,946,948,950,952,954,956,958,960,962,964,966,968,970,972,974,976,978,980,982,984,986,988,990,992,994,996,998,1000,1002,1004,1006,1008,1010,1012,1014,1016,1018,1020,1022,1024,1026,1028,1030,1032,1034,1036,1038,1040,1042,1044,1046,1048,1050,1052,1054,1056,1058,1060,1062,1064,1066,1068,1070,1072,1074,1076,1078,1080,1082,1084,1086,1088,1090,1092,1094,1096,1098,1100,1102,1104,1106,1108,1110,1112,1114,1116,1118,1120,1122,1124,1126,1128,1130,1132,1134,1136,1138,1140,1142,1144,1146,1148,1150,1152,1154,1156,1158,1160,1162,1164,1166,1168,1170,1172,1174,1176,1178,1180,1182,1184,1186,1188,1190,1192,1194,1196,1198,1200,1202,1204,1206,1208,1210,1212,1214,1216,1218,1220,1222,1224,1226,1228,1230,1232,1234,1236,1238,1240,1242,1244,1246,1248,1250,1252,1254,1256,1258,1260,1262,1264,1266,1268,1270,1272,1274,1276,1278,1280,1282,1284,1286,1288,1290,1292,1294,1296,1298,1300,1302,1304,1306,1308,1310,1312,1314,1316,1318,1320,1322,1324,1326,1328,1330,1332,1334,1336,1338,1340,1342,1344,1346,1348,1350,1352,1354,1356,1358,1360,1362,1364,1366,1368,1370,1372,1374,1376,1378,1380,1382,1384,1386,1388,1390,1392,1394,1396,1398,1400,1402,1404,1406,1408,1410,1412,1414,1416,1418,1420,1422,1424,1426,1428,1430,1432,1434,1436,1438,1440,1442,1444,1446,1448,1450,1452,1454,1456,1458,1460,1462,1464,1466,1468,1470,1472,1474,1476,1478,1480,1482,1484,1486,1488,1490,1492,1494,1496,1498,1500,1502,1504,1506,1508,1510,1512,1514,1516,1518,1520,1522,1524,1526,1528,1530,1532,1534,1536,1538,1540,1542,1544,1546,1548,1550,1552,1554,1556,1558,1560,1562,1564,1566,1568,1570,1572,1574,1576,1578,1580,1582,1584,1586,1588,1590,1592,1594,1596,1598,1600,1602,1604,1606,1608,1610,1612,1614,1616,1618,1620,1622,1624,1626,1628,1630,1632,1634,1636,1638,1640,1642,1644,1646,1648,1650,1652,1654,1656,1658,1660,1662,1664,1666,1668,1670,1672,1674,1676,1678,1680,1682,1684,1686,1688,1690,1692,1694,1696,1698,1700,1702,1704,1706,1708,1710,1712,1714,1716,1718,1720,1722,1724,1726,1728,1730,1732,1734,1736,1738,1740,1742,1744,1746,1748,1750,1752,1754,1756,1758,1760,1762,1764,1766,1768,1770,1772,1774,1776,1778,1780,1782,1784,1786,1788,1790,1792,1794,1796,1798,1800,1802,1804,1806,1808,1810,1812,1814,1816,1818,1820,1822,1824,1826,1828,1830,1832,1834,1836,1838,1840,1842,1844,1846,1848,1850,1852,1854,1856,1858,1860,1862,1864,1866,1868,1870,1872,1874,1876,1878,1880,1882,1884,1886,1888,1890,1892,1894,1896,1898,1900,1902,1904,1906,1908,1910,1912,1914,1916,1918,1920,1922,1924,1926,1928,1930,1932,1934,1936,1938,1940,1942,1944,1946,1948,1950,1952,1954,1956,1958,1960,1962,1964,1966,1968,1970,1972,1974,1976,1978,1980,1982,1984,1986,1988,1990,1992,1994,1996,1998,2000,2002,2004,2006,2008,2010,2012,2014,2016,2018,2020,2022,2024,2026,2028,2030,2032,2034,2036,2038,2040,2042,2044,2046,2048,2050,2052,2054,2056,2058,2060,2062,2064,2066,2068,2070,2072,2074,2076,2078,2080,2082,2084,2086,2088,2090,2092,2094,2096,2098,2100,2102,2104,2106,2108,2110,2112,2114,2116,2118,2120,2122,2124,2126,2128,2130,2132,2134,2136,2138,2140,2142,2144,2146,2148,2150,2152,2154,2156,2158,2160,2162,2164,2166,2168,2170,2172,2174,2176,2178,2180,2182,2184,2186,2188,2190,2192,2194,2196,2198,2200,2202,2204,2206,2208,2210,2212,2214,2216,2218,2220,2222,2224,2226,2228,2230,2232,2234,2236,2238,2240,2242,2244,2246,2248,2250,2252,2254,2256,2258,2260,2262,2264,2266,2268,2270,2272,2274,2276,2278,2280,2282,2284,2286,2288,2290,2292,2294,2296,2298,2300,2302,2304,2306,2308,2310,2312,2314,2316,2318,2320,2322,2324,2326,2328,2330,2332,2334,2336,2338,2340,2342,2344,2346,2348,2350,2352,2354,2356,2358,2360,2362,2364,2366,2368,2370,2372,2374,2376,2378,2380,2382,2384,2386,2388,2390,2392,2394,2396,2398,2400,2402,2404,2406,2408,2410,2412,2414,2416,2418,2420,2422,2424,2426,2428,2430,2432,2434,2436,2438,2440,2442,2444,2446,2448,2450,2452,2454,2456,2458,2460,2462,2464,2466,2468,2470,2472,2474,2476,2478,2480,2482,2484,2486,2488,2490,2492,2494,2496,2498,2500,2502,2504,2506,2508,2510,2512,2514,2516,2518,2520,2522,2524,2526,2528,2530,2532,2534,2536,2538,2540,2542,2544,2546,2548,2550,2552,2554,2556,2558,2560,2562,2564,2566,2568,2570,2572,2574,2576,2578,2580,2582,2584,2586,2588,2590,2592,2594,2596,2598,2600,2602,2604,2606,2608,2610,2612,2614,2616,2618,2620,2622,2624,2626,2628,2630,2632,2634,2636,2638,2640,2642,2644,2646,2648,2650,2652,2654,2656,2658,2660,2662,2664,2666,2668,2670,2672,2674,2676,2678,2680,2682,2684,2686,2688,2690,2692,2694,2696,2698,2700,2702,2704,2706,2708,2710,2712,2714,2716,2718,2720,2722,2724,2726,2728,2730,2732,2734,2736,2738,2740,2742,2744,2746,2748,2750,2752,2754,2756,2758,2760,2762,2764,2766,2768,2770,2772,2774,2776,2778,2780,2782,2784,2786,2788,2790,2792,2794,2796,2798,2800,2802,2804,2806,2808,2810,2812,2814,2816,2818,2820,2822,2824,2826,2828,2830,2832,2834,2836,2838,2840,2842,2844,2846,2848,2850,2852,2854,2856,2858,2860,2862,2864,2866,2868,2870,2872,2874,2876,2878,2880,2882,2884,2886,2888,2890,2892,2894,2896,2898,2900,2902,2904,2906,2908,2910,2912,2914,2916,2918,2920,2922,2924,2926,2928,2930,2932,2934,2936,2938,2940,2942,2944,2946,2948,2950,2952,2954,2956,2958,2960,2962,2964,2966,2968,2970,2972,2974,2976,2978,2980,2982,2984,2986,2988,2990,2992,2994,2996,2998,3000,3002,3004,3006,3008,3010,3012,3014,3016,3018,3020,3022,3024,3026,3028,3030,3032,3034,3036,3038,3040,3042,3044,3046,3048,3050,3052,3054,3056,3058,3060,3062,3064,3066,3068,3070,3072,3074,3076,3078,3080,3082,3084,3086,3088,3090,3092,3094,3096,3098,3100,3102,3104,3106,3108,3110,3112,3114,3116,3118,3120,3122,3124,3126,3128,3130,3132,3134,3136,3138,3140,3142,3144,3146,3148,3150,3152,3154,3156,3158,3160,3162,3164,3166,3168,3170,3172,3174,3176,3178,3180,3182,3184,3186,3188,3190,3192,3194,3196,3198,3200,3202,3204,3206,3208,3210,3212,3214,3216,3218,3220,3222,3224,3226,3228,3230,3232,3234,3236,3238,3240,3242,3244,3246,3248,3250,3252,3254,3256,3258,3260,3262,3264,3266,3268,3270,3272,3274,3276,3278,3280,3282,3284,3286,3288,3290,3292,3294,3296,3298,3300,3302,3304,3306,3308,3310,3312,3314,3316,3318,3320,3322,3324,3326,3328,3330,3332,3334,3336,3338,3340,3342,3344,3346,3348,3350,3352,3354,3356,3358,3360,3362,3364,3366,3368,3370,3372,3374,3376,3378,3380,3382,3384,3386,3388,3390,3392,3394,3396,3398,3400,3402,3404,3406,3408,3410,3412,3414,3416,3418,3420,3422,3424,3426,3428,3430,3432,3434,3436,3438,3440,3442,3444,3446,3448,3450,3452,3454,3456,3458,3460,3462,3464,3466,3468,3470,3472,3474,3476,3478,3480,3482,3484,3486,3488,3490,3492,3494,3496,3498,3500,3502,3504,3506,3508,3510,3512,3514,3516,3518,3520,3522,3524,3526,3528,3530,3532,3534,3536,3538,3540,3542,3544,3546,3548,3550,3552,3554,3556,3558,3560,3562,3564,3566,3568,3570,3572,3574,3576,3578,3580,3582,3584,3586,3588,3590,3592,3594,3596,3598,3600,3602,3604,3606,3608,3610,3612,3614,3616,3618,3620,3622,3624,3626,3628,3630,3632,3634,3636,3638,3640,3642,3644,3646,3648,3650,3652,3654,3656,3658,3660,3662,3664,3666,3668,3670,3672,3674,3676,3678,3680,3682,3684,3686,3688,3690,3692,3694,3696,3698,3700,3702,3704,3706,3708,3710,3712,3714,3716,3718,3720,3722,3724,3726,3728,3730,3732,3734,3736,3738,3740,3742,3744,3746,3748,3750,3752,3754,3756,3758,3760,3762,3764,3766,3768,3770,3772,3774,3776,3778,3780,3782,3784,3786,3788,3790,3792,3794,3796,3798,3800,3802,3804,3806,3808,3810,3812,3814,3816,3818,3820,3822,3824,3826,3828,3830,3832,3834,3836,3838,3840,3842,3844,3846,3848,3850,3852,3854,3856,3858,3860,3862,3864,3866,3868,3870,3872,3874,3876,3878,3880,3882,3884,3886,3888,3890,3892,3894,3896,3898,3900,3902,3904,3906,3908,3910,3912,3914,3916,3918,3920,3922,3924,3926,3928,3930,3932,3934,3936,3938,3940,3942,3944,3946,3948,3950,3952,3954,3956,3958,3960,3962,3964,3966,3968,3970,3972,3974,3976,3978,3980,3982,3984,3986,3988,3990,3992,3994,3996,3998,4000,4002,4004,4006,4008,4010,4012,4014,4016,4018,4020,4022,4024,4026,4028,4030,4032,4034,4036,4038,4040,4042,4044,4046,4048,4050,4052,4054,4056,4058,4060,4062,4064,4066,4068,4070,4072,4074,4076,4078,4080,4082,4084,4086,4088,4090,4092,4094,4096,4098,4100,4102,4104,4106,4108,4110,4112,4114,4116,4118,4120,4122,4124,4126,4128,4130,4132,4134,4136,4138,4140,4142,4144,4146,4148,4150,4152,4154,4156,4158,4160,4162,4164,4166,4168,4170,4172,4174,4176,4178,4180,4182,4184,4186,4188,4190,4192,4194,4196,4198,4200,4202,4204,4206,4208,4210,4212,4214,4216,4218,4220,4222,4224,4226,4228,4230,4232,4234,4236,4238,4240,4242,4244,4246,4248,4250,4252,4254,4256,4258,4260,4262,4264,4266,4268,4270,4272,4274,4276,4278,4280,4282,4284,4286,4288,4290,4292,4294,4296,4298,4300,4302,4304,4306,4308,4310,4312,4314,4316,4318,4320,4322,4324,4326,4328,4330,4332,4334,4336,4338,4340,4342,4344,4346,4348,4350,4352,4354,4356,4358,4360,4362,4364,4366,4368,4370,4372,4374,4376,4378,4380,4382,4384,4386,4388,4390,4392,4394,4396,4398,4400,4402,4404,4406,4408,4410,4412,4414,4416,4418,4420,4422,4424,4426,4428,4430,4432,4434,4436,4438,4440,4442,4444,4446,4448,4450,4452,4454,4456,4458,4460,4462,4464,4466,4468,4470,4472,4474,4476,4478,4480,4482,4484,4486,4488,4490,4492,4494,4496,4498,4500,4502,4504,4506,4508,4510,4512,4514,4516,4518,4520,4522,4524,4526,4528,4530,4532,4534,4536,4538,4540,4542,4544,4546,4548,4550,4552,4554,4556,4558,4560,4562,4564,4566,4568,4570,4572,4574,4576,4578,4580,4582,4584,4586,4588,4590,4592,4594,4596,4598,4600,4602,4604,4606,4608,4610],{"categories":209},[210],"Business & SaaS",{"categories":212},[210],{"categories":214},[215],"AI News & Trends",{"categories":217},[],{"categories":219},[220],"AI Automation",{"categories":222},[223],"Marketing & Growth",{"categories":225},[226],"Design & Frontend",{"categories":228},[229],"Software Engineering",{"categories":231},[220],{"categories":233},[],{"categories":235},[226],{"categories":237},[226],{"categories":239},[220],{"categories":241},[226],{"categories":243},[226],{"categories":245},[246],"AI & LLMs",{"categories":248},[226],{"categories":250},[226],{"categories":252},[],{"categories":254},[226],{"categories":256},[226],{"categories":258},[246],{"categories":260},[261],"Developer Productivity",{"categories":263},[246],{"categories":265},[246],{"categories":267},[246],{"categories":269},[215],{"categories":271},[246],{"categories":273},[220],{"categories":275},[210],{"categories":277},[215],{"categories":279},[223],{"categories":281},[],{"categories":283},[],{"categories":285},[220],{"categories":287},[220],{"categories":289},[220],{"categories":291},[223],{"categories":293},[246],{"categories":295},[261],{"categories":297},[215],{"categories":299},[],{"categories":301},[],{"categories":303},[],{"categories":305},[185],{"categories":307},[],{"categories":309},[220],{"categories":311},[229],{"categories":313},[220],{"categories":315},[220],{"categories":317},[246],{"categories":319},[223],{"categories":321},[220],{"categories":323},[],{"categories":325},[],{"categories":327},[],{"categories":329},[226],{"categories":331},[226],{"categories":333},[220],{"categories":335},[223],{"categories":337},[261],{"categories":339},[226],{"categories":341},[246],{"categories":343},[229],{"categories":345},[246],{"categories":347},[],{"categories":349},[220],{"categories":351},[246],{"categories":353},[261],{"categories":355},[261],{"categories":357},[],{"categories":359},[223],{"categories":361},[210],{"categories":363},[246],{"categories":365},[210],{"categories":367},[210],{"categories":369},[220],{"categories":371},[223],{"categories":373},[220],{"categories":375},[210],{"categories":377},[220],{"categories":379},[226],{"categories":381},[246],{"categories":383},[226],{"categories":385},[246],{"categories":387},[210],{"categories":389},[246],{"categories":391},[223],{"categories":393},[],{"categories":395},[246],{"categories":397},[210],{"categories":399},[],{"categories":401},[215],{"categories":403},[229],{"categories":405},[],{"categories":407},[246],{"categories":409},[226],{"categories":411},[246],{"categories":413},[226],{"categories":415},[],{"categories":417},[220],{"categories":419},[],{"categories":421},[],{"categories":423},[],{"categories":425},[246],{"categories":427},[],{"categories":429},[246],{"categories":431},[246],{"categories":433},[226],{"categories":435},[246],{"categories":437},[261],{"categories":439},[220],{"categories":441},[223],{"categories":443},[261],{"categories":445},[261],{"categories":447},[261],{"categories":449},[223],{"categories":451},[223],{"categories":453},[246],{"categories":455},[246],{"categories":457},[226],{"categories":459},[210],{"categories":461},[226],{"categories":463},[229],{"categories":465},[210],{"categories":467},[210],{"categories":469},[210],{"categories":471},[226],{"categories":473},[],{"categories":475},[],{"categories":477},[246],{"categories":479},[246],{"categories":481},[229],{"categories":483},[246],{"categories":485},[246],{"categories":487},[],{"categories":489},[246],{"categories":491},[246],{"categories":493},[],{"categories":495},[246],{"categories":497},[215],{"categories":499},[215],{"categories":501},[],{"categories":503},[],{"categories":505},[223],{"categories":507},[223],{"categories":509},[229],{"categories":511},[246],{"categories":513},[],{"categories":515},[],{"categories":517},[220],{"categories":519},[246],{"categories":521},[246],{"categories":523},[],{"categories":525},[246,210],{"categories":527},[246],{"categories":529},[],{"categories":531},[246],{"categories":533},[246],{"categories":535},[],{"categories":537},[],{"categories":539},[220],{"categories":541},[246],{"categories":543},[246],{"categories":545},[220],{"categories":547},[246],{"categories":549},[],{"categories":551},[],{"categories":553},[246],{"categories":555},[],{"categories":557},[246],{"categories":559},[246],{"categories":561},[],{"categories":563},[220],{"categories":565},[226],{"categories":567},[],{"categories":569},[220,570],"DevOps & Cloud",{"categories":572},[246],{"categories":574},[220],{"categories":576},[246],{"categories":578},[],{"categories":580},[],{"categories":582},[],{"categories":584},[],{"categories":586},[246],{"categories":588},[220],{"categories":590},[],{"categories":592},[220],{"categories":594},[],{"categories":596},[246],{"categories":598},[],{"categories":600},[],{"categories":602},[],{"categories":604},[],{"categories":606},[220],{"categories":608},[226],{"categories":610},[246],{"categories":612},[223],{"categories":614},[215],{"categories":616},[210],{"categories":618},[261],{"categories":620},[],{"categories":622},[220],{"categories":624},[220],{"categories":626},[246],{"categories":628},[],{"categories":630},[],{"categories":632},[],{"categories":634},[220],{"categories":636},[],{"categories":638},[220],{"categories":640},[220],{"categories":642},[215],{"categories":644},[220],{"categories":646},[246],{"categories":648},[],{"categories":650},[246],{"categories":652},[],{"categories":654},[215],{"categories":656},[220,657],"Product Strategy",{"categories":659},[229],{"categories":661},[570],{"categories":663},[657],{"categories":665},[246],{"categories":667},[220],{"categories":669},[],{"categories":671},[215],{"categories":673},[215],{"categories":675},[220],{"categories":677},[],{"categories":679},[220],{"categories":681},[246],{"categories":683},[246],{"categories":685},[261],{"categories":687},[246],{"categories":689},[],{"categories":691},[246,229],{"categories":693},[215],{"categories":695},[246],{"categories":697},[215],{"categories":699},[220],{"categories":701},[215],{"categories":703},[],{"categories":705},[229],{"categories":707},[210],{"categories":709},[],{"categories":711},[220],{"categories":713},[220],{"categories":715},[220],{"categories":717},[220],{"categories":719},[210],{"categories":721},[226],{"categories":723},[223],{"categories":725},[],{"categories":727},[220],{"categories":729},[],{"categories":731},[215],{"categories":733},[215],{"categories":735},[215],{"categories":737},[220],{"categories":739},[215],{"categories":741},[246],{"categories":743},[261],{"categories":745},[246],{"categories":747},[229],{"categories":749},[246,261],{"categories":751},[261],{"categories":753},[261],{"categories":755},[261],{"categories":757},[261],{"categories":759},[246],{"categories":761},[],{"categories":763},[],{"categories":765},[223],{"categories":767},[],{"categories":769},[246],{"categories":771},[261],{"categories":773},[246],{"categories":775},[226],{"categories":777},[229],{"categories":779},[],{"categories":781},[246],{"categories":783},[261],{"categories":785},[223],{"categories":787},[215],{"categories":789},[229],{"categories":791},[246],{"categories":793},[],{"categories":795},[229],{"categories":797},[226],{"categories":799},[210],{"categories":801},[210],{"categories":803},[],{"categories":805},[226],{"categories":807},[210],{"categories":809},[215],{"categories":811},[261],{"categories":813},[220],{"categories":815},[220],{"categories":817},[246],{"categories":819},[246],{"categories":821},[215],{"categories":823},[215],{"categories":825},[261],{"categories":827},[215],{"categories":829},[],{"categories":831},[657],{"categories":833},[220],{"categories":835},[215],{"categories":837},[215],{"categories":839},[215],{"categories":841},[246],{"categories":843},[220],{"categories":845},[220],{"categories":847},[210],{"categories":849},[210],{"categories":851},[246],{"categories":853},[215],{"categories":855},[],{"categories":857},[246],{"categories":859},[210],{"categories":861},[220],{"categories":863},[220],{"categories":865},[220],{"categories":867},[226],{"categories":869},[220],{"categories":871},[261],{"categories":873},[215],{"categories":875},[215],{"categories":877},[215],{"categories":879},[215],{"categories":881},[215],{"categories":883},[],{"categories":885},[],{"categories":887},[261],{"categories":889},[215],{"categories":891},[215],{"categories":893},[215],{"categories":895},[],{"categories":897},[246],{"categories":899},[],{"categories":901},[],{"categories":903},[226],{"categories":905},[210],{"categories":907},[],{"categories":909},[215],{"categories":911},[220],{"categories":913},[220],{"categories":915},[220],{"categories":917},[223],{"categories":919},[220],{"categories":921},[],{"categories":923},[215],{"categories":925},[215],{"categories":927},[246],{"categories":929},[],{"categories":931},[223],{"categories":933},[223],{"categories":935},[246],{"categories":937},[215],{"categories":939},[210],{"categories":941},[229],{"categories":943},[246],{"categories":945},[],{"categories":947},[246],{"categories":949},[246],{"categories":951},[229],{"categories":953},[246],{"categories":955},[246],{"categories":957},[246],{"categories":959},[223],{"categories":961},[215],{"categories":963},[246],{"categories":965},[246],{"categories":967},[215],{"categories":969},[220],{"categories":971},[261],{"categories":973},[210],{"categories":975},[246],{"categories":977},[261],{"categories":979},[261],{"categories":981},[],{"categories":983},[223],{"categories":985},[215],{"categories":987},[215],{"categories":989},[261],{"categories":991},[220],{"categories":993},[220],{"categories":995},[220],{"categories":997},[220],{"categories":999},[226],{"categories":1001},[246],{"categories":1003},[246],{"categories":1005},[657],{"categories":1007},[246],{"categories":1009},[246],{"categories":1011},[220],{"categories":1013},[210],{"categories":1015},[223],{"categories":1017},[],{"categories":1019},[210],{"categories":1021},[210],{"categories":1023},[],{"categories":1025},[226],{"categories":1027},[246],{"categories":1029},[],{"categories":1031},[],{"categories":1033},[215],{"categories":1035},[215],{"categories":1037},[215],{"categories":1039},[215],{"categories":1041},[],{"categories":1043},[215],{"categories":1045},[246],{"categories":1047},[246],{"categories":1049},[],{"categories":1051},[215],{"categories":1053},[215],{"categories":1055},[210],{"categories":1057},[246],{"categories":1059},[],{"categories":1061},[],{"categories":1063},[215],{"categories":1065},[215],{"categories":1067},[215],{"categories":1069},[246],{"categories":1071},[215],{"categories":1073},[215],{"categories":1075},[215],{"categories":1077},[215],{"categories":1079},[215],{"categories":1081},[],{"categories":1083},[220],{"categories":1085},[246],{"categories":1087},[223],{"categories":1089},[210],{"categories":1091},[220],{"categories":1093},[246],{"categories":1095},[],{"categories":1097},[223],{"categories":1099},[215],{"categories":1101},[215],{"categories":1103},[215],{"categories":1105},[215],{"categories":1107},[261],{"categories":1109},[229],{"categories":1111},[],{"categories":1113},[246],{"categories":1115},[220],{"categories":1117},[220],{"categories":1119},[220],{"categories":1121},[570],{"categories":1123},[220],{"categories":1125},[246],{"categories":1127},[246],{"categories":1129},[229],{"categories":1131},[570],{"categories":1133},[185],{"categories":1135},[246],{"categories":1137},[185],{"categories":1139},[],{"categories":1141},[223],{"categories":1143},[223],{"categories":1145},[226],{"categories":1147},[570],{"categories":1149},[220],{"categories":1151},[246],{"categories":1153},[246],{"categories":1155},[220],{"categories":1157},[220],{"categories":1159},[220],{"categories":1161},[261],{"categories":1163},[261],{"categories":1165},[220],{"categories":1167},[220],{"categories":1169},[],{"categories":1171},[220],{"categories":1173},[220],{"categories":1175},[246],{"categories":1177},[185],{"categories":1179},[220],{"categories":1181},[220],{"categories":1183},[220],{"categories":1185},[220],{"categories":1187},[210],{"categories":1189},[226],{"categories":1191},[215],{"categories":1193},[229],{"categories":1195},[570],{"categories":1197},[229],{"categories":1199},[185],{"categories":1201},[],{"categories":1203},[229],{"categories":1205},[],{"categories":1207},[],{"categories":1209},[229],{"categories":1211},[246],{"categories":1213},[],{"categories":1215},[],{"categories":1217},[],{"categories":1219},[210],{"categories":1221},[],{"categories":1223},[],{"categories":1225},[185],{"categories":1227},[246],{"categories":1229},[570],{"categories":1231},[246],{"categories":1233},[],{"categories":1235},[220],{"categories":1237},[261],{"categories":1239},[261],{"categories":1241},[223],{"categories":1243},[223],{"categories":1245},[223],{"categories":1247},[570],{"categories":1249},[229],{"categories":1251},[220],{"categories":1253},[210],{"categories":1255},[210],{"categories":1257},[229],{"categories":1259},[226],{"categories":1261},[185],{"categories":1263},[226],{"categories":1265},[],{"categories":1267},[246],{"categories":1269},[220],{"categories":1271},[220],{"categories":1273},[261],{"categories":1275},[220],{"categories":1277},[220],{"categories":1279},[226],{"categories":1281},[226],{"categories":1283},[220],{"categories":1285},[570],{"categories":1287},[246],{"categories":1289},[],{"categories":1291},[223],{"categories":1293},[220],{"categories":1295},[210],{"categories":1297},[220],{"categories":1299},[220],{"categories":1301},[],{"categories":1303},[246],{"categories":1305},[220],{"categories":1307},[220],{"categories":1309},[261],{"categories":1311},[220],{"categories":1313},[246],{"categories":1315},[],{"categories":1317},[220],{"categories":1319},[],{"categories":1321},[226],{"categories":1323},[261],{"categories":1325},[246],{"categories":1327},[229],{"categories":1329},[226],{"categories":1331},[261],{"categories":1333},[185],{"categories":1335},[261],{"categories":1337},[],{"categories":1339},[246],{"categories":1341},[246],{"categories":1343},[657],{"categories":1345},[229],{"categories":1347},[246,220],{"categories":1349},[220],{"categories":1351},[246],{"categories":1353},[220],{"categories":1355},[220,229],{"categories":1357},[220],{"categories":1359},[246],{"categories":1361},[],{"categories":1363},[261],{"categories":1365},[246],{"categories":1367},[220],{"categories":1369},[246],{"categories":1371},[],{"categories":1373},[229],{"categories":1375},[210],{"categories":1377},[220],{"categories":1379},[],{"categories":1381},[185],{"categories":1383},[229],{"categories":1385},[220],{"categories":1387},[229],{"categories":1389},[],{"categories":1391},[220],{"categories":1393},[],{"categories":1395},[220],{"categories":1397},[],{"categories":1399},[],{"categories":1401},[226],{"categories":1403},[261],{"categories":1405},[246],{"categories":1407},[220],{"categories":1409},[],{"categories":1411},[220],{"categories":1413},[229],{"categories":1415},[246],{"categories":1417},[246],{"categories":1419},[229],{"categories":1421},[229],{"categories":1423},[261],{"categories":1425},[210],{"categories":1427},[],{"categories":1429},[246],{"categories":1431},[246],{"categories":1433},[246],{"categories":1435},[220],{"categories":1437},[246],{"categories":1439},[],{"categories":1441},[226],{"categories":1443},[246],{"categories":1445},[220],{"categories":1447},[],{"categories":1449},[246],{"categories":1451},[],{"categories":1453},[246],{"categories":1455},[],{"categories":1457},[],{"categories":1459},[],{"categories":1461},[246],{"categories":1463},[246],{"categories":1465},[246],{"categories":1467},[246],{"categories":1469},[],{"categories":1471},[246],{"categories":1473},[246],{"categories":1475},[246],{"categories":1477},[],{"categories":1479},[246],{"categories":1481},[],{"categories":1483},[223],{"categories":1485},[246],{"categories":1487},[],{"categories":1489},[],{"categories":1491},[],{"categories":1493},[246],{"categories":1495},[215],{"categories":1497},[215],{"categories":1499},[],{"categories":1501},[220],{"categories":1503},[246],{"categories":1505},[],{"categories":1507},[246],{"categories":1509},[246],{"categories":1511},[215],{"categories":1513},[],{"categories":1515},[246],{"categories":1517},[215],{"categories":1519},[220],{"categories":1521},[246],{"categories":1523},[],{"categories":1525},[],{"categories":1527},[],{"categories":1529},[220],{"categories":1531},[220],{"categories":1533},[220],{"categories":1535},[220],{"categories":1537},[246],{"categories":1539},[226],{"categories":1541},[226],{"categories":1543},[220],{"categories":1545},[220],{"categories":1547},[261],{"categories":1549},[657],{"categories":1551},[261],{"categories":1553},[261],{"categories":1555},[246],{"categories":1557},[220],{"categories":1559},[246],{"categories":1561},[261],{"categories":1563},[246],{"categories":1565},[220],{"categories":1567},[220],{"categories":1569},[220],{"categories":1571},[220],{"categories":1573},[220],{"categories":1575},[246],{"categories":1577},[261],{"categories":1579},[261],{"categories":1581},[223],{"categories":1583},[220],{"categories":1585},[],{"categories":1587},[220],{"categories":1589},[],{"categories":1591},[215],{"categories":1593},[246],{"categories":1595},[],{"categories":1597},[210],{"categories":1599},[226],{"categories":1601},[226],{"categories":1603},[220],{"categories":1605},[220],{"categories":1607},[246],{"categories":1609},[246],{"categories":1611},[215],{"categories":1613},[215],{"categories":1615},[570],{"categories":1617},[220],{"categories":1619},[215],{"categories":1621},[],{"categories":1623},[246],{"categories":1625},[220],{"categories":1627},[220],{"categories":1629},[220],{"categories":1631},[220],{"categories":1633},[246],{"categories":1635},[246],{"categories":1637},[246],{"categories":1639},[246],{"categories":1641},[220],{"categories":1643},[220],{"categories":1645},[220],{"categories":1647},[220],{"categories":1649},[],{"categories":1651},[226],{"categories":1653},[246],{"categories":1655},[246],{"categories":1657},[246],{"categories":1659},[],{"categories":1661},[223],{"categories":1663},[],{"categories":1665},[261],{"categories":1667},[],{"categories":1669},[220],{"categories":1671},[261],{"categories":1673},[226],{"categories":1675},[261],{"categories":1677},[],{"categories":1679},[261],{"categories":1681},[261],{"categories":1683},[],{"categories":1685},[226],{"categories":1687},[220],{"categories":1689},[220],{"categories":1691},[261],{"categories":1693},[246],{"categories":1695},[246],{"categories":1697},[],{"categories":1699},[215],{"categories":1701},[],{"categories":1703},[223],{"categories":1705},[],{"categories":1707},[226],{"categories":1709},[215],{"categories":1711},[226],{"categories":1713},[226],{"categories":1715},[226],{"categories":1717},[226],{"categories":1719},[226],{"categories":1721},[226],{"categories":1723},[226],{"categories":1725},[226],{"categories":1727},[226],{"categories":1729},[226],{"categories":1731},[],{"categories":1733},[220],{"categories":1735},[226],{"categories":1737},[246],{"categories":1739},[246],{"categories":1741},[226],{"categories":1743},[226],{"categories":1745},[226],{"categories":1747},[226],{"categories":1749},[226],{"categories":1751},[226],{"categories":1753},[226],{"categories":1755},[246,226],{"categories":1757},[226],{"categories":1759},[226],{"categories":1761},[226],{"categories":1763},[226],{"categories":1765},[],{"categories":1767},[226],{"categories":1769},[226],{"categories":1771},[226],{"categories":1773},[226],{"categories":1775},[226],{"categories":1777},[226],{"categories":1779},[226],{"categories":1781},[226],{"categories":1783},[226],{"categories":1785},[226,246],{"categories":1787},[226],{"categories":1789},[226],{"categories":1791},[],{"categories":1793},[215],{"categories":1795},[],{"categories":1797},[246],{"categories":1799},[],{"categories":1801},[220],{"categories":1803},[570],{"categories":1805},[657],{"categories":1807},[220],{"categories":1809},[220],{"categories":1811},[],{"categories":1813},[220],{"categories":1815},[],{"categories":1817},[220],{"categories":1819},[],{"categories":1821},[],{"categories":1823},[246],{"categories":1825},[246],{"categories":1827},[246],{"categories":1829},[215],{"categories":1831},[215],{"categories":1833},[215],{"categories":1835},[215],{"categories":1837},[],{"categories":1839},[215],{"categories":1841},[],{"categories":1843},[215],{"categories":1845},[246],{"categories":1847},[215],{"categories":1849},[215],{"categories":1851},[215],{"categories":1853},[215],{"categories":1855},[246],{"categories":1857},[215],{"categories":1859},[220],{"categories":1861},[],{"categories":1863},[220],{"categories":1865},[215],{"categories":1867},[246],{"categories":1869},[215],{"categories":1871},[215],{"categories":1873},[215],{"categories":1875},[246],{"categories":1877},[246],{"categories":1879},[246],{"categories":1881},[],{"categories":1883},[],{"categories":1885},[246],{"categories":1887},[215],{"categories":1889},[],{"categories":1891},[246],{"categories":1893},[220],{"categories":1895},[246],{"categories":1897},[220],{"categories":1899},[220],{"categories":1901},[246],{"categories":1903},[],{"categories":1905},[],{"categories":1907},[220],{"categories":1909},[220],{"categories":1911},[220],{"categories":1913},[220],{"categories":1915},[220],{"categories":1917},[220],{"categories":1919},[220],{"categories":1921},[220],{"categories":1923},[],{"categories":1925},[220],{"categories":1927},[220],{"categories":1929},[220],{"categories":1931},[246],{"categories":1933},[246],{"categories":1935},[246],{"categories":1937},[215],{"categories":1939},[246],{"categories":1941},[246],{"categories":1943},[246],{"categories":1945},[220],{"categories":1947},[223],{"categories":1949},[223],{"categories":1951},[223],{"categories":1953},[220],{"categories":1955},[],{"categories":1957},[246],{"categories":1959},[],{"categories":1961},[],{"categories":1963},[246],{"categories":1965},[],{"categories":1967},[220],{"categories":1969},[226],{"categories":1971},[261],{"categories":1973},[185],{"categories":1975},[246],{"categories":1977},[220],{"categories":1979},[226],{"categories":1981},[],{"categories":1983},[220],{"categories":1985},[223,210],{"categories":1987},[220],{"categories":1989},[220],{"categories":1991},[570],{"categories":1993},[229],{"categories":1995},[223],{"categories":1997},[261],{"categories":1999},[246],{"categories":2001},[],{"categories":2003},[246],{"categories":2005},[],{"categories":2007},[246],{"categories":2009},[246],{"categories":2011},[220],{"categories":2013},[],{"categories":2015},[246],{"categories":2017},[220],{"categories":2019},[246],{"categories":2021},[261],{"categories":2023},[220],{"categories":2025},[246],{"categories":2027},[246,261],{"categories":2029},[261],{"categories":2031},[],{"categories":2033},[246],{"categories":2035},[246],{"categories":2037},[246],{"categories":2039},[],{"categories":2041},[],{"categories":2043},[220],{"categories":2045},[223],{"categories":2047},[215],{"categories":2049},[220],{"categories":2051},[246],{"categories":2053},[215],{"categories":2055},[],{"categories":2057},[261],{"categories":2059},[215],{"categories":2061},[],{"categories":2063},[185],{"categories":2065},[223],{"categories":2067},[210],{"categories":2069},[215],{"categories":2071},[246],{"categories":2073},[220],{"categories":2075},[246],{"categories":2077},[220],{"categories":2079},[220],{"categories":2081},[215],{"categories":2083},[261],{"categories":2085},[226],{"categories":2087},[210],{"categories":2089},[246],{"categories":2091},[246],{"categories":2093},[],{"categories":2095},[],{"categories":2097},[246],{"categories":2099},[],{"categories":2101},[246],{"categories":2103},[215],{"categories":2105},[],{"categories":2107},[220],{"categories":2109},[261],{"categories":2111},[215],{"categories":2113},[261],{"categories":2115},[220],{"categories":2117},[246],{"categories":2119},[],{"categories":2121},[220],{"categories":2123},[220],{"categories":2125},[226],{"categories":2127},[220],{"categories":2129},[226],{"categories":2131},[220],{"categories":2133},[220],{"categories":2135},[226],{"categories":2137},[],{"categories":2139},[],{"categories":2141},[226],{"categories":2143},[226],{"categories":2145},[226],{"categories":2147},[229],{"categories":2149},[261],{"categories":2151},[261],{"categories":2153},[220],{"categories":2155},[215],{"categories":2157},[261],{"categories":2159},[261],{"categories":2161},[223],{"categories":2163},[226],{"categories":2165},[220],{"categories":2167},[220],{"categories":2169},[246],{"categories":2171},[261],{"categories":2173},[246],{"categories":2175},[],{"categories":2177},[570],{"categories":2179},[657],{"categories":2181},[],{"categories":2183},[],{"categories":2185},[220],{"categories":2187},[215],{"categories":2189},[223],{"categories":2191},[223],{"categories":2193},[185],{"categories":2195},[226],{"categories":2197},[185],{"categories":2199},[185],{"categories":2201},[220],{"categories":2203},[],{"categories":2205},[],{"categories":2207},[185],{"categories":2209},[229],{"categories":2211},[246],{"categories":2213},[229],{"categories":2215},[185],{"categories":2217},[229],{"categories":2219},[185],{"categories":2221},[210],{"categories":2223},[229],{"categories":2225},[261],{"categories":2227},[246],{"categories":2229},[],{"categories":2231},[185],{"categories":2233},[570],{"categories":2235},[],{"categories":2237},[246],{"categories":2239},[246],{"categories":2241},[],{"categories":2243},[],{"categories":2245},[246],{"categories":2247},[246],{"categories":2249},[215],{"categories":2251},[246],{"categories":2253},[],{"categories":2255},[215],{"categories":2257},[],{"categories":2259},[],{"categories":2261},[215],{"categories":2263},[215],{"categories":2265},[246],{"categories":2267},[246],{"categories":2269},[246],{"categories":2271},[246],{"categories":2273},[246],{"categories":2275},[246],{"categories":2277},[223],{"categories":2279},[],{"categories":2281},[246],{"categories":2283},[],{"categories":2285},[],{"categories":2287},[220],{"categories":2289},[261],{"categories":2291},[],{"categories":2293},[570],{"categories":2295},[246,570],{"categories":2297},[246],{"categories":2299},[],{"categories":2301},[226],{"categories":2303},[226],{"categories":2305},[226],{"categories":2307},[226],{"categories":2309},[226],{"categories":2311},[],{"categories":2313},[],{"categories":2315},[],{"categories":2317},[229],{"categories":2319},[220],{"categories":2321},[210],{"categories":2323},[229],{"categories":2325},[261],{"categories":2327},[226],{"categories":2329},[],{"categories":2331},[223],{"categories":2333},[657],{"categories":2335},[185],{"categories":2337},[185],{"categories":2339},[185],{"categories":2341},[261],{"categories":2343},[657],{"categories":2345},[261],{"categories":2347},[],{"categories":2349},[210],{"categories":2351},[229],{"categories":2353},[246],{"categories":2355},[226],{"categories":2357},[223],{"categories":2359},[229],{"categories":2361},[223],{"categories":2363},[246],{"categories":2365},[226],{"categories":2367},[229],{"categories":2369},[570],{"categories":2371},[246],{"categories":2373},[215],{"categories":2375},[229],{"categories":2377},[],{"categories":2379},[246],{"categories":2381},[229],{"categories":2383},[229],{"categories":2385},[220],{"categories":2387},[],{"categories":2389},[223],{"categories":2391},[223],{"categories":2393},[223],{"categories":2395},[220],{"categories":2397},[246],{"categories":2399},[],{"categories":2401},[210],{"categories":2403},[261],{"categories":2405},[261],{"categories":2407},[185],{"categories":2409},[210],{"categories":2411},[215],{"categories":2413},[185],{"categories":2415},[],{"categories":2417},[215],{"categories":2419},[215],{"categories":2421},[215],{"categories":2423},[246],{"categories":2425},[210],{"categories":2427},[246],{"categories":2429},[],{"categories":2431},[],{"categories":2433},[],{"categories":2435},[229],{"categories":2437},[220],{"categories":2439},[],{"categories":2441},[261],{"categories":2443},[226],{"categories":2445},[],{"categories":2447},[223],{"categories":2449},[],{"categories":2451},[226],{"categories":2453},[246],{"categories":2455},[261],{"categories":2457},[210],{"categories":2459},[],{"categories":2461},[226],{"categories":2463},[226],{"categories":2465},[246],{"categories":2467},[],{"categories":2469},[],{"categories":2471},[229],{"categories":2473},[246],{"categories":2475},[],{"categories":2477},[220],{"categories":2479},[246],{"categories":2481},[],{"categories":2483},[229],{"categories":2485},[220],{"categories":2487},[246],{"categories":2489},[185],{"categories":2491},[246],{"categories":2493},[],{"categories":2495},[185],{"categories":2497},[246],{"categories":2499},[229],{"categories":2501},[246],{"categories":2503},[185],{"categories":2505},[220],{"categories":2507},[246],{"categories":2509},[246],{"categories":2511},[246,220],{"categories":2513},[220],{"categories":2515},[220],{"categories":2517},[220],{"categories":2519},[226],{"categories":2521},[261],{"categories":2523},[246],{"categories":2525},[261],{"categories":2527},[226],{"categories":2529},[246],{"categories":2531},[],{"categories":2533},[],{"categories":2535},[246],{"categories":2537},[246],{"categories":2539},[246],{"categories":2541},[220],{"categories":2543},[246],{"categories":2545},[],{"categories":2547},[246],{"categories":2549},[246],{"categories":2551},[220],{"categories":2553},[220],{"categories":2555},[246],{"categories":2557},[246],{"categories":2559},[],{"categories":2561},[246],{"categories":2563},[],{"categories":2565},[246],{"categories":2567},[246],{"categories":2569},[246],{"categories":2571},[246],{"categories":2573},[246],{"categories":2575},[246],{"categories":2577},[246],{"categories":2579},[],{"categories":2581},[246],{"categories":2583},[215],{"categories":2585},[215],{"categories":2587},[],{"categories":2589},[],{"categories":2591},[246],{"categories":2593},[],{"categories":2595},[246],{"categories":2597},[246,570],{"categories":2599},[],{"categories":2601},[215],{"categories":2603},[],{"categories":2605},[246],{"categories":2607},[],{"categories":2609},[],{"categories":2611},[],{"categories":2613},[246],{"categories":2615},[],{"categories":2617},[246],{"categories":2619},[],{"categories":2621},[246],{"categories":2623},[246],{"categories":2625},[],{"categories":2627},[],{"categories":2629},[246,570],{"categories":2631},[570,246],{"categories":2633},[215],{"categories":2635},[],{"categories":2637},[246],{"categories":2639},[],{"categories":2641},[246],{"categories":2643},[246],{"categories":2645},[],{"categories":2647},[215],{"categories":2649},[246,210],{"categories":2651},[215],{"categories":2653},[229],{"categories":2655},[],{"categories":2657},[220],{"categories":2659},[246],{"categories":2661},[223],{"categories":2663},[246],{"categories":2665},[261],{"categories":2667},[261],{"categories":2669},[570],{"categories":2671},[215],{"categories":2673},[246],{"categories":2675},[570],{"categories":2677},[229],{"categories":2679},[246],{"categories":2681},[261],{"categories":2683},[],{"categories":2685},[246],{"categories":2687},[],{"categories":2689},[],{"categories":2691},[246],{"categories":2693},[],{"categories":2695},[246],{"categories":2697},[229],{"categories":2699},[210],{"categories":2701},[261],{"categories":2703},[223],{"categories":2705},[220],{"categories":2707},[261],{"categories":2709},[],{"categories":2711},[223],{"categories":2713},[],{"categories":2715},[],{"categories":2717},[246],{"categories":2719},[215],{"categories":2721},[223],{"categories":2723},[],{"categories":2725},[246],{"categories":2727},[215],{"categories":2729},[215],{"categories":2731},[223],{"categories":2733},[215],{"categories":2735},[246],{"categories":2737},[215],{"categories":2739},[246],{"categories":2741},[],{"categories":2743},[246],{"categories":2745},[246],{"categories":2747},[246],{"categories":2749},[215],{"categories":2751},[],{"categories":2753},[],{"categories":2755},[226],{"categories":2757},[215],{"categories":2759},[],{"categories":2761},[246],{"categories":2763},[246],{"categories":2765},[246],{"categories":2767},[246],{"categories":2769},[246],{"categories":2771},[246],{"categories":2773},[246],{"categories":2775},[246],{"categories":2777},[246],{"categories":2779},[223],{"categories":2781},[246,226],{"categories":2783},[215],{"categories":2785},[215],{"categories":2787},[246],{"categories":2789},[229],{"categories":2791},[185],{"categories":2793},[246],{"categories":2795},[246],{"categories":2797},[],{"categories":2799},[],{"categories":2801},[246],{"categories":2803},[246],{"categories":2805},[],{"categories":2807},[226],{"categories":2809},[226],{"categories":2811},[261],{"categories":2813},[246],{"categories":2815},[261],{"categories":2817},[246],{"categories":2819},[246],{"categories":2821},[],{"categories":2823},[246],{"categories":2825},[],{"categories":2827},[],{"categories":2829},[246],{"categories":2831},[],{"categories":2833},[],{"categories":2835},[215],{"categories":2837},[],{"categories":2839},[246],{"categories":2841},[246],{"categories":2843},[246],{"categories":2845},[],{"categories":2847},[246],{"categories":2849},[215],{"categories":2851},[657],{"categories":2853},[220],{"categories":2855},[246],{"categories":2857},[],{"categories":2859},[220],{"categories":2861},[246],{"categories":2863},[],{"categories":2865},[246],{"categories":2867},[],{"categories":2869},[220],{"categories":2871},[],{"categories":2873},[],{"categories":2875},[220],{"categories":2877},[220],{"categories":2879},[220],{"categories":2881},[246],{"categories":2883},[],{"categories":2885},[220],{"categories":2887},[220],{"categories":2889},[],{"categories":2891},[],{"categories":2893},[220],{"categories":2895},[246],{"categories":2897},[215],{"categories":2899},[657],{"categories":2901},[223],{"categories":2903},[],{"categories":2905},[226],{"categories":2907},[246],{"categories":2909},[246],{"categories":2911},[210],{"categories":2913},[215],{"categories":2915},[215],{"categories":2917},[215],{"categories":2919},[215],{"categories":2921},[],{"categories":2923},[220],{"categories":2925},[220],{"categories":2927},[220],{"categories":2929},[220],{"categories":2931},[261],{"categories":2933},[246],{"categories":2935},[210],{"categories":2937},[],{"categories":2939},[261],{"categories":2941},[220],{"categories":2943},[226],{"categories":2945},[226],{"categories":2947},[226],{"categories":2949},[226],{"categories":2951},[226],{"categories":2953},[226],{"categories":2955},[246,210],{"categories":2957},[220],{"categories":2959},[210],{"categories":2961},[215],{"categories":2963},[215],{"categories":2965},[261],{"categories":2967},[],{"categories":2969},[],{"categories":2971},[223],{"categories":2973},[],{"categories":2975},[246],{"categories":2977},[223],{"categories":2979},[246],{"categories":2981},[229],{"categories":2983},[220],{"categories":2985},[210],{"categories":2987},[220],{"categories":2989},[229],{"categories":2991},[261],{"categories":2993},[220],{"categories":2995},[],{"categories":2997},[261],{"categories":2999},[],{"categories":3001},[],{"categories":3003},[220],{"categories":3005},[220],{"categories":3007},[220],{"categories":3009},[246],{"categories":3011},[246],{"categories":3013},[246],{"categories":3015},[246],{"categories":3017},[246],{"categories":3019},[],{"categories":3021},[570],{"categories":3023},[246],{"categories":3025},[],{"categories":3027},[],{"categories":3029},[],{"categories":3031},[261],{"categories":3033},[],{"categories":3035},[246],{"categories":3037},[],{"categories":3039},[215],{"categories":3041},[246],{"categories":3043},[215],{"categories":3045},[246],{"categories":3047},[220],{"categories":3049},[],{"categories":3051},[246],{"categories":3053},[246],{"categories":3055},[],{"categories":3057},[185],{"categories":3059},[185],{"categories":3061},[229],{"categories":3063},[226],{"categories":3065},[],{"categories":3067},[246],{"categories":3069},[220],{"categories":3071},[],{"categories":3073},[],{"categories":3075},[246],{"categories":3077},[229],{"categories":3079},[220],{"categories":3081},[210],{"categories":3083},[261,229],{"categories":3085},[229],{"categories":3087},[246],{"categories":3089},[220],{"categories":3091},[],{"categories":3093},[],{"categories":3095},[],{"categories":3097},[],{"categories":3099},[],{"categories":3101},[],{"categories":3103},[246],{"categories":3105},[],{"categories":3107},[],{"categories":3109},[246],{"categories":3111},[],{"categories":3113},[],{"categories":3115},[],{"categories":3117},[246],{"categories":3119},[215],{"categories":3121},[],{"categories":3123},[],{"categories":3125},[],{"categories":3127},[246],{"categories":3129},[],{"categories":3131},[246],{"categories":3133},[246],{"categories":3135},[],{"categories":3137},[246],{"categories":3139},[229],{"categories":3141},[],{"categories":3143},[261],{"categories":3145},[261],{"categories":3147},[],{"categories":3149},[223],{"categories":3151},[],{"categories":3153},[],{"categories":3155},[],{"categories":3157},[226],{"categories":3159},[215],{"categories":3161},[220],{"categories":3163},[246],{"categories":3165},[210],{"categories":3167},[246],{"categories":3169},[],{"categories":3171},[],{"categories":3173},[210],{"categories":3175},[223],{"categories":3177},[220],{"categories":3179},[],{"categories":3181},[570],{"categories":3183},[],{"categories":3185},[223],{"categories":3187},[246],{"categories":3189},[246],{"categories":3191},[223],{"categories":3193},[246],{"categories":3195},[226],{"categories":3197},[220],{"categories":3199},[246],{"categories":3201},[220],{"categories":3203},[246],{"categories":3205},[220],{"categories":3207},[261],{"categories":3209},[261],{"categories":3211},[226],{"categories":3213},[],{"categories":3215},[246],{"categories":3217},[246],{"categories":3219},[223],{"categories":3221},[657],{"categories":3223},[261],{"categories":3225},[215],{"categories":3227},[246],{"categories":3229},[215],{"categories":3231},[246],{"categories":3233},[246],{"categories":3235},[],{"categories":3237},[246],{"categories":3239},[],{"categories":3241},[246],{"categories":3243},[223],{"categories":3245},[246],{"categories":3247},[246],{"categories":3249},[246],{"categories":3251},[],{"categories":3253},[246],{"categories":3255},[246],{"categories":3257},[657],{"categories":3259},[],{"categories":3261},[215],{"categories":3263},[570],{"categories":3265},[229],{"categories":3267},[],{"categories":3269},[185],{"categories":3271},[],{"categories":3273},[],{"categories":3275},[215],{"categories":3277},[246],{"categories":3279},[],{"categories":3281},[246],{"categories":3283},[246],{"categories":3285},[220],{"categories":3287},[246],{"categories":3289},[215],{"categories":3291},[215],{"categories":3293},[226],{"categories":3295},[226],{"categories":3297},[226],{"categories":3299},[246],{"categories":3301},[185],{"categories":3303},[215],{"categories":3305},[261],{"categories":3307},[],{"categories":3309},[226],{"categories":3311},[226],{"categories":3313},[570],{"categories":3315},[226],{"categories":3317},[226],{"categories":3319},[220],{"categories":3321},[215],{"categories":3323},[570],{"categories":3325},[246],{"categories":3327},[246],{"categories":3329},[246],{"categories":3331},[246],{"categories":3333},[],{"categories":3335},[220],{"categories":3337},[246],{"categories":3339},[226],{"categories":3341},[],{"categories":3343},[],{"categories":3345},[215],{"categories":3347},[],{"categories":3349},[220],{"categories":3351},[220],{"categories":3353},[220],{"categories":3355},[220],{"categories":3357},[220],{"categories":3359},[220],{"categories":3361},[220],{"categories":3363},[220],{"categories":3365},[],{"categories":3367},[],{"categories":3369},[246],{"categories":3371},[],{"categories":3373},[220],{"categories":3375},[261],{"categories":3377},[261],{"categories":3379},[185],{"categories":3381},[210],{"categories":3383},[],{"categories":3385},[],{"categories":3387},[],{"categories":3389},[226],{"categories":3391},[246],{"categories":3393},[],{"categories":3395},[210],{"categories":3397},[210],{"categories":3399},[226],{"categories":3401},[261],{"categories":3403},[185],{"categories":3405},[226],{"categories":3407},[226],{"categories":3409},[],{"categories":3411},[220],{"categories":3413},[210],{"categories":3415},[210],{"categories":3417},[246],{"categories":3419},[220],{"categories":3421},[229],{"categories":3423},[226],{"categories":3425},[],{"categories":3427},[223],{"categories":3429},[185],{"categories":3431},[215],{"categories":3433},[215],{"categories":3435},[215],{"categories":3437},[570],{"categories":3439},[],{"categories":3441},[220],{"categories":3443},[],{"categories":3445},[220],{"categories":3447},[220],{"categories":3449},[246],{"categories":3451},[246],{"categories":3453},[229],{"categories":3455},[220],{"categories":3457},[229],{"categories":3459},[],{"categories":3461},[220],{"categories":3463},[226],{"categories":3465},[226],{"categories":3467},[226],{"categories":3469},[246],{"categories":3471},[220],{"categories":3473},[246],{"categories":3475},[210],{"categories":3477},[215],{"categories":3479},[226],{"categories":3481},[215],{"categories":3483},[246],{"categories":3485},[],{"categories":3487},[215],{"categories":3489},[220],{"categories":3491},[215],{"categories":3493},[215],{"categories":3495},[215],{"categories":3497},[215],{"categories":3499},[],{"categories":3501},[],{"categories":3503},[215],{"categories":3505},[215],{"categories":3507},[],{"categories":3509},[215],{"categories":3511},[215],{"categories":3513},[246],{"categories":3515},[246],{"categories":3517},[215],{"categories":3519},[215],{"categories":3521},[246],{"categories":3523},[],{"categories":3525},[246],{"categories":3527},[220],{"categories":3529},[246],{"categories":3531},[246],{"categories":3533},[],{"categories":3535},[246],{"categories":3537},[246],{"categories":3539},[246],{"categories":3541},[215],{"categories":3543},[],{"categories":3545},[],{"categories":3547},[],{"categories":3549},[],{"categories":3551},[246],{"categories":3553},[246],{"categories":3555},[],{"categories":3557},[223],{"categories":3559},[215],{"categories":3561},[],{"categories":3563},[],{"categories":3565},[],{"categories":3567},[],{"categories":3569},[],{"categories":3571},[246],{"categories":3573},[],{"categories":3575},[],{"categories":3577},[246],{"categories":3579},[],{"categories":3581},[220],{"categories":3583},[220],{"categories":3585},[220],{"categories":3587},[210],{"categories":3589},[],{"categories":3591},[223],{"categories":3593},[229],{"categories":3595},[229],{"categories":3597},[570],{"categories":3599},[215],{"categories":3601},[],{"categories":3603},[246],{"categories":3605},[246],{"categories":3607},[210],{"categories":3609},[],{"categories":3611},[210],{"categories":3613},[],{"categories":3615},[],{"categories":3617},[],{"categories":3619},[229],{"categories":3621},[220],{"categories":3623},[220],{"categories":3625},[220],{"categories":3627},[220],{"categories":3629},[220],{"categories":3631},[],{"categories":3633},[215],{"categories":3635},[246],{"categories":3637},[246],{"categories":3639},[246],{"categories":3641},[],{"categories":3643},[210],{"categories":3645},[],{"categories":3647},[226],{"categories":3649},[185],{"categories":3651},[226],{"categories":3653},[],{"categories":3655},[],{"categories":3657},[246],{"categories":3659},[220],{"categories":3661},[],{"categories":3663},[246],{"categories":3665},[246],{"categories":3667},[246],{"categories":3669},[220],{"categories":3671},[220],{"categories":3673},[246],{"categories":3675},[185],{"categories":3677},[220],{"categories":3679},[],{"categories":3681},[246],{"categories":3683},[],{"categories":3685},[657],{"categories":3687},[229],{"categories":3689},[185],{"categories":3691},[229],{"categories":3693},[570],{"categories":3695},[246],{"categories":3697},[229],{"categories":3699},[215],{"categories":3701},[570],{"categories":3703},[229],{"categories":3705},[226],{"categories":3707},[226],{"categories":3709},[],{"categories":3711},[229],{"categories":3713},[],{"categories":3715},[261],{"categories":3717},[229],{"categories":3719},[],{"categories":3721},[185],{"categories":3723},[185],{"categories":3725},[657],{"categories":3727},[],{"categories":3729},[246],{"categories":3731},[229],{"categories":3733},[570],{"categories":3735},[220],{"categories":3737},[220],{"categories":3739},[185],{"categories":3741},[246],{"categories":3743},[261],{"categories":3745},[246],{"categories":3747},[],{"categories":3749},[],{"categories":3751},[],{"categories":3753},[223],{"categories":3755},[246],{"categories":3757},[226],{"categories":3759},[229],{"categories":3761},[229],{"categories":3763},[246],{"categories":3765},[223],{"categories":3767},[261],{"categories":3769},[246],{"categories":3771},[229],{"categories":3773},[246],{"categories":3775},[229],{"categories":3777},[261],{"categories":3779},[261],{"categories":3781},[220],{"categories":3783},[261],{"categories":3785},[229],{"categories":3787},[210],{"categories":3789},[229],{"categories":3791},[229],{"categories":3793},[229],{"categories":3795},[229],{"categories":3797},[],{"categories":3799},[215],{"categories":3801},[],{"categories":3803},[185],{"categories":3805},[246],{"categories":3807},[246],{"categories":3809},[],{"categories":3811},[],{"categories":3813},[],{"categories":3815},[246],{"categories":3817},[215],{"categories":3819},[246],{"categories":3821},[246],{"categories":3823},[],{"categories":3825},[246],{"categories":3827},[226],{"categories":3829},[246],{"categories":3831},[246],{"categories":3833},[246],{"categories":3835},[],{"categories":3837},[],{"categories":3839},[],{"categories":3841},[570],{"categories":3843},[570],{"categories":3845},[210],{"categories":3847},[220],{"categories":3849},[210,223],{"categories":3851},[246],{"categories":3853},[215],{"categories":3855},[],{"categories":3857},[226],{"categories":3859},[185],{"categories":3861},[246],{"categories":3863},[229],{"categories":3865},[246],{"categories":3867},[],{"categories":3869},[185],{"categories":3871},[570],{"categories":3873},[220],{"categories":3875},[210],{"categories":3877},[570],{"categories":3879},[220],{"categories":3881},[261],{"categories":3883},[220],{"categories":3885},[261],{"categories":3887},[246],{"categories":3889},[261],{"categories":3891},[261],{"categories":3893},[229],{"categories":3895},[185],{"categories":3897},[246],{"categories":3899},[223],{"categories":3901},[],{"categories":3903},[246],{"categories":3905},[226],{"categories":3907},[185],{"categories":3909},[210],{"categories":3911},[246],{"categories":3913},[185],{"categories":3915},[261],{"categories":3917},[246],{"categories":3919},[246],{"categories":3921},[185],{"categories":3923},[246],{"categories":3925},[261],{"categories":3927},[246],{"categories":3929},[],{"categories":3931},[246],{"categories":3933},[246],{"categories":3935},[246],{"categories":3937},[246],{"categories":3939},[],{"categories":3941},[220],{"categories":3943},[570],{"categories":3945},[],{"categories":3947},[],{"categories":3949},[246],{"categories":3951},[210],{"categories":3953},[223],{"categories":3955},[210],{"categories":3957},[210],{"categories":3959},[220],{"categories":3961},[],{"categories":3963},[246],{"categories":3965},[215],{"categories":3967},[246],{"categories":3969},[246],{"categories":3971},[],{"categories":3973},[220],{"categories":3975},[215],{"categories":3977},[246,570],{"categories":3979},[220,570],{"categories":3981},[570],{"categories":3983},[246],{"categories":3985},[220],{"categories":3987},[220],{"categories":3989},[229],{"categories":3991},[229],{"categories":3993},[229],{"categories":3995},[246],{"categories":3997},[226],{"categories":3999},[220],{"categories":4001},[],{"categories":4003},[570],{"categories":4005},[],{"categories":4007},[570],{"categories":4009},[570],{"categories":4011},[210],{"categories":4013},[220],{"categories":4015},[],{"categories":4017},[570],{"categories":4019},[246],{"categories":4021},[215],{"categories":4023},[246],{"categories":4025},[226],{"categories":4027},[229],{"categories":4029},[229],{"categories":4031},[229],{"categories":4033},[570],{"categories":4035},[],{"categories":4037},[],{"categories":4039},[],{"categories":4041},[246],{"categories":4043},[229],{"categories":4045},[246],{"categories":4047},[229],{"categories":4049},[570],{"categories":4051},[570],{"categories":4053},[246],{"categories":4055},[220],{"categories":4057},[],{"categories":4059},[246],{"categories":4061},[246],{"categories":4063},[246],{"categories":4065},[],{"categories":4067},[],{"categories":4069},[570],{"categories":4071},[570],{"categories":4073},[246,570],{"categories":4075},[220],{"categories":4077},[220],{"categories":4079},[220],{"categories":4081},[220],{"categories":4083},[220],{"categories":4085},[220],{"categories":4087},[],{"categories":4089},[229],{"categories":4091},[246],{"categories":4093},[229],{"categories":4095},[223],{"categories":4097},[246],{"categories":4099},[657],{"categories":4101},[657],{"categories":4103},[220],{"categories":4105},[229],{"categories":4107},[],{"categories":4109},[220],{"categories":4111},[246],{"categories":4113},[],{"categories":4115},[226],{"categories":4117},[],{"categories":4119},[246],{"categories":4121},[220],{"categories":4123},[215],{"categories":4125},[246],{"categories":4127},[],{"categories":4129},[],{"categories":4131},[226],{"categories":4133},[226],{"categories":4135},[261],{"categories":4137},[226],{"categories":4139},[220],{"categories":4141},[],{"categories":4143},[220],{"categories":4145},[215],{"categories":4147},[246],{"categories":4149},[246],{"categories":4151},[],{"categories":4153},[246],{"categories":4155},[261],{"categories":4157},[246],{"categories":4159},[],{"categories":4161},[185],{"categories":4163},[229],{"categories":4165},[229],{"categories":4167},[210],{"categories":4169},[210],{"categories":4171},[210],{"categories":4173},[220],{"categories":4175},[210],{"categories":4177},[220],{"categories":4179},[570],{"categories":4181},[657],{"categories":4183},[215],{"categories":4185},[215],{"categories":4187},[215],{"categories":4189},[570],{"categories":4191},[215,210],{"categories":4193},[185],{"categories":4195},[220],{"categories":4197},[],{"categories":4199},[246],{"categories":4201},[],{"categories":4203},[229],{"categories":4205},[185],{"categories":4207},[226],{"categories":4209},[229],{"categories":4211},[261],{"categories":4213},[],{"categories":4215},[220],{"categories":4217},[],{"categories":4219},[657],{"categories":4221},[],{"categories":4223},[226],{"categories":4225},[226],{"categories":4227},[185],{"categories":4229},[],{"categories":4231},[246],{"categories":4233},[185],{"categories":4235},[],{"categories":4237},[246],{"categories":4239},[246],{"categories":4241},[],{"categories":4243},[261],{"categories":4245},[246],{"categories":4247},[],{"categories":4249},[246],{"categories":4251},[],{"categories":4253},[],{"categories":4255},[220],{"categories":4257},[220],{"categories":4259},[],{"categories":4261},[229],{"categories":4263},[229],{"categories":4265},[229],{"categories":4267},[246,220],{"categories":4269},[220],{"categories":4271},[220],{"categories":4273},[220],{"categories":4275},[185],{"categories":4277},[185],{"categories":4279},[],{"categories":4281},[215],{"categories":4283},[246],{"categories":4285},[185],{"categories":4287},[185],{"categories":4289},[215],{"categories":4291},[210],{"categories":4293},[220],{"categories":4295},[229],{"categories":4297},[246],{"categories":4299},[246],{"categories":4301},[220],{"categories":4303},[229],{"categories":4305},[220],{"categories":4307},[246],{"categories":4309},[223],{"categories":4311},[],{"categories":4313},[246],{"categories":4315},[],{"categories":4317},[246],{"categories":4319},[246],{"categories":4321},[229],{"categories":4323},[],{"categories":4325},[185],{"categories":4327},[246],{"categories":4329},[220],{"categories":4331},[220],{"categories":4333},[229],{"categories":4335},[261],{"categories":4337},[261],{"categories":4339},[215],{"categories":4341},[246],{"categories":4343},[220],{"categories":4345},[],{"categories":4347},[220],{"categories":4349},[246],{"categories":4351},[215],{"categories":4353},[246],{"categories":4355},[246],{"categories":4357},[246],{"categories":4359},[220],{"categories":4361},[185],{"categories":4363},[246],{"categories":4365},[226],{"categories":4367},[246],{"categories":4369},[246],{"categories":4371},[246],{"categories":4373},[246],{"categories":4375},[],{"categories":4377},[246],{"categories":4379},[185],{"categories":4381},[226],{"categories":4383},[246],{"categories":4385},[226],{"categories":4387},[],{"categories":4389},[],{"categories":4391},[],{"categories":4393},[246],{"categories":4395},[],{"categories":4397},[],{"categories":4399},[],{"categories":4401},[],{"categories":4403},[220],{"categories":4405},[261],{"categories":4407},[220],{"categories":4409},[220],{"categories":4411},[229],{"categories":4413},[210],{"categories":4415},[246],{"categories":4417},[246],{"categories":4419},[246],{"categories":4421},[210],{"categories":4423},[261],{"categories":4425},[],{"categories":4427},[185],{"categories":4429},[223],{"categories":4431},[246],{"categories":4433},[226],{"categories":4435},[261],{"categories":4437},[261],{"categories":4439},[657],{"categories":4441},[220],{"categories":4443},[246],{"categories":4445},[246],{"categories":4447},[261],{"categories":4449},[246],{"categories":4451},[],{"categories":4453},[],{"categories":4455},[570],{"categories":4457},[226],{"categories":4459},[261],{"categories":4461},[246],{"categories":4463},[215],{"categories":4465},[261],{"categories":4467},[210],{"categories":4469},[220],{"categories":4471},[220],{"categories":4473},[215],{"categories":4475},[246],{"categories":4477},[],{"categories":4479},[],{"categories":4481},[],{"categories":4483},[246],{"categories":4485},[],{"categories":4487},[215],{"categories":4489},[],{"categories":4491},[246],{"categories":4493},[],{"categories":4495},[215],{"categories":4497},[220],{"categories":4499},[246],{"categories":4501},[570],{"categories":4503},[246],{"categories":4505},[261],{"categories":4507},[246],{"categories":4509},[261],{"categories":4511},[261],{"categories":4513},[],{"categories":4515},[],{"categories":4517},[261],{"categories":4519},[261],{"categories":4521},[261],{"categories":4523},[],{"categories":4525},[261],{"categories":4527},[220],{"categories":4529},[220],{"categories":4531},[],{"categories":4533},[246],{"categories":4535},[223],{"categories":4537},[185],{"categories":4539},[246],{"categories":4541},[],{"categories":4543},[261],{"categories":4545},[246],{"categories":4547},[657],{"categories":4549},[261],{"categories":4551},[261],{"categories":4553},[223],{"categories":4555},[229],{"categories":4557},[229],{"categories":4559},[],{"categories":4561},[229],{"categories":4563},[246],{"categories":4565},[],{"categories":4567},[],{"categories":4569},[220],{"categories":4571},[],{"categories":4573},[220],{"categories":4575},[220],{"categories":4577},[215],{"categories":4579},[246],{"categories":4581},[215],{"categories":4583},[261],{"categories":4585},[215],{"categories":4587},[229],{"categories":4589},[229],{"categories":4591},[229],{"categories":4593},[215],{"categories":4595},[246],{"categories":4597},[220],{"categories":4599},[570],{"categories":4601},[210],{"categories":4603},[570],{"categories":4605},[570],{"categories":4607},[229],{"categories":4609},[570],{"categories":4611},[570],[4613,4792,4860,4934],{"id":4614,"title":4615,"ai":4616,"body":4621,"categories":4781,"created_at":186,"date_modified":186,"description":178,"extension":187,"faq":186,"featured":188,"kicker_label":186,"meta":4782,"navigation":190,"path":4783,"published_at":192,"question":186,"scraped_at":186,"seo":4784,"sitemap":4785,"source_id":4786,"source_name":196,"source_type":197,"source_url":198,"stem":4787,"tags":4788,"thumbnail_url":186,"tldr":4789,"tweet":186,"unknown_tags":4790,"__hash__":4791},"summaries\u002Fsummaries\u002Fnumpy-batched-lstm-forward-backward-summary.md","NumPy Batched LSTM Forward\u002FBackward",{"provider":7,"model":8,"input_tokens":4617,"output_tokens":4618,"processing_time_ms":4619,"cost_usd":4620},8684,1415,14034,0.0019739,{"type":14,"value":4622,"toc":4775},[4623,4627,4630,4634,4641,4690,4693,4697,4707,4762,4765,4769,4772],[17,4624,4626],{"id":4625},"parameter-initialization-for-stable-training","Parameter Initialization for Stable Training",[22,4628,4629],{},"LSTM weights form a single matrix WLSTM of shape (input_size + hidden_size + 1, 4 * hidden_size), with +1 for biases as the first row. Use Xavier initialization: random normal scaled by 1\u002Fsqrt(input_size + hidden_size). Set biases to zero initially, but apply 'fancy_forget_bias_init=3' to forget gate biases (indices hidden_size:2*hidden_size) to start with negative bias, encouraging forget gates to stay off early in training since raw gate outputs are ~N(0,1).",[17,4631,4633],{"id":4632},"batched-forward-pass-logic","Batched Forward Pass Logic",[22,4635,4636,4637,4640],{},"Input X: (n,b,input_size). Hidden d = WLSTM.shape",[77,4638,4639],{},"1","\u002F4. Init c0\u002Fh0 as zeros((b,d)) if None. For each timestep t:",[4642,4643,4644,4664,4673,4676,4682],"ul",{},[4645,4646,4647,4648,4651,4652,4655,4656,4659,4660,4663],"li",{},"Build Hin",[77,4649,4650],{},"t,:,0","=1 (bias), Hin",[77,4653,4654],{},"t,:,1:input_size+1","=X",[77,4657,4658],{},"t",", Hin",[77,4661,4662],{},"t,:,input_size+1:","=prev_h (h0 at t=0).",[4645,4665,4666,4667,4669,4670,4672],{},"Compute raw IFOG",[77,4668,4658],{}," = Hin",[77,4671,4658],{}," @ WLSTM (main compute).",[4645,4674,4675],{},"Gates: sigmoid on first 3*d (input\u002Fforget\u002Foutput), tanh on last d (gate candidate).",[4645,4677,4678,4679,4681],{},"Cell C",[77,4680,4658],{}," = input_gate * gate_candidate + forget_gate * prev_c.",[4645,4683,4684,4685,4687,4688,80],{},"Output Hout",[77,4686,4658],{}," = output_gate * tanh(C",[77,4689,4658],{},[22,4691,4692],{},"Cache stores all intermediates (Hin, IFOG, IFOGf, C, Ct, etc.) for backward. Returns full Hout (n,b,d), final C\u002FH, cache.",[17,4694,4696],{"id":4695},"backward-pass-gradient-computation","Backward Pass Gradient Computation",[22,4698,4699,4700,4703,4704,4706],{},"Input dHout_in (n,b,d). Accumulate dC",[77,4701,4702],{},"n-1","\u002FdHout",[77,4705,4702],{}," if provided for state carryover. Reverse loop over t:",[4642,4708,4709,4717,4726,4729,4744],{},[4645,4710,4711,4712,4714,4715,103],{},"dIFOGf output slice (2d:3d) = tanh(Ct",[77,4713,4658],{},") * dHout",[77,4716,4658],{},[4645,4718,4719,4720,4722,4723,4725],{},"dC",[77,4721,4658],{}," from tanh' * output_gate * dHout",[77,4724,4658],{},", plus forget\u002Finput contributions to prev_c.",[4645,4727,4728],{},"Backprop activations: tanh' on gate candidate, sigmoid'=(y(1-y)) on gates.",[4645,4730,4731,4732,4734,4735,4737,4738,4740,4741,4743],{},"dWLSTM += Hin",[77,4733,4658],{},".T @ dIFOG",[77,4736,4658],{},"; dHin",[77,4739,4658],{}," = dIFOG",[77,4742,4658],{}," @ WLSTM.T.",[4645,4745,4746,4747,4749,4750,4753,4754,4757,4758,4761],{},"Extract dX",[77,4748,4658],{}," = dHin",[77,4751,4752],{},"t,1:input+1","; propagate dHout",[77,4755,4756],{},"t-1","\u002Fdh0 from dHin",[77,4759,4760],{},"t,input+1:","; dc0\u002Fdh0 similarly.",[22,4763,4764],{},"Returns dX (n,b,input), dWLSTM, dc0, dh0.",[17,4766,4768],{"id":4767},"verification-ensures-correctness","Verification Ensures Correctness",[22,4770,4771],{},"Test 1 (sequential vs batch): n=5,b=3,d=4,input=10. Run forward sequentially (one timestep at a time, carrying c\u002Fh), confirm Hout matches full batch forward.",[22,4773,4774],{},"Test 2 (gradient check): Numerical grad = (fwd(+δ) - fwd(-δ))\u002F(2δ), δ=1e-5. Relative error threshold warning=1e-2, error=1. Checks every element of X\u002FWLSTM\u002Fc0\u002Fh0 against analytic grads from loss = sum(H * wrand). All params pass with low error, confirming backprop accuracy.",{"title":178,"searchDepth":179,"depth":179,"links":4776},[4777,4778,4779,4780],{"id":4625,"depth":179,"text":4626},{"id":4632,"depth":179,"text":4633},{"id":4695,"depth":179,"text":4696},{"id":4767,"depth":179,"text":4768},[185],{},"\u002Fsummaries\u002Fnumpy-batched-lstm-forward-backward-summary",{"title":4615,"description":178},{"loc":4783},"ed69ec8dcc565dc4","summaries\u002Fnumpy-batched-lstm-forward-backward-summary",[201,202,203],"Efficient pure NumPy LSTM processes batched sequences (n,b,input_size); init with Xavier + forget bias=3; verified via sequential match and numerical gradients.",[],"5dD3n1TS6LbPVttHG7t_U-CvXEtPl5LjBkztvfD9Gxw",{"id":4793,"title":4794,"ai":4795,"body":4800,"categories":4849,"created_at":186,"date_modified":186,"description":178,"extension":187,"faq":186,"featured":188,"kicker_label":186,"meta":4850,"navigation":190,"path":4851,"published_at":192,"question":186,"scraped_at":186,"seo":4852,"sitemap":4853,"source_id":4854,"source_name":196,"source_type":197,"source_url":198,"stem":4855,"tags":4856,"thumbnail_url":186,"tldr":4857,"tweet":186,"unknown_tags":4858,"__hash__":4859},"summaries\u002Fsummaries\u002Fpolicy-gradients-for-pong-100-line-rl-agent-summary.md","Policy Gradients for Pong: 100-Line RL Agent",{"provider":7,"model":8,"input_tokens":4796,"output_tokens":4797,"processing_time_ms":4798,"cost_usd":4799},12952,1480,13868,0.00286,{"type":14,"value":4801,"toc":4843},[4802,4806,4809,4812,4816,4823,4827,4833,4837,4840],[17,4803,4805],{"id":4804},"network-architecture-and-forwardbackward-passes","Network Architecture and Forward\u002FBackward Passes",[22,4807,4808],{},"Build a fully connected policy network with 200 ReLU hidden units: input is 80x80=6400D (binary diff frame), W1 (200x6400 Xavier init), ReLU, W2 (200x1), sigmoid for P(UP=action 2). Forward: h = ReLU(W1 @ x), p = sigmoid(W2 @ h). Sample action stochastically: UP if uniform() \u003C p else DOWN.",[22,4810,4811],{},"Backward computes policy gradient analytically. For episode: stack epx (inputs), eph (hiddens), epdlogp (y - p where y=1 for UP). dW2 = eph.T @ epdlogp. dh = epdlogp.outer(W2), zero ReLU grads (eph\u003C=0), dW1 = dh.T @ epx. Accumulate into grad_buffer over batch_size=10 episodes.",[17,4813,4815],{"id":4814},"image-preprocessing-for-atari-pong","Image Preprocessing for Atari Pong",[22,4817,4818,4819,4822],{},"Transform 210x160x3 uint8 frame: crop top\u002Fbottom to 160x80 (35:195), downsample 2x to 80x80 grayscale (I",[77,4820,4821],{},"::2,::2,0","), binarize (set bg 144\u002F109=0, else=1), flatten to 6400D float. Use difference frames x = cur_x - prev_x (motion highlights ball\u002Fpaddles, zeros static bg). This reduces noise, enables end-to-end from pixels.",[17,4824,4826],{"id":4825},"reward-discounting-and-advantage-normalization","Reward Discounting and Advantage Normalization",[22,4828,4829,4830,4832],{},"Pong rewards: +1 win, -1 lose (sparse, at episode end). For trajectory drs: discount backwards with gamma=0.99, reset running sum at r",[77,4831,4658],{},"!=0 (game boundaries). Standardize discounted_epr to mean=0, std=1 (controls gradient variance). Modulate: epdlogp *= discounted_epr (REINFORCE: grad log pi(a|s) * advantage).",[17,4834,4836],{"id":4835},"training-loop-and-optimization","Training Loop and Optimization",[22,4838,4839],{},"OpenAI Gym Pong-v0. Loop: prepro obs, forward policy, sample\u002Fact, record x\u002Fh\u002Fdlogp\u002Fr. On done: compute discounted\u002Fcentered advantages, backward, add to grad_buffer. Every 10 eps: RMSProp update (decay=0.99, lr=1e-4): g \u002F (sqrt(rms_cache) + 1e-5), reset buffer. Track running_reward (EWMA 0.99), save model every 100 eps. Render optional. Resume from save.p.",[22,4841,4842],{},"Prints episode rewards; agent learns to beat random policy quickly, human-level after ~1-2hr CPU (per blog link in comments).",{"title":178,"searchDepth":179,"depth":179,"links":4844},[4845,4846,4847,4848],{"id":4804,"depth":179,"text":4805},{"id":4814,"depth":179,"text":4815},{"id":4825,"depth":179,"text":4826},{"id":4835,"depth":179,"text":4836},[229],{},"\u002Fsummaries\u002Fpolicy-gradients-for-pong-100-line-rl-agent-summary",{"title":4794,"description":178},{"loc":4851},"7c1c3951efe2f58d","summaries\u002Fpolicy-gradients-for-pong-100-line-rl-agent-summary",[201,202,203],"Train a 2-layer NN to play Atari Pong from raw pixels using REINFORCE policy gradients. Uses 80x80 binary diff frames, discounts rewards with gamma=0.99, standardizes advantages, RMSProp updates every 10 episodes. Converges on CPU in hours.",[],"6XMa-na9tAra5BBDuY7gL83XBVHrWa_0VpHOibbLrNQ",{"id":4861,"title":4862,"ai":4863,"body":4868,"categories":4908,"created_at":186,"date_modified":186,"description":178,"extension":187,"faq":186,"featured":188,"kicker_label":186,"meta":4909,"navigation":190,"path":4920,"published_at":4921,"question":186,"scraped_at":4922,"seo":4923,"sitemap":4924,"source_id":4925,"source_name":4926,"source_type":197,"source_url":4927,"stem":4928,"tags":4929,"thumbnail_url":186,"tldr":4931,"tweet":186,"unknown_tags":4932,"__hash__":4933},"summaries\u002Fsummaries\u002Fpreprocessing-swings-cnn-accuracy-from-65-to-87-on-summary.md","Preprocessing Swings CNN Accuracy from 65% to 87% on CIFAR-10",{"provider":7,"model":8,"input_tokens":4864,"output_tokens":4865,"processing_time_ms":4866,"cost_usd":4867},8876,1567,16564,0.00205185,{"type":14,"value":4869,"toc":4903},[4870,4874,4885,4889,4896,4900],[17,4871,4873],{"id":4872},"scale-pixels-to-stabilize-gradients-and-boost-baseline-performance","Scale Pixels to Stabilize Gradients and Boost Baseline Performance",[22,4875,4876,4877,4880,4881,4884],{},"Train CNNs on raw CIFAR-10 images (32x32x3 pixels, 0-255 range) without preprocessing for a 65.47% test accuracy baseline after 10 epochs using Adam optimizer and sparse categorical cross-entropy. Large pixel values (up to 255) cause exploding gradients: ∂L\u002F∂w ≈ 255 × δ, leading to overshooting and oscillations in weight updates. Normalize by dividing by 255.0 to scale to ",[77,4878,4879],{},"0,1",", reducing gradients to 1 × δ for smooth convergence, raising accuracy to 69.38%. Standardization (Z-score: (x - μ)\u002Fσ per channel) matches this at 69.38%, centering data at mean 0 and std 1—E",[77,4882,4883],{},"z"," = 0 and Var(z) = 1 proven via linearity of expectation and variance properties—but offers no extra gain for CNNs on images, as basic normalization suffices for stable training.",[17,4886,4888],{"id":4887},"use-geometric-augmentation-for-invariance-but-avoid-photometric-overkill","Use Geometric Augmentation for Invariance but Avoid Photometric Overkill",[22,4890,4891,4892,4895],{},"Apply geometric augmentations (RandomFlip horizontal, RandomRotation 0.1, RandomZoom 0.1) after normalization, training 20 epochs: accuracy dips to 67.13% on simple CNN, as added variability challenges the model without deeper capacity. These create rotation\u002Fscale\u002Fflip invariance via affine transformations—e.g., flip: x' = -x, rotation: ",[77,4893,4894],{},"cosθ -sinθ; sinθ cosθ",", zoom: s scaling—forcing feature learning (wheels, wings) over memorization. Photometric augmentations (RandomBrightness\u002FContrast 0.2) after normalization catastrophically drop accuracy to 20.62%: clipping saturates pixels to 0\u002F1 (e.g., 0.9 + 0.2 → 1.0), destroying edges\u002Ftextures in low-res 32x32 images, worsening signal-to-noise ratio and erasing discriminative features like airplane wings or cat eyes.",[17,4897,4899],{"id":4898},"stack-normalization-geometric-augs-and-architecture-for-87-accuracy","Stack Normalization, Geometric Augs, and Architecture for 87% Accuracy",[22,4901,4902],{},"Combine Z-score standardization ((X - mean)\u002Fstd, ε=1e-7), geometric augmentations (add RandomTranslation 0.1,0.1), one-hot labels with 0.1 label smoothing (y_smooth = (1-α)y_true + α\u002FK, injecting 0.01 uniform noise across 10 classes to curb overconfidence), and deeper CNN (64-128-256 filters in padded conv blocks, BatchNorm, Dropout 0.2-0.5, MaxPool): achieves 87.32% test accuracy with EarlyStopping (patience=8 on val_acc) and ReduceLROnPlateau (factor=0.5, patience=3). BatchNorm normalizes layer activations: ˆx = (x - μ_B)\u002F√(σ²_B + ε), then γˆx + β for learnable scaling\u002Fshift, stabilizing internal distributions. This pipeline aligns preprocessing with model capacity, proving no single technique wins—success demands tailored combinations avoiding info destruction while enforcing generalization.",{"title":178,"searchDepth":179,"depth":179,"links":4904},[4905,4906,4907],{"id":4872,"depth":179,"text":4873},{"id":4887,"depth":179,"text":4888},{"id":4898,"depth":179,"text":4899},[185],{"content_references":4910,"triage":4915},[4911],{"type":4912,"title":4913,"context":4914},"dataset","CIFAR-10","mentioned",{"relevance":4916,"novelty":4916,"quality":4917,"actionability":4917,"composite":4918,"reasoning":4919},3,4,3.45,"Category: Data Science & Visualization. The article discusses preprocessing techniques that significantly improve CNN accuracy on the CIFAR-10 dataset, which is relevant for AI product builders looking to enhance model performance. It provides actionable insights on normalization and augmentation strategies that can be directly applied in practice.","\u002Fsummaries\u002Fpreprocessing-swings-cnn-accuracy-from-65-to-87-on-summary","2026-04-20 16:07:06","2026-04-21 15:25:42",{"title":4862,"description":178},{"loc":4920},"03a80d45cc3addfe","Level Up Coding","https:\u002F\u002Flevelup.gitconnected.com\u002Fwhen-preprocessing-helps-and-when-it-hurts-why-your-image-classification-models-accuracy-varies-a6761f20e09e?source=rss----5517fd7b58a6---4","summaries\u002Fpreprocessing-swings-cnn-accuracy-from-65-to-87-on-summary",[202,203,4930,201],"data-science","Raw CIFAR-10 pixels yield 65% test accuracy; normalization\u002Fstandardization lift to 69%; geometric augmentation maintains ~67%; photometric brightness\u002Fcontrast crashes to 20%; combined pipeline with deeper CNN hits 87%.",[],"Lk6CsNdjDDk9VrZYIAxPRIjBCpRAx_6Kn92kO-p3qmQ",{"id":4935,"title":4936,"ai":4937,"body":4942,"categories":5114,"created_at":186,"date_modified":186,"description":178,"extension":187,"faq":186,"featured":188,"kicker_label":186,"meta":5115,"navigation":190,"path":5124,"published_at":5125,"question":186,"scraped_at":5126,"seo":5127,"sitemap":5128,"source_id":5129,"source_name":5130,"source_type":197,"source_url":5131,"stem":5132,"tags":5133,"thumbnail_url":186,"tldr":5135,"tweet":186,"unknown_tags":5136,"__hash__":5137},"summaries\u002Fsummaries\u002Fbuild-fno-pinn-surrogates-for-darcy-flow-with-phys-summary.md","Build FNO & PINN Surrogates for Darcy Flow with PhysicsNeMo",{"provider":7,"model":8,"input_tokens":4938,"output_tokens":4939,"processing_time_ms":4940,"cost_usd":4941},9889,3106,28970,0.00323995,{"type":14,"value":4943,"toc":5108},[4944,4948,4954,4977,5001,5005,5008,5011,5026,5030,5041,5044,5064,5068,5071,5074,5089,5092,5095,5098,5101,5104],[17,4945,4947],{"id":4946},"synthetic-darcy-flow-data-pipeline-from-grf-permeability-to-pressure-solutions","Synthetic Darcy Flow Data Pipeline: From GRF Permeability to Pressure Solutions",[22,4949,4950,4951,4953],{},"The core skill taught is generating high-fidelity training data for operator learning on the 2D Darcy equation: -∇·(k∇u) = f over ",[77,4952,4879],{},"² with Dirichlet BCs u=0. Start with DarcyFlowDataGenerator(resolution=32, length_scale=0.15, variance=1.0). It builds a Gaussian Random Field (GRF) covariance matrix for permeability k(x,y) = exp(GRF), using exponential kernel exp(-dist²\u002F(2*length_scale²)) + jitter, Cholesky decomposed for efficient sampling: z ~ N(0,I), samples = L @ z.",[22,4955,4956,4957,4960,4961,4964,4965,4968,4969,4972,4973,4976],{},"Solve for pressure u using iterative Jacobi: for interior points, u",[77,4958,4959],{},"i,j"," = (k_e u",[77,4962,4963],{},"i,j+1"," + k_w u",[77,4966,4967],{},"i,j-1"," + k_n u",[77,4970,4971],{},"i-1,j"," + k_s u",[77,4974,4975],{},"i+1,j"," + dx² f) \u002F (k_e + k_w + k_n + k_s), converging in ~5000 steps or tol=1e-6. Generate n_samples=200 train\u002F50 test pairs. Wrap in PyTorch Dataset with channel dim and optional z-score normalization (store mean\u002Fstd for denorm). Use DataLoader(batch_size=16). Principle: GRF captures realistic heterogeneous permeability (e.g., subsurface flows); finite differences provide ground-truth without external solvers. Common mistake: Underdamped length_scale (>0.2) yields smooth k, poor generalization—use 0.1-0.15 for multiscale. Quality check: Visualize 3 samples side-by-side (viridis for k, hot for u) to confirm pressure pools in high-k regions.",[4978,4979,4982],"pre",{"className":4980,"code":4981,"language":201,"meta":178,"style":178},"language-python shiki shiki-themes github-light github-dark","# Key generation snippet\ngenerator = DarcyFlowDataGenerator(resolution=32, length_scale=0.15)\nperm_train, press_train = generator.generate_dataset(200)\n",[26,4983,4984,4991,4996],{"__ignoreMap":178},[77,4985,4988],{"class":4986,"line":4987},"line",1,[77,4989,4990],{},"# Key generation snippet\n",[77,4992,4993],{"class":4986,"line":179},[77,4994,4995],{},"generator = DarcyFlowDataGenerator(resolution=32, length_scale=0.15)\n",[77,4997,4998],{"class":4986,"line":4916},[77,4999,5000],{},"perm_train, press_train = generator.generate_dataset(200)\n",[17,5002,5004],{"id":5003},"fourier-neural-operator-spectral-kernels-for-resolution-independent-mapping","Fourier Neural Operator: Spectral Kernels for Resolution-Independent Mapping",[22,5006,5007],{},"FNO learns function-to-function operators k → u by parameterizing Fourier multipliers. Key blocks: SpectralConv2d(in_ch=1, out_ch=1, modes1=8, modes2=8) does FFT → low-freq multiply (weights ~1\u002F(in*out)) → iFFT; handles wraparound with dual weights for positive\u002Fnegative freqs. FNOBlock adds local Conv2d(1x1) residual + GELU. Full FourierNeuralOperator2D: lift k (32x32x1) + grid (x,y linspace 0-1) via Linear(3→width=32), pad=5, 4 FNOBlocks, unpad, project Linear(32→128→1). ~100k params. Forward: permute to NCHW, cat grid, process, return NC(1)HW.",[22,5009,5010],{},"Why spectral? Convolution = Fourier multiply; truncating high modes (modes=12 max for 64res) ignores noise, enables zero-shot super-res. Trade-off: Padding needed for FFT modes; fix via consistent pad\u002Funpad. Train with MSE on full fields (no points). Mistake: Forgetting grid encoding—FNOs are translation-equivariant but need pos for bounded domains. Eval: Relative L2 = ||u_pred - u|| \u002F ||u|| \u003C 1e-3 good for surrogates.",[4978,5012,5014],{"className":4980,"code":5013,"language":201,"meta":178,"style":178},"fno = FourierNeuralOperator2D(modes1=8, modes2=8, width=32, n_layers=4).to(device)\n# Forward: out = fno(perm_batch)  # learns k → u operator\n",[26,5015,5016,5021],{"__ignoreMap":178},[77,5017,5018],{"class":4986,"line":4987},[77,5019,5020],{},"fno = FourierNeuralOperator2D(modes1=8, modes2=8, width=32, n_layers=4).to(device)\n",[77,5022,5023],{"class":4986,"line":179},[77,5024,5025],{},"# Forward: out = fno(perm_batch)  # learns k → u operator\n",[17,5027,5029],{"id":5028},"physics-informed-nns-pde-residuals-without-full-data","Physics-Informed NNs: PDE Residuals Without Full Data",[22,5031,5032,5033,5036,5037,5040],{},"PINNs solve unsupervised via multi-task loss on sparse\u002Fno data. PINN_MLP(input_dim=3: x,y,k → u): Fourier embedding (sin\u002Fcos(2π B · ",[77,5034,5035],{},"x,y","), B fixed rand, 64 freqs) + k, then Tanh MLP ",[77,5038,5039],{},"256→128→...→1",", Xavier init. Loss (lambda_data=1, pde=1, bc=10): data MSE(u_pred, u_obs), PDE residual -k(u_xx + u_yy) -1 via dual autograd (grad(u,x)→u_x→u_xx), BC MSE(u_bc=0). Collocation: sample interior\u002Fpde\u002Fbc points uniformly.",[22,5042,5043],{},"Principle: Autodiff enforces physics everywhere; Fourier feats boost freq capture vs ReLU. Trade-off: Stiff losses (tune lambdas, start data>>physics); slower than data-driven (grad graph). Mistake: No requires_grad_(True) on coords or forgetting create_graph=True for Hessians. Quality: Balance losses \u003C1e-4 each; physics loss drops signal overfit.",[4978,5045,5047],{"className":4980,"code":5046,"language":201,"meta":178,"style":178},"pinn = PINN_MLP(hidden_dims=[128]*4, n_frequencies=64).to(device)\nloss_fn = DarcyPINNLoss()\n# Usage: losses = loss_fn(pinn, x_data,y_data,k_data,u_data, x_pde,...)\n",[26,5048,5049,5054,5059],{"__ignoreMap":178},[77,5050,5051],{"class":4986,"line":4987},[77,5052,5053],{},"pinn = PINN_MLP(hidden_dims=[128]*4, n_frequencies=64).to(device)\n",[77,5055,5056],{"class":4986,"line":179},[77,5057,5058],{},"loss_fn = DarcyPINNLoss()\n",[77,5060,5061],{"class":4986,"line":4916},[77,5062,5063],{},"# Usage: losses = loss_fn(pinn, x_data,y_data,k_data,u_data, x_pde,...)\n",[17,5065,5067],{"id":5066},"cnn-surrogate-baseline-and-inference-benchmarking","CNN Surrogate Baseline and Inference Benchmarking",[22,5069,5070],{},"Add convolutional surrogate: UNet-like with Conv2d blocks as baseline (not physics-aware). Train all (FNO\u002FPINN\u002FCNN) via Trainer: Adam(lr=1e-3), MSE\u002Fdata loss for supervised, full physics loss for PINN. Loop: train_epoch (zero_grad→pred→loss→backward→step), validate no_grad MSE, save best val state, CosineAnnealLR. Plot semilogy train\u002Fval curves.",[22,5072,5073],{},"Benchmark: Time 1000 inferences on test set (torch.no_grad(), sync). FNO fastest (spectral lift), CNN mid, PINN slowest (autodiff). Save torch.save(model.state_dict(), 'fno_darcy.pth'). Principle: Surrogates 1000x faster than FD solvers for repeated k. Trade-off: FNO best gen (res-invariant), PINN data-efficient but eval slow. Post-train: Denorm preds, L2\u002Frel err plots.",[4978,5075,5077],{"className":4980,"code":5076,"language":201,"meta":178,"style":178},"trainer = Trainer(fno, Adam(fno.parameters(),1e-3))\nhistory = trainer.train(train_loader, test_loader, 100)\n",[26,5078,5079,5084],{"__ignoreMap":178},[77,5080,5081],{"class":4986,"line":4987},[77,5082,5083],{},"trainer = Trainer(fno, Adam(fno.parameters(),1e-3))\n",[77,5085,5086],{"class":4986,"line":179},[77,5087,5088],{},"history = trainer.train(train_loader, test_loader, 100)\n",[22,5090,5091],{},"\"The Fourier Neural Operator (FNO) learns mappings between function spaces by parameterizing the integral kernel in Fourier space. Key insight: Convolution in physical space = multiplication in Fourier space.\"",[22,5093,5094],{},"\"Physics-Informed Neural Networks (PINNs) incorporate physical laws directly into the loss function... residual of the PDE at collocation points.\"",[22,5096,5097],{},"\"GRF for permeability: realistic heterogeneous fields critical for subsurface modeling—smooth k leads to trivial solutions.\"",[22,5099,5100],{},"\"Benchmark shows FNO at 50ms\u002Finference vs FD Jacobi 2s—key for real-time surrogates in optimization loops.\"",[22,5102,5103],{},"\"Fourier features in PINN: sine activations capture high freqs better than Tanh alone, converging 2x faster.\"",[5105,5106,5107],"style",{},"html .default .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}html.dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}",{"title":178,"searchDepth":179,"depth":179,"links":5109},[5110,5111,5112,5113],{"id":4946,"depth":179,"text":4947},{"id":5003,"depth":179,"text":5004},{"id":5028,"depth":179,"text":5029},{"id":5066,"depth":179,"text":5067},[185],{"content_references":5116,"triage":5121},[5117],{"type":5118,"title":5119,"url":5120,"context":4914},"tool","NVIDIA PhysicsNeMo","https:\u002F\u002Fgithub.com\u002FNVIDIA\u002Fphysicsnemo",{"relevance":4917,"novelty":4916,"quality":4917,"actionability":4917,"composite":5122,"reasoning":5123},3.8,"Category: AI & LLMs. The article provides a detailed step-by-step guide on building surrogate models for Darcy flow using PhysicsNeMo, which directly addresses practical applications in AI engineering. It includes specific coding examples and techniques that can be implemented, making it actionable for developers looking to integrate AI into their projects.","\u002Fsummaries\u002Fbuild-fno-pinn-surrogates-for-darcy-flow-with-phys-summary","2026-04-13 17:07:34","2026-04-13 17:53:26",{"title":4936,"description":178},{"loc":5124},"70fa59cd85bd7438","MarkTechPost","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F04\u002F13\u002Fa-step-by-step-coding-tutorial-on-nvidia-physicsnemo-darcy-flow-fnos-pinns-surrogate-models-and-inference-benchmarking\u002F","summaries\u002Fbuild-fno-pinn-surrogates-for-darcy-flow-with-phys-summary",[202,203,201,5134],"ai-tools","Step-by-step Colab guide: generate 2D Darcy datasets via GRF & finite differences, implement\u002Ftrain FNO operators and PINNs, add CNN baselines, benchmark inference speeds for fast physics surrogates.",[],"4aRIDAtT3k5p3j_0yt0EECKKCYyaQXTBCw3QfJ4Qj8w"]