diff --git a/c#/.gitignore b/c#/.gitignore new file mode 100644 index 0000000..2afa2e2 --- /dev/null +++ b/c#/.gitignore @@ -0,0 +1,454 @@ +## Ignore Visual Studio temporary files, build results, and +## files generated by popular Visual Studio add-ons. +## +## Get latest from https://github.com/github/gitignore/blob/master/VisualStudio.gitignore + +# User-specific files +*.rsuser +*.suo +*.user +*.userosscache +*.sln.docstates + +# User-specific files (MonoDevelop/Xamarin Studio) +*.userprefs + +# Mono auto generated files +mono_crash.* + +# Build results +[Dd]ebug/ +[Dd]ebugPublic/ +[Rr]elease/ +[Rr]eleases/ +x64/ +x86/ +[Ww][Ii][Nn]32/ +[Aa][Rr][Mm]/ +[Aa][Rr][Mm]64/ +bld/ +[Bb]in/ +[Oo]bj/ +[Ll]og/ +[Ll]ogs/ + +# Visual Studio 2015/2017 cache/options directory +.vs/ +# Uncomment if you have tasks that create the project's static files in wwwroot +#wwwroot/ + +# Visual Studio 2017 auto generated files +Generated\ Files/ + +# MSTest test Results +[Tt]est[Rr]esult*/ +[Bb]uild[Ll]og.* + +# NUnit +*.VisualState.xml +TestResult.xml +nunit-*.xml + +# Build Results of an ATL Project +[Dd]ebugPS/ +[Rr]eleasePS/ +dlldata.c + +# Benchmark Results +BenchmarkDotNet.Artifacts/ + +# .NET +project.lock.json +project.fragment.lock.json +artifacts/ + +# Tye +.tye/ + +# ASP.NET Scaffolding +ScaffoldingReadMe.txt + +# StyleCop +StyleCopReport.xml + +# Files built by Visual Studio +*_i.c +*_p.c +*_h.h +*.ilk +*.meta +*.obj +*.iobj +*.pch +*.pdb +*.ipdb +*.pgc +*.pgd +*.rsp +*.sbr +*.tlb +*.tli +*.tlh +*.tmp +*.tmp_proj +*_wpftmp.csproj +*.log +*.vspscc +*.vssscc +.builds +*.pidb +*.svclog +*.scc + +# Chutzpah Test files +_Chutzpah* + +# Visual C++ cache files +ipch/ +*.aps +*.ncb +*.opendb +*.opensdf +*.sdf +*.cachefile +*.VC.db +*.VC.VC.opendb + +# Visual Studio profiler +*.psess +*.vsp +*.vspx +*.sap + +# Visual Studio Trace Files +*.e2e + +# TFS 2012 Local Workspace +$tf/ + +# Guidance Automation Toolkit +*.gpState + +# ReSharper is a .NET coding add-in +_ReSharper*/ +*.[Rr]e[Ss]harper +*.DotSettings.user + +# TeamCity is a build add-in +_TeamCity* + +# DotCover is a Code Coverage Tool +*.dotCover + +# AxoCover is a Code Coverage Tool +.axoCover/* +!.axoCover/settings.json + +# Coverlet is a free, cross platform Code Coverage Tool +coverage*.json +coverage*.xml +coverage*.info + +# Visual Studio code coverage results +*.coverage +*.coveragexml + +# NCrunch +_NCrunch_* +.*crunch*.local.xml +nCrunchTemp_* + +# MightyMoose +*.mm.* +AutoTest.Net/ + +# Web workbench (sass) +.sass-cache/ + +# Installshield output folder +[Ee]xpress/ + +# DocProject is a documentation generator add-in +DocProject/buildhelp/ +DocProject/Help/*.HxT +DocProject/Help/*.HxC +DocProject/Help/*.hhc +DocProject/Help/*.hhk +DocProject/Help/*.hhp +DocProject/Help/Html2 +DocProject/Help/html + +# Click-Once directory +publish/ + +# Publish Web Output +*.[Pp]ublish.xml +*.azurePubxml +# Note: Comment the next line if you want to checkin your web deploy settings, +# but database connection strings (with potential passwords) will be unencrypted +*.pubxml +*.publishproj + +# Microsoft Azure Web App publish settings. Comment the next line if you want to +# checkin your Azure Web App publish settings, but sensitive information contained +# in these scripts will be unencrypted +PublishScripts/ + +# NuGet Packages +*.nupkg +# NuGet Symbol Packages +*.snupkg +# The packages folder can be ignored because of Package Restore +**/[Pp]ackages/* +# except build/, which is used as an MSBuild target. +!**/[Pp]ackages/build/ +# Uncomment if necessary however generally it will be regenerated when needed +#!**/[Pp]ackages/repositories.config +# NuGet v3's project.json files produces more ignorable files +*.nuget.props +*.nuget.targets + +# Microsoft Azure Build Output +csx/ +*.build.csdef + +# Microsoft Azure Emulator +ecf/ +rcf/ + +# Windows Store app package directories and files +AppPackages/ +BundleArtifacts/ +Package.StoreAssociation.xml +_pkginfo.txt +*.appx +*.appxbundle +*.appxupload + +# Visual Studio cache files +# files ending in .cache can be ignored +*.[Cc]ache +# but keep track of directories ending in .cache +!?*.[Cc]ache/ + +# Others +ClientBin/ +~$* +*~ +*.dbmdl +*.dbproj.schemaview +*.jfm +*.pfx +*.publishsettings +orleans.codegen.cs + +# Including strong name files can present a security risk +# (https://github.com/github/gitignore/pull/2483#issue-259490424) +#*.snk + +# Since there are multiple workflows, uncomment next line to ignore bower_components +# (https://github.com/github/gitignore/pull/1529#issuecomment-104372622) +#bower_components/ + +# RIA/Silverlight projects +Generated_Code/ + +# Backup & report files from converting an old project file +# to a newer Visual Studio version. Backup files are not needed, +# because we have git ;-) +_UpgradeReport_Files/ +Backup*/ +UpgradeLog*.XML +UpgradeLog*.htm +ServiceFabricBackup/ +*.rptproj.bak + +# SQL Server files +*.mdf +*.ldf +*.ndf + +# Business Intelligence projects +*.rdl.data +*.bim.layout +*.bim_*.settings +*.rptproj.rsuser +*- [Bb]ackup.rdl +*- [Bb]ackup ([0-9]).rdl +*- [Bb]ackup ([0-9][0-9]).rdl + +# Microsoft Fakes +FakesAssemblies/ + +# GhostDoc plugin setting file +*.GhostDoc.xml + +# Node.js Tools for Visual Studio +.ntvs_analysis.dat +node_modules/ + +# Visual Studio 6 build log +*.plg + +# Visual Studio 6 workspace options file +*.opt + +# Visual Studio 6 auto-generated workspace file (contains which files were open etc.) +*.vbw + +# Visual Studio LightSwitch build output +**/*.HTMLClient/GeneratedArtifacts +**/*.DesktopClient/GeneratedArtifacts +**/*.DesktopClient/ModelManifest.xml +**/*.Server/GeneratedArtifacts +**/*.Server/ModelManifest.xml +_Pvt_Extensions + +# Paket dependency manager +.paket/paket.exe +paket-files/ + +# FAKE - F# Make +.fake/ + +# CodeRush personal settings +.cr/personal + +# Python Tools for Visual Studio (PTVS) +__pycache__/ +*.pyc + +# Cake - Uncomment if you are using it +# tools/** +# !tools/packages.config + +# Tabs Studio +*.tss + +# Telerik's JustMock configuration file +*.jmconfig + +# BizTalk build output +*.btp.cs +*.btm.cs +*.odx.cs +*.xsd.cs + +# OpenCover UI analysis results +OpenCover/ + +# Azure Stream Analytics local run output +ASALocalRun/ + +# MSBuild Binary and Structured Log +*.binlog + +# NVidia Nsight GPU debugger configuration file +*.nvuser + +# MFractors (Xamarin productivity tool) working folder +.mfractor/ + +# Local History for Visual Studio +.localhistory/ + +# BeatPulse healthcheck temp database +healthchecksdb + +# Backup folder for Package Reference Convert tool in Visual Studio 2017 +MigrationBackup/ + +# Ionide (cross platform F# VS Code tools) working folder +.ionide/ + +# Fody - auto-generated XML schema +FodyWeavers.xsd + +## +## Visual studio for Mac +## + + +# globs +Makefile.in +*.userprefs +*.usertasks +config.make +config.status +aclocal.m4 +install-sh +autom4te.cache/ +*.tar.gz +tarballs/ +test-results/ + +# Mac bundle stuff +*.dmg +*.app + +# content below from: https://github.com/github/gitignore/blob/master/Global/macOS.gitignore +# General +.DS_Store +.AppleDouble +.LSOverride + +# Icon must end with two \r +Icon + + +# Thumbnails +._* + +# Files that might appear in the root of a volume +.DocumentRevisions-V100 +.fseventsd +.Spotlight-V100 +.TemporaryItems +.Trashes +.VolumeIcon.icns +.com.apple.timemachine.donotpresent + +# Directories potentially created on remote AFP share +.AppleDB +.AppleDesktop +Network Trash Folder +Temporary Items +.apdisk + +# content below from: https://github.com/github/gitignore/blob/master/Global/Windows.gitignore +# Windows thumbnail cache files +Thumbs.db +ehthumbs.db +ehthumbs_vista.db + +# Dump file +*.stackdump + +# Folder config file +[Dd]esktop.ini + +# Recycle Bin used on file shares +$RECYCLE.BIN/ + +# Windows Installer files +*.cab +*.msi +*.msix +*.msm +*.msp + +# Windows shortcuts +*.lnk + +# JetBrains Rider +.idea/ +*.sln.iml + +## +## Visual Studio Code +## +.vscode/* +!.vscode/settings.json +!.vscode/tasks.json +!.vscode/launch.json +!.vscode/extensions.json diff --git a/c#/Chapter1/Chapter1.cs b/c#/Chapter1/Chapter1.cs new file mode 100644 index 0000000..a91889a --- /dev/null +++ b/c#/Chapter1/Chapter1.cs @@ -0,0 +1,132 @@ +// See https://aka.ms/new-console-template for more information + +using System.Diagnostics; +using StackExchange.Redis; + +namespace Chapter1; + +public class Chapter1 { + private const int OneWeekInSeconds = 7 * 86400; + private const int VoteScore = 432; + private const int ArticlesPerPage = 25; + + public static void Main() { + new Chapter1().run(); + } + + private void run() { + var con = ConnectionMultiplexer.Connect("localhost"); + var db = con.GetDatabase(); + + var articleId = postArticle(db, "username", "A title", "https://www.google.com"); + Console.WriteLine("We posted a new article with id: " + articleId); + Console.WriteLine("Its HASH looks like:"); + var articleData = db.HashGetAll("article:" + articleId); + + foreach (var entry in articleData) { + Console.WriteLine(" " + entry.Name + ": " + entry.Value); + } + + Console.WriteLine(); + + articleVote(db, "other_user", "article:" + articleId); + var votes = (int?)db.HashGet("article:" + articleId, "votes") ?? 0; + Console.WriteLine("We voted for the article, it now has votes: " + votes); + Debug.Assert(votes > 1, "Vote count is less than 1"); + + Console.WriteLine("The currently highest-scoring articles are:"); + var articles = getArticles(db, 1); + printArticles(articles); + Debug.Assert(articles.Count >= 1, "Article count is less than 1"); + + addGroups(db, articleId, new[]{"new-group"}); + Console.WriteLine("We added the article to a new group, other articles include:"); + var groupArticles = getGroupArticles(db, "new-group", 1); + printArticles(groupArticles); + Debug.Assert(groupArticles.Count >= 1, "Article group count is less than 1"); + } + + private string postArticle(IDatabase db, string user, string title, string link) { + var articleId = db.StringIncrement("article:").ToString(); + + var voted = "voted:" + articleId; + db.SetAdd(voted, user); + db.KeyExpire(voted, TimeSpan.FromSeconds(OneWeekInSeconds)); + + var now = DateTimeOffset.Now.ToUnixTimeSeconds(); + var article = "article:" + articleId; + var articleData = new List { + new("title", title), + new("link", link), + new("user", user), + new("now", now.ToString()), + new("votes", "1") + }; + db.HashSet(article, articleData.ToArray()); + + db.SortedSetAdd("score:", article, now + VoteScore); + db.SortedSetAdd("time:", article, now); + + return articleId; + } + + private void articleVote(IDatabase db, string user, string article) { + var cutoff = DateTimeOffset.Now.ToUnixTimeSeconds() - OneWeekInSeconds; + var articleScore = db.SortedSetScore("time:", article) ?? 0; + + if (articleScore < cutoff) { + return; + } + + var articleId = article.Substring(article.IndexOf(':') + 1); + + if (db.SetAdd("voted:" + articleId, user)) { + db.SortedSetIncrement("score:", article, VoteScore); + db.HashIncrement(article, "votes"); + } + } + + private List> + getArticles(IDatabase db, int page, string order = "score:") { + var start = (page - 1) * ArticlesPerPage; + var end = start + ArticlesPerPage - 1; + + var ids = db.SortedSetRangeByRank(order, start, end, order: Order.Descending); + var articles = new List>(); + + foreach (var id in ids) { + var articleData = db.HashGetAll(id.ToString()) + .ToDictionary(c => c.Name, c => c.Value); + articleData["id"] = id; + articles.Add(articleData); + } + + return articles; + } + + private void printArticles(List> articles) { + foreach (var article in articles) { + Console.WriteLine(" id: " + article["id"]); + foreach (var articleData in article.Where(c => !c.Key.Equals("id"))) { + Console.WriteLine(" " + articleData.Key + ": " + articleData.Value); + } + } + } + + private void addGroups(IDatabase db, string articleId, string[] toAdd) { + var article = "article:" + articleId; + foreach (var group in toAdd) { + db.SetAdd("group:" + group, article); + } + } + + private List> getGroupArticles(IDatabase db, string group, int page, string order = "score:") { + var key = order + group; + if (!db.KeyExists(key)) { + db.SortedSetCombineAndStore(SetOperation.Intersect, key, "group:" + group, order, aggregate: Aggregate.Max); + db.KeyExpire(key, TimeSpan.FromSeconds(60)); + } + + return getArticles(db, page, key); + } +} diff --git a/c#/Chapter1/Chapter1.csproj b/c#/Chapter1/Chapter1.csproj new file mode 100644 index 0000000..e9d9c00 --- /dev/null +++ b/c#/Chapter1/Chapter1.csproj @@ -0,0 +1,14 @@ + + + + Exe + net6.0 + enable + enable + + + + + + + diff --git a/c#/Chapter1/Chapter1.sln b/c#/Chapter1/Chapter1.sln new file mode 100644 index 0000000..76ddc4b --- /dev/null +++ b/c#/Chapter1/Chapter1.sln @@ -0,0 +1,16 @@ + +Microsoft Visual Studio Solution File, Format Version 12.00 +Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Chapter1", "Chapter1.csproj", "{F01C7220-1D82-4691-8EB2-DB79842BEC82}" +EndProject +Global + GlobalSection(SolutionConfigurationPlatforms) = preSolution + Debug|Any CPU = Debug|Any CPU + Release|Any CPU = Release|Any CPU + EndGlobalSection + GlobalSection(ProjectConfigurationPlatforms) = postSolution + {F01C7220-1D82-4691-8EB2-DB79842BEC82}.Debug|Any CPU.ActiveCfg = Debug|Any CPU + {F01C7220-1D82-4691-8EB2-DB79842BEC82}.Debug|Any CPU.Build.0 = Debug|Any CPU + {F01C7220-1D82-4691-8EB2-DB79842BEC82}.Release|Any CPU.ActiveCfg = Release|Any CPU + {F01C7220-1D82-4691-8EB2-DB79842BEC82}.Release|Any CPU.Build.0 = Release|Any CPU + EndGlobalSection +EndGlobal diff --git a/c#/Chapter2/CacheRowsThread.cs b/c#/Chapter2/CacheRowsThread.cs new file mode 100644 index 0000000..af07be6 --- /dev/null +++ b/c#/Chapter2/CacheRowsThread.cs @@ -0,0 +1,63 @@ +using StackExchange.Redis; +using System.Text.Json; + +namespace Chapter2; + +public class CacheRowsThread { + private readonly IDatabase _db; + private bool _quit; + private readonly Thread _thread; + + public CacheRowsThread(IDatabase db) { + _db = db; + _thread = new Thread(run); + _quit = false; + } + + public void Start() { + _thread.Start(); + } + + public void Quit() { + _quit = true; + } + + public bool IsAlive() { + return _thread.IsAlive; + } + + private void run() { + while (!_quit) { + var range = _db.SortedSetRangeByRankWithScores("schedule:", 0, 0); + var enumerator = range.GetEnumerator(); + var next = (SortedSetEntry?)(enumerator.MoveNext() ? enumerator.Current : null); + var now = DateTimeOffset.UtcNow.ToUnixTimeSeconds(); + if (next == null || next.Value.Score > now) { + try { + Thread.Sleep(50); + } catch (Exception ex) { + Console.WriteLine("error at thread:" + ex); + } + + continue; + } + + var rowId = next.Value.Element.ToString(); + var delay = _db.SortedSetScore("delay:", rowId) ?? 0; + if (delay <= 0) { + _db.SortedSetRemove("delay:", rowId); + _db.SortedSetRemove("schedule:", rowId); + _db.KeyDelete("inv:" + rowId); + continue; + } + + var row = new Inventory(rowId); + if (row == null) { + throw new ArgumentNullException(nameof(row)); + } + + _db.SortedSetAdd("schedule:", rowId, now + delay); + _db.StringSet("inv:" + rowId, JsonSerializer.Serialize(row)); + } + } +} diff --git a/c#/Chapter2/Chapter2.cs b/c#/Chapter2/Chapter2.cs new file mode 100644 index 0000000..ccbff69 --- /dev/null +++ b/c#/Chapter2/Chapter2.cs @@ -0,0 +1,244 @@ +using System.Diagnostics; +using StackExchange.Redis; + +namespace Chapter2; + +public class Chapter2 { + public static void Main() { + new Chapter2().run(); + } + + private void run() { + var connection = ConnectionMultiplexer.Connect("localhost"); + var db = connection.GetDatabase(15); + + testLoginCookies(db); + testShoppingCartCookies(db); + testCacheRows(db); + testCacheRequest(db); + } + + private void testLoginCookies(IDatabase conn) { + Console.WriteLine("\n----- testLoginCookies -----"); + var token = Guid.NewGuid().ToString(); + + var username = "someUser"; + updateToken(conn, token, username, "itemX"); + Console.WriteLine("We just logged-in/updated token: " + token); + Console.WriteLine($"For user: '{username}'"); + Console.WriteLine(); + + Console.WriteLine("What username do we get when we look-up that token?"); + var r = checkToken(conn, token); + Console.WriteLine(r); + Console.WriteLine(); + Debug.Assert(username.Equals(r), "username retrieved from token does not match initial username."); + Debug.Assert(r is not null, "Token is null"); + + Console.WriteLine("Let's drop the maximum number of cookies to 0 to clean them out"); + Console.WriteLine("We will start a thread to do the cleaning, while we stop it later"); + + var thread = new CleanSessionsThread(conn, 0); + thread.Start(); + Thread.Sleep(1000); + thread.Quit(); + Thread.Sleep(2000); + if (thread.IsAlive()) { + throw new Exception("The clean sessions thread is still alive?!?"); + } + + var s = conn.HashLength("login:"); + Console.WriteLine("The current number of sessions still available is: " + s); + Debug.Assert(s == 0, "sessions are not zero"); + } + + private void testShoppingCartCookies(IDatabase conn) { + Console.WriteLine("\n----- testShoppingCartCookies -----"); + var token = Guid.NewGuid().ToString(); + + Console.WriteLine("We'll refresh our session..."); + updateToken(conn, token, "username", "itemX"); + Console.WriteLine("And add an item to the shopping cart"); + addToCart(conn, token, "itemY", 3); + + var r = conn.HashGetAll("cart:" + token); + + Console.WriteLine("Our shopping cart currently has:"); + foreach (var entry in r) { + Console.WriteLine(" " + entry.Name + ": " + entry.Value); + } + + Console.WriteLine(); + + Debug.Assert(r.Length >= 1, "Shopping cart is empty"); + + Console.WriteLine("Let's clean out our sessions and carts"); + var thread = new CleanFullSessionsThread(conn, 0); + thread.Start(); + Thread.Sleep(1000); + thread.Quit(); + Thread.Sleep(2000); + if (thread.IsAlive()) { + throw new Exception("The clean sessions thread is still alive?!?"); + } + + r = conn.HashGetAll("cart:" + token); + Console.WriteLine("Our shopping cart now contains:"); + foreach (var entry in r) { + Console.WriteLine(" " + entry.Name + ": " + entry.Value); + } + + Debug.Assert(r.Length == 0, "cart is not empty"); + } + + private void testCacheRows(IDatabase conn) { + Console.WriteLine("\n----- testCacheRows -----"); + Console.WriteLine("First, let's schedule caching of itemX every 5 seconds"); + scheduleRowCache(conn, "itemX", 5); + Console.WriteLine("Our schedule looks like:"); + + var s = conn.SortedSetRangeByRankWithScores("schedule:", 0, -1); + foreach (var entry in s) { + Console.WriteLine(" " + entry.Element + ", " + entry.Score); + } + + Debug.Assert(s.Length != 0, "schedule set is empty"); + + Console.WriteLine("We'll start a caching thread that will cache the data..."); + + var thread = new CacheRowsThread(conn); + thread.Start(); + Thread.Sleep(1000); + Console.WriteLine("Our cached data looks like:"); + string? r = conn.StringGet("inv:itemX"); + Console.WriteLine(r); + Debug.Assert(r is not null, "cached data is null"); + Console.WriteLine(); + + Console.WriteLine("We'll check again in 5 seconds..."); + Thread.Sleep(5000); + Console.WriteLine("Notice that the data has changed..."); + string? r2 = conn.StringGet("inv:itemX"); + Console.WriteLine(r2); + Console.WriteLine(); + Debug.Assert(r2 is not null, "changed cached data is null"); + Debug.Assert(!r.Equals(r2), "cached data did not change"); + + Console.WriteLine("Let's force un-caching"); + scheduleRowCache(conn, "itemX", -1); + Thread.Sleep(1000); + r = conn.StringGet("inv:itemX"); + Console.WriteLine("The cache was cleared? " + (r == null)); + Debug.Assert(r is null, "cached data was not un-cached"); + + thread.Quit(); + Thread.Sleep(2000); + if (thread.IsAlive()) { + throw new Exception("The database caching thread is still alive?!?"); + } + } + + private void testCacheRequest(IDatabase conn) { + Console.WriteLine("\n----- testCacheRequest -----"); + var token = Guid.NewGuid().ToString(); + + updateToken(conn, token, "username", "itemX"); + var url = "http://test.com/?item=itemX"; + Console.WriteLine("We are going to cache a simple request against " + url); + var result = cacheRequest(conn, url, s=> "content for " + s); + Console.WriteLine("We got initial content:\n" + result); + Console.WriteLine(); + + Debug.Assert(result is not null,"Request was not cached"); + + Console.WriteLine("To test that we've cached the request, we'll pass a bad callback"); + var result2 = cacheRequest(conn, url, null); + Console.WriteLine("We ended up getting the same response!\n" + result2); + + Debug.Assert(result.Equals(result2),"Cached request was not altered"); + + Debug.Assert(!canCache(conn, "http://test.com/")); + Debug.Assert(!canCache(conn, "http://test.com/?item=itemX&_=1234536")); + } + + private string? checkToken(IDatabase conn, string token) { + return conn.HashGet("login:", token); + } + + private void updateToken(IDatabase conn, string token, string user, string? item) { + var timestamp = DateTimeOffset.UtcNow.ToUnixTimeSeconds(); + conn.HashSet("login:", token, user); + conn.SortedSetAdd("recent:", token, timestamp); + if (item != null) { + conn.SortedSetAdd("viewed:" + token, item, timestamp); + conn.SortedSetRemoveRangeByRank("viewed:" + token, 0, -26); + conn.SortedSetIncrement("viewed:", item, -1); + } + } + + private void addToCart(IDatabase conn, string session, string item, int count) { + if (count <= 0) { + conn.HashDelete("cart:" + session, item); + } else { + conn.HashSet("cart:" + session, item, count); + } + } + + private static void scheduleRowCache(IDatabase conn, string rowId, int delay) { + conn.SortedSetAdd("delay:", rowId, delay); + conn.SortedSetAdd("schedule:", rowId, DateTimeOffset.UtcNow.ToUnixTimeSeconds()); + } + + private string? cacheRequest(IDatabase conn, string request, Func? callback) { + if (!canCache(conn, request)) { + return callback?.Invoke(request); + } + + var pageKey = "cache:" + hashRequest(request); + var content = conn.StringGet(pageKey); + + if (!content.HasValue && callback != null) { + content = callback(request); + conn.StringSet(pageKey, content); + conn.KeyExpire(pageKey, TimeSpan.FromSeconds(300)); + } + + return content; + } + + private bool canCache(IDatabase conn, String request) { + try { + var url = new Uri(request); + var parameters = new Dictionary(); + if (!string.IsNullOrEmpty(url.Query)) { + foreach (var par in url.Query[1..].Split("&")) { + var pair = par.Split("=", 2); + parameters.Add(pair[0], pair.Length == 2 ? pair[1] : null); + } + } + + var itemId = extractItemId(parameters); + if (itemId == null || isDynamic(parameters)) { + return false; + } + + var rank = conn.SortedSetRank("viewed:", itemId); + return rank is < 10000; + } catch (FormatException) { + return false; + } + } + + private bool isDynamic(Dictionary parameters) { + return parameters.ContainsKey("_"); + } + + private string? extractItemId(Dictionary parameters) { + parameters.TryGetValue("item",out var result); + return result; + } + + private string hashRequest(String request) { + return request.GetHashCode().ToString(); + } +} diff --git a/c#/Chapter2/Chapter2.csproj b/c#/Chapter2/Chapter2.csproj new file mode 100644 index 0000000..e9d9c00 --- /dev/null +++ b/c#/Chapter2/Chapter2.csproj @@ -0,0 +1,14 @@ + + + + Exe + net6.0 + enable + enable + + + + + + + diff --git a/c#/Chapter2/Chapter2.sln b/c#/Chapter2/Chapter2.sln new file mode 100644 index 0000000..2213ece --- /dev/null +++ b/c#/Chapter2/Chapter2.sln @@ -0,0 +1,16 @@ + +Microsoft Visual Studio Solution File, Format Version 12.00 +Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Chapter2", "Chapter2.csproj", "{71562795-1023-4377-9E81-49653BE6E86D}" +EndProject +Global + GlobalSection(SolutionConfigurationPlatforms) = preSolution + Debug|Any CPU = Debug|Any CPU + Release|Any CPU = Release|Any CPU + EndGlobalSection + GlobalSection(ProjectConfigurationPlatforms) = postSolution + {71562795-1023-4377-9E81-49653BE6E86D}.Debug|Any CPU.ActiveCfg = Debug|Any CPU + {71562795-1023-4377-9E81-49653BE6E86D}.Debug|Any CPU.Build.0 = Debug|Any CPU + {71562795-1023-4377-9E81-49653BE6E86D}.Release|Any CPU.ActiveCfg = Release|Any CPU + {71562795-1023-4377-9E81-49653BE6E86D}.Release|Any CPU.Build.0 = Release|Any CPU + EndGlobalSection +EndGlobal diff --git a/c#/Chapter2/CleanFullSessionsThread.cs b/c#/Chapter2/CleanFullSessionsThread.cs new file mode 100644 index 0000000..934d0b1 --- /dev/null +++ b/c#/Chapter2/CleanFullSessionsThread.cs @@ -0,0 +1,59 @@ +using StackExchange.Redis; + +namespace Chapter2; + +public class CleanFullSessionsThread { + private readonly IDatabase _db; + private readonly int _limit; + private bool _quit; + private readonly Thread _thread; + + public CleanFullSessionsThread(IDatabase db, int limit) { + _db = db; + this._limit = limit; + _thread = new Thread(run); + _quit = false; + } + + public void Start() { + _thread.Start(); + } + + public void Quit() { + _quit = true; + } + + public bool IsAlive() { + return _thread.IsAlive; + } + + private void run() { + while (!_quit) { + var size = _db.SortedSetLength("recent:"); + + if (size <= _limit) { + try { + Thread.Sleep(1000); + } catch (Exception ex) { + Console.WriteLine("error at thread:" + ex); + } + continue; + } + + var endIndex = Math.Min(size - _limit, 100); + + var tokens = _db.SortedSetRangeByRank("recent:", 0, endIndex - 1); + + var sessionKeys = new List(); + + foreach (var token in tokens) { + sessionKeys.Add("viewed:" + token); + sessionKeys.Add("cart:" + token); + } + + _db.KeyDelete(sessionKeys.ToArray()); + _db.HashDelete("login:", tokens); + _db.SortedSetRemove("recent:", tokens); + } + } +} diff --git a/c#/Chapter2/CleanSessionsThread.cs b/c#/Chapter2/CleanSessionsThread.cs new file mode 100644 index 0000000..89c4f75 --- /dev/null +++ b/c#/Chapter2/CleanSessionsThread.cs @@ -0,0 +1,60 @@ +using StackExchange.Redis; + +namespace Chapter2; + +public class CleanSessionsThread { + private readonly IDatabase _db; + private readonly int _limit; + private bool _quit; + private readonly Thread _thread; + + public CleanSessionsThread(IDatabase db, int limit) { + _db = db; + this._limit = limit; + _thread = new Thread(run); + _quit = false; + } + + public void Start() { + _thread.Start(); + } + + public void Quit() { + _quit = true; + } + + public bool IsAlive() { + return _thread.IsAlive; + } + + private void run() { + while (!_quit) { + var size = _db.SortedSetLength("recent:"); + + if (size <= _limit) { + try { + Thread.Sleep(1000); + } catch (Exception ex) { + Console.WriteLine("error at thread:" + ex); + } + + continue; + } + + var endIndex = Math.Min(size - _limit, 100); + + var tokens = _db.SortedSetRangeByRank("recent:", 0, endIndex - 1); + + var sessionKeys = new List(); + + foreach (var token in tokens) { + sessionKeys.Add("viewed:" + token); + } + + _db.KeyDelete(sessionKeys.ToArray()); + + _db.HashDelete("login:", tokens); + _db.SortedSetRemove("recent:", tokens); + } + } +} diff --git a/c#/Chapter2/Inventory.cs b/c#/Chapter2/Inventory.cs new file mode 100644 index 0000000..32d7a17 --- /dev/null +++ b/c#/Chapter2/Inventory.cs @@ -0,0 +1,7 @@ +// ReSharper disable NotAccessedPositionalProperty.Global +// Disabled since this is just for demo purposes +namespace Chapter2; + +public record Inventory(string Id, string Data, long Time) { + public Inventory(string id) : this(id, "data to cache...", DateTimeOffset.UtcNow.ToUnixTimeSeconds()) { } +} diff --git a/c#/Chapter4/Chapter4.cs b/c#/Chapter4/Chapter4.cs new file mode 100644 index 0000000..c4ab10e --- /dev/null +++ b/c#/Chapter4/Chapter4.cs @@ -0,0 +1,240 @@ +using System.Diagnostics; +using System.Reflection; +using StackExchange.Redis; + +namespace Chapter4; + +public class Chapter4 { + private const string MarketKey = "market:"; + + public static void Main() { + new Chapter4().run(); + } + + private void run() { + var connection = ConnectionMultiplexer.Connect("localhost"); + var db = connection.GetDatabase(15); + + testListItem(db, false); + testPurchaseItem(db); + testBenchmarkUpdateToken(db); + } + + private static void testListItem(IDatabase conn, bool nested) { + if (!nested) { + Console.WriteLine("\n----- testListItem -----"); + } + + Console.WriteLine("We need to set up just enough state so that a user can list an item"); + var sellerId = "userX"; + var item = "itemX"; + conn.SetAdd("inventory:" + sellerId, item); + var i = conn.SetMembers("inventory:" + sellerId); + + Console.WriteLine("The user's inventory has:"); + foreach (var member in i) { + Console.WriteLine(" " + member); + } + + Debug.Assert(i.Length > 0, "Inventory is empty"); + Console.WriteLine(); + + Console.WriteLine("Listing the item..."); + var listResult = listItem(conn, item, sellerId, 10); + Console.WriteLine("Listing the item succeeded? " + listResult); + Debug.Assert(listResult, "Changes were not committed"); + var marketItems = conn.SortedSetRangeByRankWithScores(MarketKey, 0, -1); + Console.WriteLine("The market contains:"); + foreach (var marketItem in marketItems) { + Console.WriteLine(" " + marketItem.Element + ", " + marketItem.Score); + } + + Debug.Assert(marketItems.Length > 0, "Market items is empty"); + } + + private void testPurchaseItem(IDatabase conn) { + Console.WriteLine("\n----- testPurchaseItem -----"); + testListItem(conn, true); + + Console.WriteLine("We need to set up just enough state so a user can buy an item"); + conn.HashSet("users:userY", "funds", "125"); + var r = conn.HashGetAll("users:userY"); + Console.WriteLine("The user has some money:"); + foreach (var entry in r) { + Console.WriteLine(" " + entry.Name + ": " + entry.Value); + } + + Debug.Assert(r.Length > 0, "User hashset not found!"); + var funds = r.Any(a => a.Name == "funds"); + Debug.Assert(funds, "Didn't find a hash entry for funds"); + Console.WriteLine(); + + Console.WriteLine("Let's purchase an item"); + var purchaseResult = purchaseItem(conn, "userY", "itemX", "userX", 10); + Console.WriteLine("Purchasing an item succeeded? " + purchaseResult); + Debug.Assert(purchaseResult, "Changes were not committed"); + + r = conn.HashGetAll("users:userY"); + Console.WriteLine("Their money is now:"); + foreach (var entry in r) { + Console.WriteLine(" " + entry.Name + ": " + entry.Value); + } + + Debug.Assert(r.Length > 0, "Used data is empty"); + + var buyer = "userY"; + var i = conn.SetMembers("inventory:" + buyer); + Console.WriteLine("Their inventory is now:"); + foreach (var member in i) { + Console.WriteLine(" " + member); + } + + Debug.Assert(i.Length > 0, "Buyer inventory is empty"); + Debug.Assert(i.Any(item => item.Equals("itemX")), "itemX was not moved to buyers inventory"); + Debug.Assert(conn.SortedSetScore(MarketKey, "itemX.userX") == null, "Market still contains itemX.userX"); + } + + private static bool listItem(IDatabase conn, string itemId, string sellerId, double price) { + var inventory = "inventory:" + sellerId; + var item = itemId + '.' + sellerId; + var end = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() + 5000; + + while (DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() < end) { + // The client has a multiplexer approach to connections. + // As a result we can't use multi/exec/watch directly. + // We can however add transaction conditions which are functioning similarly behind the scenes. + // So we will add them by hand in order to simulate a watch request. + // We will load the set, verify that cardinality remained the same as well as that items were unchanged + var inventorySet = conn.SetMembers(inventory); + var trans = conn.CreateTransaction(); + + trans.AddCondition(Condition.SetContains(inventory, itemId)); + trans.AddCondition(Condition.SetLengthEqual(inventory, inventorySet.Length)); + foreach (var invItem in inventorySet) { + trans.AddCondition(Condition.SetContains(inventory, invItem)); + } + + trans.SortedSetAddAsync(MarketKey, item, price); + trans.SetRemoveAsync(inventory, itemId); + var committed = trans.Execute(); + + if (!committed) { + continue; + } + + return true; + } + + return false; + } + + private static bool purchaseItem( + IDatabase conn, string buyerId, string itemId, string sellerId, double listedPrice) { + var buyer = "users:" + buyerId; + var seller = "users:" + sellerId; + var item = itemId + '.' + sellerId; + var inventory = "inventory:" + buyerId; + var end = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() + 10000; + + while (DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() < end) { + var trans = conn.CreateTransaction(); + + // The client has a multiplexer approach to connections. + // As a result we can't use multi/exec/watch directly. + // We can however add transaction conditions which are functioning similarly behind the scenes. + // So we will add them by hand in order to simulate a watch request. + // We will load the hashset, verify that cardinality remained the same as well as that items were unchanged + // Similarly, we will do a check to the market set to verify that everything remained unchanged + var userSet = conn.HashGetAll(buyer); + trans.AddCondition(Condition.HashLengthEqual(buyer, userSet.Length)); + foreach (var entry in userSet) { + trans.AddCondition(Condition.HashEqual(buyer, entry.Name, entry.Value)); + } + + var marketSortedSet = conn.SortedSetRangeByRankWithScores(MarketKey, 0, -1); + trans.AddCondition(Condition.SortedSetLengthEqual(MarketKey, marketSortedSet.Length)); + foreach (var entry in marketSortedSet) { + trans.AddCondition(Condition.SortedSetEqual(MarketKey, entry.Element, entry.Score)); + } + + var price = conn.SortedSetScore(MarketKey, item); + var funds = double.Parse(conn.HashGet(buyer, "funds").ToString()); + if (price != listedPrice || price > funds) { + return false; + } + + trans.HashIncrementAsync(seller, "funds", (int)price); + trans.HashIncrementAsync(buyer, "funds", (int)-price); + trans.SetAddAsync(inventory, itemId); + trans.SortedSetRemoveAsync(MarketKey, item); + var result = trans.Execute(); + // null response indicates that the transaction was aborted due to + // the watched key changing. + if (!result) { + continue; + } + + return true; + } + + return false; + } + + private void testBenchmarkUpdateToken(IDatabase conn) { + Console.WriteLine("\n----- testBenchmarkUpdate -----"); + benchmarkUpdateToken(conn, 5); + } + + private void benchmarkUpdateToken(IDatabase conn, int duration) { + var methods = new List>() { + updateToken, + updateTokenPipeline + }; + + Console.WriteLine("{0,-20} {1,-10} {2,-15} {3,-30}","Update method","#Runs","Delta(seconds)",$"#Runs to delta(seconds) ratio"); + foreach (var method in methods) { + var count = 0; + var start = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds(); + var end = start + (duration * 1000); + while (DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() < end) { + count++; + method(conn, "token", "user", "item"); + } + + var delta = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds()- start; + Console.WriteLine("{0,-20} {1,-10} {2,-15} {3,-30}", + method.GetMethodInfo().Name, + count, + (delta / 1000), + (count / (delta / 1000))); + } + } + + private static void updateToken(IDatabase conn, string token, string user, string? item) { + var timestamp = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() / 1000; + + conn.HashSet("login:", token, user); + conn.SortedSetAdd("recent:", token, timestamp); + if (item != null) { + conn.SortedSetAdd("viewed:" + token, item, timestamp); + conn.SortedSetRemoveRangeByRank("viewed:" + token, 0, -26); + conn.SortedSetIncrement("viewed:", item, -1); + } + } + + private static void updateTokenPipeline(IDatabase conn, string token, string user, string? item) { + var timestamp = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() / 1000; + var x = new List { + Capacity = 0 + }; + x.Add(conn.HashSetAsync("login:", token, user)); + x.Add(conn.SortedSetAddAsync("recent:", token, timestamp)); + if (item != null) { + x.Add(conn.SortedSetAddAsync("viewed:" + token, item, timestamp)); + x.Add(conn.SortedSetRemoveRangeByRankAsync("viewed:" + token, 0, -26)); + x.Add(conn.SortedSetIncrementAsync("viewed:", item, -1)); + } + + conn.WaitAll(x.ToArray()); + } +} diff --git a/c#/Chapter4/Chapter4.csproj b/c#/Chapter4/Chapter4.csproj new file mode 100644 index 0000000..e9d9c00 --- /dev/null +++ b/c#/Chapter4/Chapter4.csproj @@ -0,0 +1,14 @@ + + + + Exe + net6.0 + enable + enable + + + + + + + diff --git a/c#/Chapter4/Chapter4.sln b/c#/Chapter4/Chapter4.sln new file mode 100644 index 0000000..6396e93 --- /dev/null +++ b/c#/Chapter4/Chapter4.sln @@ -0,0 +1,16 @@ + +Microsoft Visual Studio Solution File, Format Version 12.00 +Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Chapter4", "Chapter4.csproj", "{BE353086-9CF8-486F-B94D-7C7677932731}" +EndProject +Global + GlobalSection(SolutionConfigurationPlatforms) = preSolution + Debug|Any CPU = Debug|Any CPU + Release|Any CPU = Release|Any CPU + EndGlobalSection + GlobalSection(ProjectConfigurationPlatforms) = postSolution + {BE353086-9CF8-486F-B94D-7C7677932731}.Debug|Any CPU.ActiveCfg = Debug|Any CPU + {BE353086-9CF8-486F-B94D-7C7677932731}.Debug|Any CPU.Build.0 = Debug|Any CPU + {BE353086-9CF8-486F-B94D-7C7677932731}.Release|Any CPU.ActiveCfg = Release|Any CPU + {BE353086-9CF8-486F-B94D-7C7677932731}.Release|Any CPU.Build.0 = Release|Any CPU + EndGlobalSection +EndGlobal diff --git a/c#/README.md b/c#/README.md new file mode 100644 index 0000000..c18ce4a --- /dev/null +++ b/c#/README.md @@ -0,0 +1,16 @@ +##Prerequisites + +* A running Redis instance as mentioned in the book +* NET 6.0 or higher installed + +##Running + +Open a command-line/terminal in the `c#` directory and do one of the following: + +* Windows: + `runchapter [#chapter-number]`. Use numbers 1 through 9 depending on the chapter's examples you want to run + i.e `runchapter 1` + +* Linux/Mac: + `./runchapter.sh [#chapter-number]`. Use numbers 1 through 9 depending on the chapter's examples you want to run + i.e `./runchapter.sh 1` \ No newline at end of file diff --git a/c#/runChapter.bat b/c#/runChapter.bat new file mode 100644 index 0000000..1bf5800 --- /dev/null +++ b/c#/runChapter.bat @@ -0,0 +1,26 @@ + @echo off +set chapter=Chapter%1 +set dir=%~dp0%chapter% +:: Keep old directory before changing +if exist %dir%\ ( +pushd . +cd %dir% + +:: build and run our project +echo: +echo ------------------------------------------------------------- +echo ^| Building %chapter% ^| +echo ------------------------------------------------------------- +echo: +dotnet build +echo: +echo ------------------------------------------------------------- +echo ^| Running %chapter% ^| +echo ------------------------------------------------------------- +echo: +dotnet run +:: Return to original directory +popd +) else ( + echo Could not locate directory "%dir%" +) \ No newline at end of file diff --git a/c#/runChapter.sh b/c#/runChapter.sh new file mode 100644 index 0000000..7fd7a83 --- /dev/null +++ b/c#/runChapter.sh @@ -0,0 +1,22 @@ +dir="Chapter"$1 + +if [ -d "$dir" ]; then + echo + echo ------------------------------------------------------------- + echo "| "Building $dir" |" + echo ------------------------------------------------------------- + echo + cd $dir + dotnet build + echo + echo ------------------------------------------------------------- + echo "| "Running $dir" |" + echo ------------------------------------------------------------- + echo + dotnet run + exit 0; +fi +else + echo Could not locate directory "$dir" + exit 1; +fi \ No newline at end of file