diff --git a/c#/.gitignore b/c#/.gitignore new file mode 100644 index 0000000..2afa2e2 --- /dev/null +++ b/c#/.gitignore @@ -0,0 +1,454 @@ +## Ignore Visual Studio temporary files, build results, and +## files generated by popular Visual Studio add-ons. +## +## Get latest from https://github.com/github/gitignore/blob/master/VisualStudio.gitignore + +# User-specific files +*.rsuser +*.suo +*.user +*.userosscache +*.sln.docstates + +# User-specific files (MonoDevelop/Xamarin Studio) +*.userprefs + +# Mono auto generated files +mono_crash.* + +# Build results +[Dd]ebug/ +[Dd]ebugPublic/ +[Rr]elease/ +[Rr]eleases/ +x64/ +x86/ +[Ww][Ii][Nn]32/ +[Aa][Rr][Mm]/ +[Aa][Rr][Mm]64/ +bld/ +[Bb]in/ +[Oo]bj/ +[Ll]og/ +[Ll]ogs/ + +# Visual Studio 2015/2017 cache/options directory +.vs/ +# Uncomment if you have tasks that create the project's static files in wwwroot +#wwwroot/ + +# Visual Studio 2017 auto generated files +Generated\ Files/ + +# MSTest test Results +[Tt]est[Rr]esult*/ +[Bb]uild[Ll]og.* + +# NUnit +*.VisualState.xml +TestResult.xml +nunit-*.xml + +# Build Results of an ATL Project +[Dd]ebugPS/ +[Rr]eleasePS/ +dlldata.c + +# Benchmark Results +BenchmarkDotNet.Artifacts/ + +# .NET +project.lock.json +project.fragment.lock.json +artifacts/ + +# Tye +.tye/ + +# ASP.NET Scaffolding +ScaffoldingReadMe.txt + +# StyleCop +StyleCopReport.xml + +# Files built by Visual Studio +*_i.c +*_p.c +*_h.h +*.ilk +*.meta +*.obj +*.iobj +*.pch +*.pdb +*.ipdb +*.pgc +*.pgd +*.rsp +*.sbr +*.tlb +*.tli +*.tlh +*.tmp +*.tmp_proj +*_wpftmp.csproj +*.log +*.vspscc +*.vssscc +.builds +*.pidb +*.svclog +*.scc + +# Chutzpah Test files +_Chutzpah* + +# Visual C++ cache files +ipch/ +*.aps +*.ncb +*.opendb +*.opensdf +*.sdf +*.cachefile +*.VC.db +*.VC.VC.opendb + +# Visual Studio profiler +*.psess +*.vsp +*.vspx +*.sap + +# Visual Studio Trace Files +*.e2e + +# TFS 2012 Local Workspace +$tf/ + +# Guidance Automation Toolkit +*.gpState + +# ReSharper is a .NET coding add-in +_ReSharper*/ +*.[Rr]e[Ss]harper +*.DotSettings.user + +# TeamCity is a build add-in +_TeamCity* + +# DotCover is a Code Coverage Tool +*.dotCover + +# AxoCover is a Code Coverage Tool +.axoCover/* +!.axoCover/settings.json + +# Coverlet is a free, cross platform Code Coverage Tool +coverage*.json +coverage*.xml +coverage*.info + +# Visual Studio code coverage results +*.coverage +*.coveragexml + +# NCrunch +_NCrunch_* +.*crunch*.local.xml +nCrunchTemp_* + +# MightyMoose +*.mm.* +AutoTest.Net/ + +# Web workbench (sass) +.sass-cache/ + +# Installshield output folder +[Ee]xpress/ + +# DocProject is a documentation generator add-in +DocProject/buildhelp/ +DocProject/Help/*.HxT +DocProject/Help/*.HxC +DocProject/Help/*.hhc +DocProject/Help/*.hhk +DocProject/Help/*.hhp +DocProject/Help/Html2 +DocProject/Help/html + +# Click-Once directory +publish/ + +# Publish Web Output +*.[Pp]ublish.xml +*.azurePubxml +# Note: Comment the next line if you want to checkin your web deploy settings, +# but database connection strings (with potential passwords) will be unencrypted +*.pubxml +*.publishproj + +# Microsoft Azure Web App publish settings. Comment the next line if you want to +# checkin your Azure Web App publish settings, but sensitive information contained +# in these scripts will be unencrypted +PublishScripts/ + +# NuGet Packages +*.nupkg +# NuGet Symbol Packages +*.snupkg +# The packages folder can be ignored because of Package Restore +**/[Pp]ackages/* +# except build/, which is used as an MSBuild target. +!**/[Pp]ackages/build/ +# Uncomment if necessary however generally it will be regenerated when needed +#!**/[Pp]ackages/repositories.config +# NuGet v3's project.json files produces more ignorable files +*.nuget.props +*.nuget.targets + +# Microsoft Azure Build Output +csx/ +*.build.csdef + +# Microsoft Azure Emulator +ecf/ +rcf/ + +# Windows Store app package directories and files +AppPackages/ +BundleArtifacts/ +Package.StoreAssociation.xml +_pkginfo.txt +*.appx +*.appxbundle +*.appxupload + +# Visual Studio cache files +# files ending in .cache can be ignored +*.[Cc]ache +# but keep track of directories ending in .cache +!?*.[Cc]ache/ + +# Others +ClientBin/ +~$* +*~ +*.dbmdl +*.dbproj.schemaview +*.jfm +*.pfx +*.publishsettings +orleans.codegen.cs + +# Including strong name files can present a security risk +# (https://github.com/github/gitignore/pull/2483#issue-259490424) +#*.snk + +# Since there are multiple workflows, uncomment next line to ignore bower_components +# (https://github.com/github/gitignore/pull/1529#issuecomment-104372622) +#bower_components/ + +# RIA/Silverlight projects +Generated_Code/ + +# Backup & report files from converting an old project file +# to a newer Visual Studio version. Backup files are not needed, +# because we have git ;-) +_UpgradeReport_Files/ +Backup*/ +UpgradeLog*.XML +UpgradeLog*.htm +ServiceFabricBackup/ +*.rptproj.bak + +# SQL Server files +*.mdf +*.ldf +*.ndf + +# Business Intelligence projects +*.rdl.data +*.bim.layout +*.bim_*.settings +*.rptproj.rsuser +*- [Bb]ackup.rdl +*- [Bb]ackup ([0-9]).rdl +*- [Bb]ackup ([0-9][0-9]).rdl + +# Microsoft Fakes +FakesAssemblies/ + +# GhostDoc plugin setting file +*.GhostDoc.xml + +# Node.js Tools for Visual Studio +.ntvs_analysis.dat +node_modules/ + +# Visual Studio 6 build log +*.plg + +# Visual Studio 6 workspace options file +*.opt + +# Visual Studio 6 auto-generated workspace file (contains which files were open etc.) +*.vbw + +# Visual Studio LightSwitch build output +**/*.HTMLClient/GeneratedArtifacts +**/*.DesktopClient/GeneratedArtifacts +**/*.DesktopClient/ModelManifest.xml +**/*.Server/GeneratedArtifacts +**/*.Server/ModelManifest.xml +_Pvt_Extensions + +# Paket dependency manager +.paket/paket.exe +paket-files/ + +# FAKE - F# Make +.fake/ + +# CodeRush personal settings +.cr/personal + +# Python Tools for Visual Studio (PTVS) +__pycache__/ +*.pyc + +# Cake - Uncomment if you are using it +# tools/** +# !tools/packages.config + +# Tabs Studio +*.tss + +# Telerik's JustMock configuration file +*.jmconfig + +# BizTalk build output +*.btp.cs +*.btm.cs +*.odx.cs +*.xsd.cs + +# OpenCover UI analysis results +OpenCover/ + +# Azure Stream Analytics local run output +ASALocalRun/ + +# MSBuild Binary and Structured Log +*.binlog + +# NVidia Nsight GPU debugger configuration file +*.nvuser + +# MFractors (Xamarin productivity tool) working folder +.mfractor/ + +# Local History for Visual Studio +.localhistory/ + +# BeatPulse healthcheck temp database +healthchecksdb + +# Backup folder for Package Reference Convert tool in Visual Studio 2017 +MigrationBackup/ + +# Ionide (cross platform F# VS Code tools) working folder +.ionide/ + +# Fody - auto-generated XML schema +FodyWeavers.xsd + +## +## Visual studio for Mac +## + + +# globs +Makefile.in +*.userprefs +*.usertasks +config.make +config.status +aclocal.m4 +install-sh +autom4te.cache/ +*.tar.gz +tarballs/ +test-results/ + +# Mac bundle stuff +*.dmg +*.app + +# content below from: https://github.com/github/gitignore/blob/master/Global/macOS.gitignore +# General +.DS_Store +.AppleDouble +.LSOverride + +# Icon must end with two \r +Icon + + +# Thumbnails +._* + +# Files that might appear in the root of a volume +.DocumentRevisions-V100 +.fseventsd +.Spotlight-V100 +.TemporaryItems +.Trashes +.VolumeIcon.icns +.com.apple.timemachine.donotpresent + +# Directories potentially created on remote AFP share +.AppleDB +.AppleDesktop +Network Trash Folder +Temporary Items +.apdisk + +# content below from: https://github.com/github/gitignore/blob/master/Global/Windows.gitignore +# Windows thumbnail cache files +Thumbs.db +ehthumbs.db +ehthumbs_vista.db + +# Dump file +*.stackdump + +# Folder config file +[Dd]esktop.ini + +# Recycle Bin used on file shares +$RECYCLE.BIN/ + +# Windows Installer files +*.cab +*.msi +*.msix +*.msm +*.msp + +# Windows shortcuts +*.lnk + +# JetBrains Rider +.idea/ +*.sln.iml + +## +## Visual Studio Code +## +.vscode/* +!.vscode/settings.json +!.vscode/tasks.json +!.vscode/launch.json +!.vscode/extensions.json diff --git a/c#/Chapter1/Chapter1.cs b/c#/Chapter1/Chapter1.cs new file mode 100644 index 0000000..a91889a --- /dev/null +++ b/c#/Chapter1/Chapter1.cs @@ -0,0 +1,132 @@ +// See https://aka.ms/new-console-template for more information + +using System.Diagnostics; +using StackExchange.Redis; + +namespace Chapter1; + +public class Chapter1 { + private const int OneWeekInSeconds = 7 * 86400; + private const int VoteScore = 432; + private const int ArticlesPerPage = 25; + + public static void Main() { + new Chapter1().run(); + } + + private void run() { + var con = ConnectionMultiplexer.Connect("localhost"); + var db = con.GetDatabase(); + + var articleId = postArticle(db, "username", "A title", "https://www.google.com"); + Console.WriteLine("We posted a new article with id: " + articleId); + Console.WriteLine("Its HASH looks like:"); + var articleData = db.HashGetAll("article:" + articleId); + + foreach (var entry in articleData) { + Console.WriteLine(" " + entry.Name + ": " + entry.Value); + } + + Console.WriteLine(); + + articleVote(db, "other_user", "article:" + articleId); + var votes = (int?)db.HashGet("article:" + articleId, "votes") ?? 0; + Console.WriteLine("We voted for the article, it now has votes: " + votes); + Debug.Assert(votes > 1, "Vote count is less than 1"); + + Console.WriteLine("The currently highest-scoring articles are:"); + var articles = getArticles(db, 1); + printArticles(articles); + Debug.Assert(articles.Count >= 1, "Article count is less than 1"); + + addGroups(db, articleId, new[]{"new-group"}); + Console.WriteLine("We added the article to a new group, other articles include:"); + var groupArticles = getGroupArticles(db, "new-group", 1); + printArticles(groupArticles); + Debug.Assert(groupArticles.Count >= 1, "Article group count is less than 1"); + } + + private string postArticle(IDatabase db, string user, string title, string link) { + var articleId = db.StringIncrement("article:").ToString(); + + var voted = "voted:" + articleId; + db.SetAdd(voted, user); + db.KeyExpire(voted, TimeSpan.FromSeconds(OneWeekInSeconds)); + + var now = DateTimeOffset.Now.ToUnixTimeSeconds(); + var article = "article:" + articleId; + var articleData = new List { + new("title", title), + new("link", link), + new("user", user), + new("now", now.ToString()), + new("votes", "1") + }; + db.HashSet(article, articleData.ToArray()); + + db.SortedSetAdd("score:", article, now + VoteScore); + db.SortedSetAdd("time:", article, now); + + return articleId; + } + + private void articleVote(IDatabase db, string user, string article) { + var cutoff = DateTimeOffset.Now.ToUnixTimeSeconds() - OneWeekInSeconds; + var articleScore = db.SortedSetScore("time:", article) ?? 0; + + if (articleScore < cutoff) { + return; + } + + var articleId = article.Substring(article.IndexOf(':') + 1); + + if (db.SetAdd("voted:" + articleId, user)) { + db.SortedSetIncrement("score:", article, VoteScore); + db.HashIncrement(article, "votes"); + } + } + + private List> + getArticles(IDatabase db, int page, string order = "score:") { + var start = (page - 1) * ArticlesPerPage; + var end = start + ArticlesPerPage - 1; + + var ids = db.SortedSetRangeByRank(order, start, end, order: Order.Descending); + var articles = new List>(); + + foreach (var id in ids) { + var articleData = db.HashGetAll(id.ToString()) + .ToDictionary(c => c.Name, c => c.Value); + articleData["id"] = id; + articles.Add(articleData); + } + + return articles; + } + + private void printArticles(List> articles) { + foreach (var article in articles) { + Console.WriteLine(" id: " + article["id"]); + foreach (var articleData in article.Where(c => !c.Key.Equals("id"))) { + Console.WriteLine(" " + articleData.Key + ": " + articleData.Value); + } + } + } + + private void addGroups(IDatabase db, string articleId, string[] toAdd) { + var article = "article:" + articleId; + foreach (var group in toAdd) { + db.SetAdd("group:" + group, article); + } + } + + private List> getGroupArticles(IDatabase db, string group, int page, string order = "score:") { + var key = order + group; + if (!db.KeyExists(key)) { + db.SortedSetCombineAndStore(SetOperation.Intersect, key, "group:" + group, order, aggregate: Aggregate.Max); + db.KeyExpire(key, TimeSpan.FromSeconds(60)); + } + + return getArticles(db, page, key); + } +} diff --git a/c#/Chapter1/Chapter1.csproj b/c#/Chapter1/Chapter1.csproj new file mode 100644 index 0000000..e9d9c00 --- /dev/null +++ b/c#/Chapter1/Chapter1.csproj @@ -0,0 +1,14 @@ + + + + Exe + net6.0 + enable + enable + + + + + + + diff --git a/c#/Chapter1/Chapter1.sln b/c#/Chapter1/Chapter1.sln new file mode 100644 index 0000000..76ddc4b --- /dev/null +++ b/c#/Chapter1/Chapter1.sln @@ -0,0 +1,16 @@ + +Microsoft Visual Studio Solution File, Format Version 12.00 +Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Chapter1", "Chapter1.csproj", "{F01C7220-1D82-4691-8EB2-DB79842BEC82}" +EndProject +Global + GlobalSection(SolutionConfigurationPlatforms) = preSolution + Debug|Any CPU = Debug|Any CPU + Release|Any CPU = Release|Any CPU + EndGlobalSection + GlobalSection(ProjectConfigurationPlatforms) = postSolution + {F01C7220-1D82-4691-8EB2-DB79842BEC82}.Debug|Any CPU.ActiveCfg = Debug|Any CPU + {F01C7220-1D82-4691-8EB2-DB79842BEC82}.Debug|Any CPU.Build.0 = Debug|Any CPU + {F01C7220-1D82-4691-8EB2-DB79842BEC82}.Release|Any CPU.ActiveCfg = Release|Any CPU + {F01C7220-1D82-4691-8EB2-DB79842BEC82}.Release|Any CPU.Build.0 = Release|Any CPU + EndGlobalSection +EndGlobal diff --git a/c#/Chapter2/CacheRowsThread.cs b/c#/Chapter2/CacheRowsThread.cs new file mode 100644 index 0000000..af07be6 --- /dev/null +++ b/c#/Chapter2/CacheRowsThread.cs @@ -0,0 +1,63 @@ +using StackExchange.Redis; +using System.Text.Json; + +namespace Chapter2; + +public class CacheRowsThread { + private readonly IDatabase _db; + private bool _quit; + private readonly Thread _thread; + + public CacheRowsThread(IDatabase db) { + _db = db; + _thread = new Thread(run); + _quit = false; + } + + public void Start() { + _thread.Start(); + } + + public void Quit() { + _quit = true; + } + + public bool IsAlive() { + return _thread.IsAlive; + } + + private void run() { + while (!_quit) { + var range = _db.SortedSetRangeByRankWithScores("schedule:", 0, 0); + var enumerator = range.GetEnumerator(); + var next = (SortedSetEntry?)(enumerator.MoveNext() ? enumerator.Current : null); + var now = DateTimeOffset.UtcNow.ToUnixTimeSeconds(); + if (next == null || next.Value.Score > now) { + try { + Thread.Sleep(50); + } catch (Exception ex) { + Console.WriteLine("error at thread:" + ex); + } + + continue; + } + + var rowId = next.Value.Element.ToString(); + var delay = _db.SortedSetScore("delay:", rowId) ?? 0; + if (delay <= 0) { + _db.SortedSetRemove("delay:", rowId); + _db.SortedSetRemove("schedule:", rowId); + _db.KeyDelete("inv:" + rowId); + continue; + } + + var row = new Inventory(rowId); + if (row == null) { + throw new ArgumentNullException(nameof(row)); + } + + _db.SortedSetAdd("schedule:", rowId, now + delay); + _db.StringSet("inv:" + rowId, JsonSerializer.Serialize(row)); + } + } +} diff --git a/c#/Chapter2/Chapter2.cs b/c#/Chapter2/Chapter2.cs new file mode 100644 index 0000000..ccbff69 --- /dev/null +++ b/c#/Chapter2/Chapter2.cs @@ -0,0 +1,244 @@ +using System.Diagnostics; +using StackExchange.Redis; + +namespace Chapter2; + +public class Chapter2 { + public static void Main() { + new Chapter2().run(); + } + + private void run() { + var connection = ConnectionMultiplexer.Connect("localhost"); + var db = connection.GetDatabase(15); + + testLoginCookies(db); + testShoppingCartCookies(db); + testCacheRows(db); + testCacheRequest(db); + } + + private void testLoginCookies(IDatabase conn) { + Console.WriteLine("\n----- testLoginCookies -----"); + var token = Guid.NewGuid().ToString(); + + var username = "someUser"; + updateToken(conn, token, username, "itemX"); + Console.WriteLine("We just logged-in/updated token: " + token); + Console.WriteLine($"For user: '{username}'"); + Console.WriteLine(); + + Console.WriteLine("What username do we get when we look-up that token?"); + var r = checkToken(conn, token); + Console.WriteLine(r); + Console.WriteLine(); + Debug.Assert(username.Equals(r), "username retrieved from token does not match initial username."); + Debug.Assert(r is not null, "Token is null"); + + Console.WriteLine("Let's drop the maximum number of cookies to 0 to clean them out"); + Console.WriteLine("We will start a thread to do the cleaning, while we stop it later"); + + var thread = new CleanSessionsThread(conn, 0); + thread.Start(); + Thread.Sleep(1000); + thread.Quit(); + Thread.Sleep(2000); + if (thread.IsAlive()) { + throw new Exception("The clean sessions thread is still alive?!?"); + } + + var s = conn.HashLength("login:"); + Console.WriteLine("The current number of sessions still available is: " + s); + Debug.Assert(s == 0, "sessions are not zero"); + } + + private void testShoppingCartCookies(IDatabase conn) { + Console.WriteLine("\n----- testShoppingCartCookies -----"); + var token = Guid.NewGuid().ToString(); + + Console.WriteLine("We'll refresh our session..."); + updateToken(conn, token, "username", "itemX"); + Console.WriteLine("And add an item to the shopping cart"); + addToCart(conn, token, "itemY", 3); + + var r = conn.HashGetAll("cart:" + token); + + Console.WriteLine("Our shopping cart currently has:"); + foreach (var entry in r) { + Console.WriteLine(" " + entry.Name + ": " + entry.Value); + } + + Console.WriteLine(); + + Debug.Assert(r.Length >= 1, "Shopping cart is empty"); + + Console.WriteLine("Let's clean out our sessions and carts"); + var thread = new CleanFullSessionsThread(conn, 0); + thread.Start(); + Thread.Sleep(1000); + thread.Quit(); + Thread.Sleep(2000); + if (thread.IsAlive()) { + throw new Exception("The clean sessions thread is still alive?!?"); + } + + r = conn.HashGetAll("cart:" + token); + Console.WriteLine("Our shopping cart now contains:"); + foreach (var entry in r) { + Console.WriteLine(" " + entry.Name + ": " + entry.Value); + } + + Debug.Assert(r.Length == 0, "cart is not empty"); + } + + private void testCacheRows(IDatabase conn) { + Console.WriteLine("\n----- testCacheRows -----"); + Console.WriteLine("First, let's schedule caching of itemX every 5 seconds"); + scheduleRowCache(conn, "itemX", 5); + Console.WriteLine("Our schedule looks like:"); + + var s = conn.SortedSetRangeByRankWithScores("schedule:", 0, -1); + foreach (var entry in s) { + Console.WriteLine(" " + entry.Element + ", " + entry.Score); + } + + Debug.Assert(s.Length != 0, "schedule set is empty"); + + Console.WriteLine("We'll start a caching thread that will cache the data..."); + + var thread = new CacheRowsThread(conn); + thread.Start(); + Thread.Sleep(1000); + Console.WriteLine("Our cached data looks like:"); + string? r = conn.StringGet("inv:itemX"); + Console.WriteLine(r); + Debug.Assert(r is not null, "cached data is null"); + Console.WriteLine(); + + Console.WriteLine("We'll check again in 5 seconds..."); + Thread.Sleep(5000); + Console.WriteLine("Notice that the data has changed..."); + string? r2 = conn.StringGet("inv:itemX"); + Console.WriteLine(r2); + Console.WriteLine(); + Debug.Assert(r2 is not null, "changed cached data is null"); + Debug.Assert(!r.Equals(r2), "cached data did not change"); + + Console.WriteLine("Let's force un-caching"); + scheduleRowCache(conn, "itemX", -1); + Thread.Sleep(1000); + r = conn.StringGet("inv:itemX"); + Console.WriteLine("The cache was cleared? " + (r == null)); + Debug.Assert(r is null, "cached data was not un-cached"); + + thread.Quit(); + Thread.Sleep(2000); + if (thread.IsAlive()) { + throw new Exception("The database caching thread is still alive?!?"); + } + } + + private void testCacheRequest(IDatabase conn) { + Console.WriteLine("\n----- testCacheRequest -----"); + var token = Guid.NewGuid().ToString(); + + updateToken(conn, token, "username", "itemX"); + var url = "http://test.com/?item=itemX"; + Console.WriteLine("We are going to cache a simple request against " + url); + var result = cacheRequest(conn, url, s=> "content for " + s); + Console.WriteLine("We got initial content:\n" + result); + Console.WriteLine(); + + Debug.Assert(result is not null,"Request was not cached"); + + Console.WriteLine("To test that we've cached the request, we'll pass a bad callback"); + var result2 = cacheRequest(conn, url, null); + Console.WriteLine("We ended up getting the same response!\n" + result2); + + Debug.Assert(result.Equals(result2),"Cached request was not altered"); + + Debug.Assert(!canCache(conn, "http://test.com/")); + Debug.Assert(!canCache(conn, "http://test.com/?item=itemX&_=1234536")); + } + + private string? checkToken(IDatabase conn, string token) { + return conn.HashGet("login:", token); + } + + private void updateToken(IDatabase conn, string token, string user, string? item) { + var timestamp = DateTimeOffset.UtcNow.ToUnixTimeSeconds(); + conn.HashSet("login:", token, user); + conn.SortedSetAdd("recent:", token, timestamp); + if (item != null) { + conn.SortedSetAdd("viewed:" + token, item, timestamp); + conn.SortedSetRemoveRangeByRank("viewed:" + token, 0, -26); + conn.SortedSetIncrement("viewed:", item, -1); + } + } + + private void addToCart(IDatabase conn, string session, string item, int count) { + if (count <= 0) { + conn.HashDelete("cart:" + session, item); + } else { + conn.HashSet("cart:" + session, item, count); + } + } + + private static void scheduleRowCache(IDatabase conn, string rowId, int delay) { + conn.SortedSetAdd("delay:", rowId, delay); + conn.SortedSetAdd("schedule:", rowId, DateTimeOffset.UtcNow.ToUnixTimeSeconds()); + } + + private string? cacheRequest(IDatabase conn, string request, Func? callback) { + if (!canCache(conn, request)) { + return callback?.Invoke(request); + } + + var pageKey = "cache:" + hashRequest(request); + var content = conn.StringGet(pageKey); + + if (!content.HasValue && callback != null) { + content = callback(request); + conn.StringSet(pageKey, content); + conn.KeyExpire(pageKey, TimeSpan.FromSeconds(300)); + } + + return content; + } + + private bool canCache(IDatabase conn, String request) { + try { + var url = new Uri(request); + var parameters = new Dictionary(); + if (!string.IsNullOrEmpty(url.Query)) { + foreach (var par in url.Query[1..].Split("&")) { + var pair = par.Split("=", 2); + parameters.Add(pair[0], pair.Length == 2 ? pair[1] : null); + } + } + + var itemId = extractItemId(parameters); + if (itemId == null || isDynamic(parameters)) { + return false; + } + + var rank = conn.SortedSetRank("viewed:", itemId); + return rank is < 10000; + } catch (FormatException) { + return false; + } + } + + private bool isDynamic(Dictionary parameters) { + return parameters.ContainsKey("_"); + } + + private string? extractItemId(Dictionary parameters) { + parameters.TryGetValue("item",out var result); + return result; + } + + private string hashRequest(String request) { + return request.GetHashCode().ToString(); + } +} diff --git a/c#/Chapter2/Chapter2.csproj b/c#/Chapter2/Chapter2.csproj new file mode 100644 index 0000000..e9d9c00 --- /dev/null +++ b/c#/Chapter2/Chapter2.csproj @@ -0,0 +1,14 @@ + + + + Exe + net6.0 + enable + enable + + + + + + + diff --git a/c#/Chapter2/Chapter2.sln b/c#/Chapter2/Chapter2.sln new file mode 100644 index 0000000..2213ece --- /dev/null +++ b/c#/Chapter2/Chapter2.sln @@ -0,0 +1,16 @@ + +Microsoft Visual Studio Solution File, Format Version 12.00 +Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Chapter2", "Chapter2.csproj", "{71562795-1023-4377-9E81-49653BE6E86D}" +EndProject +Global + GlobalSection(SolutionConfigurationPlatforms) = preSolution + Debug|Any CPU = Debug|Any CPU + Release|Any CPU = Release|Any CPU + EndGlobalSection + GlobalSection(ProjectConfigurationPlatforms) = postSolution + {71562795-1023-4377-9E81-49653BE6E86D}.Debug|Any CPU.ActiveCfg = Debug|Any CPU + {71562795-1023-4377-9E81-49653BE6E86D}.Debug|Any CPU.Build.0 = Debug|Any CPU + {71562795-1023-4377-9E81-49653BE6E86D}.Release|Any CPU.ActiveCfg = Release|Any CPU + {71562795-1023-4377-9E81-49653BE6E86D}.Release|Any CPU.Build.0 = Release|Any CPU + EndGlobalSection +EndGlobal diff --git a/c#/Chapter2/CleanFullSessionsThread.cs b/c#/Chapter2/CleanFullSessionsThread.cs new file mode 100644 index 0000000..934d0b1 --- /dev/null +++ b/c#/Chapter2/CleanFullSessionsThread.cs @@ -0,0 +1,59 @@ +using StackExchange.Redis; + +namespace Chapter2; + +public class CleanFullSessionsThread { + private readonly IDatabase _db; + private readonly int _limit; + private bool _quit; + private readonly Thread _thread; + + public CleanFullSessionsThread(IDatabase db, int limit) { + _db = db; + this._limit = limit; + _thread = new Thread(run); + _quit = false; + } + + public void Start() { + _thread.Start(); + } + + public void Quit() { + _quit = true; + } + + public bool IsAlive() { + return _thread.IsAlive; + } + + private void run() { + while (!_quit) { + var size = _db.SortedSetLength("recent:"); + + if (size <= _limit) { + try { + Thread.Sleep(1000); + } catch (Exception ex) { + Console.WriteLine("error at thread:" + ex); + } + continue; + } + + var endIndex = Math.Min(size - _limit, 100); + + var tokens = _db.SortedSetRangeByRank("recent:", 0, endIndex - 1); + + var sessionKeys = new List(); + + foreach (var token in tokens) { + sessionKeys.Add("viewed:" + token); + sessionKeys.Add("cart:" + token); + } + + _db.KeyDelete(sessionKeys.ToArray()); + _db.HashDelete("login:", tokens); + _db.SortedSetRemove("recent:", tokens); + } + } +} diff --git a/c#/Chapter2/CleanSessionsThread.cs b/c#/Chapter2/CleanSessionsThread.cs new file mode 100644 index 0000000..89c4f75 --- /dev/null +++ b/c#/Chapter2/CleanSessionsThread.cs @@ -0,0 +1,60 @@ +using StackExchange.Redis; + +namespace Chapter2; + +public class CleanSessionsThread { + private readonly IDatabase _db; + private readonly int _limit; + private bool _quit; + private readonly Thread _thread; + + public CleanSessionsThread(IDatabase db, int limit) { + _db = db; + this._limit = limit; + _thread = new Thread(run); + _quit = false; + } + + public void Start() { + _thread.Start(); + } + + public void Quit() { + _quit = true; + } + + public bool IsAlive() { + return _thread.IsAlive; + } + + private void run() { + while (!_quit) { + var size = _db.SortedSetLength("recent:"); + + if (size <= _limit) { + try { + Thread.Sleep(1000); + } catch (Exception ex) { + Console.WriteLine("error at thread:" + ex); + } + + continue; + } + + var endIndex = Math.Min(size - _limit, 100); + + var tokens = _db.SortedSetRangeByRank("recent:", 0, endIndex - 1); + + var sessionKeys = new List(); + + foreach (var token in tokens) { + sessionKeys.Add("viewed:" + token); + } + + _db.KeyDelete(sessionKeys.ToArray()); + + _db.HashDelete("login:", tokens); + _db.SortedSetRemove("recent:", tokens); + } + } +} diff --git a/c#/Chapter2/Inventory.cs b/c#/Chapter2/Inventory.cs new file mode 100644 index 0000000..32d7a17 --- /dev/null +++ b/c#/Chapter2/Inventory.cs @@ -0,0 +1,7 @@ +// ReSharper disable NotAccessedPositionalProperty.Global +// Disabled since this is just for demo purposes +namespace Chapter2; + +public record Inventory(string Id, string Data, long Time) { + public Inventory(string id) : this(id, "data to cache...", DateTimeOffset.UtcNow.ToUnixTimeSeconds()) { } +} diff --git a/c#/Chapter4/Chapter4.cs b/c#/Chapter4/Chapter4.cs new file mode 100644 index 0000000..c4ab10e --- /dev/null +++ b/c#/Chapter4/Chapter4.cs @@ -0,0 +1,240 @@ +using System.Diagnostics; +using System.Reflection; +using StackExchange.Redis; + +namespace Chapter4; + +public class Chapter4 { + private const string MarketKey = "market:"; + + public static void Main() { + new Chapter4().run(); + } + + private void run() { + var connection = ConnectionMultiplexer.Connect("localhost"); + var db = connection.GetDatabase(15); + + testListItem(db, false); + testPurchaseItem(db); + testBenchmarkUpdateToken(db); + } + + private static void testListItem(IDatabase conn, bool nested) { + if (!nested) { + Console.WriteLine("\n----- testListItem -----"); + } + + Console.WriteLine("We need to set up just enough state so that a user can list an item"); + var sellerId = "userX"; + var item = "itemX"; + conn.SetAdd("inventory:" + sellerId, item); + var i = conn.SetMembers("inventory:" + sellerId); + + Console.WriteLine("The user's inventory has:"); + foreach (var member in i) { + Console.WriteLine(" " + member); + } + + Debug.Assert(i.Length > 0, "Inventory is empty"); + Console.WriteLine(); + + Console.WriteLine("Listing the item..."); + var listResult = listItem(conn, item, sellerId, 10); + Console.WriteLine("Listing the item succeeded? " + listResult); + Debug.Assert(listResult, "Changes were not committed"); + var marketItems = conn.SortedSetRangeByRankWithScores(MarketKey, 0, -1); + Console.WriteLine("The market contains:"); + foreach (var marketItem in marketItems) { + Console.WriteLine(" " + marketItem.Element + ", " + marketItem.Score); + } + + Debug.Assert(marketItems.Length > 0, "Market items is empty"); + } + + private void testPurchaseItem(IDatabase conn) { + Console.WriteLine("\n----- testPurchaseItem -----"); + testListItem(conn, true); + + Console.WriteLine("We need to set up just enough state so a user can buy an item"); + conn.HashSet("users:userY", "funds", "125"); + var r = conn.HashGetAll("users:userY"); + Console.WriteLine("The user has some money:"); + foreach (var entry in r) { + Console.WriteLine(" " + entry.Name + ": " + entry.Value); + } + + Debug.Assert(r.Length > 0, "User hashset not found!"); + var funds = r.Any(a => a.Name == "funds"); + Debug.Assert(funds, "Didn't find a hash entry for funds"); + Console.WriteLine(); + + Console.WriteLine("Let's purchase an item"); + var purchaseResult = purchaseItem(conn, "userY", "itemX", "userX", 10); + Console.WriteLine("Purchasing an item succeeded? " + purchaseResult); + Debug.Assert(purchaseResult, "Changes were not committed"); + + r = conn.HashGetAll("users:userY"); + Console.WriteLine("Their money is now:"); + foreach (var entry in r) { + Console.WriteLine(" " + entry.Name + ": " + entry.Value); + } + + Debug.Assert(r.Length > 0, "Used data is empty"); + + var buyer = "userY"; + var i = conn.SetMembers("inventory:" + buyer); + Console.WriteLine("Their inventory is now:"); + foreach (var member in i) { + Console.WriteLine(" " + member); + } + + Debug.Assert(i.Length > 0, "Buyer inventory is empty"); + Debug.Assert(i.Any(item => item.Equals("itemX")), "itemX was not moved to buyers inventory"); + Debug.Assert(conn.SortedSetScore(MarketKey, "itemX.userX") == null, "Market still contains itemX.userX"); + } + + private static bool listItem(IDatabase conn, string itemId, string sellerId, double price) { + var inventory = "inventory:" + sellerId; + var item = itemId + '.' + sellerId; + var end = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() + 5000; + + while (DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() < end) { + // The client has a multiplexer approach to connections. + // As a result we can't use multi/exec/watch directly. + // We can however add transaction conditions which are functioning similarly behind the scenes. + // So we will add them by hand in order to simulate a watch request. + // We will load the set, verify that cardinality remained the same as well as that items were unchanged + var inventorySet = conn.SetMembers(inventory); + var trans = conn.CreateTransaction(); + + trans.AddCondition(Condition.SetContains(inventory, itemId)); + trans.AddCondition(Condition.SetLengthEqual(inventory, inventorySet.Length)); + foreach (var invItem in inventorySet) { + trans.AddCondition(Condition.SetContains(inventory, invItem)); + } + + trans.SortedSetAddAsync(MarketKey, item, price); + trans.SetRemoveAsync(inventory, itemId); + var committed = trans.Execute(); + + if (!committed) { + continue; + } + + return true; + } + + return false; + } + + private static bool purchaseItem( + IDatabase conn, string buyerId, string itemId, string sellerId, double listedPrice) { + var buyer = "users:" + buyerId; + var seller = "users:" + sellerId; + var item = itemId + '.' + sellerId; + var inventory = "inventory:" + buyerId; + var end = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() + 10000; + + while (DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() < end) { + var trans = conn.CreateTransaction(); + + // The client has a multiplexer approach to connections. + // As a result we can't use multi/exec/watch directly. + // We can however add transaction conditions which are functioning similarly behind the scenes. + // So we will add them by hand in order to simulate a watch request. + // We will load the hashset, verify that cardinality remained the same as well as that items were unchanged + // Similarly, we will do a check to the market set to verify that everything remained unchanged + var userSet = conn.HashGetAll(buyer); + trans.AddCondition(Condition.HashLengthEqual(buyer, userSet.Length)); + foreach (var entry in userSet) { + trans.AddCondition(Condition.HashEqual(buyer, entry.Name, entry.Value)); + } + + var marketSortedSet = conn.SortedSetRangeByRankWithScores(MarketKey, 0, -1); + trans.AddCondition(Condition.SortedSetLengthEqual(MarketKey, marketSortedSet.Length)); + foreach (var entry in marketSortedSet) { + trans.AddCondition(Condition.SortedSetEqual(MarketKey, entry.Element, entry.Score)); + } + + var price = conn.SortedSetScore(MarketKey, item); + var funds = double.Parse(conn.HashGet(buyer, "funds").ToString()); + if (price != listedPrice || price > funds) { + return false; + } + + trans.HashIncrementAsync(seller, "funds", (int)price); + trans.HashIncrementAsync(buyer, "funds", (int)-price); + trans.SetAddAsync(inventory, itemId); + trans.SortedSetRemoveAsync(MarketKey, item); + var result = trans.Execute(); + // null response indicates that the transaction was aborted due to + // the watched key changing. + if (!result) { + continue; + } + + return true; + } + + return false; + } + + private void testBenchmarkUpdateToken(IDatabase conn) { + Console.WriteLine("\n----- testBenchmarkUpdate -----"); + benchmarkUpdateToken(conn, 5); + } + + private void benchmarkUpdateToken(IDatabase conn, int duration) { + var methods = new List>() { + updateToken, + updateTokenPipeline + }; + + Console.WriteLine("{0,-20} {1,-10} {2,-15} {3,-30}","Update method","#Runs","Delta(seconds)",$"#Runs to delta(seconds) ratio"); + foreach (var method in methods) { + var count = 0; + var start = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds(); + var end = start + (duration * 1000); + while (DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() < end) { + count++; + method(conn, "token", "user", "item"); + } + + var delta = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds()- start; + Console.WriteLine("{0,-20} {1,-10} {2,-15} {3,-30}", + method.GetMethodInfo().Name, + count, + (delta / 1000), + (count / (delta / 1000))); + } + } + + private static void updateToken(IDatabase conn, string token, string user, string? item) { + var timestamp = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() / 1000; + + conn.HashSet("login:", token, user); + conn.SortedSetAdd("recent:", token, timestamp); + if (item != null) { + conn.SortedSetAdd("viewed:" + token, item, timestamp); + conn.SortedSetRemoveRangeByRank("viewed:" + token, 0, -26); + conn.SortedSetIncrement("viewed:", item, -1); + } + } + + private static void updateTokenPipeline(IDatabase conn, string token, string user, string? item) { + var timestamp = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() / 1000; + var x = new List { + Capacity = 0 + }; + x.Add(conn.HashSetAsync("login:", token, user)); + x.Add(conn.SortedSetAddAsync("recent:", token, timestamp)); + if (item != null) { + x.Add(conn.SortedSetAddAsync("viewed:" + token, item, timestamp)); + x.Add(conn.SortedSetRemoveRangeByRankAsync("viewed:" + token, 0, -26)); + x.Add(conn.SortedSetIncrementAsync("viewed:", item, -1)); + } + + conn.WaitAll(x.ToArray()); + } +} diff --git a/c#/Chapter4/Chapter4.csproj b/c#/Chapter4/Chapter4.csproj new file mode 100644 index 0000000..e9d9c00 --- /dev/null +++ b/c#/Chapter4/Chapter4.csproj @@ -0,0 +1,14 @@ + + + + Exe + net6.0 + enable + enable + + + + + + + diff --git a/c#/Chapter4/Chapter4.sln b/c#/Chapter4/Chapter4.sln new file mode 100644 index 0000000..6396e93 --- /dev/null +++ b/c#/Chapter4/Chapter4.sln @@ -0,0 +1,16 @@ + +Microsoft Visual Studio Solution File, Format Version 12.00 +Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Chapter4", "Chapter4.csproj", "{BE353086-9CF8-486F-B94D-7C7677932731}" +EndProject +Global + GlobalSection(SolutionConfigurationPlatforms) = preSolution + Debug|Any CPU = Debug|Any CPU + Release|Any CPU = Release|Any CPU + EndGlobalSection + GlobalSection(ProjectConfigurationPlatforms) = postSolution + {BE353086-9CF8-486F-B94D-7C7677932731}.Debug|Any CPU.ActiveCfg = Debug|Any CPU + {BE353086-9CF8-486F-B94D-7C7677932731}.Debug|Any CPU.Build.0 = Debug|Any CPU + {BE353086-9CF8-486F-B94D-7C7677932731}.Release|Any CPU.ActiveCfg = Release|Any CPU + {BE353086-9CF8-486F-B94D-7C7677932731}.Release|Any CPU.Build.0 = Release|Any CPU + EndGlobalSection +EndGlobal diff --git a/c#/Chapter5/AccessTimer.cs b/c#/Chapter5/AccessTimer.cs new file mode 100644 index 0000000..1c3ca2d --- /dev/null +++ b/c#/Chapter5/AccessTimer.cs @@ -0,0 +1,33 @@ +using System.Diagnostics; +using StackExchange.Redis; + +namespace Chapter5; + +public class AccessTimer { + private readonly IDatabase _conn; + private readonly Stopwatch _watch; + + public AccessTimer(IDatabase conn,Stopwatch watch){ + _conn = conn; + _watch = watch; + } + + public void Start() { + _watch.Restart(); + } + + public void Stop(string context){ + _watch.Stop(); + var delta = _watch.Elapsed.TotalSeconds; + var stats = StatsOperations.UpdateStats(_conn, context, "AccessTime", delta); + if (stats is null) { + throw new NullReferenceException(nameof(stats)); + } + var average = stats["sum"] / stats["count"]; + + var trans = _conn.CreateTransaction(); + trans.SortedSetAddAsync("slowest:AccessTime", context, average); + trans.SortedSetRemoveRangeByRankAsync("slowest:AccessTime", 0, -101); + trans.Execute(); + } +} diff --git a/c#/Chapter5/Chapter5.cs b/c#/Chapter5/Chapter5.cs new file mode 100644 index 0000000..98a98f0 --- /dev/null +++ b/c#/Chapter5/Chapter5.cs @@ -0,0 +1,223 @@ +using System.Diagnostics; +using Chapter5.Configuration; +using Chapter5.Counters; +using Chapter5.IpLookup; +using StackExchange.Redis; + +namespace Chapter5; + +public class Chapter5 { + public static void Main() { + new Chapter5().run(); + } + + private void run() { + var connection = ConnectionMultiplexer.Connect("localhost"); + var db = connection.GetDatabase(15); + + testLogRecent(db); + testLogCommon(db); + testCounters(db); + testStats(db); + testAccessTime(db); + testIpLookup(db); + testIsUnderMaintenance(db); + testConfig(db); + } + + private void testLogRecent(IDatabase conn) { + Console.WriteLine("\n----- testLogRecent -----"); + Console.WriteLine("Let's write a few logs to the recent log"); + for (var i = 0; i < 5; i++) { + LoggerOperations.LogRecent(conn, "test", "this is message " + i); + } + + var recent = conn.ListRange("recent:test:info", 0, -1); + Console.WriteLine( + "The current recent message log has this many messages: " + + recent.Length); + Console.WriteLine("Those messages include:"); + foreach (var message in recent) { + Console.WriteLine(message); + } + + Debug.Assert(recent.Length >= 5, "Expected at least 5 recently logged messages"); + } + + private void testLogCommon(IDatabase conn) { + Console.WriteLine("\n----- testLogCommon -----"); + Console.WriteLine("Let's write some items to the common log"); + for (var count = 1; count < 6; count++) { + for (var i = 0; i < count; i++) { + LoggerOperations.LogCommon(conn, "test", "message-" + count); + } + } + + conn.SortedSetRangeByRankWithScores("common:test:info", 0, -1, Order.Descending); + var common = conn.SortedSetRangeByRankWithScores("common:test:info", 0, -1, Order.Descending); + Console.WriteLine("The current number of common messages is: " + common.Length); + Console.WriteLine("Those common messages are:"); + foreach (var tuple in common) { + Console.WriteLine(" " + tuple.Element + ", " + tuple.Score); + } + + Debug.Assert(common.Length >= 5, "The common logs contain less than 5 entries"); + } + + private void testCounters(IDatabase conn) { + Console.WriteLine("\n----- testCounters -----"); + Console.WriteLine("Let's update some counters for now and a little in the future"); + var now = DateTimeOffset.UtcNow.ToUnixTimeSeconds(); + var random = new Random(); + for (var i = 0; i < 10; i++) { + var count = (int)random.NextInt64(1, 6); + CounterOperations.UpdateCounter(conn, "test", count, now + i); + } + + var counter = CounterOperations.GetCounter(conn, "test", 1); + Console.WriteLine("We have some per-second counters: " + counter.Count); + Console.WriteLine("These counters include:"); + foreach (var count in counter) { + Console.WriteLine(" " + count); + } + + Debug.Assert(counter.Count >= 10, "Counters are less than 10"); + + counter = CounterOperations.GetCounter(conn, "test", 5); + Console.WriteLine("We have some per-5-second counters: " + counter.Count); + Console.WriteLine("These counters include:"); + foreach (var count in counter) { + Console.WriteLine(" " + count); + } + + Debug.Assert(counter.Count >= 2, "Counters are less than 2"); + Console.WriteLine(); + + Console.WriteLine("Let's clean out some counters by setting our sample count to 0"); + var thread = new CleanCountersThread(0, 2 * 86400000); + thread.Start(); + Thread.Sleep(1000); + thread.Quit(); + counter = CounterOperations.GetCounter(conn, "test", 86400); + Console.WriteLine("Did we clean out all of the counters? " + (counter.Count == 0)); + Debug.Assert(counter.Count == 0, "Counters were not cleaned up"); + } + + private void testStats(IDatabase conn) { + Console.WriteLine("\n----- testStats -----"); + Console.WriteLine("Let's add some data for our statistics!"); + var r = new Dictionary(); + var random = new Random(); + for (var i = 0; i < 5; i++) { + var value = (random.NextDouble() * 11) + 5; + r = StatsOperations.UpdateStats(conn, "temp", "example", value); + } + + Debug.Assert(r != null, "Aggregate statistics are null"); + + Console.WriteLine("We have some aggregate statistics: " + string.Join(", ", r)); + var stats = StatsOperations.GetStats(conn, "temp", "example"); + Console.WriteLine("Which we can also fetch manually:"); + foreach (var pair in stats) { + Console.WriteLine($"{pair.Key}:{pair.Value}"); + } + + Debug.Assert(stats["count"] >= 5, "Count is less than 5 in our stats"); + } + + private void testAccessTime(IDatabase conn) { + Console.WriteLine("\n----- testAccessTime -----"); + Console.WriteLine("Let's calculate some access times..."); + var timer = new AccessTimer(conn, new Stopwatch()); + var rand = new Random(); + for (var i = 0; i < 10; i++) { + timer.Start(); + Thread.Sleep(rand.Next(0, 1001)); + timer.Stop("req-" + i); + } + + Console.WriteLine("The slowest access times are:"); + conn.SortedSetRangeByRankWithScores("slowest:AccessTime", 0, -1); + var accessTimes = conn.SortedSetRangeByRankWithScores("slowest:AccessTime", 0, -1); + foreach (var pair in accessTimes) { + Console.WriteLine(" " + pair.Element + ", " + pair.Score); + } + + Debug.Assert(accessTimes.Length >= 10, "Our access times dataset has less than 10 elements"); + Console.WriteLine(); + } + + private void testIpLookup(IDatabase conn) { + Console.WriteLine("\n----- testIpLookup -----"); + var cwd = Environment.GetEnvironmentVariable("user.dir"); + + if (string.IsNullOrEmpty(cwd)) { + throw new ArgumentNullException(nameof(cwd), "Environment variable user.dir should be set"); + } + + var blocksPath = (cwd + "/GeoLiteCity-Blocks.csv"); + var locationsPath = cwd + "/GeoLiteCity-Location.csv"; + if (!File.Exists(blocksPath)) { + Console.WriteLine("********"); + Console.WriteLine("GeoLiteCity-Blocks.csv not found at: " + blocksPath); + Console.WriteLine("********"); + return; + } + + if (!File.Exists(locationsPath)) { + Console.WriteLine("********"); + Console.WriteLine("GeoLiteCity-Location.csv not found at: " + locationsPath); + Console.WriteLine("********"); + return; + } + + Console.WriteLine("Importing IP addresses to Redis... (this may take a while)"); + IpLookupOperations.ImportIpsToRedis(conn, blocksPath); + var ranges = conn.SortedSetLength("ip2cityid:"); + Console.WriteLine("Loaded ranges into Redis: " + ranges); + Debug.Assert(ranges > 1000, "Ip2CityId dataset has a cardinality less than 1000"); + Console.WriteLine(); + + Console.WriteLine("Importing Location lookups to Redis... (this may take a while)"); + IpLookupOperations.ImportCitiesToRedis(conn, locationsPath); + var cities = conn.HashLength("cityid2city:"); + Console.WriteLine("Loaded city lookups into Redis:" + cities); + Debug.Assert(cities > 1000, "Cities are less than 1000"); + Console.WriteLine(); + + Console.WriteLine("Let's lookup some locations!"); + + for (var i = 0; i < 5; i++) { + var ip = + $"{IpLookupOperations.RandomOctet(255)}.{IpLookupOperations.RandomOctet(256)}.{IpLookupOperations.RandomOctet(256)}.{IpLookupOperations.RandomOctet(256)}"; + var result = IpLookupOperations.FindCityByIp(conn, ip) ?? Array.Empty(); + var toDisplay = $"[{string.Join(",", result)}]"; + Console.WriteLine(toDisplay); + } + } + + private static void testIsUnderMaintenance(IDatabase conn) { + Console.WriteLine("\n----- testIsUnderMaintenance -----"); + Console.WriteLine("Are we under maintenance (we shouldn't be)? " + Maintenance.IsUnderMaintenance(conn)); + conn.StringSet("is-under-maintenance", "yes"); + Console.WriteLine("We cached this, so it should be the same: " + Maintenance.IsUnderMaintenance(conn)); + Thread.Sleep(1000); + Console.WriteLine("But after a sleep, it should change: " + Maintenance.IsUnderMaintenance(conn)); + Console.WriteLine("Cleaning up..."); + conn.KeyDelete("is-under-maintenance"); + Thread.Sleep(1000); + Console.WriteLine("Should be False again: " + Maintenance.IsUnderMaintenance(conn)); + } + + private static void testConfig(IDatabase conn) { + Console.WriteLine("\n----- testConfig -----"); + Console.WriteLine("Let's set a config and then get a connection from that config..."); + var config = new Dictionary(); + config.Add("db", 15); + ConfigOperations.SetConfig(conn, "redis", "test", config); + + var conn2 = ConfigOperations.RedisConnection("test"); + Console.WriteLine( + "We can run commands from the configured connection: " + (conn2 != null)); + } +} diff --git a/c#/Chapter5/Chapter5.csproj b/c#/Chapter5/Chapter5.csproj new file mode 100644 index 0000000..4b39da3 --- /dev/null +++ b/c#/Chapter5/Chapter5.csproj @@ -0,0 +1,15 @@ + + + + Exe + net6.0 + enable + enable + + + + + + + + diff --git a/c#/Chapter5/Chapter5.sln b/c#/Chapter5/Chapter5.sln new file mode 100644 index 0000000..1de7358 --- /dev/null +++ b/c#/Chapter5/Chapter5.sln @@ -0,0 +1,16 @@ + +Microsoft Visual Studio Solution File, Format Version 12.00 +Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Chapter5", "Chapter5.csproj", "{B8DA5E72-DE9A-49B7-9D4C-DA87C5396702}" +EndProject +Global + GlobalSection(SolutionConfigurationPlatforms) = preSolution + Debug|Any CPU = Debug|Any CPU + Release|Any CPU = Release|Any CPU + EndGlobalSection + GlobalSection(ProjectConfigurationPlatforms) = postSolution + {B8DA5E72-DE9A-49B7-9D4C-DA87C5396702}.Debug|Any CPU.ActiveCfg = Debug|Any CPU + {B8DA5E72-DE9A-49B7-9D4C-DA87C5396702}.Debug|Any CPU.Build.0 = Debug|Any CPU + {B8DA5E72-DE9A-49B7-9D4C-DA87C5396702}.Release|Any CPU.ActiveCfg = Release|Any CPU + {B8DA5E72-DE9A-49B7-9D4C-DA87C5396702}.Release|Any CPU.Build.0 = Release|Any CPU + EndGlobalSection +EndGlobal diff --git a/c#/Chapter5/Configuration/ConfigOperations.cs b/c#/Chapter5/Configuration/ConfigOperations.cs new file mode 100644 index 0000000..449bb58 --- /dev/null +++ b/c#/Chapter5/Configuration/ConfigOperations.cs @@ -0,0 +1,62 @@ +using System.Text.Json; +using StackExchange.Redis; + +namespace Chapter5.Configuration; + +public abstract class ConfigOperations { + private static readonly Dictionary REDIS_CONNECTIONS = new Dictionary(); + private static Dictionary> CONFIGS = new Dictionary>(); + private static Dictionary CHECKED = new Dictionary(); + + public static void SetConfig(IDatabase conn, string type, string component, Dictionary config) { + conn.StringSet($"config:{type}:{component}", JsonSerializer.Serialize(config)); + } + + public static Dictionary GetConfig(IDatabase conn, string type, string component) { + var wait = 1000; + var key = $"config:{type}:{component}"; + + CHECKED.TryGetValue(key, out var lastChecked); + if (lastChecked == default || lastChecked < DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() - wait) { + CHECKED[key] = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds(); + + var value = conn.StringGet(key); + var config = new Dictionary(); + if (value != RedisValue.Null) { + config = JsonSerializer.Deserialize>(value.ToString()); + } + + CONFIGS[key] = config ?? new Dictionary(); + } + + return CONFIGS[key]; + } + + public static IDatabase? RedisConnection(string component) { + REDIS_CONNECTIONS.TryGetValue("config", out var configConn); + if (configConn == null) { + var connection = ConnectionMultiplexer.Connect("localhost"); + configConn = connection.GetDatabase(15); + REDIS_CONNECTIONS["config"] = configConn; + } + + var key = $"config:redis:{component}"; + CONFIGS.TryGetValue(key,out var oldConfig); + oldConfig ??= new(); + var config = GetConfig(configConn, "redis", component); + + var configsAreEqual = config.Count == oldConfig.Count && !config.Except(oldConfig).Any(); + if (!configsAreEqual) { + var conn = ConnectionMultiplexer.Connect("localhost"); + var db = conn.GetDatabase(); + if (config.ContainsKey("db")) { + var dnNo = int.Parse(config["db"].ToString() ?? throw new InvalidOperationException()); + db = conn.GetDatabase(dnNo); + } + + REDIS_CONNECTIONS[key] = db; + } + + return REDIS_CONNECTIONS[key]; + } +} diff --git a/c#/Chapter5/Configuration/Maintenance.cs b/c#/Chapter5/Configuration/Maintenance.cs new file mode 100644 index 0000000..c15fb8d --- /dev/null +++ b/c#/Chapter5/Configuration/Maintenance.cs @@ -0,0 +1,18 @@ +using StackExchange.Redis; + +namespace Chapter5.Configuration; + +public static class Maintenance { + private static long _lastChecked; + private static bool _underMaintenance; + + public static bool IsUnderMaintenance(IDatabase conn) { + if (_lastChecked < DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() - 1000){ + _lastChecked = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds(); + var flag = conn.StringGet("is-under-maintenance"); + _underMaintenance = "yes".Equals(flag); + } + + return _underMaintenance; + } +} diff --git a/c#/Chapter5/Counters/CleanCountersThread.cs b/c#/Chapter5/Counters/CleanCountersThread.cs new file mode 100644 index 0000000..d441728 --- /dev/null +++ b/c#/Chapter5/Counters/CleanCountersThread.cs @@ -0,0 +1,101 @@ +using StackExchange.Redis; + +namespace Chapter5.Counters; + +public class CleanCountersThread { + private readonly IDatabase _db; + private bool _quit; + private readonly Thread _thread; + private readonly int _sampleCount; + private readonly long _timeOffset; + + public CleanCountersThread(int sampleCount, long timeOffset) { + var connection = ConnectionMultiplexer.Connect("localhost"); + _db = connection.GetDatabase(15); + _thread = new Thread(run) { + IsBackground = true + }; + _quit = false; + _sampleCount = sampleCount; + _timeOffset = timeOffset; + } + + public void Start() { + _thread.Start(); + } + + public void Quit() { + _quit = true; + } + + private void run() { + try { + var passes = 0; + while (!_quit) { + var start = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() + _timeOffset; + var index = 0; + while (index < _db.SortedSetLength("known:") && !_quit) { + var counterToCheck = _db.SortedSetRangeByRank("known:", index, index); + index++; + if (counterToCheck.Length == 0) { + break; + } + + var hash = counterToCheck[0].ToString(); + + var precision = int.Parse(hash.Substring(0, hash.IndexOf(':'))); + var bprec = precision / 60; + if (bprec == 0) { + bprec = 1; + } + + if ((passes % bprec) != 0) { + continue; + } + + var counterHashKey = "count:" + hash; + var cutoff = (((DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() + _timeOffset) / 1000) - _sampleCount * precision).ToString(); + var samples = new List(_db.HashKeys(counterHashKey).Select(c => c.ToString())); + samples.Sort(); + var remove = bisectRight(samples, cutoff); + + if (remove != 0) { + var samplesToRemove = samples.GetRange(0, remove).Select(c => new RedisValue(c)).ToArray(); + _db.HashDelete(counterHashKey, samplesToRemove); + var hashPrev = _db.HashGetAll(counterHashKey); + if (remove == samples.Count) { + var trans = _db.CreateTransaction(); + if (hashPrev.Length > 0) { + foreach (var entry in hashPrev) { + trans.AddCondition(Condition.HashEqual(counterHashKey, entry.Name, entry.Value)); + } + + trans.AddCondition(Condition.HashLengthEqual(counterHashKey, hashPrev.Length)); + } else { + trans.AddCondition(Condition.HashLengthEqual(counterHashKey, 0)); + } + + if (hashPrev.Length == 0) { + trans.SortedSetRemoveAsync("known:", hash); + trans.Execute(); + index--; + } + } + } + } + passes++; + var duration = Math.Min((DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() + _timeOffset) - start + 1000, 60000); + var timeToSleep = TimeSpan.FromMilliseconds(Math.Max(60000 - duration, 1000)); + Thread.Sleep(timeToSleep); + } + } catch (Exception ex) when (ex is ThreadAbortException || ex is ThreadInterruptedException) { + Thread.CurrentThread.Interrupt(); + } + } + + // mimic python's bisect.bisect_right + private int bisectRight(List values, String key) { + var index = values.BinarySearch(key); + return index < 0 ? Math.Abs(index) - 1 : index + 1; + } +} diff --git a/c#/Chapter5/Counters/CounterOperations.cs b/c#/Chapter5/Counters/CounterOperations.cs new file mode 100644 index 0000000..7c3cbfd --- /dev/null +++ b/c#/Chapter5/Counters/CounterOperations.cs @@ -0,0 +1,29 @@ +using StackExchange.Redis; + +namespace Chapter5.Counters; + +public static class CounterOperations { + private static readonly int[] _precisions = { 1, 5, 60, 300, 3600, 18000, 86400 }; + public static List<(int, int)> GetCounter( + IDatabase conn, string name, int precision) { + var hash = $"{precision}:{name}"; + var data = conn.HashGetAll($"count:{hash}"); + var results = new List<(int, int)>(); + foreach (var entry in data) { + results.Add((Convert.ToInt32(entry.Name), Convert.ToInt32(entry.Value))); + } + + results.Sort(); + return results; + } + public static void UpdateCounter(IDatabase conn, string name, int count, long now) { + var trans = conn.CreateTransaction(); + foreach (var precision in _precisions) { + var precisionNow = (now / precision) * precision; + var hash = $"{precision}:{name}"; + trans.SortedSetAddAsync("known:", hash, 0); + trans.HashIncrementAsync($"count:{hash}", precisionNow.ToString(), count); + } + trans.Execute(); + } +} diff --git a/c#/Chapter5/DateTimeExtensions.cs b/c#/Chapter5/DateTimeExtensions.cs new file mode 100644 index 0000000..6853e34 --- /dev/null +++ b/c#/Chapter5/DateTimeExtensions.cs @@ -0,0 +1,12 @@ +using System.Globalization; + +namespace Chapter5; + +public static class DateTimeExtensions { + public static string ToIsoFormat(this DateTime dateTime) { + return dateTime.ToUniversalTime().ToString("s"); + } + public static string ToTimestampFormat(this DateTime dateTime) { + return dateTime.ToString("ddd MMM HH:00:00 yyyy",CultureInfo.InvariantCulture); + } +} diff --git a/c#/Chapter5/IpLookup/IpLookupCsvHelper.cs b/c#/Chapter5/IpLookup/IpLookupCsvHelper.cs new file mode 100644 index 0000000..324a387 --- /dev/null +++ b/c#/Chapter5/IpLookup/IpLookupCsvHelper.cs @@ -0,0 +1,64 @@ +using System.Text; +using TinyCsvParser; +using TinyCsvParser.Mapping; + +namespace Chapter5.IpLookup; + +public static class IpLookupCsvHelper { + public class CityBlock { + public string StartIp { get; set; } + public string EndIp { get; set; } + public string LocationId { get; set; } + } + + public class CityLocation { + public string CityId { get; set; } + public string Country { get; set; } + public string Region { get; set; } + public string City { get; set; } + } + + private class CsvCityBlockMapping : CsvMapping { + public CsvCityBlockMapping() { + MapProperty(0, x => x.StartIp); + MapProperty(1, x => x.EndIp); + MapProperty(2, x => x.LocationId); + } + } + + private class CsvCityLocationMapping : CsvMapping { + public CsvCityLocationMapping() { + MapProperty(0, x => x.CityId); + MapProperty(1, x => x.Country); + MapProperty(2, x => x.Region); + MapProperty(3, x => x.City); + } + } + + public static List LoadCityBlocks(string path) { + var mapper = new CsvCityBlockMapping(); + var option = new CsvParserOptions(true, ','); + var parser = new CsvParser(option, mapper); + + var result = parser + .ReadFromFile(path, Encoding.ASCII) + .Select(c => c.Result) + .Where(c => !c.StartIp.Equals("startIpNum", StringComparison.InvariantCultureIgnoreCase)) + .ToList(); + + return result; + } + + public static List LoadCityLocations(string path) { + var mapper = new CsvCityLocationMapping(); + var option = new CsvParserOptions(false, ','); + var parser = new CsvParser(option, mapper); + + var result = parser + .ReadFromFile(path, Encoding.ASCII) + .Select(c => c.Result) + .ToList(); + + return result; + } +} diff --git a/c#/Chapter5/IpLookup/IpLookupOperations.cs b/c#/Chapter5/IpLookup/IpLookupOperations.cs new file mode 100644 index 0000000..d3fa402 --- /dev/null +++ b/c#/Chapter5/IpLookup/IpLookupOperations.cs @@ -0,0 +1,77 @@ +using System.Text.Json; +using StackExchange.Redis; + +namespace Chapter5.IpLookup; + +public static class IpLookupOperations { + public static void ImportIpsToRedis(IDatabase conn, string path) { + var blocks = IpLookupCsvHelper.LoadCityBlocks(path); + Parallel.ForEach(blocks, block => { + var startIp = block.StartIp; + if (startIp.ToLower().IndexOf('i') != -1) { + return; + } + + var score = 0L; + if (startIp.IndexOf('.') != -1) { + score = ipToScore(startIp); + } else { + try { + score = Int64.Parse(startIp); + } catch (FormatException) { + return; + } + } + + var cityId = block.LocationId + '_' + Guid.NewGuid().ToString("N"); + conn.SortedSetAdd("ip2cityid:", cityId, score); + }); + } + + public static void ImportCitiesToRedis(IDatabase conn, string path) { + var locations = IpLookupCsvHelper.LoadCityLocations(path) + .Where(c=> + !string.IsNullOrEmpty(c.City) + && !string.IsNullOrEmpty(c.CityId) + && !string.IsNullOrEmpty(c.Region) + && !string.IsNullOrEmpty(c.Country) + && char.IsDigit(c.CityId[0]) + ); + + Parallel.ForEach(locations, location => { + var json = JsonSerializer.Serialize(new[] { location.City, location.Region, location.Country }); + conn.HashSet("cityid2city:", location.CityId, json); + }); + } + + public static string[]? FindCityByIp(IDatabase conn, string ipAddress) { + var score = ipToScore(ipAddress); + var results = conn.SortedSetRangeByScore("ip2cityid:", score, 0, 0, Order.Descending); + if (results.Length == 0) { + return null; + } + + var cityId = results[0].ToString(); + cityId = cityId.Substring(0, cityId.IndexOf('_')); + var value = conn.HashGet("cityid2city:", cityId).ToString(); + if (string.IsNullOrEmpty(value)) { + return null; + } + var result = JsonSerializer.Deserialize(value); + return result; + } + + public static string RandomOctet(int max) { + var rand = new Random(); + return rand.Next(0, max + 1).ToString(); + } + + private static long ipToScore(string ipAddress) { + var score = 0L; + foreach (var v in ipAddress.Split(@".")) { + score = score * 256 + int.Parse(v); + } + + return score; + } +} diff --git a/c#/Chapter5/LoggerOperations.cs b/c#/Chapter5/LoggerOperations.cs new file mode 100644 index 0000000..02331e3 --- /dev/null +++ b/c#/Chapter5/LoggerOperations.cs @@ -0,0 +1,56 @@ +using StackExchange.Redis; + +namespace Chapter5; + +public static class LoggerOperations { + public enum MessageSeverity { + Debug, + Info , + Warning, + Error, + Critical, + } + + public static void LogRecent(IDatabase conn, string name, string message, MessageSeverity severity = MessageSeverity.Info) { + var destination = $"recent:{name}:{severity.ToString().ToLower()}"; + var tasks = new List { + conn.ListLeftPushAsync(destination, $"{DateTime.UtcNow.ToTimestampFormat()} {message}"), + conn.ListTrimAsync(destination, 0, 99) + }; + conn.WaitAll(tasks.ToArray()); + } + + public static void LogCommon(IDatabase conn, string name, string message, MessageSeverity severity = MessageSeverity.Info, int timeout = 5000) { + var severityString = severity.ToString().ToLower(); + var commonDest = $"common:{name}:{severityString}"; + var startKey = $"{commonDest}:start"; + var end = DateTimeOffset.UtcNow.AddMilliseconds(timeout).ToUnixTimeMilliseconds(); + + while (DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() < end) { + var existing = conn.StringGet(startKey); + var transaction = conn.CreateTransaction(); + if (!string.IsNullOrEmpty(existing)) { + transaction.AddCondition(Condition.StringEqual(startKey, existing)); + } + + var hourStart = DateTime.UtcNow.ToIsoFormat(); + + if (existing!=RedisValue.Null && DateTime.Compare(DateTime.Parse(existing.ToString()), DateTime.Parse(hourStart)) < 0) { + transaction.KeyRenameAsync(commonDest, $"{commonDest}:last"); + transaction.KeyRenameAsync(startKey, $"{commonDest}:pstart"); + transaction.StringSetAsync(startKey, hourStart); + } + + transaction.SortedSetIncrementAsync(commonDest,message, 1 ); + + var recentDest = $"recent:{name}:{severityString}"; + transaction.ListLeftPushAsync(recentDest, $"{DateTime.UtcNow.ToTimestampFormat()} {message}"); + transaction.ListTrimAsync(recentDest, 0, 99); + var result = transaction.Execute(); + + if (result) { + return; + } + } + } +} diff --git a/c#/Chapter5/StatsOperations.cs b/c#/Chapter5/StatsOperations.cs new file mode 100644 index 0000000..58faacc --- /dev/null +++ b/c#/Chapter5/StatsOperations.cs @@ -0,0 +1,72 @@ +using StackExchange.Redis; + +namespace Chapter5; + +public static class StatsOperations { + public static Dictionary? UpdateStats(IDatabase conn, string context, string type, double value) { + var timeout = 5000; + var destination = $"stats:{context}:{type}"; + var startKey = $"{destination}:start"; + var end = DateTimeOffset.UtcNow.AddMilliseconds(timeout).ToUnixTimeMilliseconds(); + while (DateTimeOffset.UtcNow.ToUnixTimeMilliseconds() < end) { + var startPrev = conn.StringGet(startKey); + var trans = conn.CreateTransaction(); + + trans.AddCondition(startPrev != RedisValue.Null ? Condition.StringEqual(startKey, startPrev) : Condition.KeyNotExists(startKey)); + + var hourStart = DateTime.UtcNow.ToIsoFormat(); + + var existing = conn.StringGet(startKey); + if (existing != RedisValue.Null && DateTime.Compare(DateTime.Parse(existing.ToString()), DateTime.Parse(hourStart)) < 0) { + trans.KeyRenameAsync(destination, destination + ":last"); + trans.KeyRenameAsync(startKey, destination + ":pstart"); + trans.StringSetAsync(startKey, hourStart); + } + + var tkey1 = Guid.NewGuid().ToString(); + var tkey2 = Guid.NewGuid().ToString(); + trans.SortedSetAddAsync(tkey1, "min", value); + trans.SortedSetAddAsync(tkey2, "max", value); + + trans.SortedSetCombineAndStoreAsync(SetOperation.Union, destination, destination, tkey1, Aggregate.Min); + trans.SortedSetCombineAndStoreAsync(SetOperation.Union, destination, destination, tkey2, Aggregate.Max); + + trans.KeyDeleteAsync(tkey1); + trans.KeyDeleteAsync(tkey2); + trans.SortedSetIncrementAsync(destination, "count", 1); + trans.SortedSetIncrementAsync(destination, "sum", value); + trans.SortedSetIncrementAsync(destination, "sumsq", value * value); + + var result = trans.Execute(); + if (!result) { + continue; + } + + var statsNames = new List() { "count", "sum", "sumsq" }; + + var sortedSet = conn.SortedSetRangeByScoreWithScores(destination) + .Where(c => statsNames.Contains(c.Element.ToString())) + .OrderBy(c => c.Element.ToString()) + .ToDictionary(c => c.Element.ToString(), c => c.Score); + + return sortedSet; + } + + return null; + } + + public static Dictionary GetStats(IDatabase conn, string context, string type) { + var key = $"stats:{context}:{type}"; + var stats = new Dictionary(); + var data = conn.SortedSetRangeByRankWithScores(key, 0, -1); + foreach (var pair in data) { + stats.Add(pair.Element.ToString(), pair.Score); + } + + stats.Add("average", stats["sum"] / stats["count"]); + var numerator = stats["sumsq"] - Math.Pow(stats["sum"], 2) / stats["count"]; + var count = stats["count"]; + stats.Add("stddev", Math.Pow(numerator / (count > 1 ? count - 1 : 1), .5)); + return stats; + } +} diff --git a/c#/README.md b/c#/README.md new file mode 100644 index 0000000..c18ce4a --- /dev/null +++ b/c#/README.md @@ -0,0 +1,16 @@ +##Prerequisites + +* A running Redis instance as mentioned in the book +* NET 6.0 or higher installed + +##Running + +Open a command-line/terminal in the `c#` directory and do one of the following: + +* Windows: + `runchapter [#chapter-number]`. Use numbers 1 through 9 depending on the chapter's examples you want to run + i.e `runchapter 1` + +* Linux/Mac: + `./runchapter.sh [#chapter-number]`. Use numbers 1 through 9 depending on the chapter's examples you want to run + i.e `./runchapter.sh 1` \ No newline at end of file diff --git a/c#/runChapter.bat b/c#/runChapter.bat new file mode 100644 index 0000000..1bf5800 --- /dev/null +++ b/c#/runChapter.bat @@ -0,0 +1,26 @@ + @echo off +set chapter=Chapter%1 +set dir=%~dp0%chapter% +:: Keep old directory before changing +if exist %dir%\ ( +pushd . +cd %dir% + +:: build and run our project +echo: +echo ------------------------------------------------------------- +echo ^| Building %chapter% ^| +echo ------------------------------------------------------------- +echo: +dotnet build +echo: +echo ------------------------------------------------------------- +echo ^| Running %chapter% ^| +echo ------------------------------------------------------------- +echo: +dotnet run +:: Return to original directory +popd +) else ( + echo Could not locate directory "%dir%" +) \ No newline at end of file diff --git a/c#/runChapter.sh b/c#/runChapter.sh new file mode 100644 index 0000000..7fd7a83 --- /dev/null +++ b/c#/runChapter.sh @@ -0,0 +1,22 @@ +dir="Chapter"$1 + +if [ -d "$dir" ]; then + echo + echo ------------------------------------------------------------- + echo "| "Building $dir" |" + echo ------------------------------------------------------------- + echo + cd $dir + dotnet build + echo + echo ------------------------------------------------------------- + echo "| "Running $dir" |" + echo ------------------------------------------------------------- + echo + dotnet run + exit 0; +fi +else + echo Could not locate directory "$dir" + exit 1; +fi \ No newline at end of file