91 Commits
nav ... grid

Author SHA1 Message Date
Tommy Parnell
d3946d61d4 grid 2020-08-09 22:33:33 -04:00
Tommy Parnell
de0fafbc74 fix header 2020-08-09 19:58:43 -04:00
Tommy Parnell
8fd4dbd4ec dark theme 2020-08-09 19:27:47 -04:00
Tommy Parnell
3795a95ed0 make nav sticky 2020-08-09 18:08:05 -04:00
tparnell
0dce62ab99 split out desktop vs mobile 2020-08-09 13:35:30 -04:00
tparnell
dbea3664d6 fix gist lint 2020-08-09 10:25:33 -04:00
Tommy Parnell
e26c47f91f add my face back 2020-08-08 23:25:52 -04:00
Tommy Parnell
58e34d8177 A11y (#11)
* Accessibility Driven Development
2020-08-08 22:19:37 -04:00
Tommy Parnell
0e36180218 a11y done 2020-08-08 22:18:35 -04:00
tparnell
a245a21e02 a11y 2020-08-08 19:45:04 -04:00
tparnell
792919cb70 stop 2020-08-07 18:37:55 -04:00
tparnell
934a762939 preload script 2020-08-07 17:15:11 -04:00
Tommy Parnell
be76863dc2 fix up css better 2020-08-07 02:49:10 -04:00
Tommy Parnell
da8a0d9a4f 3.1 merge 2020-08-01 18:47:09 -04:00
Tommy Parnell
71cf945baf canonical tags, dotnet 3.1 2020-08-01 18:38:12 -04:00
tparnell
abacb42468 switch domains 2019-03-13 09:36:57 -04:00
tparnell
198f99f7f1 redirect plural tags to singular 2019-03-09 08:39:57 -05:00
tparnell
4fe77edfbc its 2019 duh 2019-02-23 14:33:44 -05:00
tparnell
78e73f2a4c Merge branch 'master' of github.com:TerribleDev/blog.terribledev.io.dotnet 2019-02-23 14:09:34 -05:00
tparnell
521b335f8a fix some nav stuff 2019-02-23 14:09:27 -05:00
tparnell
e2ad204571 web performance tips 2019-02-23 14:05:19 -05:00
Tommy Parnell
755c03303b fix media 2019-02-20 05:33:09 -05:00
Tommy Parnell
4c25db4039 prevent navbar reflow on load 2019-02-17 09:10:38 -05:00
Tommy Parnell
c3a583a33b add robots 2019-02-08 00:11:31 -05:00
Tommy Parnell
718b938a76 case insensitive tags 2019-02-07 23:20:44 -05:00
Tommy Parnell
415b62e1e7 get better 404 info 2019-02-07 23:13:13 -05:00
Tommy Parnell
e367072f21 alexa skills fix 2019-02-07 23:01:11 -05:00
tparnell
1327d87b96 site creator 2019-02-07 20:39:38 -05:00
tparnell
39dfcfe70d another twit meta 2019-02-07 20:37:16 -05:00
tparnell
86cb0cec23 more mettah fixes 2019-02-07 19:14:12 -05:00
tparnell
f87bd029f4 include redux gist 2019-02-07 18:47:05 -05:00
tparnell
69b2112e4b precconnect to analytics domain vs 2019-02-07 18:43:51 -05:00
tparnell
0b57031fc1 inject deps, fix links for rss feeds 2019-02-07 18:10:41 -05:00
tparnell
71929bb1ba fix external links 2019-02-07 16:06:52 -05:00
tparnell
b1959082dc I think I got links working 2019-02-06 22:38:30 -05:00
tparnell
be5c4cc806 fix rss 2019-02-06 22:32:14 -05:00
tparnell
ca0344c902 hopefully this fixes rss readers 2019-02-06 20:55:23 -05:00
tparnell
5a844f34f9 add img back in meta 2019-02-06 20:42:25 -05:00
tparnell
57a129cf8d tools for frontend devs 2019-02-06 19:12:18 -05:00
tparnell
a31b9d4fa9 stop 2019-02-06 19:01:46 -05:00
Tommy Parnell
7013e61c2f alt text for pictures 2019-02-06 07:56:24 -05:00
Tommy Parnell
6ce47adb8a webp avatar 2019-02-06 07:53:29 -05:00
tparnell
b9b9e81213 support webp 2019-02-05 23:32:21 -05:00
tparnell
53b8b448da webp 2019-02-05 23:32:21 -05:00
tparnell
aa6ed52d93 picture elem 2019-02-05 23:32:21 -05:00
Tommy Parnell
16c10c9ca1 app insights 2019-02-03 22:37:57 -05:00
Tommy Parnell
c3cb61619b fix some spacing 2019-02-03 21:40:49 -05:00
Tommy Parnell
7ff61450f9 Merge branch 'master' of github.com:TerribleDev/blog.terribledev.io.dotnet 2019-02-03 21:22:06 -05:00
Tommy Parnell
f3faede79e cache home page and posts in cf for 15 minutes 2019-02-03 21:21:57 -05:00
tparnell
38f82061e9 do not block anything 2019-02-03 20:22:21 -05:00
tparnell
57a8bba66a add csp back 2019-02-03 19:18:11 -05:00
tparnell
43d6e33638 rm csp 2019-02-03 19:11:28 -05:00
tparnell
d875ca6fea headers 🎉 2019-02-03 18:53:16 -05:00
Tommy Parnell
d846a538a0 headerz 2019-02-03 17:33:01 -05:00
Tommy Parnell
00b711aef4 increase hsts header 2019-02-03 13:14:20 -05:00
Tommy Parnell
dbb6ae208b output cache all the things 2019-02-03 13:01:21 -05:00
Tommy Parnell
de62e6275d inline styles 2019-02-03 11:53:52 -05:00
tparnell
d873be97d8 bettah config 2019-02-02 14:17:20 -05:00
tparnell
f3080faae0 start to move things to config 2019-02-02 13:49:46 -05:00
tparnell
6ed0ef4205 only render gtm in production 2019-02-02 13:30:30 -05:00
tparnell
ab9250b968 Revert "fix menu animation"
This reverts commit c24684fa8b.
2019-02-02 12:38:01 -05:00
tparnell
c24684fa8b fix menu animation 2019-02-02 12:28:55 -05:00
tparnell
6ebf9a6574 Merge branch 'master' of github.com:TerribleDev/blog.terribledev.io.dotnet 2019-01-29 17:36:42 -05:00
tparnell
7cf143c078 manifest should be async 2019-01-29 17:36:31 -05:00
Tommy Parnell
f7984258a5 make list mutation obvious 2019-01-26 22:07:21 -05:00
Tommy Parnell
712e92ff6b minor fixes 2019-01-26 21:35:17 -05:00
tparnell
8bf5a55dcb more meta fixes 2019-01-24 18:08:19 -05:00
tparnell
04d5f29fee apostrophe 2019-01-23 15:50:48 -05:00
tparnell
0aa95d9988 more unneeded redirects 2019-01-23 15:41:17 -05:00
tparnell
506188041a fix tags url trailing slash in link 2019-01-23 15:40:15 -05:00
tparnell
099f570e84 change png and compress 2019-01-23 15:36:07 -05:00
tparnell
6813370179 bettah mettah redux 2019-01-23 14:13:22 -05:00
tparnell
f981ba3f39 publish tinypng 2019-01-23 10:52:26 -05:00
tparnell
365f1730f5 add xml namespace 2019-01-23 10:19:37 -05:00
tparnell
d1c2d60c5a bettah mettah 2019-01-23 09:48:45 -05:00
tparnell
7f28de0655 add tag pages to sitemap 2019-01-23 09:22:11 -05:00
tparnell
fb14bb735e cache all the things 2019-01-22 09:37:07 -05:00
tparnell
fdbc9c6d6a fix external links 2019-01-22 09:07:40 -05:00
Tommy Parnell
a2e6c43e56 progressive jpg, more perf tips 2019-01-21 19:50:58 -05:00
Tommy Parnell
4acdfb3d4c make external links target blank 2019-01-21 19:15:05 -05:00
Tommy Parnell
45d0f3361e tag fixes 2019-01-21 18:28:18 -05:00
Tommy Parnell
9044e8679f only add headers when not static file 2019-01-21 18:05:23 -05:00
Tommy Parnell
7852083a8c bottom rule 2019-01-21 17:50:29 -05:00
Tommy Parnell
c353269c52 fix styles 2019-01-21 17:48:20 -05:00
Tommy Parnell
091abd4561 fix media 2019-01-21 15:32:51 -05:00
Tommy Parnell
a2dd0c0d2a rss html content 2019-01-21 15:25:44 -05:00
Tommy Parnell
3143f1c76b stop for now 2019-01-21 15:02:56 -05:00
Tommy Parnell
fd5c668820 fix all the things 2019-01-21 14:36:20 -05:00
Tommy Parnell
031c5c2598 no opener 2019-01-21 13:37:58 -05:00
Tommy Parnell
a46631b5e6 more 404 fixes, catch sw issues 2019-01-21 13:13:27 -05:00
Tommy Parnell
4c752803c0 add menu to all 2019-01-21 12:45:01 -05:00
191 changed files with 1670 additions and 426 deletions

13
.vscode/extensions.json vendored Normal file
View File

@@ -0,0 +1,13 @@
{
// See http://go.microsoft.com/fwlink/?LinkId=827846 to learn about workspace recommendations.
// Extension identifier format: ${publisher}.${name}. Example: vscode.csharp
// List of extensions which should be recommended for users of this workspace.
"recommendations": [
"ban.spellright"
],
// List of extensions recommended by VS Code that should not be recommended for users of this workspace.
"unwantedRecommendations": [
]
}

19
.vscode/launch.json vendored
View File

@@ -1,18 +1,17 @@
{
// Use IntelliSense to find out which attributes exist for C# debugging
// Use hover for the description of the existing attributes
// For further information visit https://github.com/OmniSharp/omnisharp-vscode/blob/master/debugger-launchjson.md
"version": "0.2.0",
"configurations": [
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": ".NET Core Launch (web)",
"type": "coreclr",
"request": "launch",
"preLaunchTask": "build",
// If you have changed target frameworks, make sure to update the program path.
"program": "${workspaceFolder}/TerribleDev.Blog.Web/bin/Debug/netcoreapp2.2/TerribleDev.Blog.Web.dll",
"program": "${workspaceFolder}/src/TerribleDev.Blog.Web/bin/Debug/netcoreapp3.1/TerribleDev.Blog.Web.dll",
"args": [],
"cwd": "${workspaceFolder}/TerribleDev.Blog.Web",
"cwd": "${workspaceFolder}/src/TerribleDev.Blog.Web",
"stopAtEntry": false,
"internalConsoleOptions": "openOnSessionStart",
"launchBrowser": {
@@ -42,5 +41,5 @@
"request": "attach",
"processId": "${command:pickProcess}"
}
,]
}
]
}

19
.vscode/settings.json vendored Normal file
View File

@@ -0,0 +1,19 @@
{
"files.eol": "\n",
"spellchecker.language": "en_US",
"spellchecker.ignoreWordsList": [
"dotnet",
"csproj's",
"VS2017",
"vs2017",
"refactor"
],
"spellchecker.documentTypes": [
"markdown",
"latex",
"plaintext"
],
"spellchecker.ignoreRegExp": [],
"spellchecker.ignoreFileExtensions": [],
"spellchecker.checkInterval": 5000
}

5
.vscode/spellright.dict vendored Normal file
View File

@@ -0,0 +1,5 @@
intellisense
docker
env
mydocklinting
eslint

2
.vscode/tasks.json vendored
View File

@@ -7,7 +7,7 @@
"type": "process",
"args": [
"build",
"${workspaceFolder}/TerribleDev.Blog.Web/TerribleDev.Blog.Web.csproj"
"${workspaceFolder}/src/TerribleDev.Blog.Web/TerribleDev.Blog.Web.csproj"
],
"problemMatcher": "$msCompile"
}

10
Readme.md Normal file
View File

@@ -0,0 +1,10 @@
## Compress webp
find . -iname '*.png' -exec cwebp -lossless '{}' -o '{}'.webp \;
find . -iname '*.jpg' -exec cwebp '{}' -o '{}'.webp \;
find . -iname '*.gif' -exec gif2webp -mixed '{}' -o '{}'.webp \;
## resize image
find . -iname '*' -exec convert '{}' -resize 750 '{}' \;

7
docker-compose.yml Normal file
View File

@@ -0,0 +1,7 @@
version: '3'
services:
webapp:
build: ./src/TerribleDev.Blog.Web
ports:
- "80:80"
- "443:443"

View File

@@ -0,0 +1,7 @@
{
"ProviderId": "Microsoft.ApplicationInsights.ConnectedService.ConnectedServiceProvider",
"Version": "8.14.11009.1",
"GettingStartedDocument": {
"Uri": "https://go.microsoft.com/fwlink/?LinkID=798432"
}
}

View File

@@ -12,48 +12,25 @@ namespace TerribleDev.Blog.Web.Controllers
{
public class HomeController : Controller
{
public static List<IPost> postsAsList = new BlogFactory().GetAllPosts().OrderByDescending(a=>a.PublishDate).ToList();
public static Dictionary<string, List<IPost>> tagToPost = postsAsList.Where(a=>a.tags != null)
.Aggregate(
new Dictionary<string, List<IPost>>(),
(accum, item) => {
foreach(var tag in item.tags)
{
if(accum.TryGetValue(tag, out var list))
{
list.Add(item);
}
else
{
accum[tag] = new List<IPost>() { item };
}
}
return accum;
});
public static IDictionary<string, IPost> posts = postsAsList.ToDictionary(a=>a.Url);
public static IDictionary<int, List<IPost>> postsByPage = postsAsList.Aggregate(new Dictionary<int, List<IPost>>() { [1] = new List<IPost>() }, (accum, item) =>
{
var highestPage = accum.Keys.Max();
var current = accum[highestPage].Count;
if (current >= 10)
{
accum[highestPage + 1] = new List<IPost>() { item };
return accum;
}
accum[highestPage].Add(item);
return accum;
});
private readonly PostCache postCache;
public HomeController(PostCache postCache)
{
this.postCache = postCache;
}
[Route("/")]
[Route("/index.html")]
[Route("/page/{pageNumber?}" )]
[OutputCache(Duration = 31536000, VaryByParam = "pageNumber")]
[ResponseCache(Duration = 900)]
public IActionResult Index(int pageNumber = 1)
{
if(!postsByPage.TryGetValue(pageNumber, out var result))
if(!postCache.PostsByPage.TryGetValue(pageNumber, out var result))
{
return NotFound();
return Redirect($"/404/?from=/page/{pageNumber}/");
}
return View(new HomeViewModel() { Posts = result, Page = pageNumber, HasNext = postsByPage.ContainsKey(pageNumber + 1), HasPrevious = postsByPage.ContainsKey(pageNumber - 1) });
return View(new HomeViewModel() { Posts = result, Page = pageNumber, HasNext = postCache.PostsByPage.ContainsKey(pageNumber + 1), HasPrevious = postCache.PostsByPage.ContainsKey(pageNumber - 1) });
}
[Route("/theme/{postName?}")]
public IActionResult Theme(string postName)
@@ -61,6 +38,7 @@ namespace TerribleDev.Blog.Web.Controllers
return View(model: postName);
}
[Route("/offline")]
[Route("/offline.html")]
[ResponseCache(Duration = 3600)]
public IActionResult Offline()
{
@@ -75,12 +53,12 @@ namespace TerribleDev.Blog.Web.Controllers
[Route("{postUrl}")]
[OutputCache(Duration = 31536000, VaryByParam = "postUrl")]
[ResponseCache(Duration = 180)]
[ResponseCache(Duration = 900)]
public IActionResult Post(string postUrl)
{
if(!posts.TryGetValue(postUrl, out var currentPost))
if(!postCache.UrlToPost.TryGetValue(postUrl, out var currentPost))
{
return NotFound();
return Redirect($"/404/?from={postUrl}");
}
return View(model: currentPost);
}
@@ -88,13 +66,23 @@ namespace TerribleDev.Blog.Web.Controllers
[ResponseCache(Duration = 0, Location = ResponseCacheLocation.None, NoStore = true)]
public IActionResult Error()
{
this.Response.StatusCode = 500;
return View(new ErrorViewModel { RequestId = Activity.Current?.Id ?? HttpContext.TraceIdentifier });
}
[Route("/404")]
[Route("{*url}", Order = 999)]
[ResponseCache(Duration = 0, Location = ResponseCacheLocation.None, NoStore = true)]
public IActionResult FourOhFour()
public IActionResult FourOhFour(string from = null)
{
return View();
this.Response.StatusCode = 404;
return View(viewName: nameof(FourOhFour));
}
[Route("/404.html")]
[ResponseCache(Duration = 0, Location = ResponseCacheLocation.None, NoStore = true)]
public IActionResult FourOhFourCachePage()
{
//make a route so the service worker can cache a 404 page, but get a valid status code
return View(viewName: nameof(FourOhFour));
}
}
}

View File

@@ -15,12 +15,19 @@ namespace TerribleDev.Blog.Web.Controllers
{
public class SeoController : Controller
{
public static DateTimeOffset publishDate = DateTimeOffset.UtcNow; // keep publish date in memory so we just return when the server was kicked
public static IEnumerable<SyndicationItem> postsToSyndication = HomeController.postsAsList.Select(a => a.ToSyndicationItem()).ToList();
private readonly BlogConfiguration configuration;
private readonly PostCache postCache;
public SeoController(BlogConfiguration configuration, PostCache postCache)
{
this.configuration = configuration;
this.postCache = postCache;
}
public static DateTimeOffset publishDate = DateTimeOffset.UtcNow; // keep publish date in memory so we just return when the server was kicked
[Route("/rss")]
[Route("/rss.xml")]
[ResponseCache(Duration = 7200)]
[OutputCache(Duration = 86400)]
[OutputCache(Duration = 31536000)]
public async Task Rss()
{
Response.StatusCode = 200;
@@ -28,11 +35,11 @@ namespace TerribleDev.Blog.Web.Controllers
using (XmlWriter xmlWriter = XmlWriter.Create(this.Response.Body, new XmlWriterSettings() { Async = true, Indent = false, Encoding = Encoding.UTF8 }))
{
var writer = new RssFeedWriter(xmlWriter);
await writer.WriteTitle("The Ramblings of TerribleDev");
await writer.WriteValue("link", "https://blog.terribledev.io");
await writer.WriteTitle(configuration.Title);
await writer.WriteValue("link", configuration.Link);
await writer.WriteDescription("My name is Tommy Parnell. I usually go by TerribleDev on the internets. These are just some of my writings and rants about the software space.");
foreach (var item in postsToSyndication)
foreach (var item in postCache.PostsAsSyndication)
{
await writer.Write(item);
}
@@ -43,22 +50,22 @@ namespace TerribleDev.Blog.Web.Controllers
}
[Route("/sitemap.xml")]
[ResponseCache(Duration = 7200)]
[OutputCache(Duration = 86400)]
[OutputCache(Duration = 31536000)]
public void SiteMap()
{
Response.StatusCode = 200;
Response.ContentType = "text/xml";
var sitewideLinks = new List<SiteMapItem>()
var sitewideLinks = new List<SiteMapItem>(postCache.TagsToPosts.Keys.Select(a => new SiteMapItem() { LastModified = DateTime.UtcNow, Location = $"https://blog.terrible.dev/tag/{a}/" }))
{
new SiteMapItem() { LastModified = DateTime.UtcNow, Location="https://blog.terribledev.io/all-tags/" }
new SiteMapItem() { LastModified = DateTime.UtcNow, Location="https://blog.terrible.dev/all-tags/" }
};
var ser = new XmlSerializer(typeof(SiteMapRoot));
var sitemap = new SiteMapRoot()
{
Urls = HomeController.postsAsList.Select(a => new SiteMapItem() { LastModified = DateTime.UtcNow, Location = $"https://blog.terribledev.io/{a.Url}" }).ToList()
Urls = postCache.PostsAsLists.Select(a => new SiteMapItem() { LastModified = DateTime.UtcNow, Location = a.CanonicalUrl }).ToList()
};
sitemap.Urls.AddRange(sitewideLinks);
ser.Serialize(this.Response.Body, sitemap);
}
}
}
}

View File

@@ -3,26 +3,45 @@ using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using TerribleDev.Blog.Web.Models;
namespace TerribleDev.Blog.Web.Controllers
{
public class TagsController : Controller
{
private readonly PostCache postCache;
public TagsController(PostCache postCache)
{
this.postCache = postCache;
}
[Route("/all-tags")]
[OutputCache(Duration = 31536000)]
public IActionResult AllTags()
{
return View(HomeController.tagToPost);
return View(postCache.TagsToPosts);
}
[Route("/tags/{tagName}")]
[OutputCache(Duration = 31536000, VaryByParam = "tagName")]
public IActionResult TagPluralRedirect(string tagName)
{
if(string.IsNullOrEmpty(tagName))
{
return Redirect($"/404/?from=/tags/emptyString/");
}
return Redirect($"/tag/{tagName}/");
}
[Route("/tag/{tagName}")]
[OutputCache(Duration = 31536000, VaryByParam = "tagName")]
public IActionResult GetTag(string tagName)
{
if(!HomeController.tagToPost.TryGetValue(tagName, out var models))
if(!postCache.TagsToPosts.TryGetValue(tagName.ToLower(), out var models))
{
return NotFound();
return Redirect($"/404/?from=/tag/{tagName}/");
}
{
return View(new Models.GetTagViewModel { Tag = tagName, Posts = models });
return View(new Models.GetTagViewModel { Tag = tagName, Posts = models, CanonicalUrl = $"https://blog.terrible.dev/tag/{tagName.ToLower()}/" });
}
}
}
}
}

View File

@@ -1,9 +1,9 @@
FROM microsoft/dotnet:2.2-aspnetcore-runtime AS base
FROM mcr.microsoft.com/dotnet/core/aspnet:3.1 AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443
FROM microsoft/dotnet:2.2-sdk AS build
FROM mcr.microsoft.com/dotnet/core/sdk:3.1 AS build
WORKDIR /src
COPY ["./TerribleDev.Blog.Web.csproj", "."]
RUN dotnet restore "TerribleDev.Blog.Web.csproj"
@@ -17,4 +17,4 @@ RUN dotnet publish "TerribleDev.Blog.Web.csproj" -c Release -o /app
FROM base AS final
WORKDIR /app
COPY --from=publish /app .
ENTRYPOINT ["dotnet", "TerribleDev.Blog.Web.dll"]
ENTRYPOINT ["dotnet", "TerribleDev.Blog.Web.dll"]

View File

@@ -0,0 +1,20 @@
title: Hosting your blog on the cheap
date: 2019-08-17 04:49:46
tags:
- cloud
---
A load of people have been asking me lately how I host my blog. Incase its not apparent, I make 0 dollars on this blog. I refuse to place ads on the page, just to gain pennies of revenue. I do this, not because I don't feel like I shouldn't get paid, but simply because I find ads to be disruptive to the reader. At the end of the day, blogs should have a high signal to noise ratio.
<!-- more -->
Since I make no money, on this my strategy is about cutting costs. My grandfather use to say "take care of the pounds, let the pennies take care of themselves." Now since my grandfather is in England, and their dollar is known as the pound, he was telling me to focus on the bigger picture.
The first big decision for blogs is what "engine" you are going to use, or if you are going to make your own. These usually fall into 2 categories. Static sites, which are usually when blogs are written in text files, and are compiled into static html, or server rendered blogs such as wordpress. When a request is made to blog that has server rendering, the html is dynamically built in time and delivered to the consumer. Static sites, on the other hand are precomputed and thus are just delivered to the browser.
I won't go into the details on what is better for different scenarios. If you are being cheap, then you will want to use static sites. Static sites are precomputed, which essentially means you just need to serve files to the user. There is no dynamic server to host, you won't need a database, etc. There are a few I like, but my favorite is [gatsbyjs](https://www.gatsbyjs.org/).
So I know what you are thinking, static sites are just 'better' for page load time. While this is true, they can lack dynamic features that might be important to you, such as adding new blog posts on a schedule, or limiting ip addresses, or even some kind of login/subscription model.

View File

@@ -0,0 +1,13 @@
title: Hosting your webapp on the cheap
date: 2018-08-22 05:11:20
tags:
- cloud
---
So many people have asked me how I've hosted apps in the past. There is a bit of an art at the moment to making your apps extremely cheap in the cloud. I've heard of hosting costs cut from thousands to pennies.
<!-- more -->
## Hosting

View File

@@ -0,0 +1,3 @@
title: 'I used ask.com for 30 days, and this is what I learned'
tags:
---

View File

@@ -0,0 +1,3 @@
title: Migrating from azure web app to containers
tags:
---

View File

@@ -0,0 +1,3 @@
title: Precompiling razor views in dotnet core
tags:
---

View File

@@ -0,0 +1,3 @@
title: Securing your dotnet core apps with hardhat
tags:
---

View File

@@ -0,0 +1,16 @@
title: The ultimate chaos monkey. When your cloud provider goes down!
date: 2017-03-13 15:20:14
tags:
- amazon
- aws
- cloud
- DevOps
---
A few weeks ago, the internet delt with the fallout that was [the aws outage](https://techcrunch.com/2017/02/28/amazon-aws-s3-outage-is-breaking-things-for-a-lot-of-websites-and-apps/). AWS, or Amazon Web Services is amazon's cloud platform, and the most popular one to use. There are other platforms similar in scope such as Microsoft's Azure. Amazon had an S3 outage, that ultimately caused other services to fail in the most popular, and oldest region they own. The region dubbed `us-east-1` which is in Virgina.
This was one of the largest cloud outages we have seen, and users of the cloud found out first hand that the cloud is imperfect. In short when you are using the cloud, you are using services, and infrastructure developed by human beings. However most people turn to tools such as cloud vendors, since the scope of their applications do not, and should not include management of large infrastructure.
The Netflix, and amazon's of the world are large. Really large, and total avalibility is not just a prefered option, but a basic requirement. Companies that are huge users of the cloud, have started to think about region level depenencies. In short, for huge companies, being in one region is perilous, and frought with danger.
Infact this isn't the first time we have heard such things. In 2013 Netflix published [an article](http://techblog.netflix.com/2013/05/denominating-multi-region-sites.html) describing how they run in multiple regions. There is an obvious cost in making something work multi-region. This is pretty much for the large companies, however if you are a multi billion dollar organization, working multi-region would probably be an awesome idea.

View File

View File

@@ -11,13 +11,24 @@ namespace TerribleDev.Blog.Web
{
public static SyndicationItem ToSyndicationItem(this IPost x)
{
return new SyndicationItem()
Uri.TryCreate(x.CanonicalUrl, UriKind.Absolute, out var url);
var syn = new SyndicationItem()
{
Title = x.Title,
Description = x.ContentPlain,
Id = $"https://blog.terribledev.io/{x.Url}",
Description = x.Content.ToString(),
Id = url.ToString(),
Published = x.PublishDate
};
syn.AddLink(new SyndicationLink(url));
return syn;
}
public static ISet<string> ToNormalizedTagList(this IPost x)
{
if(x.tags == null)
{
return new HashSet<string>();
}
return new HashSet<string>(x.tags.Where(a => !string.IsNullOrWhiteSpace(a)).Select(a => a.ToLower()));
}
}
}

View File

@@ -0,0 +1,69 @@
using System.Collections.Generic;
using TerribleDev.Blog.Web.Models;
using System.Linq;
using System.Collections.Immutable;
using System.Diagnostics;
using System;
using Microsoft.SyndicationFeed;
namespace TerribleDev.Blog.Web.Factories
{
public static class BlogCacheFactory
{
public static PostCache ProjectPostCache(IEnumerable<IPost> rawPosts)
{
var orderedPosts = rawPosts.OrderByDescending(a => a.PublishDate);
var posts = new List<IPost>();
var urlToPosts = new Dictionary<string, IPost>();
var tagsToPost = new Dictionary<string, IList<IPost>>();
var postsByPage = new Dictionary<int, IList<IPost>>();
var syndicationPosts = new List<SyndicationItem>();
foreach(var post in orderedPosts)
{
posts.Add(post);
urlToPosts.Add(post.UrlWithoutPath, post);
syndicationPosts.Add(post.ToSyndicationItem());
foreach(var tag in post.ToNormalizedTagList())
{
if(tagsToPost.TryGetValue(tag, out var list))
{
list.Add(post);
}
else
{
tagsToPost.Add(tag, new List<IPost>() { post });
}
}
if(postsByPage.Keys.Count < 1)
{
postsByPage.Add(1, new List<IPost>() { post });
}
else
{
var highestPageKey = postsByPage.Keys.Max();
var highestPage = postsByPage[highestPageKey];
if(highestPage.Count < 10)
{
highestPage.Add(post);
}
else
{
postsByPage.Add(highestPageKey + 1, new List<IPost>() { post });
}
}
}
return new PostCache()
{
PostsAsLists = posts,
TagsToPosts = tagsToPost,
UrlToPost = urlToPosts,
PostsByPage = postsByPage,
PostsAsSyndication = syndicationPosts
};
}
}
}

View File

@@ -7,23 +7,37 @@ using TerribleDev.Blog.Web.Models;
using YamlDotNet.Serialization;
using Microsoft.AspNetCore.Html;
using Markdig;
using TerribleDev.Blog.Web.MarkExtension.TerribleDev.Blog.Web.ExternalLinkParser;
using TerribleDev.Blog.Web.MarkExtension;
using Microsoft.AspNetCore.Hosting;
using System.Diagnostics;
using System.Collections.Concurrent;
namespace TerribleDev.Blog.Web
{
public class BlogFactory
{
public List<IPost> GetAllPosts()
public IEnumerable<IPost> GetAllPosts(string domain)
{
// why didn't I use f# I'd have a pipe operator by now
var posts = GetPosts();
var allPosts = posts.AsParallel().Select(a =>
var list = new ConcurrentBag<IPost>();
Parallel.ForEach(posts, post =>
{
var fileInfo = new FileInfo(a);
var fileText = File.ReadAllText(fileInfo.FullName);
return ParsePost(fileText, fileInfo.Name);
var (text, fileInfo) = GetFileText(post);
list.Add(ParsePost(text, fileInfo.Name, domain));
});
return allPosts.ToList();
return list;
}
private static (string text, FileInfo fileInfo) GetFileText(string filePath)
{
var fileInfo = new FileInfo(filePath);
var text = File.ReadAllText(fileInfo.FullName);
return (text, fileInfo);
}
public IEnumerable<string> GetPosts() => Directory.EnumerateFiles(Path.Combine(Directory.GetCurrentDirectory(), "Posts"), "*.md", SearchOption.TopDirectoryOnly);
public PostSettings ParseYaml(string ymlText)
@@ -32,30 +46,43 @@ namespace TerribleDev.Blog.Web
return serializer.Deserialize<PostSettings>(ymlText);
}
public IPost ParsePost(string postText, string fileName)
public IPost ParsePost(string postText, string fileName, string domain)
{
var splitFile = postText.Split("---");
var ymlRaw = splitFile[0];
var markdownText = string.Join("", splitFile.Skip(1));
var pipeline = new MarkdownPipelineBuilder().UseEmojiAndSmiley().Build();
var postContent = Markdown.ToHtml(markdownText, pipeline);
var postContentPlain = String.Join("", Markdown.ToPlainText(markdownText, pipeline).Split("<!-- more -->"));
var postSettings = ParseYaml(ymlRaw);
var resolvedUrl = !string.IsNullOrWhiteSpace(postSettings.permalink) ? postSettings.permalink : fileName.Split('.')[0].Replace(' ', '-').WithoutSpecialCharacters();
List<string> postImages = new List<string>();
var pipeline = new MarkdownPipelineBuilder()
.Use(new AbsoluteLinkConverter(resolvedUrl, domain))
.Use<ImageRecorder>(new ImageRecorder(ref postImages))
.Use<TargetLinkExtension>()
.UseMediaLinks()
.Use<PictureInline>()
.UseEmojiAndSmiley()
.Build();
var postContent = Markdown.ToHtml(markdownText, pipeline);
var postContentPlain = String.Join("", Markdown.ToPlainText(markdownText, pipeline).Split("<!-- more -->"));
var summary = postContent.Split("<!-- more -->")[0];
var postSummaryPlain = postContentPlain.Split("<!-- more -->")[0];
return new Post()
{
PublishDate = postSettings.date,
tags = postSettings.tags?.Select(a=>a.Replace(' ', '-').WithoutSpecialCharacters().ToLower()).ToList() ?? new List<string>(),
PublishDate = postSettings.date.ToUniversalTime(),
tags = postSettings.tags?.Select(a => a.Replace(' ', '-').WithoutSpecialCharacters().ToLower()).ToList() ?? new List<string>(),
Title = postSettings.title,
Url = resolvedUrl,
RelativeUrl = $"/{resolvedUrl}/",
CanonicalUrl = $"https://blog.terrible.dev/{resolvedUrl}/",
UrlWithoutPath = resolvedUrl,
Content = new HtmlString(postContent),
Summary = new HtmlString(summary),
SummaryPlain = postSummaryPlain,
ContentPlain = postContentPlain
SummaryPlainShort = (postContentPlain.Length <= 147 ? postContentPlain : postContentPlain.Substring(0, 146)) + "...",
ContentPlain = postContentPlain,
Images = postImages.Distinct().ToList()
};
}
}
}

View File

@@ -0,0 +1,61 @@
using System;
using Markdig;
using Markdig.Renderers;
using Markdig.Renderers.Html.Inlines;
using Markdig.Syntax.Inlines;
namespace TerribleDev.Blog.Web.MarkExtension
{
public class AbsoluteLinkConverter : IMarkdownExtension
{
public string BaseUrl { get; }
public string Domain { get; }
public AbsoluteLinkConverter(string baseUrl, string domain)
{
BaseUrl = baseUrl;
Domain = domain;
}
public void Setup(MarkdownPipelineBuilder pipeline)
{
}
public void Setup(MarkdownPipeline pipeline, IMarkdownRenderer renderer)
{
var htmlRenderer = renderer as HtmlRenderer;
if (htmlRenderer != null)
{
var inlineRenderer = htmlRenderer.ObjectRenderers.FindExact<LinkInlineRenderer>();
inlineRenderer.TryWriters.Add(TryLinkAbsoluteUrlWriter);
}
}
private bool TryLinkAbsoluteUrlWriter(HtmlRenderer renderer, LinkInline linkInline)
{
var prevDynamic = linkInline.GetDynamicUrl;
linkInline.GetDynamicUrl = () => {
var escapeUrl = prevDynamic != null ? prevDynamic() ?? linkInline.Url : linkInline.Url;
if(!System.Uri.TryCreate(escapeUrl, UriKind.RelativeOrAbsolute, out var parsedResult))
{
throw new Exception($"Error making link for {escapeUrl} @ {BaseUrl}");
}
if(parsedResult.IsAbsoluteUri)
{
return escapeUrl;
}
var uriBuilder = new UriBuilder(Domain);
if(!escapeUrl.StartsWith("/"))
{
uriBuilder = uriBuilder.WithPathSegment($"/{BaseUrl}/{escapeUrl}");
}
else
{
uriBuilder = uriBuilder.WithPathSegment(parsedResult.ToString());
}
return uriBuilder.Uri.ToString();
};
return false;
}
}
}

View File

@@ -0,0 +1,71 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using Markdig.Syntax.Inlines;
namespace TerribleDev.Blog.Web.MarkExtension
{
using System;
using System.Collections.Generic;
using System.Linq;
using Markdig;
using Markdig.Renderers;
using Markdig.Renderers.Html;
using Markdig.Renderers.Html.Inlines;
using Markdig.Syntax.Inlines;
namespace TerribleDev.Blog.Web.ExternalLinkParser
{
/// <summary>
/// Extension for extending image Markdown links in case a video or an audio file is linked and output proper link.
/// </summary>
/// <seealso cref="Markdig.IMarkdownExtension" />
public class TargetLinkExtension : IMarkdownExtension
{
public void Setup(MarkdownPipelineBuilder pipeline)
{
}
public void Setup(MarkdownPipeline pipeline, IMarkdownRenderer renderer)
{
var htmlRenderer = renderer as HtmlRenderer;
if (htmlRenderer != null)
{
var inlineRenderer = htmlRenderer.ObjectRenderers.FindExact<LinkInlineRenderer>();
if (inlineRenderer != null)
{
inlineRenderer.TryWriters.Remove(TryLinkInlineRenderer);
inlineRenderer.TryWriters.Add(TryLinkInlineRenderer);
}
}
}
private bool TryLinkInlineRenderer(HtmlRenderer renderer, LinkInline linkInline)
{
if (linkInline.Url == null)
{
return false;
}
Uri uri;
// Only process absolute Uri
if (!Uri.TryCreate(linkInline.Url, UriKind.RelativeOrAbsolute, out uri) || !uri.IsAbsoluteUri)
{
return false;
}
RenderTargetAttribute(uri, renderer, linkInline);
return false;
}
private void RenderTargetAttribute(Uri uri, HtmlRenderer renderer, LinkInline linkInline)
{
linkInline.SetAttributes(new HtmlAttributes() { Properties = new List<KeyValuePair<string, string>>() { new KeyValuePair<string, string>("target", "_blank"), new KeyValuePair<string, string>("rel", "noopener"), } });
}
}
}
}

View File

@@ -0,0 +1,57 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using Markdig.Syntax.Inlines;
namespace TerribleDev.Blog.Web.MarkExtension
{
using System;
using System.Collections.Generic;
using System.Linq;
using Markdig;
using Markdig.Renderers;
using Markdig.Renderers.Html;
using Markdig.Renderers.Html.Inlines;
using Markdig.Syntax.Inlines;
namespace TerribleDev.Blog.Web.ExternalLinkParser
{
public class ImageRecorder : IMarkdownExtension
{
private List<string> images = null;
public ImageRecorder(ref List<string> images)
{
this.images = images;
}
public void Setup(MarkdownPipelineBuilder pipeline)
{
}
public void Setup(MarkdownPipeline pipeline, IMarkdownRenderer renderer)
{
var htmlRenderer = renderer as HtmlRenderer;
if (htmlRenderer != null)
{
var inlineRenderer = htmlRenderer.ObjectRenderers.FindExact<LinkInlineRenderer>();
if (inlineRenderer != null)
{
inlineRenderer.TryWriters.Add(TryLinkInlineRenderer);
}
}
}
private bool TryLinkInlineRenderer(HtmlRenderer renderer, LinkInline linkInline)
{
if (linkInline.Url == null || !linkInline.IsImage)
{
return false;
}
var url = linkInline.GetDynamicUrl != null ? linkInline.GetDynamicUrl(): linkInline.Url;
this.images.Add(url);
return false;
}
}
}
}

View File

@@ -0,0 +1,69 @@
using System;
using Markdig;
using Markdig.Renderers;
using Markdig.Renderers.Html.Inlines;
using Markdig.Syntax.Inlines;
namespace TerribleDev.Blog.Web.MarkExtension
{
public class PictureInline : IMarkdownExtension
{
public void Setup(MarkdownPipelineBuilder pipeline)
{
}
public void Setup(MarkdownPipeline pipeline, IMarkdownRenderer renderer)
{
var htmlRenderer = renderer as HtmlRenderer;
if (htmlRenderer != null)
{
var inlineRenderer = htmlRenderer.ObjectRenderers.FindExact<LinkInlineRenderer>();
inlineRenderer.TryWriters.Add(TryLinkInlineRenderer);
}
}
private bool TryLinkInlineRenderer(HtmlRenderer renderer, LinkInline linkInline)
{
if (linkInline == null || !linkInline.IsImage)
{
return false;
}
renderer.Write("<picture>");
WriteImageTag(renderer, linkInline, ".webp", "image/webp");
WriteImageTag(renderer, linkInline, string.Empty);
renderer.Write("</picture>");
return true;
}
private void WriteImageTag(HtmlRenderer renderer, LinkInline link, string suffix, string type = null)
{
renderer.Write(string.IsNullOrWhiteSpace(type) ? $"<img src=\"" : $"<source type=\"{type}\" srcset=\"");
var escapeUrl = link.GetDynamicUrl != null ? link.GetDynamicUrl() ?? link.Url : link.Url;
renderer.WriteEscapeUrl($"{escapeUrl}{suffix}");
renderer.Write("\"");
renderer.WriteAttributes(link);
if (renderer.EnableHtmlForInline)
{
renderer.Write(" alt=\"");
}
var wasEnableHtmlForInline = renderer.EnableHtmlForInline;
renderer.EnableHtmlForInline = false;
renderer.WriteChildren(link);
renderer.EnableHtmlForInline = wasEnableHtmlForInline;
if (renderer.EnableHtmlForInline)
{
renderer.Write("\"");
}
if (renderer.EnableHtmlForInline)
{
renderer.Write(" />");
}
}
}
}

View File

@@ -0,0 +1,8 @@
namespace TerribleDev.Blog.Web.Models
{
public class BlogConfiguration
{
public string Title { get; set; }
public string Link { get; set; }
}
}

View File

@@ -9,5 +9,6 @@ namespace TerribleDev.Blog.Web.Models
{
public IEnumerable<IPost> Posts { get; set; }
public string Tag { get; set; }
public string CanonicalUrl { get; set; }
}
}

View File

@@ -9,13 +9,18 @@ namespace TerribleDev.Blog.Web.Models
{
public interface IPost
{
string Url { get; set; }
string CanonicalUrl { get; set; }
string UrlWithoutPath { get; set; }
string RelativeUrl { get; set; }
string Title { get; set; }
HtmlString Summary { get; set; }
DateTime PublishDate { get; set; }
HtmlString Content { get; set; }
string ContentPlain { get; set; }
string SummaryPlain { get; set; }
string SummaryPlainShort { get; set; }
IList<string> tags { get; set; }
IList<string> Images { get; set;}
}
}

View File

@@ -1,18 +1,25 @@
using Microsoft.AspNetCore.Html;
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Diagnostics.CodeAnalysis;
namespace TerribleDev.Blog.Web.Models
{
[DebuggerDisplay("{Title}")]
public class Post : IPost
{
public string Url { get; set; }
public string CanonicalUrl { get; set; }
public string UrlWithoutPath { get; set; }
public string RelativeUrl { get; set; }
public string Title { get; set; }
public DateTime PublishDate { get; set; }
public HtmlString Content { get; set; }
public HtmlString Summary { get; set; }
public string ContentPlain { get; set; }
public string SummaryPlain { get; set; }
public string SummaryPlainShort { get; set; }
public IList<string> tags { get; set; }
public IList<string> Images { get; set; }
}
}

View File

@@ -0,0 +1,15 @@
using System.Collections.Generic;
using Microsoft.SyndicationFeed;
namespace TerribleDev.Blog.Web.Models
{
public class PostCache
{
public IList<IPost> PostsAsLists { get; set;}
public IDictionary<string, IList<IPost>> TagsToPosts { get; set; }
public IDictionary<string, IPost> UrlToPost { get; set; }
public IDictionary<int, IList<IPost>> PostsByPage { get; set; }
public IList<SyndicationItem> PostsAsSyndication { get; set; }
}
}

View File

@@ -0,0 +1,20 @@
using System;
using System.Collections.Generic;
using System.Diagnostics.CodeAnalysis;
namespace TerribleDev.Blog.Web.Models
{
public class PostComparer
{
public static PostComparisonByDateInternal PostComparisonByDate = new PostComparisonByDateInternal();
public class PostComparisonByDateInternal : IComparer<IPost>
{
public int Compare([AllowNull] IPost x, [AllowNull] IPost y)
{
return DateTime.Compare(x.PublishDate, y.PublishDate);
}
}
}
}

View File

@@ -6,7 +6,7 @@ using System.Xml.Serialization;
namespace TerribleDev.Blog.Web.Models
{
[XmlRoot("urlset")]
[XmlRoot("urlset", Namespace="http://www.sitemaps.org/schemas/sitemap/0.9")]
public class SiteMapRoot
{
[XmlElement("url")]

View File

@@ -0,0 +1,46 @@
title: 5 web perf tips for 2019
date: 2019-02-23 01:32
tags:
- web
- performance
- javascript
- battle of the bulge
---
As more and more of the world is getting online, a larger part of the internet community is using the internet on lower powered devices. Making websites fast is becoming paramount. Here are 5 tips to improving you web page's performance
<!-- more -->
## Brotli and gzip
So incase you didn't know, when your browser makes a request to the server it sends along a header called `Accept-Encoding` This is a comma separated list of compression types your server can use to compress the data to the user. The common ones in the past have been `gzip, and deflate`. [Broli](https://en.wikipedia.org/wiki/Brotli), is a compression
algorithm invented by google to be a more efficient for the web. This has about a 35% effectiveness over gzip based on my own testing. This means your content will be almost 1/3rd smaller over the wire. Most browsers [support this already](https://caniuse.com/#feat=brotli). You can use cloudflare to serve Brotli (br) to your users, and most web servers support this today. Make sure your server is serving br, and at minimum gzip.
## Webp, JPEG 2000
Images are among one of the largest types of files on the internet today, and picking the right file type is as important as getting your data structures right. In the past we told everyone to keep photography in `jpeg`, logos and screen shots in `png`. However google has come out with a new file format. One that is massively smaller than either `jpeg` or `png`, and that is `webp`. Webp is only supported on [chrome, edge and firefox](https://caniuse.com/#search=webp), but don't worry for IOS Safari you can use `JPEG 2000`. Sizing images is also a key concern, you can use srcset to size images appropriately, and you can use the picture element to select the right image given browser support.
```html
<picture>
<source type="image/webp" srcset="3.webp" alt="an image showing the tiny png results">
<source type="image/jp2" srcset="3.jp2" alt="an image showing the tiny png results">
<img src="3.png" alt="an image showing the tiny png results">
</picture>
```
## Lighthouse
Ok so this is less of a trick to implement and more of a tool use use. Man I keep mentioning google, but they keep making amazing web stuff so here we are. Google has made this awesome performance tool called [lighthouse](https://developers.google.com/web/tools/lighthouse/). A version of this tool is built into chrome. Open the developer tools, and click the `audits` tab. That tool is lighthouse. You can install newer versions with `npm install -g lighthouse` or `yarn global add lighthouse`. Then just run `lighthouse --view <url>` so this blog would be `lighthouse --view https://blog.terrible.dev`. You should be hit with a pretty in depth report as to how you can fix and improve your web pages. You can also have your CI system run lighthouse on every build. You can fail PR's if they reduce performance, or just track your accessibility over time.
## HTTP/2
HTTP version 2 is a newer version of the http spec. Supported [by all major browsers](https://caniuse.com/#feat=http2) this protocol offers compression of http headers, a [push feature](https://en.wikipedia.org/wiki/HTTP/2_Server_Push) that lets you push files down to the browser before they are requested, [http pipelining](https://en.wikipedia.org/wiki/HTTP_pipelining), and multiplexing multiple requests over a single TCP connection. You can easily get http2 working if you let [cloudflare](https://www.cloudflare.com/) front your http traffic, but you will still want to implement http2 in your server eventually.
## Service workers
My last and probably favorite feature. [Service Workers](https://developers.google.com/web/fundamentals/primers/service-workers/) are a worker that can stand in between your server and web page in the browser. They are mostly a proxy that let you do things like cache your content, and support offline capabilities. They are easy to implement, you need to have a `manifest.json` file which you can generate from Microsoft's [PWA Builder](https://www.pwabuilder.com/), and just serve traffic over https only. PWA Builder even has [pre-made service workers](https://www.pwabuilder.com/serviceworker) for most scenarios so you don't even need to write your own. I use this for my blog to cache static content, preload blog posts, and provide offline support.

View File

@@ -0,0 +1,103 @@
title: Accessibility Driven Development
date: 2020-08-07 05:27:00
tags:
- a11y
- accessibility
---
I've been working at [CarGurus.com](https://www.cargurus.com) for the last 2 years or so. One of the biggest journeys we've been undertaking is to take accessibility far more seriously. However with an engineering team way into the triple digits it gets harder and harder to scale accessibility knowledge.
<!-- more -->
Knowledge gaps aside CarGurus has a multitude of technologies UI are build with. The two major are [Freemarker](https://freemarker.apache.org/) and [React](https://reactjs.org/). I manage one of our infrastructure teams, we build the tools and technologies to create the site with. This includes our component library, our build systems, linting tools, authentication systems, and core utilities for product development. When we first started really taking accessibility seriously we went to several teams in the business. Many of them did not have anyone with accessibility expertise.
> Our first approach was to teach accessibility. At the same time we worked with our brand marketing team to ensure our color pallet would be accessible from the start.
After identifying advocates on every team we set out to streamline identifying accessibility issues. One approach I decided to take was to show borders around failing elements during development. I first heard of this idea years ago when GitHub released something it called [accessibilityjs](https://github.com/github/accessibilityjs). This script Github included in its pages and put a giant ugly red border around failing elements. I thought this was a really slick idea to point out issues during development.
> I was going to use accessibility JS until I found axe-core
So [axe](https://www.deque.com/axe/) is a technology built by deque to identify accessibility issues. This is a highly configurable piece of technology that includes libraries for developers, browser extensions, and bots you can scan sites with. Deque has open sourced the core technology of axe which is a JavaScript called [axe-core](https://github.com/dequelabs/axe-core).
> I first started out by writing a script to use axe-core and to add a 10px red border around elements, but I quickly ran into trouble
First problem, I need to re-run axe every time the browser changes. If we click to open a nav-bar we'll need to rescan the page. Second problem, every-time we change the DOM the script would crash react apps, and finally axe-core is quite slow on large HTML documents.
## Mutation Observers
So the first problem was easily solvable. The browser has an API called [Mutation Observer](https://developer.mozilla.org/en-US/docs/Web/API/MutationObserver). This is an API that lets you listen to changes to certain elements and fire a function when those elements change. In our case we wanted to listen to any changes to the `<body>` tag and all of its descendants.
```js
function scanForAccesibilityIssues() { /* scan for issues */}
const observer = new MutationObserver(scanForAccesibilityIssues);
observer.observe(document.querySelector('body'), { childList: true, subtree: true });
```
## Shadow DOM
Several UI frameworks such as React keep an in memory representation of the HTML document. The reason for this is when you want to change the UI in React. React will diff its current in-memory DOM with the next DOM and determine the most efficient way to actually apply the changes to the browser. Any application such as a browser extension, or our accessibility detector that edits the DOM outside of React's in-memory DOM will cause React to freak out and either crash of apply a change in an unexpected way. Luckily in recent years browsers have added a [Shadow DOM](https://developer.mozilla.org/en-US/docs/Web/Web_Components/Using_shadow_DOM). This is essentially a DOM that is used to apply visual changes to a user, but sits outside the light DOM (or the regular DOM). However, not all HTML elements support The Shadow DOM. For us to apply the red border we need to use the shadow DOM, and if any elements do not support shadow then we have to apply the border to the parent element. I wrote a [recursive function](https://en.wikipedia.org/wiki/Recursion_(computer_science)#Tail-recursive_functions) called `resolveClosestShadowRoot` which will walk up the DOM document and find the closest parent a target element has that supports shadow. You can tell if a node supports shadow because it will have a `.attachShadow` method. So we can simply access this variable and see if its defined or not.
```js
/**
*
* @param {HTMLElement} node
* @returns
*/
function resolveClosestShadowRoot(node) {
if (!node) {
return null;
}
if (node.attachShadow) {
return node;
}
return resolveClosestShadowRoot(node.parentElement);
}
```
After we identify which element to style we just have to apply the border. The code below is doing that by calling the attach shadow function and setting its innerHTML.
```js
const resolvedNode = resolveClosestShadowRoot(node);
const shadowRoot = resolvedNode.attachShadow({ mode: 'open' });
shadowRoot.innerHTML = '<style>:host { outline: red solid 1rem; }</style><slot></slot>';
```
The `<slot></slot>` element is rendering the content of the light DOM. We still have to show the existing content, and the `:host` psudo-class selector is selecting the host of the shadow DOM.
## Debounce 🎉
In web development we often use what's known as a "debounce" to delay doing something. The simple example is sometimes people click on a button multiple times, often on accident, sometimes intentionally. Before taking any action or taking multiple actions you might wait a moment before they stop clicking to do something. You wouldn't want to take the same action multiple times for each click. This is where debounce comes into play.
```js
function debounce(fn, wait) {
let timeout = null;
return function (...args) {
const next = () => fn.apply(this, args);
clearTimeout(timeout);
timeout = setTimeout(next, wait);
};
}
```
A debounce function accepts a function and a "wait time" or delay before being called to actually executing your function. To debounce a buttons onclick function you would pass its standard onclick function into the debounce function
```js
const onclick = () => { };
const debouncedClick = debounce(onclick, 500); // 500 milliseconds before the function is actually fired
```
```html
<button onclick="debouncedClick()" ></button>
```
## The result
So the result of all this is a function that listens to changes in the HTML document, waits 1 second for all the changes to finish applying, then scans the page for failing elements and uses The Shadow DOM to apply a red border around those elements. You can see a basic version of the code at [this Github Gist](https://gist.github.com/TerribleDev/51049146e00b36b0d8643f5e09d21ea8).
We log the Deque error object to the console which includes links to the failing elements. The result is whenever anyone develops new UI at CarGurus a giant ugly red border surrounds elements they don't write as accessible. This provides **immediate** feedback during the development process and prevents huge categories of accessibility issues from reaching production.
![An example of a failing element](1.jpg)

View File

@@ -3,6 +3,7 @@ permalink: anti-forgery-tokens-in-nancyfx-with-razor
id: 33
updated: '2014-06-11 20:00:34'
date: 2014-06-11 19:34:13
tags:
---
Getting started with anti-forgery tokens in NancyFX with razor views is pretty simple.

View File

@@ -7,7 +7,7 @@ tags:
- docker
---
Here we are, its 2017 dotnet core is out, and finally dotnet has a proper cli. In a previous post [we explored the new cli](http://blog.terribledev.io/Exploring-the-dotnet-cli/). In short you can use the dotnet cli to build, test, package, and publish projects. However sometimes just using the cli is not enough. Sometimes, you land in a place where you have many projects to compile, test, and package.
Here we are, its 2017 dotnet core is out, and finally dotnet has a proper cli. In a previous post [we explored the new cli](/Exploring-the-dotnet-cli/). In short you can use the dotnet cli to build, test, package, and publish projects. However sometimes just using the cli is not enough. Sometimes, you land in a place where you have many projects to compile, test, and package.
<!-- more -->
You sometimes need a more complex tool to help you manage your versions, and set the right properties as part of your builds. This is where a tasking system like [gulp](http://gulpjs.com/) can help. Now gulp is not the only task engines. There are Rake, Cake, MSBuild, etc. Plenty to pick from. I personally use gulp a lot, because I'm a web developer. I need a JS based system, to help me run the [babels](https://babeljs.io), and [webpacks](https://webpack.github.io/docs/) of the world.

View File

@@ -0,0 +1,19 @@
title: Compressing images with tinypng's CLI
date: 2019-01-23 10:50
tags:
- javascript
- tools
---
Ok so I'm really lazy, and I honestly think that has helped me a lot in this industry. I always try to work smarter, not harder. I take many screen shots for this blog, and I need to optimize them. Incase you didn't know many images are often larger than they need to be slowing the download time. However, I don't ever want to load them into photoshop. Too much time and effort!
<!-- more -->
At first I tried to compress images locally, but it took to long to run through all the images I had. So recently I started using a service called [tiny png](https://tinypng.com/) to compress images. Now the website seems to indicate that you upload images, and you will get back optimized versions. However to me this takes too much time. I don't want the hassle of zipping my images uploading them, downloading the results. Again, lazy!
So I figured out they have a cli in npm. Easy to install, just use npm to globally install it. `npm install -g tinypng-cli`.
Now you have to call the cli, this is the flags I use `tinypng . -r -k YourKeyHere`. The period tells tinypng to look in the current directory for images, `-r` tells it to look recursively, or essentially to look through child directories as well, and the `-k YourKeyHere` is the key you get by logging in. On the free plan you get 500 compressions a month. Hopefully you will fall into the pit of success like I did!
![an image showing the tiny png results](3.png)

View File

@@ -3,6 +3,7 @@ permalink: fixing-could-not-load-file-or-assembly-microsoft-dnx-host-clr-2
id: 53
updated: '2015-09-09 17:34:41'
date: 2015-09-09 10:08:18
tags:
---
So I recently ran into this error where the latest bits could not load Microsoft.Dnx.Host.Clr here is what I did to fix it.

View File

@@ -8,5 +8,4 @@ I put together [some materials](https://github.com/TerribleDev/intro-to-docker)
<br />
<!--more-->
{% youtube 6EGyhDlr8rs %}
![video](https://www.youtube.com/watch?v=6EGyhDlr8rs)

View File

@@ -15,7 +15,7 @@ Getting Started:
Ok, so the alexa .net sdk is for the full framework only, and its built for webapi. The best way to get going is in visual studio `file -> new project -> ASP.NET Web Application .net framework` A dialog comes up, and I picked `Azure API App`.
![dialog picker](dialog.png)
![dialog picker](dialog.PNG)
Now you have an empty webapi project. We don't need swashbuckle/swagger so lets get rid of that

View File

@@ -0,0 +1,83 @@
title: Rebuilding this blog for performance
date: 2019-01-21 17:56:34
tags:
- performance
- battle of the bulge
- javascript
- dotnet
---
So many people know me as a very performance focused engineer, and as someone that cares about perf I've always been a bit embarrassed about this blog. In actual fact this blog as it sits now is **fast** by most people's standards. I got a new job in July, and well I work with an [absolute mad lad](https://twitter.com/markuskobler) that is making me feel pretty embarrassed with his 900ms page load times. So I've decided to build my own blog engine, and compete against him.
<!-- more -->
## Approach
Ok, so I want a really fast blog, but one that does not sacrifice design. I plan to pre-compute the HTML into memory, but I am not going to serve static files. In this case, I'll need an application server. I'm going to have my own CSS styles, but I'm hoping to be in the (almost) no-JS camp. Not that I dislike JS, but I want to do as much pre-computing as possible, and I don't want to slow the page down with compute in the client.
## Features
This blog has a view to read a post. A home page with links to the last 10 blog posts and a pager to go back further in time. A page listing blogs by tags and links for each tag to posts.
## Picking Technologies
So in the past my big philosophy has been that most programming languages and technologies really don't matter for most applications. In fact this use-case *could* and probably should be one of them, but when you go to extremes that I go, you want to look at benchmarks. [Tech empower](https://www.techempower.com/benchmarks/) does benchmarks of top programming languages and frameworks. For my blog since it will be mostly be bytes in bytes out, precomputed, we should look at the plain text benchmark. The top 10 webservers include go, java, rust, c++, and C#. Now I know rust, go and C# pretty well. Since the rust, and go webservers listed in the benchmark were mostly things no one really uses, I decided to use dotnet. This is also for a bit of a laugh, because my competition hates dotnet, and I also have deep dotnet expertise I can leverage.
## Server-side approach
So as previously mentioned we'll be precomputing blog posts. I plan to compute the posts and hand them down to the views. If we use completely immutable data structures we'll prevent any locking that could slow down our app.
## ASPNET/Dotnet Gotchas
So dotnet is a managed language with a runtime. Microsoft has some [performance best practices](https://docs.microsoft.com/en-us/aspnet/core/performance/performance-best-practices?view=aspnetcore-2.2), but here are some of my thoughts.
* There is a tool called [cross gen](https://github.com/dotnet/coreclr/blob/master/Documentation/building/crossgen.md) which compiles dll's to native code.
* Dotnet's garbage collector is really good, but it struggles to collect long living objects. Our objects will need to either be ephemeral, or pinned in memory forever.
* The garbage collector struggles with large objects, especially large strings. We'll have to avoid large string allocations when possible.
* dotnet has reference types such as objects, classes, strings, and most other things are value types. [Value types are allocated](/c-strings/) on the stack which is far cheaper than the heap
* Exceptions are expensive when thrown in dotnet. I'm going to always avoid hitting them.
* Cache all the things!
In the past we had to pre-compile razor views, but in 2.x of dotnet core, that is now built in. So one thing I don't have to worry about
## Client side page architecture and design
So here are my thoughts on the client side of things.
* Minify all the content
* Fingerprint all css/js content and set cache headers to maximum time
* Deliver everything with brotli compression
* Zopfli and gzip for fallbacks
* Always use `Woff2` for fonts
* Avoid expensive css selectors
* `:nth child`
* `fixed`
* partial matching `[class^="wrap"]`
* Use HTTP/2 for **all requests**
* Images
* Use SVG's when possible
* Recompile all images in the build to `jpeg 2000, jpeg xr, and webp`
* Serve `jpeg 2000` to ios
* `jpeg XR` to ie11 and edge
* Send `webp` to everyone else
* PWA
* Use a service worker to cache assets
* Also use a service worker to prefetch blog posts
* Offline support
* CDN
* Use Cloudflare to deliver assets faster
* Cloudflare's argo improves geo-routing and latency issues
* Throw any expected 301's inside cloudflares own datacenters with workers
## Tools
These are the list of tools I'm using to measure performance.
* `lighthouse` - Built into chrome (its in the audit tab in the devtools), this displays a lot of performance and PWA improvements.
* [Web Hint](https://webhint.io/) is like a linter for your web pages. The tool provides a ton of improvements from accessibility to performance
* I really like [pingdom's](https://tools.pingdom.com/) page load time tool.
* Good ol' [web page test is also great](https://www.webpagetest.org/)
* The chrome devtools can also give you a breakdown as to what unused css you have on the page

View File

@@ -16,8 +16,7 @@ Today marks the release of Visual Studio 2017, and with it the final release of
So I bet you are wondering, how is VS2017 improved. When you first boot the vs2017 installer you are immediately hit with a very sleek UI for the installer. The installer actually has reasonable install sizes for scenarios like nodejs only.
{% image "fancybox" vs.PNG "vs 2017 installer" %}
![vs 2017 installer](vs.PNG)
VS2017 can understand which lines of code are linked to your unit tests. As you alter, or refactor code VS can run the tests. This can allow the editor to show checkmarks or red `x`'s This is huge as it can seemingly provide constant feedback to developers during development.

View File

@@ -0,0 +1,49 @@
title: Must have vscode plugins for front-end devs
date: 2019-02-06
tags:
- visual studio
- javascript
- css
- front-end
---
I've had a lot of people ask me about my choice of editors, and plugins. A while back I switched to vscode for all my programming work, for both front and back end. In the past I've blogged about [the best plugins for visual studio](/VS-2017-best-extensions-on-launch/) as a backend dev, but I thought I'd give you a more front-end angle
<!-- more -->
## Document this
My first one, and in my opinion the most underrated is [document this](https://marketplace.visualstudio.com/items?itemName=joelday.docthis). So if you have ever had to write [jsdoc](http://usejsdoc.org/) comments you can know how tedious it gets, and if you haven't, trust me you should. VSCode and most other editors can read [jsdoc](http://usejsdoc.org/) comments above functions, and class declarations to improve the intellisense and type completion statements. Simply have your cursor over a function, invoke document this, and quickly you will be given jsdoc comments for your code.
![Animated gif showing off document this](document-this.gif)
## Import Cost
Another extension I find vital to my every day is [import cost](https://marketplace.visualstudio.com/items?itemName=wix.vscode-import-cost). This is a package, that leaves you little notes on the side of any import you have as to how big it will be. This package will even highlight the size text in red for large imports which you can configure. What I love about this package, is it tells me if the package I'm about to use is going to be very expensive size wise. That way I find out long before I commit the code, and my pages get slow.
![a static image showing off import cost](import-cost.png)
## ESlint and Prettier
Hopefully both of these will not be new to you. ESLint is a linting tool that looks for potential errors in your code. Prettier is an opinionated style enforcer for your code. The [eslint](https://marketplace.visualstudio.com/items?itemName=dbaeumer.vscode-eslint) and [prettier](https://marketplace.visualstudio.com/items?itemName=esbenp.prettier-vscode) extensions for vscode can automatically show you problems in your code as you type, and can even fix your code on save. What I love about both of these tools, is together they make a great force for improving your code base. Prettier eliminates many debates over code style between team members, and eslint prevents you from shipping many bugs to production. These extensions can call out problems as you type, which decreases the feedback loops, and increases your productivity.
## Filesize
As a web developer I spend a lot of my time looking at file size. Right now file sizes are ever inflating, and are causing pain for bandwidth constrained devices. I often download bundles, and inspect their compiled source, or just have to look at how big a file is on the filesystem. A big tool I have in my belt is [filesize](https://marketplace.visualstudio.com/items?itemName=mkxml.vscode-filesize). This is a crazy simple extension, but one that brings me joy everyday. The premise is simple, print the file size of the current file in the status bar at the bottom. Click on it, and you get a nice output of what its like gzipped, and the mime type. Dirt simple, but saved me a ton of time everyday!
![a picture of the filesize plugin in action](filesize2.jpg)
## Runner ups
Here is a list of additional extensions I certainly couldn't live without
* [path intellisense](https://marketplace.visualstudio.com/items?itemName=christian-kohler.path-intellisense) - autocomplete file paths in various files (including html)
* [npm intellisense](https://marketplace.visualstudio.com/items?itemName=christian-kohler.npm-intellisense) - autocomplete npm pages in imports
* [html 5 boilerplate](https://marketplace.visualstudio.com/items?itemName=sidthesloth.html5-boilerplate) - dirt simple html boilerplate snippets
* [icon fonts](https://marketplace.visualstudio.com/items?itemName=idleberg.icon-fonts) - Autocomplete for various icon fonts such as font awesome
* [git lens](https://marketplace.visualstudio.com/items?itemName=eamodio.gitlens) - Show git history inline, along with other information from git

View File

@@ -1,21 +1,24 @@
title: The battle of the buldge. Visualizing your javascript bundle
title: The battle of the bulge. Visualizing your javascript bundle
date: 2018-10-17 13:19:18
tags:
- javascript
- battle of the bulge
- performance
---
So incase you havn't been following me. I joined Cargurus in July. At cargurus we're currently working on our mobile web experience written in react, redux and reselect. As our implementation grew so did our time to first paint.
<!-- more -->
So I've been spending a lot of time working on our performance. One tool I have found invaluable in the quest for page perf mecca is [source-map-explorer](https://www.npmjs.com/package/source-map-explorer). This is a tool that dives into a bundled file, and its map. Then visualizes the bundle in a tree view. This view lets you easily understand exactly what is taking up space in the bundle. What I love about this tool is that it works with any type of bundled javascript file, and is completely seperate of the build. So any bugs in webpack where you have duplicate files in a bundle will appear here.
So I've been spending a lot of time working on our performance. One tool I have found invaluable in the quest for page perf mecca is [source-map-explorer](https://www.npmjs.com/package/source-map-explorer). This is a tool that dives into a bundled file, and its map. Then visualizes the bundle in a tree view. This view lets you easily understand exactly what is taking up space in the bundle. What I love about this tool is that it works with any type of bundled javascript file, and is completely de-void of any builds. So any bugs in your webpack config leading to duplicate files in a bundle will show up here.
## Getting started
You get started by `npm install -g source-map-explorer` then just download your bundles, and sourcemaps. In the command line run `source-map-explorer ./yourbundle.js ./yourbundlemap.js` Your browser should then open with a great tree view of what is inside your bundle. From here you can look to see what dependencies you have, and their sizes. Obviously, you can then decide to keep or throw them away.
You get started by `npm install -g source-map-explorer` then just download your bundles, and sourcemaps. You can do this from production if you have them. Otherwise build bundles locally. **Note** You should always use this on minified code where any tree shaking and dead code elimination has occurred. In the command line run `source-map-explorer ./yourbundle.js ./yourbundle.js.map` Your browser should then open with a great tree view of what is inside your bundle. From here you can look to see what dependencies you have, and their sizes. Obviously, you can then decide to keep or throw them away.
![an example visualization](1.png)
Here is a great youtube video explaining it in detail!
{% youtube 7aY9BoMEpG8 %}
![video](https://www.youtube.com/watch?v=7aY9BoMEpG8)

View File

@@ -0,0 +1,94 @@
title: 'Measuring, Visualizing and Debugging your React Redux Reselect performance bottlenecks'
date: 2019-01-14 22:04:56
tags:
- battle of the bulge
- javascript
- performance
---
In the battle of performance one tool constantly rains supreme, the all powerful profiler! In javascript land chrome has a pretty awesome profiler, but every-time I looked into our react perf issues I was always hit by a slow function called `anonymous function`
<!-- more -->
## Using the chrome profiler
So if you open the chrome devtools, you will see a tab called `performance`. Click on that tab. If you are looking into CPU bound workloads click the CPU dropdown and set yourself to 6x slowdown, which will emulate a device that is much slower.
![An image showing the chrome devtools](1.png)
Press the record button, click around on your page, then click the record button again. You are now hit with a timeline of your app, and what scripts were ran during this time.
So what I personally like to do is find orange bars that often make up the bulk of the time. However I've often noticed the bulk of bigger redux apps are taken up by `anonymous functions` or functions that essentially have no name. They often look like this `() => {}`. This is largely because they are inside of [reselect selectors](https://github.com/reduxjs/reselect). Incase you are unfamiliar selectors are functions that cache computations off the redux store. Back to the chrome profiler. One thing you can do it use the `window.performance` namespace to measure and record performance metrics into the browser. If you expand the `user timings section` in the chrome profiler you may find that react in dev mode has included some visualizations for how long components take to render.
![react user timings in chrome](3.png)
## Adding your own visualizations
So digging into other blog posts, I found posts showing how to [visualize your redux actions](https://medium.com/@vcarl/performance-profiling-a-redux-app-c85e67bf84ae) using the same performance API mechanisms react uses. That blog post uses redux middleware to add timings to actions. This narrowed down on our performance problems, but did not point out the exact selector that was slow. Clearly we had an action that was triggering an expensive state update, but the time was still spent in `anonymous function`. Thats when I had the idea to wrap reselect selector functions in a function that can append the timings. [This gist is what I came up with](https://gist.github.com/TerribleDev/db48b2c8e143f9364292161346877f93)
```js
import {createSelector} from 'reselect';
const hasPerformanceApi =
window &&
window.performance &&
window.performance.measure &&
window.performance.mark;
const createFuncWithMark = (name, callback) => (...args) => {
const startMark = `${name}-Startmark`;
const endMark = `${name}-EndMark`;
window.performance.mark(startMark);
const result = callback(...args);
window.performance.mark(endMark);
window.performance.measure('♻️ ' + `${name}-Selector`, startMark, endMark);
window.performance.clearMarks(startMark);
window.performance.clearMarks(endMark);
window.performance.clearMeasures(startMark);
window.performance.clearMeasures(endMark);
return result;
};
export const createMarkedSelector = (name, ...args) => {
if (!hasPerformanceApi) {
return createSelector(...args);
}
if (!name || typeof name !== 'string') {
throw new Error('marked selectors must have names');
}
const callback = args.pop();
const funcWithMark = createFuncWithMark(name, callback);
args.push(funcWithMark);
return createSelector(...args);
};
```
So how does this work exactly? Well its a library that wraps the function you pass to reselect that adds markers to the window to tell you how fast reselect selectors take to run. Combined with the previously mentioned blog post, you can now get timings in chrome's performance tool with selectors! You can also combine this with the [redux middleware](https://medium.com/@vcarl/performance-profiling-a-redux-app-c85e67bf84ae) I previously mentioned to get a deeper insight into how your app is performing
![a preview of selectors reporting their performance](2.png)
## So how do I use your gist?
You can copy the code into a file of your own. If you use reselect you probably have code that looks like the following.
```js
export const computeSomething = createSelector([getState], (state) => { /* compute projection */ });
```
You just need to replace the above with the following
```js
export const computeSomething = createMarkedSelector('computeSomething', [getState], (state) => { /* compute projection */ });
```
its pretty simple, it just requires you to pass a string in the first argument slot. That string will be the name used to write to the performance API, and will show up in the chrome profiler. Inside vscode you can even do a regex find and replace to add this string.
```
find: const(\s?)(\w*)(\s?)=(\s)createSelector\(
replace: const$1$2$3=$4createMarkedSelector('$2',
```

View File

@@ -43,7 +43,7 @@ Essentially I add the routing package to the container, and then have have the a
foreach(var route in Routes.RoutesDictionary)
{
a.MapGet("docker101", handler: async b=>{
b.Response.Redirect("https://blog.terribledev.io/Getting-started-with-docker-containers/", true);
b.Response.Redirect("https://blog.terrible.dev/Getting-started-with-docker-containers/", true);
});
}
});

View File

@@ -1,49 +1,66 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.HttpsPolicy;
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Rewrite;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.FileProviders;
using Microsoft.Net.Http.Headers;
using HardHat.Middlewares;
using HardHat;
using TerribleDev.Blog.Web.Models;
using TerribleDev.Blog.Web.Factories;
using Microsoft.Extensions.Hosting;
namespace TerribleDev.Blog.Web
{
public class Startup
{
public Startup(IConfiguration configuration)
public Startup(IConfiguration configuration, IWebHostEnvironment env)
{
Configuration = configuration;
Env = env;
}
public IConfiguration Configuration { get; }
public IWebHostEnvironment Env { get; }
// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
Func<BlogConfiguration> getBlog = () => Configuration.GetSection("Blog").Get<BlogConfiguration>();
if (Env.IsDevelopment())
{
services.AddTransient(a => getBlog());
}
else
{
services.AddSingleton(getBlog());
}
services.AddSingleton((i) => {
var posts = new BlogFactory().GetAllPosts(Env.IsDevelopment() ? "https://localhost:5001": "https://blog.terrible.dev");
return BlogCacheFactory.ProjectPostCache(posts);
});
services.AddApplicationInsightsTelemetry();
var controllerBuilder = services.AddControllersWithViews();
#if DEBUG
if (Env.IsDevelopment())
{
controllerBuilder.AddRazorRuntimeCompilation();
}
#endif
services.AddResponseCompression(a =>
{
a.EnableForHttps = true;
})
.AddMemoryCache()
.AddMvcCore()
.AddCacheTagHelper()
.AddRazorViewEngine()
.SetCompatibilityVersion(CompatibilityVersion.Version_2_2);
services.AddOutputCaching();
.AddOutputCaching();
}
// This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
public void Configure(IApplicationBuilder app, IHostingEnvironment env)
public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
if (env.IsDevelopment())
{
@@ -52,38 +69,57 @@ namespace TerribleDev.Blog.Web
else
{
app.UseExceptionHandler("/Error");
// The default HSTS value is 30 days. You may want to change this for production scenarios, see https://aka.ms/aspnetcore-hsts.
app.UseHsts(TimeSpan.FromDays(30), false, preload: true);
}
app.UseIENoOpen();
app.UseNoMimeSniff();
app.UseCrossSiteScriptingFilters();
app.UseFrameGuard(new FrameGuardOptions(FrameGuardOptions.FrameGuard.SAMEORIGIN));
app.UseHttpsRedirection();
app.UseResponseCompression();
var cacheTime = env.IsDevelopment() ? 0 : 31536000;
app.UseStaticFiles(new StaticFileOptions
{
OnPrepareResponse = ctx =>
{
ctx.Context.Response.Headers[HeaderNames.CacheControl] =
"public,max-age=" + cacheTime;
}
OnPrepareResponse = ctx =>
{
ctx.Context.Response.Headers[HeaderNames.CacheControl] =
"public,max-age=" + cacheTime;
}
});
app.UseStaticFiles(new StaticFileOptions
{
FileProvider = new PhysicalFileProvider(Path.Combine(Directory.GetCurrentDirectory(), "wwwroot", "img")),
OnPrepareResponse = ctx =>
{
ctx.Context.Response.Headers[HeaderNames.CacheControl] =
"public,max-age=" + cacheTime;
}
OnPrepareResponse = ctx =>
{
ctx.Context.Response.Headers[HeaderNames.CacheControl] =
"public,max-age=" + cacheTime;
}
});
app.UseRewriter(new Microsoft.AspNetCore.Rewrite.RewriteOptions().AddRedirect("(.*[^/|.xml|.html])$", "$1/", 301));
app.UseIENoOpen();
app.UseNoMimeSniff();
app.UseCrossSiteScriptingFilters();
app.UseFrameGuard(new FrameGuardOptions(FrameGuardOptions.FrameGuard.SAMEORIGIN));
app.UseHsts(TimeSpan.FromDays(365), false, preload: true);
app.UseContentSecurityPolicy(
new ContentSecurityPolicy()
{
// DefaultSrc = new HashSet<string>() {
// CSPConstants.Self, "https://www.google-analytics.com", "https://www.googletagmanager.com", "https://stats.g.doubleclick.net"
// },
// ScriptSrc = new HashSet<string>()
// {
// CSPConstants.Self, CSPConstants.UnsafeInline, "https://www.google-analytics.com", "https://www.googletagmanager.com", "https://stats.g.doubleclick.net"
// },
// StyleSrc = new HashSet<string>()
// {
// CSPConstants.Self, CSPConstants.UnsafeInline
// },
UpgradeInsecureRequests = true
});
app.UseOutputCaching();
app.UseMvc();
app.UseRouting();
app.UseEndpoints(endpoints =>
{
endpoints.MapControllers();
});
}
}
}

View File

@@ -0,0 +1,73 @@
using Microsoft.AspNetCore.Hosting;
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Razor.TagHelpers;
using Microsoft.Extensions.Caching.Memory;
using Microsoft.Extensions.FileProviders;
using Microsoft.Extensions.Hosting;
using System.IO;
using System.Threading.Tasks;
namespace TerribleDev.Blog.Web.Taghelpers
{
[HtmlTargetElement("inline-style")]
public class InlineStyleTagHelper : TagHelper
{
[HtmlAttributeName("href")]
public string Href { get; set; }
private IWebHostEnvironment HostingEnvironment { get; }
private IMemoryCache Cache { get; }
public InlineStyleTagHelper(IWebHostEnvironment hostingEnvironment, IMemoryCache cache)
{
HostingEnvironment = hostingEnvironment;
Cache = cache;
}
public override async Task ProcessAsync(TagHelperContext context, TagHelperOutput output)
{
var path = Href;
// Get the value from the cache, or compute the value and add it to the cache
var fileContent = await Cache.GetOrCreateAsync("InlineStyleTagHelper-" + path, async entry =>
{
var fileProvider = HostingEnvironment.WebRootFileProvider;
if(HostingEnvironment.IsDevelopment())
{
var changeToken = fileProvider.Watch(path);
entry.AddExpirationToken(changeToken);
}
entry.SetPriority(CacheItemPriority.NeverRemove);
var file = fileProvider.GetFileInfo(path);
if (file == null || !file.Exists)
return null;
return await ReadFileContent(file);
});
if (fileContent == null)
{
output.SuppressOutput();
return;
}
output.TagName = "style";
output.Attributes.RemoveAll("href");
output.Content.AppendHtml(fileContent);
}
private static async Task<string> ReadFileContent(IFileInfo file)
{
using (var stream = file.CreateReadStream())
using (var textReader = new StreamReader(stream))
{
return await textReader.ReadToEndAsync();
}
}
}
}

View File

@@ -1,10 +1,12 @@
<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>netcoreapp2.2</TargetFramework>
<TargetFramework>netcoreapp3.1</TargetFramework>
<AspNetCoreHostingModel>InProcess</AspNetCoreHostingModel>
<DockerDefaultTargetOS>Linux</DockerDefaultTargetOS>
<UserSecretsId>9a1f51b6-f4d9-4df7-a0af-e345176e9927</UserSecretsId>
<ApplicationInsightsResourceId>/subscriptions/088a81c7-d703-41c9-a1d0-476bce11df60/resourcegroups/WebResourceGroup/providers/microsoft.insights/components/tparnellblognew</ApplicationInsightsResourceId>
<ApplicationInsightsAnnotationResourceId>/subscriptions/088a81c7-d703-41c9-a1d0-476bce11df60/resourcegroups/WebResourceGroup/providers/microsoft.insights/components/tparnellblognew</ApplicationInsightsAnnotationResourceId>
</PropertyGroup>
<ItemGroup>
@@ -21,19 +23,28 @@
<ItemGroup>
<PackageReference Include="BuildBundlerMinifier" Version="2.8.391" />
<PackageReference Include="Markdig" Version="0.15.7" />
<PackageReference Include="Microsoft.AspNetCore.App" />
<PackageReference Include="Microsoft.AspNetCore.Razor.Design" Version="2.2.0" PrivateAssets="All" />
<PackageReference Include="Microsoft.ApplicationInsights.AspNetCore" Version="2.8.2" />
<PackageReference Include="Microsoft.VisualStudio.Azure.Containers.Tools.Targets" Version="1.0.2105168" />
<PackageReference Include="Microsoft.VisualStudio.Web.CodeGeneration.Design" Version="2.2.0" />
<PackageReference Include="UriBuilder.Fluent" Version="1.5.2" />
<PackageReference Include="YamlDotNet" Version="5.3.0" />
<PackageReference Include="HardHat" Version="2.0.0" />
<PackageReference Include="HardHat" Version="2.1.1" />
<PackageReference Include="Microsoft.SyndicationFeed.ReaderWriter" Version="1.0.2" />
<PackageReference Include="WebEssentials.AspNetCore.OutputCaching" Version="1.0.16" />
<PackageReference Include="Microsoft.AspNetCore.Mvc.Razor.RuntimeCompilation" Version="3.1.0" Condition="'$(Configuration)' == 'Debug'" />
</ItemGroup>
<ItemGroup>
<Content Include="Posts\*.md" CopyToOutputDirectory="Always" />
<Watch Include="Posts\*.md" />
</ItemGroup>
<ItemGroup>
<WCFMetadata Include="Connected Services" />
</ItemGroup>
<ItemGroup>
<Folder Include="BackgroundWorker\" />
</ItemGroup>
</Project>

View File

@@ -1,4 +1,4 @@
@inject Microsoft.AspNetCore.Hosting.IHostingEnvironment env
@inject Microsoft.AspNetCore.Hosting.IWebHostEnvironment env
@{
ViewData["Title"] = "Debug";
}

View File

@@ -1,7 +1,6 @@
@{
ViewData["Title"] = "FourOhFour";
ViewData["DisableHeader"] = true;
}
<h1>Ruh Oh!</h1>

View File

@@ -2,7 +2,6 @@
@{
ViewData["Title"] = "Home Page";
ViewData["DisableHeader"] = true;
}
<cache vary-by-route="pageNumber">
@@ -10,14 +9,17 @@
{
<partial name="PostSummary" model="post" />
}
@if (Model.HasNext)
{
<div class="bottomNavButtons">
<a href="/page/@(Model.Page - 1)/" class="btn">&#8592; Previous Page</a>
@if (Model.HasPrevious)
{
<a href="/page/@(Model.Page - 1)/" class="btn">&#8592; Previous Page</a>
}
<div class="spacer"></div>
<a href="/page/@(Model.Page + 1)/" class="btn">Next Page &#8594;</a>
@if (Model.HasNext)
{
<a href="/page/@(Model.Page + 1)/" class="btn">Next Page &#8594;</a>
}
</div>
}
</cache>
@section Head {

View File

@@ -1,7 +1,7 @@
@model IPost
@inject BlogConfiguration config
@model IPost
@{
ViewData["Title"] = "Post";
ViewData["HideNav"] = true;
ViewData["Title"] = @Model.Title;
}
<cache vary-by-route="postUrl">
@@ -9,17 +9,26 @@
</cache>
@section Head {
<meta name="description" content="@Model.SummaryPlain" />
<meta name="description" content="@Model.SummaryPlainShort" />
<meta property="og:type" content="blog">
<meta property="og:title" content="@Model.Title">
<meta property="og:url" content="https://blog.terribledev.io/Visualizing-your-react-redux-performance-bottlenecks/index.html">
<meta property="og:site_name" content="The Ramblings of TerribleDev">
<meta property="og:description" content="@Model.SummaryPlain">
<meta property="og:updated_time" content="2019-01-20T15:07:51.000Z">
<meta property="og:url" content="@Model.CanonicalUrl">
<meta property="og:site_name" content="@config.Title">
<meta property="og:description" content="@Model.SummaryPlainShort">
<meta property="og:updated_time" content="@Model.PublishDate.ToString("O")">
<meta name="twitter:card" content="summary">
<meta name="twitter:title" content="@Model.Title">
<meta name="twitter:description" content="@Model.SummaryPlain">
<meta name="twitter:image" content="https://blog.terribledev.io/1.png">
<meta name="twitter:description" content="@Model.SummaryPlainShort">
<meta name="twitter:site" content="@@TerribleDev">
<meta name="twitter:creator" content="@@TerribleDev">
<link rel="canonical" href="@Model.CanonicalUrl" />
@foreach(var image in Model.Images.Take(6))
{
<meta property="og:image" content="@image">
}
@if(Model.Images.Count > 0)
{
<meta name="twitter:image" content="@(Model.Images[0])">
}
<meta property="og:image" content="https://www.gravatar.com/avatar/333e3cea32cd17ff2007d131df336061?s=640" />
}

View File

@@ -10,7 +10,7 @@
<span>Tagged In:</span><br />
@foreach (var tag in Model.tags)
{
<a href="/tag/@tag" class="btn block">@tag</a>
<a href="/tag/@tag/" class="btn block">@tag</a>
}
</div>
}

View File

@@ -5,11 +5,12 @@
<link rel="preconnect" href="https://www.google-analytics.com">
<link rel="preconnect" href="https://stats.g.doubleclick.net">
<link rel="preconnect" href="https://www.googletagmanager.com">
<link rel="preconnect" href="https://az416426.vo.msecnd.net" />
<link rel="preconnect" href="https://dc.services.visualstudio.com" />
<script async src="https://www.googletagmanager.com/gtag/js?id=UA-48128396-1"></script>
<script>
window.dataLayer = window.dataLayer || [];
function gtag(){dataLayer.push(arguments);}
gtag('js', new Date());
gtag('config', 'UA-48128396-1');
</script>
window.dataLayer = window.dataLayer || [];
function gtag() { dataLayer.push(arguments); }
gtag('js', new Date());
gtag('config', 'UA-48128396-1');
</script>

View File

@@ -1,23 +1,18 @@
@{
var hideNav = ViewData["HideNav"] != null ? "hide" : "";
}
<nav class="navBar @hideNav">
@if (ViewData["HideNav"] != null)
{
<img src="" data-src="~/content/tommyAvatar3.jpg" class="lazy round" />
}
else
{
<img src="~/content/tommyAvatar3.jpg" class="round" />
}
<span>Tommy "Terrible Dev" Parnell</span>
<ul class="sidebarBtns">
<li><a href="/" class="link-unstyled">Home</a></li>
<li><a href="/all-tags" class="link-unstyled">Tags</a></li>
<li><a href="/about" class="link-unstyled">About</a></li>
<li><a href="https://github.com/terribledev" target="_blank" class="link-unstyled">Github</a></li>
<li><a href="https://twitter.com/terribledev" target="_blank" class="link-unstyled">Twitter</a></li>
<li><a href="mailto:tommy@terribledev.io" class="link-unstyled">Email</a></li>
</ul>
</nav>
<nav class="navBar hide" id="navBar">
<div class="navContent">
<picture class="navHero">
<source srcset="" type="image/webp" alt="An image of TerribleDev" data-src="/content/tommyAvatar4.jpg.webp" class="lazy round" />
<img src="" alt="An image of TerribleDev" data-src="/content/tommyAvatar4.jpg" class="lazy round" />
</picture>
<span>Tommy "Terrible Dev" Parnell</span>
<ul class="sidebarBtns">
<li><a href="/" class="link-unstyled">Home</a></li>
<li><a href="/all-tags" class="link-unstyled">Tags</a></li>
<li><a href="/rss.xml" class="link-unstyled">RSS Feed</a></li>
<li><a href="https://github.com/terribledev" rel="noopener" target="_blank" class="link-unstyled">Github</a></li>
<li><a href="https://twitter.com/terribledev" rel="noopener" target="_blank" class="link-unstyled">Twitter</a></li>
<li><a href="mailto:tommy@terribledev.io" class="link-unstyled">Email</a></li>
<li><span class="link-unstyled" id="closeNav">Close Navbar</span></li>
</ul>
</div>
</nav>

View File

@@ -1,10 +1,10 @@
@model IPost
<article class="btmRule">
<h3 itemprop="headline" class="headline"><a href="/@Model.Url" class="link-unstyled">@Model.Title</a></h3>
<h3 itemprop="headline" class="headline"><a href="@Model.RelativeUrl" class="link-unstyled">@Model.Title</a></h3>
<time class="headlineSubtext" itemprop="datePublished" content="@Model.PublishDate.ToString()">@Model.PublishDate.ToString("D")</time>
<div itemprop="articleBody">
@Model.Summary
</div>
<a href="/@Model.Url">Continue Reading </a>
</article>
<a href="@Model.RelativeUrl">Continue Reading </a>
</article>

View File

@@ -1,11 +1,12 @@
<meta name="description" content="My name is Tommy Parnell. I usually go by TerribleDev on the internets. These are just some of my writings and rants about the software space." />
@inject BlogConfiguration config
<meta name="description" content="My name is Tommy Parnell. I usually go by TerribleDev on the internets. These are just some of my writings and rants about the software space." />
<meta property="og:type" content="blog">
<meta property="og:title" content="The Ramblings of TerribleDev">
<meta property="og:url" content="https://blog.terribledev.io/index.html">
<meta property="og:site_name" content="The Ramblings of TerribleDev">
<meta property="og:title" content="@config.Title">
<meta property="og:url" content="https://blog.terrible.dev/">
<meta property="og:site_name" content="@config.Title">
<meta property="og:description" content="My name is Tommy Parnell. I usually go by TerribleDev on the internets. These are just some of my writings and rants about the software space.">
<meta name="twitter:card" content="summary">
<meta name="twitter:title" content="The Ramblings of TerribleDev">
<meta name="twitter:title" content="@config.Title">
<meta name="twitter:description" content="My name is Tommy Parnell. I usually go by TerribleDev on the internets. These are just some of my writings and rants about the software space.">
<meta name="twitter:creator" content="@@TerribleDev">
<meta property="og:image" content="https://www.gravatar.com/avatar/333e3cea32cd17ff2007d131df336061?s=640" />

View File

@@ -1,60 +1,57 @@
<!DOCTYPE html>
@inject BlogConfiguration config
<!DOCTYPE html>
<html lang="en">
<head>
<partial name="Gtm" />
<meta charset="utf-8" />
<meta http-equiv="Content-Type" content="text/html" charset="UTF-8" />
<environment names="Production">
<partial name="Gtm" />
</environment>
<meta name="author" content="Tommy &quot;TerribleDev&quot; Parnell" />
<meta name="theme-color" content="#4A4A4A" />
<link rel="alternate" type="application/atom+xml" title="RSS" href="/rss.xml" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<link rel="alternate" type="application/atom+xml" title="RSS" href="/rss.xml">
<link rel="manifest" href="~/manifest.json" asp-append-version="true">
<link asp-append-version="true" rel="icon" href="~/favicon.ico" />
<title>@ViewData["Title"] - The Ramblings of TerribleDev</title>
<link rel="alternate" type="application/atom+xml" async title="RSS" href="/rss.xml">
<link rel="manifest" href="~/manifest.json" async asp-append-version="true">
<link asp-append-version="true" rel="icon" async href="~/favicon.ico" />
<title>@ViewData["Title"] - @config.Title</title>
<environment names="Development">
<link asp-append-version="true" rel="stylesheet" href="~/css/site.css" />
</environment>
<environment names="Production">
<link asp-append-version="true" rel="stylesheet" href="~/css/site.min.css" />
</environment>
<environment names="Development">
<link asp-append-version="true" media="screen and (min-width: 769px)" rel="stylesheet" href="~/css/site.desktop.css" />
</environment>
<environment names="Production">
<link asp-append-version="true" media="screen and (min-width: 769px)" rel="stylesheet" href="~/css/site.desktop.min.css" />
</environment>
<environment names="Development">
<link asp-append-version="true" media="screen and (max-width: 768px)" rel="stylesheet" href="~/css/site.mobile.css" />
</environment>
<environment names="Production">
<link asp-append-version="true" media="screen and (max-width: 768px)" rel="stylesheet" href="~/css/site.mobile.min.css" />
</environment>
<environment names="Development">
<link asp-append-version="true" rel="preload" as="script" href="~/js/swi.js" />
</environment>
<environment names="Production">
<link asp-append-version="true" rel="preload" as="script" href="~/js/site.min.js" />
</environment>
@RenderSection("Head", false)
</head>
<body>
<partial name="Nav" />
@if (ViewData["DisableHeader"] == null)
{
<header class="header">
<div><a href="/" class="link-unstyled">The Ramblings of TerribleDev</a></div>
</header>
}
@{
var bodyBump = ViewData["HideNav"] == null ? "bodyWithNav": "";
var headerBump = ViewData["DisableHeader"] == null ? "headerBump" : "";
}
<main role="main" class="@bodyBump @headerBump">
<div class="main-content-wrap">
@RenderBody()
</div>
</main>
@*@if (ViewData["DisableHeader"] != null)
{
<main role="main" class="bodyWithNav">
<div class="main-content-wrap">
@RenderBody()
</div>
</main>
}
else
{
<header class="header">
<div><a href="/" class="link-unstyled">The Ramblings of TerribleDev</a></div>
</header>
<main role="main" class="main-content-wrap headerBump">
@RenderBody()
</main>
}*@
<div class="rootbox">
<header class="header">
<svg aria-label="Open Menu" id="menuBtn" role="button" xmlns="http://www.w3.org/2000/svg" width="32" height="32"><path d="M4 10h24c1.104 0 2-.896 2-2s-.896-2-2-2H4c-1.104 0-2 .896-2 2s.896 2 2 2zm24 4H4c-1.104 0-2 .896-2 2s.896 2 2 2h24c1.104 0 2-.896 2-2s-.896-2-2-2zm0 8H4c-1.104 0-2 .896-2 2s.896 2 2 2h24c1.104 0 2-.896 2-2s-.896-2-2-2z" /></svg>
<div class="headerCallout"><a href="/" class="link-unstyled ">@config.Title</a></div>
</header>
<partial name="Nav" />
<main class="headerBump main-content-wrap">
@RenderBody()
</main>
</div>
</div>
@RenderSection("Scripts", required: false)
<environment names="Development">
<script asp-append-version="true" src="~/js/swi.js" async></script>

View File

@@ -1,14 +1,15 @@
@model Dictionary<string, List<IPost>>
@model IDictionary<string, IList<IPost>>
@{
ViewData["Title"] = "all-tags";
ViewData["DisableHeader"] = true;
}
<h2>All Tags</h2>
<cache>
@foreach (var tag in Model.Keys)
{
<a href="/tag/@tag/" class="btn block">@tag</a>
}
</cache>
@section Head {
<partial name="StockMeta" />
}
<link rel="canonical" href="https://blog.terrible.dev/all-tags/" />
}

View File

@@ -2,9 +2,16 @@
@model GetTagViewModel
@{
ViewData["Tag:" + Model.Tag] = "GetTag";
ViewData["DisableHeader"] = true;
}
<cache vary-by-route="tagName">
@foreach (var post in Model.Posts)
{
<partial name="PostSummary" model="post" />
}
</cache>
@section Head {
@if(!String.IsNullOrEmpty(Model.CanonicalUrl)) {
<link rel="canonical" href="@Model.CanonicalUrl" />
}
}

View File

@@ -0,0 +1,5 @@
{
"ApplicationInsights": {
"InstrumentationKey": "974b47d2-1f08-42df-b498-bbfda7425f0b"
}
}

View File

@@ -9,5 +9,9 @@
}
}
},
"AllowedHosts": "*"
}
"AllowedHosts": "*",
"Blog": {
"title": "The Ramblings of TerribleDev",
"link": "https://blog.terrible.dev"
}
}

View File

@@ -5,6 +5,18 @@
"wwwroot/css/site.css"
]
},
{
"outputFileName": "wwwroot/css/site.desktop.min.css",
"inputFiles": [
"wwwroot/css/site.desktop.css"
]
},
{
"outputFileName": "wwwroot/css/site.mobile.min.css",
"inputFiles": [
"wwwroot/css/site.mobile.css"
]
},
{
"outputFileName": "wwwroot/js/site.min.js",
"inputFiles": [

Binary file not shown.

After

Width:  |  Height:  |  Size: 300 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 30 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 30 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 30 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 30 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 152 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 152 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 25 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 40 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 113 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 113 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.9 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 195 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 72 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 44 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 39 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.9 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 46 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 13 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 31 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 67 KiB

Some files were not shown because too many files have changed in this diff Show More