My Blog of Various Ramblings
Multilayer Caching in .NET
Caching is a powerful tool in a programmer's toolbox but it isn't magic. It can help scale an application to a vast number of users or it can be the thing dragging down your application. Layered caching is a technique of stacking different types of cache on top of each other which play to different strengths.
Levenshtein Distance (Part 3: Optimize Everything!)
In Part 1 we went through what the Levenshtein Distance is and in Part 2 we covered a few major optimizations for memory and performance. In Part 3 (this post) we will be taking things up to 11 and trying to squeeze every bit of performance out of our code.
Levenshtein Distance with SIMD (Bonus Part)
This is a bonus part because the other post was already jam-packed with optimizations plus this is a pretty exotic optimization that less developers are likely to directly use.
Levenshtein Distance (Part 2: Gotta Go Fast)
In Part 1 I explained what the Levenshtein Distance is and that it is both computationally and memory inefficient in a simple form. In Part 2 (this post), I'll cover ways to decrease the memory overhead and increase the performance.
Levenshtein Distance (Part 1: What is it?)
The Levenshtein Distance is a deceptively simple algorithm - by looping over two strings, it can provide the "distance" (the number of differences) between the two. These differences are calculated in terms of "inserts", "deletions" and "substitutions".
What is Microdata and why should I care?
So you want to launch a product? First you need an idea.
If you're anything like me, you have an itch you just can't scratch when working for someone. Maybe you're not very interested in the types of work presented. Perhaps you're feeling burnt out for doing many long hours maintaining legacy systems. At the very least, you want to create something new and launch it for the world to see.
MiniProfiler ❤ MongoDB
If you're familiar with .NET, you may have heard of an awesome project called MiniProfiler made by the awesome folks at Stack Overflow.
Bad design is everywhere. Let's be part of the solution.
Ever encountered poorly written code? How about a confusing UI in a program or web page? What about accidentally pulling a "Push" door? Each of these cases is a form of bad design through poor User Experience (UX) in their different mediums for their different users.
Building a Polite Web Crawler
Web crawling is the act of having a program or script accessing a website, capturing content and discovering any pages linked to from that content. On the surface it really is only performing HTTP requests and parsing HTML, both things that can be quite easily accomplished in a variety of languages and frameworks.
Halt and Hangfire
I have a website and I want to schedule a task to run every X minutes. Majority of the time, you would reach for Cron, throw together a fancy CRON expression and you would be on your way.
When you DIY a Time Zone library
You are an experienced developer who works on a Calendar application, allowing users to add events to their calendar including time and location. A new JIRA issue comes in for you to let users know when there is X amount of time before the event starts. To do this accurately with events potentially being in different regions, you need to think about time zones.
The anatomy of a critical vulnerability
Note: This was responsibly disclosed to SilverStripe immediately when it was discovered. It has now been publicly disclosed by SilverStripe (CVE-2019-5715 / SS-2018-021) as of the 18th of February with patches for all supported versions released.
I left my job today after 7 years
Today was my last day at the job I have had for the past 7 years. This day has been a long time coming but not because anything was bad about the job but about a grand plan I have had for a long time.
Our ability to search online for various things and get relevant results is quite a technical achievement, especially at the scale that search engines need to work at. They need to build large indexes of websites and content so they can process are queries and bring us to the content we are after.
No Robots Allowed
You probably have heard about web crawlers/spiders/bots etc, generally in the context of a search engine indexing a site to appear in its search results.
Shuffle Fail: Fixing my car stereo with code!
I have a 40-ish minute commute to work and love blasting music in my car (sorry if you were in the car next to me). I bought my car used 6 years ago and it still has the same stereo since I bought it.
Onion Coding: Programming in Layers
If you've come wanting to know interesting details about that well known "onion project", I am sorry to disappoint. Instead, this article will be talking about my experiences structuring code into layers as well as touching on some well-known patterns.