In perhaps a majority of cases, integer overflow is not intended. It’s easy to find many instances of integer over and underflow causing unexpected behaviour in software - ranging from the amusing -127 lives glitch in Super Mario Bros., to a frightening bug in the software of the Boeing 787. In the hope of preventing these issues, the Rust compiler performs analysis on code to identify cases of potentially unintended integer overflow, while also offering a few different ways for programmers to explicitly allow overflow as desired - this is what we will explore in this article.- read more -
Hooray! I made it through yet another year on planet earth. What fun. Well done to me. Now I’m going to begin 2023 by reflecting on the past year through the very limited lens of programming languages, libraries, and tools, before then going over what I intend to learn this year.- read more -
x86 is a complex architecture with numerous functionalities and possible configuration options (naturally, given it being categorised as a complex instruction set - it certainly lives up to such a label). At the start of the long journey of building my own operating system, I noticed a recurring theme while trying my best to not make the emulated computer crash - everything is specified with tables. Memory segmentation, paging, interrupt handling - all specified by defining table structures in memory and then informing the CPU of their existence using special-purpose assembly instructions (
lgdt for the Global Descriptor Table,
lidt for the Interrupt Descriptor Table, and so on). The focus of this article is on only the first table a budding x86 operating system developer is likely to define: the Global Descriptor Table (GDT). The GDT is quite an annoying table because for the majority of OSes it is essentially useless, but still it must be declared or you risk causing your future self much frustration. Let’s take a look at how it works.