I built a browser-based tool for constructing, visualizing, and querying Bayesian Networks — no code, no MATLAB, no installs. Here's the story behind it and how to use it.
My journey of trying to access my homelab remotely. After struggling with ToDesk, Tailscale, and Cloudflare, I finally settled on a self-hosted FRP solution.
Building upon our basic GPT, we now implement a Sparse Mixture of Experts (MoE) architecture. This allows us to scale up model capacity (parameters) without proportionally increasing computational cost (FLOPs) during inference.
My open notebook for mastering AI building blocks.
Ep.01 focuses on the pretraining phase of a decoder-only, character-level GPT architecture, built from first principles.