The best Hacker News stories from Show from the past day
Latest posts:
Show HN: Onlyrecipe 2.0 – I added all features HN requested – 4 years later
Show HN: Onlyrecipe 2.0 – I added all features HN requested – 4 years later
Show HN: Onlyrecipe 2.0 – I added all features HN requested – 4 years later
Show HN: FastLanes based integer compression in Zig
Show HN: Microlandia, a brutally honest city builder
It all started as an experiment to see if I could build a game making heavy use of Deno and its SQLite driver. After sharing an early build in the „What are you working on?“ thread here, I got the encouragement I needed to polish it and make a version 1.0 for Steam.<p>So here it is, Microlandia, a SimCity Classic-inspired game with parameters from real-life datasets, statistics and research. It also introduces aspects that are conveniently hidden in other games (like homelessness), and my plan is to continue updating, expanding and perfecting the models for an indefinite amount of time.
Show HN: Microlandia, a brutally honest city builder
It all started as an experiment to see if I could build a game making heavy use of Deno and its SQLite driver. After sharing an early build in the „What are you working on?“ thread here, I got the encouragement I needed to polish it and make a version 1.0 for Steam.<p>So here it is, Microlandia, a SimCity Classic-inspired game with parameters from real-life datasets, statistics and research. It also introduces aspects that are conveniently hidden in other games (like homelessness), and my plan is to continue updating, expanding and perfecting the models for an indefinite amount of time.
Show HN: Microlandia, a brutally honest city builder
It all started as an experiment to see if I could build a game making heavy use of Deno and its SQLite driver. After sharing an early build in the „What are you working on?“ thread here, I got the encouragement I needed to polish it and make a version 1.0 for Steam.<p>So here it is, Microlandia, a SimCity Classic-inspired game with parameters from real-life datasets, statistics and research. It also introduces aspects that are conveniently hidden in other games (like homelessness), and my plan is to continue updating, expanding and perfecting the models for an indefinite amount of time.
Show HN: Fresh – A new terminal editor built in Rust
I built Fresh to challenge the status quo that terminal editing must require a steep learning curve or endless configuration. My goal was to create a fast, resource-efficient TUI editor with the usability and features of a modern GUI editor (like a command palette, mouse support, and LSP integration).<p>Core Philosophy:<p>- <i>Ease-of-Use:</i> Fundamentally non-modal. Prioritizes standard keybindings and a minimal learning curve.<p>- <i>Efficiency:</i> Uses a lazy-loading piece tree to avoid loading huge files into RAM - reads only what's needed for user interactions. Coded in Rust.<p>- <i>Extensibility:</i> Uses TypeScript (via Deno) for plugins, making it accessible to a large developer base.<p>The Performance Challenge:<p>I focused on resource consumption and speed with large file support as a core feature. I did a quick benchmark loading a 2GB log file with ANSI color codes. Here is the comparison against other popular editors:<p><pre><code> - Fresh: Load Time: *~600ms* | Memory: *~36 MB*
- Neovim: Load Time: ~6.5 seconds | Memory: ~2 GB
- Emacs: Load Time: ~10 seconds | Memory: ~2 GB
- VS Code: Load Time: ~20 seconds | Memory: OOM Killed (~4.3 GB available)
</code></pre>
(Only Fresh rendered the ansi colors.)<p>Development process:<p>I embraced Claude Code and made an effort to get good mileage out of it. I gave it strong specific directions, especially in architecture / code structure / UX-sensitive areas. It required constant supervision and re-alignment, especially in the performance critical areas. Added very extensive tests (compared to my normal standards) to keep it aligned as the code grows. Especially, focused on end-to-end testing where I could easily enforce a specific behavior or user flow.<p>Fresh is an open-source project (GPL-2) seeking early adopters. You're welcome to send feedback, feature requests, and bug reports.<p>Website: <a href="https://sinelaw.github.io/fresh/" rel="nofollow">https://sinelaw.github.io/fresh/</a><p>GitHub Repository: <a href="https://github.com/sinelaw/fresh" rel="nofollow">https://github.com/sinelaw/fresh</a>
Show HN: Fresh – A new terminal editor built in Rust
I built Fresh to challenge the status quo that terminal editing must require a steep learning curve or endless configuration. My goal was to create a fast, resource-efficient TUI editor with the usability and features of a modern GUI editor (like a command palette, mouse support, and LSP integration).<p>Core Philosophy:<p>- <i>Ease-of-Use:</i> Fundamentally non-modal. Prioritizes standard keybindings and a minimal learning curve.<p>- <i>Efficiency:</i> Uses a lazy-loading piece tree to avoid loading huge files into RAM - reads only what's needed for user interactions. Coded in Rust.<p>- <i>Extensibility:</i> Uses TypeScript (via Deno) for plugins, making it accessible to a large developer base.<p>The Performance Challenge:<p>I focused on resource consumption and speed with large file support as a core feature. I did a quick benchmark loading a 2GB log file with ANSI color codes. Here is the comparison against other popular editors:<p><pre><code> - Fresh: Load Time: *~600ms* | Memory: *~36 MB*
- Neovim: Load Time: ~6.5 seconds | Memory: ~2 GB
- Emacs: Load Time: ~10 seconds | Memory: ~2 GB
- VS Code: Load Time: ~20 seconds | Memory: OOM Killed (~4.3 GB available)
</code></pre>
(Only Fresh rendered the ansi colors.)<p>Development process:<p>I embraced Claude Code and made an effort to get good mileage out of it. I gave it strong specific directions, especially in architecture / code structure / UX-sensitive areas. It required constant supervision and re-alignment, especially in the performance critical areas. Added very extensive tests (compared to my normal standards) to keep it aligned as the code grows. Especially, focused on end-to-end testing where I could easily enforce a specific behavior or user flow.<p>Fresh is an open-source project (GPL-2) seeking early adopters. You're welcome to send feedback, feature requests, and bug reports.<p>Website: <a href="https://sinelaw.github.io/fresh/" rel="nofollow">https://sinelaw.github.io/fresh/</a><p>GitHub Repository: <a href="https://github.com/sinelaw/fresh" rel="nofollow">https://github.com/sinelaw/fresh</a>
Show HN: Fresh – A new terminal editor built in Rust
I built Fresh to challenge the status quo that terminal editing must require a steep learning curve or endless configuration. My goal was to create a fast, resource-efficient TUI editor with the usability and features of a modern GUI editor (like a command palette, mouse support, and LSP integration).<p>Core Philosophy:<p>- <i>Ease-of-Use:</i> Fundamentally non-modal. Prioritizes standard keybindings and a minimal learning curve.<p>- <i>Efficiency:</i> Uses a lazy-loading piece tree to avoid loading huge files into RAM - reads only what's needed for user interactions. Coded in Rust.<p>- <i>Extensibility:</i> Uses TypeScript (via Deno) for plugins, making it accessible to a large developer base.<p>The Performance Challenge:<p>I focused on resource consumption and speed with large file support as a core feature. I did a quick benchmark loading a 2GB log file with ANSI color codes. Here is the comparison against other popular editors:<p><pre><code> - Fresh: Load Time: *~600ms* | Memory: *~36 MB*
- Neovim: Load Time: ~6.5 seconds | Memory: ~2 GB
- Emacs: Load Time: ~10 seconds | Memory: ~2 GB
- VS Code: Load Time: ~20 seconds | Memory: OOM Killed (~4.3 GB available)
</code></pre>
(Only Fresh rendered the ansi colors.)<p>Development process:<p>I embraced Claude Code and made an effort to get good mileage out of it. I gave it strong specific directions, especially in architecture / code structure / UX-sensitive areas. It required constant supervision and re-alignment, especially in the performance critical areas. Added very extensive tests (compared to my normal standards) to keep it aligned as the code grows. Especially, focused on end-to-end testing where I could easily enforce a specific behavior or user flow.<p>Fresh is an open-source project (GPL-2) seeking early adopters. You're welcome to send feedback, feature requests, and bug reports.<p>Website: <a href="https://sinelaw.github.io/fresh/" rel="nofollow">https://sinelaw.github.io/fresh/</a><p>GitHub Repository: <a href="https://github.com/sinelaw/fresh" rel="nofollow">https://github.com/sinelaw/fresh</a>
Show HN: I built a dashboard to compare mortgage rates across 120 credit unions
When I bought my home, the big bank I'd been using for years quoted me 7% APR. A local credit union was offering 5.5% for the exact same mortgage.<p>I was surprised until I learned that mortgages are basically standardized products – the government buys almost all of them (see Bits About Money: <a href="https://www.bitsaboutmoney.com/archive/mortgages-are-a-manufactured-product/" rel="nofollow">https://www.bitsaboutmoney.com/archive/mortgages-are-a-manuf...</a>). So what's the price difference paying for? A recent Bloomberg Odd Lots episode makes the case that it's largely advertising and marketing (<a href="https://www.bloomberg.com/news/audio/2025-11-28/odd-lots-this-is-why-credit-card-rates-are-so-high-podcast" rel="nofollow">https://www.bloomberg.com/news/audio/2025-11-28/odd-lots-thi...</a>). Credit unions are non-profits without big marketing budgets, so they can pass those savings on, but a lot of people don't know about them.<p>I built this dashboard to make it easier to shop around. I pull public rates from 120+ credit union websites and compares against the weekly FRED national benchmark.<p>Features:<p>- Filter by loan type (30Y/15Y/etc.), eligibility (the hardest part tbh), and rate type
- Payment calculator with refi mode (CUs can be a bit slower than big lenders, but that makes them great for refi)
- Links to each CU's rates page and eligibility requirements
- Toggle to show/hide statistical outliers<p>At the time of writing, the average CU rate is 5.91% vs. 6.23% national average. about $37k difference in total interest on a $500k loan. I actually used seaborn to visualize the rate spread against the four big banks: <a href="https://www.reddit.com/r/dataisbeautiful/comments/1pcj9t7/oc_the_high_cost_of_big_banks_i_tracked_daily/" rel="nofollow">https://www.reddit.com/r/dataisbeautiful/comments/1pcj9t7/oc...</a><p>Stack: Python for the data/backend, Svelte/SvelteKit for the frontend. No signup, no ads, no referral fees.<p>Happy to answer questions about the methodology or add CUs people suggest.
Show HN: I built a dashboard to compare mortgage rates across 120 credit unions
When I bought my home, the big bank I'd been using for years quoted me 7% APR. A local credit union was offering 5.5% for the exact same mortgage.<p>I was surprised until I learned that mortgages are basically standardized products – the government buys almost all of them (see Bits About Money: <a href="https://www.bitsaboutmoney.com/archive/mortgages-are-a-manufactured-product/" rel="nofollow">https://www.bitsaboutmoney.com/archive/mortgages-are-a-manuf...</a>). So what's the price difference paying for? A recent Bloomberg Odd Lots episode makes the case that it's largely advertising and marketing (<a href="https://www.bloomberg.com/news/audio/2025-11-28/odd-lots-this-is-why-credit-card-rates-are-so-high-podcast" rel="nofollow">https://www.bloomberg.com/news/audio/2025-11-28/odd-lots-thi...</a>). Credit unions are non-profits without big marketing budgets, so they can pass those savings on, but a lot of people don't know about them.<p>I built this dashboard to make it easier to shop around. I pull public rates from 120+ credit union websites and compares against the weekly FRED national benchmark.<p>Features:<p>- Filter by loan type (30Y/15Y/etc.), eligibility (the hardest part tbh), and rate type
- Payment calculator with refi mode (CUs can be a bit slower than big lenders, but that makes them great for refi)
- Links to each CU's rates page and eligibility requirements
- Toggle to show/hide statistical outliers<p>At the time of writing, the average CU rate is 5.91% vs. 6.23% national average. about $37k difference in total interest on a $500k loan. I actually used seaborn to visualize the rate spread against the four big banks: <a href="https://www.reddit.com/r/dataisbeautiful/comments/1pcj9t7/oc_the_high_cost_of_big_banks_i_tracked_daily/" rel="nofollow">https://www.reddit.com/r/dataisbeautiful/comments/1pcj9t7/oc...</a><p>Stack: Python for the data/backend, Svelte/SvelteKit for the frontend. No signup, no ads, no referral fees.<p>Happy to answer questions about the methodology or add CUs people suggest.
Show HN: I built a dashboard to compare mortgage rates across 120 credit unions
When I bought my home, the big bank I'd been using for years quoted me 7% APR. A local credit union was offering 5.5% for the exact same mortgage.<p>I was surprised until I learned that mortgages are basically standardized products – the government buys almost all of them (see Bits About Money: <a href="https://www.bitsaboutmoney.com/archive/mortgages-are-a-manufactured-product/" rel="nofollow">https://www.bitsaboutmoney.com/archive/mortgages-are-a-manuf...</a>). So what's the price difference paying for? A recent Bloomberg Odd Lots episode makes the case that it's largely advertising and marketing (<a href="https://www.bloomberg.com/news/audio/2025-11-28/odd-lots-this-is-why-credit-card-rates-are-so-high-podcast" rel="nofollow">https://www.bloomberg.com/news/audio/2025-11-28/odd-lots-thi...</a>). Credit unions are non-profits without big marketing budgets, so they can pass those savings on, but a lot of people don't know about them.<p>I built this dashboard to make it easier to shop around. I pull public rates from 120+ credit union websites and compares against the weekly FRED national benchmark.<p>Features:<p>- Filter by loan type (30Y/15Y/etc.), eligibility (the hardest part tbh), and rate type
- Payment calculator with refi mode (CUs can be a bit slower than big lenders, but that makes them great for refi)
- Links to each CU's rates page and eligibility requirements
- Toggle to show/hide statistical outliers<p>At the time of writing, the average CU rate is 5.91% vs. 6.23% national average. about $37k difference in total interest on a $500k loan. I actually used seaborn to visualize the rate spread against the four big banks: <a href="https://www.reddit.com/r/dataisbeautiful/comments/1pcj9t7/oc_the_high_cost_of_big_banks_i_tracked_daily/" rel="nofollow">https://www.reddit.com/r/dataisbeautiful/comments/1pcj9t7/oc...</a><p>Stack: Python for the data/backend, Svelte/SvelteKit for the frontend. No signup, no ads, no referral fees.<p>Happy to answer questions about the methodology or add CUs people suggest.
Show HN: Doomscrolling Research Papers
Hi HN,<p>Would love your thoughts on Open Paper Digest. It’s a mobile feed that let’s you “doomscroll” through summaries of popular papers that were published recently.<p><i>Backstory</i>
There’s a combination of factors lead me to build this:<p>1. Quality of content social media apps has decreased, but I still notice that it is harder than ever for me to stay away from these apps.
2. I’ve been saying for a while now that I should start reading papers to keep up with what’s going on in AI-world.<p>Initially, I set out to build something solely for point 2. This version was more search-focussed, and focussed on simplifying the whole text of a paper, not summarizing. Still, I wasn’t using it. After yet another 30 min doomscroll on a bus last month, point 1 came into the picture and I changed how Open Paper Digest worked. That’s what you can see today!<p><i>How it works</i>
It is checking Huggingface Trending Papers and the large research labs daily to find papers to add to the index. The PDFs gets converted to markdown using Mistral OCR, this is then given to Gemini 2.5 to create a 5 minute summary.<p>I notice that I am now going to the site daily, so that’s a good sign. I’m curious what you all think, and what feedback you might have.<p>Cheers,
Arthur
Show HN: FFmpeg Engineering Handbook
Show HN: RunMat – runtime with auto CPU/GPU routing for dense math
Hi, I’m Nabeel. In August I released RunMat as an open-source runtime for MATLAB code that was already much faster than GNU Octave on the workloads I tried. <a href="https://news.ycombinator.com/item?id=44972919">https://news.ycombinator.com/item?id=44972919</a><p>Since then, I’ve taken it further with RunMat Accelerate: the runtime now automatically fuses operations and routes work between CPU and GPU. You write MATLAB-style code, and RunMat runs your computation across CPUs and GPUs for speed. No CUDA, no kernel code.<p>Under the hood, it builds a graph of your array math, fuses long chains into a few kernels, keeps data on the GPU when that helps, and falls back to CPU JIT / BLAS for small cases.<p>On an Apple M2 Max (32 GB), here are some current benchmarks (median of several runs):<p>* 5M-path Monte Carlo
* RunMat ≈ 0.61 s
* PyTorch ≈ 1.70 s
* NumPy ≈ 79.9 s
→ ~2.8× faster than PyTorch and ~130× faster than NumPy on this test.<p>* 64 × 4K image preprocessing pipeline
(mean/std, normalize, gain/bias, gamma, MSE)
* RunMat ≈ 0.68 s
* PyTorch ≈ 1.20 s
* NumPy ≈ 7.0 s
→ ~1.8× faster than PyTorch and ~10× faster than NumPy.<p>* 1B-point elementwise chain (sin / exp / cos / tanh mix)
* RunMat ≈ 0.14 s
* PyTorch ≈ 20.8 s
* NumPy ≈ 11.9 s
→ ~140× faster than PyTorch and ~80× faster than NumPy.<p>If you want more detail on how the fusion and CPU/GPU routing work, I wrote up a longer post here:
<a href="https://runmat.org/blog/runmat-accel-intro-blog" rel="nofollow">https://runmat.org/blog/runmat-accel-intro-blog</a><p>You can run the same benchmarks yourself from the GitHub repo in the main HN link. Feedback, bug reports, and “here’s where it breaks or is slow” examples are very welcome.
Show HN: RunMat – runtime with auto CPU/GPU routing for dense math
Hi, I’m Nabeel. In August I released RunMat as an open-source runtime for MATLAB code that was already much faster than GNU Octave on the workloads I tried. <a href="https://news.ycombinator.com/item?id=44972919">https://news.ycombinator.com/item?id=44972919</a><p>Since then, I’ve taken it further with RunMat Accelerate: the runtime now automatically fuses operations and routes work between CPU and GPU. You write MATLAB-style code, and RunMat runs your computation across CPUs and GPUs for speed. No CUDA, no kernel code.<p>Under the hood, it builds a graph of your array math, fuses long chains into a few kernels, keeps data on the GPU when that helps, and falls back to CPU JIT / BLAS for small cases.<p>On an Apple M2 Max (32 GB), here are some current benchmarks (median of several runs):<p>* 5M-path Monte Carlo
* RunMat ≈ 0.61 s
* PyTorch ≈ 1.70 s
* NumPy ≈ 79.9 s
→ ~2.8× faster than PyTorch and ~130× faster than NumPy on this test.<p>* 64 × 4K image preprocessing pipeline
(mean/std, normalize, gain/bias, gamma, MSE)
* RunMat ≈ 0.68 s
* PyTorch ≈ 1.20 s
* NumPy ≈ 7.0 s
→ ~1.8× faster than PyTorch and ~10× faster than NumPy.<p>* 1B-point elementwise chain (sin / exp / cos / tanh mix)
* RunMat ≈ 0.14 s
* PyTorch ≈ 20.8 s
* NumPy ≈ 11.9 s
→ ~140× faster than PyTorch and ~80× faster than NumPy.<p>If you want more detail on how the fusion and CPU/GPU routing work, I wrote up a longer post here:
<a href="https://runmat.org/blog/runmat-accel-intro-blog" rel="nofollow">https://runmat.org/blog/runmat-accel-intro-blog</a><p>You can run the same benchmarks yourself from the GitHub repo in the main HN link. Feedback, bug reports, and “here’s where it breaks or is slow” examples are very welcome.
Show HN: Webclone.js – A simple tool to clone websites
I needed a lightweight way to archive documentation from a website. wget and similar tools failed to clone the site reliably (missing assets, broken links, etc.), so I ended up building a full website-cloning tool using Node.js + Puppeteer.<p>Repo:
<a href="https://github.com/jademsee/webclone" rel="nofollow">https://github.com/jademsee/webclone</a><p>Feedback, issues, and PRs are very welcome.
Show HN: Webclone.js – A simple tool to clone websites
I needed a lightweight way to archive documentation from a website. wget and similar tools failed to clone the site reliably (missing assets, broken links, etc.), so I ended up building a full website-cloning tool using Node.js + Puppeteer.<p>Repo:
<a href="https://github.com/jademsee/webclone" rel="nofollow">https://github.com/jademsee/webclone</a><p>Feedback, issues, and PRs are very welcome.
Show HN: Marmot – Single-binary data catalog (no Kafka, no Elasticsearch)