The best Hacker News stories from Show from the past day
Latest posts:
Show HN: Real-time AI (audio/video in, voice out) on an M3 Pro with Gemma E2B
Related: <a href="https://news.ycombinator.com/item?id=47653752">https://news.ycombinator.com/item?id=47653752</a>
Show HN: Ghost Pepper – Local hold-to-talk speech-to-text for macOS
I built this because I wanted to see how far I could get with a voice-to-text app that used 100% local models so no data left my computer. I've been using a ton for coding and emails. Experimenting with using it as a voice interface for my other agents too. 100% open-source MIT license, would love feedback, PRs, and ideas on where to take it.
Show HN: Ghost Pepper – Local hold-to-talk speech-to-text for macOS
I built this because I wanted to see how far I could get with a voice-to-text app that used 100% local models so no data left my computer. I've been using a ton for coding and emails. Experimenting with using it as a voice interface for my other agents too. 100% open-source MIT license, would love feedback, PRs, and ideas on where to take it.
Show HN: Ghost Pepper – Local hold-to-talk speech-to-text for macOS
I built this because I wanted to see how far I could get with a voice-to-text app that used 100% local models so no data left my computer. I've been using a ton for coding and emails. Experimenting with using it as a voice interface for my other agents too. 100% open-source MIT license, would love feedback, PRs, and ideas on where to take it.
Show HN: I made a YouTube search form with advanced filters
Show HN: I built a tiny LLM to demystify how language models work
Built a ~9M param LLM from scratch to understand how they actually work. Vanilla transformer, 60K synthetic conversations, ~130 lines of PyTorch. Trains in 5 min on a free Colab T4. The fish thinks the meaning of life is food.<p>Fork it and swap the personality for your own character.
Show HN: I built a tiny LLM to demystify how language models work
Built a ~9M param LLM from scratch to understand how they actually work. Vanilla transformer, 60K synthetic conversations, ~130 lines of PyTorch. Trains in 5 min on a free Colab T4. The fish thinks the meaning of life is food.<p>Fork it and swap the personality for your own character.
Show HN: I built a small app for FSI German Course
Hi everyone. I am built a small application on top of FSI German basic course and I need some feedback regarding it.<p>It is a small web app. Currently only Unit 01 is available. I will feed in the rest of the units later down the road as I use it myself.<p>Some of the features includes<p>- Slow and fast audio with every single word and sentence in this app. You can play them with a click of a button. No need to rewind a tap back and forth.
- Flashcards with keyboard control to quickly go through the material and drill them out.<p>You can access the website at <a href="https://detawk.com/" rel="nofollow">https://detawk.com/</a> . There is a demo video on the landing page. Give it a look before signing up.<p>If you have any questions or feedback for me, let me know. I hope you like the app.
Show HN: OsintRadar – Curated directory for osint tools
A project which groups together curated open source intelligence tools, frameworks, and techniques.
Show HN: Contrapunk – Real-time counterpoint harmony from guitar input
Hi HN, I built Contrapunk because I wanted to play guitar and hear
counterpoint harmonies generated in real-time. It takes audio from your guitar, MIDI player or your computer keyboard and generates harmony voices that follow counterpoint rules to generate harmonies. You can choose the key you would like to improvise/play in and the voice leading style and which part of the harmony you would like to play as, as well.<p>macOS DMG:
<a href="https://github.com/contrapunk-audio/contrapunk/releases/tag/v1.0.0" rel="nofollow">https://github.com/contrapunk-audio/contrapunk/releases/tag/...</a><p>Source: <a href="https://github.com/contrapunk-audio/contrapunk" rel="nofollow">https://github.com/contrapunk-audio/contrapunk</a> (do open any issues if you have any)<p>Would love feedback on the DSP approach and the harmony algorithms. I am also looking at training a ML model for better realtime guitar to midi detection. I believe that will take some time.
Show HN: Contrapunk – Real-time counterpoint harmony from guitar input
Hi HN, I built Contrapunk because I wanted to play guitar and hear
counterpoint harmonies generated in real-time. It takes audio from your guitar, MIDI player or your computer keyboard and generates harmony voices that follow counterpoint rules to generate harmonies. You can choose the key you would like to improvise/play in and the voice leading style and which part of the harmony you would like to play as, as well.<p>macOS DMG:
<a href="https://github.com/contrapunk-audio/contrapunk/releases/tag/v1.0.0" rel="nofollow">https://github.com/contrapunk-audio/contrapunk/releases/tag/...</a><p>Source: <a href="https://github.com/contrapunk-audio/contrapunk" rel="nofollow">https://github.com/contrapunk-audio/contrapunk</a> (do open any issues if you have any)<p>Would love feedback on the DSP approach and the harmony algorithms. I am also looking at training a ML model for better realtime guitar to midi detection. I believe that will take some time.
Show HN: I made open source, zero power PCB hackathon badges
I love getting cool swag from hackathons and I also love designing PCB's, so when my friend asked me if I would design hackathon badges for a large game jam in singapore, I was absolutely down!<p>The theme of overglade was a "The game jam within a game", pretty cool concept right! High schoolers from around the world were flown out to the event by hackclub after they spent about 70 hours designing their own game.<p>These badges needed to be really cheap and simple, because we were going to manufacture about a hundred in a pretty limited amount of time. I went with a zero-power approach, which means sticking with e-inks, and I decided to include NFC if the organizers wanted to introduce it into the roleplay of the event, and so participants could add their website or github if they so choose!<p>I used an RP2040-based architecture because it's really easy and cheap to get on the first try, and then added an ST25 passive NFC tag which was really simple to configure. The badge is in the shape of a ticket, because you got a "ticket" to the event after spending a lot of time designing games to qualify! 20 GPIO's are broken out onto the edges if you're ever in a pinch at a hackathon, and I wanted the badges to feel really fun so there's a lot of art designed by various people in the community!<p>The badge worked really well and I learned quite a lot in the process. My takeaways are to manufacture a BUNCH of extra badges, because some will end up breaking; to think about your PCB in 3D, because one of the inductors was a bit tall and caused more badges to break; and to have a strong vision of your final product, because it really helped me to create something unique and beautiful :D<p>I like to journal about all my projects, so if you'd like to read my full design process, feel free to take a look at my journal (<a href="https://github.com/KaiPereira/Overglade-Badges/blob/master/JOURNAL.md" rel="nofollow">https://github.com/KaiPereira/Overglade-Badges/blob/master/J...</a>). If you also have any questions or feedback, I'd be happy to answer them!
Show HN: M. C. Escher spiral in WebGL inspired by 3Blue1Brown
The latest 3Blue1Brown video [1] about the M. C. Escher print gallery effect inspired me to re-implement the effect as WebGL fragment shader on my own.<p>[1]: <a href="https://www.youtube.com/watch?v=ldxFjLJ3rVY" rel="nofollow">https://www.youtube.com/watch?v=ldxFjLJ3rVY</a>
Show HN: M. C. Escher spiral in WebGL inspired by 3Blue1Brown
The latest 3Blue1Brown video [1] about the M. C. Escher print gallery effect inspired me to re-implement the effect as WebGL fragment shader on my own.<p>[1]: <a href="https://www.youtube.com/watch?v=ldxFjLJ3rVY" rel="nofollow">https://www.youtube.com/watch?v=ldxFjLJ3rVY</a>
Show HN: ctx – an Agentic Development Environment (ADE)
Show HN: Travel Hacking Toolkit – Points search and trip planning with AI
I use points and miles for most of my travel. Every booking comes down to the same decision: use points or pay cash? To answer that, you need award availability across multiple programs, cash prices, your current balances, transfer partner ratios, and the math to compare them. I got tired of doing it manually across a dozen tabs.<p>This toolkit teaches Claude Code and OpenCode how to do it. 7 skills (markdown files with API docs and curl examples) and 6 MCP servers (real-time tools the AI calls directly).<p>It searches award flights across 25+ mileage programs (Seats.aero), compares cash prices (Google Flights, Skiplagged, Kiwi.com, Duffel), pulls your loyalty balances (AwardWallet), searches hotels (Trivago, LiteAPI, Airbnb, Booking.com), finds ferry routes across 33 countries, and looks up weird hidden gems near your destination (Atlas Obscura).<p>Reference data is included: transfer partner ratios for Chase UR, Amex MR, Bilt, Capital One, and Citi TY. Point valuations sourced from TPG, Upgraded Points, OMAAT, and View From The Wing. Alliance membership, sweet spot redemptions, booking windows, hotel chain brand lookups.<p>5 of the 6 MCP servers need zero API keys. Clone, run setup.sh, start
searching.<p>Skills are, as usual, plain markdown. They work in OpenCode and Claude Code automatically (I added a tiny setup script), and they'll work in anything else that supports skills.<p>PRs welcome! Help me expand the toolkit! :)<p><a href="https://github.com/borski/travel-hacking-toolkit" rel="nofollow">https://github.com/borski/travel-hacking-toolkit</a>
Show HN: TinyOS – A minimalist RTOS for Cortex-M written in C
Show HN: TurboQuant-WASM – Google's vector quantization in the browser
Show HN: TurboQuant-WASM – Google's vector quantization in the browser
Show HN: sllm – Split a GPU node with other developers, unlimited tokens
Running DeepSeek V3 (685B) requires 8×H100 GPUs which is about $14k/month. Most developers only need 15-25 tok/s. sllm lets you join a cohort of developers sharing a dedicated node. You reserve a spot with your card, and nobody is charged until the cohort fills. Prices start at $5/mo for smaller models.<p>The LLMs are completely private (we don't log any traffic).<p>The API is OpenAI-compatible (we run vLLM), so you just swap the base URL. Currently offering a few models.