The best Hacker News stories from Show from the past day
Latest posts:
Show HN: Mazelit - My wife and I released our first game
Hey folks,<p>About a year ago my wife and I, both closing in on 40, quit our jobs at Red Hat to start a games company and learn game development. Many things happened along the road, and about a week ago we released our first (small) game on Steam.<p><a href="https://store.steampowered.com/app/2816120/Mazelit/" rel="nofollow">https://store.steampowered.com/app/2816120/Mazelit/</a><p>The demo is free to play up to level 8 (the final game plays up to level 80) and we'd appreciate any feedback you have, whether it's for the store page or the game itself.<p>We made the game in Godot 4.2 in roughly 3 months and I was working full time next to it. Since we ran into a bunch of roadblocks, we decided to also offer the entire source code up as a DLC in case someone wants to go look how we implemented the game, mod the game, or compile it for a different platform. The only thing we can't redistribute with the game code is the Steamworks SDK, which is available for download from Steam. (The game minus integration is fully runnable without the SDK, though.)<p>Cheers and happy weekend!
Show HN: Next-token prediction in JavaScript
What inspired this project today was watching this amazing video by 3Blue1Brown called "But what is a GPT?" on Youtube (<a href="https://www.youtube.com/watch?v=wjZofJX0v4M" rel="nofollow">https://www.youtube.com/watch?v=wjZofJX0v4M</a> - I highly recommend watching it). I added it to the repo for reference.<p>When it clicked in my head that "knowing a fact" is nearly synonymous with predicting a word (or series of words), I wanted to put it to the test, because it seemed so simple. I chose JavaScript because I can exploit the way it structures objects to aid in the modeling of language.<p>For example:<p>"I want to be at the beach",
"I will do it later",
"I want to know the answer",
...<p>becomes:<p><pre><code> {
I: {
want: {
to: {
be: { ... },
know: { ... }
}
},
will: { ... }
},
...
}
</code></pre>
in JavaScript. You can exploit the language's fast object lookup speed to find known sentences this way, rather than recursively searching text - which is the convention and would take forever or not work at all considering there are several full books loaded in by default (and it could support many more).<p>Accompanying research yielded learnings about what "tokens" and "embeddings" are, what is meant by "training", and most of the rest - though I'm still learning jargon. I wrote a script to iterate over every single word of every single book to rank how likely it is that word will appear next, if given a cursor, and extended that to rank entire phrases. The base decoder started out what I'll call "token-agnostic" - didn't care if you were looking for the next letter... word... pixel... it's the same logic. But actually it's not, and it soon evolved into a text (language) model. But I have plans to get into image generation next (next-pixel prediction), using this. Overall the concepts are similar, but there are differences primarily around extraction and formatting.<p>Goals of the project:<p>- Demystify LLMs for people, show that it's just regular code that does normal stuff<p>- Actually make a pretty good LLM in JavaScript, with a version at least capable of running in a browser tab
Show HN: Next-token prediction in JavaScript
What inspired this project today was watching this amazing video by 3Blue1Brown called "But what is a GPT?" on Youtube (<a href="https://www.youtube.com/watch?v=wjZofJX0v4M" rel="nofollow">https://www.youtube.com/watch?v=wjZofJX0v4M</a> - I highly recommend watching it). I added it to the repo for reference.<p>When it clicked in my head that "knowing a fact" is nearly synonymous with predicting a word (or series of words), I wanted to put it to the test, because it seemed so simple. I chose JavaScript because I can exploit the way it structures objects to aid in the modeling of language.<p>For example:<p>"I want to be at the beach",
"I will do it later",
"I want to know the answer",
...<p>becomes:<p><pre><code> {
I: {
want: {
to: {
be: { ... },
know: { ... }
}
},
will: { ... }
},
...
}
</code></pre>
in JavaScript. You can exploit the language's fast object lookup speed to find known sentences this way, rather than recursively searching text - which is the convention and would take forever or not work at all considering there are several full books loaded in by default (and it could support many more).<p>Accompanying research yielded learnings about what "tokens" and "embeddings" are, what is meant by "training", and most of the rest - though I'm still learning jargon. I wrote a script to iterate over every single word of every single book to rank how likely it is that word will appear next, if given a cursor, and extended that to rank entire phrases. The base decoder started out what I'll call "token-agnostic" - didn't care if you were looking for the next letter... word... pixel... it's the same logic. But actually it's not, and it soon evolved into a text (language) model. But I have plans to get into image generation next (next-pixel prediction), using this. Overall the concepts are similar, but there are differences primarily around extraction and formatting.<p>Goals of the project:<p>- Demystify LLMs for people, show that it's just regular code that does normal stuff<p>- Actually make a pretty good LLM in JavaScript, with a version at least capable of running in a browser tab
Show HN: I made a new sensor out of 3D printer filament for my PhD
Here's a "behind-the-scenes" look at my development of a cool sensor during my PhD (electrical engineering). This sensor is only about 1/3 of my total research for my degree and took about a year.<p>I've been on HN for a while now and I've seen my fair share of posts about the woes of pursuing a PhD. Now that I'm done with mine I wanna share some anecdotal evidence that doing a PhD can actually be enjoyable (not necessarily easy) and also be doable in 3 years.<p>When I started I knew I didn't want to work on something that would never leave the lab or languish in a dissertation PDF no one will ever read. Thanks to an awesome advisor I think I managed to thread the needle between simplicity and functionality.<p>Looking back, the ideas and methods behind it are pretty straightforward, but getting there took some doing. It’s funny how things seem obvious once you've figured them out!<p>Oh, I love creating GUIs for sensor data and visualizations as you'll see -- it's such a game changer! pyqtgraph is my go-to at the moment - such a great library.
Show HN: I made a new sensor out of 3D printer filament for my PhD
Here's a "behind-the-scenes" look at my development of a cool sensor during my PhD (electrical engineering). This sensor is only about 1/3 of my total research for my degree and took about a year.<p>I've been on HN for a while now and I've seen my fair share of posts about the woes of pursuing a PhD. Now that I'm done with mine I wanna share some anecdotal evidence that doing a PhD can actually be enjoyable (not necessarily easy) and also be doable in 3 years.<p>When I started I knew I didn't want to work on something that would never leave the lab or languish in a dissertation PDF no one will ever read. Thanks to an awesome advisor I think I managed to thread the needle between simplicity and functionality.<p>Looking back, the ideas and methods behind it are pretty straightforward, but getting there took some doing. It’s funny how things seem obvious once you've figured them out!<p>Oh, I love creating GUIs for sensor data and visualizations as you'll see -- it's such a game changer! pyqtgraph is my go-to at the moment - such a great library.
Show HN: I made a new sensor out of 3D printer filament for my PhD
Here's a "behind-the-scenes" look at my development of a cool sensor during my PhD (electrical engineering). This sensor is only about 1/3 of my total research for my degree and took about a year.<p>I've been on HN for a while now and I've seen my fair share of posts about the woes of pursuing a PhD. Now that I'm done with mine I wanna share some anecdotal evidence that doing a PhD can actually be enjoyable (not necessarily easy) and also be doable in 3 years.<p>When I started I knew I didn't want to work on something that would never leave the lab or languish in a dissertation PDF no one will ever read. Thanks to an awesome advisor I think I managed to thread the needle between simplicity and functionality.<p>Looking back, the ideas and methods behind it are pretty straightforward, but getting there took some doing. It’s funny how things seem obvious once you've figured them out!<p>Oh, I love creating GUIs for sensor data and visualizations as you'll see -- it's such a game changer! pyqtgraph is my go-to at the moment - such a great library.
Show HN: I made a new sensor out of 3D printer filament for my PhD
Here's a "behind-the-scenes" look at my development of a cool sensor during my PhD (electrical engineering). This sensor is only about 1/3 of my total research for my degree and took about a year.<p>I've been on HN for a while now and I've seen my fair share of posts about the woes of pursuing a PhD. Now that I'm done with mine I wanna share some anecdotal evidence that doing a PhD can actually be enjoyable (not necessarily easy) and also be doable in 3 years.<p>When I started I knew I didn't want to work on something that would never leave the lab or languish in a dissertation PDF no one will ever read. Thanks to an awesome advisor I think I managed to thread the needle between simplicity and functionality.<p>Looking back, the ideas and methods behind it are pretty straightforward, but getting there took some doing. It’s funny how things seem obvious once you've figured them out!<p>Oh, I love creating GUIs for sensor data and visualizations as you'll see -- it's such a game changer! pyqtgraph is my go-to at the moment - such a great library.
Show HN: I made a new sensor out of 3D printer filament for my PhD
Here's a "behind-the-scenes" look at my development of a cool sensor during my PhD (electrical engineering). This sensor is only about 1/3 of my total research for my degree and took about a year.<p>I've been on HN for a while now and I've seen my fair share of posts about the woes of pursuing a PhD. Now that I'm done with mine I wanna share some anecdotal evidence that doing a PhD can actually be enjoyable (not necessarily easy) and also be doable in 3 years.<p>When I started I knew I didn't want to work on something that would never leave the lab or languish in a dissertation PDF no one will ever read. Thanks to an awesome advisor I think I managed to thread the needle between simplicity and functionality.<p>Looking back, the ideas and methods behind it are pretty straightforward, but getting there took some doing. It’s funny how things seem obvious once you've figured them out!<p>Oh, I love creating GUIs for sensor data and visualizations as you'll see -- it's such a game changer! pyqtgraph is my go-to at the moment - such a great library.
Show HN: Deco.cx – realtime TypeScript web editor
Hi, HN.
Gui, Lucis, and the deco.cx team here — we're a bunch of Brazilians building an open-source, all-in-one web editor for the modern TypeScript dev. Check out our GitHub at <a href="https://github.com/deco-cx/deco">https://github.com/deco-cx/deco</a> and discover more at our site <a href="https://deco.cx/" rel="nofollow">https://deco.cx/</a>.<p>We built frontend tools at a hyper-growth e-commerce platform for 9 years. Deco.cx is the thing that we wish existed when we were managing hundreds of enterprise-grade React-based websites.<p>The gist is: JSX, HTMX, TypeScript, Tailwind and Deno. All visually editable in a beautiful, collaborative, REALTIME (!) admin UI. Great for devs, great for content.<p>In essence, we're a git-based TypeScript-first configuration management system. We take TypeScript code that represents your site, your entities, your UI, and we allow anyone to "save" configuration and content based on those types.<p>It's great for developers, because you can code directly on the web, or on your machine, and get instant feedback. You "export interface Props" and boom, an automatic form is generated. It's great for the business users, the content editors, because you just press "." (dot) in your keyboard and you go into a visual edit mode for the page you're at right now. Everything the developer exported in TypeScript is editable.<p>And when you hit "publish", it's already live, with a proper CDN, proper caching, everything already setup and just working. Ready to scale!<p>But you can't fly blind. We're democratizing pro-level tools for web analytics and observability by bundling in our Pro plan: fully-managed Plausible analytics and HyperDX observability. Check them out — these are great tools on their own, and they shine in the deco.cx platform bundle.<p>We don't want to be the only answer, we don't claim to be right about anything, we just want to make the friendly and accessible open-source tool that we wish existed when we were junior developers. If you want to participate, please join our discord community at <a href="https://deco.cx/discord" rel="nofollow">https://deco.cx/discord</a><p>on open-source: our admin UI core is not ready for prime time, we're working to open-source it this year. but basically everything else, our integrations, templates and configuration management runtime are all open-source already! take a look at these: <a href="https://github.com/deco-cx/deco">https://github.com/deco-cx/deco</a> <a href="https://github.com/deco-cx/apps">https://github.com/deco-cx/apps</a> <a href="https://github.com/deco-sites/storefront">https://github.com/deco-sites/storefront</a> Making the admin UI core extensible and open source is a MAIN OBJECTIVE for this year, so expect news here soon!<p>Thanks for your support!
Show HN: Deco.cx – realtime TypeScript web editor
Hi, HN.
Gui, Lucis, and the deco.cx team here — we're a bunch of Brazilians building an open-source, all-in-one web editor for the modern TypeScript dev. Check out our GitHub at <a href="https://github.com/deco-cx/deco">https://github.com/deco-cx/deco</a> and discover more at our site <a href="https://deco.cx/" rel="nofollow">https://deco.cx/</a>.<p>We built frontend tools at a hyper-growth e-commerce platform for 9 years. Deco.cx is the thing that we wish existed when we were managing hundreds of enterprise-grade React-based websites.<p>The gist is: JSX, HTMX, TypeScript, Tailwind and Deno. All visually editable in a beautiful, collaborative, REALTIME (!) admin UI. Great for devs, great for content.<p>In essence, we're a git-based TypeScript-first configuration management system. We take TypeScript code that represents your site, your entities, your UI, and we allow anyone to "save" configuration and content based on those types.<p>It's great for developers, because you can code directly on the web, or on your machine, and get instant feedback. You "export interface Props" and boom, an automatic form is generated. It's great for the business users, the content editors, because you just press "." (dot) in your keyboard and you go into a visual edit mode for the page you're at right now. Everything the developer exported in TypeScript is editable.<p>And when you hit "publish", it's already live, with a proper CDN, proper caching, everything already setup and just working. Ready to scale!<p>But you can't fly blind. We're democratizing pro-level tools for web analytics and observability by bundling in our Pro plan: fully-managed Plausible analytics and HyperDX observability. Check them out — these are great tools on their own, and they shine in the deco.cx platform bundle.<p>We don't want to be the only answer, we don't claim to be right about anything, we just want to make the friendly and accessible open-source tool that we wish existed when we were junior developers. If you want to participate, please join our discord community at <a href="https://deco.cx/discord" rel="nofollow">https://deco.cx/discord</a><p>on open-source: our admin UI core is not ready for prime time, we're working to open-source it this year. but basically everything else, our integrations, templates and configuration management runtime are all open-source already! take a look at these: <a href="https://github.com/deco-cx/deco">https://github.com/deco-cx/deco</a> <a href="https://github.com/deco-cx/apps">https://github.com/deco-cx/apps</a> <a href="https://github.com/deco-sites/storefront">https://github.com/deco-sites/storefront</a> Making the admin UI core extensible and open source is a MAIN OBJECTIVE for this year, so expect news here soon!<p>Thanks for your support!
Show HN: QWANJI
Hey HN! I've built this bit of cuteware (software not solving a problem but just for fun) inspired from the patterns made by swipe typing on mobile devices. To me it seemed like the closest thing to an English version of Kanji and it would be cool to consistently recreate those patterns.<p>The implementation is vanilla js to keep things simple (and from a bit of framework fatigue)<p>I'm keen to see how people use qwanji. Any and all feedback welcome!
Show HN: QWANJI
Hey HN! I've built this bit of cuteware (software not solving a problem but just for fun) inspired from the patterns made by swipe typing on mobile devices. To me it seemed like the closest thing to an English version of Kanji and it would be cool to consistently recreate those patterns.<p>The implementation is vanilla js to keep things simple (and from a bit of framework fatigue)<p>I'm keen to see how people use qwanji. Any and all feedback welcome!
Show HN: CompressX, my FFmpeg wrapper for macOS
Hey HN, just wanted to share my success story with CompressX, my FFmpeg wrapper for macOS.<p>For those who may not be familiar, FFmpeg is a powerful tool for converting, streaming, and recording audio and video content.<p>I started CompressX as a weekend project to serve my 9-5 jobs, primarily to compress demo videos for uploading to GitLab and sharing with my colleagues. It took me 2 weeks to make the first working version. I shared the demo on Twitter and the reaction was extraordinary. People loved it, they said that I was bringing the Pied Piper to life.<p>Four months later, I hit the $9,000 mark in revenue. I never expected to make a dime from this project, let alone eight thousand dollars. It's been a surreal experience, but it's also been incredibly rewarding.<p>I put a lot of time and effort into developing this tool, and it's amazing to see it paying off. It's been a great journey so far and I'm excited to see where it takes me next.
Show HN: CompressX, my FFmpeg wrapper for macOS
Hey HN, just wanted to share my success story with CompressX, my FFmpeg wrapper for macOS.<p>For those who may not be familiar, FFmpeg is a powerful tool for converting, streaming, and recording audio and video content.<p>I started CompressX as a weekend project to serve my 9-5 jobs, primarily to compress demo videos for uploading to GitLab and sharing with my colleagues. It took me 2 weeks to make the first working version. I shared the demo on Twitter and the reaction was extraordinary. People loved it, they said that I was bringing the Pied Piper to life.<p>Four months later, I hit the $9,000 mark in revenue. I never expected to make a dime from this project, let alone eight thousand dollars. It's been a surreal experience, but it's also been incredibly rewarding.<p>I put a lot of time and effort into developing this tool, and it's amazing to see it paying off. It's been a great journey so far and I'm excited to see where it takes me next.
Show HN: CompressX, my FFmpeg wrapper for macOS
Hey HN, just wanted to share my success story with CompressX, my FFmpeg wrapper for macOS.<p>For those who may not be familiar, FFmpeg is a powerful tool for converting, streaming, and recording audio and video content.<p>I started CompressX as a weekend project to serve my 9-5 jobs, primarily to compress demo videos for uploading to GitLab and sharing with my colleagues. It took me 2 weeks to make the first working version. I shared the demo on Twitter and the reaction was extraordinary. People loved it, they said that I was bringing the Pied Piper to life.<p>Four months later, I hit the $9,000 mark in revenue. I never expected to make a dime from this project, let alone eight thousand dollars. It's been a surreal experience, but it's also been incredibly rewarding.<p>I put a lot of time and effort into developing this tool, and it's amazing to see it paying off. It's been a great journey so far and I'm excited to see where it takes me next.
Show HN: ADS-B visualizer
I've created a web app for querying and visualization of ADS-B datasets: <a href="https://adsb.exposed/" rel="nofollow">https://adsb.exposed/</a><p>Source code: <a href="https://github.com/ClickHouse/adsb.exposed/">https://github.com/ClickHouse/adsb.exposed/</a><p>The results significantly exceeded my expectations because the pictures are insanely beautiful, and the data is a treasure trove.<p>It proves many statements that were not certain:
- it is feasible to generate tiles by aggregation on a pixel level (instead of hexagons or rectangular grid);
- it does not require JPG/PNG tiles - we can transfer raw bitmap data with zstd compression;
- it is possible to do it in real time;
Show HN: ADS-B visualizer
I've created a web app for querying and visualization of ADS-B datasets: <a href="https://adsb.exposed/" rel="nofollow">https://adsb.exposed/</a><p>Source code: <a href="https://github.com/ClickHouse/adsb.exposed/">https://github.com/ClickHouse/adsb.exposed/</a><p>The results significantly exceeded my expectations because the pictures are insanely beautiful, and the data is a treasure trove.<p>It proves many statements that were not certain:
- it is feasible to generate tiles by aggregation on a pixel level (instead of hexagons or rectangular grid);
- it does not require JPG/PNG tiles - we can transfer raw bitmap data with zstd compression;
- it is possible to do it in real time;
Show HN: ADS-B visualizer
I've created a web app for querying and visualization of ADS-B datasets: <a href="https://adsb.exposed/" rel="nofollow">https://adsb.exposed/</a><p>Source code: <a href="https://github.com/ClickHouse/adsb.exposed/">https://github.com/ClickHouse/adsb.exposed/</a><p>The results significantly exceeded my expectations because the pictures are insanely beautiful, and the data is a treasure trove.<p>It proves many statements that were not certain:
- it is feasible to generate tiles by aggregation on a pixel level (instead of hexagons or rectangular grid);
- it does not require JPG/PNG tiles - we can transfer raw bitmap data with zstd compression;
- it is possible to do it in real time;
Show HN: Sonauto – A more controllable AI music creator
Hey HN,<p>My cofounder and I trained an AI music generation model and after a month of testing we're launching 1.0 today. Ours is interesting because it's a latent diffusion model instead of a language model, which makes it more controllable: <a href="https://sonauto.ai/">https://sonauto.ai/</a><p>Others do music generation by training a Vector Quantized Variational Autoencoder like Descript Audio Codec (<a href="https://github.com/descriptinc/descript-audio-codec">https://github.com/descriptinc/descript-audio-codec</a>) to turn music into tokens, then training an LLM on those tokens. Instead, we ripped the tokenization part off and replaced it with a normal variational autoencoder bottleneck (along with some other important changes to enable insane compression ratios). This gave us a nice, normally distributed latent space on which to train a diffusion transformer (like Sora). Our diffusion model is also particularly interesting because it is the first audio diffusion model to generate coherent lyrics!<p>We like diffusion models for music generation because they have some interesting properties that make controlling them easier (so you can make <i>your own</i> music instead of just taking what the machine gives you). For example, we have a rhythm control mode where you can upload your own percussion line or set a BPM. Very soon you'll also be able to generate proper variations of an uploaded or previously generated song (e.g., you could even sing into Voice Memos for a minute and upload that!). @Musicians of HN, try uploading your songs and using Rhythm Control/let us know what you think! Our goal is to enable more of you, not replace you.<p>For example, we turned this drum line (<a href="https://sonauto.ai/songs/uoTKycBghUBv7wA2YfNz">https://sonauto.ai/songs/uoTKycBghUBv7wA2YfNz</a>) into this full song (<a href="https://sonauto.ai/songs/KSK7WM1PJuz1euhq6lS7">https://sonauto.ai/songs/KSK7WM1PJuz1euhq6lS7</a> skip to 1:05 if impatient) or this other song I like better (<a href="https://sonauto.ai/songs/qkn3KYv0ICT9kjWTmins">https://sonauto.ai/songs/qkn3KYv0ICT9kjWTmins</a> - we accidentally compressed it with AAC instead of Opus which hurt quality, though)<p>We also like diffusion models because while they're expensive to train, they're cheap to serve. We built our own efficient inference infrastructure instead of using those expensive inference as a service startups that are all the rage. That's why we're making generations on our site free and unlimited for as long as possible.<p>We'd love to answer your questions. Let us know what you think of our first model! <a href="https://sonauto.ai/">https://sonauto.ai/</a>
Show HN: Sonauto – A more controllable AI music creator
Hey HN,<p>My cofounder and I trained an AI music generation model and after a month of testing we're launching 1.0 today. Ours is interesting because it's a latent diffusion model instead of a language model, which makes it more controllable: <a href="https://sonauto.ai/">https://sonauto.ai/</a><p>Others do music generation by training a Vector Quantized Variational Autoencoder like Descript Audio Codec (<a href="https://github.com/descriptinc/descript-audio-codec">https://github.com/descriptinc/descript-audio-codec</a>) to turn music into tokens, then training an LLM on those tokens. Instead, we ripped the tokenization part off and replaced it with a normal variational autoencoder bottleneck (along with some other important changes to enable insane compression ratios). This gave us a nice, normally distributed latent space on which to train a diffusion transformer (like Sora). Our diffusion model is also particularly interesting because it is the first audio diffusion model to generate coherent lyrics!<p>We like diffusion models for music generation because they have some interesting properties that make controlling them easier (so you can make <i>your own</i> music instead of just taking what the machine gives you). For example, we have a rhythm control mode where you can upload your own percussion line or set a BPM. Very soon you'll also be able to generate proper variations of an uploaded or previously generated song (e.g., you could even sing into Voice Memos for a minute and upload that!). @Musicians of HN, try uploading your songs and using Rhythm Control/let us know what you think! Our goal is to enable more of you, not replace you.<p>For example, we turned this drum line (<a href="https://sonauto.ai/songs/uoTKycBghUBv7wA2YfNz">https://sonauto.ai/songs/uoTKycBghUBv7wA2YfNz</a>) into this full song (<a href="https://sonauto.ai/songs/KSK7WM1PJuz1euhq6lS7">https://sonauto.ai/songs/KSK7WM1PJuz1euhq6lS7</a> skip to 1:05 if impatient) or this other song I like better (<a href="https://sonauto.ai/songs/qkn3KYv0ICT9kjWTmins">https://sonauto.ai/songs/qkn3KYv0ICT9kjWTmins</a> - we accidentally compressed it with AAC instead of Opus which hurt quality, though)<p>We also like diffusion models because while they're expensive to train, they're cheap to serve. We built our own efficient inference infrastructure instead of using those expensive inference as a service startups that are all the rage. That's why we're making generations on our site free and unlimited for as long as possible.<p>We'd love to answer your questions. Let us know what you think of our first model! <a href="https://sonauto.ai/">https://sonauto.ai/</a>