The best Hacker News stories from Show from the past day
Latest posts:
Show HN: Tiny Chrome extension to disable images to reduce distractions
Show HN: Tiny Chrome extension to disable images to reduce distractions
Show HN: I built a backend so simple that it fits in a YAML file
Show HN: I built a backend so simple that it fits in a YAML file
Show HN: I built a backend so simple that it fits in a YAML file
Show HN: I built an interactive cloth solver for Apple Vision Pro
A bit more context - the cloth sim is part of my app, Lungy (<a href="https://www.lungy.app" rel="nofollow">https://www.lungy.app</a>). It's designed to be an active meditation / relaxation app, so you can play relaxing instruments in space and immersive breathing exercises. The original Lungy is a breathing app available for iOS, that uses real-time breathing with interactive visuals.<p>The cloth sim is a Verlet integration, running on a regular grid. For now, I have tried a couple of different cloth scenes - a sort of touch reactive 'pad', where different parts of the cloth are mapped to different sounds and a cloth that blows in sync with breathing. The collision detection is a little bit tricky with the deforming mesh, but seems to work ok overall. Overall, it seems like a cool interaction to explore.<p>The cloth sim is live on the app store now (and free) - would love to hear feedback from anyone with a Vision Pro.<p>App Store (Vision Pro): <a href="https://apps.apple.com/app/id6470201263" rel="nofollow">https://apps.apple.com/app/id6470201263</a><p>Lungy, original for iOS - <a href="https://apps.apple.com/app/id1545223887" rel="nofollow">https://apps.apple.com/app/id1545223887</a>
Show HN: I built an interactive cloth solver for Apple Vision Pro
A bit more context - the cloth sim is part of my app, Lungy (<a href="https://www.lungy.app" rel="nofollow">https://www.lungy.app</a>). It's designed to be an active meditation / relaxation app, so you can play relaxing instruments in space and immersive breathing exercises. The original Lungy is a breathing app available for iOS, that uses real-time breathing with interactive visuals.<p>The cloth sim is a Verlet integration, running on a regular grid. For now, I have tried a couple of different cloth scenes - a sort of touch reactive 'pad', where different parts of the cloth are mapped to different sounds and a cloth that blows in sync with breathing. The collision detection is a little bit tricky with the deforming mesh, but seems to work ok overall. Overall, it seems like a cool interaction to explore.<p>The cloth sim is live on the app store now (and free) - would love to hear feedback from anyone with a Vision Pro.<p>App Store (Vision Pro): <a href="https://apps.apple.com/app/id6470201263" rel="nofollow">https://apps.apple.com/app/id6470201263</a><p>Lungy, original for iOS - <a href="https://apps.apple.com/app/id1545223887" rel="nofollow">https://apps.apple.com/app/id1545223887</a>
Show HN: I built an interactive cloth solver for Apple Vision Pro
A bit more context - the cloth sim is part of my app, Lungy (<a href="https://www.lungy.app" rel="nofollow">https://www.lungy.app</a>). It's designed to be an active meditation / relaxation app, so you can play relaxing instruments in space and immersive breathing exercises. The original Lungy is a breathing app available for iOS, that uses real-time breathing with interactive visuals.<p>The cloth sim is a Verlet integration, running on a regular grid. For now, I have tried a couple of different cloth scenes - a sort of touch reactive 'pad', where different parts of the cloth are mapped to different sounds and a cloth that blows in sync with breathing. The collision detection is a little bit tricky with the deforming mesh, but seems to work ok overall. Overall, it seems like a cool interaction to explore.<p>The cloth sim is live on the app store now (and free) - would love to hear feedback from anyone with a Vision Pro.<p>App Store (Vision Pro): <a href="https://apps.apple.com/app/id6470201263" rel="nofollow">https://apps.apple.com/app/id6470201263</a><p>Lungy, original for iOS - <a href="https://apps.apple.com/app/id1545223887" rel="nofollow">https://apps.apple.com/app/id1545223887</a>
Show HN: I built an interactive cloth solver for Apple Vision Pro
A bit more context - the cloth sim is part of my app, Lungy (<a href="https://www.lungy.app" rel="nofollow">https://www.lungy.app</a>). It's designed to be an active meditation / relaxation app, so you can play relaxing instruments in space and immersive breathing exercises. The original Lungy is a breathing app available for iOS, that uses real-time breathing with interactive visuals.<p>The cloth sim is a Verlet integration, running on a regular grid. For now, I have tried a couple of different cloth scenes - a sort of touch reactive 'pad', where different parts of the cloth are mapped to different sounds and a cloth that blows in sync with breathing. The collision detection is a little bit tricky with the deforming mesh, but seems to work ok overall. Overall, it seems like a cool interaction to explore.<p>The cloth sim is live on the app store now (and free) - would love to hear feedback from anyone with a Vision Pro.<p>App Store (Vision Pro): <a href="https://apps.apple.com/app/id6470201263" rel="nofollow">https://apps.apple.com/app/id6470201263</a><p>Lungy, original for iOS - <a href="https://apps.apple.com/app/id1545223887" rel="nofollow">https://apps.apple.com/app/id1545223887</a>
Show HN: Explore Websites by Chosen Color
I crawled lots of websites and extracted their color palettes (primary, secondary, etc) to build a searchable collection of website colors!<p>It was a fun side-project and I learned a lot about color theory. Let me know what you think and what features you would like to see :)<p>If you have any questions about the tech behind it, feel free to ask in the comments. Feedback is also greatly appreciated.
Show HN: Explore Websites by Chosen Color
I crawled lots of websites and extracted their color palettes (primary, secondary, etc) to build a searchable collection of website colors!<p>It was a fun side-project and I learned a lot about color theory. Let me know what you think and what features you would like to see :)<p>If you have any questions about the tech behind it, feel free to ask in the comments. Feedback is also greatly appreciated.
Show HN: I built a directory about big life decisions/regrets
Show HN: I built a directory about big life decisions/regrets
Show HN: I built a directory about big life decisions/regrets
Show HN: I built a directory about big life decisions/regrets
Show HN: Wadzilla. Play DOOM in Zork
Wadzilla is currently a PoC. Although in theory it “works”, it does not create an entertaining experience as a game, although to me it is quite fun just too see it output ZIL for all of the rooms in a DOOM WAD, with all of the objects in all of their correct locations by their plain English names, along with the 8-character names for the textures of the walls, floors, and ceilings in their relative positions in the room. Part of that enjoyment of course comes from knowing what it entails just to get that far, so the amount of entertainment you derive from that may be far less. In fact, I suspect that for most people at this point the most amusing thing about Wadzilla will be the very concept of its existence - and of course the name, which I will take credit for while also acknowledging how fortuitous it is that the project practically named itself. “What should I name a tool that converts WAD to ZIL? Oh yeah, right. Of course. Wadzilla.”<p>I share it in this early stage because I suspect many in this audience may enjoy just reading about it, and many others might be excited by it and want to contribute to the effort, and also because I welcome feedback here and contributions by way of GitHub issues and PRs.
Show HN: Wadzilla. Play DOOM in Zork
Wadzilla is currently a PoC. Although in theory it “works”, it does not create an entertaining experience as a game, although to me it is quite fun just too see it output ZIL for all of the rooms in a DOOM WAD, with all of the objects in all of their correct locations by their plain English names, along with the 8-character names for the textures of the walls, floors, and ceilings in their relative positions in the room. Part of that enjoyment of course comes from knowing what it entails just to get that far, so the amount of entertainment you derive from that may be far less. In fact, I suspect that for most people at this point the most amusing thing about Wadzilla will be the very concept of its existence - and of course the name, which I will take credit for while also acknowledging how fortuitous it is that the project practically named itself. “What should I name a tool that converts WAD to ZIL? Oh yeah, right. Of course. Wadzilla.”<p>I share it in this early stage because I suspect many in this audience may enjoy just reading about it, and many others might be excited by it and want to contribute to the effort, and also because I welcome feedback here and contributions by way of GitHub issues and PRs.
Show HN: Wadzilla. Play DOOM in Zork
Wadzilla is currently a PoC. Although in theory it “works”, it does not create an entertaining experience as a game, although to me it is quite fun just too see it output ZIL for all of the rooms in a DOOM WAD, with all of the objects in all of their correct locations by their plain English names, along with the 8-character names for the textures of the walls, floors, and ceilings in their relative positions in the room. Part of that enjoyment of course comes from knowing what it entails just to get that far, so the amount of entertainment you derive from that may be far less. In fact, I suspect that for most people at this point the most amusing thing about Wadzilla will be the very concept of its existence - and of course the name, which I will take credit for while also acknowledging how fortuitous it is that the project practically named itself. “What should I name a tool that converts WAD to ZIL? Oh yeah, right. Of course. Wadzilla.”<p>I share it in this early stage because I suspect many in this audience may enjoy just reading about it, and many others might be excited by it and want to contribute to the effort, and also because I welcome feedback here and contributions by way of GitHub issues and PRs.
Show HN: CommitAsync – $100K+ dev jobs 100% remote only
Show HN: Every mountain, building and tree shadow mapped for any date and time
I've been working on this project for about 4 years. It began as terrain only because world wide elevation data was publicly available. I then added buildings from OpenStreetMap (crowd sourced) and more recently from Overture Maps data. Some computer vision/machine learning advancements [1] in the past few years have made it possible to estimate tree canopy heights using satellite imagery alone making it possible to finally add trees to the map. The data isn't perfect, but it's within +/- 3 meters of so. Good enough to give a general idea for any location on Earth. Happy to answer any questions.<p>[1] <a href="https://www.nature.com/articles/s41559-023-02206-6" rel="nofollow">https://www.nature.com/articles/s41559-023-02206-6</a>