The best Hacker News stories from Show from the past day

Go back

Latest posts:

Show HN: Browse Anime from the Terminal

Show HN: Meelo, self-hosted music server for collectors and music maniacs

I've been working on this alternative for Plex for almost 3 years now. It's main selling point is that it correctly handles multiple versions of albums and songs. As of today, it only has a web client.<p>It tries to be as flexible as possible, but still requires a bit of configuration (including regexes, but if metadata is embedded into the files, it can be skipped).<p>I just released v3.0, making videos first-class data, and scanning + metadata matching faster.

Show HN: Meelo, self-hosted music server for collectors and music maniacs

I've been working on this alternative for Plex for almost 3 years now. It's main selling point is that it correctly handles multiple versions of albums and songs. As of today, it only has a web client.<p>It tries to be as flexible as possible, but still requires a bit of configuration (including regexes, but if metadata is embedded into the files, it can be skipped).<p>I just released v3.0, making videos first-class data, and scanning + metadata matching faster.

Show HN: Meelo, self-hosted music server for collectors and music maniacs

I've been working on this alternative for Plex for almost 3 years now. It's main selling point is that it correctly handles multiple versions of albums and songs. As of today, it only has a web client.<p>It tries to be as flexible as possible, but still requires a bit of configuration (including regexes, but if metadata is embedded into the files, it can be skipped).<p>I just released v3.0, making videos first-class data, and scanning + metadata matching faster.

Show HN: DeepSeek Your HN Profile

A fun project that I built to try out R1 Distill Llama 70B. Enjoy :)

Show HN: Orange intelligence, an open source alternative to Apple Intelligence

Hi HN! I’m excited to share Orange Intelligence, an open-source alternative to Apple Intelligence for macOS.<p>Orange Intelligence allows you to interact with any text on your macOS system in a more powerful and customizable way. It brings a floating text processor that integrates seamlessly with your workflow. Whether you’re a developer, writer, or productivity enthusiast, this tool can boost your efficiency. Key Features:<p><pre><code> Floating Text Processor: Trigger a floating window by double-tapping the Option key to process selected text. Run Any Python Function: From basic text manipulations to running large language models (LLM) like OpenAI or local LLaMA, you can execute any Python function on the fly. Full Customization: Want to add your own functions or logic? Just write them in Python, and they’ll appear in the floating window. </code></pre> How does it work?<p><pre><code> Capture: Uses AppleScript to simulate a global Cmd+C and capture selected text from any active macOS app. Process: A floating window pops up, letting you choose what to do with the text (run a function, format it, or apply an LLM). Replace: After processing, the app returns focus to the original application and pastes the processed text back with a global Cmd+V. </code></pre> Why open source?<p>I built this to overcome the limitations of Apple’s proprietary tools, and I wanted to make it fully customizable and extendable. Orange Intelligence is built with Python and PyQt6, so it’s easy to adapt, extend, and contribute to.<p>It’s not just a text processor—it’s a platform for building custom workflows, whether you want to automate simple tasks or integrate with complex AI systems.<p>If you’re on macOS and you’re interested in boosting your productivity with Python and AI, I’d love for you to try it out and give feedback: <a href="https://github.com/sharingan-no-kakashi/orange-intelligence">https://github.com/sharingan-no-kakashi/orange-intelligence</a>.<p>I’m looking forward to your thoughts, ideas, and contributions. Thanks!

Show HN: I Created ErisForge, a Python Library for Abliteration of LLMs

ErisForge is a Python library designed to modify Large Language Models (LLMs) by applying transformations to their internal layers. Named after Eris, the goddess of strife and discord, ErisForge allows you to alter model behavior in a controlled manner, creating both ablated and augmented versions of LLMs that respond differently to specific types of input.<p>It is also quite useful to perform studies on propaganda and bias in LLMs (planning to experiment with deepseek).<p>Features - Modify internal layers of LLMs to produce altered behaviors. - Ablate or enhance model responses with the AblationDecoderLayer and AdditionDecoderLayer classes. - Measure refusal expressions in model responses using the ExpressionRefusalScorer. - Supports custom behavior directions for applying specific types of transformations.

Show HN: I Created ErisForge, a Python Library for Abliteration of LLMs

ErisForge is a Python library designed to modify Large Language Models (LLMs) by applying transformations to their internal layers. Named after Eris, the goddess of strife and discord, ErisForge allows you to alter model behavior in a controlled manner, creating both ablated and augmented versions of LLMs that respond differently to specific types of input.<p>It is also quite useful to perform studies on propaganda and bias in LLMs (planning to experiment with deepseek).<p>Features - Modify internal layers of LLMs to produce altered behaviors. - Ablate or enhance model responses with the AblationDecoderLayer and AdditionDecoderLayer classes. - Measure refusal expressions in model responses using the ExpressionRefusalScorer. - Supports custom behavior directions for applying specific types of transformations.

Show HN: I Created ErisForge, a Python Library for Abliteration of LLMs

ErisForge is a Python library designed to modify Large Language Models (LLMs) by applying transformations to their internal layers. Named after Eris, the goddess of strife and discord, ErisForge allows you to alter model behavior in a controlled manner, creating both ablated and augmented versions of LLMs that respond differently to specific types of input.<p>It is also quite useful to perform studies on propaganda and bias in LLMs (planning to experiment with deepseek).<p>Features - Modify internal layers of LLMs to produce altered behaviors. - Ablate or enhance model responses with the AblationDecoderLayer and AdditionDecoderLayer classes. - Measure refusal expressions in model responses using the ExpressionRefusalScorer. - Supports custom behavior directions for applying specific types of transformations.

Show HN: I Created ErisForge, a Python Library for Abliteration of LLMs

ErisForge is a Python library designed to modify Large Language Models (LLMs) by applying transformations to their internal layers. Named after Eris, the goddess of strife and discord, ErisForge allows you to alter model behavior in a controlled manner, creating both ablated and augmented versions of LLMs that respond differently to specific types of input.<p>It is also quite useful to perform studies on propaganda and bias in LLMs (planning to experiment with deepseek).<p>Features - Modify internal layers of LLMs to produce altered behaviors. - Ablate or enhance model responses with the AblationDecoderLayer and AdditionDecoderLayer classes. - Measure refusal expressions in model responses using the ExpressionRefusalScorer. - Supports custom behavior directions for applying specific types of transformations.

Show HN: Making AR experiences is still painful – had to make my own editor

Hey HN!<p>My co-founder and I have spent over a decade building mixed reality projects and have been growing more and more frustrated with the process. From the number of tools needed, to client sign-off, to the complex esoteric tech stacks. And especially how slow the iteration loop is when dealing with interactivity and UX.<p>Two years ago we decided to stop whining and fix the fundamental issues. Ordinary Objects [0] was built with our core needs for AR prototyping: a very tight iteration loop between editor and real device, real-time interactivity while editing and clear and concise flow management + mapping. There are many things that layered on to make all of that possible: making it multi-user from the ground up, handling assets without a fuss, and building up a new interaction language.<p>From a technical standpoint we wanted to be native on the all of the platforms that we support, and do that as quickly as possible. Two years ago the best tool to achieve that was Unity, and I believe that is still the case today. Everything else is inside our custom C# Redux implementation. Our multi-user needs are very different from games, and it helped a lot to learn from Figma's technical notes to implement our pseudo eventual consistency setup. Its been super nice to be multi-user from the get go, we've been able to explore much more functionality this way.<p>Once the core churn eases up a bit more we will be open sourcing this particular C# Redux setup. As it has nothing to do with any engine code.<p>The website has some quick examples of how the design tool works [0]. But if you want to view a more complete prototype here is something Gregor, my co-founder, put together recently [1]. We now also have an easy getting started playlist [2].<p>Over the past year we've been testing with closed groups, and have been excited by what everyone is making. Now we are ready to open it up for all of you to try! Give it a spin and let me know what you think! And happy to answer any questions here :)<p>[0]: <a href="https://ordinary.space" rel="nofollow">https://ordinary.space</a><p>[1]: <a href="https://www.linkedin.com/posts/onlinegregor_mixedreality-sportsinnovation-wearabletech-activity-7275472488883462144-dT7j" rel="nofollow">https://www.linkedin.com/posts/onlinegregor_mixedreality-spo...</a><p>[2]: <a href="https://www.youtube.com/watch?v=S_A9gShKENE&list=PLidKy8OpOyFMa5oynAjF1viiT7ynKYYbJ" rel="nofollow">https://www.youtube.com/watch?v=S_A9gShKENE&list=PLidKy8OpOy...</a>

Show HN: Making AR experiences is still painful – had to make my own editor

Hey HN!<p>My co-founder and I have spent over a decade building mixed reality projects and have been growing more and more frustrated with the process. From the number of tools needed, to client sign-off, to the complex esoteric tech stacks. And especially how slow the iteration loop is when dealing with interactivity and UX.<p>Two years ago we decided to stop whining and fix the fundamental issues. Ordinary Objects [0] was built with our core needs for AR prototyping: a very tight iteration loop between editor and real device, real-time interactivity while editing and clear and concise flow management + mapping. There are many things that layered on to make all of that possible: making it multi-user from the ground up, handling assets without a fuss, and building up a new interaction language.<p>From a technical standpoint we wanted to be native on the all of the platforms that we support, and do that as quickly as possible. Two years ago the best tool to achieve that was Unity, and I believe that is still the case today. Everything else is inside our custom C# Redux implementation. Our multi-user needs are very different from games, and it helped a lot to learn from Figma's technical notes to implement our pseudo eventual consistency setup. Its been super nice to be multi-user from the get go, we've been able to explore much more functionality this way.<p>Once the core churn eases up a bit more we will be open sourcing this particular C# Redux setup. As it has nothing to do with any engine code.<p>The website has some quick examples of how the design tool works [0]. But if you want to view a more complete prototype here is something Gregor, my co-founder, put together recently [1]. We now also have an easy getting started playlist [2].<p>Over the past year we've been testing with closed groups, and have been excited by what everyone is making. Now we are ready to open it up for all of you to try! Give it a spin and let me know what you think! And happy to answer any questions here :)<p>[0]: <a href="https://ordinary.space" rel="nofollow">https://ordinary.space</a><p>[1]: <a href="https://www.linkedin.com/posts/onlinegregor_mixedreality-sportsinnovation-wearabletech-activity-7275472488883462144-dT7j" rel="nofollow">https://www.linkedin.com/posts/onlinegregor_mixedreality-spo...</a><p>[2]: <a href="https://www.youtube.com/watch?v=S_A9gShKENE&list=PLidKy8OpOyFMa5oynAjF1viiT7ynKYYbJ" rel="nofollow">https://www.youtube.com/watch?v=S_A9gShKENE&list=PLidKy8OpOy...</a>

Show HN: Making AR experiences is still painful – had to make my own editor

Hey HN!<p>My co-founder and I have spent over a decade building mixed reality projects and have been growing more and more frustrated with the process. From the number of tools needed, to client sign-off, to the complex esoteric tech stacks. And especially how slow the iteration loop is when dealing with interactivity and UX.<p>Two years ago we decided to stop whining and fix the fundamental issues. Ordinary Objects [0] was built with our core needs for AR prototyping: a very tight iteration loop between editor and real device, real-time interactivity while editing and clear and concise flow management + mapping. There are many things that layered on to make all of that possible: making it multi-user from the ground up, handling assets without a fuss, and building up a new interaction language.<p>From a technical standpoint we wanted to be native on the all of the platforms that we support, and do that as quickly as possible. Two years ago the best tool to achieve that was Unity, and I believe that is still the case today. Everything else is inside our custom C# Redux implementation. Our multi-user needs are very different from games, and it helped a lot to learn from Figma's technical notes to implement our pseudo eventual consistency setup. Its been super nice to be multi-user from the get go, we've been able to explore much more functionality this way.<p>Once the core churn eases up a bit more we will be open sourcing this particular C# Redux setup. As it has nothing to do with any engine code.<p>The website has some quick examples of how the design tool works [0]. But if you want to view a more complete prototype here is something Gregor, my co-founder, put together recently [1]. We now also have an easy getting started playlist [2].<p>Over the past year we've been testing with closed groups, and have been excited by what everyone is making. Now we are ready to open it up for all of you to try! Give it a spin and let me know what you think! And happy to answer any questions here :)<p>[0]: <a href="https://ordinary.space" rel="nofollow">https://ordinary.space</a><p>[1]: <a href="https://www.linkedin.com/posts/onlinegregor_mixedreality-sportsinnovation-wearabletech-activity-7275472488883462144-dT7j" rel="nofollow">https://www.linkedin.com/posts/onlinegregor_mixedreality-spo...</a><p>[2]: <a href="https://www.youtube.com/watch?v=S_A9gShKENE&list=PLidKy8OpOyFMa5oynAjF1viiT7ynKYYbJ" rel="nofollow">https://www.youtube.com/watch?v=S_A9gShKENE&list=PLidKy8OpOy...</a>

Show HN: Making AR experiences is still painful – had to make my own editor

Hey HN!<p>My co-founder and I have spent over a decade building mixed reality projects and have been growing more and more frustrated with the process. From the number of tools needed, to client sign-off, to the complex esoteric tech stacks. And especially how slow the iteration loop is when dealing with interactivity and UX.<p>Two years ago we decided to stop whining and fix the fundamental issues. Ordinary Objects [0] was built with our core needs for AR prototyping: a very tight iteration loop between editor and real device, real-time interactivity while editing and clear and concise flow management + mapping. There are many things that layered on to make all of that possible: making it multi-user from the ground up, handling assets without a fuss, and building up a new interaction language.<p>From a technical standpoint we wanted to be native on the all of the platforms that we support, and do that as quickly as possible. Two years ago the best tool to achieve that was Unity, and I believe that is still the case today. Everything else is inside our custom C# Redux implementation. Our multi-user needs are very different from games, and it helped a lot to learn from Figma's technical notes to implement our pseudo eventual consistency setup. Its been super nice to be multi-user from the get go, we've been able to explore much more functionality this way.<p>Once the core churn eases up a bit more we will be open sourcing this particular C# Redux setup. As it has nothing to do with any engine code.<p>The website has some quick examples of how the design tool works [0]. But if you want to view a more complete prototype here is something Gregor, my co-founder, put together recently [1]. We now also have an easy getting started playlist [2].<p>Over the past year we've been testing with closed groups, and have been excited by what everyone is making. Now we are ready to open it up for all of you to try! Give it a spin and let me know what you think! And happy to answer any questions here :)<p>[0]: <a href="https://ordinary.space" rel="nofollow">https://ordinary.space</a><p>[1]: <a href="https://www.linkedin.com/posts/onlinegregor_mixedreality-sportsinnovation-wearabletech-activity-7275472488883462144-dT7j" rel="nofollow">https://www.linkedin.com/posts/onlinegregor_mixedreality-spo...</a><p>[2]: <a href="https://www.youtube.com/watch?v=S_A9gShKENE&list=PLidKy8OpOyFMa5oynAjF1viiT7ynKYYbJ" rel="nofollow">https://www.youtube.com/watch?v=S_A9gShKENE&list=PLidKy8OpOy...</a>

Show HN: I Made an iOS Podcast Player with Racket

Show HN: I Made an iOS Podcast Player with Racket

Show HN: I Made an iOS Podcast Player with Racket

Show HN: Bagels – TUI expense tracker

Hi! I'm Jax and I've been building this cool little terminal app for myself to track my expenses and budgets!<p>Other than challenging myself to learn Python, I built this mainly around the habit of budget tracking at the end of the day. (I tried tracking on-the-go, but the balance was always out of sync.) All data is stored in a single sqlite file, so you can export and process them all you want!<p>The app is built using the textual API for Python! Awesome framework which feels like I'm doing webdev haha.<p>You can check out some screenshots on gh: <a href="https://github.com/EnhancedJax/Bagels">https://github.com/EnhancedJax/Bagels</a><p>Thanks!

Show HN: Bagels – TUI expense tracker

Hi! I'm Jax and I've been building this cool little terminal app for myself to track my expenses and budgets!<p>Other than challenging myself to learn Python, I built this mainly around the habit of budget tracking at the end of the day. (I tried tracking on-the-go, but the balance was always out of sync.) All data is stored in a single sqlite file, so you can export and process them all you want!<p>The app is built using the textual API for Python! Awesome framework which feels like I'm doing webdev haha.<p>You can check out some screenshots on gh: <a href="https://github.com/EnhancedJax/Bagels">https://github.com/EnhancedJax/Bagels</a><p>Thanks!

Show HN: A new native app for 20 year old OS X

A few of us here are probably familiar with the original Xbox modding scene and the iconic xbins FTP server. Recently, I came across an amazing tool called Pandora by Team Resurgent [0], which got me thinking about how incredible something like this would have been 20 years ago. Just to clarify, I had no involvement in creating Pandora—I’m just inspired by their work.<p>For those who aren’t familiar, getting access to xbins involves a rather dated process. You need to connect to a channel on an EFnet IRC server, message a bot for temporary credentials, then plug those credentials into your FTP client to access xbins. Pandora (and my app) simplifies this entire workflow into a single click.<p>Inspired by Pandora, I decided to build my own take on what this dream tool might have looked like back in the day. I wrote a native Mac app on original hardware—an Intel iMac (20-inch, 2007)—running a 20-year-old operating system, Mac OS X 10.4 Tiger.<p>This was my first foray into native Mac app development, though I’ve done some iOS development in the past. The result is Uppercut [1], and the source is available on GitHub [2].<p>For the development process, I used Claude to help with a lot of the coding, especially since I was constrained to Xcode 2.5 and the pre-“Objective-C 2.0” features available at the time. I had to be very specific in prompting Claude to avoid newer features that didn’t exist back then. Since the majority of Objective-C code out there comes from the era of iOS development (which relied heavily on Objective-C 2.0 until the arrival of Swift), this was a unique and challenging exercise in retro development.<p>[0] - <a href="https://github.com/Team-Resurgent/Pandora">https://github.com/Team-Resurgent/Pandora</a><p>[1] - <a href="https://uppercut.chadbibler.com" rel="nofollow">https://uppercut.chadbibler.com</a><p>[2] - <a href="https://github.com/chadkeck/Uppercut">https://github.com/chadkeck/Uppercut</a>

< 1 2 3 ... 72 73 74 75 76 ... 824 825 826 >