The best Hacker News stories from Show from the past day

Go back

Latest posts:

Show HN: Reverse-Engineering a Switch Lite with 1,917 wires

Hey Hackers. This is a project I solo-developed that turns completed PCB assemblies into an easy to use boardview with some accompanying boardscans. There are lots of easier and better ways of doing this, but this is an experimentation to do it as cheaply as possible, with the highest quality and lowest chance of errors. The technical details are in the link.<p>Most public boardviews are almost entirely the result of industrial espionage, other than a few encrypted subscription based software platforms that provide extensive access. The process output is released as donationware, as my main concern is that even released as a low-cost purchase, there is a very strong culture to share this type of information at no cost. I would like to have a more sophisticated suggested donation system adaptive to user country, but I wasn't able to find a good solution.<p>In terms of 'good startup ideas', I don't think this is one of them. The very high level of soldering skill required makes it difficult to scale, and the prevailing piracy culture makes it challenging to monetize. My main advantage is that costs are very low now that I have the entire thing working. Other than forge ahead at a loss and hope for the best, or to pivot hard leveraging the imaging technology, I'm not sure what other options I have. It feels too complicated and repetitive for shoft-form video content. If you have any feedback, questions, suggestions, etc., I'd love to hear them.

Show HN: Hacker News Telegram Bot

Show HN: Consol3 – A 3D engine for the terminal that executes on the CPU

Hi all<p>This has been my hobby project for quite a few years now<p>It started as a small engine to serve as a sandbox to try out new 3d graphics ideas<p>After adding many features through out the years and re-writing the entire engine a few times, this is the latest state<p>It currently supports loading models with animations, textures, lights, shadow maps, normal maps, and some other goodies<p>I've also recently added voxel raymarching as an alternative renderer, along with a fun physics simulation :)

Show HN: Consol3 – A 3D engine for the terminal that executes on the CPU

Hi all<p>This has been my hobby project for quite a few years now<p>It started as a small engine to serve as a sandbox to try out new 3d graphics ideas<p>After adding many features through out the years and re-writing the entire engine a few times, this is the latest state<p>It currently supports loading models with animations, textures, lights, shadow maps, normal maps, and some other goodies<p>I've also recently added voxel raymarching as an alternative renderer, along with a fun physics simulation :)

Show HN: Consol3 – A 3D engine for the terminal that executes on the CPU

Hi all<p>This has been my hobby project for quite a few years now<p>It started as a small engine to serve as a sandbox to try out new 3d graphics ideas<p>After adding many features through out the years and re-writing the entire engine a few times, this is the latest state<p>It currently supports loading models with animations, textures, lights, shadow maps, normal maps, and some other goodies<p>I've also recently added voxel raymarching as an alternative renderer, along with a fun physics simulation :)

Show HN: CodeMate – The Revolutionary Search Engine for Developers

Show HN: CodeMate – The Revolutionary Search Engine for Developers

Show HN: Released Chartbrew v3.0 an open source client reporting platform

Show HN: Little Fixes – a spatial forum to improve your city

I love urban planning and think the way we interact with the built environment is hugely impactful to individuals. But I also think that most people have been trained to take the built environment as a given rather than something that they have partial ownership of. By building a place to discuss their community on a hyper-local scale, I’m hoping to encourage residents to feel like they are an important piece of their city.<p>I thought building a sort of spatial forum, where city residents can discuss the little annoyances in their neighborhoods, might help people a) start thinking about which parts of the built environment bug them and b) realize that other people in their neighborhood probably have the same complaint. Of course, I know that local politics can turn nasty quickly, hence the name of the site: I’m hoping to keep discussion focused on potential <i>fixes</i> for each problem.<p>If you’re excited about this but your city isn’t on the list, I’m happy to add it as long as you promise to make at least one post. It’s extra helpful if you go to geojson.io and create GeoJson for a closed polygon that marks where you think the bounds for your city should be (doesn’t need to correlate with official city boundaries), but I’m happy to guess and do that part myself. Let me know here or by email if you want your city added!

Show HN: Little Fixes – a spatial forum to improve your city

I love urban planning and think the way we interact with the built environment is hugely impactful to individuals. But I also think that most people have been trained to take the built environment as a given rather than something that they have partial ownership of. By building a place to discuss their community on a hyper-local scale, I’m hoping to encourage residents to feel like they are an important piece of their city.<p>I thought building a sort of spatial forum, where city residents can discuss the little annoyances in their neighborhoods, might help people a) start thinking about which parts of the built environment bug them and b) realize that other people in their neighborhood probably have the same complaint. Of course, I know that local politics can turn nasty quickly, hence the name of the site: I’m hoping to keep discussion focused on potential <i>fixes</i> for each problem.<p>If you’re excited about this but your city isn’t on the list, I’m happy to add it as long as you promise to make at least one post. It’s extra helpful if you go to geojson.io and create GeoJson for a closed polygon that marks where you think the bounds for your city should be (doesn’t need to correlate with official city boundaries), but I’m happy to guess and do that part myself. Let me know here or by email if you want your city added!

Show HN: Little Fixes – a spatial forum to improve your city

I love urban planning and think the way we interact with the built environment is hugely impactful to individuals. But I also think that most people have been trained to take the built environment as a given rather than something that they have partial ownership of. By building a place to discuss their community on a hyper-local scale, I’m hoping to encourage residents to feel like they are an important piece of their city.<p>I thought building a sort of spatial forum, where city residents can discuss the little annoyances in their neighborhoods, might help people a) start thinking about which parts of the built environment bug them and b) realize that other people in their neighborhood probably have the same complaint. Of course, I know that local politics can turn nasty quickly, hence the name of the site: I’m hoping to keep discussion focused on potential <i>fixes</i> for each problem.<p>If you’re excited about this but your city isn’t on the list, I’m happy to add it as long as you promise to make at least one post. It’s extra helpful if you go to geojson.io and create GeoJson for a closed polygon that marks where you think the bounds for your city should be (doesn’t need to correlate with official city boundaries), but I’m happy to guess and do that part myself. Let me know here or by email if you want your city added!

Show HN: GPU Prices on eBay

Howdy!<p>Keeping with the trend of being influenced by <a href="https://diskprices.com" rel="nofollow">https://diskprices.com</a>, I wanted to make a resource for GPUs on eBay that also take into account a performance metric I tend to look at when checking GPUs.<p>It's still a work in progress, but it's at a state where I think some of HN might find it useful!<p>There are a few things that I have planned to add in the coming day(s): 1. Filters for compatible slots and connectors 2. Expand for different eBay regions, rather than just US focus<p>Let me know if you have any questions or want more / different filters on the page or if I missed something important.

Show HN: GPU Prices on eBay

Howdy!<p>Keeping with the trend of being influenced by <a href="https://diskprices.com" rel="nofollow">https://diskprices.com</a>, I wanted to make a resource for GPUs on eBay that also take into account a performance metric I tend to look at when checking GPUs.<p>It's still a work in progress, but it's at a state where I think some of HN might find it useful!<p>There are a few things that I have planned to add in the coming day(s): 1. Filters for compatible slots and connectors 2. Expand for different eBay regions, rather than just US focus<p>Let me know if you have any questions or want more / different filters on the page or if I missed something important.

Show HN: Refractify – Optical software against myopia

Last summer there was an Ask HN[1] about a Nature article that said bluring the blue and green color channels on screen may be good against early myopia development. The OP wanted such software and there was none available.<p>So I quit my job and implemented this software, did a short video with a 3D artist about it.<p>Turns out marketing is expensive, so I made an open source browser extension version too.<p>How it works?<p>There is a small neural network on the retina that tries to detect if the eye is far-sighted(most people are born far-sighted), and it is producing dopamine to slow or increase eye growth rate. It is not very smart, and if you do a lot of near-work it can think you are still hyperopic, causing further myopia progression.<p>So, based on the refractive properties of the eye the software calculates the signal that would convince the retinal neural network that the eye is long enough, so it would produce dopamine, a known signal to stop axial eye growth. (based on myopic defocus LCA from the papers[2][3])<p>Some myopia control techniques work similarly, like MiSight and Hoya lenses.<p>Since then I got a Neurobiologist co-founder and the goal is to best understand the Retinal NN to create the best anti-myopic effect that does not interfere with productivity.<p>The effect can be tried live on the site. Also check out the github repo. Any questions suggestions welcome!<p>[1] <a href="https://news.ycombinator.com/item?id=37019143">https://news.ycombinator.com/item?id=37019143</a> [2] <a href="https://www.nature.com/articles/s41598-022-26323-7" rel="nofollow">https://www.nature.com/articles/s41598-022-26323-7</a> [3] <a href="https://www.sciencedirect.com/science/article/abs/pii/S0014483522002676?via=ihub" rel="nofollow">https://www.sciencedirect.com/science/article/abs/pii/S00144...</a>

Show HN: Refractify – Optical software against myopia

Last summer there was an Ask HN[1] about a Nature article that said bluring the blue and green color channels on screen may be good against early myopia development. The OP wanted such software and there was none available.<p>So I quit my job and implemented this software, did a short video with a 3D artist about it.<p>Turns out marketing is expensive, so I made an open source browser extension version too.<p>How it works?<p>There is a small neural network on the retina that tries to detect if the eye is far-sighted(most people are born far-sighted), and it is producing dopamine to slow or increase eye growth rate. It is not very smart, and if you do a lot of near-work it can think you are still hyperopic, causing further myopia progression.<p>So, based on the refractive properties of the eye the software calculates the signal that would convince the retinal neural network that the eye is long enough, so it would produce dopamine, a known signal to stop axial eye growth. (based on myopic defocus LCA from the papers[2][3])<p>Some myopia control techniques work similarly, like MiSight and Hoya lenses.<p>Since then I got a Neurobiologist co-founder and the goal is to best understand the Retinal NN to create the best anti-myopic effect that does not interfere with productivity.<p>The effect can be tried live on the site. Also check out the github repo. Any questions suggestions welcome!<p>[1] <a href="https://news.ycombinator.com/item?id=37019143">https://news.ycombinator.com/item?id=37019143</a> [2] <a href="https://www.nature.com/articles/s41598-022-26323-7" rel="nofollow">https://www.nature.com/articles/s41598-022-26323-7</a> [3] <a href="https://www.sciencedirect.com/science/article/abs/pii/S0014483522002676?via=ihub" rel="nofollow">https://www.sciencedirect.com/science/article/abs/pii/S00144...</a>

Show HN: Refractify – Optical software against myopia

Last summer there was an Ask HN[1] about a Nature article that said bluring the blue and green color channels on screen may be good against early myopia development. The OP wanted such software and there was none available.<p>So I quit my job and implemented this software, did a short video with a 3D artist about it.<p>Turns out marketing is expensive, so I made an open source browser extension version too.<p>How it works?<p>There is a small neural network on the retina that tries to detect if the eye is far-sighted(most people are born far-sighted), and it is producing dopamine to slow or increase eye growth rate. It is not very smart, and if you do a lot of near-work it can think you are still hyperopic, causing further myopia progression.<p>So, based on the refractive properties of the eye the software calculates the signal that would convince the retinal neural network that the eye is long enough, so it would produce dopamine, a known signal to stop axial eye growth. (based on myopic defocus LCA from the papers[2][3])<p>Some myopia control techniques work similarly, like MiSight and Hoya lenses.<p>Since then I got a Neurobiologist co-founder and the goal is to best understand the Retinal NN to create the best anti-myopic effect that does not interfere with productivity.<p>The effect can be tried live on the site. Also check out the github repo. Any questions suggestions welcome!<p>[1] <a href="https://news.ycombinator.com/item?id=37019143">https://news.ycombinator.com/item?id=37019143</a> [2] <a href="https://www.nature.com/articles/s41598-022-26323-7" rel="nofollow">https://www.nature.com/articles/s41598-022-26323-7</a> [3] <a href="https://www.sciencedirect.com/science/article/abs/pii/S0014483522002676?via=ihub" rel="nofollow">https://www.sciencedirect.com/science/article/abs/pii/S00144...</a>

Show HN: Refractify – Optical software against myopia

Last summer there was an Ask HN[1] about a Nature article that said bluring the blue and green color channels on screen may be good against early myopia development. The OP wanted such software and there was none available.<p>So I quit my job and implemented this software, did a short video with a 3D artist about it.<p>Turns out marketing is expensive, so I made an open source browser extension version too.<p>How it works?<p>There is a small neural network on the retina that tries to detect if the eye is far-sighted(most people are born far-sighted), and it is producing dopamine to slow or increase eye growth rate. It is not very smart, and if you do a lot of near-work it can think you are still hyperopic, causing further myopia progression.<p>So, based on the refractive properties of the eye the software calculates the signal that would convince the retinal neural network that the eye is long enough, so it would produce dopamine, a known signal to stop axial eye growth. (based on myopic defocus LCA from the papers[2][3])<p>Some myopia control techniques work similarly, like MiSight and Hoya lenses.<p>Since then I got a Neurobiologist co-founder and the goal is to best understand the Retinal NN to create the best anti-myopic effect that does not interfere with productivity.<p>The effect can be tried live on the site. Also check out the github repo. Any questions suggestions welcome!<p>[1] <a href="https://news.ycombinator.com/item?id=37019143">https://news.ycombinator.com/item?id=37019143</a> [2] <a href="https://www.nature.com/articles/s41598-022-26323-7" rel="nofollow">https://www.nature.com/articles/s41598-022-26323-7</a> [3] <a href="https://www.sciencedirect.com/science/article/abs/pii/S0014483522002676?via=ihub" rel="nofollow">https://www.sciencedirect.com/science/article/abs/pii/S00144...</a>

Show HN: OK-Robot: open, modular home robot framework for pick-and-drop anywhere

Hi all, excited to share our latest work, OK-Robot, which is an open and modular framework to perform navigation and manipulation with a robot assistant in practically any homes without having to teach the robot anything new! You can simply unbox the target robot, install OK-Robot, give it a "scan" (think a 60 second iPhone video), and start asking the robot to move arbitrary things from A to B. We already tested it out in 10 home environments in New York city, and one environment each in Pittsburgh and Fremont.<p>We based everything off of the current best machine learning models, and so things don't quite work perfectly all the time, so we are hoping to build it together with the community! Our code is open: <a href="https://github.com/ok-robot/ok-robot">https://github.com/ok-robot/ok-robot</a> and we have a Discord server for discussion and support: <a href="https://discord.gg/wzzZJxqKYC" rel="nofollow">https://discord.gg/wzzZJxqKYC</a> If you are curious what works and what doesn't work, take a quick look at <a href="https://ok-robot.github.io/#analysis" rel="nofollow">https://ok-robot.github.io/#analysis</a> or read our paper for a detailed analysis: <a href="https://arxiv.org/abs/2401.12202" rel="nofollow">https://arxiv.org/abs/2401.12202</a><p>P.S.: while the code is open the project unfortunately isn't fully open source since one of our dependencies, AnyGrasp, has a closed-source, educational license. Apologize in advance, but we used it since that was the best grasping model we could have access to!<p>Would love to hear more thoughts and feedback on this project!

Show HN: OK-Robot: open, modular home robot framework for pick-and-drop anywhere

Hi all, excited to share our latest work, OK-Robot, which is an open and modular framework to perform navigation and manipulation with a robot assistant in practically any homes without having to teach the robot anything new! You can simply unbox the target robot, install OK-Robot, give it a "scan" (think a 60 second iPhone video), and start asking the robot to move arbitrary things from A to B. We already tested it out in 10 home environments in New York city, and one environment each in Pittsburgh and Fremont.<p>We based everything off of the current best machine learning models, and so things don't quite work perfectly all the time, so we are hoping to build it together with the community! Our code is open: <a href="https://github.com/ok-robot/ok-robot">https://github.com/ok-robot/ok-robot</a> and we have a Discord server for discussion and support: <a href="https://discord.gg/wzzZJxqKYC" rel="nofollow">https://discord.gg/wzzZJxqKYC</a> If you are curious what works and what doesn't work, take a quick look at <a href="https://ok-robot.github.io/#analysis" rel="nofollow">https://ok-robot.github.io/#analysis</a> or read our paper for a detailed analysis: <a href="https://arxiv.org/abs/2401.12202" rel="nofollow">https://arxiv.org/abs/2401.12202</a><p>P.S.: while the code is open the project unfortunately isn't fully open source since one of our dependencies, AnyGrasp, has a closed-source, educational license. Apologize in advance, but we used it since that was the best grasping model we could have access to!<p>Would love to hear more thoughts and feedback on this project!

Show HN: OK-Robot: open, modular home robot framework for pick-and-drop anywhere

Hi all, excited to share our latest work, OK-Robot, which is an open and modular framework to perform navigation and manipulation with a robot assistant in practically any homes without having to teach the robot anything new! You can simply unbox the target robot, install OK-Robot, give it a "scan" (think a 60 second iPhone video), and start asking the robot to move arbitrary things from A to B. We already tested it out in 10 home environments in New York city, and one environment each in Pittsburgh and Fremont.<p>We based everything off of the current best machine learning models, and so things don't quite work perfectly all the time, so we are hoping to build it together with the community! Our code is open: <a href="https://github.com/ok-robot/ok-robot">https://github.com/ok-robot/ok-robot</a> and we have a Discord server for discussion and support: <a href="https://discord.gg/wzzZJxqKYC" rel="nofollow">https://discord.gg/wzzZJxqKYC</a> If you are curious what works and what doesn't work, take a quick look at <a href="https://ok-robot.github.io/#analysis" rel="nofollow">https://ok-robot.github.io/#analysis</a> or read our paper for a detailed analysis: <a href="https://arxiv.org/abs/2401.12202" rel="nofollow">https://arxiv.org/abs/2401.12202</a><p>P.S.: while the code is open the project unfortunately isn't fully open source since one of our dependencies, AnyGrasp, has a closed-source, educational license. Apologize in advance, but we used it since that was the best grasping model we could have access to!<p>Would love to hear more thoughts and feedback on this project!

< 1 2 3 ... 251 252 253 254 255 ... 835 836 837 >