The best Hacker News stories from Show from the past day
Latest posts:
Show HN: Aidlab – Health Data for Devs
Hey HN! I'm Jakub, and together with my co-founders Agnieszka and Nathan, we built Aidlab, a wearable that gives developers gold-standard physiological data.<p>Unlike health trackers with locked-down APIs, Aidlab ships with a free SDK [1] across 6+ platforms so you can just <i>pip install aidlabsdk</i> or <i>flutter pub add aidlab_sdk</i> or whatever platform (even Unity), and start streaming raw health data and events in real time with simple <i>didReceive*(timestamp, value)</i> callbacks.<p>Currently, we are exposing 13 data types including raw ECG, cough/snoring, motion, raw respiration, skin temperature, bodyweight reps, body position, and 20 high-level stats like stress or readiness through the API.<p>The most common questions I got are:<p>1) "how is it better than my smartwatch?"<p>2) "why we built it?"<p>Chest-mounted wearables are considered the gold standard for physiological measurements. For example, whenever Apple validates their watch, they benchmark against chest straps [2], as some signals can only be reliably measured (or measured at all!) near the heart including continuous ECG, true respiration (based on lung volume changes) or body position/orientation.<p>As for the second question: the problem for us was that smartwatches were too simple and the data too inaccurate, while advanced medical devices were too pricey or too complicated. We found a sweet spot between accuracy and accessibility - Aidlab delivers medical-grade signals without the hospital-level complexity. As "medical-grade" is a bold statement, we’ve published validation papers comparing Aidlab’s performance with other certified medical devices [3].<p>Today Aidlab is already a pretty mature concept. We've been building Aidlab for 2 years, shipped our first version in 2020, we got our first clients including Boeing/Jeppesen (monitoring pilots’ bio-signals during tests&training).<p>Now we're about to release Aidlab 2 [4] - with additional signals like EDA and GPS, and a bunch of new features, including on-device ML (we've trained a few small LSTM models running inference with TensorFlow Lite for Micro). The cool part is that we've built a custom shell on top of FreeRTOS, letting anyone invoke POSIX-like commands directly on the device, for example:<p><i>timeout 10 temperature --sampling-rate 1 | tee /data/temperature.csv | tail -n 5</i><p>The biggest breakthrough for us was realizing that cloud-based processing was the wrong approach. In the beginning, we pushed most of the computation to the cloud - it seemed natural, but turned out to be slow, costly, and devs didn't want it ("hey, is there a way to use your product without cloud?"). For example, our ECG analysis pipeline used to send raw data to an external microservice, processing it in 30-minute chunks through Bull queues. A 24-hour Holter analysis could spawn 100k+ event objects and take significant time to complete. Now we're doing everything we can to move computation to the edge. In an ideal world, the cloud wouldn't store or process anything - just receive already-analyzed, privacy-preserving results straight from the device.<p>Another lesson: don't hand-solder prototypes at 3 a.m. to save money -> please pay professionals to assemble PCBs.<p>We decided to showcase this now for three reasons:<p>- health feels more relevant than ever with the rise of longevity research and biohacking,<p>- we are close to finalizing Aidlab 2,<p>- and I am super curious to see if anyone here finds it useful!<p>If you'd like to check the quality of Aidlab for yourself, we are publishing free datasets every week during different activities [5].<p>[1] <a href="https://github.com/Aidlab" rel="nofollow">https://github.com/Aidlab</a><p>[2] <a href="https://www.apple.com/health/pdf/Heart_Rate_Calorimetry_Activity_on_Apple_Watch_November_2024.pdf" rel="nofollow">https://www.apple.com/health/pdf/Heart_Rate_Calorimetry_Acti...</a><p>[3] <a href="https://aidlab.com/validation" rel="nofollow">https://aidlab.com/validation</a><p>[4] <a href="https://aidlab.com/aidlab-2" rel="nofollow">https://aidlab.com/aidlab-2</a><p>[5] <a href="https://aidlab.com/datasets" rel="nofollow">https://aidlab.com/datasets</a>
Show HN: Aidlab – Health Data for Devs
Hey HN! I'm Jakub, and together with my co-founders Agnieszka and Nathan, we built Aidlab, a wearable that gives developers gold-standard physiological data.<p>Unlike health trackers with locked-down APIs, Aidlab ships with a free SDK [1] across 6+ platforms so you can just <i>pip install aidlabsdk</i> or <i>flutter pub add aidlab_sdk</i> or whatever platform (even Unity), and start streaming raw health data and events in real time with simple <i>didReceive*(timestamp, value)</i> callbacks.<p>Currently, we are exposing 13 data types including raw ECG, cough/snoring, motion, raw respiration, skin temperature, bodyweight reps, body position, and 20 high-level stats like stress or readiness through the API.<p>The most common questions I got are:<p>1) "how is it better than my smartwatch?"<p>2) "why we built it?"<p>Chest-mounted wearables are considered the gold standard for physiological measurements. For example, whenever Apple validates their watch, they benchmark against chest straps [2], as some signals can only be reliably measured (or measured at all!) near the heart including continuous ECG, true respiration (based on lung volume changes) or body position/orientation.<p>As for the second question: the problem for us was that smartwatches were too simple and the data too inaccurate, while advanced medical devices were too pricey or too complicated. We found a sweet spot between accuracy and accessibility - Aidlab delivers medical-grade signals without the hospital-level complexity. As "medical-grade" is a bold statement, we’ve published validation papers comparing Aidlab’s performance with other certified medical devices [3].<p>Today Aidlab is already a pretty mature concept. We've been building Aidlab for 2 years, shipped our first version in 2020, we got our first clients including Boeing/Jeppesen (monitoring pilots’ bio-signals during tests&training).<p>Now we're about to release Aidlab 2 [4] - with additional signals like EDA and GPS, and a bunch of new features, including on-device ML (we've trained a few small LSTM models running inference with TensorFlow Lite for Micro). The cool part is that we've built a custom shell on top of FreeRTOS, letting anyone invoke POSIX-like commands directly on the device, for example:<p><i>timeout 10 temperature --sampling-rate 1 | tee /data/temperature.csv | tail -n 5</i><p>The biggest breakthrough for us was realizing that cloud-based processing was the wrong approach. In the beginning, we pushed most of the computation to the cloud - it seemed natural, but turned out to be slow, costly, and devs didn't want it ("hey, is there a way to use your product without cloud?"). For example, our ECG analysis pipeline used to send raw data to an external microservice, processing it in 30-minute chunks through Bull queues. A 24-hour Holter analysis could spawn 100k+ event objects and take significant time to complete. Now we're doing everything we can to move computation to the edge. In an ideal world, the cloud wouldn't store or process anything - just receive already-analyzed, privacy-preserving results straight from the device.<p>Another lesson: don't hand-solder prototypes at 3 a.m. to save money -> please pay professionals to assemble PCBs.<p>We decided to showcase this now for three reasons:<p>- health feels more relevant than ever with the rise of longevity research and biohacking,<p>- we are close to finalizing Aidlab 2,<p>- and I am super curious to see if anyone here finds it useful!<p>If you'd like to check the quality of Aidlab for yourself, we are publishing free datasets every week during different activities [5].<p>[1] <a href="https://github.com/Aidlab" rel="nofollow">https://github.com/Aidlab</a><p>[2] <a href="https://www.apple.com/health/pdf/Heart_Rate_Calorimetry_Activity_on_Apple_Watch_November_2024.pdf" rel="nofollow">https://www.apple.com/health/pdf/Heart_Rate_Calorimetry_Acti...</a><p>[3] <a href="https://aidlab.com/validation" rel="nofollow">https://aidlab.com/validation</a><p>[4] <a href="https://aidlab.com/aidlab-2" rel="nofollow">https://aidlab.com/aidlab-2</a><p>[5] <a href="https://aidlab.com/datasets" rel="nofollow">https://aidlab.com/datasets</a>
Show HN: Baby's first international landline
Hi HN,<p>As a weekend project, I hacked together a physical phone, a Raspberry Pi running Asterisk and Twilio, to let toddlers safely make international calls.<p>I’ve documented the setup in this write-up and published the code + Ansible playbooks on GitHub so others can replicate it.<p>I built this so kids of expats can easily stay in touch with family on other continents.<p>Would love feedback from anyone who’s worked on something similar or tries building this themselves!<p>writeup: <a href="https://wip.tf/posts/telefonefix-building-babys-first-international-landline/" rel="nofollow">https://wip.tf/posts/telefonefix-building-babys-first-intern...</a>
github repos:
- <a href="https://github.com/nbr23/ansible-role-telefonefix" rel="nofollow">https://github.com/nbr23/ansible-role-telefonefix</a>
- <a href="https://github.com/nbr23/allo-wed" rel="nofollow">https://github.com/nbr23/allo-wed</a>
Show HN: Baby's first international landline
Hi HN,<p>As a weekend project, I hacked together a physical phone, a Raspberry Pi running Asterisk and Twilio, to let toddlers safely make international calls.<p>I’ve documented the setup in this write-up and published the code + Ansible playbooks on GitHub so others can replicate it.<p>I built this so kids of expats can easily stay in touch with family on other continents.<p>Would love feedback from anyone who’s worked on something similar or tries building this themselves!<p>writeup: <a href="https://wip.tf/posts/telefonefix-building-babys-first-international-landline/" rel="nofollow">https://wip.tf/posts/telefonefix-building-babys-first-intern...</a>
github repos:
- <a href="https://github.com/nbr23/ansible-role-telefonefix" rel="nofollow">https://github.com/nbr23/ansible-role-telefonefix</a>
- <a href="https://github.com/nbr23/allo-wed" rel="nofollow">https://github.com/nbr23/allo-wed</a>
Show HN: AI toy I worked on is in stores
Alt link: <a href="https://mrchristmas.com/products/santas-magical-telephone" rel="nofollow">https://mrchristmas.com/products/santas-magical-telephone</a><p>Video demo: <a href="https://www.youtube.com/watch?v=0z7QJxZWFQg" rel="nofollow">https://www.youtube.com/watch?v=0z7QJxZWFQg</a><p>The first time I talked with AI santa and it responded with a joke I was HOOKED. The fun/nonsense doesn't click until you try it yourself. What's even more exciting is you can build it yourself:<p>libpeer: <a href="https://github.com/sepfy/libpeer" rel="nofollow">https://github.com/sepfy/libpeer</a><p>pion: <a href="https://github.com/pion/webrtc" rel="nofollow">https://github.com/pion/webrtc</a><p>Then go do all your fun logic in your Pion server. Connect to any Voice AI provider, or roll your own via Open Source. Anything is possible.<p>If you have questions or hit any roadblocks I would love to help you. I have lots of hardware snippets on my GitHub: <a href="https://github.com/sean-der" rel="nofollow">https://github.com/sean-der</a>.
Show HN: AI toy I worked on is in stores
Alt link: <a href="https://mrchristmas.com/products/santas-magical-telephone" rel="nofollow">https://mrchristmas.com/products/santas-magical-telephone</a><p>Video demo: <a href="https://www.youtube.com/watch?v=0z7QJxZWFQg" rel="nofollow">https://www.youtube.com/watch?v=0z7QJxZWFQg</a><p>The first time I talked with AI santa and it responded with a joke I was HOOKED. The fun/nonsense doesn't click until you try it yourself. What's even more exciting is you can build it yourself:<p>libpeer: <a href="https://github.com/sepfy/libpeer" rel="nofollow">https://github.com/sepfy/libpeer</a><p>pion: <a href="https://github.com/pion/webrtc" rel="nofollow">https://github.com/pion/webrtc</a><p>Then go do all your fun logic in your Pion server. Connect to any Voice AI provider, or roll your own via Open Source. Anything is possible.<p>If you have questions or hit any roadblocks I would love to help you. I have lots of hardware snippets on my GitHub: <a href="https://github.com/sean-der" rel="nofollow">https://github.com/sean-der</a>.
Show HN: AI toy I worked on is in stores
Alt link: <a href="https://mrchristmas.com/products/santas-magical-telephone" rel="nofollow">https://mrchristmas.com/products/santas-magical-telephone</a><p>Video demo: <a href="https://www.youtube.com/watch?v=0z7QJxZWFQg" rel="nofollow">https://www.youtube.com/watch?v=0z7QJxZWFQg</a><p>The first time I talked with AI santa and it responded with a joke I was HOOKED. The fun/nonsense doesn't click until you try it yourself. What's even more exciting is you can build it yourself:<p>libpeer: <a href="https://github.com/sepfy/libpeer" rel="nofollow">https://github.com/sepfy/libpeer</a><p>pion: <a href="https://github.com/pion/webrtc" rel="nofollow">https://github.com/pion/webrtc</a><p>Then go do all your fun logic in your Pion server. Connect to any Voice AI provider, or roll your own via Open Source. Anything is possible.<p>If you have questions or hit any roadblocks I would love to help you. I have lots of hardware snippets on my GitHub: <a href="https://github.com/sean-der" rel="nofollow">https://github.com/sean-der</a>.
Show HN: SQLite Online – 11 years of solo development, 11K daily users
Show HN: SQLite Online – 11 years of solo development, 11K daily users
Show HN: SQLite Online – 11 years of solo development, 11K daily users
Show HN: SQLite Online – 11 years of solo development, 11K daily users
Show HN: A Lisp Interpreter for Shell Scripting
Redstart is a lightweight Lisp interpreter written in C++ with a focus on shell scripting. It lets you combine the expressive power of Lisp with the practicality of the Unix shell: you can run commands, capture output, pipe between processes, and still use Lisp syntax for logic and structure. Think of it as writing your shell scripts in Lisp instead of Bash.
Show HN: A Lisp Interpreter for Shell Scripting
Redstart is a lightweight Lisp interpreter written in C++ with a focus on shell scripting. It lets you combine the expressive power of Lisp with the practicality of the Unix shell: you can run commands, capture output, pipe between processes, and still use Lisp syntax for logic and structure. Think of it as writing your shell scripts in Lisp instead of Bash.
Show HN: A Lisp Interpreter for Shell Scripting
Redstart is a lightweight Lisp interpreter written in C++ with a focus on shell scripting. It lets you combine the expressive power of Lisp with the practicality of the Unix shell: you can run commands, capture output, pipe between processes, and still use Lisp syntax for logic and structure. Think of it as writing your shell scripts in Lisp instead of Bash.
Show HN: I made an esoteric programming language that's read like a spellbook
i made an esoteric programming language which i call spellscript.
every program is a "spell" written in a "grimoire," and you have to use keywords like summon, enchant, inscribe, and conjure.<p>it's literally read like a spellbook because the syntax consists of all natural language, and newlines are optional. your code can now be an essay, like everybody wants!<p>for example, if you want to print something, you'd write:
`begin the grimoire. inscribe whispers of "hello, world!". close the grimoire.`<p>it has variables, dynamic typing, arrays, functions, conditionals, loops, string manipulation, array manipulation, type conversion, and user input, among other (listed in the docs!)<p>but why? i wanted to see how far you could push natural language syntax while still being parseable. most esolangs are intentionally obtuse (BF, Malbolge), but i wanted something that's weird but readable, like you're reading instructions from a spellbook, which makes it incredibly easy to read and understand. like an anti-esolang? hmm...<p>github: <a href="https://github.com/sirbread/spellscript" rel="nofollow">https://github.com/sirbread/spellscript</a><p>docs: <a href="https://github.com/sirbread/spellscript/blob/main/resources/documentation.md" rel="nofollow">https://github.com/sirbread/spellscript/blob/main/resources/...</a>
Show HN: I made an esoteric programming language that's read like a spellbook
i made an esoteric programming language which i call spellscript.
every program is a "spell" written in a "grimoire," and you have to use keywords like summon, enchant, inscribe, and conjure.<p>it's literally read like a spellbook because the syntax consists of all natural language, and newlines are optional. your code can now be an essay, like everybody wants!<p>for example, if you want to print something, you'd write:
`begin the grimoire. inscribe whispers of "hello, world!". close the grimoire.`<p>it has variables, dynamic typing, arrays, functions, conditionals, loops, string manipulation, array manipulation, type conversion, and user input, among other (listed in the docs!)<p>but why? i wanted to see how far you could push natural language syntax while still being parseable. most esolangs are intentionally obtuse (BF, Malbolge), but i wanted something that's weird but readable, like you're reading instructions from a spellbook, which makes it incredibly easy to read and understand. like an anti-esolang? hmm...<p>github: <a href="https://github.com/sirbread/spellscript" rel="nofollow">https://github.com/sirbread/spellscript</a><p>docs: <a href="https://github.com/sirbread/spellscript/blob/main/resources/documentation.md" rel="nofollow">https://github.com/sirbread/spellscript/blob/main/resources/...</a>
Show HN: I made an esoteric programming language that's read like a spellbook
i made an esoteric programming language which i call spellscript.
every program is a "spell" written in a "grimoire," and you have to use keywords like summon, enchant, inscribe, and conjure.<p>it's literally read like a spellbook because the syntax consists of all natural language, and newlines are optional. your code can now be an essay, like everybody wants!<p>for example, if you want to print something, you'd write:
`begin the grimoire. inscribe whispers of "hello, world!". close the grimoire.`<p>it has variables, dynamic typing, arrays, functions, conditionals, loops, string manipulation, array manipulation, type conversion, and user input, among other (listed in the docs!)<p>but why? i wanted to see how far you could push natural language syntax while still being parseable. most esolangs are intentionally obtuse (BF, Malbolge), but i wanted something that's weird but readable, like you're reading instructions from a spellbook, which makes it incredibly easy to read and understand. like an anti-esolang? hmm...<p>github: <a href="https://github.com/sirbread/spellscript" rel="nofollow">https://github.com/sirbread/spellscript</a><p>docs: <a href="https://github.com/sirbread/spellscript/blob/main/resources/documentation.md" rel="nofollow">https://github.com/sirbread/spellscript/blob/main/resources/...</a>
Show HN: Rift – A tiling window manager for macOS
Show HN: Rift – A tiling window manager for macOS
Show HN: I built a simple ambient sound app with no ads or subscriptions
I’ve always liked having background noise while working or falling asleep, but I got frustrated that most “white noise” or ambient sound apps are either paywalled, stuffed with ads, or try to upsell subscriptions for basic features.<p>So I made Ambi, a small iOS app with a clean interface and a set of freely available ambient sounds — rain, waves, wind, birds, that sort of thing. You can mix them, adjust volume levels, and just let it play all night or while you work. Everything works offline and there are no hidden catches.<p>It’s something I built for myself first, but I figured others might find it useful too.
Feedback, bugs, and suggestions are all welcome.<p><a href="https://apps.apple.com/app/ambi-white-noise-sleep-sounds/id6753184615">https://apps.apple.com/app/ambi-white-noise-sleep-sounds/id6...</a>