The best Hacker News stories from Show from the past day
Latest posts:
Show HN: Bi-directional sync between Postgres and SQLite
Hi HN,<p>Today we’re launching PowerSync, a Postgres<>SQLite bi-directional sync engine that enables an offline-first app architecture. It currently supports Flutter, React Native and web (JavaScript) using Wasm SQLite in the browser, with more client SDKs on the way.<p>Conrad and I (Ralf) have been working on our sync engine since 2009, originally as part of a full-stack app platform. That version of the system is still used in production worldwide and we’ve learnt a lot from its use cases and scaling. About a year ago we started on spinning off PowerSync as a standalone product that is designed to be stack-agnostic.<p>If you’d like to see a simple demo, check out the pebbles widget on the landing page here: <a href="https://www.powersync.com/" rel="nofollow noreferrer">https://www.powersync.com/</a><p>We wrote about our architecture and design philosophy here: <a href="https://www.powersync.com/blog/introducing-powersync-v1-0-postgres-sqlite-sync-layer" rel="nofollow noreferrer">https://www.powersync.com/blog/introducing-powersync-v1-0-po...</a><p>This covers amongst other things how we designed the system for scalable dynamic partial replication, why we use a server authority architecture based on an event log instead of CRDTs for merging changes, and the approach to consistency.<p>Our docs can be found here: <a href="https://docs.powersync.com/" rel="nofollow noreferrer">https://docs.powersync.com/</a><p>We would love to hear your feedback!
- Ralf, Conrad, Kobie, Phillip and team
Show HN: Docu – Never sign a sketchy contract again. GPT-4 contract review
A simple tool for people like me whose brain hurts when they read any legal document full of bloated jargon.<p>It's called Docu and it's makes contract review super simple. It gives you a very readable summary, highlights beneficial clauses, flags potential risks, and gives you actions you can take to make the contract work for you.<p>Build on OpenAI assistant API using a combination of prompts.<p><a href="https://docu.review" rel="nofollow noreferrer">https://docu.review</a>
Show HN: pgxman – npm for Postgres extensions
pgxman is npm for Postgres extensions, simplifying the discovery and use of extensions so you can easily enhance your applications.<p>Installing and updating Postgres extensions is an uphill battle. You're left searching for the right build tools and grappling with often unclear and incomplete compiling instructions to even try one out. But with pgxman, we've streamlined the process to one simple step: pgxman install [extension name].<p>For example, to build parquet_s3_fdw manually, you'd need to:<p>1. Download the parquet_s3_fdw source code;<p>2. Figure out how to build it by looking at README. When unclear, look at the source code how to build it. The README of parquet_s3_fdw says it needs libarrow, libparquet & aws_sdk. Readme doesn’t say where to get them.<p>3. Make sure all dependencies are available to build the project. Either install them from apt, if available, or build them manually if not. For parquet_s3_fdw aws_sdk has to be built manually — it’s not available in any apt repos<p>4. Build the extension targeting the correct OS, CPU architecture and Postgres version.<p>5. Determine and build to the right path /usr/lib/postgresql/15/lib - otherwise, Postgres wouldn’t be able to find them.<p>6. Repeat across all relevant Postgres instances. Hopefully, these Postgres versions are recent or else you’ll have to update postgres, set maintenance window, etc.<p>* Added friction: Since parquet_s3_fdw is not designed to use in a cloud environment, and we forked it to make changes to make it possible to use.<p>Using pgxman, you can just do:<p><pre><code> pgxman install s3_parquet_fdw
</code></pre>
pgxman integrates with your system package manager, ensuring the correct versions are installed without extra packages from any shared dependencies between extensions. pgxman’s automated build system creates [APT] packages for each Postgres version, platform, and OS supported by the extension. Extensions are built from a “buildkit” formula, written in YAML, and are contributed through GitHub.<p>To install pgxman, you can do<p><pre><code> brew install pgxman/tap/pgxman
</code></pre>
or, if you don't mind pipe-to-shell,<p><pre><code> curl -sfl https://install.pgx.sh | sh -
</code></pre>
If you'd like to learn more, we have an extensive blog post here: <a href="https://www.hydra.so/blog-posts/the-design-of-postgres-extension-manager-pgxman">https://www.hydra.so/blog-posts/the-design-of-postgres-exten...</a>.
Show HN: pgxman – npm for Postgres extensions
pgxman is npm for Postgres extensions, simplifying the discovery and use of extensions so you can easily enhance your applications.<p>Installing and updating Postgres extensions is an uphill battle. You're left searching for the right build tools and grappling with often unclear and incomplete compiling instructions to even try one out. But with pgxman, we've streamlined the process to one simple step: pgxman install [extension name].<p>For example, to build parquet_s3_fdw manually, you'd need to:<p>1. Download the parquet_s3_fdw source code;<p>2. Figure out how to build it by looking at README. When unclear, look at the source code how to build it. The README of parquet_s3_fdw says it needs libarrow, libparquet & aws_sdk. Readme doesn’t say where to get them.<p>3. Make sure all dependencies are available to build the project. Either install them from apt, if available, or build them manually if not. For parquet_s3_fdw aws_sdk has to be built manually — it’s not available in any apt repos<p>4. Build the extension targeting the correct OS, CPU architecture and Postgres version.<p>5. Determine and build to the right path /usr/lib/postgresql/15/lib - otherwise, Postgres wouldn’t be able to find them.<p>6. Repeat across all relevant Postgres instances. Hopefully, these Postgres versions are recent or else you’ll have to update postgres, set maintenance window, etc.<p>* Added friction: Since parquet_s3_fdw is not designed to use in a cloud environment, and we forked it to make changes to make it possible to use.<p>Using pgxman, you can just do:<p><pre><code> pgxman install s3_parquet_fdw
</code></pre>
pgxman integrates with your system package manager, ensuring the correct versions are installed without extra packages from any shared dependencies between extensions. pgxman’s automated build system creates [APT] packages for each Postgres version, platform, and OS supported by the extension. Extensions are built from a “buildkit” formula, written in YAML, and are contributed through GitHub.<p>To install pgxman, you can do<p><pre><code> brew install pgxman/tap/pgxman
</code></pre>
or, if you don't mind pipe-to-shell,<p><pre><code> curl -sfl https://install.pgx.sh | sh -
</code></pre>
If you'd like to learn more, we have an extensive blog post here: <a href="https://www.hydra.so/blog-posts/the-design-of-postgres-extension-manager-pgxman">https://www.hydra.so/blog-posts/the-design-of-postgres-exten...</a>.
Show HN: Python-Type-Challenges, master Python typing with online exercises
Hi HN, I'm excited to share Python-Type-Challenges, a collection of hands-on, interactive challenges designed to help Python developers master type annotations. Whether you're new to type hints or looking to deepen your understanding, these exercises provide a fun and educational way to explore Python's type system. I'd love to get your feedback and contributions!<p><a href="https://github.com/laike9m/Python-Type-Challenges">https://github.com/laike9m/Python-Type-Challenges</a>
Show HN: Python-Type-Challenges, master Python typing with online exercises
Hi HN, I'm excited to share Python-Type-Challenges, a collection of hands-on, interactive challenges designed to help Python developers master type annotations. Whether you're new to type hints or looking to deepen your understanding, these exercises provide a fun and educational way to explore Python's type system. I'd love to get your feedback and contributions!<p><a href="https://github.com/laike9m/Python-Type-Challenges">https://github.com/laike9m/Python-Type-Challenges</a>
Resume Matcher – An open source, free tool to improve your resume
Show HN: Error return traces for Go, inspired by Zig
Show HN: Error return traces for Go, inspired by Zig
Show HN: Error return traces for Go, inspired by Zig
Show HN: Hacky Meta Glasses GPT4 Vision Integration
Super hacky implementation due to the lack of an SDK. Fun project though.<p>In the foodlog demonstration I just made a fake fb account (sorry zucc) called "Mye Food-Log".
Show HN: Hacky Meta Glasses GPT4 Vision Integration
Super hacky implementation due to the lack of an SDK. Fun project though.<p>In the foodlog demonstration I just made a fake fb account (sorry zucc) called "Mye Food-Log".
Show HN: Hacky Meta Glasses GPT4 Vision Integration
Super hacky implementation due to the lack of an SDK. Fun project though.<p>In the foodlog demonstration I just made a fake fb account (sorry zucc) called "Mye Food-Log".
Show HN: Dobb·E – towards home robots with an open-source platform
Hi HN! Proud to share our open-source robot platform, Dobb·E, a home robot system that needs just 5 minutes of human teaching to learn new tasks. We've already taken Dobb·E to 10 different homes in New York, taught it 100+ tasks, and we are just getting started! I would love to hear your thoughts about this.<p>Here are some more details, below (or see a Twitter thread with attached media: <a href="https://twitter.com/i/status/1729515379892826211" rel="nofollow noreferrer">https://twitter.com/i/status/1729515379892826211</a> or <a href="https://nitter.net/i/status/1729515379892826211" rel="nofollow noreferrer">https://nitter.net/i/status/1729515379892826211</a>):<p>We engineered Dobb·E to maximize efficiency, safety, and user comfort. As a system, it is composed of four parts: a data collection tool, a home dataset, a pretrained vision model, and a policy fine-tuning recipe.<p>We teach our robots with imitation learning, and for data collection, we created the “Stick”, a tool made out of $25 of hardware and an iPhone.<p>Then, using the Stick, we collected a 13 hour dataset in 22 New York homes, called Homes of New York (HoNY). HoNY has 1.5M frames collected over 216 different "environments" which is an order of magnitude larger compared to similar open source datasets.<p>Then we trained a foundational vision model that we can fine-tune fast (15 minutes!) on a new task with only 5 minutes (human time)/ 90 seconds (demo time) of data. So from start to finish, it takes about 20 minutes to teach the robot a new task.<p>Over a month, we visited 10 homes, tried 109 tasks, and got 81% success rate in simple household tasks. We also found a line of challenges, from mirrors to heavy objects, that we must overcome if we are to get a general purpose home robot.<p>We open-sourced our entire system because our primary goal is to get more robotics and AI researchers, engineers, and enthusiasts to go beyond constrained lab environments and start getting into homes!<p>So here is how you can get started:<p>1. Code and STL files: <a href="https://github.com/notmahi/dobb-e/">https://github.com/notmahi/dobb-e/</a><p>2. Technical documentation: <a href="https://docs.dobb-e.com/" rel="nofollow noreferrer">https://docs.dobb-e.com/</a><p>3. Paper: <a href="https://arxiv.org/abs/2311.16098" rel="nofollow noreferrer">https://arxiv.org/abs/2311.16098</a><p>4. More videos and the dataset: <a href="https://dobb-e.com" rel="nofollow noreferrer">https://dobb-e.com</a><p>5. Robot we used: <a href="https://hello-robot.com" rel="nofollow noreferrer">https://hello-robot.com</a>
Show HN: Dobb·E – towards home robots with an open-source platform
Hi HN! Proud to share our open-source robot platform, Dobb·E, a home robot system that needs just 5 minutes of human teaching to learn new tasks. We've already taken Dobb·E to 10 different homes in New York, taught it 100+ tasks, and we are just getting started! I would love to hear your thoughts about this.<p>Here are some more details, below (or see a Twitter thread with attached media: <a href="https://twitter.com/i/status/1729515379892826211" rel="nofollow noreferrer">https://twitter.com/i/status/1729515379892826211</a> or <a href="https://nitter.net/i/status/1729515379892826211" rel="nofollow noreferrer">https://nitter.net/i/status/1729515379892826211</a>):<p>We engineered Dobb·E to maximize efficiency, safety, and user comfort. As a system, it is composed of four parts: a data collection tool, a home dataset, a pretrained vision model, and a policy fine-tuning recipe.<p>We teach our robots with imitation learning, and for data collection, we created the “Stick”, a tool made out of $25 of hardware and an iPhone.<p>Then, using the Stick, we collected a 13 hour dataset in 22 New York homes, called Homes of New York (HoNY). HoNY has 1.5M frames collected over 216 different "environments" which is an order of magnitude larger compared to similar open source datasets.<p>Then we trained a foundational vision model that we can fine-tune fast (15 minutes!) on a new task with only 5 minutes (human time)/ 90 seconds (demo time) of data. So from start to finish, it takes about 20 minutes to teach the robot a new task.<p>Over a month, we visited 10 homes, tried 109 tasks, and got 81% success rate in simple household tasks. We also found a line of challenges, from mirrors to heavy objects, that we must overcome if we are to get a general purpose home robot.<p>We open-sourced our entire system because our primary goal is to get more robotics and AI researchers, engineers, and enthusiasts to go beyond constrained lab environments and start getting into homes!<p>So here is how you can get started:<p>1. Code and STL files: <a href="https://github.com/notmahi/dobb-e/">https://github.com/notmahi/dobb-e/</a><p>2. Technical documentation: <a href="https://docs.dobb-e.com/" rel="nofollow noreferrer">https://docs.dobb-e.com/</a><p>3. Paper: <a href="https://arxiv.org/abs/2311.16098" rel="nofollow noreferrer">https://arxiv.org/abs/2311.16098</a><p>4. More videos and the dataset: <a href="https://dobb-e.com" rel="nofollow noreferrer">https://dobb-e.com</a><p>5. Robot we used: <a href="https://hello-robot.com" rel="nofollow noreferrer">https://hello-robot.com</a>
Show HN: Dobb·E – towards home robots with an open-source platform
Hi HN! Proud to share our open-source robot platform, Dobb·E, a home robot system that needs just 5 minutes of human teaching to learn new tasks. We've already taken Dobb·E to 10 different homes in New York, taught it 100+ tasks, and we are just getting started! I would love to hear your thoughts about this.<p>Here are some more details, below (or see a Twitter thread with attached media: <a href="https://twitter.com/i/status/1729515379892826211" rel="nofollow noreferrer">https://twitter.com/i/status/1729515379892826211</a> or <a href="https://nitter.net/i/status/1729515379892826211" rel="nofollow noreferrer">https://nitter.net/i/status/1729515379892826211</a>):<p>We engineered Dobb·E to maximize efficiency, safety, and user comfort. As a system, it is composed of four parts: a data collection tool, a home dataset, a pretrained vision model, and a policy fine-tuning recipe.<p>We teach our robots with imitation learning, and for data collection, we created the “Stick”, a tool made out of $25 of hardware and an iPhone.<p>Then, using the Stick, we collected a 13 hour dataset in 22 New York homes, called Homes of New York (HoNY). HoNY has 1.5M frames collected over 216 different "environments" which is an order of magnitude larger compared to similar open source datasets.<p>Then we trained a foundational vision model that we can fine-tune fast (15 minutes!) on a new task with only 5 minutes (human time)/ 90 seconds (demo time) of data. So from start to finish, it takes about 20 minutes to teach the robot a new task.<p>Over a month, we visited 10 homes, tried 109 tasks, and got 81% success rate in simple household tasks. We also found a line of challenges, from mirrors to heavy objects, that we must overcome if we are to get a general purpose home robot.<p>We open-sourced our entire system because our primary goal is to get more robotics and AI researchers, engineers, and enthusiasts to go beyond constrained lab environments and start getting into homes!<p>So here is how you can get started:<p>1. Code and STL files: <a href="https://github.com/notmahi/dobb-e/">https://github.com/notmahi/dobb-e/</a><p>2. Technical documentation: <a href="https://docs.dobb-e.com/" rel="nofollow noreferrer">https://docs.dobb-e.com/</a><p>3. Paper: <a href="https://arxiv.org/abs/2311.16098" rel="nofollow noreferrer">https://arxiv.org/abs/2311.16098</a><p>4. More videos and the dataset: <a href="https://dobb-e.com" rel="nofollow noreferrer">https://dobb-e.com</a><p>5. Robot we used: <a href="https://hello-robot.com" rel="nofollow noreferrer">https://hello-robot.com</a>
Show HN: Spaceflight News made using Htmx
Show HN: Multiple – Load test any API with JavaScript and NPM packages
Hey HN,<p>I wanted a better load testing solution – so I built one with my team at Multiple. We just opened early access and would love to get your feedback.<p>We created Multiple to solve three challenges with existing tools:<p>1. Limited scripting capabilities. XML or GUI-based scripting can only test basic scenarios. Existing code-based tools struggle with auth, generating synthetic data, and testing anything other than HTTP requests. We went with JavaScript for ease of use, versatility, and integration with existing developer workflows.<p>2. Cannot use existing libraries or code. Instead of forcing you to learn a new system and rewrite code, Multiple leverages the JavaScript and NPM ecosystem so you can use packages you're already familiar with. By supporting NPM packages, Multiple can test nearly any API, service, or protocol.<p>3. Tedious infrastructure management. There's no reason to spend time spinning up and configuring machines, and then destroying them after a test. Multiple abstracts that away. You just enter the test size and duration, and press start.<p>My favorite feature we've built so far is the Debug Run. You can use Debug Run as you write your tests to execute a single run-through. It's helpful to verify correct behavior and capture logs, and it allows you to iterate quickly, without spinning up a full load test each time.<p>We have so much in store for developers: pass/fail conditions, CLI, and repo integration, to name a few. Thanks for reading, and let us know what you think.
Show HN: Transform Notes into Visual Mind Maps
Hey, HN Community!<p>I'm a Brazilian software engineer, and my educational journey led to a significant discovery.
During my master's degree, I often struggled to recall foundational concepts studied years earlier, which were crucial for my current studies. This challenge sparked a reevaluation of traditional note-taking and inspired the creation of cmaps.io.<p>Human thinking is inherently associative, not linear. We often draw connections between learned concepts unconsciously. To leverage this natural process, making these connections explicit is essential – and that's precisely what cmaps.io offers.<p>I can personally vouch for the impact of cmaps.io on my learning and retention. It's become an invaluable tool for me, and I'm confident it can be for you as well!<p>You can find more details on the ProductHunt post:
<a href="https://www.producthunt.com/posts/cmaps-io" rel="nofollow noreferrer">https://www.producthunt.com/posts/cmaps-io</a><p>Discover cmaps.io and enjoy a revolutionary note-taking experience!<p>I eagerly await your feedback and thoughts
Show HN: Cap – open-source alternative to Loom