The best Hacker News stories from Show from the past day

Go back

Latest posts:

Show HN: Claude Composer

Central feature is a something like "yolo mode" but with fine grained controls over how yolo you're feeling. Also makes it easy to use "presets" of tools and permissions.<p>Let me know if you have any questions and feel free to contact me on X at <a href="https://x.com/possibilities" rel="nofollow">https://x.com/possibilities</a>

Show HN: Claude Composer

Central feature is a something like "yolo mode" but with fine grained controls over how yolo you're feeling. Also makes it easy to use "presets" of tools and permissions.<p>Let me know if you have any questions and feel free to contact me on X at <a href="https://x.com/possibilities" rel="nofollow">https://x.com/possibilities</a>

Show HN: Claude Composer

Central feature is a something like "yolo mode" but with fine grained controls over how yolo you're feeling. Also makes it easy to use "presets" of tools and permissions.<p>Let me know if you have any questions and feel free to contact me on X at <a href="https://x.com/possibilities" rel="nofollow">https://x.com/possibilities</a>

Show HN: Claude Composer

Central feature is a something like "yolo mode" but with fine grained controls over how yolo you're feeling. Also makes it easy to use "presets" of tools and permissions.<p>Let me know if you have any questions and feel free to contact me on X at <a href="https://x.com/possibilities" rel="nofollow">https://x.com/possibilities</a>

Show HN: I made a 3D SVG Renderer that projects textures without rasterization

Show HN: I made a 3D SVG Renderer that projects textures without rasterization

Show HN: ClickStack – Open-source Datadog alternative by ClickHouse and HyperDX

Hey HN! Mike & Warren here from HyperDX (now part of ClickHouse)! We’ve been building ClickStack, an open source observability stack that helps you collect, centralize, search/viz/alert on your telemetry (logs, metrics, traces) in just a few minutes - all powered by ClickHouse (Apache2) for storage, HyperDX (MIT) for visualization and OpenTelemetry (Apache2) for ingestion.<p>You can check out the quick start for spinning things up in the repo here: <a href="https://github.com/hyperdxio/hyperdx">https://github.com/hyperdxio/hyperdx</a><p>ClickStack makes it really easy to instrument your application so you can go from bug reports of “my checkout didn’t go through” to a session replay of the user, backend API calls, to DB queries and infrastructure metrics related to that specific request in a single view.<p>For those that might be migrating from Very Expensive Observability Vendor (TM) to something open source, more performant, and doesn’t require extensive culling of retention limits and sampling rates - ClickStack gives a batteries-included way of starting that migration journey.<p>For those that aren’t familiar with ClickHouse, it’s a high performance database that has already been used by companies such as Anthropic, Cloudflare, and DoorDash to power their core observability at scale due to its flexibility, ease of use, and cost effectiveness. However, this required teams to dedicate engineers to building a custom observability stack, where it’s difficult to not only get their telemetry data easily into ClickHouse but also struggling without a native UI experience.<p>That’s why we’re building ClickStack - we wanted to bundle an easy way to get started ingesting your telemetry data whether it’s logs & traces from Node.js or Ruby to metrics from Kubernetes or your bare metal infrastructure. Just as important we wanted our users to enjoy a visualization experience that allowed users to quickly search using a familiar lucene-like search syntax (similar to what you’d use in Google!). We recognise though, that a SQL mode is needed for the most complex of queries. We've also added high cardinality outlier analysis by charting the delta between outlier and inlier events - which we've found really helpful in narrowing down causes of regressions/anomalies in our traces as well as log patterns to condense down clusters of similar logs.<p>We’re really excited about the roadmap ahead in terms of improving ClickStack as a product and the ClickHouse core database to improve observability. Would love to hear everyone’s feedback and what they think!<p>Spinning up a container is pretty simple: `docker run -p 8080:8080 -p 4317:4317 -p 4318:4318 docker.hyperdx.io/hyperdx/hyperdx-all-in-one` In browser live demo (no sign ups or anything silly, it runs fully in your browser!): <a href="https://play.hyperdx.io/" rel="nofollow">https://play.hyperdx.io/</a> Landing Page: <a href="https://clickhouse.com/o11y" rel="nofollow">https://clickhouse.com/o11y</a> Github Repo: <a href="https://github.com/hyperdxio/hyperdx">https://github.com/hyperdxio/hyperdx</a> Discord community: <a href="https://hyperdx.io/discord" rel="nofollow">https://hyperdx.io/discord</a> Docs: <a href="https://clickhouse.com/docs/use-cases/observability/clickstack/getting-started" rel="nofollow">https://clickhouse.com/docs/use-cases/observability/clicksta...</a>

Show HN: ClickStack – Open-source Datadog alternative by ClickHouse and HyperDX

Hey HN! Mike & Warren here from HyperDX (now part of ClickHouse)! We’ve been building ClickStack, an open source observability stack that helps you collect, centralize, search/viz/alert on your telemetry (logs, metrics, traces) in just a few minutes - all powered by ClickHouse (Apache2) for storage, HyperDX (MIT) for visualization and OpenTelemetry (Apache2) for ingestion.<p>You can check out the quick start for spinning things up in the repo here: <a href="https://github.com/hyperdxio/hyperdx">https://github.com/hyperdxio/hyperdx</a><p>ClickStack makes it really easy to instrument your application so you can go from bug reports of “my checkout didn’t go through” to a session replay of the user, backend API calls, to DB queries and infrastructure metrics related to that specific request in a single view.<p>For those that might be migrating from Very Expensive Observability Vendor (TM) to something open source, more performant, and doesn’t require extensive culling of retention limits and sampling rates - ClickStack gives a batteries-included way of starting that migration journey.<p>For those that aren’t familiar with ClickHouse, it’s a high performance database that has already been used by companies such as Anthropic, Cloudflare, and DoorDash to power their core observability at scale due to its flexibility, ease of use, and cost effectiveness. However, this required teams to dedicate engineers to building a custom observability stack, where it’s difficult to not only get their telemetry data easily into ClickHouse but also struggling without a native UI experience.<p>That’s why we’re building ClickStack - we wanted to bundle an easy way to get started ingesting your telemetry data whether it’s logs & traces from Node.js or Ruby to metrics from Kubernetes or your bare metal infrastructure. Just as important we wanted our users to enjoy a visualization experience that allowed users to quickly search using a familiar lucene-like search syntax (similar to what you’d use in Google!). We recognise though, that a SQL mode is needed for the most complex of queries. We've also added high cardinality outlier analysis by charting the delta between outlier and inlier events - which we've found really helpful in narrowing down causes of regressions/anomalies in our traces as well as log patterns to condense down clusters of similar logs.<p>We’re really excited about the roadmap ahead in terms of improving ClickStack as a product and the ClickHouse core database to improve observability. Would love to hear everyone’s feedback and what they think!<p>Spinning up a container is pretty simple: `docker run -p 8080:8080 -p 4317:4317 -p 4318:4318 docker.hyperdx.io/hyperdx/hyperdx-all-in-one` In browser live demo (no sign ups or anything silly, it runs fully in your browser!): <a href="https://play.hyperdx.io/" rel="nofollow">https://play.hyperdx.io/</a> Landing Page: <a href="https://clickhouse.com/o11y" rel="nofollow">https://clickhouse.com/o11y</a> Github Repo: <a href="https://github.com/hyperdxio/hyperdx">https://github.com/hyperdxio/hyperdx</a> Discord community: <a href="https://hyperdx.io/discord" rel="nofollow">https://hyperdx.io/discord</a> Docs: <a href="https://clickhouse.com/docs/use-cases/observability/clickstack/getting-started" rel="nofollow">https://clickhouse.com/docs/use-cases/observability/clicksta...</a>

Show HN: Air Lab – A portable and open air quality measuring device

Hi HN!<p>I’ve been working on an air quality measuring device called Air Lab for the past three years. It measures CO2, temperature, relative humidity, air pollutants (VOC, NOx), and atmospheric pressure. You can log and analyze the data directly on the device — no smartphone or laptop needed.<p>To better show what the device can do and how it feels like, I spent the past week developing a web-based simulator using Emscripten. It runs the stock firmware with most features available except for networking. Check it out and let me know what you think!<p>The firmware will be open-source and available once the first batch of devices ships. We’re currently finishing up our crowdfunding campaign on CrowdSupply. If you want to get one, now is the time to support the project: <a href="https://www.crowdsupply.com/networked-artifacts/air-lab" rel="nofollow">https://www.crowdsupply.com/networked-artifacts/air-lab</a><p>We started building the Air Lab because most air quality measuring devices we found were locked-down or hard to tinker with. Air quality is a growing concern, and we’re hoping a more open, playful approach can help make the topic more accessible. It is important to us that there is a low bar for customizing and extending the Air Lab. Until we ship, we plan to create rich documentation and further tools, like the simulator, to make this as easy as possible.<p>The technical: The device is powered by the popular ESP32S3 microcontroller, equipped with a precise CO2, temperature, and relative humidity sensor (SCD41) as well as a VOC/NOx (SGP41) and atmospheric pressure sensor (LPS22). The support circuitry provides built-in battery charging, a real-time clock, an RGB LED, buzzer, an accelerometer, and capacitive touch, which makes Air Lab a powerful stand-alone device. The firmware itself is written on top of esp-idf and uses LVGL for rendering the UI.<p>If you seek more high-level info, here are also some videos covering the project: - <a href="https://www.youtube.com/watch?v=oBltdMLjUyg" rel="nofollow">https://www.youtube.com/watch?v=oBltdMLjUyg</a> (Introduction) - <a href="https://www.youtube.com/watch?v=_tzjVYPm_MU" rel="nofollow">https://www.youtube.com/watch?v=_tzjVYPm_MU</a> (Product Update)<p>Would love your feedback — on the device, hardware choices, potential use cases, or anything else worth improving. If you want to get notified on project updates, subscribe on Crowd Supply.<p>Happy to answer any questions!

Show HN: Air Lab – A portable and open air quality measuring device

Hi HN!<p>I’ve been working on an air quality measuring device called Air Lab for the past three years. It measures CO2, temperature, relative humidity, air pollutants (VOC, NOx), and atmospheric pressure. You can log and analyze the data directly on the device — no smartphone or laptop needed.<p>To better show what the device can do and how it feels like, I spent the past week developing a web-based simulator using Emscripten. It runs the stock firmware with most features available except for networking. Check it out and let me know what you think!<p>The firmware will be open-source and available once the first batch of devices ships. We’re currently finishing up our crowdfunding campaign on CrowdSupply. If you want to get one, now is the time to support the project: <a href="https://www.crowdsupply.com/networked-artifacts/air-lab" rel="nofollow">https://www.crowdsupply.com/networked-artifacts/air-lab</a><p>We started building the Air Lab because most air quality measuring devices we found were locked-down or hard to tinker with. Air quality is a growing concern, and we’re hoping a more open, playful approach can help make the topic more accessible. It is important to us that there is a low bar for customizing and extending the Air Lab. Until we ship, we plan to create rich documentation and further tools, like the simulator, to make this as easy as possible.<p>The technical: The device is powered by the popular ESP32S3 microcontroller, equipped with a precise CO2, temperature, and relative humidity sensor (SCD41) as well as a VOC/NOx (SGP41) and atmospheric pressure sensor (LPS22). The support circuitry provides built-in battery charging, a real-time clock, an RGB LED, buzzer, an accelerometer, and capacitive touch, which makes Air Lab a powerful stand-alone device. The firmware itself is written on top of esp-idf and uses LVGL for rendering the UI.<p>If you seek more high-level info, here are also some videos covering the project: - <a href="https://www.youtube.com/watch?v=oBltdMLjUyg" rel="nofollow">https://www.youtube.com/watch?v=oBltdMLjUyg</a> (Introduction) - <a href="https://www.youtube.com/watch?v=_tzjVYPm_MU" rel="nofollow">https://www.youtube.com/watch?v=_tzjVYPm_MU</a> (Product Update)<p>Would love your feedback — on the device, hardware choices, potential use cases, or anything else worth improving. If you want to get notified on project updates, subscribe on Crowd Supply.<p>Happy to answer any questions!

Show HN: Air Lab – A portable and open air quality measuring device

Hi HN!<p>I’ve been working on an air quality measuring device called Air Lab for the past three years. It measures CO2, temperature, relative humidity, air pollutants (VOC, NOx), and atmospheric pressure. You can log and analyze the data directly on the device — no smartphone or laptop needed.<p>To better show what the device can do and how it feels like, I spent the past week developing a web-based simulator using Emscripten. It runs the stock firmware with most features available except for networking. Check it out and let me know what you think!<p>The firmware will be open-source and available once the first batch of devices ships. We’re currently finishing up our crowdfunding campaign on CrowdSupply. If you want to get one, now is the time to support the project: <a href="https://www.crowdsupply.com/networked-artifacts/air-lab" rel="nofollow">https://www.crowdsupply.com/networked-artifacts/air-lab</a><p>We started building the Air Lab because most air quality measuring devices we found were locked-down or hard to tinker with. Air quality is a growing concern, and we’re hoping a more open, playful approach can help make the topic more accessible. It is important to us that there is a low bar for customizing and extending the Air Lab. Until we ship, we plan to create rich documentation and further tools, like the simulator, to make this as easy as possible.<p>The technical: The device is powered by the popular ESP32S3 microcontroller, equipped with a precise CO2, temperature, and relative humidity sensor (SCD41) as well as a VOC/NOx (SGP41) and atmospheric pressure sensor (LPS22). The support circuitry provides built-in battery charging, a real-time clock, an RGB LED, buzzer, an accelerometer, and capacitive touch, which makes Air Lab a powerful stand-alone device. The firmware itself is written on top of esp-idf and uses LVGL for rendering the UI.<p>If you seek more high-level info, here are also some videos covering the project: - <a href="https://www.youtube.com/watch?v=oBltdMLjUyg" rel="nofollow">https://www.youtube.com/watch?v=oBltdMLjUyg</a> (Introduction) - <a href="https://www.youtube.com/watch?v=_tzjVYPm_MU" rel="nofollow">https://www.youtube.com/watch?v=_tzjVYPm_MU</a> (Product Update)<p>Would love your feedback — on the device, hardware choices, potential use cases, or anything else worth improving. If you want to get notified on project updates, subscribe on Crowd Supply.<p>Happy to answer any questions!

Show HN: Air Lab – A portable and open air quality measuring device

Hi HN!<p>I’ve been working on an air quality measuring device called Air Lab for the past three years. It measures CO2, temperature, relative humidity, air pollutants (VOC, NOx), and atmospheric pressure. You can log and analyze the data directly on the device — no smartphone or laptop needed.<p>To better show what the device can do and how it feels like, I spent the past week developing a web-based simulator using Emscripten. It runs the stock firmware with most features available except for networking. Check it out and let me know what you think!<p>The firmware will be open-source and available once the first batch of devices ships. We’re currently finishing up our crowdfunding campaign on CrowdSupply. If you want to get one, now is the time to support the project: <a href="https://www.crowdsupply.com/networked-artifacts/air-lab" rel="nofollow">https://www.crowdsupply.com/networked-artifacts/air-lab</a><p>We started building the Air Lab because most air quality measuring devices we found were locked-down or hard to tinker with. Air quality is a growing concern, and we’re hoping a more open, playful approach can help make the topic more accessible. It is important to us that there is a low bar for customizing and extending the Air Lab. Until we ship, we plan to create rich documentation and further tools, like the simulator, to make this as easy as possible.<p>The technical: The device is powered by the popular ESP32S3 microcontroller, equipped with a precise CO2, temperature, and relative humidity sensor (SCD41) as well as a VOC/NOx (SGP41) and atmospheric pressure sensor (LPS22). The support circuitry provides built-in battery charging, a real-time clock, an RGB LED, buzzer, an accelerometer, and capacitive touch, which makes Air Lab a powerful stand-alone device. The firmware itself is written on top of esp-idf and uses LVGL for rendering the UI.<p>If you seek more high-level info, here are also some videos covering the project: - <a href="https://www.youtube.com/watch?v=oBltdMLjUyg" rel="nofollow">https://www.youtube.com/watch?v=oBltdMLjUyg</a> (Introduction) - <a href="https://www.youtube.com/watch?v=_tzjVYPm_MU" rel="nofollow">https://www.youtube.com/watch?v=_tzjVYPm_MU</a> (Product Update)<p>Would love your feedback — on the device, hardware choices, potential use cases, or anything else worth improving. If you want to get notified on project updates, subscribe on Crowd Supply.<p>Happy to answer any questions!

Show HN: Air Lab – A portable and open air quality measuring device

Hi HN!<p>I’ve been working on an air quality measuring device called Air Lab for the past three years. It measures CO2, temperature, relative humidity, air pollutants (VOC, NOx), and atmospheric pressure. You can log and analyze the data directly on the device — no smartphone or laptop needed.<p>To better show what the device can do and how it feels like, I spent the past week developing a web-based simulator using Emscripten. It runs the stock firmware with most features available except for networking. Check it out and let me know what you think!<p>The firmware will be open-source and available once the first batch of devices ships. We’re currently finishing up our crowdfunding campaign on CrowdSupply. If you want to get one, now is the time to support the project: <a href="https://www.crowdsupply.com/networked-artifacts/air-lab" rel="nofollow">https://www.crowdsupply.com/networked-artifacts/air-lab</a><p>We started building the Air Lab because most air quality measuring devices we found were locked-down or hard to tinker with. Air quality is a growing concern, and we’re hoping a more open, playful approach can help make the topic more accessible. It is important to us that there is a low bar for customizing and extending the Air Lab. Until we ship, we plan to create rich documentation and further tools, like the simulator, to make this as easy as possible.<p>The technical: The device is powered by the popular ESP32S3 microcontroller, equipped with a precise CO2, temperature, and relative humidity sensor (SCD41) as well as a VOC/NOx (SGP41) and atmospheric pressure sensor (LPS22). The support circuitry provides built-in battery charging, a real-time clock, an RGB LED, buzzer, an accelerometer, and capacitive touch, which makes Air Lab a powerful stand-alone device. The firmware itself is written on top of esp-idf and uses LVGL for rendering the UI.<p>If you seek more high-level info, here are also some videos covering the project: - <a href="https://www.youtube.com/watch?v=oBltdMLjUyg" rel="nofollow">https://www.youtube.com/watch?v=oBltdMLjUyg</a> (Introduction) - <a href="https://www.youtube.com/watch?v=_tzjVYPm_MU" rel="nofollow">https://www.youtube.com/watch?v=_tzjVYPm_MU</a> (Product Update)<p>Would love your feedback — on the device, hardware choices, potential use cases, or anything else worth improving. If you want to get notified on project updates, subscribe on Crowd Supply.<p>Happy to answer any questions!

Show HN: Gradle plugin for faster Java compiles

Hey HN,<p>We've written a pretty cool Gradle plugin I wanted to share.<p>It turns out if you native-image the Java and Kotlin compilers, you can experience a serious gain, especially for "smaller" projects (under 10,000 classes).<p>By compiling the compiler with native image, JIT warmup normally experienced by Gradle/Maven et al is skipped. Startup time is extremely fast, since native image seals the heap into the binary itself. The native version of javac produces identical outputs from inputs. It's the same exact code, just AOT-compiled, translated to machine code, and pre-optimized by GraalVM.<p>Of course, native image isn't optimal in all cases. Warm JIT still outperforms NI, but I think most projects <i>never hit</i> fully warmed JIT through Gradle or Maven, because the VM running the compiler so rarely survives for long enough.<p>Elide (the tool used by this plugin) also supports fetching Maven dependencies. When active, it prepares a local m2 root where Gradle can find your dependencies already on-disk when it needs them. Preliminary benchmarking shows a 100x+ gain since lockfiles prevent needless re-resolution and native-imaging the resolver results in a similar gain to the compiler.<p>We (the authors) are very much open to feedback in improving this Gradle plugin or the underlying toolchain. Please, let us know what you think!

Show HN: App.build, an open-source AI agent that builds full-stack apps

Show HN: PinSend – Share text between devices using a PIN(P2P, no login)

Hi HN,<p>I built [PinSend](<a href="https://pinsend.app" rel="nofollow">https://pinsend.app</a>) — a free web app for instantly sharing text between devices, using a simple 6-character PIN.<p>- No login, no account, no install. - Peer-to-peer WebRTC transfer (no server relay, no cloud). - Cross-platform: works on any modern browser.<p>I built PinSend for myself while developing web apps—I was always copying ngrok links and sending error logs between my laptop and mobile devices. I wanted a frictionless, instant way to move links and text between anything.<p>*Demo:* 1. Open <a href="https://pinsend.app" rel="nofollow">https://pinsend.app</a> on your phone & laptop 2. Paste or type some text and hit "Send", enter the PIN on the other device 3. Instant sync! 4. No more emailing or Whatsapping notes to yourself<p>Would love feedback!

Show HN: Hacker News historic upvote and score data

Hi yall!<p>I've been using hacker news for a while but one of the things I started wanting recently was the ability to have alerts for any stories I post.<p>The thing that pushed me over the edge was Hackclub's shipwrecked <a href="https://shipwrecked.hackclub.com/" rel="nofollow">https://shipwrecked.hackclub.com/</a> (hackathon in the boston bay for anyone that can make 4 projects over the summer and get at least one of them to go viral). One of the options for "going viral" was to get to the front page of hacker news but I was constantly scared that I would miss it getting on there lol so I whipped up a quick slackbot to send alerts to a channel. It was dead simple but it did work.<p>Once I had the bot I realized I could do wayyyy more with the data I was collecting so I decided to add some historical data initially thinking I would generate graphs and then embed them in the message but decided to quickly try using Bun.serve to host a quick dashboard mainly since I wanted to see how the developer experience was. Spoiler it is amazing. I've gotten really inspired by web components and the idea of only using universally supported `html`, `css`, and `js`. Bun results in an amazingly nice developer experience where you can just import the `index.html` and assign it to your root route and be done. Sorry for shilling about bun but it truly was one of my favorite parts of building this besides drizzle.<p>The dashboard has a graph of the points earned and position on the leaderboard over time (updated every 5 minutes) and then the expected stats like peak points, peak position, author, and comment count.<p>Also btw all the code is open source ofc on both my tangled repo (<a href="https://tangled.sh/@dunkirk.sh/hn-alerts" rel="nofollow">https://tangled.sh/@dunkirk.sh/hn-alerts</a>) as well as a github repo (<a href="https://github.com/taciturnaxolotl/hn-alerts">https://github.com/taciturnaxolotl/hn-alerts</a>) and you can try the hosted version at <a href="https://hn.dunkirk.sh" rel="nofollow">https://hn.dunkirk.sh</a> I'm planning to add the ability to just install the slackbot to any workspace and have workspace specific leaderboards but that will require a bit of refactoring and probably abandoning the slack-edge package.<p>Also you can view specific item's data by simply replacing news.yc.com with hn.dunkirk.sh like: <a href="https://hn.dunkirk.sh/item?id=44115853" rel="nofollow">https://hn.dunkirk.sh/item?id=44115853</a>

Show HN: GPT image editing, but for 3D models

Hey HN!<p>I’m Zach one of the co-founders of Adam (<a href="https://www.adamcad.com">https://www.adamcad.com</a>). We're building AI-powered tools for CAD and 3D modeling [1].<p>We’ve recently been exploring a new way to bring GPT-style image editing directly into 3D model generation and are excited to showcase this in our web-app today. We’re calling it creative mode and are intrigued by the fun use cases this could create by making 3D generation more conversational!<p>For example you can put a prompt in such as “an elephant” then follow it up by “have it ride a skateboard” and it preserves the context, identity and maintains consistency with the previous model. We believe this lends itself better to an iterative design process when prototyping creative 3D assets or models for printing.<p>We’re offering everyone 10 free generations to start (ramping up soon!). Here’s a short video explaining how it works: <a href="https://www.loom.com/share/cf9ab91375374a4f93d6cc89619a043b" rel="nofollow">https://www.loom.com/share/cf9ab91375374a4f93d6cc89619a043b</a><p>We’d also love you to try our parametric mode (free!) which uses LLMs to create a conversational interface for solid modeling as touched on in a recent HN thread [2]. We are leveraging the code generation capabilities of these models to generate OpenSCAD code (an open-source script based CAD) and are surfacing the variables as sliders the user can toggle to adjust their design. We hope this can give a glimpse into what it could be like to “vibe-CAD”. We will soon be releasing our results on Will Patrick's Text to CAD eval [3] and adding B-rep compatible export!<p>We’d love to hear what you think and where we should take this next :)<p>[1]<a href="https://x.com/zachdive/status/1882858765613228287" rel="nofollow">https://x.com/zachdive/status/1882858765613228287</a><p>[2]<a href="https://news.ycombinator.com/item?id=43774990">https://news.ycombinator.com/item?id=43774990</a><p>[3]<a href="https://willpatrick.xyz/technology/2025/04/23/teaching-llms-how-to-solid-model.html" rel="nofollow">https://willpatrick.xyz/technology/2025/04/23/teaching-llms-...</a>

Show HN: GPT image editing, but for 3D models

Hey HN!<p>I’m Zach one of the co-founders of Adam (<a href="https://www.adamcad.com">https://www.adamcad.com</a>). We're building AI-powered tools for CAD and 3D modeling [1].<p>We’ve recently been exploring a new way to bring GPT-style image editing directly into 3D model generation and are excited to showcase this in our web-app today. We’re calling it creative mode and are intrigued by the fun use cases this could create by making 3D generation more conversational!<p>For example you can put a prompt in such as “an elephant” then follow it up by “have it ride a skateboard” and it preserves the context, identity and maintains consistency with the previous model. We believe this lends itself better to an iterative design process when prototyping creative 3D assets or models for printing.<p>We’re offering everyone 10 free generations to start (ramping up soon!). Here’s a short video explaining how it works: <a href="https://www.loom.com/share/cf9ab91375374a4f93d6cc89619a043b" rel="nofollow">https://www.loom.com/share/cf9ab91375374a4f93d6cc89619a043b</a><p>We’d also love you to try our parametric mode (free!) which uses LLMs to create a conversational interface for solid modeling as touched on in a recent HN thread [2]. We are leveraging the code generation capabilities of these models to generate OpenSCAD code (an open-source script based CAD) and are surfacing the variables as sliders the user can toggle to adjust their design. We hope this can give a glimpse into what it could be like to “vibe-CAD”. We will soon be releasing our results on Will Patrick's Text to CAD eval [3] and adding B-rep compatible export!<p>We’d love to hear what you think and where we should take this next :)<p>[1]<a href="https://x.com/zachdive/status/1882858765613228287" rel="nofollow">https://x.com/zachdive/status/1882858765613228287</a><p>[2]<a href="https://news.ycombinator.com/item?id=43774990">https://news.ycombinator.com/item?id=43774990</a><p>[3]<a href="https://willpatrick.xyz/technology/2025/04/23/teaching-llms-how-to-solid-model.html" rel="nofollow">https://willpatrick.xyz/technology/2025/04/23/teaching-llms-...</a>

Show HN: GPT image editing, but for 3D models

Hey HN!<p>I’m Zach one of the co-founders of Adam (<a href="https://www.adamcad.com">https://www.adamcad.com</a>). We're building AI-powered tools for CAD and 3D modeling [1].<p>We’ve recently been exploring a new way to bring GPT-style image editing directly into 3D model generation and are excited to showcase this in our web-app today. We’re calling it creative mode and are intrigued by the fun use cases this could create by making 3D generation more conversational!<p>For example you can put a prompt in such as “an elephant” then follow it up by “have it ride a skateboard” and it preserves the context, identity and maintains consistency with the previous model. We believe this lends itself better to an iterative design process when prototyping creative 3D assets or models for printing.<p>We’re offering everyone 10 free generations to start (ramping up soon!). Here’s a short video explaining how it works: <a href="https://www.loom.com/share/cf9ab91375374a4f93d6cc89619a043b" rel="nofollow">https://www.loom.com/share/cf9ab91375374a4f93d6cc89619a043b</a><p>We’d also love you to try our parametric mode (free!) which uses LLMs to create a conversational interface for solid modeling as touched on in a recent HN thread [2]. We are leveraging the code generation capabilities of these models to generate OpenSCAD code (an open-source script based CAD) and are surfacing the variables as sliders the user can toggle to adjust their design. We hope this can give a glimpse into what it could be like to “vibe-CAD”. We will soon be releasing our results on Will Patrick's Text to CAD eval [3] and adding B-rep compatible export!<p>We’d love to hear what you think and where we should take this next :)<p>[1]<a href="https://x.com/zachdive/status/1882858765613228287" rel="nofollow">https://x.com/zachdive/status/1882858765613228287</a><p>[2]<a href="https://news.ycombinator.com/item?id=43774990">https://news.ycombinator.com/item?id=43774990</a><p>[3]<a href="https://willpatrick.xyz/technology/2025/04/23/teaching-llms-how-to-solid-model.html" rel="nofollow">https://willpatrick.xyz/technology/2025/04/23/teaching-llms-...</a>

< 1 2 3 ... 75 76 77 78 79 ... 891 892 893 >