The best Hacker News stories from Show from the past day
Latest posts:
Show HN: Firm, a text-based work management system
Show HN: Firm, a text-based work management system
Show HN: Firm, a text-based work management system
Show HN: Firm, a text-based work management system
Show HN: Scriber Pro – Offline AI transcription for macOS
Hey HN! Built this because I was tired of waiting hours for transcription
services and didn't want to upload sensitive recordings to the cloud.<p><pre><code> Real metrics from my M1 Max: 4.5hr video file transcribed in 3 minutes 32
seconds. Works completely offline.
First 5 HN users who click the button on the page get it free. Literally promo code straight to the app sore
Key differences vs Rev/Otter:
- No 2-hour file limits (handles any length)
- Timecodes stay accurate on long files (no drift from chunking)
- Supports MP3, WAV, MP4, MOV, M4A, FLAC
- Exports to SRT, VTT, JSON, PDF, DOCX, CSV, Markdown
Built for macOS. Happy to answer questions!</code></pre>
Show HN: Scriber Pro – Offline AI transcription for macOS
Hey HN! Built this because I was tired of waiting hours for transcription
services and didn't want to upload sensitive recordings to the cloud.<p><pre><code> Real metrics from my M1 Max: 4.5hr video file transcribed in 3 minutes 32
seconds. Works completely offline.
First 5 HN users who click the button on the page get it free. Literally promo code straight to the app sore
Key differences vs Rev/Otter:
- No 2-hour file limits (handles any length)
- Timecodes stay accurate on long files (no drift from chunking)
- Supports MP3, WAV, MP4, MOV, M4A, FLAC
- Exports to SRT, VTT, JSON, PDF, DOCX, CSV, Markdown
Built for macOS. Happy to answer questions!</code></pre>
Show HN: Scriber Pro – Offline AI transcription for macOS
Hey HN! Built this because I was tired of waiting hours for transcription
services and didn't want to upload sensitive recordings to the cloud.<p><pre><code> Real metrics from my M1 Max: 4.5hr video file transcribed in 3 minutes 32
seconds. Works completely offline.
First 5 HN users who click the button on the page get it free. Literally promo code straight to the app sore
Key differences vs Rev/Otter:
- No 2-hour file limits (handles any length)
- Timecodes stay accurate on long files (no drift from chunking)
- Supports MP3, WAV, MP4, MOV, M4A, FLAC
- Exports to SRT, VTT, JSON, PDF, DOCX, CSV, Markdown
Built for macOS. Happy to answer questions!</code></pre>
Show HN: Halloy – Modern IRC client
I started working on Halloy back in 2022, with the goal of giving something back to the community I’ve been a part of for the past two decades. I wanted to create a modern, multi-platform IRC client written in Rust.<p>Three years later, I’ve made new friends who have become core contributors, and there are now over 200 people idling in our #halloy channel on Libera.<p>My hope is that this client will outlive me and that IRC will live on.
Show HN: Halloy – Modern IRC client
I started working on Halloy back in 2022, with the goal of giving something back to the community I’ve been a part of for the past two decades. I wanted to create a modern, multi-platform IRC client written in Rust.<p>Three years later, I’ve made new friends who have become core contributors, and there are now over 200 people idling in our #halloy channel on Libera.<p>My hope is that this client will outlive me and that IRC will live on.
Show HN: Halloy – Modern IRC client
I started working on Halloy back in 2022, with the goal of giving something back to the community I’ve been a part of for the past two decades. I wanted to create a modern, multi-platform IRC client written in Rust.<p>Three years later, I’ve made new friends who have become core contributors, and there are now over 200 people idling in our #halloy channel on Libera.<p>My hope is that this client will outlive me and that IRC will live on.
Show HN: Halloy – Modern IRC client
I started working on Halloy back in 2022, with the goal of giving something back to the community I’ve been a part of for the past two decades. I wanted to create a modern, multi-platform IRC client written in Rust.<p>Three years later, I’ve made new friends who have become core contributors, and there are now over 200 people idling in our #halloy channel on Libera.<p>My hope is that this client will outlive me and that IRC will live on.
Show HN: Halloy – Modern IRC client
I started working on Halloy back in 2022, with the goal of giving something back to the community I’ve been a part of for the past two decades. I wanted to create a modern, multi-platform IRC client written in Rust.<p>Three years later, I’ve made new friends who have become core contributors, and there are now over 200 people idling in our #halloy channel on Libera.<p>My hope is that this client will outlive me and that IRC will live on.
Show HN: Metorial (YC F25) – Vercel for MCP
Hey HN! We're Wen and Tobias, and we're building Metorial (<a href="https://metorial.com">https://metorial.com</a>), an integration platform that connects AI agents to external tools and data using MCP.<p>The Problem: While MCP works great locally (e.g., Cursor or Claude Desktop), server-side deployments are painful. Running MCP servers means managing Docker configs, per-user OAuth flows, scaling concurrent sessions, and building observability from scratch. This infrastructure work turns simple integrations into weeks of setup.<p>Metorial handles all of this automatically. We maintain an open catalog of ~600 MCP servers (GitHub, Slack, Google Drive, Salesforce, databases, etc.) that you can deploy in three clicks. You can also bring your own MCP server or fork existing ones.<p>For OAuth, just provide your client ID and secret and we handle the entire flow, including token refresh. Each user then gets an isolated MCP server instance configured with their own OAuth credentials automatically.<p>What makes us different is that our serverless runtime hibernates idle MCP servers and resumes them with sub-second cold starts while preserving the state and connection. Our custom MCP engine is capable of managing thousands of concurrent connections, giving you a scalable service with per-user isolation. Other alternatives either run shared servers (security issues) or provision separate VMs per user (expensive and slow to scale).<p>Our Python and TypeScript SDKs let you connect LLMs to MCP tools in a single function call, abstracting away the protocol complexity. But if you want to dig deep, you can just use standard MCP and our REST API (<a href="https://metorial.com/api">https://metorial.com/api</a>) to connect to our platform.<p>You can self-host (<a href="https://github.com/metorial/metorial" rel="nofollow">https://github.com/metorial/metorial</a>) or use the managed version at <a href="https://metorial.com">https://metorial.com</a>.<p>So far, we see enterprise teams use Metorial to have a central integration hub for tools like Salesforce, while startups use it to cut weeks of infra work on their side when building AI agents with integrations.<p>Demo video: <a href="https://www.youtube.com/watch?v=07StSRNmJZ8" rel="nofollow">https://www.youtube.com/watch?v=07StSRNmJZ8</a><p>Our Repos: Metorial: <a href="https://github.com/metorial/metorial" rel="nofollow">https://github.com/metorial/metorial</a>, MCP Containers: <a href="https://github.com/metorial/mcp-containers" rel="nofollow">https://github.com/metorial/mcp-containers</a><p>SDKs: Node/TypeScript: <a href="https://github.com/metorial/metorial-node" rel="nofollow">https://github.com/metorial/metorial-node</a>, Python: <a href="https://github.com/metorial/metorial-python" rel="nofollow">https://github.com/metorial/metorial-python</a><p>We'd love to hear feedback, especially if you've dealt with deploying MCP at scale!
Show HN: Metorial (YC F25) – Vercel for MCP
Hey HN! We're Wen and Tobias, and we're building Metorial (<a href="https://metorial.com">https://metorial.com</a>), an integration platform that connects AI agents to external tools and data using MCP.<p>The Problem: While MCP works great locally (e.g., Cursor or Claude Desktop), server-side deployments are painful. Running MCP servers means managing Docker configs, per-user OAuth flows, scaling concurrent sessions, and building observability from scratch. This infrastructure work turns simple integrations into weeks of setup.<p>Metorial handles all of this automatically. We maintain an open catalog of ~600 MCP servers (GitHub, Slack, Google Drive, Salesforce, databases, etc.) that you can deploy in three clicks. You can also bring your own MCP server or fork existing ones.<p>For OAuth, just provide your client ID and secret and we handle the entire flow, including token refresh. Each user then gets an isolated MCP server instance configured with their own OAuth credentials automatically.<p>What makes us different is that our serverless runtime hibernates idle MCP servers and resumes them with sub-second cold starts while preserving the state and connection. Our custom MCP engine is capable of managing thousands of concurrent connections, giving you a scalable service with per-user isolation. Other alternatives either run shared servers (security issues) or provision separate VMs per user (expensive and slow to scale).<p>Our Python and TypeScript SDKs let you connect LLMs to MCP tools in a single function call, abstracting away the protocol complexity. But if you want to dig deep, you can just use standard MCP and our REST API (<a href="https://metorial.com/api">https://metorial.com/api</a>) to connect to our platform.<p>You can self-host (<a href="https://github.com/metorial/metorial" rel="nofollow">https://github.com/metorial/metorial</a>) or use the managed version at <a href="https://metorial.com">https://metorial.com</a>.<p>So far, we see enterprise teams use Metorial to have a central integration hub for tools like Salesforce, while startups use it to cut weeks of infra work on their side when building AI agents with integrations.<p>Demo video: <a href="https://www.youtube.com/watch?v=07StSRNmJZ8" rel="nofollow">https://www.youtube.com/watch?v=07StSRNmJZ8</a><p>Our Repos: Metorial: <a href="https://github.com/metorial/metorial" rel="nofollow">https://github.com/metorial/metorial</a>, MCP Containers: <a href="https://github.com/metorial/mcp-containers" rel="nofollow">https://github.com/metorial/mcp-containers</a><p>SDKs: Node/TypeScript: <a href="https://github.com/metorial/metorial-node" rel="nofollow">https://github.com/metorial/metorial-node</a>, Python: <a href="https://github.com/metorial/metorial-python" rel="nofollow">https://github.com/metorial/metorial-python</a><p>We'd love to hear feedback, especially if you've dealt with deploying MCP at scale!
Show HN: Metorial (YC F25) – Vercel for MCP
Hey HN! We're Wen and Tobias, and we're building Metorial (<a href="https://metorial.com">https://metorial.com</a>), an integration platform that connects AI agents to external tools and data using MCP.<p>The Problem: While MCP works great locally (e.g., Cursor or Claude Desktop), server-side deployments are painful. Running MCP servers means managing Docker configs, per-user OAuth flows, scaling concurrent sessions, and building observability from scratch. This infrastructure work turns simple integrations into weeks of setup.<p>Metorial handles all of this automatically. We maintain an open catalog of ~600 MCP servers (GitHub, Slack, Google Drive, Salesforce, databases, etc.) that you can deploy in three clicks. You can also bring your own MCP server or fork existing ones.<p>For OAuth, just provide your client ID and secret and we handle the entire flow, including token refresh. Each user then gets an isolated MCP server instance configured with their own OAuth credentials automatically.<p>What makes us different is that our serverless runtime hibernates idle MCP servers and resumes them with sub-second cold starts while preserving the state and connection. Our custom MCP engine is capable of managing thousands of concurrent connections, giving you a scalable service with per-user isolation. Other alternatives either run shared servers (security issues) or provision separate VMs per user (expensive and slow to scale).<p>Our Python and TypeScript SDKs let you connect LLMs to MCP tools in a single function call, abstracting away the protocol complexity. But if you want to dig deep, you can just use standard MCP and our REST API (<a href="https://metorial.com/api">https://metorial.com/api</a>) to connect to our platform.<p>You can self-host (<a href="https://github.com/metorial/metorial" rel="nofollow">https://github.com/metorial/metorial</a>) or use the managed version at <a href="https://metorial.com">https://metorial.com</a>.<p>So far, we see enterprise teams use Metorial to have a central integration hub for tools like Salesforce, while startups use it to cut weeks of infra work on their side when building AI agents with integrations.<p>Demo video: <a href="https://www.youtube.com/watch?v=07StSRNmJZ8" rel="nofollow">https://www.youtube.com/watch?v=07StSRNmJZ8</a><p>Our Repos: Metorial: <a href="https://github.com/metorial/metorial" rel="nofollow">https://github.com/metorial/metorial</a>, MCP Containers: <a href="https://github.com/metorial/mcp-containers" rel="nofollow">https://github.com/metorial/mcp-containers</a><p>SDKs: Node/TypeScript: <a href="https://github.com/metorial/metorial-node" rel="nofollow">https://github.com/metorial/metorial-node</a>, Python: <a href="https://github.com/metorial/metorial-python" rel="nofollow">https://github.com/metorial/metorial-python</a><p>We'd love to hear feedback, especially if you've dealt with deploying MCP at scale!
Show HN: Metorial (YC F25) – Vercel for MCP
Hey HN! We're Wen and Tobias, and we're building Metorial (<a href="https://metorial.com">https://metorial.com</a>), an integration platform that connects AI agents to external tools and data using MCP.<p>The Problem: While MCP works great locally (e.g., Cursor or Claude Desktop), server-side deployments are painful. Running MCP servers means managing Docker configs, per-user OAuth flows, scaling concurrent sessions, and building observability from scratch. This infrastructure work turns simple integrations into weeks of setup.<p>Metorial handles all of this automatically. We maintain an open catalog of ~600 MCP servers (GitHub, Slack, Google Drive, Salesforce, databases, etc.) that you can deploy in three clicks. You can also bring your own MCP server or fork existing ones.<p>For OAuth, just provide your client ID and secret and we handle the entire flow, including token refresh. Each user then gets an isolated MCP server instance configured with their own OAuth credentials automatically.<p>What makes us different is that our serverless runtime hibernates idle MCP servers and resumes them with sub-second cold starts while preserving the state and connection. Our custom MCP engine is capable of managing thousands of concurrent connections, giving you a scalable service with per-user isolation. Other alternatives either run shared servers (security issues) or provision separate VMs per user (expensive and slow to scale).<p>Our Python and TypeScript SDKs let you connect LLMs to MCP tools in a single function call, abstracting away the protocol complexity. But if you want to dig deep, you can just use standard MCP and our REST API (<a href="https://metorial.com/api">https://metorial.com/api</a>) to connect to our platform.<p>You can self-host (<a href="https://github.com/metorial/metorial" rel="nofollow">https://github.com/metorial/metorial</a>) or use the managed version at <a href="https://metorial.com">https://metorial.com</a>.<p>So far, we see enterprise teams use Metorial to have a central integration hub for tools like Salesforce, while startups use it to cut weeks of infra work on their side when building AI agents with integrations.<p>Demo video: <a href="https://www.youtube.com/watch?v=07StSRNmJZ8" rel="nofollow">https://www.youtube.com/watch?v=07StSRNmJZ8</a><p>Our Repos: Metorial: <a href="https://github.com/metorial/metorial" rel="nofollow">https://github.com/metorial/metorial</a>, MCP Containers: <a href="https://github.com/metorial/mcp-containers" rel="nofollow">https://github.com/metorial/mcp-containers</a><p>SDKs: Node/TypeScript: <a href="https://github.com/metorial/metorial-node" rel="nofollow">https://github.com/metorial/metorial-node</a>, Python: <a href="https://github.com/metorial/metorial-python" rel="nofollow">https://github.com/metorial/metorial-python</a><p>We'd love to hear feedback, especially if you've dealt with deploying MCP at scale!
Show HN: CSS Extras
Show HN: CSS Extras
Show HN: CSS Extras
Show HN: I extracted BASIC listings for Tim Hartnell's 1986 book
Tim Hartnell was one of the most prolific authors during the early days of the home computing boom, writing many popular books covering genres of games on different platforms and, in this case, artificial intelligence.<p>I've extracted the BASIC program listings from Hartnell's 1986 book 'Exploring Artificial Intelligence on Your IBM PC' and organized them along with a PC-BASIC runtime environment and instructions so you can try these programs out yourself.<p>Even though the AI landscape has changed enormously since Hartnell first wrote this book, I hope one or two of you will get some value out of these program listings if you're interested in exploring the fundamentals of AI on home-computing platforms as they were in the 1980's.<p>Tim Hartnell unfortunately passed away in 1991 at the young age of 40, and without his writing I imagine more than a few of us would not have found the start in computing we did. Thanks Tim.