The best Hacker News stories from Show from the past day

Go back

Latest posts:

Show HN: Open-Source Webhooks Gateway for Platform Engineers

Hey Friends,<p>Convoy is an open-source webhooks gateway. Webhooks continue to be hard at scale, and large teams require consistent tooling for inbound & outbound webhooks. Convoy enables developers to securely send, receive and manage millions of webhooks reliably with features like retries, rate limiting, circuit breaking, customer-facing webhook logs, zero downtime secrets rotation and more.<p>Since our initial launch [0], we've learned a lot about our users and made several important improvements we are excited to share:<p>- We are now a webhooks gateway! Akin to API Gateways that sit at the edge of your network to receive all API traffic and route them to the respective microservice, webhooks gateways sit at the edge of your network to receive webhooks from any backend service and route to client endpoints as well as ingest events from multiple providers and route them to the required backend services.<p>- We now have first-class integration with Pub/Sub Systems. Our users wanted increased deliverability guarantees. Your backend services write events to a queue/topic etc. Convoy consumes the queues, creates webhook events and dispatches those reliably to client endpoints. We currently support Amazon SQS and Google PubSub. On our roadmap, we have the following planned - Kafka, RabbitMQ & Nats (In that order)<p>- We switched our backend store to PostgreSQL. This improves the self-hosted experience tremendously. MongoDB was great for storing schemaless data, but was severely lacking in some important features for our users, e.g. transactions don't work on a single node; you need to bootstrap a replica set; also, we wanted to ship updates frequently, but the lack of migrations for schema & data changes slowed us down.<p>- We decided to go the open-core route of OSS monetisation because it offered us a good balance to serve our community and make enough money to run the company. Like GitLab, we hope to be good stewards of our community edition. Since our enterprise edition is simply a wrapper around the community edition with enterprise features like RBAC, Audit Logs etc., we are properly incentivised to continue making it excellent.<p>- Our Cloud platform is in the private alpha stage. Please contact us at founders@getconvoy.io to gain access!<p>Our mission is to serve hobbyist developers all the way to the most ambitious teams on the planet with a consistent and easy-to-use webhooks gateway for asynchronous communication on the internet.<p>We welcome you to try it out using our getting started at <a href="https://github.com/frain-dev/convoy#installation-getting-started">https://github.com/frain-dev/convoy#installation-getting-sta...</a>. Share with us your webhook horror stories and give us feedback.<p>[0]: <a href="https://news.ycombinator.com/item?id=30469078" rel="nofollow">https://news.ycombinator.com/item?id=30469078</a>

Show HN: DigicamFinder – open-sourced DPReview camera data

Ever since the DPReview closure announcement <a href="https://news.ycombinator.com/item?id=35248296" rel="nofollow">https://news.ycombinator.com/item?id=35248296</a> we were thinking how to preserve the 25 years of valuable DPReview camera data. Archive.org has been great, but it's not usable by the general public.<p>The best way to keep it safe going forward, is to have the community own it, so we open sourced it: <a href="https://github.com/open-product-data/digital-cameras">https://github.com/open-product-data/digital-cameras</a><p>I'm aware of a number of attempts to make product data open-sourced, but none have the power of the photo geeks behind it :)<p>Thoughts or ideas? + really looking for some contribution love.

Show HN: DigicamFinder – open-sourced DPReview camera data

Ever since the DPReview closure announcement <a href="https://news.ycombinator.com/item?id=35248296" rel="nofollow">https://news.ycombinator.com/item?id=35248296</a> we were thinking how to preserve the 25 years of valuable DPReview camera data. Archive.org has been great, but it's not usable by the general public.<p>The best way to keep it safe going forward, is to have the community own it, so we open sourced it: <a href="https://github.com/open-product-data/digital-cameras">https://github.com/open-product-data/digital-cameras</a><p>I'm aware of a number of attempts to make product data open-sourced, but none have the power of the photo geeks behind it :)<p>Thoughts or ideas? + really looking for some contribution love.

Show HN: DigicamFinder – open-sourced DPReview camera data

Ever since the DPReview closure announcement <a href="https://news.ycombinator.com/item?id=35248296" rel="nofollow">https://news.ycombinator.com/item?id=35248296</a> we were thinking how to preserve the 25 years of valuable DPReview camera data. Archive.org has been great, but it's not usable by the general public.<p>The best way to keep it safe going forward, is to have the community own it, so we open sourced it: <a href="https://github.com/open-product-data/digital-cameras">https://github.com/open-product-data/digital-cameras</a><p>I'm aware of a number of attempts to make product data open-sourced, but none have the power of the photo geeks behind it :)<p>Thoughts or ideas? + really looking for some contribution love.

Show HN: Random Aerial Airport Views

Hi HN!<p>Sharing Random Airport<p>Inspired by RandomStreetView (which I find weirdly addictive), and a passion for air travel.<p>Probably not for everyone, but I hope some of you find it interesting!<p>Needless to say, open to feedback!<p>Enjoy clicking!<p>Further reading:<p>TECH: It's Build in React, NodeJS, with a Notion DB. The code is public on Github. It is spaghetti though ... Especially open to feedback here.<p>DB: The db is publicly available (and editable), I can add the link in comments if anyone would like to have a look .<p>KNOWN ISSUES: I would like to improve the design, pic loading performance and quality of (some) pics.

Show HN: Random Aerial Airport Views

Hi HN!<p>Sharing Random Airport<p>Inspired by RandomStreetView (which I find weirdly addictive), and a passion for air travel.<p>Probably not for everyone, but I hope some of you find it interesting!<p>Needless to say, open to feedback!<p>Enjoy clicking!<p>Further reading:<p>TECH: It's Build in React, NodeJS, with a Notion DB. The code is public on Github. It is spaghetti though ... Especially open to feedback here.<p>DB: The db is publicly available (and editable), I can add the link in comments if anyone would like to have a look .<p>KNOWN ISSUES: I would like to improve the design, pic loading performance and quality of (some) pics.

Show HN: Multi-display screen sharing with CoScreen

Good to be back on HN with all-new CoScreen, a little more than 3 years after it launched over here!<p>With CoScreen 5.0, you can now share your windows from multiple displays at the same time, a long standing request by our most avid users and impossible in other apps. It also has a lightning-fast, Rust-based window compositing, scaling, and streaming engine now.<p>CoScreen was always meant to be different so that you and your team can share your screens simultaneously and multi-directionally, and to be able to control what is being shared. We saw it as a natural extension and closely coupled with your OS — instant, fast, and seamless. A better way to pair program, debug tough incidents, or jam on great ideas by sharing multi-modal information like code, commands, graphs, or logs.<p>All that made a lot of sense conceptually but to be frank, it was hard to get it right. Now a part of Datadog and with major parts of our app rewritten in Rust, we feel we’re closer than ever.<p>Here’s what pair programmers liked about CoScreen, so we made it even better: - High definition code sharing: Windows are video-streamed in real-time at their native resolution whenever possible. You never have to search for your IDE anymore or be anxious to share the wrong window. - Multi-directional collaboration: You can share, while Alice shares, while Bob shares. Side-by-side, across multiple displays. With built-in crisp audio and video chat. - 60FPS+ super smooth mouse pointers. Type, click, and draw on any shared window as if it was your own.<p>What some of you did NOT like, so we fixed it in CoScreen V5: - CPU utilization and latency have been reduced drastically as various parts of our desktop client are now implemented in Rust, building on crates such as cxx, rust-skia, iced, as well as Neon for our native remote control plugins. - No more accidental clicking into remote windows through the new remote window toggles. - You’re no longer bound by your displays, can share windows from multiple of them at the same time and even move them across displays while sharing without stopping. - You’ll also soon be able to join meetings from your browser from any platform.<p>CoScreen runs on macOS (x64 and Apple Silicon), Windows, soon also on the web and is currently free. We’re planning to charge for larger teams and enterprise features in the future. Hopefully - finally - we’ll also have a Linux version one day. Tell us if you need it urgently and if you have any other requirements!

Show HN: Multi-display screen sharing with CoScreen

Good to be back on HN with all-new CoScreen, a little more than 3 years after it launched over here!<p>With CoScreen 5.0, you can now share your windows from multiple displays at the same time, a long standing request by our most avid users and impossible in other apps. It also has a lightning-fast, Rust-based window compositing, scaling, and streaming engine now.<p>CoScreen was always meant to be different so that you and your team can share your screens simultaneously and multi-directionally, and to be able to control what is being shared. We saw it as a natural extension and closely coupled with your OS — instant, fast, and seamless. A better way to pair program, debug tough incidents, or jam on great ideas by sharing multi-modal information like code, commands, graphs, or logs.<p>All that made a lot of sense conceptually but to be frank, it was hard to get it right. Now a part of Datadog and with major parts of our app rewritten in Rust, we feel we’re closer than ever.<p>Here’s what pair programmers liked about CoScreen, so we made it even better: - High definition code sharing: Windows are video-streamed in real-time at their native resolution whenever possible. You never have to search for your IDE anymore or be anxious to share the wrong window. - Multi-directional collaboration: You can share, while Alice shares, while Bob shares. Side-by-side, across multiple displays. With built-in crisp audio and video chat. - 60FPS+ super smooth mouse pointers. Type, click, and draw on any shared window as if it was your own.<p>What some of you did NOT like, so we fixed it in CoScreen V5: - CPU utilization and latency have been reduced drastically as various parts of our desktop client are now implemented in Rust, building on crates such as cxx, rust-skia, iced, as well as Neon for our native remote control plugins. - No more accidental clicking into remote windows through the new remote window toggles. - You’re no longer bound by your displays, can share windows from multiple of them at the same time and even move them across displays while sharing without stopping. - You’ll also soon be able to join meetings from your browser from any platform.<p>CoScreen runs on macOS (x64 and Apple Silicon), Windows, soon also on the web and is currently free. We’re planning to charge for larger teams and enterprise features in the future. Hopefully - finally - we’ll also have a Linux version one day. Tell us if you need it urgently and if you have any other requirements!

Show HN: Multi-display screen sharing with CoScreen

Good to be back on HN with all-new CoScreen, a little more than 3 years after it launched over here!<p>With CoScreen 5.0, you can now share your windows from multiple displays at the same time, a long standing request by our most avid users and impossible in other apps. It also has a lightning-fast, Rust-based window compositing, scaling, and streaming engine now.<p>CoScreen was always meant to be different so that you and your team can share your screens simultaneously and multi-directionally, and to be able to control what is being shared. We saw it as a natural extension and closely coupled with your OS — instant, fast, and seamless. A better way to pair program, debug tough incidents, or jam on great ideas by sharing multi-modal information like code, commands, graphs, or logs.<p>All that made a lot of sense conceptually but to be frank, it was hard to get it right. Now a part of Datadog and with major parts of our app rewritten in Rust, we feel we’re closer than ever.<p>Here’s what pair programmers liked about CoScreen, so we made it even better: - High definition code sharing: Windows are video-streamed in real-time at their native resolution whenever possible. You never have to search for your IDE anymore or be anxious to share the wrong window. - Multi-directional collaboration: You can share, while Alice shares, while Bob shares. Side-by-side, across multiple displays. With built-in crisp audio and video chat. - 60FPS+ super smooth mouse pointers. Type, click, and draw on any shared window as if it was your own.<p>What some of you did NOT like, so we fixed it in CoScreen V5: - CPU utilization and latency have been reduced drastically as various parts of our desktop client are now implemented in Rust, building on crates such as cxx, rust-skia, iced, as well as Neon for our native remote control plugins. - No more accidental clicking into remote windows through the new remote window toggles. - You’re no longer bound by your displays, can share windows from multiple of them at the same time and even move them across displays while sharing without stopping. - You’ll also soon be able to join meetings from your browser from any platform.<p>CoScreen runs on macOS (x64 and Apple Silicon), Windows, soon also on the web and is currently free. We’re planning to charge for larger teams and enterprise features in the future. Hopefully - finally - we’ll also have a Linux version one day. Tell us if you need it urgently and if you have any other requirements!

Show HN: Multi-display screen sharing with CoScreen

Good to be back on HN with all-new CoScreen, a little more than 3 years after it launched over here!<p>With CoScreen 5.0, you can now share your windows from multiple displays at the same time, a long standing request by our most avid users and impossible in other apps. It also has a lightning-fast, Rust-based window compositing, scaling, and streaming engine now.<p>CoScreen was always meant to be different so that you and your team can share your screens simultaneously and multi-directionally, and to be able to control what is being shared. We saw it as a natural extension and closely coupled with your OS — instant, fast, and seamless. A better way to pair program, debug tough incidents, or jam on great ideas by sharing multi-modal information like code, commands, graphs, or logs.<p>All that made a lot of sense conceptually but to be frank, it was hard to get it right. Now a part of Datadog and with major parts of our app rewritten in Rust, we feel we’re closer than ever.<p>Here’s what pair programmers liked about CoScreen, so we made it even better: - High definition code sharing: Windows are video-streamed in real-time at their native resolution whenever possible. You never have to search for your IDE anymore or be anxious to share the wrong window. - Multi-directional collaboration: You can share, while Alice shares, while Bob shares. Side-by-side, across multiple displays. With built-in crisp audio and video chat. - 60FPS+ super smooth mouse pointers. Type, click, and draw on any shared window as if it was your own.<p>What some of you did NOT like, so we fixed it in CoScreen V5: - CPU utilization and latency have been reduced drastically as various parts of our desktop client are now implemented in Rust, building on crates such as cxx, rust-skia, iced, as well as Neon for our native remote control plugins. - No more accidental clicking into remote windows through the new remote window toggles. - You’re no longer bound by your displays, can share windows from multiple of them at the same time and even move them across displays while sharing without stopping. - You’ll also soon be able to join meetings from your browser from any platform.<p>CoScreen runs on macOS (x64 and Apple Silicon), Windows, soon also on the web and is currently free. We’re planning to charge for larger teams and enterprise features in the future. Hopefully - finally - we’ll also have a Linux version one day. Tell us if you need it urgently and if you have any other requirements!

Show HN: Open-source ETL framework to sync data from SaaS tools to vector stores

Hey hacker news, we launched a few weeks ago as a GPT-powered chatbot for developer docs, and quickly realized that the value of what we’re doing isn’t the chatbot itself. Rather, it’s the time we save developers by automating the extraction of data from their SaaS tools (Github, Zendesk, Salesforce, etc) and helping transform it to contextually relevant chunks that fit into GPT’s context window.<p>A lot of companies are building prototypes with GPT right now and they’re all using some combination of Langchain/Llama Index + Weaviate/Pinecone + GPT3.5/GPT4 as their stack for retrieval augmented generation (RAG). This works great for prototypes, but what we learned was that as you scale your RAG app to more users and ingest more sources of content, it becomes a real pain to manage your data pipelines.<p>For example, if you want to ingest your developer docs, process it into chunks of <500 tokens, and add those chunks to a vector store, you can build a prototype with Langchain fairly quickly. However, if you want to deploy it to customers like we did for BentoML ([<a href="https://www.bentoml.com/](https://www.bentoml.com/)" rel="nofollow">https://www.bentoml.com/](https://www.bentoml.com/)</a>) you’ll quickly realize that a naive chunking method that splits by character/token leads to poor results, and that “delete and re-vectorize everything” when the source docs change doesn’t scale as a data synchronization strategy.<p>We took the code we used to build chatbots for our early customers and turned it into an open source framework to rapidly build new data Connectors and Chunkers. This way developers can use community built Connectors and Chunkers to start running vector searches on data from any source in a matter of minutes, or write their own in a matter of hours.<p>Here’s a video demo: [<a href="https://youtu.be/I2V3Cu8L6wk](https://youtu.be/I2V3Cu8L6wk)" rel="nofollow">https://youtu.be/I2V3Cu8L6wk](https://youtu.be/I2V3Cu8L6wk)</a><p>The repo has instructions on how to get started and set up API endpoints to load, chunk, and vectorize data quickly. Right now it only works with websites and Github repos, but we’ll be adding Zendesk, Google Drive, and Confluence integrations soon too.

Show HN: Walkie-talkie for teams

Hey team, I'm Arjun! Builder at flowy.live.<p>So - it's quite simple. It's a walkie-talkie with a way to play back the last 24 hours. A new medium of communication for active teams.<p>We have released for Mac and Windows.<p>Any feedback or thoughts would mean the world to us!<p>Open to a feedback session? <a href="https://calendly.com/arjun-flowy/onboarding" rel="nofollow">https://calendly.com/arjun-flowy/onboarding</a><p>Appreciate all of you, Arjun Patel arjun@flowy.live

Show HN: Marvin – build AI functions that use an LLM as a runtime

Hey HN! We're excited to share our new open-source project, Marvin. Marvin is a high-level library for building AI-powered software. We developed it to address the challenges of integrating LLMs into more traditional applications. One of the biggest issues is the fact that LLMs only deal with strings (and conversational strings at that), so using them to process structured data is especially difficult.<p>Marvin introduces a new concept called AI Functions. These look and feel just like regular Python functions: you provide typed inputs, outputs, and docstrings. However, instead of relying on traditional source code, AI functions use LLMs like GPT-4 as a sort of “runtime” to generate outputs on-demand, based on the provided inputs and other details. The results are then parsed and converted back into native data types.<p>This “functional prompt engineering” means you can seamlessly integrate AI functions with your existing codebase. You can chain them together with other functions to form sophisticated, AI-enabled pipelines. They’re particularly useful for tasks that are simple to describe yet challenging to code, such as entity extraction, semantic scraping, complex filtering, template-based data generation, and categorization. For example, you could extract terms from a contract as JSON, scrape websites for quotes that support an idea, or build a list of questions from a customer support request. All of these would yield structured data that you could immediately start to process.<p>We initially created Marvin to tackle broad internal use cases in customer service and knowledge synthesis. AI Functions are just a piece of that, but have proven to be even more effective than we anticipated, and have quickly become one of our favorite features! We’re eager for you to try them out for yourself.<p>We’d love to hear your thoughts, feedback, and any creative ways you could use Marvin in your own projects. Let’s discuss in the comments!

Show HN: Marvin – build AI functions that use an LLM as a runtime

Hey HN! We're excited to share our new open-source project, Marvin. Marvin is a high-level library for building AI-powered software. We developed it to address the challenges of integrating LLMs into more traditional applications. One of the biggest issues is the fact that LLMs only deal with strings (and conversational strings at that), so using them to process structured data is especially difficult.<p>Marvin introduces a new concept called AI Functions. These look and feel just like regular Python functions: you provide typed inputs, outputs, and docstrings. However, instead of relying on traditional source code, AI functions use LLMs like GPT-4 as a sort of “runtime” to generate outputs on-demand, based on the provided inputs and other details. The results are then parsed and converted back into native data types.<p>This “functional prompt engineering” means you can seamlessly integrate AI functions with your existing codebase. You can chain them together with other functions to form sophisticated, AI-enabled pipelines. They’re particularly useful for tasks that are simple to describe yet challenging to code, such as entity extraction, semantic scraping, complex filtering, template-based data generation, and categorization. For example, you could extract terms from a contract as JSON, scrape websites for quotes that support an idea, or build a list of questions from a customer support request. All of these would yield structured data that you could immediately start to process.<p>We initially created Marvin to tackle broad internal use cases in customer service and knowledge synthesis. AI Functions are just a piece of that, but have proven to be even more effective than we anticipated, and have quickly become one of our favorite features! We’re eager for you to try them out for yourself.<p>We’d love to hear your thoughts, feedback, and any creative ways you could use Marvin in your own projects. Let’s discuss in the comments!

Show HN: RoboPianist, a piano playing robot simulation in the browser

Show HN: RoboPianist, a piano playing robot simulation in the browser

Show HN: RoboPianist, a piano playing robot simulation in the browser

Show HN: YakGPT – A locally running, hands-free ChatGPT UI

Greetings!<p>YakGPT is a simple, frontend-only, ChatGPT UI you can use to either chat normally, or, more excitingly, use your mic + OpenAI's Whisper API to chat hands-free.<p>Some features:<p>* A few fun characters pre-installed<p>* No tracking or analytics, OpenAI is the only thing it calls out to<p>* Optimized for mobile use via hands-free mode and cross-platform compressed audio recording<p>* Your API key and chat history are stored in browser local storage only<p>* Open-source, you can either use the deployed version at Vercel, or run it locally<p>Planned features:<p>* Integrate Eleven Labs & other TTS services to enable full hands-free conversation<p>* Implement LangChain and/or plugins<p>* Integrate more ASR services that allow for streaming<p>Source code: <a href="https://github.com/yakGPT/yakGPT">https://github.com/yakGPT/yakGPT</a><p>I’d love for you to try it out and hear your feedback!

Show HN: YakGPT – A locally running, hands-free ChatGPT UI

Greetings!<p>YakGPT is a simple, frontend-only, ChatGPT UI you can use to either chat normally, or, more excitingly, use your mic + OpenAI's Whisper API to chat hands-free.<p>Some features:<p>* A few fun characters pre-installed<p>* No tracking or analytics, OpenAI is the only thing it calls out to<p>* Optimized for mobile use via hands-free mode and cross-platform compressed audio recording<p>* Your API key and chat history are stored in browser local storage only<p>* Open-source, you can either use the deployed version at Vercel, or run it locally<p>Planned features:<p>* Integrate Eleven Labs & other TTS services to enable full hands-free conversation<p>* Implement LangChain and/or plugins<p>* Integrate more ASR services that allow for streaming<p>Source code: <a href="https://github.com/yakGPT/yakGPT">https://github.com/yakGPT/yakGPT</a><p>I’d love for you to try it out and hear your feedback!

Show HN: YakGPT – A locally running, hands-free ChatGPT UI

Greetings!<p>YakGPT is a simple, frontend-only, ChatGPT UI you can use to either chat normally, or, more excitingly, use your mic + OpenAI's Whisper API to chat hands-free.<p>Some features:<p>* A few fun characters pre-installed<p>* No tracking or analytics, OpenAI is the only thing it calls out to<p>* Optimized for mobile use via hands-free mode and cross-platform compressed audio recording<p>* Your API key and chat history are stored in browser local storage only<p>* Open-source, you can either use the deployed version at Vercel, or run it locally<p>Planned features:<p>* Integrate Eleven Labs & other TTS services to enable full hands-free conversation<p>* Implement LangChain and/or plugins<p>* Integrate more ASR services that allow for streaming<p>Source code: <a href="https://github.com/yakGPT/yakGPT">https://github.com/yakGPT/yakGPT</a><p>I’d love for you to try it out and hear your feedback!

< 1 2 3 ... 516 517 518 519 520 ... 936 937 938 >