The best Hacker News stories from Show from the past week

Go back

Latest posts:

Show HN: Tabby – A self-hosted GitHub Copilot

I would like to introduce Tabby, which is a self-hosted alternative to GitHub Copilot that you can integrate into your hardware. While GitHub Copilot has made coding more efficient and less time-consuming by assisting developers with suggestions and completing code, it raises concerns around privacy and security.<p>Tabby is in its early stages, and we are excited to receive feedback from the community.<p>Its Github repository is located here: <a href="https://github.com/TabbyML/tabby">https://github.com/TabbyML/tabby</a>.<p>We have also deployed the latest docker image to Huggingface for a live demo: <a href="https://huggingface.co/spaces/TabbyML/tabby" rel="nofollow">https://huggingface.co/spaces/TabbyML/tabby</a>.<p>Tabby is built on top of the popular Hugging Face Transformers / Triton FasterTransformer backend and is designed to be self-hosted, providing you with complete control over your data and privacy. In Tabby's next feature iteration, you can fine-tune the model to meet your project requirements.

Show HN: Tabby – A self-hosted GitHub Copilot

I would like to introduce Tabby, which is a self-hosted alternative to GitHub Copilot that you can integrate into your hardware. While GitHub Copilot has made coding more efficient and less time-consuming by assisting developers with suggestions and completing code, it raises concerns around privacy and security.<p>Tabby is in its early stages, and we are excited to receive feedback from the community.<p>Its Github repository is located here: <a href="https://github.com/TabbyML/tabby">https://github.com/TabbyML/tabby</a>.<p>We have also deployed the latest docker image to Huggingface for a live demo: <a href="https://huggingface.co/spaces/TabbyML/tabby" rel="nofollow">https://huggingface.co/spaces/TabbyML/tabby</a>.<p>Tabby is built on top of the popular Hugging Face Transformers / Triton FasterTransformer backend and is designed to be self-hosted, providing you with complete control over your data and privacy. In Tabby's next feature iteration, you can fine-tune the model to meet your project requirements.

Show HN: Want something better than k-means? Try BanditPAM

Want something better than k-means? I'm happy to announce our SOTA k-medoids algorithm from NeurIPS 2020, BanditPAM, is now publicly available! `pip install banditpam` or `install.packages("banditpam")` and you're good to go!<p>k-means is one of the most widely-used algorithms to cluster data. However, it has several limitations: a) it requires the use of L2 distance for efficient clustering, which also b) restricts the data you're clustering to be vectors, and c) doesn't require the means to be datapoints in the dataset.<p>Unlike in k-means, the k-medoids problem requires cluster centers to be actual datapoints, which permits greater interpretability of your cluster centers. k-medoids also works better with arbitrary distance metrics, so your clustering can be more robust to outliers if you're using metrics like L1. Despite these advantages, most people don't use k-medoids because prior algorithms were too slow.<p>In our NeurIPS 2020 paper, BanditPAM, we sped up the best known algorithm from O(n^2) to O(nlogn) by using techniques from multi-armed bandits. We were inspired by prior research that demonstrated many algorithms can be sped up by sampling the data intelligently, instead of performing exhaustive computations.<p>We've released our implementation, which is pip- and CRAN-installable. It's written in C++ for speed, but callable from Python and R. It also supports parallelization and intelligent caching at no extra complexity to end users. Its interface also matches the sklearn.cluster.KMeans interface, so minimal changes are necessary to existing code.<p>PyPI: <a href="https://pypi.org/project/banditpam" rel="nofollow">https://pypi.org/project/banditpam</a><p>CRAN: <a href="https://cran.r-project.org/web/packages/banditpam/index.html" rel="nofollow">https://cran.r-project.org/web/packages/banditpam/index.html</a><p>Repo: <a href="https://github.com/motiwari/BanditPAM">https://github.com/motiwari/BanditPAM</a><p>Paper: <a href="https://arxiv.org/abs/2006.06856" rel="nofollow">https://arxiv.org/abs/2006.06856</a><p>If you find our work valuable, please consider starring the repo or citing our work. These help us continue development on this project.<p>I'm Mo Tiwari (motiwari.com), a PhD student in Computer Science at Stanford University. A special thanks to my collaborators on this project, Martin Jinye Zhang, James Mayclin, Sebastian Thrun, Chris Piech, and Ilan Shomorony, as well as the author of the R package, Balasubramanian Narasimhan.<p>(This is my first time posting on HN; I've read the FAQ before posting, but please let me know if I broke any rules)

Show HN: Live coaching app for remote SWE interviews, uses Whisper and GPT-4

Posting from a throwaway account to maintain privacy.<p>This project is a salvo against leetcode-style interviews that require candidates to study useless topics and confidently write code in front of a live audience, in order to get a job where none of that stuff matters.<p>Cheetah is an AI-powered macOS app designed to assist users during remote software engineering interviews by providing real-time, discreet coaching and integration with CoderPad. It uses Whisper for audio transcription and GPT-4 to generate hints/answers. The UI is intentionally minimal to allow for discreet use during a video call.<p>It was fun dipping into the world of LLMs, prompt chaining, etc. I didn't find a Swift wrapper for whisper.cpp, so in the repo there's also a barebones Swift framework that wraps whisper.cpp and is designed for real-time transcription on M1/M2.<p>I'll be around if anyone has questions or comments!

Show HN: Ermine.ai – Record and transcribe speech, 100% client-side (WASM)

Show HN: We are building an open-source IDE powered by AI

Show HN: Hocus – self-hosted alternative to GitHub Codespaces using Firecracker

Show HN: Unknown Pleasures, a tiny web experiment with WebGL

Show HN: Unknown Pleasures, a tiny web experiment with WebGL

Show HN: Coursemate – connect with other self learners

Hey Hacker News!<p>My name is Collin, 18 years old and doing a gap year after finishing high school last year.<p>This was my first real project after starting to learn web development around 5 months ago.<p>I came up with this idea as it was a real pain for me to find other people from my country and especially my age, learning and taking online courses about the same stuff online. Lots of these online courses include their own discord communities and forums, but I still found it very hard to connect with other people in there.<p>Thats why I built Coursemate.<p>I would love to get your feedback on it! :)

Show HN: Multi-display screen sharing with CoScreen

Good to be back on HN with all-new CoScreen, a little more than 3 years after it launched over here!<p>With CoScreen 5.0, you can now share your windows from multiple displays at the same time, a long standing request by our most avid users and impossible in other apps. It also has a lightning-fast, Rust-based window compositing, scaling, and streaming engine now.<p>CoScreen was always meant to be different so that you and your team can share your screens simultaneously and multi-directionally, and to be able to control what is being shared. We saw it as a natural extension and closely coupled with your OS — instant, fast, and seamless. A better way to pair program, debug tough incidents, or jam on great ideas by sharing multi-modal information like code, commands, graphs, or logs.<p>All that made a lot of sense conceptually but to be frank, it was hard to get it right. Now a part of Datadog and with major parts of our app rewritten in Rust, we feel we’re closer than ever.<p>Here’s what pair programmers liked about CoScreen, so we made it even better: - High definition code sharing: Windows are video-streamed in real-time at their native resolution whenever possible. You never have to search for your IDE anymore or be anxious to share the wrong window. - Multi-directional collaboration: You can share, while Alice shares, while Bob shares. Side-by-side, across multiple displays. With built-in crisp audio and video chat. - 60FPS+ super smooth mouse pointers. Type, click, and draw on any shared window as if it was your own.<p>What some of you did NOT like, so we fixed it in CoScreen V5: - CPU utilization and latency have been reduced drastically as various parts of our desktop client are now implemented in Rust, building on crates such as cxx, rust-skia, iced, as well as Neon for our native remote control plugins. - No more accidental clicking into remote windows through the new remote window toggles. - You’re no longer bound by your displays, can share windows from multiple of them at the same time and even move them across displays while sharing without stopping. - You’ll also soon be able to join meetings from your browser from any platform.<p>CoScreen runs on macOS (x64 and Apple Silicon), Windows, soon also on the web and is currently free. We’re planning to charge for larger teams and enterprise features in the future. Hopefully - finally - we’ll also have a Linux version one day. Tell us if you need it urgently and if you have any other requirements!

Show HN: Multi-display screen sharing with CoScreen

Good to be back on HN with all-new CoScreen, a little more than 3 years after it launched over here!<p>With CoScreen 5.0, you can now share your windows from multiple displays at the same time, a long standing request by our most avid users and impossible in other apps. It also has a lightning-fast, Rust-based window compositing, scaling, and streaming engine now.<p>CoScreen was always meant to be different so that you and your team can share your screens simultaneously and multi-directionally, and to be able to control what is being shared. We saw it as a natural extension and closely coupled with your OS — instant, fast, and seamless. A better way to pair program, debug tough incidents, or jam on great ideas by sharing multi-modal information like code, commands, graphs, or logs.<p>All that made a lot of sense conceptually but to be frank, it was hard to get it right. Now a part of Datadog and with major parts of our app rewritten in Rust, we feel we’re closer than ever.<p>Here’s what pair programmers liked about CoScreen, so we made it even better: - High definition code sharing: Windows are video-streamed in real-time at their native resolution whenever possible. You never have to search for your IDE anymore or be anxious to share the wrong window. - Multi-directional collaboration: You can share, while Alice shares, while Bob shares. Side-by-side, across multiple displays. With built-in crisp audio and video chat. - 60FPS+ super smooth mouse pointers. Type, click, and draw on any shared window as if it was your own.<p>What some of you did NOT like, so we fixed it in CoScreen V5: - CPU utilization and latency have been reduced drastically as various parts of our desktop client are now implemented in Rust, building on crates such as cxx, rust-skia, iced, as well as Neon for our native remote control plugins. - No more accidental clicking into remote windows through the new remote window toggles. - You’re no longer bound by your displays, can share windows from multiple of them at the same time and even move them across displays while sharing without stopping. - You’ll also soon be able to join meetings from your browser from any platform.<p>CoScreen runs on macOS (x64 and Apple Silicon), Windows, soon also on the web and is currently free. We’re planning to charge for larger teams and enterprise features in the future. Hopefully - finally - we’ll also have a Linux version one day. Tell us if you need it urgently and if you have any other requirements!

Show HN: Multi-display screen sharing with CoScreen

Good to be back on HN with all-new CoScreen, a little more than 3 years after it launched over here!<p>With CoScreen 5.0, you can now share your windows from multiple displays at the same time, a long standing request by our most avid users and impossible in other apps. It also has a lightning-fast, Rust-based window compositing, scaling, and streaming engine now.<p>CoScreen was always meant to be different so that you and your team can share your screens simultaneously and multi-directionally, and to be able to control what is being shared. We saw it as a natural extension and closely coupled with your OS — instant, fast, and seamless. A better way to pair program, debug tough incidents, or jam on great ideas by sharing multi-modal information like code, commands, graphs, or logs.<p>All that made a lot of sense conceptually but to be frank, it was hard to get it right. Now a part of Datadog and with major parts of our app rewritten in Rust, we feel we’re closer than ever.<p>Here’s what pair programmers liked about CoScreen, so we made it even better: - High definition code sharing: Windows are video-streamed in real-time at their native resolution whenever possible. You never have to search for your IDE anymore or be anxious to share the wrong window. - Multi-directional collaboration: You can share, while Alice shares, while Bob shares. Side-by-side, across multiple displays. With built-in crisp audio and video chat. - 60FPS+ super smooth mouse pointers. Type, click, and draw on any shared window as if it was your own.<p>What some of you did NOT like, so we fixed it in CoScreen V5: - CPU utilization and latency have been reduced drastically as various parts of our desktop client are now implemented in Rust, building on crates such as cxx, rust-skia, iced, as well as Neon for our native remote control plugins. - No more accidental clicking into remote windows through the new remote window toggles. - You’re no longer bound by your displays, can share windows from multiple of them at the same time and even move them across displays while sharing without stopping. - You’ll also soon be able to join meetings from your browser from any platform.<p>CoScreen runs on macOS (x64 and Apple Silicon), Windows, soon also on the web and is currently free. We’re planning to charge for larger teams and enterprise features in the future. Hopefully - finally - we’ll also have a Linux version one day. Tell us if you need it urgently and if you have any other requirements!

Show HN: YakGPT – A locally running, hands-free ChatGPT UI

Greetings!<p>YakGPT is a simple, frontend-only, ChatGPT UI you can use to either chat normally, or, more excitingly, use your mic + OpenAI's Whisper API to chat hands-free.<p>Some features:<p>* A few fun characters pre-installed<p>* No tracking or analytics, OpenAI is the only thing it calls out to<p>* Optimized for mobile use via hands-free mode and cross-platform compressed audio recording<p>* Your API key and chat history are stored in browser local storage only<p>* Open-source, you can either use the deployed version at Vercel, or run it locally<p>Planned features:<p>* Integrate Eleven Labs & other TTS services to enable full hands-free conversation<p>* Implement LangChain and/or plugins<p>* Integrate more ASR services that allow for streaming<p>Source code: <a href="https://github.com/yakGPT/yakGPT">https://github.com/yakGPT/yakGPT</a><p>I’d love for you to try it out and hear your feedback!

Show HN: YakGPT – A locally running, hands-free ChatGPT UI

Greetings!<p>YakGPT is a simple, frontend-only, ChatGPT UI you can use to either chat normally, or, more excitingly, use your mic + OpenAI's Whisper API to chat hands-free.<p>Some features:<p>* A few fun characters pre-installed<p>* No tracking or analytics, OpenAI is the only thing it calls out to<p>* Optimized for mobile use via hands-free mode and cross-platform compressed audio recording<p>* Your API key and chat history are stored in browser local storage only<p>* Open-source, you can either use the deployed version at Vercel, or run it locally<p>Planned features:<p>* Integrate Eleven Labs & other TTS services to enable full hands-free conversation<p>* Implement LangChain and/or plugins<p>* Integrate more ASR services that allow for streaming<p>Source code: <a href="https://github.com/yakGPT/yakGPT">https://github.com/yakGPT/yakGPT</a><p>I’d love for you to try it out and hear your feedback!

Show HN: Gut – An easy-to-use CLI for Git

Hi Hacker news !<p>I’m Julien and I built an alternative CLI for Git : gut.<p>Even if I haven’t been coding for a long time (I’m in the first year studying computer science), I’ve always found git to be frustrating. The command naming is inconsistent and git lets you easily shoot yourself in the foot.<p>I made gut, another git porcelain, to solve these issues.<p>It provides a consistent naming of command. To do so, syntax is based on subcommands. For example, to delete a branch, run gut branch rm rather than git branch -d, same to delete a remote (gut remote rm) and so on.<p>Gut also prevents you from shooting yourself. It provides nice defaults and always prompt you before doing something destructive. Also, it won’t allow you to rewrite the history if it has been pushed to the remote. Creating commits in detached head is also prohibited.<p>Finally, git was made when GitHub and others didn’t existed yet. To diff commits, gut opens the compare view in the browser. And to merge a branch, gut opens a pull request.<p>I have been working on this project for the past few months and I am happy to be able to share it.<p>I hope you’ll like it. Any suggestions is welcome !<p>Repo: <a href="https://github.com/julien040/gut">https://github.com/julien040/gut</a>

Show HN: Gut – An easy-to-use CLI for Git

Hi Hacker news !<p>I’m Julien and I built an alternative CLI for Git : gut.<p>Even if I haven’t been coding for a long time (I’m in the first year studying computer science), I’ve always found git to be frustrating. The command naming is inconsistent and git lets you easily shoot yourself in the foot.<p>I made gut, another git porcelain, to solve these issues.<p>It provides a consistent naming of command. To do so, syntax is based on subcommands. For example, to delete a branch, run gut branch rm rather than git branch -d, same to delete a remote (gut remote rm) and so on.<p>Gut also prevents you from shooting yourself. It provides nice defaults and always prompt you before doing something destructive. Also, it won’t allow you to rewrite the history if it has been pushed to the remote. Creating commits in detached head is also prohibited.<p>Finally, git was made when GitHub and others didn’t existed yet. To diff commits, gut opens the compare view in the browser. And to merge a branch, gut opens a pull request.<p>I have been working on this project for the past few months and I am happy to be able to share it.<p>I hope you’ll like it. Any suggestions is welcome !<p>Repo: <a href="https://github.com/julien040/gut">https://github.com/julien040/gut</a>

Show HN: StratusGFX, my open-source real-time 3D rendering engine

It's been closed source for a long time while I worked on it on and off as a hobby research project, but yesterday the repo was made public for the first time under the MPL 2.0 license.<p>A feature reel showing its capabilities can be found here: <a href="https://ktstephano.github.io/rendering/stratusgfx/feature_reel" rel="nofollow">https://ktstephano.github.io/rendering/stratusgfx/feature_re...</a><p>A technical breakdown of a single frame can be found here: <a href="https://ktstephano.github.io/rendering/stratusgfx/frame_analysis" rel="nofollow">https://ktstephano.github.io/rendering/stratusgfx/frame_anal...</a><p>It's still in a very beta state (bugs and instability expected), but I felt like it was a good time to make it public since a lot of its core features are mostly presentable. I plan to continue working on it in my spare time to try and improve the usability of the code.<p>Two main use cases I could see for it:<p>1) People using it for educational purposes.<p>2) People integrating it into other more general purpose engines that they're working on since Stratus is primarily a rendering engine. Any extensions to the rendering code that are made public would then further help others.<p>So I think it will remain very niche but I'm hoping it will still be helpful for people in the future.

Show HN: StratusGFX, my open-source real-time 3D rendering engine

It's been closed source for a long time while I worked on it on and off as a hobby research project, but yesterday the repo was made public for the first time under the MPL 2.0 license.<p>A feature reel showing its capabilities can be found here: <a href="https://ktstephano.github.io/rendering/stratusgfx/feature_reel" rel="nofollow">https://ktstephano.github.io/rendering/stratusgfx/feature_re...</a><p>A technical breakdown of a single frame can be found here: <a href="https://ktstephano.github.io/rendering/stratusgfx/frame_analysis" rel="nofollow">https://ktstephano.github.io/rendering/stratusgfx/frame_anal...</a><p>It's still in a very beta state (bugs and instability expected), but I felt like it was a good time to make it public since a lot of its core features are mostly presentable. I plan to continue working on it in my spare time to try and improve the usability of the code.<p>Two main use cases I could see for it:<p>1) People using it for educational purposes.<p>2) People integrating it into other more general purpose engines that they're working on since Stratus is primarily a rendering engine. Any extensions to the rendering code that are made public would then further help others.<p>So I think it will remain very niche but I'm hoping it will still be helpful for people in the future.

Show HN: Customizable, embeddable Chat GPT based on your own documents

Hi Hacker News!<p>My name is Bea, I built a site called Libraria that uses GPT to do a few things<p>1. Let you spin up multiple assistants based on your own documents. You can make it public, private, or protected. It has its own subdomain and landing page. 2. Respond in full markdown always, so it can output images, links, code, and more 3. Let you upload articles on the fly within the Chat, so you can ask it questions 4. Make it embeddable in your site with one line of code 5. Let you update it for fun / with your branding 5. Enable syncing for any URLs you let us scrape, so that you can make sure it's always up to date 6. Let you upload multiple file types<p>I've been working on this for about a month now by myself and you can keep track of my feature updates here: <a href="https://libraria.dev/feature-updates" rel="nofollow">https://libraria.dev/feature-updates</a><p>I would LOVE your feedback on anything, and If you're willing to try it out I'm looking for a few beta users that can provide me more continuous feedback that I would gladly waive the fee for!

< 1 2 3 ... 48 49 50 51 52 ... 122 123 124 >