The best Hacker News stories from Show from the past day

Go back

Latest posts:

Show HN: Electrico – Electron Without Node and Chrome

Show HN: Electrico – Electron Without Node and Chrome

Show HN: Open Scanner, an open-source document scanning app for iPhone

Show HN: Open Scanner, an open-source document scanning app for iPhone

Show HN: Open Scanner, an open-source document scanning app for iPhone

Show HN: Void, an open-source Cursor/GitHub Copilot alternative

Hey HN, I'm Andrew, one of the creators of Void. I made this open source version of Cursor where you can get all of Cursor's core features but in a fully-customizable IDE (ctrl+k, ctrl+L). We love Cursor but there are so many other features we want to build, like allowing AI to edit multiple files at once, or giving AI better understanding of your file system. Void is the open-source, fully customizable tool we've been wanting.<p>The hard part: we're building Void as a fork of vscode. The repo has great documentation for extensions, but going deeper gets pretty involved. All of the code is OOP-based, and they mount DOM nodes the old-school way (which is what React was supposed to solve..). So adding new UI features isn't exactly trivial. Microsoft also made its extension marketplace closed-source so we (and Cursor) have to hack our way through it. One thing we're excited about is refactoring and creating docs so that it's much easier for anyone to contribute.<p>The other benefit of open source is we don't need to hide how our prompts are built, so we can transfer the private API logic that Cursor has right onto your local machine. This lets you host a model on-prem and have your data stay completely private. It also means you can go directly to LLM providers (OpenAI, Anthropic) instead of going through us as a middleman.<p>There's still a lot to build, and full disclosure, we are very early stage. But we're super excited about building and have a working prototype that we're quickly adding features to.<p>Let us know if there's anything you want to see in a Cursor-style editor. Or feel free to shoot us a pull request. Cheers!

Show HN: TensorZero – open-source data and learning flywheel for LLMs

Hi HN!<p>We're Gabriel & Viraj, and we're excited to open source TensorZero.<p>To be a little cheeky, TensorZero is an open-source platform that helps LLM applications graduate from API wrappers into defensible AI products.<p>1. Integrate our model gateway<p>2. Send metrics or feedback<p>3. Unlock compounding improvements in quality, cost, and latency<p>It enables a data & learning flywheel for LLMs by unifying:<p>• Inference: one API for all LLMs, with <1ms P99 overhead<p>• Observability: inference & feedback → your database<p>• Optimization: better prompts, models, inference strategies<p>• Experimentation: built-in A/B testing, routing, fallbacks<p>Our goal is to help engineers build, manage, and optimize the next generation of LLM applications: AI systems that learn from real-world experience.<p>In addition to a Quick Start (5min) [1] and a Tutorial (30min) [2], we've also published a series of complete runnable examples illustrating TensorZero's data & learning flywheel.<p>• Writing Haikus to Satisfy a Judge with Hidden Preferences [3] – my personal favorite<p>• Fine-Tuning TensorZero JSON Functions for Named Entity Recognition (CoNLL++) [4]<p>• Automated Prompt Engineering for Math Reasoning (GSM8K) with a Custom Recipe (DSPy) [5]<p>___<p>[1] <a href="https://www.tensorzero.com/docs/gateway/quickstart" rel="nofollow">https://www.tensorzero.com/docs/gateway/quickstart</a><p>[2] <a href="https://www.tensorzero.com/docs/gateway/tutorial" rel="nofollow">https://www.tensorzero.com/docs/gateway/tutorial</a><p>[3] <a href="https://github.com/tensorzero/tensorzero/tree/main/examples/haiku-hidden-preferences">https://github.com/tensorzero/tensorzero/tree/main/examples/...</a><p>[4] <a href="https://github.com/tensorzero/tensorzero/tree/main/examples/ner-fine-tuning-json-functions">https://github.com/tensorzero/tensorzero/tree/main/examples/...</a><p>[5] <a href="https://github.com/tensorzero/tensorzero/tree/main/examples/gsm8k-custom-recipe-dspy">https://github.com/tensorzero/tensorzero/tree/main/examples/...</a><p>We hope you find TensorZero useful! Feedback and questions are very welcome. If you're interested in using it at work, we'd be happy to set up a Slack channel with your team (free).

Show HN: Xnapper Studio – Web tool for creating attention-grabbing screenshots

Xnapper Studio aims to make it fast and easy to create eye-catching images and mockups, especially for people sharing images their product images on on social media and blogs.<p>Compared to our macOS app, we've doubled down on making the screenshots look better by adding device mockups (i.e. iPhone, Macbook, Safari, etc.), more social media dimensions, and image positioning.<p>Currently we're working on adding an option to share the image with a link (cloud storage and share).<p>Would love to get your feedback on our beta!

Show HN: Universal Logger for Node, Deno, Bun, Browser

Hey everyone, I had posted about my project <a href="https://adzejs.com" rel="nofollow">https://adzejs.com</a> a couple of years ago and it was met with a lot of interest, so I'm writing about the major v2 update that's just been released to see if anyone is interested.<p>What makes Adze interesting compared to other logging libraries like pino, bunyan, winston, etc?<p>Adze is universal. This means that Adze will "just work" in all of your environments. This is especially handy when working with SSR projects like sveltekit, nuxt, next, etc. You can also use Adze with Bun or Deno without any special adaptations or considerations.<p>Adze 2.x is also smaller (13.29kb minified and brotlied) and faster than the original. Benchmarks put it at generating 100,000 logs in ~700ms.<p>Version 2 also offers a cleaner API than version 1 as it no longer uses factories and instead uses static class methods.<p><pre><code> import adze from 'adze'; // Generating a log adze.timestamp.ns('foo').log('A log with a timestamp and namespace.'); // Making a child logger const logger = adze.timestamp.ns('foo').seal(); logger.log('A log with a timestamp and namespace.'); </code></pre> Adze 2.x comes with support for four different types of log formats out-of-the-box. These formats include: - a human-readable pretty format - a machine-readable JSON format that is compatible with the Bunyan CLI - a format for common logs - and a format for simple stdout logging<p>Adze 2.x also offers better extensibility support. You can now create custom formatters and custom middleware for modifying log behavior or transporting them to another source (like a file, etc). Log listeners are also still supported.<p>Changing formats is easy.<p><pre><code> import adze, { setup } from 'adze'; setup({ format: 'json', // <- Change with an env var }); adze.withEmoji.success('This is a pretty log!'); </code></pre> Adze 2.x also includes a handy new template literal logging feature for times where you are repeating logs frequently with slightly different messages (like error messages in a catch). Adze offers a new sealTag terminator that will seal your configuration into a template literal tag function to further simplify your logging. Example<p><pre><code> import adze from 'adze'; // Let's create a reusable ERR tag with emoji's, timestamps, and the "my-module" namespace. const ERR = adze.withEmoji.timestamp.ns('my-module').sealTag(); try { // do something that could fail... } catch (e) { ERR`Printing my error as an error log! ${e}`; }</code></pre> There is much, much more to Adze than what I can present in this post, but please check it out at <a href="https://adzejs.com" rel="nofollow">https://adzejs.com</a> and let me know what you think! Try it out! Also, please give it a star to bookmark it at <a href="https://github.com/adzejs/adze">https://github.com/adzejs/adze</a> if you might use it in the future!<p>I appreciate any feedback as well. This has been a huge labor of love for me and I hope it benefits you all as well.<p>Thank you!

Show HN: JAQT – JavaScript Queries and Transformations

Hi all,<p>I've made a javascript library to simplify searching/sorting/filtering in arrays of objects. Its inspired by both GraphQL and SQL, but implemented using javascript Proxies. Instead of creating a new language, its all just javascript.<p>I've made it as part of an experimental database, which uses javascript as the query engine. The normal javascript map/reduce/sort functions are quite difficult to master for junior developers. JAQT is easier to explain, and can still be used in combination with any existing array functions.<p>Please let me know what you think of the API and its ease of use!

Show HN: JAQT – JavaScript Queries and Transformations

Hi all,<p>I've made a javascript library to simplify searching/sorting/filtering in arrays of objects. Its inspired by both GraphQL and SQL, but implemented using javascript Proxies. Instead of creating a new language, its all just javascript.<p>I've made it as part of an experimental database, which uses javascript as the query engine. The normal javascript map/reduce/sort functions are quite difficult to master for junior developers. JAQT is easier to explain, and can still be used in combination with any existing array functions.<p>Please let me know what you think of the API and its ease of use!

Show HN: Sisi – Semantic Image Search CLI tool, locally without third party APIs

I wrote this tool to get familiar with CLIP model, I know many people have written similar tools with CLIP before, but I'm new to machine learning and writing a classic tool helps my study.<p>The unusual thing with my version is, it is in pure Node.js, with the power of node-mlx, a Node.js machine learning framework.<p>The repo in the link is mostly about implementing indexing and CLI, the code of the model implementation lives as a Node.js module: <a href="https://github.com/frost-beta/clip">https://github.com/frost-beta/clip</a> .<p>Hope this helps other learners!

Show HN: Sisi – Semantic Image Search CLI tool, locally without third party APIs

I wrote this tool to get familiar with CLIP model, I know many people have written similar tools with CLIP before, but I'm new to machine learning and writing a classic tool helps my study.<p>The unusual thing with my version is, it is in pure Node.js, with the power of node-mlx, a Node.js machine learning framework.<p>The repo in the link is mostly about implementing indexing and CLI, the code of the model implementation lives as a Node.js module: <a href="https://github.com/frost-beta/clip">https://github.com/frost-beta/clip</a> .<p>Hope this helps other learners!

Show HN: Epitomē – A semantic search engine for ancient text

Show HN: Server Uptime

Show HN: I made a digital circuit drawing and simulation game

Inspired by games like Turing Complete/Virtual Circuit Board/Logic World, I tried to make a mix of aseprite and wiredworld/wired-logic, the idea being the user can build a digital circuit using a "fullstack" pixelart creation workflow.<p>The circuit is just an image. The primitives are (i) connected wires which have undefined, 0 or 1 state during simulation (displayed brighter or darker in function of the state) and (ii) NANDs, which are little pixel triangles. During simulation the user can interact with any wire by clicking on it, toggling its state, which is cool for messing around when learning. The simulation uses a unit-delay event driven algorithm.<p>Then, on top of that there are little wire interfaces on the left side of the image that communicate with an external system. This external system is defined in lua and is simulated together with the main circuit (they alternate until convergence). By default there's a sandbox mode with a clock and a power-on-reset signal. The user can choose other "levels", where the API change and there are some problems to solve, from finding if a number is multiple of 3 to solving hanoi tower to finding if a number is prime. The idea is that if the user want to learn but not sure what to do they can try to solve these puzzles, or they can change the lua scripts to add their own stuff/interface for a custom project.<p>I've also included a small wiki (circuitopedia) with some basic digital concepts to guide those who are new or are a bit rusty. It's not super detailed but I guess it can at the very least present the concepts so the user can dig further on more serious material if they want to.<p>I developed the game in C with raylib, with scripting in lua/luajit. I've put the game on steam (for windows) and released the source code on github under GPLv3. There's also a web demo version on itch.io, even though it's a bit laggy: <a href="https://lets-all-be-stupid-foreva.itch.io/circuit-artist-demo" rel="nofollow">https://lets-all-be-stupid-foreva.itch.io/circuit-artist-dem...</a> .<p>Feedback is appreciated!

Show HN: I made a digital circuit drawing and simulation game

Inspired by games like Turing Complete/Virtual Circuit Board/Logic World, I tried to make a mix of aseprite and wiredworld/wired-logic, the idea being the user can build a digital circuit using a "fullstack" pixelart creation workflow.<p>The circuit is just an image. The primitives are (i) connected wires which have undefined, 0 or 1 state during simulation (displayed brighter or darker in function of the state) and (ii) NANDs, which are little pixel triangles. During simulation the user can interact with any wire by clicking on it, toggling its state, which is cool for messing around when learning. The simulation uses a unit-delay event driven algorithm.<p>Then, on top of that there are little wire interfaces on the left side of the image that communicate with an external system. This external system is defined in lua and is simulated together with the main circuit (they alternate until convergence). By default there's a sandbox mode with a clock and a power-on-reset signal. The user can choose other "levels", where the API change and there are some problems to solve, from finding if a number is multiple of 3 to solving hanoi tower to finding if a number is prime. The idea is that if the user want to learn but not sure what to do they can try to solve these puzzles, or they can change the lua scripts to add their own stuff/interface for a custom project.<p>I've also included a small wiki (circuitopedia) with some basic digital concepts to guide those who are new or are a bit rusty. It's not super detailed but I guess it can at the very least present the concepts so the user can dig further on more serious material if they want to.<p>I developed the game in C with raylib, with scripting in lua/luajit. I've put the game on steam (for windows) and released the source code on github under GPLv3. There's also a web demo version on itch.io, even though it's a bit laggy: <a href="https://lets-all-be-stupid-foreva.itch.io/circuit-artist-demo" rel="nofollow">https://lets-all-be-stupid-foreva.itch.io/circuit-artist-dem...</a> .<p>Feedback is appreciated!

Show HN: I made a digital circuit drawing and simulation game

Inspired by games like Turing Complete/Virtual Circuit Board/Logic World, I tried to make a mix of aseprite and wiredworld/wired-logic, the idea being the user can build a digital circuit using a "fullstack" pixelart creation workflow.<p>The circuit is just an image. The primitives are (i) connected wires which have undefined, 0 or 1 state during simulation (displayed brighter or darker in function of the state) and (ii) NANDs, which are little pixel triangles. During simulation the user can interact with any wire by clicking on it, toggling its state, which is cool for messing around when learning. The simulation uses a unit-delay event driven algorithm.<p>Then, on top of that there are little wire interfaces on the left side of the image that communicate with an external system. This external system is defined in lua and is simulated together with the main circuit (they alternate until convergence). By default there's a sandbox mode with a clock and a power-on-reset signal. The user can choose other "levels", where the API change and there are some problems to solve, from finding if a number is multiple of 3 to solving hanoi tower to finding if a number is prime. The idea is that if the user want to learn but not sure what to do they can try to solve these puzzles, or they can change the lua scripts to add their own stuff/interface for a custom project.<p>I've also included a small wiki (circuitopedia) with some basic digital concepts to guide those who are new or are a bit rusty. It's not super detailed but I guess it can at the very least present the concepts so the user can dig further on more serious material if they want to.<p>I developed the game in C with raylib, with scripting in lua/luajit. I've put the game on steam (for windows) and released the source code on github under GPLv3. There's also a web demo version on itch.io, even though it's a bit laggy: <a href="https://lets-all-be-stupid-foreva.itch.io/circuit-artist-demo" rel="nofollow">https://lets-all-be-stupid-foreva.itch.io/circuit-artist-dem...</a> .<p>Feedback is appreciated!

Show HN: Wordllama – Things you can do with the token embeddings of an LLM

After working with LLMs for long enough, I found myself wanting a lightweight utility for doing various small tasks to prepare inputs, locate information and create evaluators. This library is two things: a very simple model and utilities that inference it (eg. fuzzy deduplication). The target platform is CPU, and it’s intended to be light, fast and pip installable — a library that lowers the barrier to working with strings <i>semantically</i>. You don’t need to install pytorch to use it, or any deep learning runtimes.<p>How can this be accomplished? The model is simply token embeddings that are average pooled. To create this model, I extracted token embedding (nn.Embedding) vectors from LLMs, concatenated them along the embedding dimension, added a learnable weight parameter, and projected them to a smaller dimension. Using the sentence transformers framework and datasets, I trained the pooled embedding with multiple negatives ranking loss and matryoshka representation learning so they can be truncated. After training, the weights and projections are no longer needed, because there is no contextual calculations. I inference the entire token vocabulary and save the new token embeddings to be loaded to numpy.<p>While the results are not impressive compared to transformer models, they perform well on MTEB benchmarks compared to word embedding models (which they are most similar to), while being much smaller in size (smallest model, 32k vocab, 64-dim is only 4MB).<p>On the utility side, I’ve been adding some tools that I think it’ll be useful for. In addition to general embedding, there’s algorithms for ranking, filtering, clustering, deduplicating and similarity. Some of them have a cython implementation, and I’m continuing to work on benchmarking them and improving them as I have time. In addition to “standard” models that use cosine similarity for some algorithms, there are binarized models that use hamming distance. This is a slightly faster, similarity algorithm, with significantly less memory per embedding (float32 -> 1 bit).<p>Hope you enjoy it, and find it useful. PS I haven’t figured out Windows builds yet, but Linux and Mac are supported.

Show HN: Wordllama – Things you can do with the token embeddings of an LLM

After working with LLMs for long enough, I found myself wanting a lightweight utility for doing various small tasks to prepare inputs, locate information and create evaluators. This library is two things: a very simple model and utilities that inference it (eg. fuzzy deduplication). The target platform is CPU, and it’s intended to be light, fast and pip installable — a library that lowers the barrier to working with strings <i>semantically</i>. You don’t need to install pytorch to use it, or any deep learning runtimes.<p>How can this be accomplished? The model is simply token embeddings that are average pooled. To create this model, I extracted token embedding (nn.Embedding) vectors from LLMs, concatenated them along the embedding dimension, added a learnable weight parameter, and projected them to a smaller dimension. Using the sentence transformers framework and datasets, I trained the pooled embedding with multiple negatives ranking loss and matryoshka representation learning so they can be truncated. After training, the weights and projections are no longer needed, because there is no contextual calculations. I inference the entire token vocabulary and save the new token embeddings to be loaded to numpy.<p>While the results are not impressive compared to transformer models, they perform well on MTEB benchmarks compared to word embedding models (which they are most similar to), while being much smaller in size (smallest model, 32k vocab, 64-dim is only 4MB).<p>On the utility side, I’ve been adding some tools that I think it’ll be useful for. In addition to general embedding, there’s algorithms for ranking, filtering, clustering, deduplicating and similarity. Some of them have a cython implementation, and I’m continuing to work on benchmarking them and improving them as I have time. In addition to “standard” models that use cosine similarity for some algorithms, there are binarized models that use hamming distance. This is a slightly faster, similarity algorithm, with significantly less memory per embedding (float32 -> 1 bit).<p>Hope you enjoy it, and find it useful. PS I haven’t figured out Windows builds yet, but Linux and Mac are supported.

< 1 2 3 ... 32 33 34 35 36 ... 718 719 720 >