The best Hacker News stories from Show from the past day
Latest posts:
Show HN: Chrome extension to close Zoom/Notion tabs after launching desktop app
Hi HN! I use the desktop versions of Zoom, Notion, and Asana, so at the end of the day, I have a ton of Chrome tabs left over from these services launching their apps. I threw together a little extension to clean these tabs up.<p>Do folks tend to use the browser versions of these apps? Or are there other sites that this extension should support?
Show HN: Syncing data to your customer’s Google Sheets
Hey HN! Charles here from Prequel (<a href="https://prequel.co">https://prequel.co</a>). We just launched the ability to sync data from your own app/db/data warehouse to any of your customer’s Google Sheets, CSV, or Excel – and I wanted to share a bit more about how we built the Google Sheets integration. If you’re curious, see here for a quick GIF demo of our Google Sheets destination: <a href="https://storage.googleapis.com/hn_asset/Prequel_GoogleSheetsDemo.webp" rel="nofollow">https://storage.googleapis.com/hn_asset/Prequel_GoogleSheets...</a>.<p>Quick background on us: we make it easy to integrate with and sync data to data warehouses. Problem is, there are plenty of folks who want access to their data, but don’t have or don’t know how to use a data warehouse. For example, FP&A teams, customer success teams, etc.<p>To get around that, we added some non-db destinations to Prequel: Google Sheets, CSV, and Excel. We had to rework some core assumptions in order to get Google Sheets to work.<p>By default, Prequel does incremental syncs, meaning we only write net new or updated data to the destination. To avoid duplicate rows, we typically perform those writes as upserts – this is pretty trivial in most SQL dialects. But since Google Sheets is not actually a db, it doesn’t have a concept of upserts, and we had to get creative.<p>We had two options: either force all Google Sheets syncs to be “full refreshes” every time (eg grab all the data and brute-force write it to the sheet). The downside is, this can get expensive quickly for our customers, especially when data gets refreshed at higher frequencies (eg every 15 minutes).<p>The other, and better, option was to figure out how to perform upserts in Sheets. To do so, we read the data from the sheet we’re about to write to into memory. We store it in a large map by primary key. We reconcile it with the data we’re about to write. We then dump the contents of the map back to the sheet. In order to make the user experience smoother, we also sort the rows by timestamp before writing it back. This guarantees that we don’t accidentally shuffle rows with every transfer, which might leave users feeling confused.<p>“Wait, you keep all the data in memory… so how do you avoid blowing up your pods?”. Great question! Luckily, Google Sheets has pretty stringent cell / row size limits. This allows us to restrict the amount of data that can be written to these destinations (we throw a nice error if someone tries to sync too much data), and thereby also guarantees that we don’t OOM our poor pods.<p>Another interesting problem we had to solve was auth: how do we let users give us access to their sheets in a way that both feels intuitive and upholds strong security guarantees? It seemed like the cleanest user experience was to ask the spreadsheet owner to share access with a new user – much like they would with any real human user. To make this possible without creating a superuser that would have access to _all_ the sheets, we had to programmatically generate a different user for each of our customers. We do this via the GCP IAM API, creating a new service account every time. We then auth into the sheet through this service account.<p>One last fun UX challenge to think through was how to prevent users from editing the “golden” data we just sync’d. It might not be immediately clear to them that this data is meant as a source of truth record, rather than a playground. To get around this, we create protected ranges and prevent them from editing the sheets we write to. Sheets even adds a little padlock icon to the relevant sheets, which helps convey the “don’t mess with this”.<p>If you want to take it for a spin, you can sign up on our site or reach us at hello (at) prequel.co. Happy to answer any other questions about the design!
Show HN: Syncing data to your customer’s Google Sheets
Hey HN! Charles here from Prequel (<a href="https://prequel.co">https://prequel.co</a>). We just launched the ability to sync data from your own app/db/data warehouse to any of your customer’s Google Sheets, CSV, or Excel – and I wanted to share a bit more about how we built the Google Sheets integration. If you’re curious, see here for a quick GIF demo of our Google Sheets destination: <a href="https://storage.googleapis.com/hn_asset/Prequel_GoogleSheetsDemo.webp" rel="nofollow">https://storage.googleapis.com/hn_asset/Prequel_GoogleSheets...</a>.<p>Quick background on us: we make it easy to integrate with and sync data to data warehouses. Problem is, there are plenty of folks who want access to their data, but don’t have or don’t know how to use a data warehouse. For example, FP&A teams, customer success teams, etc.<p>To get around that, we added some non-db destinations to Prequel: Google Sheets, CSV, and Excel. We had to rework some core assumptions in order to get Google Sheets to work.<p>By default, Prequel does incremental syncs, meaning we only write net new or updated data to the destination. To avoid duplicate rows, we typically perform those writes as upserts – this is pretty trivial in most SQL dialects. But since Google Sheets is not actually a db, it doesn’t have a concept of upserts, and we had to get creative.<p>We had two options: either force all Google Sheets syncs to be “full refreshes” every time (eg grab all the data and brute-force write it to the sheet). The downside is, this can get expensive quickly for our customers, especially when data gets refreshed at higher frequencies (eg every 15 minutes).<p>The other, and better, option was to figure out how to perform upserts in Sheets. To do so, we read the data from the sheet we’re about to write to into memory. We store it in a large map by primary key. We reconcile it with the data we’re about to write. We then dump the contents of the map back to the sheet. In order to make the user experience smoother, we also sort the rows by timestamp before writing it back. This guarantees that we don’t accidentally shuffle rows with every transfer, which might leave users feeling confused.<p>“Wait, you keep all the data in memory… so how do you avoid blowing up your pods?”. Great question! Luckily, Google Sheets has pretty stringent cell / row size limits. This allows us to restrict the amount of data that can be written to these destinations (we throw a nice error if someone tries to sync too much data), and thereby also guarantees that we don’t OOM our poor pods.<p>Another interesting problem we had to solve was auth: how do we let users give us access to their sheets in a way that both feels intuitive and upholds strong security guarantees? It seemed like the cleanest user experience was to ask the spreadsheet owner to share access with a new user – much like they would with any real human user. To make this possible without creating a superuser that would have access to _all_ the sheets, we had to programmatically generate a different user for each of our customers. We do this via the GCP IAM API, creating a new service account every time. We then auth into the sheet through this service account.<p>One last fun UX challenge to think through was how to prevent users from editing the “golden” data we just sync’d. It might not be immediately clear to them that this data is meant as a source of truth record, rather than a playground. To get around this, we create protected ranges and prevent them from editing the sheets we write to. Sheets even adds a little padlock icon to the relevant sheets, which helps convey the “don’t mess with this”.<p>If you want to take it for a spin, you can sign up on our site or reach us at hello (at) prequel.co. Happy to answer any other questions about the design!
Show HN: Syncing data to your customer’s Google Sheets
Hey HN! Charles here from Prequel (<a href="https://prequel.co">https://prequel.co</a>). We just launched the ability to sync data from your own app/db/data warehouse to any of your customer’s Google Sheets, CSV, or Excel – and I wanted to share a bit more about how we built the Google Sheets integration. If you’re curious, see here for a quick GIF demo of our Google Sheets destination: <a href="https://storage.googleapis.com/hn_asset/Prequel_GoogleSheetsDemo.webp" rel="nofollow">https://storage.googleapis.com/hn_asset/Prequel_GoogleSheets...</a>.<p>Quick background on us: we make it easy to integrate with and sync data to data warehouses. Problem is, there are plenty of folks who want access to their data, but don’t have or don’t know how to use a data warehouse. For example, FP&A teams, customer success teams, etc.<p>To get around that, we added some non-db destinations to Prequel: Google Sheets, CSV, and Excel. We had to rework some core assumptions in order to get Google Sheets to work.<p>By default, Prequel does incremental syncs, meaning we only write net new or updated data to the destination. To avoid duplicate rows, we typically perform those writes as upserts – this is pretty trivial in most SQL dialects. But since Google Sheets is not actually a db, it doesn’t have a concept of upserts, and we had to get creative.<p>We had two options: either force all Google Sheets syncs to be “full refreshes” every time (eg grab all the data and brute-force write it to the sheet). The downside is, this can get expensive quickly for our customers, especially when data gets refreshed at higher frequencies (eg every 15 minutes).<p>The other, and better, option was to figure out how to perform upserts in Sheets. To do so, we read the data from the sheet we’re about to write to into memory. We store it in a large map by primary key. We reconcile it with the data we’re about to write. We then dump the contents of the map back to the sheet. In order to make the user experience smoother, we also sort the rows by timestamp before writing it back. This guarantees that we don’t accidentally shuffle rows with every transfer, which might leave users feeling confused.<p>“Wait, you keep all the data in memory… so how do you avoid blowing up your pods?”. Great question! Luckily, Google Sheets has pretty stringent cell / row size limits. This allows us to restrict the amount of data that can be written to these destinations (we throw a nice error if someone tries to sync too much data), and thereby also guarantees that we don’t OOM our poor pods.<p>Another interesting problem we had to solve was auth: how do we let users give us access to their sheets in a way that both feels intuitive and upholds strong security guarantees? It seemed like the cleanest user experience was to ask the spreadsheet owner to share access with a new user – much like they would with any real human user. To make this possible without creating a superuser that would have access to _all_ the sheets, we had to programmatically generate a different user for each of our customers. We do this via the GCP IAM API, creating a new service account every time. We then auth into the sheet through this service account.<p>One last fun UX challenge to think through was how to prevent users from editing the “golden” data we just sync’d. It might not be immediately clear to them that this data is meant as a source of truth record, rather than a playground. To get around this, we create protected ranges and prevent them from editing the sheets we write to. Sheets even adds a little padlock icon to the relevant sheets, which helps convey the “don’t mess with this”.<p>If you want to take it for a spin, you can sign up on our site or reach us at hello (at) prequel.co. Happy to answer any other questions about the design!
Show HN: Syncing data to your customer’s Google Sheets
Hey HN! Charles here from Prequel (<a href="https://prequel.co">https://prequel.co</a>). We just launched the ability to sync data from your own app/db/data warehouse to any of your customer’s Google Sheets, CSV, or Excel – and I wanted to share a bit more about how we built the Google Sheets integration. If you’re curious, see here for a quick GIF demo of our Google Sheets destination: <a href="https://storage.googleapis.com/hn_asset/Prequel_GoogleSheetsDemo.webp" rel="nofollow">https://storage.googleapis.com/hn_asset/Prequel_GoogleSheets...</a>.<p>Quick background on us: we make it easy to integrate with and sync data to data warehouses. Problem is, there are plenty of folks who want access to their data, but don’t have or don’t know how to use a data warehouse. For example, FP&A teams, customer success teams, etc.<p>To get around that, we added some non-db destinations to Prequel: Google Sheets, CSV, and Excel. We had to rework some core assumptions in order to get Google Sheets to work.<p>By default, Prequel does incremental syncs, meaning we only write net new or updated data to the destination. To avoid duplicate rows, we typically perform those writes as upserts – this is pretty trivial in most SQL dialects. But since Google Sheets is not actually a db, it doesn’t have a concept of upserts, and we had to get creative.<p>We had two options: either force all Google Sheets syncs to be “full refreshes” every time (eg grab all the data and brute-force write it to the sheet). The downside is, this can get expensive quickly for our customers, especially when data gets refreshed at higher frequencies (eg every 15 minutes).<p>The other, and better, option was to figure out how to perform upserts in Sheets. To do so, we read the data from the sheet we’re about to write to into memory. We store it in a large map by primary key. We reconcile it with the data we’re about to write. We then dump the contents of the map back to the sheet. In order to make the user experience smoother, we also sort the rows by timestamp before writing it back. This guarantees that we don’t accidentally shuffle rows with every transfer, which might leave users feeling confused.<p>“Wait, you keep all the data in memory… so how do you avoid blowing up your pods?”. Great question! Luckily, Google Sheets has pretty stringent cell / row size limits. This allows us to restrict the amount of data that can be written to these destinations (we throw a nice error if someone tries to sync too much data), and thereby also guarantees that we don’t OOM our poor pods.<p>Another interesting problem we had to solve was auth: how do we let users give us access to their sheets in a way that both feels intuitive and upholds strong security guarantees? It seemed like the cleanest user experience was to ask the spreadsheet owner to share access with a new user – much like they would with any real human user. To make this possible without creating a superuser that would have access to _all_ the sheets, we had to programmatically generate a different user for each of our customers. We do this via the GCP IAM API, creating a new service account every time. We then auth into the sheet through this service account.<p>One last fun UX challenge to think through was how to prevent users from editing the “golden” data we just sync’d. It might not be immediately clear to them that this data is meant as a source of truth record, rather than a playground. To get around this, we create protected ranges and prevent them from editing the sheets we write to. Sheets even adds a little padlock icon to the relevant sheets, which helps convey the “don’t mess with this”.<p>If you want to take it for a spin, you can sign up on our site or reach us at hello (at) prequel.co. Happy to answer any other questions about the design!
Show HN: Search inside 15,000 pitchdeck slides
Show HN: Search inside 15,000 pitchdeck slides
Show HN: Search inside 15,000 pitchdeck slides
Show HN: Search inside 15,000 pitchdeck slides
Show HN: YouTube Summaries Using GPT
Hi, I'm Alex. I created Eightify to take my mind off things during a weekend, but I was surprised that my friends were genuinely interested in it. I kept going, and now it's been nine weeks since I started.<p>I got the idea to summarize videos when my friend sent me a lengthy video again. This happens to me often; the video title is so enticing, and then it turns out to be nothing. I had been working with GPT for 6 months by the time, so everything looked like a nail to me.<p>It's a Chrome extension, and I'm offering 5 free tries for videos under an hour. After that, you have to buy a package. I'm not making money yet, but it pays for GPT, which can be pricey for long texts. And some of Lex Fridman's podcasts are incredibly long.<p>I'm one of those overly optimistic people when it comes to GPT. So many people tell me, "Oh, it doesn't solve this problem yet; let's wait for GPT-4". The real issue is that their prompts are usually inadequate, and it takes you anywhere from two days to two weeks to make it work. Testing and debugging, preferably with automated tests. I believe you can solve many problems with GPT-3 already.<p>I would love to answer any questions you have about the product and GPT in general. I've invested at least 500 hours into prompt engineering. And I enjoy watching other people's prompts too!
Show HN: YouTube Summaries Using GPT
Hi, I'm Alex. I created Eightify to take my mind off things during a weekend, but I was surprised that my friends were genuinely interested in it. I kept going, and now it's been nine weeks since I started.<p>I got the idea to summarize videos when my friend sent me a lengthy video again. This happens to me often; the video title is so enticing, and then it turns out to be nothing. I had been working with GPT for 6 months by the time, so everything looked like a nail to me.<p>It's a Chrome extension, and I'm offering 5 free tries for videos under an hour. After that, you have to buy a package. I'm not making money yet, but it pays for GPT, which can be pricey for long texts. And some of Lex Fridman's podcasts are incredibly long.<p>I'm one of those overly optimistic people when it comes to GPT. So many people tell me, "Oh, it doesn't solve this problem yet; let's wait for GPT-4". The real issue is that their prompts are usually inadequate, and it takes you anywhere from two days to two weeks to make it work. Testing and debugging, preferably with automated tests. I believe you can solve many problems with GPT-3 already.<p>I would love to answer any questions you have about the product and GPT in general. I've invested at least 500 hours into prompt engineering. And I enjoy watching other people's prompts too!
Show HN: YouTube Summaries Using GPT
Hi, I'm Alex. I created Eightify to take my mind off things during a weekend, but I was surprised that my friends were genuinely interested in it. I kept going, and now it's been nine weeks since I started.<p>I got the idea to summarize videos when my friend sent me a lengthy video again. This happens to me often; the video title is so enticing, and then it turns out to be nothing. I had been working with GPT for 6 months by the time, so everything looked like a nail to me.<p>It's a Chrome extension, and I'm offering 5 free tries for videos under an hour. After that, you have to buy a package. I'm not making money yet, but it pays for GPT, which can be pricey for long texts. And some of Lex Fridman's podcasts are incredibly long.<p>I'm one of those overly optimistic people when it comes to GPT. So many people tell me, "Oh, it doesn't solve this problem yet; let's wait for GPT-4". The real issue is that their prompts are usually inadequate, and it takes you anywhere from two days to two weeks to make it work. Testing and debugging, preferably with automated tests. I believe you can solve many problems with GPT-3 already.<p>I would love to answer any questions you have about the product and GPT in general. I've invested at least 500 hours into prompt engineering. And I enjoy watching other people's prompts too!
Show HN: Don't lose track of HN post comments
Yesterday, I got lost track of the HN post comments.<p>I fixed that problem → http://hntoast.com<p>I make this in the last couple of hours. So, if you have any feature requests or feedback.<p>Let me know your thought about this tool.
Show HN: GPT Joke Writer
An AI joke generation tool built on top of OpenAI’s GPT-3 language models, and fine-tuned with ~15k late night comedy monologue jokes.<p>web app and model creation all open-sourced
Show HN: Doc Converter – Convert PDF docs to Word documents on your computer
Show HN: 1Kb Webspace
Hey guys, I wanted to introduce you my hacknight project.<p>It is a tribute to onekb.net which has stopped its service a few years ago. Currently it is still a beta where external resources are also possible (but not the point ;) ) to get your opinions.<p>When it is finished, the source code will be open source. The secret word is therefore also hackernews.<p>P.S.: The source code is currently 2.4Kb I'm trying to make it smaller. 1Kb would be my goal.
Show HN: 1Kb Webspace
Hey guys, I wanted to introduce you my hacknight project.<p>It is a tribute to onekb.net which has stopped its service a few years ago. Currently it is still a beta where external resources are also possible (but not the point ;) ) to get your opinions.<p>When it is finished, the source code will be open source. The secret word is therefore also hackernews.<p>P.S.: The source code is currently 2.4Kb I'm trying to make it smaller. 1Kb would be my goal.
Show HN: I'm a doctor and made a responsive breathing app for stress and anxiety
Hey HN! Some more info: I’m an NHS doctor and the founder of Pi-A (<a href="https://www.pi-a.io" rel="nofollow">https://www.pi-a.io</a>) which developed Lungy (<a href="https://www.lungy.app" rel="nofollow">https://www.lungy.app</a>). Lungy is an app (iOS only for now) that responds to breathing in real-time and was designed to make breathing exercises more engaging and beneficial to do. It hopefully has many aspects of interest to the HN community – real-time fluid, cloth and soft body sims running on the phone’s GPU.<p>My background is as a junior surgical trainee and I started building Lungy in 2020 during the first COVID lockdown in London. During COVID, there were huge numbers of patients coming off ventilators and they are often given breathing exercises on a worksheet and disposable plastic devices called incentive spirometers to encourage deep breathing. This is intended to prevent chest infections and strengthen breathing muscles that have weakened. I noticed often the incentive spirometer would sit by the bedside, whilst the patient would be on their phone – this was the spark that lead to Lungy!<p>The visuals are mostly built using Metal, with one or two using SpriteKit. There are 20 to choose from, including boids, cloth sims, fluid sims, a hacky DLA implementation, rigid body + soft body sims. The audio uses AudioKit with a polyphonic synth and a sequencer plays generated notes from a chosen scale (you can mess around with the sequencer and synth in Settings/Create Music).<p>There are obviously lots of breathing and meditation apps out there, I wanted Lungy to be different - it's about tuning into your surroundings and noticing the world around you, so all the visuals are nature-inspired or have some reference to the physical world. I didn’t like other apps required large downloads and/or a wifi connection, so Lungy’s download size is very small (<50MB), with no geometry, video or audio files.<p>Lungy is initially a wellness app, but I’d like to develop a medical device version for patients with breathing problems such as asthma, chronic obstructive pulmonary disease (COPD) & long COVID. Thanks for reading - would love to hear feedback!
Show HN: I'm a doctor and made a responsive breathing app for stress and anxiety
Hey HN! Some more info: I’m an NHS doctor and the founder of Pi-A (<a href="https://www.pi-a.io" rel="nofollow">https://www.pi-a.io</a>) which developed Lungy (<a href="https://www.lungy.app" rel="nofollow">https://www.lungy.app</a>). Lungy is an app (iOS only for now) that responds to breathing in real-time and was designed to make breathing exercises more engaging and beneficial to do. It hopefully has many aspects of interest to the HN community – real-time fluid, cloth and soft body sims running on the phone’s GPU.<p>My background is as a junior surgical trainee and I started building Lungy in 2020 during the first COVID lockdown in London. During COVID, there were huge numbers of patients coming off ventilators and they are often given breathing exercises on a worksheet and disposable plastic devices called incentive spirometers to encourage deep breathing. This is intended to prevent chest infections and strengthen breathing muscles that have weakened. I noticed often the incentive spirometer would sit by the bedside, whilst the patient would be on their phone – this was the spark that lead to Lungy!<p>The visuals are mostly built using Metal, with one or two using SpriteKit. There are 20 to choose from, including boids, cloth sims, fluid sims, a hacky DLA implementation, rigid body + soft body sims. The audio uses AudioKit with a polyphonic synth and a sequencer plays generated notes from a chosen scale (you can mess around with the sequencer and synth in Settings/Create Music).<p>There are obviously lots of breathing and meditation apps out there, I wanted Lungy to be different - it's about tuning into your surroundings and noticing the world around you, so all the visuals are nature-inspired or have some reference to the physical world. I didn’t like other apps required large downloads and/or a wifi connection, so Lungy’s download size is very small (<50MB), with no geometry, video or audio files.<p>Lungy is initially a wellness app, but I’d like to develop a medical device version for patients with breathing problems such as asthma, chronic obstructive pulmonary disease (COPD) & long COVID. Thanks for reading - would love to hear feedback!
Show HN: I'm a doctor and made a responsive breathing app for stress and anxiety
Hey HN! Some more info: I’m an NHS doctor and the founder of Pi-A (<a href="https://www.pi-a.io" rel="nofollow">https://www.pi-a.io</a>) which developed Lungy (<a href="https://www.lungy.app" rel="nofollow">https://www.lungy.app</a>). Lungy is an app (iOS only for now) that responds to breathing in real-time and was designed to make breathing exercises more engaging and beneficial to do. It hopefully has many aspects of interest to the HN community – real-time fluid, cloth and soft body sims running on the phone’s GPU.<p>My background is as a junior surgical trainee and I started building Lungy in 2020 during the first COVID lockdown in London. During COVID, there were huge numbers of patients coming off ventilators and they are often given breathing exercises on a worksheet and disposable plastic devices called incentive spirometers to encourage deep breathing. This is intended to prevent chest infections and strengthen breathing muscles that have weakened. I noticed often the incentive spirometer would sit by the bedside, whilst the patient would be on their phone – this was the spark that lead to Lungy!<p>The visuals are mostly built using Metal, with one or two using SpriteKit. There are 20 to choose from, including boids, cloth sims, fluid sims, a hacky DLA implementation, rigid body + soft body sims. The audio uses AudioKit with a polyphonic synth and a sequencer plays generated notes from a chosen scale (you can mess around with the sequencer and synth in Settings/Create Music).<p>There are obviously lots of breathing and meditation apps out there, I wanted Lungy to be different - it's about tuning into your surroundings and noticing the world around you, so all the visuals are nature-inspired or have some reference to the physical world. I didn’t like other apps required large downloads and/or a wifi connection, so Lungy’s download size is very small (<50MB), with no geometry, video or audio files.<p>Lungy is initially a wellness app, but I’d like to develop a medical device version for patients with breathing problems such as asthma, chronic obstructive pulmonary disease (COPD) & long COVID. Thanks for reading - would love to hear feedback!