10 Interesting Open Source Projects That Are Gaining Stars Fast

I’ve been poking around GitHub lately, and a few projects keep popping up with star counts climbing faster than expected. Nothing wildly hyped—just solid tools that solve real problems. Here’s a roundup of ten that stood out, all relatively new or newly active, with legit reasons for the attention.

First up is yt-dlp. You probably know youtube-dl, but this fork has been taking over because it actually keeps up with site changes. One command to download a video, and it handles playlists, subtitles, and even live streams. The star count’s been steady for years, but recently it spiked again after a few big sites broke the original. Basically, if you deal with online media, this is the tool you end up using.

Warp is a terminal emulator written in Rust. It’s fast, looks clean, and has built‑in AI that actually helps—like suggesting commands based on history or explaining error messages inline. I know terminal purists might roll their eyes, but for day‑to‑day work it reduces the friction of remembering obscure flags. The project’s been open source for a while, but the recent 0.12 release added proper SSH support and a new theme system, which pushed its stars way up.

Another interesting one is Ruff, a Python linter written in Rust. It’s not just fast—it’s 10–100x faster than existing tools like Flake8 or Pylint. The best part? It ships with hundreds of rules built in, zero configuration needed. Just run ruff check . and it catches everything from syntax errors to unused imports. I’ve seen entire codebases cleaned up in seconds. That kind of speed makes it an easy recommend for any Python project.

InfluxDB 3.0 hasn’t officially launched, but the community’s been tracking the new core written in Rust. The project’s star growth recently came from the announcement that it will support SQL natively, while retaining its time‑series performance. That’s a big deal for teams who want to use standard tooling without sacrificing speed. The repo itself is the old one, but the buzz around the rewrite is real.

Fly.io’s CLI tool flyctl has been quietly gaining stars as more people discover how simple it is to deploy apps globally. The key insight is that it uses Firecracker microVMs to run containers close to users, with no need to manage servers. One command: fly launch. That’s it. For anyone tired of configuring Kubernetes, this feels like a breath of fresh air.

Zed is a code editor built by the team behind Atom and Tree‑sitter. It’s designed for performance—editing even large files feels instant. The collaboration features are also smooth: you can pair program with zero latency, because it’s all over WebRTC. Zed recently opened its beta and the star count jumped because developers are genuinely looking for a faster alternative to VS Code. It’s not there yet for every workflow, but the direction is promising.

Bun is a JavaScript runtime and toolkit that replaces Node.js, npm, and node‑based tools like ts‑node and nodemon. It’s written in Zig and aims to be drop‑in compatible. The speed difference is noticeable: package installs in seconds, and script runs start under 50ms. The community has been steadily climbing, and now it’s at a point where you can use it for real projects without worrying about edge cases.

Coolify is an open‑source self‑hosted PaaS that works like Vercel or Netlify but on your own servers. You deploy Docker images via SSH, and it handles SSL, reverse proxy, and monitoring. The interface is clean, and it supports databases, static sites, and Node.js apps. Star growth came from people who want to avoid vendor lock‑in but still want a sane deployment experience. One command setup via curl -sSL https://get.coolify.io | bash.

Langfuse is an observability platform for LLM applications. It tracks prompts, completions, token usage, and latency, and gives you a nice dashboard to debug your AI workflows. The project went viral after a few popular YouTube videos showed how to use it with LangChain and OpenAI. It’s basically PostHog for LLMs. If you build anything with models, this is a no‑brainer.

Finally, Dify is an open‑source LLM app builder. You can create chatbots, text generators, or agents via a drag‑and‑drop interface, then deploy them as APIs. It supports multiple models (OpenAI, Anthropic, local models) and includes RAG pipelines out of the box. The star count jumped because it lowers the barrier for non‑ML engineers to build AI applications. And it’s free, no credit card required.

None of these are flashy for the sake of being flashy. They all solve an actual pain point with minimal config. If any catch your eye, go give them a star—or better yet, try running them. That’s the best way to tell if they’re actually useful.