Every hackathon organizer has seen it: 9am Saturday, the clock starts, and forty students immediately open Google. They're not searching for the solution. They're searching for the right starting point. Auth library. Database. How do I get Stripe working in 20 minutes. Can I use OpenAI in a free-tier account.
They'll find answers — eventually. The question is whether they find them in a vetted, context-aware resource built for hackathons, or in a 2019 Stack Overflow thread for a version of the library that no longer exists.
That's what Codefest.ai is. And that's why we think you should make it the first tab your participants open.
The research was never the point
Ask yourself: when a hackathon team does well, what do the judges say at demo day?
They don't say "impressive — tell me about your process for selecting your database." They don't ask how the team discovered that Supabase existed, or how long it took them to learn that Next.js has an App Router now. That entire phase of a hackathon — the tool selection, the googling, the "wait which version is current" — is invisible to every outcome that anyone involved actually cares about.
What judges ask about: the problem you identified, why it matters, whether the thing works, how you'd take it further. What sponsors remember: did these students understand our challenge, did they ship something real, would we want to hire them. What participants remember: the moment something clicked, the time pressure, building with their team.
Nobody has ever said "that was the hackathon that shaped me — I'll never forget the three hours I spent figuring out which authentication library to use."
That friction was never pedagogy. It was always waste. And the question of whether a tool that eliminates it is "appropriate for an academic setting" is a category error — it's asking whether it's okay to remove something that was never teaching anything in the first place.
This is sharper in an AI hackathon
If your event involves AI tools — and increasingly, all of them do — the argument above becomes even more obvious.
In an AI hackathon, the differentiation between teams is almost entirely what they imagined and how they applied the tools, not whether they knew the tooling existed before the clock started. The interesting question is the problem statement. The creative leap. The application to a specific community or context. Whether the team understood the domain well enough to build something that actually matters.
The team that wins didn't win because they spent longer on npm install. They won because they understood the problem better, scoped smarter, and built something that landed with the judges. Codefest compresses the irrelevant part. It doesn't touch the part that matters.
Penalizing participants for having a curated starting point is like penalizing a carpenter for owning a good set of tools before a timed competition. The craft is the cabinet. Not the time spent at the hardware store.
What we actually are
Codefest.ai is a curated reference layer for hackathon participants. We link to GitHub repos and official docs. We don't host code, write it for them, or build anything on their behalf. Every component in our library already exists as a public, open-source project — we've just added context: setup time estimates, difficulty ratings, compatibility notes, and which challenge domains each tool is suited for.
Think of it like a physical handbook. Chemistry students use the CRC Handbook during problem sets. It has formulas in it. Nobody calls it cheating. The knowledge was always public — the handbook just makes it findable in the right moment.
Your students are going to look this stuff up anyway. The question is what they find.
What we are not
We are not an AI that writes their project. We don't generate code, design their architecture, or solve their problem statement. We are a structured resource list.
If your concern is that AI tools are doing too much for participants — that's a legitimate pedagogical conversation worth having. But that conversation is about GitHub Copilot, ChatGPT, and Claude. Not about a curated link library with setup time estimates.
Banning Codefest.ai while permitting Google is like banning the bibliography but allowing the library. The same links are one search away. We just organized them.
Better projects are better for everyone
Here's what we've noticed from studying winning hackathon projects: the teams that perform best aren't the ones with the most technically sophisticated members. They're the ones who got to a working prototype the fastest and spent the remaining time iterating on the problem.
Every hour saved on "which payment library should I use" is an hour spent on the actual challenge you designed. Your sponsors notice. Your judges notice. Your students remember the hackathon as the one where they actually built something, not the one where they spent Saturday morning reading documentation.
Curated starting points raise the floor. The ceiling is still entirely up to your participants.
We align with what you're already trying to do
Our library is organized around SDG-aligned challenge domains: climate, health equity, civic infrastructure, financial inclusion, food systems. If you're running a hackathon with a social impact theme — which most institutional hackathons increasingly are — we're organized around the same framework you're using to design the prompt.
We're building organizer tools specifically because we know the challenge prompt is the most important thing you control. When the execution gap between participants is shrinking (and it is, fast), the quality of the problem statement is what determines the quality of the projects. We want to help you design better prompts, brief participants more effectively, and create better alignment between what your sponsors want and what your students build.
That work is coming. For now, the participant layer is live. It's free. It works offline. And your students will spend less time on setup and more time on the problem you gave them.
The practical case
- No login required to browse the full library. Students can use it anonymously.
- No code downloads. Every component is a link to a public GitHub repo or official docs page.
- No ads, no sponsored placements. Curation is based on hackathon-tested utility, not who paid us.
- Works on bad wifi. The library is statically rendered. No API calls blocking the page load.
- SDG-aligned. Challenge domains match the UN framework most institutional hackathons already use.
- Open about what it is. We're a reference resource. We say so clearly. There's nothing hidden.
A simple test
Before your next event, send this page to one skeptical faculty colleague and one of your most experienced participants. Ask them both: "Does this cross a line?"
We're confident about what they'll say. But we think you should ask, rather than assume — and so should we.
If there's something in our library that you think shouldn't be there, or a domain we've missed, or a way we could be more useful to your specific institution, we want to hear it. We're building this with organizers, not at them.
Codefest.ai is free for all hackathon participants. Organizer tools are in development — if you want early access or want to share how your institution runs hackathons, reach out at hello@codefest.ai.