Giving Cursor adequate context
hi guys who here has ever build a PRD document (on Convex) with Tasks/Subtasks breakdown so complete that you just had to repeatedly type "Next task" into Cursor ?
5 Replies
I have never formalized content into a "PRD", but I do add a MDC file describing my tech stack and key design decisions/tradeoffs.
I set Cursor to always read this file for every new prompt conversation.
I do plan to do an adjacent idea, which is to start a /blog URL route, and importantly, keep it on the same domain as my app.
As opposed to create a blog.myApp.com subdomain
I didn't get it what does /blog have to do with Cursor?\
Here's an excerpt from my X post where you can read about this in more detail.
https://x.com/Matt_Luo/status/1912524697389916262
I also plan to implement a similar, bigger idea: Append a /blog URL route on claritytext .com, which importantly, keeps it on the same domain as claritytext .com. Write the blog articles as, say, MDX files. And command the code editor to read particular articles related to a new feature I am trying to build.This /blog URL route is in contrast to creating a blog.claritytext .com subdomain, which would manifest as a different app altogether. LLMs have a more difficult time reading multiple repos and apps.
Matt Luo ClarityText B2B group chat platform 🌱 (@Matt_Luo) on X
AI powered code editors are increasingly able to increase the context capacity for prompts. For example, setting the LLM model Gemini allows for one million tokens.
This is leading to ever more creative approaches to stuff configuration-like content into the prompt. For example,
X
Here is fundamentally the same content on LinkedIn, but edited for clarity, typos, etc.:
https://www.linkedin.com/posts/matt-luo-0b1b9919_heres-my-learned-experience-software-programming-activity-7318294656222433284-9VUv?utm_source=share&utm_medium=member_desktop&rcm=ACoAAAPpqkUBiEzVkQNGb0jjb2azWtmdaywSc4c
Here's my learned experience software programming with AI powered c...
Here's my learned experience software programming with AI powered code editors, specifically about informing the code editor about how my app already works:
AI powered code editors are increasingly able to increase the context capacity for prompts. For example, setting the LLM model to Gemini allows for one million tokens, i.e. essentially mor...
It also assumes a less technical audience