An LLM has “zero context” about your project’s specific stack and style guidelines. In other words, an AI might produce a generic <Modal> component, but integrating it into your app’s unique architecture is still a human task.
This is very old. Nowadays, in Copilot for example, you add files to context and tell “hey look how I did that thing there, do this new thing following the same structure, with the same naming conventions” and it’s enough. And tools like Cursor just throw your whole project into context by default.
threaded - newest
This is very old. Nowadays, in Copilot for example, you add files to context and tell “hey look how I did that thing there, do this new thing following the same structure, with the same naming conventions” and it’s enough. And tools like Cursor just throw your whole project into context by default.