Some draft work on vibe-coding-errors
This commit is contained in:
parent
551a9d286d
commit
cebf8b08dc
@ -15,9 +15,22 @@ description: "AI coding assistants, or even vibe-coding agents, can create a lot
|
||||
1. Libraries and common code
|
||||
1. Performance
|
||||
1. Abstract structure
|
||||
1. Real-world/Domain modeling
|
||||
1. How do we fix it?
|
||||
1. Functional tests
|
||||
2. Enforce good debugging practices
|
||||
3. Build your own libraries
|
||||
4. Build real-world performance tests
|
||||
5. You gotta know how to design it
|
||||
|
||||
### Domain Modeling
|
||||
|
||||
While the AI can be very good at getting some code to run. It's like having an assistant who lives in another country that you only interact with through chat. Not even another country. It's like an assistant who was born in a fallout shelter and has only ever interacted with the world through a text terminal. Their entire life they have been sitting, hunched over a black screen with green text reading and reading to learn everything they can. But it's also pitch black down their. They've never _seen_ anything.
|
||||
|
||||
You have to tell them everything you want them to know about your specific case. Let's say you are modeling some sort of supply chain logistics. They've never actually worked on a supply chain. They've never seen a ship. They've never been annoyed by a late order or a project falling behind due to a shipping mishap. All they know is what they have read.
|
||||
|
||||
More than that, they have never read anything about your specific case. They only know what you have told them. You might think that this is at least as good as an intern, but even that intern sits in on meetings, chats with people in the halls, and observes the company functioning around them. They also have an intuitive grasp of things like space and time. They may have seen the ships sitting out in the harbor and realize just how hard those are to turn around.
|
||||
|
||||
The AI doesn't have any of that. It only knows what it's read. You have to tell it _everthing_ it needs to know in order to write code that corresponds to whatever problem you want to solve.
|
||||
|
||||
AI sucks at domain modeling because it has no understanding of the domain. All it can do is imitate models it's read about that might sound similar.
|
||||
|
||||
Loading…
Reference in New Issue
Block a user