The AI industry defaults to embeddings for everything. Embed your code, vector search for relevant chunks, stuff them in a prompt. It's the RAG pattern, and it works — for text.
Code is not text. Code has structure: functions call other functions, types reference other types, files import other files. Embeddings flatten this structure into vectors, losing the relationship information that makes code intelligence possible.
What Embeddings Are Good At
Embeddings capture semantic similarity. "Find code similar to this function" returns functions that do comparable things, even if they use different names and patterns.
Good for: