faangguyindia
I usually only edit 1 function using LLM on old code base.

On Greenfield projects. I ask Claude Soñnet to write all the function and their signature with return value etc..

Then I've a script which sends these signature to Google Flash which writes all the functions for me.

All this happens in paraellel.

I've found if you limit the scope, Google Flash writes the best code and it's ultra fast and cheap.

mrtesthah
This symbolic link broke it:

srtp -> .

  File "repogather/file_filter.py", line 170, in process_directory
    if item.is_file():
       ^^^^^^^^^^^^^^
OSError: [Errno 62] Too many levels of symbolic links: 'submodules/externals/srtp/include/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp/srtp'
reacharavindh
Do you literally paste a wall of text (source code of the filtered whole repo) into the prompt and ask the LLM to give you a diff patch as an answer to your question?

Example,

Here is my whole project, now implement user authentication with plain username/password?

reidbarber
Nice! I built something similar, but in the browser with drag-and-drop at https://files2prompt.com

It doesn’t have all the fancy LLM integration though.

fellowniusmonk
This looks very cool for complex queries!

If your codebase is structured in a very modular way than this one liner mostly just works:

find . -type f -exec echo {} \; -exec cat {} \; | pbcopy

smcleod
There's so many of these popping up! Here's mine - https://github.com/sammcj/ingest
jondwillis
In this thread: nobody using Cursor, embedding documentation, using various RAG techniques…
ukuina
It's fascinating to see how different frameworks are dealing with the problem of populating context correctly. Aider, for example, asks users to manually add files to context. Claude Dev attempts to grep files based on LLM intent. And Continue.dev uses vector embeddings to find relevant chunks and files.

I wonder if an increase in usable (not advertised) context tokens may obviate many of these approaches.

faangguyindia
LLM for coding is bit meh after novelty wears off.

I've had problems where LLM doesn't know which library version I am using. It keeps suggesting methods which do not exit etc...

As if LLM are unaware of library version.

Place where I found LLM to be most effect and effortless is CLI

My brother made this but I use it everyday https://github.com/zerocorebeta/Option-K