User wants a keyboard shortcut in VS Code to send the content of the current code and other open tabs to an AI endpoint for autocomplete. They are looking for LLMs to act as advanced autocomplete, implying that current tools like Copilot don't fully provide this, especially with broad context.
2. For the second prompt, I took all 4 output of the first prompt and put them all together in 1 text file and I copy pasted all that into a fresh chat for all four, with that prompt: I asked 4 llm: I want to be able to press a keyboarb shortcut in vscode and the whole content of my current code (plus the other opened tab), plus instructions to a open ai end point would be sent to them and the ai would automcomplete and add code for the 'section'. the ai will decide what section its confident to predict how should we do it? below are the responses, considering plus your own judgement. Help me make that work. I want to use groq this is a snippet they recommend: from groq import Groq import { Groq } from 'groq-sdk'; const groq = new Groq(); const chatCompletion = await https://t.co/atYdZmRmjv.completions.create({ "messages": [ { "role": "user", "content": "" } ], "model": "moonshotai/kimi-k2-instruct", "temperature": 0.6, "max_completion_tokens": 4096, "top_p": 1, "stream": true, "stop": null }); for await (const chunk of chatCompletion) { process.stdout.write(chunk.choices[0]?.delta?.content || ''); } I will be fiddling with the right prompts, just focus on the mechanics. My main goal is to use llms (because they are Now fast enough as automcomplete instead of cursor tab or github copilot type of things).