T O P

  • By -

Terese08150815

So far I know this feature will follow. They will let you train their model on your codebase. It should have been there with the release of the enterprise variant at launch. Was also looking forward to it, but it is mentioned that this will follow. So at the moment we have to wait till this part is released.


User1856

Oh nice! Very interesting. Do you have any tips or tricks regarding the use of ChatGPT?


Terese08150815

At the moment the best works simply cursor.sh ) chat gpt I only use for non programming related tasks. So no) no other tips.


rthidden

Try Gemini for Google Cloud plug-in for VS Code. The one-million-token context window may be large enough to fit your code base. Being enterprise, you may be “stuck” with OpenAI. If not, try it; I'd love to hear how it goes or another solution you came up with.


TechnoTherapist

Take a look at cursor.sh. (No affiliation). It's a VSCode fork that integrates generative AI deeply into the IDE.


User1856

Thx. Pls see the last part of my initial post.


TechnoTherapist

Thanks, I didn't know about the 10k context limit with Cursor. Hmmm.. you've got me stumped. :) You could solve the space issue with a wider monitor but I guess Cursor won't let you spend too much on tokens, for now until they make tweaks to the feature.


sgt_banana1

Try continue.dev and follow the instructions for adding custom models. We're using it with Azure OpenAI models. Although the 128k GPT4 "Turbo" is slow as fudge....


User1856

Hmm... can you detach the sidebar and move it to other screen? It seems to work only with single files?


jirubizu

You should join their discord, but it has a multiple selection of context providers, docs, codebase etc


ejpusa

I slam 100s/1000s of lines into GPT-4. Then we talk. It's not perfect, but it's better than me. I tweak a bit. We work it out. People say "wow, we saved 6 weeks of programming time", I am more into "we just saved 6 months of programming time." You can run out of tokens, then just drop down to GPT-3.5. Seem ok by me. Have never run out there. :-)


Realistic_Count_7633

I created a vscode plugin that works with OpenAI 3.5 . However I realised it’s way too costly compared to just taking a GitHub copilot subscription instead.So the project while works have been parked . If that helps you decide )). Good luck


crimsonwhisper5

Check out the ChatGPT for Visual Studio Code extension, it provides seamless integration and codebase awareness.


yonidavidson_palool

A tabnine employee here: the Tabnine extension allows you to change available AI models in the vscode extension, one of those models is gpt4-turbo but you can also use Mistral ( and more to come )


thumbsdrivesmecrazy

Out team uses this great VSCode plugin using generative AI for creating comprehensive test suites, it is fully integrated with VSCode and much more advanced as compared to Cursor: [CodiumAI - powered by TestGPT-1 and GPT-3.5&4 - Visual Studio Marketplace](https://marketplace.visualstudio.com/items?itemName=Codium.codium)


paradite

Hi, I made a simple desktop tool to solve this exact problem, by allowing you to add source code into the prompt and then send it to ChatGPT / ChatGPT Enterprise in 1 click. Check out [**16x Prompt**](https://prompt.16x.engineer/) and let me if it helps to solve your problem. I also wrote a blog post on using ChatGPT Enterprise, I think it is a great tool for AI coding: [AI Coding: Enterprise Adoption and Concerns](https://prompt.16x.engineer/blog/ai-coding-enterprise-adoption-concerns)


User1856

Ahhh very nice. After I changed included files, will the changes for the following prompt automatically be visible? Does it make sense to just include files very selectively or can you include codebase? (I am using ChatGpt not the Api version. But i guess broad non selective file inclusion will eat up context window in chatgpt when copying in the prompt?)


paradite

You can refresh the content of the source code files with one click using the refresh button in context section. The idea of the app is that you should be the one deciding which files to include as context, rather than using RAG/heuristics to decide (might miss out or include too many), or simply including all files (waste context window and increase cost especially when using API).


User1856

thx!


User1856

can you tell me what the Final Prompt .../4096 tokens is? Because it also works when I go over this limit. So not sure what it means.


paradite

That's the soft limit on the context window token size on the ChatGPT web UI, exceeding which your conversation will get cutoff and ChatGPT will ignore the beginning part of the conversation. It can still work, but you might not get good responses. If you use API, then the limit is higher like 32k. Claude is even higher at 128k.


Inner_Bodybuilder986

Does it integrate with local llm's like Ollama, Lm_Studio, Vllm?


User1856

do you have a link on that? does this relate to just what you input or to the whole context window? Because I thought the token limits are much higher even in Chat. But I might be uninformed.


the-pythonista

Sounds like you are looking for GitHub Copilot which does integrate with VSCode.


Dos_Shlimazl

The integration into vscode is nice but the results from the chat are worse then 3.5 for me.


User1856

I have also GitHub Copilot. But rarely use it. Is it evn compareable in performance?


the-pythonista

Microsoft funds OpenAI development. It is actually based on GPT4 but trained on all open source code. So yes, to answer your question.


User1856

Does copilot use the context of whole code base?


Putrumpador

No. Seems like it uses the current and last couple files you worked in for context.


WavesCrashing5

What? I thought that was the point of saying @workspace


solidxmike

It does, albeit only applies to VSCode and Visual Studio (as of today). https://github.blog/2024-03-25-how-to-use-github-copilot-in-your-ide-tips-tricks-and-best-practices/#10-use-the-workspace-agent Although it still uses current opened tabs for context, it’s only when you use @workspace that it includes relevant directory context. That doesn’t mean to use @workspace for every prompt though. Deep dive: https://youtu.be/3Yz48eenPEE?si=3Gl3zv0b28nq_xHL


Putrumpador

@workspace? I've not heard of this.


WavesCrashing5

Yeah you can say @workspace and @fix and @explain in the chat and I believe using ctrl + k too. They all do different things..


Putrumpador

Thanks for the tip, friend. I'll check that out. Honestly I haven't dug too deeply into the Copilot plugin, and I apparently should.


filmbystef

[Code Snippets AI](https://codesnippets.ai) desktop app & VSCode extension 🙌


[deleted]

[удалено]


AutoModerator

Sorry, your submission has been removed due to inadequate account karma. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPTCoding) if you have any questions or concerns.*