topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Thursday December 5, 2024, 4:48 am
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Last post Author Topic: AI Coding Assistants (Who uses them and which)  (Read 16462 times)

KynloStephen66515

  • Animated Giffer in Chief
  • Honorary Member
  • Joined in 2010
  • **
  • Posts: 3,759
    • View Profile
    • Donate to Member
AI Coding Assistants (Who uses them and which)
« on: August 02, 2024, 02:06 PM »
Hey guys,

I'm curious to see who on the DC forum uses AI coding assistants in their IDEs or CLIs to enhance their daily coding workflow.

If you do use them, which one did you pick, and why?!

Ath

  • Supporting Member
  • Joined in 2006
  • **
  • Posts: 3,629
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #1 on: August 02, 2024, 02:27 PM »
Simple: Nope. To me that's garbage in, garbage out.

KynloStephen66515

  • Animated Giffer in Chief
  • Honorary Member
  • Joined in 2010
  • **
  • Posts: 3,759
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #2 on: August 02, 2024, 02:45 PM »

Simple: Nope. To me that's garbage in, garbage out.

Not sure what you mean here. 

Do you mean the quality of the code? The LLMs themselves? or am I missing something

Ath

  • Supporting Member
  • Joined in 2006
  • **
  • Posts: 3,629
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #3 on: August 02, 2024, 04:23 PM »
Do you mean the quality of the code? The LLMs themselves? or am I missing something
-KynloStephen66515 (August 02, 2024, 02:45 PM)

Any example I've seen so far seems to be a garbage answer, unusable or totally unfit to what it's supposed to be tailored to. I'll wait and see what's left after the dust of this hype settles :huh:

KynloStephen66515

  • Animated Giffer in Chief
  • Honorary Member
  • Joined in 2010
  • **
  • Posts: 3,759
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #4 on: August 02, 2024, 04:50 PM »
Do you mean the quality of the code? The LLMs themselves? or am I missing something
-KynloStephen66515 (August 02, 2024, 02:45 PM)

Any example I've seen so far seems to be a garbage answer, unusable or totally unfit to what it's supposed to be tailored to. I'll wait and see what's left after the dust of this hype settles :huh:

Which assistants have you tried, because I personally know quite a few (and even work for one) that are actually incredible (especially for autocomplete, but also for quickly getting code snippets, answers, bug fixes, code smells, etc)

Shades

  • Member
  • Joined in 2006
  • **
  • Posts: 2,939
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #5 on: August 03, 2024, 03:25 AM »
Which assistants have you tried, because I personally know quite a few (and even work for one) that are actually incredible (especially for autocomplete, but also for quickly getting code snippets, answers, bug fixes, code smells, etc)
-KynloStephen66515 (August 02, 2024, 04:50 PM)

Not assistants per se, but I have been using a tool: 'LM Studio' to run 8 LLM's locally. This tool provides an easy way to download LLMs, use one or more of those in the provided chat screen and allows you to run one or more models (at the same time) as a server, which you can access via an API that uses the same form as the OpenAI API.

Right now I'm most impressed with model 'bartowski\StableAI Instruct 3B'. It doesn't take up that much RAM and responds surprisingly well in CPU-only mode, even on a i3 10100F CPU. You can also set it to use the available GPU (NVidia/AMD) if that has enough memory to offload one or more models into. And it allows you to play with quite some model-specific settings for the LLM's you load into memory. LM Studio is freeware.

Sometimes I verify the results bij filling in the exact same prompt into ChatGPT (v3.5, I think that is the free one) and the locally running StableAI model. ChatGPT answers show up faster and usually have a lot more words to convey the same message.

Basic script generation works quite well in both, but ChatGPT can deal with a bit more complexity. Still, for my purposes, the StableAI model hasn't been too far off ChatGPT or too slow in comparison.

The thing I am looking for is a relative easy way to train the StableAI model I have with company-specific documentation, our script language and documentation portals. For that purpose, the open source tool 'LlamaIndex' appears to be very interesting.

Once I can train the LLM I have, turning my local AI instance into a proper personal AI Assistant shouldn't be too much of a problem.

Ath

  • Supporting Member
  • Joined in 2006
  • **
  • Posts: 3,629
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #6 on: August 03, 2024, 03:36 AM »
Which assistants have you tried
-KynloStephen66515 (August 02, 2024, 04:50 PM)
None, opinion based on the crappy 'examples' / 'answers' that ppl posted in (another) forum.

Basically, I'm sitting out the storm, to see what eventually rolls out without the world crumbling to pieces. (Must be related to my age/experience and somewhat conservative approach to 'the latest hype') :)

KynloStephen66515

  • Animated Giffer in Chief
  • Honorary Member
  • Joined in 2010
  • **
  • Posts: 3,759
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #7 on: August 03, 2024, 06:07 AM »
Which assistants have you tried, because I personally know quite a few (and even work for one) that are actually incredible (especially for autocomplete, but also for quickly getting code snippets, answers, bug fixes, code smells, etc)
-KynloStephen66515 (August 02, 2024, 04:50 PM)

Not assistants per se, but I have been using a tool: 'LM Studio' to run 8 LLM's locally. This tool provides an easy way to download LLMs, use one or more of those in the provided chat screen and allows you to run one or more models (at the same time) as a server, which you can access via an API that uses the same form as the OpenAI API.

Right now I'm most impressed with model 'bartowski\StableAI Instruct 3B'. It doesn't take up that much RAM and responds surprisingly well in CPU-only mode, even on a i3 10100F CPU. You can also set it to use the available GPU (NVidia/AMD) if that has enough memory to offload one or more models into. And it allows you to play with quite some model-specific settings for the LLM's you load into memory. LM Studio is freeware.

Sometimes I verify the results bij filling in the exact same prompt into ChatGPT (v3.5, I think that is the free one) and the locally running StableAI model. ChatGPT answers show up faster and usually have a lot more words to convey the same message.

Basic script generation works quite well in both, but ChatGPT can deal with a bit more complexity. Still, for my purposes, the StableAI model hasn't been too far off ChatGPT or too slow in comparison.

The thing I am looking for is a relative easy way to train the StableAI model I have with company-specific documentation, our script language and documentation portals. For that purpose, the open source tool 'LlamaIndex' appears to be very interesting.

Once I can train the LLM I have, turning my local AI instance into a proper personal AI Assistant shouldn't be too much of a problem.

You might be better suited with RAG than a fine tuned model (much quicker to set-up, and vastly easier to keep up to date with ever changing information)

KynloStephen66515

  • Animated Giffer in Chief
  • Honorary Member
  • Joined in 2010
  • **
  • Posts: 3,759
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #8 on: August 03, 2024, 06:09 AM »
Which assistants have you tried
-KynloStephen66515 (August 02, 2024, 04:50 PM)
None, opinion based on the crappy 'examples' / 'answers' that ppl posted in (another) forum.

Basically, I'm sitting out the storm, to see what eventually rolls out without the world crumbling to pieces. (Must be related to my age/experience and somewhat conservative approach to 'the latest hype') :)

Well, if you want any suggestions for good tools, let me know as I have a list! (Unbiased and based off of personal experience with the tools in question) XD

anandcoral

  • Honorary Member
  • Joined in 2009
  • **
  • Posts: 783
    • View Profile
    • Free Portable Apps
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #9 on: August 03, 2024, 06:56 AM »
I use ChatGPT directly, to get c# code snippets for my application features.
I do not use AI in ide.

Since I am not well versed with C#, I can vouch it gives me perfect codes, as required, most of the time.
I try to explain it more elaborately if it gives wrong answers. Sometimes I had to leave the query if I do not get what I want.

All the codes I get I use in my application and can say they are working as I need them.

Deozaan

  • Charter Member
  • Joined in 2006
  • ***
  • Points: 1
  • Posts: 9,775
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #10 on: August 03, 2024, 06:16 PM »
Well, if you want any suggestions for good tools, let me know as I have a list! (Unbiased and based off of personal experience with the tools in question) XD
-KynloStephen66515 (August 03, 2024, 06:09 AM)

I'm interested in seeing a list, because my experience has left me with the impression that AI models aren't yet fully baked, and rarely give me truly useful results.

KynloStephen66515

  • Animated Giffer in Chief
  • Honorary Member
  • Joined in 2010
  • **
  • Posts: 3,759
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #11 on: August 03, 2024, 08:44 PM »
Well, if you want any suggestions for good tools, let me know as I have a list! (Unbiased and based off of personal experience with the tools in question) XD
-KynloStephen66515 (August 03, 2024, 06:09 AM)

I'm interested in seeing a list, because my experience has left me with the impression that AI models aren't yet fully baked, and rarely give me truly useful results.

Are you interested in IDE based plugins (think GitHub Copilot), or standalone/web-verions (ChatGPT, etc)?

Tuxman

  • Supporting Member
  • Joined in 2006
  • **
  • Posts: 2,507
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #12 on: August 06, 2024, 05:24 AM »
Two of the main problems with "AI coding assistants" - whether they claim to be able to write a complete application or simply complete variable names based on what they boozed up on the internet last night - are these:

  • They violate licences. There are many examples (and probably a large number of unreported cases) of parts of code copied verbatim that were under a clear licence, but that clear licence is not part of what was copied.
  • They are writing rubbish. A work colleague likes to have work supposedly done by ChatGPT. I would need the time he needs to iron out the worst mistakes in the result to simply write what he wanted to achieve myself. Incidentally, this does not seem to be a ChatGPT-specific problem.

And I'm not even taking into account the mental embarrassment I would personally experience were I to be degraded from a developer to a supplicant to the computer...

anandcoral

  • Honorary Member
  • Joined in 2009
  • **
  • Posts: 783
    • View Profile
    • Free Portable Apps
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #13 on: August 06, 2024, 07:06 AM »
Two of the main problems with "AI coding assistants" -
Never take what the seller insist.

All the big sellers of IT are throwing our own codes to us as labeled "AI". Yes this machine instructed program is reading our own open/closed sources and giving us back as "created by AI".

I do not think or use ChatGPT as some "intelligent" one. I take it as a a helper who helps me by "searching and sieving" my query.

kunkel321

  • Supporting Member
  • Joined in 2009
  • **
  • Posts: 602
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #14 on: August 06, 2024, 09:26 AM »
I'm not a professional coder.  I just do AHK v2 for fun.  Claude.ai has gotten pretty good.  I use if frequently.  I mostly used ChatGPT before that.  GPT was often pretty good at laying out the logic for how the code should work (assuming I articulated my needs well).  It would make code with so many syntax errors, though, that it was easier to just write the code myself.   Claude is pretty good with syntax too though. 

Tuxman

  • Supporting Member
  • Joined in 2006
  • **
  • Posts: 2,507
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #15 on: August 06, 2024, 10:37 AM »
If you need to explain how the algorithm is supposed to work en detail, wouldn’t it be easier to just write the algorithm yourself either way?

Deozaan

  • Charter Member
  • Joined in 2006
  • ***
  • Points: 1
  • Posts: 9,775
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #16 on: August 06, 2024, 12:17 PM »
Are you interested in IDE based plugins (think GitHub Copilot), or standalone/web-verions (ChatGPT, etc)?
-KynloStephen66515 (August 03, 2024, 08:44 PM)

I haven't tried any IDE-based plugins. I've just used web versions.

Renegade

  • Charter Member
  • Joined in 2005
  • ***
  • Posts: 13,291
  • Tell me something you don't know...
    • View Profile
    • Renegade Minds
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #17 on: September 09, 2024, 07:17 PM »
I use ChatGPT and CoPilot directly in VS.

The code quality varies. Sometimes good and sometimes way off. But no matter what, you have to check it and verify it. Sometimes I can use it exactly as is, and other times I only need minor modifications. And of course, sometimes I throw it out and start over.
Slow Down Music - Where I commit thought crimes...

Freedom is the right to be wrong, not the right to do wrong. - John Diefenbaker

Shades

  • Member
  • Joined in 2006
  • **
  • Posts: 2,939
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #18 on: September 10, 2024, 10:14 PM »
Found today another AI "toy" to play with.

It is called tabby and you'll find it on GitHub.

Tabby is indeed an assistant and one that you can self-host. No matter where your Linux, MacOS or Windows computer is running (on-prem/hybrid or cloud) it will work. Instructions to download and run the software are very easy. It will download 2 (smaller) LLM's from Qwen and StarCoder, if it doesn't find these on your system.

Currently I'm testing it with a computer based on a pre-ryzen AMD APU that AMD adjusted to fit on motherboards that support Ryzen 1st gen till 3th gen. That computer also has an old NVidia GeForce 1650 card which has (only) 4 GByte of VRAM on it. And yet, both LLM's fit in there. The website has a listing of which LLM's are supported and their requirements, including the required NVidia development code. It might all sound complicated, it really isn't.

Once you have it running, tabby becomes a server on your computer. Acces it by entering http://localhost:8080 in a browser on the computer that hosts tabby. Or use any other computer with a browser in your network to visit: http://<ip address tabby>:8080

You will be asked to create an admin account the first time you access it. Tabby comes with a community license for free (till 5 users). They also have subscription plans if that is more your thing.

My test machine could be considered crippled. However, tabby performs really well. Even on my testing clunker it is almost as fast as ChatGPT. This amazed me to no end. Sure, the models are small and I have had hardly any time to really check how useful the answers it provides truly are.

But the responsiveness and ease of working with the community version, that was a (very) pleasant surprise. So I thought to mention it here at DC.

Oh, while it comes with its own web interface, in there are links that point to ways on how to incorporate tabby into editors like VSCode. If that is more to your liking.
« Last Edit: September 11, 2024, 07:54 AM by Shades, Reason: Grammar »

wraith808

  • Supporting Member
  • Joined in 2006
  • **
  • default avatar
  • Posts: 11,190
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #19 on: September 17, 2024, 11:25 PM »
Do you mean the quality of the code? The LLMs themselves? or am I missing something
-KynloStephen66515 (August 02, 2024, 02:45 PM)

Any example I've seen so far seems to be a garbage answer, unusable or totally unfit to what it's supposed to be tailored to. I'll wait and see what's left after the dust of this hype settles :huh:

Not sure what you use. We use Copilot at work and it's been a wonder. It helps me not have to do scut work, and I use it in place of going to stack overflow also.

What I use it for:

Algorithms that I know but would have to look up the implementation.
Spotting inefficiencies in code
Unit Tests
Documentation
Explaining new or under documented code
Scaffolding new classes and POCO/Entities from examples
Generating test data

It really takes a lot of scut work out, and lets me concentrate on other work.

I also use one that's linked to our documentation to help take the load off of us having to do support for the API. We have the documentation, but putting it into a format that everyone can find what they're looking for (or even wants to find what they're looking for) is a challenge. Having them ask their question, and being able to link them to documentation/logs has really cut down on our support queues.

Two of the main problems with "AI coding assistants" - whether they claim to be able to write a complete application or simply complete variable names based on what they boozed up on the internet last night - are these:

  • They violate licences. There are many examples (and probably a large number of unreported cases) of parts of code copied verbatim that were under a clear licence, but that clear licence is not part of what was copied.
  • They are writing rubbish. A work colleague likes to have work supposedly done by ChatGPT. I would need the time he needs to iron out the worst mistakes in the result to simply write what he wanted to achieve myself. Incidentally, this does not seem to be a ChatGPT-specific problem.

And I'm not even taking into account the mental embarrassment I would personally experience were I to be degraded from a developer to a supplicant to the computer...

Incorrect. Github Copilot is set to filter out code that uses public sources- it's basing it on context within the app. I asked it for a particular algorithm, and had to massage the prompt because the code was getting filtered out.

If you need to explain how the algorithm is supposed to work en detail, wouldn’t it be easier to just write the algorithm yourself either way?

You don't really have to explain in detail, depending on the context and what you're trying to get. In one case, I just couldn't remember the type of algorithm I needed, and gave it a general idea of what I wanted to do (b-tree sort without modification or recursion) and it gave me that I wanted the Morris Traversal, and an example.
« Last Edit: September 18, 2024, 08:02 AM by wraith808 »

Shades

  • Member
  • Joined in 2006
  • **
  • Posts: 2,939
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #20 on: October 14, 2024, 08:35 AM »
An addition to my earlier 'tabby' post:
You can link it to any git repo, whether it is locally hosted or in the cloud and it will use that repo to produce code in similar style. It can also do this with a (large) document.

There is a 'tabby' extension for Visual Studio Code, so you can use 'Tabby directly in VSCode. And when hooked into your repository, VSCode can automagically autocomplete large(r) sections of code in the style you want.

'Tabby' works ok with a NVidia card that only has 4 GByte of VRAM, but it will load only the smallest model for chat and the smallest model for producing code. Which will give reasonable Python and bash support, but not much else.

If you have a NVidia card with 8 GByte of VRAM or more, you can start playing with the 13B models in 'tabby's repertoire, which support more coding languages and/or better chat model.

Just add # in front of your request and it will look in the repository, add @ in front of your request and it will consult the document you have coupled with 'tabby'.

Deozaan

  • Charter Member
  • Joined in 2006
  • ***
  • Points: 1
  • Posts: 9,775
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #21 on: October 17, 2024, 05:06 AM »
I tried getting Tabby to work and it wouldn't start up. It said "Starting..." and then spit out warnings every 5 seconds or so, forever.

How are you running it?

Shades

  • Member
  • Joined in 2006
  • **
  • Posts: 2,939
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #22 on: October 17, 2024, 09:49 AM »
I tried getting Tabby to work and it wouldn't start up. It said "Starting..." and then spit out warnings every 5 seconds or so, forever.

How are you running it?

In my environment, Tabby runs on a standard Windows 10 computer with a GeForce 1650 card.

Downloaded the 'tabby_x86_64-windows-msvc-cuda117.zip' from Github, extracted it and I created a windows batch file to start it up with the models that fit into the VRAM of my GPU (which is only 4 GByte). The documentation provides examples on how to start Tabby. I use the 'cuda117' archive, because of the extra NVidia cuda development software I needed to install has only support till that version of cuda. 

The format I use to start Tabby:
.\tabby.exe serve --model StarCoder-3B --chat-model Qwen2-1.5B-Instruct --device cuda

This is the smallest combination of models you can use with Tabby. It supports more models for Chat and Code, but not as many as other local AI tools do. So, if you have a 30xx card or better, preferably with more VRAM, use better models.

The first time you start Tabby, it will need to download the model for Chat and the model for Code. So the first boot will take quite some time, if your connection isn't all too great. You'll find these downloaded models in C:\Users\<your.user.account>\.tabby.

The start procedure, when finished, shows a Tabby logo in ASCII graphics and tell that it is accessible on address: http://0.0.0.0:8080
Once that text shows up, you can use any computer in your LAN to browse to that address and start configuring it. And with that I mean create the first admin account. The free/community version can be used by a maximum of 5 users at the same time.

You can either continue with configuring other accounts, mail server (for invites), Git repo's etc. or go back to the web interface and see for yourself how responsive it is. There are more instructions in the web-interface, in case you want to use Tabby directly in VSCode, JetBrains or Vim.

Did follow those with VSCode and its Tabby extension. Works like a charm.

** edit:
If you need more help, I have just finished a more comprehensive manual for work.
« Last Edit: October 17, 2024, 02:38 PM by Shades »

Deozaan

  • Charter Member
  • Joined in 2006
  • ***
  • Points: 1
  • Posts: 9,775
    • View Profile
    • Read more about this member.
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #23 on: October 18, 2024, 03:40 AM »
Sorry, I thought I posted more relevant details. :-[

My machine is ancient. Core i7 2700K. GTX 670. 16 GB RAM. 👴

I installed the nVidia tools and it let me install Cuda 12.x but I don't know if my GPU supports that.

The command I'm trying to run is very slightly different from yours (StarCoder-1B instead of 3B), taken from the Windows Installation documentation:

Code: Text [Select]
  1. .\tabby.exe serve --model StarCoder-1B --chat-model Qwen2-1.5B-Instruct --device cuda

But after counting up for about 10 seconds it starts spitting out messages like the following every few seconds:

Code: Text [Select]
  1. 10.104 s   Starting...←[2m2024-10-18T08:32:25.331474Z←[0m ←[33m WARN←[0m ←[2mllama_cpp_server::supervisor←[0m←[2m:←[0m ←[2mcrates\llama-cpp-server\src\supervisor.rs←[0m←[2m:←[0m←[2m98:←[0m llama-server <embedding> exited with status code -1073741795, args: `Command { std: "D:\\Apps\\tabby_x86_64-windows-msvc-cuda122\\llama-server.exe" "-m" "C:\\Users\\Deozaan\\.tabby\\models\\TabbyML\\Nomic-Embed-Text\\ggml\\model.gguf" "--cont-batching" "--port" "30888" "-np" "1" "--log-disable" "--ctx-size" "4096" "-ngl" "9999" "--embedding" "--ubatch-size" "4096", kill_on_drop: true }

I left it running for about 2 hours and it just kept doing that.

More help would be appreciated.
« Last Edit: October 18, 2024, 11:25 PM by Deozaan »

Shades

  • Member
  • Joined in 2006
  • **
  • Posts: 2,939
    • View Profile
    • Donate to Member
Re: AI Coding Assistants (Who uses them and which)
« Reply #24 on: October 18, 2024, 10:32 AM »
Sorry, I thought I posted more relevant details. :-[

My machine is ancient. Core i7 2700K. GTX 670 (6GB). 👴

I installed the nVidia tools and it let me install Cuda 12.x but I don't know if my GPU supports that.

The command I'm trying to run is very slightly different from yours (StarCoder-1B instead of 3B), taken from the Windows Installation documentation:

Code: Text [Select]
  1. .\tabby.exe serve --model StarCoder-1B --chat-model Qwen2-1.5B-Instruct --device cuda

But after counting up for about 10 seconds it starts spitting out messages like the following every few seconds:

Code: Text [Select]
  1. 10.104 s   Starting...←[2m2024-10-18T08:32:25.331474Z←[0m ←[33m WARN←[0m ←[2mllama_cpp_server::supervisor←[0m←[2m:←[0m ←[2mcrates\llama-cpp-server\src\supervisor.rs←[0m←[2m:←[0m←[2m98:←[0m llama-server <embedding> exited with status code -1073741795, args: `Command { std: "D:\\Apps\\tabby_x86_64-windows-msvc-cuda122\\llama-server.exe" "-m" "C:\\Users\\Deozaan\\.tabby\\models\\TabbyML\\Nomic-Embed-Text\\ggml\\model.gguf" "--cont-batching" "--port" "30888" "-np" "1" "--log-disable" "--ctx-size" "4096" "-ngl" "9999" "--embedding" "--ubatch-size" "4096", kill_on_drop: true }

I left it running for about 2 hours and it just kept doing that.

More help would be appreciated.
First, it may be handy to list the specifications of the computer I use with Tabby (for reference):
CPU: Intel i3 10100F (bog standard cooling, no tweaks of any kind)
GPU: MSI GeForce 1650 (4 GB of VRAM, 128-bit bus, 75 Watt version (without extra power connectors on the card))
RAM: Kingston 32 GByte (DDR4, 3200 MHz) As of this week, I added another 16 GB RAM stick, just to see if dual channel was an improvement or not. Till now I didn't notice it. 
SSD: SSD via SATA interface (Crucial 500 GByte)
All inside a cheap, no-name "gamer" case

There are 3 things to try:
  • Tabby without GPU support
  • Tabby with GPU support
  • No Tabby

Tabby without GPU support:
You will need to download a (much smaller) Tabby, extract it and make sure that you have as much of the fastest RAM as the motherboard supports in your computer to make this the best experience possible.
Start it with:
.\tabby.exe serve --model StarCoder-1B --chat-model Qwen2-1.5B-Instruct
Less than ideal, but you would still be able to (patiently) see how it works.

Tabby with GPU support:
This overview from NVidia shows which type of CUDA is supported on their GPUs.
My GPU is a 1650, which supports CUDA 7.5
Your GPU is a 670, which supports CUDA 3.0

This overview shows you need NVidia driver 450.xxx for your card if you use CUDA development software 11.x.
You can get v11.7 of the CUDA development tools here. As far as I know, you can go the NVidia website and download a tool that identifies your card and the maximum driver number it supports. If that number isn't 450 or higher, than I'm pretty sure that the Tabby version for CUDA devices won't work.

In case you can download a sufficient driver for your GPU, download the 'tabby_x86_64-windows-msvc-cuda117.zip' archive, extract it and start it with:
.\tabby.exe serve --model StarCoder-1B --chat-model Qwen2-1.5B-Instruct --device cuda

In case you cannot download a sufficient driver for your GPU, you could still try a Tabby version that supports vulkan devices. Vulkan is supported by NVIdia/AMD/Intel GPUs and is often used to make Windows games work on Linux. As your GPU is from around 2012 and Vulkan was made relatively recent, I don't know how far back Vulkan support goes. You might be lucky though. Anyway, download the 'tabby_x86_64-windows-msvc-vulkan.zip' archive, extract it and start it with:
.\tabby.exe serve --model StarCoder-1B --chat-model Qwen2-1.5B-Instruct --device vulkan

No Tabby:
If this also doesn't work, than you have to come to the conclusion that your GPU is simply too old for use with LLMs/AI and that you are relegated to software that provide CPU-only access to LLMs/AI. And in that case, I recommend another free tool: LM Studio. This tool, in combination with the LLM 'bartowski/StableLM Instruct 3B' is my advice. This software is very versatile, takes about 1 GB of RAM and the LLM takes about 4 GB of RAM, so you'll need to have 8 GByte or more in your computer for LM Studio to work halfway decent.