💬 Using GPT4All.. on a Mac Pro 2013

I fully admit this might have been a bad idea. I found that you could load GPT4All onto your local machine and that it was available for Mac. Since 2013 supports Monterey, I decided to give it a shot.

The install was a breeze and in a few minutes I was downloading the recommended model and database – Hermes at 13GB.

With that done, it was time to give local AI a try. And that’s when the joyfulness was over.

I’m not going to say a Mac Pro 2013 was the right machine for the job, but the processing of my request took minutes to complete, and the result was typed out like it was 1980 and I was on a 300 baud dial-up connection.

The response is measured in tokens per second and it never got into the double digits. It crawled along at 6-8 max with lots of pauses.

Maybe the AI engine was as annoyed as I was, because the answers were pretty sparse. They weren’t as detailed as asking the same question to an online AI. I usually got one summary paragraph and that’s it, hardly the multi-paragraph answer I’m used to.

The experience was both awesome and terrible. It’s awesome to think GPT can be loaded on a machine at home, all that knowledge at your fingertips without having to wait in queues, buy credits, or get filtered responses.

It was pretty terrible in the terse and incomplete answer. Again, not going to excuse the machine as part of the problem, it is a decade old.

It makes me wonder, how would all this perform on a Mac Studio Ultra?

I like the direction GPT4All is heading and I hope it keeps advancing and improving. I love the idea of having that amazing amount of knowledge at the ready. And when I get a truly decent machine, I will try the process again. It was fun to try and I’m glad the project exists.

Will the Studio and Pro be GPT rated machines in the future? Once you have GPT installed, will it update itself in the background like the old Encarta? Will running it locally be the way to get the uncensored/unfiltered answers? Instead of “jailbreaking” AI, will you just run it locally and get all the goodies?

I look forward to GPT on my own machine, when that machine is a little more recent. As with all things, GPT will get faster, and computes will adapt to processing that information. That’s going to be a fun time. GPT on an iPad will be a thing.

I still have the online servers, and so far, those are pretty good too.

Maybe I should've written that in a different font.
Author Signature for Posts