using UnityMCP with ollama or local llm #107
Replies: 1 comment 1 reply
-
Update: i got it to somewhat work with the continue vs code extension. was a bit tedious to setup but i was able to create a sphere and cube after a few very precise prompts. You really have to baby your prompt if you plan on using a local ai. or have a really good config file or else it wont be able to successfully run the tools. i am still testing, I think it would be cool to have the ai build player objects like a spaceship or a car using cubes and spheres. I had to use a pretty specific prompt: "using the tools provided, create a cube in unity and name it playercube" if i didnt give it the name it would fail at running the manage_gameobject tool here is my config.yaml for the continue local assistant using ollama llm. feel free to copy or give me tips on how to improve it.
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, fantastic project, i was able to get it working with cursor in just a few minutes.
I was wondering if anyone has had success getting this to work with a local llm like gemma3 12b or qwen2.5 7b ?
I have searched up and down the internet, installed the dolphinMCP to try it out but with no luck. Was hoping someone could point me in the right direction, i found
ZundamonnoVRChatkaisetu /
unity-mcp-ollama
but i cant seem to get ollama to connect to the server
any help would be greatly appreciated
Beta Was this translation helpful? Give feedback.
All reactions