* updated code to use locally hosted ollama llm, nomic-embed-text model
* Updating documentation and creating issue templates.
* Small updates to the issue templates
* set the embedding model in an environment variable
* Updated V3-MCP-SUPPORT code to use local LLMs.
* Updated V3-MCP-SUPPORT code to use local LLMs.
* Some updates to local embedding models and fixing a couple small issues
* Updating root of repo for v3 release branch
---------
Co-authored-by: Aniket1995 <abhanavase@gmail.com>
Added import platform to detect the operating system
Used platform.system() to check if it's Windows or not
Adjusted pip and python paths:
Windows: uses 'Scripts' folder and '.exe' extension
Mac/Linux: uses 'bin' folder without extension
The script now automatically detects the operating system and uses the appropriate paths for each one, making it compatible with both Windows and macOS/Linux systems.