Go back

Setup and installation of 'DeepSeek & Llama powered All-in-One LLM Suite' on GCP


Note: We provide free demo access for the “DeepSeek & Llama-powered All-in-One LLM Suite.” To request a free demo, please reach out to us at marketing@techlatest.net with the subject “Free Demo Access Request - [Your Company Name]”



This section describes how to provision and connect to ‘DevOps Automation Powered by License Free Ansible & Semaphore web UI’ VM solution on GCP.

  1. Open DeepSeek & Llama powered All-in-One LLM Suite listing on GCP Marketplace.

  2. Click Get Started.

/img/gcp/multi-llm-vm/marketplace.png

It will ask you to enable the API’s if they are not enabled already for your account. Please click on enable as shown in the screenshot.

/img/gcp/nvidia-ubuntu/enable-api.png

  • It will take you to the agreement page. On this page, you can change the project from the project selector on top navigator bar as shown in the below screenshot.

  • Accept the Terms and agreements by ticking the checkbox and clicking on the AGREE button. /img/common/gcp_agreement_page.png

  • It will show you the successfully agreed popup page. Click on Deploy. /img/common/gcp_agreement_accept_page.png

  • On deployment page, give a name to your deployment.

  • In Deployment Service Account section, click on Existing radio button and Choose a service account from the Select a Service Account dropdown.
  • If you don't see any service account in dropdown, then change the radio button to New Account and create the new service account here.
  • If after selecting New Account option, you get below permission error message then please reach out to your GCP admin to create service account by following Step by step guide to create GCP Service Account and then refresh this deployment page once the service account is created, it should be available in the dropdown.

  • You are missing resourcemanager.projects.setIamPolicy permission, which is needed to set the required roles on the created Service Account
  • Select a zone where you want to launch the VM(such as us-east1-a)

  • Optionally change the number of cores and amount of memory. ( This defaults to 4 vCPUs and 15 GB RAM) Minimum VM Specs : 8gbvRAM /2vCPU , However for swift performance go with 4vCPUs/16GB RAM or higher configuration

  • Optionally change the boot disk type and size. (This defaults to ‘Standard Persistent Disk’ and 250 GB respectively)

  • Optionally change the network name and subnetwork names. Be sure that whichever network you specify has ports 22 (for ssh), 3389 (for RDP) , 80 (for HTTP) and 443 (for HTTPS) exposed.

  • Click Deploy when you are done.

  • DeepSeek & Llama powered All-in-One LLM Suite will begin deploying.

/img/gcp/multi-llm-vm/deployed-01.png

/img/gcp/multi-llm-vm/deployed-02.png

/img/gcp/multi-llm-vm/deployed-03.png

  1. A summary page displays when the compute engine is successfully deployed. Click on the Instance link to go to the instance page .

  2. On the instance page, click on the “SSH” button, select “Open in browser window”.

/img/gcp/puppet-support/ssh-option.png

  1. This will open SSH window in a browser. Switch to ubuntu user and navigate to ubuntu home directory.
sudo su ubuntu
cd /home/ubuntu/

/img/gcp/multi-llm-vm/switch-user.png

  1. Run below command to set the password for “ubuntu” user
sudo passwd ubuntu

/img/gcp/multi-llm-vm/ssh-passwd.png

  1. Now the password for ubuntu user is set, you can connect to the VM’s desktop environment from any local windows machine using RDP or linux machine using Remmina.

  2. To connect using RDP via Windows machine, first note the external IP of the VM from VM details page as highlighted below

/img/gcp/saltstack-semaphore/external-ip.png

  1. Then From your local windows machine, goto “start” menu, in the search box type and select “Remote desktop connection”

  2. In the “Remote Desktop connection” wizard, paste the external ip and click connect

/img/gcp/jupyter-python-notebook/rdp.png

  1. This will connect you to the VM’s desktop environment. Provide “ubuntu” as the userid and the password set in step 6 to authenticate. Click OK

/img/gcp/jupyter-python-notebook/rdp-login.png

  1. Now you are connected to out of box DeepSeek & Llama powered All-in-One LLM Suite VM’s desktop environment via Windows machines.

/img/azure/minikube/rdp-desktop.png

  1. To connect using RDP via Linux machine, first note the external IP of the VM from VM details page,then from your local Linux machine, goto menu, in the search box type and select “Remmina”.

    Note: If you don’t have Remmina installed on your Linux machine, first Install Remmina as per your linux distribution.

/img/gcp/common/remmina-search.png

  1. In the “Remmina Remote Desktop Client” wizard, select the RDP option from dropdown and paste the external ip and click enter.

/img/gcp/common/remmina-external-ip.png

  1. This will connect you to the VM’s desktop environment. Provide “ubuntu” as the userid and the password set in step 6 to authenticate. Click OK

/img/gcp/common/remmina-rdp-login.png

  1. Now you are connected to out of box DeepSeek & Llama powered All-in-One LLM Suite VM’s desktop environment via Linux machine.

/img/azure/minikube/rdp-desktop.png

  1. To access the Open WebUI Interface, copy the public IP address of the VM and paste it in the browser:

Browser will display a SSL certificate warning message. Accept the certificate warning and Continue.

/img/aws/multi-llm-vm-vm/browser-warning.png

  1. The VM also comes with Certbot Nginx plugin preinstalled. So if you have valid domain name (DNS) and your instance IP is configured to access that DNS then you can generate free Letsencrypt SSL Certificates to access the Open WebUI securely over HTTPS using that DNS and can avoid Browser warnings. To do so, connect via terminal and run
sudo certbot --nginx

This command will prompt you for valid DNS Name, provide the DNS Name associated with this instance.

/img/aws/multi-llm-vm/certbot-configure.png

  1. Once your Certbot certificates are ready , you can navigate to any browser and access the Open Web UI using DNS Name securely.

/img/aws/multi-llm-vm/access-openwebui-using-dns.png

  1. Click Get Started on very first page. This will take you to registration page. Provide the details here and create your first admin account.

/img/aws/multi-llm-vm/open-webui-get-started.png

/img/aws/multi-llm-vm/open-webui-registration-page.png

  1. Now you are logged in to Open WebUI Interface. You can choose different preinstalled models from the dropdown and ask your queries.

/img/aws/multi-llm-vm/open-webui-home-page.png

/img/aws/multi-llm-vm/preinstalled-models.png

/img/aws/multi-llm-vm/query-in-open-webui.png

  1. You can also run various ollama models from the VM’s terminal. To list the installed models run -
ollama list

/img/aws/multi-llm-vm/ollama-list.png

  1. To run the specific model use below command

Replace modelname with actual name of the model, e.g qwen2.5:7b

ollama run modelname

/img/aws/multi-llm-vm/run-model.png

  1. To pull any new ollama model run -
ollama pull modelname

/img/aws/multi-llm-vm/pull-model.png

Once your model is pulled successfully , you can start using it.

For more details, please visit Official Documentation page

For video tutorials on this solution, please visit Free course on ‘DeepSeek & Llama powered All-in-One LLM Suite’

Go back