Skip to content

HuaweiCloudDeveloper/ollama-image

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 

Repository files navigation

Ollama Inference Framework

English | 简体中文

Table of Contents

Repository Introduction

Ollama is a cross-platform inference framework client (MacOS, Windows, Linux) designed for seamless deployment of large language models (LLMs) such as Llama 2, Mistral, Llava, and more. With one-click setup, Ollama enables local execution of LLMs, keeping all interaction data on your own machine to enhance data privacy and security.

This project provides the open-source image product Ollama Inference Framework, which comes pre-installed with the Ollama inference framework and its related runtime environment, along with deployment templates. Follow the user guide to easily enjoy an "out-of-the-box" efficient experience.

System Requirements:

  • CPU: 2vCPUs or higher
  • RAM: 4GB or more
  • Disk: At least 40GB

Prerequisites

Register a Huawei Account and Activate Huawei Cloud

Image Specifications

Image Specification Features Remarks
Ollama-v0.9.2-kunpeng Deployed on Kunpeng Cloud Server + Ubuntu 24.04 64bit / Huawei Cloud EulerOS 2.0 64bit

Getting Help

  • For more issues, contact us via issue or Huawei Cloud Marketplace product support
  • Other open-source images can be found at open-source-image-repos

How to Contribute

  • Fork this repository and submit merge requests
  • Synchronize updates to README.md based on your open-source image information

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •