🎮

inference-playground

FNI 13.6
by huggingface docker

"--- title: Inference Playground emoji: 🔋 colorFrom: blue colorTo: pink sdk: docker pinned: false app_port: 3000 ---"

Best Scenarios

Interactive UI Demo

Technical Constraints

Generic Use
docker SDK
CPU Config
Running Status
240 Likes

🕸️ Neural Graph Explorer

v15.13

Graph Overview

263 Entities
273 Connections
Explore Full Graph →

📈 Interest Trend

--

* Real-time activity index across HuggingFace, GitHub and Research citations.

🔬Deep Dive

Expand Details [+]

🛠️ Technical Profile

Hardware & Scale

SDK
docker
Hardware
V100
Status
Running

🌐 Cloud & Rights

Source
huggingface
License
Open Access

🎮 Demo Preview

Interact with caution. Content generated by third-party code.

💻 Usage

docker pull inference-playground
git clone https://huggingface.co/spaces/huggingface/inference-playground

Space Overview

Hugging Face Inference Playground

Build GitHub Contributor Covenant

This application provides a user interface to interact with various large language models, leveraging the @huggingface/inference library. It allows you to easily test and compare models hosted on Hugging Face, connect to different third-party Inference Providers, and even configure your own custom OpenAI-compatible endpoints.

Local Setup

TL;DR: After cloning, run pnpm i && pnpm run dev --open

Prerequisites

Before you begin, ensure you have the following installed:

  • Node.js: Version 20 or later is recommended.
  • pnpm: Install it globally via npm install -g pnpm.
  • Hugging Face Account & Token: You'll need a free Hugging Face account and an access token to interact with models. Generate a token with at least read permissions from [hf.co/settings/tokens](https:/

4,779 characters total