To use this model you need to have the node-llama-cpp module installed. This can be installed using npm install -S node-llama-cpp and the minimum version supported in version 2.0.0. This also requires that have a locally built version of Llama2 installed.

Hierarchy (view full)

Constructors

Properties

maxTokens?: number
temperature?: number
topK?: number
topP?: number
trimWhitespaceSuffix?: boolean
""