novelai_api.Preset

class Order[source]

Bases: enum.IntEnum

An enumeration.

Temperature = 0
Top_K = 1
Top_P = 2
TFS = 3
Top_A = 4
Typical_P = 5
CFG = 6
Top_G = 7
Mirostat = 8
Unified = 9
Min_p = 10
enum_contains(enum_class: enum.EnumMeta, value: str) bool[source]

Check if the value provided is valid for the enum

Parameters:
  • enum_class – Class of the Enum

  • value – Value to check

collapse_model(enum_class: enum.EnumMeta, value: str)[source]

Collapse multiple version of a model to the last model value

Parameters:
  • enum_class – Class of the Enum

  • value – Value of the model to collapse

class StrEnum[source]

Bases: str, enum.Enum

An enumeration.

class Model[source]

Bases: novelai_api.Preset.StrEnum

An enumeration.

Sigurd = '6B-v4'
Euterpe = 'euterpe-v2'
Krake = 'krake-v2'
Clio = 'clio-v1'
Kayra = 'kayra-v1'
Erato = 'llama-3-erato-v1'
Genji = 'genji-jp-6b-v2'
Snek = 'genji-python-6b'
HypeBot = 'hypebot'
Inline = 'infillmodel'
enum_member_values = {'6B': Model.Sigurd, 'clio': Model.Clio, 'euterpe': Model.Euterpe, 'genji-jp-6b': Model.Genji, 'genji-python-6b': Model.Snek, 'hypebot': Model.HypeBot, 'infillmodel': Model.Inline, 'kayra': Model.Kayra, 'krake': Model.Krake, 'llama-3-erato': Model.Erato}
class PhraseRepPen[source]

Bases: novelai_api.Preset.StrEnum

An enumeration.

Off = 'off'
VeryLight = 'very_light'
Light = 'light'
Medium = 'medium'
Aggressive = 'aggressive'
VeryAggressive = 'very_aggressive'
PREAMBLE = {Model.Sigurd: '⁂\n', Model.Clio: '[ Author: Various ]\n[ Prologue ]\n', Model.Euterpe: '\n***\n', Model.Genji: [60, 198, 198], Model.Snek: '<|endoftext|>\n', Model.Kayra: '', Model.Krake: '<|endoftext|>[ Prologue ]\n', Model.Erato: '<|endoftext|>'}

Prompt sent to the model when the context is empty

class PresetView[source]

Bases: object

__init__(model: novelai_api.Preset.Model, officials_values: Dict[str, List[novelai_api.Preset.Preset]])[source]
model: novelai_api.Preset.Model
class Preset[source]

Bases: object

DEFAULTS = {'diversity_penalty': 0.0, 'length_penalty': 1.0, 'math1_quad': 0.0, 'math1_quad_entropy_scale': 0.0, 'math1_temp': 1.0, 'max_length': 40, 'min_length': 1, 'min_p': 0.0, 'order': [<Order.Temperature: 0>, <Order.Top_K: 1>, <Order.Top_P: 2>, <Order.TFS: 3>, <Order.Top_A: 4>, <Order.Typical_P: 5>, <Order.CFG: 6>, <Order.Top_G: 7>, <Order.Mirostat: 8>, <Order.Unified: 9>, <Order.Min_p: 10>], 'phrase_rep_pen': PhraseRepPen.Off, 'repetition_penalty': 1.0, 'repetition_penalty_default_whitelist': False, 'repetition_penalty_frequency': 0.0, 'repetition_penalty_presence': 0.0, 'repetition_penalty_range': 0, 'repetition_penalty_slope': 0.0, 'repetition_penalty_whitelist': [], 'stop_sequences': [], 'tail_free_sampling': 1.0, 'temperature': 1.0, 'top_a': 1.0, 'top_k': 0, 'top_p': 0.0, 'typical_p': 0.0}
textGenerationSettingsVersion: int

Preset version, only relevant for .preset files

temperature: float

https://naidb.miraheze.org/wiki/Generation_Settings#Randomness_(Temperature)

max_length: int

Response length, if not interrupted by a Stop Sequence

min_length: int

Minimum number of token, if interrupted by a Stop Sequence

top_k: int

https://naidb.miraheze.org/wiki/Generation_Settings#Top-K_Sampling

top_a: float

https://naidb.miraheze.org/wiki/Generation_Settings#Top-A_Sampling

top_p: float

https://naidb.miraheze.org/wiki/Generation_Settings#Nucleus_Sampling

typical_p: float

https://naidb.miraheze.org/wiki/Generation_Settings#Typical_Sampling (https://arxiv.org/pdf/2202.00666.pdf)

tail_free_sampling: float

https://naidb.miraheze.org/wiki/Generation_Settings#Tail-Free_Sampling

repetition_penalty: float

https://arxiv.org/pdf/1909.05858.pdf

repetition_penalty_range: int

Range (in tokens) the repetition penalty covers (https://arxiv.org/pdf/1909.05858.pdf)

repetition_penalty_slope: float

https://arxiv.org/pdf/1909.05858.pdf

repetition_penalty_frequency: float

https://platform.openai.com/docs/api-reference/parameter-details

repetition_penalty_presence: float

https://platform.openai.com/docs/api-reference/parameter-details

repetition_penalty_whitelist: list

List of tokens that are excluded from the repetition penalty (useful for colors and the likes)

repetition_penalty_default_whitelist: bool

Whether to use the default whitelist. Used for presets compatibility, as this setting is saved in presets

phrase_rep_pen: str | novelai_api.Preset.PhraseRepPen

https://docs.novelai.net/text/phrasereppen.html

length_penalty: float

https://huggingface.co/docs/transformers/main_classes/configuration#transformers.PretrainedConfig

diversity_penalty: float

https://huggingface.co/docs/transformers/main_classes/configuration#transformers.PretrainedConfig

order: List[novelai_api.Preset.Order | int]

list of Order to set the sampling order

cfg_scale: float

https://docs.novelai.net/text/cfg.html

mirostat_lr: float

https://docs.novelai.net/text/Editor/slidersettings.html#advanced-options

mirostat_tau: float

https://docs.novelai.net/text/Editor/slidersettings.html#advanced-options

math1_quad: float

https://docs.novelai.net/text/Editor/slidersettings.html#advanced-options (Unified quad)

math1_quad_entropy_scale: float

https://docs.novelai.net/text/Editor/slidersettings.html#advanced-options (Unified conf)

math1_temp: float

https://docs.novelai.net/text/Editor/slidersettings.html#advanced-options (Unified linear)

min_p: float

https://docs.novelai.net/text/Editor/slidersettings.html#advanced-options

pad_token_id: int

https://huggingface.co/docs/transformers/main_classes/text_generation#transformers.GenerationConfig

bos_token_id: int

https://huggingface.co/docs/transformers/main_classes/text_generation#transformers.GenerationConfig

eos_token_id: int

https://huggingface.co/docs/transformers/main_classes/text_generation#transformers.GenerationConfig

max_time: int

https://huggingface.co/docs/transformers/main_classes/text_generation#transformers.GenerationConfig

no_repeat_ngram_size: int

https://huggingface.co/docs/transformers/main_classes/configuration#transformers.PretrainedConfig

encoder_no_repeat_ngram_size: int

https://huggingface.co/docs/transformers/main_classes/configuration#transformers.PretrainedConfig

num_return_sequences: int

https://huggingface.co/docs/transformers/main_classes/configuration#transformers.PretrainedConfig

get_hidden_states: bool

PretrainedConfig.output_hidden_states

name: str

Name of the preset

model: novelai_api.Preset.Model

Model the preset is for

sampling_options: List[bool]

Enable state of sampling options

__init__(name: str, model: novelai_api.Preset.Model, settings: Dict[str, Any] | None = None)[source]
set_sampling_options_state(sampling_options_state: List[bool])[source]

Set the state (enabled/disabled) of the sampling options. Set it after setting the order setting. It should come in the same order as the order setting.

to_settings() Dict[str, Any][source]

Return the values stored in the preset, for a generate function

to_file(path: str) NoReturn[source]

Write the current preset to a file

Parameters:

path – Path to the preset file to write

copy() novelai_api.Preset.Preset[source]

Instantiate a new preset object from the current one

set(name: str, value: Any) novelai_api.Preset.Preset[source]

Set a preset value. Same as preset[name] = value

update(values: Dict[str, Any] | None = None, **kwargs) novelai_api.Preset.Preset[source]

Update the settings stored in the preset. Works like dict.update()

classmethod from_preset_data(data: Dict[str, Any] | None = None, **kwargs) novelai_api.Preset.Preset[source]

Instantiate a preset from preset data, the data should be the same as in a preset file. Works like dict.update()

classmethod from_file(path: str | bytes | os.PathLike | int) novelai_api.Preset.Preset[source]

Instantiate a preset from the given file

Parameters:

path – Path to the preset file

classmethod from_official(model: novelai_api.Preset.Model, name: str | None = None) novelai_api.Preset.Preset | None[source]

Return a copy of an official preset

Parameters:
  • model – Model to get the preset of

  • name – Name of the preset. None means a random official preset should be returned

Returns:

The chosen preset, or None if the name was not found in the list of official presets

classmethod from_default(model: novelai_api.Preset.Model) novelai_api.Preset.Preset | None[source]

Return a copy of the default preset for the given model

Parameters:

model – Model to get the default preset of

Returns:

The chosen preset, or None if the default preset was not found for the model