Module: ml_model_info
MlModelInfo
Bases: BaseModel
Attributes:
Name | Type | Description |
---|---|---|
id |
str
|
str - The unique ID of the model in format: {storage}/{owner}/{name}. Same format as RitualRepoId.to_unique_id() |
quantization_type |
Optional[str]
|
Optional[str] - The quantization type used in the model |
inference_engine |
Optional[MLType]
|
Optional[str] - The inference engine to be used for the model |
inference_engine_hash |
Optional[str]
|
Optional[str] - The SHA-256 hash of the inference engine binary or source code |
memory_requirements |
int
|
str - The estimated minimum required memory. E.g., '1.63GB' |
max_position_embeddings |
Optional[int]
|
Optional[int] - The maximum number of tokens that can be processed in a single forward pass. Context length supported by the model |
cuda_capability |
Optional[float]
|
Optional[condecimal] - The minimum required CUDA capability for the model |
cuda_version |
Optional[float]
|
Optional[condecimal] - The minimum required CUDA version for the model |
cpu_cores |
int
|
int - The minimum number of CPU cores required to run the model |
Source code in src/infernet_ml/utils/specs/ml_model_info.py
12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 |
|
calculate_hashes(ritual_manifest)
classmethod
Calculate the hash of the model using the Ritual manifest dictionary.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
ritual_manifest |
dict
|
The dictionary containing the model information. expected to have "files" key containing a list of model file paths. |
required |
Returns:
Type | Description |
---|---|
dict[str, str]
|
dict[str]: The SHA-256 hash of the model file(s). |
Source code in src/infernet_ml/utils/specs/ml_model_info.py
from_dict(data)
classmethod
Create a MlModelInfo instance from a dictionary.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
data |
dict
|
The dictionary containing the model information. |
required |
Returns:
Name | Type | Description |
---|---|---|
MlModelInfo |
MlModelInfo
|
The MlModelInfo instance. |
Source code in src/infernet_ml/utils/specs/ml_model_info.py
to_dict()
Convert the MlModelInfo instance to a dictionary.
Returns:
Name | Type | Description |
---|---|---|
dict |
dict[str, Any]
|
The dictionary containing the model information. |