Module: hf_inference_client_workflow
Huggingface Inference Client Workflow
A class that uses Huggingface Inference Client library to run inference on any model that is hosted on Huggingface Hub.
Supported Tasks
"text_generation"
"text_classification"
"token_classification"
"summarization"
Example Classification Inference
from infernet_ml.utils.hf_types import HFClassificationInferenceInput
from infernet_ml.workflows.inference.hf_inference_client_workflow import (
HFInferenceClientWorkflow,
)
def main():
# Instantiate the workflow
workflow = HFInferenceClientWorkflow()
# Setup the workflow
workflow.setup()
# Run the inference
result = workflow.inference(
HFClassificationInferenceInput(
text="Decentralizing AI using crypto is awesome!",
)
)
print(result)
if __name__ == "__main__":
main()
Outputs:
{'output': [TextClassificationOutputElement(label='POSITIVE', score=0.9997395873069763), TextClassificationOutputElement(label='NEGATIVE', score=0.00026040704688057303)]}
Example Token Classification Inference
from infernet_ml.utils.hf_types import HFTextGenerationInferenceInput
from infernet_ml.workflows.inference.hf_inference_client_workflow import (
HFInferenceClientWorkflow,
)
def main():
# Instantiate the workflow
workflow = HFInferenceClientWorkflow()
# Setup the workflow
workflow.setup()
# Run the inference
result = workflow.inference(
HFTextGenerationInferenceInput(
prompt="Decentralizing AI using crypto is awesome!",
)
)
print(result["output"])
if __name__ == "__main__":
main()
Outputs:
Example Text Generation Inference
from infernet_ml.workflows.inference.hf_inference_client_workflow import (
HFInferenceClientWorkflow,
)
from infernet_ml.utils.hf_types import (
HFSummarizationInferenceInput,
HFSummarizationConfig,
)
def main():
# Instantiate the workflow
workflow = HFInferenceClientWorkflow()
# Setup the workflow
workflow.setup()
# Define the inputs
summarization_config = HFSummarizationConfig(
min_length=28,
max_length=56,
)
input_text = "Artificial Intelligence has the capacity to positively "
"impact humanity but the infrastructure in which it is being"
"developed is not yet ready for the future. Decentralizing AI using "
"crypto is awesome!"
# Run the inference
result = workflow.inference(
HFSummarizationInferenceInput(
text=input_text,
parameters=summarization_config,
)
)
print(result)
if __name__ == "__main__":
main()
Outputs:
{'output': SummarizationOutput(summary_text=' Artificial Intelligence has the capacity to positively impact artificial intelligence, says AI expert . Artificial Intelligence can be positively beneficial to society, he says .')}
Example Summarization Inference
from infernet_ml.workflows.inference.hf_inference_client_workflow import HFInferenceClientWorkflow
from infernet_ml.utils.hf_types import HFSummarizationInferenceInput, HFTaskId
def main():
# Instantiate the workflow
workflow = HFInferenceClientWorkflow()
# Setup the workflow
workflow.setup()
# Run the inference
# Define the inputs
summarization_config = HFSummarizationConfig(
min_length=28,
max_length=56,
)
input_text = "Artificial Intelligence has the capacity to positively "
"impact humanity but the infrastructure in which it is being"
"developed is not yet ready for the future. Decentralizing AI using "
"crypto is awesome!"
# Run the inference
result = workflow.inference(
HFSummarizationInferenceInput(
text=input_text,
parameters=summarization_config,
)
)
print(result)
if name == "__main__":
main()
Outputs:
Input Formats
The input format is the HFInferenceClientInput
pydantic model. This is one of four input formats:
HFClassificationInferenceInput
HFTokenClassificationInferenceInput
HFTextGenerationInferenceInput
HFSummarizationInferenceInput
HFInferenceClientWorkflow
Bases: BaseInferenceWorkflow
Inference workflow for models available through Huggingface Hub.
Source code in src/infernet_ml/workflows/inference/hf_inference_client_workflow.py
259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 |
|
__init__(token=None, *args, **kwargs)
Initialize the Huggingface Inference Workflow object
Parameters:
Name | Type | Description | Default |
---|---|---|---|
token |
Optional[str]
|
API token for the inference client. Defaults to None. |
None
|
Source code in src/infernet_ml/workflows/inference/hf_inference_client_workflow.py
do_run_model(hf_input)
Perform inference on the hf_input data
Parameters:
Name | Type | Description | Default |
---|---|---|---|
hf_input |
HFInferenceClientInput
|
Input data for the inference call |
required |
Returns:
Name | Type | Description |
---|---|---|
HFInferenceClientOutput |
HFInferenceClientOutput
|
Output data from the inference call |
Source code in src/infernet_ml/workflows/inference/hf_inference_client_workflow.py
do_setup()
inference(input_data, log_preprocessed_data=True)
Overriding the inference method to add typing annotations
Parameters:
Name | Type | Description | Default |
---|---|---|---|
input_data |
HFInferenceClientInput
|
Input data for the inference call |
required |
Returns:
Type | Description |
---|---|
HFInferenceClientOutput
|
Dict[str, Any]: output data from the inference call |
Source code in src/infernet_ml/workflows/inference/hf_inference_client_workflow.py
setup()
Setup the inference client. Overriding the base class setup method to add typing annotations