-
Notifications
You must be signed in to change notification settings - Fork 444
[Inference Provider] Add TextCLF as an inference provider #2273
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,51 @@ | ||
| <!--- | ||
| WARNING | ||
|
|
||
| This markdown file has been generated from a script. Please do not edit it directly. | ||
|
|
||
| ### Template | ||
|
|
||
| If you want to update the content related to TextCLF's description, please edit the template file under `https://github.com/huggingface/hub-docs/tree/main/scripts/inference-providers/templates/providers/textclf.handlebars`. | ||
|
|
||
| ### Logos | ||
|
|
||
| If you want to update TextCLF's logo, upload a file by opening a PR on https://huggingface.co/datasets/huggingface/documentation-images/tree/main/inference-providers/logos. Ping @wauplin and @celinah on the PR to let them know you uploaded a new logo. | ||
| Logos must be in .png format and be named `textclf-light.png` and `textclf-dark.png`. Visit https://huggingface.co/settings/theme to switch between light and dark mode and check that the logos are displayed correctly. | ||
|
|
||
| ### Generation script | ||
|
|
||
| For more details, check out the `generate.ts` script: https://github.com/huggingface/hub-docs/blob/main/scripts/inference-providers/scripts/generate.ts. | ||
| ---> | ||
|
|
||
| # TextCLF | ||
|
|
||
| > [!TIP] | ||
| > All supported TextCLF models can be found [here](https://huggingface.co/models?inference_provider=textclf&sort=trending) | ||
|
|
||
| <div class="flex justify-center"> | ||
| <a href="https://textclf.com/" target="_blank"> | ||
| <img class="block dark:hidden" src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/inference-providers/logos/textclf-light.png"/> | ||
| <img class="hidden dark:block" src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/inference-providers/logos/textclf-dark.png"/> | ||
| </a> | ||
| </div> | ||
|
|
||
| <div class="flex"> | ||
| <a href="https://huggingface.co/textclf" target="_blank"> | ||
| <img class="block dark:hidden" src="https://huggingface.co/datasets/huggingface/badges/resolve/main/follow-us-on-hf-lg.svg"/> | ||
| <img class="hidden dark:block" src="https://huggingface.co/datasets/huggingface/badges/resolve/main/follow-us-on-hf-lg-dark.svg"/> | ||
| </a> | ||
| </div> | ||
|
|
||
| [TextCLF](https://textclf.com) provides fast AI inference at low cost. | ||
|
|
||
| ## Supported tasks | ||
|
|
||
|
|
||
| ### Chat Completion (LLM) | ||
|
|
||
| Find out more about Chat Completion (LLM) [here](../tasks/chat-completion). | ||
|
|
||
| <InferenceSnippet | ||
| pipeline=text-generation | ||
| providersMapping={ {"textclf":{"modelId":"meta-llama/Llama-3.1-8B-Instruct","providerModelId":"meta-llama/Llama-3.1-8B-Instruct"} } } | ||
| conversational /> |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,12 @@ | ||
| # TextCLF | ||
|
|
||
| > [!TIP] | ||
| > All supported TextCLF models can be found [here](https://huggingface.co/models?inference_provider=textclf-ai&sort=trending) | ||
|
|
||
| {{{logoSection}}} | ||
|
|
||
| {{{followUsSection}}} | ||
|
|
||
| Founded in 2026, TextCLF provides fast AI inference at the lowest cost. | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Template and generated file have divergent contentLow Severity The template description says Additional Locations (1) |
||
|
|
||
| {{{tasksSection}}} | ||


There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Template uses wrong provider ID
textclf-aiMedium Severity
The template uses
inference_provider=textclf-aiin the model listing URL, but the provider ID istextclfeverywhere else (generate.ts,_toctree.yml, the generated.mdfile). The generatedtextclf.mdcurrently has the correct value (textclf), but the next time the docs are regenerated from the template, it will be overwritten with the wrongtextclf-aivalue, producing a broken link to models.Additional Locations (1)
docs/inference-providers/providers/textclf.md#L22-L23