Now huggingface text-generation-inference only support Llama, Falcon, StarCoder, BLOOM, GPT-NeoX, and T5. If you select other LLMs, use HuggingfaceChatClient will get errors.
Yesterday, I use gpt2 model, and I use HuggingfaceChatClient get a lot of errors. So I think add a tip for novice will be friendly.
Comment From: markpollack
Thanks for pointing this out, I've updated the text a bit. Merged in 7801119a9244909e0f9825cc02f6122d101d9fc5