Anthropic models list error in LLM node

Today LLM node in DevOps Agent switched to another models list (another names)

And errors in chat streaming error: POST “https://api.anthropic.com/v1/messages?beta=true”: 400 Bad Request (Request-ID: req_011CVfgGofBC2…) {“type”:“error”,“error”:{“type”:“invalid_request_error”,“message”:“model: Field required”},“request_id”:“req_011CVfgGofBC…”}

Hi @qizqo

Thank you for your feedback! :folded_hands:

We are improving the LLM node, so its appearance has changed slightly.

I see that your problem has already been solved.

Best Regards,
Splox

1 Like

Yes, I started using the new version of DevOps Engineer Template (I see updates there).

Today, Anthropic writes about a lack of balance in the model. I believe this will be fixed soon.

I don’t want to connect my BYOK to the LLM node yet, your settings are more suited to a code engineer.

UPD: now LLM with Anthropic work again

1 Like