[Weave] Feature: Prompt playground response time from LLM

Feature: Prompt playground response time from LLM - would be great to know how much time a prompt spent to run with that specific LLM. Helps users to properly decide between models as well as structure prompt complexity

https://chasm-talk.slack.com/archives/C05PHU191GX/p1695637378933809

Discussion log ^

Please authenticate to join the conversation.

Upvoters
Status

In Review

Board

πŸ’‘ Request a feature

Date

Over 2 years ago

Author

John Koh

Subscribe to post

Get notified by email when there are changes.