-
Notifications
You must be signed in to change notification settings - Fork 974
Issue 1467 Fix #1523
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
Issue 1467 Fix #1523
Conversation
PR Reviewer Guide 🔍Here are some key observations to aid the review process:
|
PR Code Suggestions ✨Explore these optional code suggestions:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Caution
Changes requested ❌
Reviewed everything up to 8bd93ae in 58 seconds. Click for details.
- Reviewed
24
lines of code in1
files - Skipped
0
files when reviewing. - Skipped posting
1
draft comments. View those below. - Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. src/agents-api/agents_api/clients/litellm.py:140
- Draft comment:
Normalization is applied unconditionally in aembedding now. Consider aligning the handling logic with acompletion (perhaps via a shared utility) to prevent potential inconsistencies in future changes. - Reason this comment was not posted:
Marked as duplicate.
Workflow ID: wflow_KPakXrkFTTnu106G
You can customize by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.
@@ -76,7 +76,7 @@ async def acompletion( | |||
) | |||
|
|||
custom_api_key = secret and secret.value | |||
model = f"openai/{model}" # This is needed for litellm | |||
model = f"openai/{model.removeprefix('openai/')}" # This is needed for litellm |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider extracting the model normalization (using removeprefix) into a helper function to avoid duplication and ensure consistent behavior across functions. Also, ensure that your minimum Python version supports str.removeprefix (Python 3.9+).
User description
🔧 fix: normalize
openai/
model prefix inaembedding
andacompletion
Summary:
Fixes inconsistent and conditional handling of
openai/
prefix in model names. Previously,aembedding
only applied the prefix whencustom_api_key
was not set, leading to malformed model names likeopenai/openai/...
and broken usage tracking.Changes:
aembedding
andacompletion
:PR Type
Bug fix
Description
Fix inconsistent
openai/
prefix handling in model namesNormalize model prefix in both
acompletion
andaembedding
functionsPrevent malformed model names like
openai/openai/...
Remove conditional prefix application in
aembedding
Changes diagram
Changes walkthrough 📝
litellm.py
Normalize openai model prefix handling
src/agents-api/agents_api/clients/litellm.py
acompletion
function usingremoveprefix
aembedding
function