-
-
Notifications
You must be signed in to change notification settings - Fork 73
(WIP) feat: aws bedrock adapter #116
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
| interface ModelMeta<TProviderOptions = unknown> { | ||
| name: string | ||
| id: string | ||
| supports: { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this array should reflect the table found here:
https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference-supported-models-features.html
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@AlemTuzlak I've added this, but curious if this adapter should diverge from how we determine the modalities that a model has, when compared to other adapters (type map)?
I've left the type map for now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sorry, we didn't understand each other, yes what you added is what I wanted to see, but I meant in addition to input/output. For example in some model-meta i've done supports.capabilities.[all of the capabilities] and then you can do supports.input as well
|
|
||
| const US_LLAMA_4_SCOUT = { | ||
| name: 'Llama 4 Scout 17B Instruct (US)', | ||
| id: 'us.meta.llama4-scout-17b-instruct-v1:0', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the id here seems tied to the region?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes some of the newer models have inference profiles, including Anthropic ones. https://docs.aws.amazon.com/bedrock/latest/userguide/inference-profiles-support.html
Maybe should have something typed for regions since it's consistent across all models that require it. I have not added any of the EU models, etc yet.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To simplify, I've left only the ones where the inference profile is required for now.
|
The Bedrock SDK has been implemented for types only. Problem using it for runtime is that it will not work on Cloudflare due to a For this reason, it is using |
| * 2. `AWS_DEFAULT_REGION` environment variable | ||
| * 3. `'us-east-1'` as final fallback | ||
| */ | ||
| region?: string |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we extract the model region from here? so for example if it's eu/us/jp/sa and the model supports eu we can tell the user the model and the region don't match at runtime by throwing or default to a regionless model if it's supported?
eg:
model is anthropic, supported regions are eu and us, user specifies ca, it throws
model is anthropic, supported regions are eu and us, user specifices us, we append it to the model id before sending the request
model is anthropic, regionless, we send the request whatever region user specifies
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How's this? Added to the model config and check against that at runtime.
inferenceProfile: { regions: ['us', 'eu', 'apac'], required: true },
Where required is optional because some models require the inference profile and others do not.
π― Changes
β Checklist
pnpm run test:pr.π Release Impact