- 365a3c2: Updated the OpenInference semantic convention mapping to account for changes to the Vercel AI SDK semantic conventions
-
16a3815: ESM support
Packages are now shipped as "Dual Package" meaning that ESM and CJS module resolution should be supported for each package.
Support is described as "experimental" because opentelemetry describes support for autoinstrumenting ESM projects as "ongoing". See https://github.com/open-telemetry/opentelemetry-js/blob/61d5a0e291db26c2af638274947081b29db3f0ca/doc/esm-support.md
- Updated dependencies [16a3815]
- @arizeai/[email protected]
- @arizeai/[email protected]
- Updated dependencies [1188c6d]
- @arizeai/[email protected]
- @arizeai/[email protected]
- Updated dependencies [710d1d3]
- @arizeai/[email protected]
- @arizeai/[email protected]
- a0e6f30: Support tool_call_id and tool_call.id
- Updated dependencies [a0e6f30]
- @arizeai/[email protected]
- @arizeai/[email protected]
- a96fbd5: Add readme documentation
- Updated dependencies [f965410]
- Updated dependencies [712b9da]
- Updated dependencies [d200d85]
- @arizeai/[email protected]
- @arizeai/[email protected]
- 4f9246f: migrate OpenInferenceSpanProcessor to OpenInferenceSimpleSpanProcessor and OpenInferenceBatchSpanProcessor to allow for filtering exported spans
- 3b8702a: remove generic log from withSafety and add onError callback
- ff2668c: caputre input and output for tools, fix double count of tokens on llm spans / chains
- Updated dependencies [3b8702a]
- @arizeai/[email protected]
- 97ca03b: add OpenInferenceSpanProcessor to transform Vercel AI SDK Spans to conform to the OpenInference spec
- Updated dependencies [ba142d5]
- @arizeai/[email protected]
- @arizeai/[email protected]