Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Open Telemetry and tracing token error #3888

Open
Malagurti opened this issue Dec 19, 2024 · 0 comments
Open

[BUG] Open Telemetry and tracing token error #3888

Malagurti opened this issue Dec 19, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@Malagurti
Copy link

Describe the bug
Two related issues in token collection:

AttributeError: 'TokenCollector' object has no attribute 'try_get_openai_tokens'
OpenTelemetry warnings about invalid token value types (dict and NoneType values where numeric values are expected)

How To Reproduce the bug
Steps to reproduce the behavior:

Run any LLM flow with token tracking enabled
The error occurs consistently on every run
Check logs for OpenTelemetry warnings about invalid attribute types
Flow execution fails with AttributeError for try_get_openai_tokens

Expected behavior

TokenCollector should properly track and collect token usage
Token values should be properly typed (numeric values)
No OpenTelemetry warnings about invalid types
No AttributeErrors during token collection

Screenshots
Error logs show:
CopyWARNING:opentelemetry.attributes:Invalid type dict for attribute 'computed.cumulative_token_count.completion'
...
TypeError: unsupported operand type(s) for +: 'NoneType' and 'dict'
...
AttributeError: 'TokenCollector' object has no attribute 'try_get_openai_tokens'
Running Information

Promptflow Package Version: 1.16
Operating System: Windows
Python Version: Python 3.9

Additional context

Issue appears related to token tracking in tracing functionality
Occurs with Azure OpenAI service integration
Problem stems from method indentation error and inconsistent type handling
Affects both streaming and non-streaming LLM calls

@Malagurti Malagurti added the bug Something isn't working label Dec 19, 2024
@Malagurti Malagurti changed the title [BUG] [BUG] Open Telemetry and tracing token error Dec 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant