Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Can not turn off telemetry and telemetry logs #3880

Open
aaronjolson opened this issue Dec 3, 2024 · 0 comments · May be fixed by #3889
Open

[BUG] Can not turn off telemetry and telemetry logs #3880

aaronjolson opened this issue Dec 3, 2024 · 0 comments · May be fixed by #3889
Labels
bug Something isn't working

Comments

@aaronjolson
Copy link

Describe the bug
I am seeing these telemetry warnings all over my stdout output

%6|123.795|GETSUBSCRIPTIONS|my-chat#producer-2| [thrd:main]: Telemetry client instance id changed from AAAAAAAAAAAAAAAAAAAAAA to asbgahgjksgbjk
WARNING:opentelemetry.attributes:Invalid type NoneType for attribute '__computed__.cumulative_token_count.completion' value. Expected one of ['bool', 'str', 'bytes', 'int', 'float'] or a sequence of those types
WARNING:opentelemetry.attributes:Invalid type NoneType for attribute '__computed__.cumulative_token_count.prompt' value. Expected one of ['bool', 'str', 'bytes', 'int', 'float'] or a sequence of those types
WARNING:opentelemetry.attributes:Invalid type NoneType for attribute 'llm.usage.completion_tokens_details' value. Expected one of ['bool', 'str', 'bytes', 'int', 'float'] or a sequence of those types
WARNING:opentelemetry.attributes:Invalid type NoneType for attribute 'llm.usage.prompt_tokens_details' value. Expected one of ['bool', 'str', 'bytes', 'int', 'float'] or a sequence of those types

I do not want to see these anymore. I have tried running

 > pf config set telemetry.enabled=false
Set config [{'telemetry.enabled': 'false'}] successfully.

but that does not seem to have any effect on run output.
I also tried creating a pf.yaml file in ~./promptflow
The contents of this file looks like

telemetry.enabled=false

How To Reproduce the bug
Steps to reproduce the behavior, how frequent can you experience the bug:

  1. run a command similar to
python -m promptflow._cli._pf.entry flow test --flow/path/to/flow --user-agent "prompt-flow-extension/1.20.2 (darwin; arm64) VSCode/1.95.3"

Expected behavior
I expect none of these logs

%6|123.795|GETSUBSCRIPTIONS|my-chat#producer-2| [thrd:main]: Telemetry client instance id changed from AAAAAAAAAAAAAAAAAAAAAA to asbgahgjksgbjk
WARNING:opentelemetry.attributes:Invalid type NoneType for attribute '__computed__.cumulative_token_count.completion' value. Expected one of ['bool', 'str', 'bytes', 'int', 'float'] or a sequence of those types
WARNING:opentelemetry.attributes:Invalid type NoneType for attribute '__computed__.cumulative_token_count.prompt' value. Expected one of ['bool', 'str', 'bytes', 'int', 'float'] or a sequence of those types
WARNING:opentelemetry.attributes:Invalid type NoneType for attribute 'llm.usage.completion_tokens_details' value. Expected one of ['bool', 'str', 'bytes', 'int', 'float'] or a sequence of those types
WARNING:opentelemetry.attributes:Invalid type NoneType for attribute 'llm.usage.prompt_tokens_details' value. Expected one of ['bool', 'str', 'bytes', 'int', 'float'] or a sequence of those types

to appear in stdout. I do not want these telemetry calls to run.

Screenshots
If applicable, add screenshots to help explain your problem.

Running Information(please complete the following information):

  • Promptflow Package Version using pf -v:
     {
       "promptflow": "1.16.2",
       "promptflow-core": "1.16.2",
       "promptflow-devkit": "1.16.2",
       "promptflow-tracing": "1.16.2"
      }
  • Operating System: OSX Sequoia 15.1.1
  • Python Version using python --version: python==3.11.10
@aaronjolson aaronjolson added the bug Something isn't working label Dec 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant