-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Python: Bug: Python Semantic-Kernel 1.10.0 SDK is not compatible with the latest pydantic 2.9.2 #9012
Comments
Hi @HuskyDanny, thanks for filing the issue. Can you give us a bit more context as to what you're doing to get this error? I have not seen this before. During all check-ins to main we're running unit test and integration tests (that also exercise concept samples, learn site samples and notebooks) and this error hasn't occurred. Any other info you can provide to help us repro it would be great. Thanks. |
I have the same issue, but on my side it seems to be related to in my code if I switch from but in my code if I comment those lines
it work with |
Hi @moonbox3, I was basically trying to spin up my fast api app locally, and it broke immediately. Switching the dependent package versions resolve this for now, but I am concerned if not resolving this issue, updating to further semantic kernel version would be blocked |
…model (#9292) ### Motivation and Context <!-- Thank you for your contribution to the semantic-kernel repo! Please help reviewers and future users, providing the following information: 1. Why is this change required? 2. What problem does it solve? 3. What scenario does it contribute to? 4. If it fixes an open issue, please link to the issue here. --> - This is fixing an issue with the default opentelemery metric in KernelFunction. This issue is with SemanticKernel version >= 1.8 causing `TypeError: cannot pickle '_thread.RLock' object` when importing SemanticKernel - This is blocking us from updating to the latest SemanticKernel version. It is difficult to reproduce outside my application code. But is seems to be the same issue mentioned in #9057 and #9012 changing to a Field with a default factory solve the issue. ### Description <!-- Describe your changes, the overall approach, the underlying design. These notes will help understanding how your code works. Thanks! --> - Updating the default metrics value to use a Field with a default_factory ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [x] The code builds clean without any errors or warnings - [x] The PR follows the [SK Contribution Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) and the [pre-submission formatting script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts) raises no violations - [x] All unit tests pass, and I have added new tests where possible - [x] I didn't break anyone 😄 Co-authored-by: Evan Mattson <35585003+moonbox3@users.noreply.github.com>
This should be fixed as of #9292. Please ping back on this if you continue to experience any issues. Thanks. |
Describe the bug
I got the error "TypeError: cannot pickle '_thread.RLock' object" when working with semantic-kernel==1.10.0 and pydantic==2.9.2.
Then I downgrade them to semantic-kernel==1.7.0 and pydantic==2.8.2 pydantic-settings==2.4.0 pydantic_core==2.20.1, everything works
Screenshots
Platform
Additional context
Full Error:
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/site-packages/semantic_kernel/init.py", line 3, in
from semantic_kernel.kernel import Kernel
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/site-packages/semantic_kernel/kernel.py", line 31, in
from semantic_kernel.functions.function_result import FunctionResult
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/site-packages/semantic_kernel/functions/init.py", line 5, in
from semantic_kernel.functions.kernel_function import KernelFunction
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/site-packages/semantic_kernel/functions/kernel_function.py", line 55, in
class KernelFunction(KernelBaseModel):
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/site-packages/pydantic/_internal/_model_construction.py", line 205, in new
complete_model_class(
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/site-packages/pydantic/_internal/_model_construction.py", line 567, in complete_model_class
generate_pydantic_signature(init=cls.init, fields=cls.model_fields, config_wrapper=config_wrapper),
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/site-packages/pydantic/_internal/_signature.py", line 159, in generate_pydantic_signature
merged_params = _generate_signature_parameters(init, fields, config_wrapper)
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/site-packages/pydantic/_internal/_signature.py", line 115, in _generate_signature_parameters
kwargs = {} if field.is_required() else {'default': field.get_default(call_default_factory=False)}
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/site-packages/pydantic/fields.py", line 554, in get_default
return _utils.smart_deepcopy(self.default)
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/site-packages/pydantic/_internal/_utils.py", line 318, in smart_deepcopy
return deepcopy(obj) # slowest way when we actually might need a deepcopy
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/copy.py", line 172, in deepcopy
y = _reconstruct(x, memo, *rv)
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/copy.py", line 271, in _reconstruct
state = deepcopy(state, memo)
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/copy.py", line 146, in deepcopy
y = copier(x, memo)
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/copy.py", line 231, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/copy.py", line 172, in deepcopy
y = _reconstruct(x, memo, *rv)
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/copy.py", line 271, in _reconstruct
state = deepcopy(state, memo)
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/copy.py", line 146, in deepcopy
y = copier(x, memo)
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/copy.py", line 211, in _deepcopy_tuple
y = [deepcopy(a, memo) for a in x]
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/copy.py", line 211, in
y = [deepcopy(a, memo) for a in x]
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/copy.py", line 146, in deepcopy
y = copier(x, memo)
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/copy.py", line 231, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/copy.py", line 172, in deepcopy
y = _reconstruct(x, memo, *rv)
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/copy.py", line 271, in _reconstruct
state = deepcopy(state, memo)
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/copy.py", line 146, in deepcopy
y = copier(x, memo)
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/copy.py", line 231, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "/home/allenpan/anaconda3/envs/provisionTest/lib/python3.10/copy.py", line 161, in deepcopy
rv = reductor(4)
TypeError: cannot pickle '_thread.RLock' object
The text was updated successfully, but these errors were encountered: