You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As shown in #33, one hits the token limits of a single prompt to the OpenAI LLM provider quite quickly in the Multiple FHIR Resource Chat functionality of LLMonFHIR.
Solution
#33 provides an easy and quick fix for this limitation, however, we should investigate how to properly summarize FHIR resource data points.
However, these limitations are quite hard to explore/reproduce as we don't have the necessary extensive FHIR data points available (except for @aalami5).
Ideas to explore:
Experiment with OpenAI prompt that summarizes FHIR resources
Give the summarization output a more fixed (JSON?) structure, not just free-flowing text
Omit unnecessary FHIR data points (such as identifiers, ...)
Additional context
No response
Code of Conduct
I agree to follow this project's Code of Conduct and Contributing Guidelines
The text was updated successfully, but these errors were encountered:
Problem
As shown in #33, one hits the token limits of a single prompt to the OpenAI LLM provider quite quickly in the
Multiple FHIR Resource
Chat functionality of LLMonFHIR.Solution
#33 provides an easy and quick fix for this limitation, however, we should investigate how to properly summarize FHIR resource data points.
However, these limitations are quite hard to explore/reproduce as we don't have the necessary extensive FHIR data points available (except for @aalami5).
Ideas to explore:
Additional context
No response
Code of Conduct
The text was updated successfully, but these errors were encountered: