cve/2024/CVE-2024-12704.md

18 lines
1.1 KiB
Markdown
Raw Normal View History

2025-09-29 16:08:36 +00:00
### [CVE-2024-12704](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2024-12704)
![](https://img.shields.io/static/v1?label=Product&message=run-llama%2Fllama_index&color=blue)
2025-09-29 21:09:30 +02:00
![](https://img.shields.io/static/v1?label=Version&message=unspecified%20&color=brightgreen)
![](https://img.shields.io/static/v1?label=Vulnerability&message=CWE-755%20Improper%20Handling%20of%20Exceptional%20Conditions&color=brightgreen)
2025-09-29 16:08:36 +00:00
### Description
A vulnerability in the LangChainLLM class of the run-llama/llama_index repository, version v0.12.5, allows for a Denial of Service (DoS) attack. The stream_complete method executes the llm using a thread and retrieves the result via the get_response_gen method of the StreamingGeneratorCallbackHandler class. If the thread terminates abnormally before the _llm.predict is executed, there is no exception handling for this case, leading to an infinite loop in the get_response_gen function. This can be triggered by providing an input of an incorrect type, causing the thread to terminate and the process to continue running indefinitely.
### POC
#### Reference
No PoCs from references.
#### Github
- https://github.com/Cr0nu3/Cr0nu3