-
Notifications
You must be signed in to change notification settings - Fork 0
Request for precomputed inference results #4
Copy link
Copy link
Open
Description
Hi,
I am currently trying to reproduce the DI-Bench results.
The dataset and repository instances work fine, but I am running into issues during the dependency inference step due to API rate limits (OpenAI), which prevents generating prediction files.
I wanted to ask:
Do you provide any precomputed inference results (prediction files) for the dataset (especially the large Python set), so that I can directly run the evaluation step?
This would help verify that my setup is correct before running the full inference.
Thanks a lot for your work!
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels