ðŽA common question regarding ai llm like ChatGPT by many and answer for that question
ð
Question:
The AI language models are being downloaded and could be run offline in large RAM devices.For example, a 500 GB size ai language model can be downloaded to run offline, allowing me to get answers to variety of questions across different disciplines. My doubt is this: If I have multiple PDFs that were used to train that model, the total storage of those files might exceed 1000 times that of the 500 GB model. How can such a 500 GB language model know the answers to my questions when the data it was trained on is so much larger than the model itself?
Answers for this question by different llms:
https://x.com/raddoc96/status/1802769059475542090
ð
Question:
The AI language models are being downloaded and could be run offline in large RAM devices.For example, a 500 GB size ai language model can be downloaded to run offline, allowing me to get answers to variety of questions across different disciplines. My doubt is this: If I have multiple PDFs that were used to train that model, the total storage of those files might exceed 1000 times that of the 500 GB model. How can such a 500 GB language model know the answers to my questions when the data it was trained on is so much larger than the model itself?
Answers for this question by different llms:
https://x.com/raddoc96/status/1802769059475542090