D
Deleted member 1083629
Guest
Correct. I believe you are talking about warm-starting retraining. With how large ChatGPT, warm-start will still take money to update it. I am not talking about retraining it from ground 0; I am talking about talking it's 1.3 billion parameters (or how many are out there). Is it possible? Hell yeah. But it's expensive AF.Not always. The "transformer" setup learns how to "transform". If you get a new pfl or decision, you can project it in its "embedding space" without need to retrain the entire network. And given how stupid canadian immigration laws are, we may just get away with a GPT2.
The biggest problem with PFLs is that recent law decisions can be crucial in responding. In order to get those latest cases and write responses with the latest information, it's important to always keep the model up to date. Thus, see my first paragraph.