Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why did they call GPT-3 "text-davicini-001" in this comparison?

Like, I know that the latter is a specific checkpoint in the GPT-3 "family", but a layman doesn't and it hardly seems worth the confusion for the marginal additional precision.



Thanks for noting that, as I am a layman who didn't know.


text-davinci-001 is just not GPT-3 in any real sense

(I work at OpenAI, I helped build this page and helped train text-davinci-001)


Huh, that was not my understanding. Do you have a link where we can read more?

Follow-up question then: why include text-davinci-001 in this page, rather than some version of GPT-3?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: