It is trained on ArXiv papers, so anything pop reference I type in is just ridiculous. But let's try to use the language of a paper as a prompt.
> [In the field of natural language processing we often find that the most challenging problems are] ones in which the system cannot easily adapt a language model to the input, for example. This problem has become especially aggravated by the lack of available source-native translations in natural language processing (RNN). This problem has also become more aggravated by the lack of reliable information from external sources (such as the text itself, the database, and other non-native source languageās ). Our approach to this problem, however, has been to rely on a small sample corpus of spoken language models that have already been acquired from a large corpus of natural language processing texts.
I'm still impressed by little things, like that it remembers to close the parentheses, and can even nest them. Though this is clear gibberish.
> [Conventional NLP models struggle to produce convincing text generation. In this paper we introduce] a new, cross-lingual hybrid method which allows for both natural and induced text generation. The hybrid approach provides a method to generate the text without using a natural language model, in a manner similar to what is done in the traditional text generation ( Bengio et al., 2005). We introduce three new hybrid techniques to produce text using conventional NLP methods; in the first two categories, we propose a novel hybrid method, which we call the method for generating natural texts; in the third category, we propose a novel hybrid technique , which we call the method for generating text without a natural language model.
This one is much better. It has a citation and decently fits the format of an abstract or introduction. The context is still nonsense, but the structure is there. It gets that for a paper we need to introduce a new model type. It remembers that we are talking about text generation. It cites. It says it will introduce three methods then says "the first two" and then "in the third". This surprised me a little. I'm not convinced that it is doing math though, especially since it can't autocomplete simple math sentences. But rather it is likely picking up this pattern used in papers. I wouldn't be surprised if a 2 + 1 approach is a fairly common speech pattern. I think examples like that are easy to read into, thinking that the model is doing something extraordinary when there's probably a simpler explanation lying around.
> [In the field of natural language processing we often find that the most challenging problems are] ones in which the system cannot easily adapt a language model to the input, for example. This problem has become especially aggravated by the lack of available source-native translations in natural language processing (RNN). This problem has also become more aggravated by the lack of reliable information from external sources (such as the text itself, the database, and other non-native source languageās ). Our approach to this problem, however, has been to rely on a small sample corpus of spoken language models that have already been acquired from a large corpus of natural language processing texts.
I'm still impressed by little things, like that it remembers to close the parentheses, and can even nest them. Though this is clear gibberish.
> [Conventional NLP models struggle to produce convincing text generation. In this paper we introduce] a new, cross-lingual hybrid method which allows for both natural and induced text generation. The hybrid approach provides a method to generate the text without using a natural language model, in a manner similar to what is done in the traditional text generation ( Bengio et al., 2005). We introduce three new hybrid techniques to produce text using conventional NLP methods; in the first two categories, we propose a novel hybrid method, which we call the method for generating natural texts; in the third category, we propose a novel hybrid technique , which we call the method for generating text without a natural language model.
This one is much better. It has a citation and decently fits the format of an abstract or introduction. The context is still nonsense, but the structure is there. It gets that for a paper we need to introduce a new model type. It remembers that we are talking about text generation. It cites. It says it will introduce three methods then says "the first two" and then "in the third". This surprised me a little. I'm not convinced that it is doing math though, especially since it can't autocomplete simple math sentences. But rather it is likely picking up this pattern used in papers. I wouldn't be surprised if a 2 + 1 approach is a fairly common speech pattern. I think examples like that are easy to read into, thinking that the model is doing something extraordinary when there's probably a simpler explanation lying around.
But still, cool stuff.