Imagine a new AWS service called “Amazon Monkey” that allowed you to quickly scale up random monkeys at a typewriter. This service could scale up so that you could quickly have many monkeys randomly banging away at a keyboard (OK, this is not so different than Mechanical Turk). The cost per monkey-hour might be so low that you could run experiments to see how long it would actually take to come up with the complete works of Shakespeare.
That tool would be a natural language generator.
However, we can likely be much more efficient if we deploy some of the new technologies that are coming out today around NLG. GTP-3 by OpenAI makes access to its machine learning capabilities easier through API calls.
It’s only a matter of time before these tools become so good that they are indistinguishable from our regular writing. First there was autocomplete, then there was suggested words in Google Mail, but very soon complete emails (beyond the single line response) will be ready in our outboxes for us to send in reply.
Then my bot talks to your bot.
Then the lights flicker and we’re wearing loincloths and bearing stone tools.
In reality, NLG capabilities are going to mean less effort in crafting up responses that aren’t of extreme importance or communicating events out in traditional channels. They are also going to wreak havoc on social networks.