Poetry could be defined as the delicate balance between constraints and expression. Computers are pretty good at constraints, less so at expression.
Cracking the structure and meaning of a text has been one of the goals of linguistics, and Natural Language Processing. This had first been addressed in a reductionist way, by decomposing a sentence, counting, matching then, later, by learning from evidence and statistics. Or, as it has been shown, lately to be more and more effective, by training neural networks with a minimal set of assumptions - as was shown by the seminal paper Ã¢â‚¬Å“Natural Language Processing (almost) from ScratchÃ¢â‚¬Â.
In this talk we'll see how to follow these techniques to spot poetry in unlikely textual places, and, using a minimal set of assumptions, hoping for the meter and rhyme rules to emerge. Along the way we'll touch on topics such as the elusive definition of poetry and the tension between simple, classical rules, and models that can defy interpretation. We'll also train our models on very the very different systems of the French Alexandrine and the classical English meter.