Generate chicken soup with 20 lines of Python code to create AI-mi

Source: Internet
Author: User

Let's share some chicken soup for you first:

"Don ' t think of the overwhelming majority of the impossible."

"Don't think of the impossible."

"Grew up your bliss and the world."

"Striving to win your own happiness and the World"

"What we would end create, creates the ground and you is the one to warm it"

"The creation we want to end creates the Earth, but you hold it warm."

"Look and give to miracles"

"Look to the miracle, give up the fantasy"

But in fact, these chicken soup sentences are all computer generated, and its production of chicken soup to use the program is less than 20 lines of Python code.

When it comes to natural language generation, it is often thought that this must be an advanced AI system that uses a very advanced mathematical knowledge. But that is not the case. In this paper, I (author Ramtin alami--translator) will use a Markov chain (Markov chains) and a small chicken soup dataset to generate a new chicken soup text.

Markov chain

Markov chain is a stochastic model that can predict an event individually based on previous events. For a simple example, use the life state of my home meow to explain it. My meow master always eats, sleeps, or plays with toys. She sleeps most of the time, but occasionally wakes up to eat. Usually, after the meal she will be the spirit, start playing toys, play enough to go back to sleep, and then wake up to eat.

Markov chain can easily simulate the life of my family meow, because she will decide what to do next in accordance with the previous state. She usually does not wake up and go straight to play with toys, but after eating, there is a great chance to play for a while. These life-state transitions can also be represented graphically:

Each ellipse represents a state of life, and the arrow refers to the next state of life, and the number next to the arrow refers to the probability that she is going from one state to another. We can see that the probability of state transitions is basically based only on the previous state of life.

Generating text using a Markov chain

Using a Markov chain to generate text also uses the same idea of trying to find the probability that a word appears behind another word. To confirm the possibility of these conversions, we use some examples to train the model.

For example, we use the following sentences to train the model:

I like eating apples (I likes to eat apples). You eat oranges (you eat oranges).

From the above two training sentences, we can conclude that "I" (i), "like" (likes) and "eat" (eat) always appear in the same order, while "you" and "eat" (eat) have been linked together. But the odds of "orange" (orange) and "Apples" (apples) appearing after the words "eat" (eat) are equal. The following conversion chart is a better way to show the heap I'm talking about:

These two training sentences can generate two new sentences, but this is not always the case. I trained another model with these four sentences, and the results were very different:

The Raspberry Pi My friend made is the best of the town (my friends makes the good raspberry pies.). I think apple pie is better (I think apple pies is the best pies). Steve felt that Apple made the best computer in the world (Steve thinks Apple makes, the better computers in the Worid). I have two computers, they are not Apple computers, because I am neither Steve nor tycoon (I own two computers and they ' re not apple because I am not Steve or rich).

The conversion chart of the model trained with these four sentences will be much larger.

Although the chart and the typical Markov chain conversion chart look very different, the main idea behind the two is the same.

The path from the start node randomly selects the next word, all the way to the end node. The width of the linked path between words indicates the probability that the vocabulary is selected.

Although only four sentences are trained, the above model can generate hundreds of different sentences.

Code

The code in the above text generator is very simple, except for Python's random module, which does not require any additional modules or libraries. The code consists of two parts, one for training and the other for generating.

Training

The training code constructs a model that we will use later to generate chicken soup sentences. I used a dictionary as a model that contained words as key points, and a column that might follow a word as a corresponding value. For example, using the "I Love Apples" above (' I like to eat apples ') and ' You eat oranges ' (you eat oranges) The dictionary of the two sentences training model will be this:

{'START': ['I',' You'],'I': [' like'],' like': [' to'],' to': ['Eat'],' You': ['Eat'],'Eat': ['Apples

We do not need to calculate the probability of the occurrence of the following words, because if they occur in higher probability, they will appear multiple times in the list of possible following words. For example, if we want to add another training sentence "We eat apples" (' We eat apples '), the word "apple" (' Apples ') has appeared in two sentences behind the word "eat" (eat), then it will appear very high probability. In the dictionary of the model, if it appears two times in the "Eat" list, the probability of eat is higher.

{'START': ['I','We',' You'],'I': [' like'],' like': [' to'],' to': ['Eat'],' You': ['Eat'],'We'

In addition, there are two terms in the model dictionary above: "Start" (start) and "End" (end), which represent the starting and ending words of a generated sentence.

 forLineinchDataset_file:line=Line.lower (). Split () forI, WordinchEnumerate (line):ifi = = Len (line)-1: model['END'] = Model.get ('END', []) +[Word]Else:                ifi = =0:model['START'] = Model.get ('START', []) +[Word] model[word]= Model.get (Word, []) + [line[i+1]]

Create a chicken soup sentence

The generator section contains a loop. It first selects a random starting word and adds it to a list, and then searches the dictionary for a list of potential follow words, randomly selects a list, and adds the new selected word to the list. The generator will always select random potential followers until the end word is found, and then stop looping, outputting the resulting sentence or the so-called "famous quote".

ImportRandom generated= [] whileTrue:if  notGenerated:words= model['START']    elifGENERATED[-1]inchmodel['END']:         Break    Else: Words= Model[generated[-1]] Generated.append (random.choice (words))

I used a Markov chain to generate a lot of chicken soup, but as a text generator, you can enter any text and let it generate similar sentences.

There are other cool things that can be done with the Markov chain text generator, which is to mix different text types. For example, in my Favorite TV series, Rick and Morty, a character called "Aradov Lincoln" (Abradolf Lincler) was mixed with the names of "Araboham Lincoln" and "Adolf Hitler".

You can also do this by entering the names of some celebrities into a Markov chain to generate a playful mix of character names, such as Guo Da Stanson

Nicolas Zhaoshi

You can even take a step further by mixing some famous sayings, such as the one described above, with the words of Lincoln and Hitler, and creating a new style of speech with a Markov chain.

Markov chains can be used in almost all areas, although text generation is not the most useful application, but I do think this application is interesting, in case you produce chicken soup that one day attracted more fans than Amy?

Python Learning Exchange Group: 125240963


Reprint to: HTTPS://JUEJIN.IM/POST/5AF15CD051882567312429CB

Generate chicken soup with 20 lines of Python code to create AI-mi

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.