Boring, adapt to the trend, learn the Python machine learning it.
Buy a book, first analyze the catalogue it.
1. The first chapter is the Python machine learning ecosystem.
1.1. Data science or machine learning workflow.
It is then divided into 6 points for detailed instructions: acquisition, inspection and exploration, cleanup and preparation, modeling, evaluation, deployment.
1.2. Explains the Python library and the corresponding features.
This is also divided into 5 points for detailed instructions: acquisition, inspection, preparation, modeling and evaluation, deployment.
1.3. Set up the machine learning environment.
1.4. Finally come to a summary.
2. The second chapter is a case, build the application, to dig low-priced apartments.
2.1. First, get the data for the apartment listings. This will use the Import.io method to crawl the listing data.
2.2. Then it's checking and preparing the data. Divided into two points, the first analysis of data, and then the visualization of data.
2.3. The data is then modeled. This is divided into two points, the first is the prediction, then the extension model.
2.4. Finally, a summary.
3. Chapter three to a ticket case, the same to dig down the low-priced ticket.
3.1. First of all, it must be to get ticket price data.
3.2. Here is a learning point: Use advanced web crawler technology to retrieve fare data.
3.3. After the data is parsed, the clustering technique is adopted to identify an abnormal fare and a learning point.
3.4. Then there is the meaning of the place, by using IFTTT to send real-time reminders, this can serve the service.
3.5. Learn these scattered points of knowledge and simply integrate it through this project.
3.6. Finally, a summary. This summary is very cost-effective yo.
4. To be a bit more powerful, use logistic regression to predict the IPO market, the IPO market is similar to stocks.
4.1. Let's start by introducing what an IPO is, and needless to say, some business-minded people are investing in companies through the public to save money.
4.2. Explain the characteristics of the project, this is amazing.
4.3. Then explain the two-dollar classification, seemingly very tall on it.
4.4. Analyzing the importance of features is also an important part of predicting the IPO market.
4.5. Finally, a summary.
5. Create a custom news feed, which I like.
5.1. Use the Pocket application to create a collection of supervised training. The original training is this meaning, before the company always hear those cattle people say what training and training.
5.2. The second step is to get the data source, which uses the embed.ly API to download the content of the story.
5.3. Then there are some processing bases for explaining natural language, which is a difficult point.
5.4. And then it explains how to use the vector machine, this is also very important, for training is very necessary.
5.5. Then there is a little bit of concept involved in explaining IFTTT and the source of the article, the integration of Google Express and email.
5.6. Then, according to their hobbies, set up a daily personalized briefing, is to arrive at the result of it.
5.7. Finally, a summary.
6. Have something interesting to predict whether your content will be widely circulated. That means you're on fire.
6.1. A case of a virus explaining how it was fired.
6.2. Statistics on the degree of fire, that is, the amount and content of sharing.
6.3. Explore how the fire, that is, to explore the characteristics of communication.
6.4. Then build a predictive model of your own content to see if it will fire.
6.5. Finally, a summary.
7. Before using the logistic regression method to predict the IPO market, the machine learning is used to predict the market.
7.1. First of all, to study the type of analysis, the stock market will have a lot of different types.
7.2. Then there is something that we can learn from the stock market, which is, of course, multifaceted.
7.3. Here play hardball knife, teach you how to develop a trading strategy.
Divided into several points, the first is the period of delay analysis, or you finish the analysis after the time has gone, it is not worth the candle.
Then we use support vector regression to build a model.
Finally, to model it, and then teach you a dynamic time warp, this name is nice.
7.4. Finally, a summary.
8. One more case, create an image similarity engine.
As the name implies, it is to find similar images.
8.1. First of all, of course, to understand a similar machine learning.
8.2. Then you need to understand the image processing knowledge.
8.3. After understanding the basics, learn how to find similar images, which is of course the core content.
8.4. The next step is to learn about deep learning, which should be a deep learning of the relationship between images. involves the underlying operation.
8.5. Then of course it is the engine that shows the results, building a system of image similarity.
8.6. Finally, a summary.
9. Come to a chat robot. This is actually familiar.
9.1. First, learn about a psychic test. This is the most basic thing.
9.2. Then it's about the history and meaning of the chat robot.
9.3. Start the design, how to achieve, with what method, now began to think.
9.4. Start the code and build a chat robot.
9.5. Finally, a summary.
10. Finally, there is a practical function to recommend the engine.
Most of the news apps are now basically implemented on demand.
10.1. The first is to understand what is collaborative filtering, which is divided into two, one based on the user, a project-based filtering.
10.2. Then learn what content filtering is, and this is the inner details of filtering.
10.3. Explain what a hybrid system is, a system that filters a complex set of things according to the needs of the user.
10.4. Start the code and create a referral system.
10.5. Finally, a summary.
11. Finally, come to a personal summary.
Now I am ignorant of Python and machine learning. Pure small white.
After reading this book, I do not know what kind of a me. Start the long journey of machine learning. I love robot.
So the first chapter on a bit of foundation, the remaining 9 chapters are projects, so the arrangement is more reasonable, I also like.
Project is the fundamental to promote the development of knowledge.
Alexander's directory analysis of Python machine learning.