Unconsciously, graduated 1.5, from the internship began to contact with big data technology. At that time ignorant of me, wrong, should say that I was thinking, lying trough, this is what so good, I will not ah ... Nothing's going to go wrong. Even now it's the same.
This year is still a lot of changes, but the slightest can not stop my enthusiasm for technology, this enthusiasm is like the surging River, Hua la la la la ~, let us see the Big Data engineer to master the skills there?
At first glance, it scares the urine. Just wait for me to take a shower.
If you are willing to sacrifice all your time to study, study, practice. I believe I can master everything in it, but it's a long process. I think this picture is still very image, the left represents the project development, the right side represents the algorithm development. The basic division of labor in large companies is very clear. But in this pluralistic society, if you can master the knowledge that no one else has mastered, or need a high learning cost to master, then you will sparkle like a firefly in the dark, the Beetle in the Paddy field is like the glow of the sun, as if the first ray of sunshine in the window sill, like ~ ~ Good. Let's go back to the chase.
A lot of people say, so much I take a go, I do not know how to learn ah, where to start AH. So I'm going to say that until now, personal learning journey (personal only):
1. Starting from 2014, contact Hadoop. What is this? Bought a book (the authoritative guide for Hadoop yarn). Completely ignorant, but insisted on it to read it, and then began to build the environment, a single node, when the moment of operation is absolutely cool I ~ (Note that you have to have the Linux Foundation)
2, start to look at the programming model, Mapreduce,mapreduce is what. Good tall ah, people are always afraid of the unknown things, but fear also to see AH ~ do not understand also want to see AH ~ development environment, all kinds of pits, I have not graduated well, so many tools, how do I know how to use. What's wrong with this newspaper? Start all kinds of tossing.
3, the product manager said, we want to use spark! That was at the end of 2014, but in fact the initial investigation of the use of shark, because at that time Spark is not commercial, but eventually the use of Spark, at the time of the decision suddenly spark1.3 started commercial, joined Dataframe (of course, I was completely ignorant, what is this?) )。。 Trained by the manager, I started the spark trip. At that time, there was a basic understanding of Hadoop.
4, into the 2015, the project has started the development of spark, of course, for application development, completely starting from 0, so we started a variety of not face to ask the legend of the Orange Cloud distributed team, I every trip to buy something to honor their old haha ~ ~ 1.1 Points of study theory, 1.1 points of the attempt , ask at 1.1 o ' all. Here to say, do technology, especially afraid of each other despise themselves, dare not ask, afraid to be laughed at. In fact, the true love of technology, will be extremely enthusiastic about technology, you have to remember, shameful nothing, and then simply learned is their own ~ so began to buy the book about Spark, began to summarize a variety of knowledge points, try, summarize, try.
5. On the eve of graduation in 2015, the first time to go to Hubei project on line, transform the traditional application system to support Spark's big Data project. For my this fledgling side of the side, I le a wipe, real cluster, data center more than 180 clusters, batch processing, stream processing, see me dazzling, good fear ~ ~ Afraid of Mao Ah ~ directly on the ~ to deploy a variety of parameters, problems or even to the night 2 o'clock ~ Beijing colleague Remote Support, A variety of front-end problems overwhelmed, the final success of the online ~ (here is a bit, the technology like to know Root ask bottom, in the field environment, especially this complete big Data environment, you want is not the face to ask, do not understand the question, so the harvest is the whole big Data system process and process of well-known)
6, at the end of 2015, the project has been running for some time, in the work do not forget to cram a variety of knowledge, improve the knowledge system, and then suddenly a point, I seem to all pass. Although this is only fur ~ slowly. The various provinces and cities on the line, let me start to think of those things (this is dangerous omen)
7, in early 2016, for some reason, came to a bank in Shanghai, here is a complete big data environment, at that time, actually a little afraid, for what, because although the establishment of the Big data knowledge system, but the actual combat experience is not enough and I more is to do spark, apart directly to practice, A variety of tortured daily after 12 o'clock still lights. Learn Hbase,redis,storm,kafka and more in-depth Hadoop. Sometimes even have the idea of giving up, under great pressure, every day with Red Bull, learn, practice, learn, practice.
8, now, a moment, suddenly again feel, seemingly I will all! As if the mind can feel the data in the various components of the transmission, network transmission, when will OOM,JVM occupy, network communication ... And once again, I started applying spark, and this time, I'm thrilled to be passionate every day, so why? Because I found that once a lot of knowledge points do not understand, it seems all through the. In the quiet of the night, thinking about the people left, silently left tears, you how to wait for me.
Summary: Technology, is constantly updated iteration, but must have a set of their own knowledge system, their own understanding, do not think AH ~ I will write this mapreduce, I will write the spark backstage code, I will use HBase, I will storm, complacent, Look farther a little deeper, you will find that I do not have anything, learning! Insist! Insist! Insist! Insist! Hold on again! With your tenacity and passion, infected with the people around, affecting the people around.
Big Data Career Insights