CDA Data Analyst Ninth LEVEL II Big Data Exam Interview Champion

2021-08-17

The 9th CDA Data Analyst Certification Exam successfully ended at the end of December 2018. Recently, we interviewed several excellent students who ranked in the top in this exam. In the previous article, we interviewed the champion of Level 1 (click to view the previous interview). In this article, we interviewed the top three in the direction of Level 2 big data. , So how do they prepare and study? Let's take a look at their style below!

Level 2 Big Data · No. 1 Hu Renfei
Graduated from Zhejiang University in 2009, and has been engaged in related work in the communications industry since. He has successively served as project management, wireless network optimization management, business analyst, etc.
1. Current work

Currently engaged in big data modeling and business analysis at Zhejiang Mobile. Everyday work is to deal with data. Through the analysis of various business data, we can understand the business status of the enterprise, and establish various data models according to actual business needs to support the business development of the enterprise.


2. Opportunity to apply for CDA certification exam

Although engaged in work related to big data analysis, I feel that my understanding and mastery of the various technologies and frameworks involved in the big data platform needs to be improved. CDA has a high degree of recognition in the industry and more knowledge points involved in certification exams. Therefore, I want to pass the CDA certification exam to recharge myself and improve my technical and business capabilities.


3. How did I prepare for the exam? I actually wanted to take the exam in the first half of 18 years, but I have been delayed because of my busy work. Around mid-to-late October, I finally made up my mind to apply for the second-level big data analyst, and then I started to download the outline on the official website, and purchased the recommended teaching materials according to the outline and other suggestions. Months.

Because the preparation time is relatively tight, and the daily work is relatively busy, and there are more trivial things in family life, basically only try to arrange a relatively free time period for review every day. I usually review half to one hour during work lunch break, and try to review for 1-2 hours after work in the evening. The point is that on weekends, it may take a relatively large amount of time to learn. A total of almost 8 weeks, I spent about 5 weeks in combination with the outline to study the teaching materials, including the above-mentioned big data technology foundation and Spark big data analysis, as well as Mahout and other scattered materials. Next, it took nearly 2 weeks to review the exam guidance manual. The last one and two weeks before the exam are based on the outline for checking and filling vacancies.

4. The recommended books and courses passed the review of this exam. I found two very good books. One is the "Big Data Technology Basics" compiled by Professor Lin Ziyu from Xiamen University.
This book explains the basics of big data technology more comprehensively and systematically in plain language. Through this book, you can better grasp the principles and applications of various key technologies under the hadoop ecosystem and their relationships, such as HDFS, MapReduce, Hbase, and so on.
The second book is "Spark Big Data Analysis Technology and Practical Warfare" written by the house of management. This book gives a detailed explanation of spark, from installation and deployment to actual programming applications. Through the study of this book, Have a good understanding and mastery of the Spark project and the main core components.

5. Advice for test takers During the review process, I think two points are more important:
1. In addition to focusing on reviewing the knowledge points in the syllabus, if time permits, you must read books carefully and systematically from beginning to end so that knowledge points can be integrated.
2. Be sure to get hands-on, from installing virtual machines, to building and deploying platforms and components such as Hadoop, Hbase, Hive, and Spark, to specific programming operations, such as database operations and the operation of several major components of Spark.

Level 2 · second place Teng Big Data Cloud


1. The current work is currently the director of cloud and Internet of Things operations of Guilin Telecom, responsible for the technical support of cloud computing, Internet of Things and other emerging businesses. Has more than 10 years of technical support and technical management related experience.

2. Opportunity to apply for CDA certification exam Telecom, as a comprehensive information service provider, has huge space and potential to apply big data and artificial intelligence technology, and also has three natural advantages in data volume, computing power, and application scenarios. But as a traditional enterprise, we are short of talents who are familiar with big data, and have not yet realized the realization of big data scale well. I hope that I can master big data through systematic learning, improve the team's ability, and drive the application of big data to the internal and external companies.
As a working person, I do n’t have much time to study, and I want to pass the exam to urge myself to study systematically and systematically. At present, there is no national unified examination certification for big data, and the CDA curriculum system is relatively mature and has a good reputation in the industry. Therefore, I signed up for the CDA exam.

3. How did I prepare for the test? The test preparation started after the National Day. The entire test preparation took about 3 months, of which nearly one month was to build the operating environment.
First stage preparation (5 days)
Analyze the content of the outline, formulate a learning plan based on the level of textbook layout, your own familiarity with the contents of each module, and a reasonable weight of the proportion, break down the learning objectives into each day, and build your own knowledge framework system through mind maps.
For example, linux, Python, Mysql, etc. I have a foundation, so it is not included in the learning plan; Spark accounted for 35? The highest weight, which is also the part of my time to focus on learning.

Second stage study (40 days)
According to the outline, study it completely according to the reference book, and the knowledge points that are not in the reference book can be completed by searching online resources.


The third stage of practice (25 days)

According to the knowledge structure of the outline, through the online resources, three sets of virtual machines of the ubuntu system were installed. Java, Python, Mysql, Hadoop, Zookeeper, Hive, Hbase, Spark, etc. were all installed and configured, and the programming of Spark components was also combined. The case was practiced. There are many pits during the installation of these big data software, which took a lot of time.

Make up for the shortcomings in the fourth stage (15 days)
Carry out some simulation exercises, check the effectiveness, check for missing vacancies, and review the forgotten knowledge points.

Daily schedule:
6: 00-7: 00 in the morning: 1-1.5 hours of study. This is a fixed, energetic, and undisturbed period of time every day. I will focus on content that has certain difficulty or scalability.
After work, accompany your child to do homework time and study for 0.5-1 hour. This period of time is usually used to look at some bits and pieces of knowledge.
10: 30-11: 30: After the child sleeps, study for 1 hour.
Basically guarantee 2-3 hours per day.

4. The recommended books and course preparations are mainly based on the outline and preparation manual, and the online resources are consulted in conjunction with the outline; the reference book focuses on Dong Yiqun's "Spark Big Data Analysis Technology and Actual Combat"; in addition, the outline can be viewed in the "Hadoop Authority Guide Fourth Edition "Tsinghua University Press.

5. Advice for test takers If the purpose of getting a certificate is to pass the test and the preparation time is not sufficient, then the outline and the preparation manual are almost the same.
The CDA II big data test still pays more attention to basic theory. If you want to improve your competitiveness and want to apply the knowledge you have learned to your actual work, it is recommended to practice and practice according to the framework of the outline and deepen your understanding of the principles. And memory to improve your true talents.

6. In the future career development plan, I hope that I can continue to try the learning of LEVEL 3 big data scientists, and I can go deeper in the field of big data and artificial intelligence to improve competitiveness. Through machine learning, deep learning and other technologies, fully tap the potential value of massive data owned by telecom operators, in the fields of network operations, precision marketing, intelligent customer service, etc., to improve the user's business experience, reduce operating costs, and provide a powerful transformation and development Support.

Level 2 Big Data · Tanhua Feng Zhuoji
Graduated from Guangdong University of Technology in 2010 with a major in statistics. I have been working in Guangzhou since graduation. Four years of data analysis experience.
1. The current work is currently working in MMS Technology Co., Ltd., to do the unified certification project of China Mobile. Worked as a data analyst for SIM shield. Since SIM Shield is only me as a data analyst, many things need to be handled and studied by myself.


2. Opportunity to apply for CDA certification exam

Data analysis is currently very popular in this industry. But now China does not have a unified national exam, and CDA is recognized by the company I work for, so I chose to apply for CDA certification.
As a data analyst, he has a certain understanding and experience of big data tools. But there is not much understanding of the underlying architecture of Hadoop and Spark. Pass this CDA exam, familiarize yourself with these principles and architecture. Contribute to the development of work in the future.

3. What are the difficulties in the exam? At present, Python's regular expressions, the working principle of Hadoop, and the working principle of Spark are relatively difficult.
For more difficult knowledge points, regular expressions on Python should still be used. Although the score is not large. But it is often used in actual work. So still know more and use regular expressions. In addition, for the working principle of Hadoop, there must be more systematic learning.
Don't use the scattered time to learn the difficulty, it is easy to forget, and the learning efficiency is also very low. For the above knowledge points, try to learn at one time. Understand the context. Some working principles still need to be written down. When the knowledge points are vague, it is best to return to the examination manual and read one more article, which may be a smooth understanding.

4. Recommended books and courses Because many tools for big data are written in Java, it is recommended to learn more about Java. Those involved in Hadoop and Spark can watch videos online.
Many of my knowledge points are learned through online videos. Watching videos has certain advantages. Compared to learning by yourself, watching videos can easily master some knowledge. With the teacher's explanation, it is easier to understand the knowledge points.

5. Advice for test takers It is best to study for 2-3 months. Study strictly in accordance with the syllabus. When allowed, you can build Hadoop and Spark yourself to practice. Remember Linux commands (also commonly used in work). The real exam syllabus can expand the knowledge points with appropriate Baidu or other means.
For SQL, Linux, and Hadoop, I personally use more in my work, and I have manually built Hadoop. I still have a certain understanding of the parameters and architecture. These experiences are also helpful to the exam.
In addition, if you encounter problems that you do not understand, you need to communicate in time, which can be colleagues, classmates, etc. Know how to make appropriate use of resources around you.
In a word, put a good mind and learn more. Positively face daily work. It is right to feel my progress every day.

6. Future career development and planning As a student of statistics, I certainly hope that I can continue to work in the direction of data analysis. The same hope can continue to work on big data. At the same time increase the working experience of Python modeling.

Thanks for watching

Join Us

Company/Organization Name:

Company/Organization Site:

Candidate Name:

Candidate Job:

Tel:

Email:

Admission Remarks: (cause and appeal of admission)

Submit application