1. Responsible for the design and implementation of business solutions including data research, demand understanding, data collection, data cleaning, data analysis, data service, etc. The key tasks include but not limited to:
• Data warehouse construction: responsible for establishing offline or real-time data warehouses, and designing data models and topics according to business conditions;
• Data development: Responsible for the ETL implementation of data model, participating in the optimization of the ETL process, and solving ETL-related technical issues;
• Data service: understand business requirements and provide business-oriented data services such as OLAP, reports, data visualization large screens, and data APIs
2. In-depth understanding of business processes, drive business through data, facilitate in solution updating and optimizing.
1. Bachelor’s degree or above, 211, 985 graduates preferred;
2. Proficient in at least one programming language (Python or Java), familiar with Kafka, Hadoop, Hive, Spark, Flink, Storm and other big data computing platform tools;
3. Master the key skills involved in data R&D: data warehouse design, ETL development, data service, etc.;
4. Experience in large-scale data platform or data warehouse architecture design, model design and large-scale performance tuning is preferred.