Contact

Email: maples7@163.com
Phone: 183*****579
WeChat: maples7_lq
Location: Soochow, China

Education

2012.9 - 2016.6
Shandong University - Electric Engineering and Automation (Excellence Program) - Bachelor
2009.7 - 2012.6
Changjun High School - Science Experimental Class

Profile

I'm a web backend engineer working daily with Node.js/Python/Elixir, and currently have an exploratory interest in Rust and Swift.

I love to create value with code, I love to drill down and learn new technologies. I regularly read excellent technical books and technical blogs to improve myself, and I also have a habit of writing technical blogs myself.

I am a proponent of open source, and when there are good ideas I will work on their implementation or try to "build wheels".

In high school, I won a provincial second prize in the NOIP (National Olympiad in Informatics in Provinces) which lays a solid foundation for my knowledge about Data Structure and Algorithm.

In my spare time, I love to read and am a MOOC enthusiast who enjoys the time of studying alone.

中文版本

cv.maples7.com

Li Qi


Back-End Web Developer

Work Exprience

Seniverse
Head of Back-End R&D
2017.6 - Now
As the backend leader, I work with the CTO to design and implement a complete architecture for the entire backend weather data processing pipeline - from data collection to data distribution. The data processing part mainly involves the management of data processing task flow and the storage scheme for different data types, so it's about how to process and store data quickly and efficiently, mainly via the Python technology stack; the data distribution part is to build a stable API system that can carry high concurrent requests, including the design optimization of full request links from SLB to microservices, as well as an official website providing self-service single provisioning system, mainly via the Node.js and Elixir technology stack.
  • API v4 - Hogwarts: It's a completely new API architecture designed and implemented relative to the older v3 systems, primarily to address the user experience and concurrency performance issues of the v3 API. The underlying services are mostly completely redesigned and implemented with Elixir (previously Node.js), stripping away the gateway logic and stateless data services. In user usage, Google+'s Partial Response protocol has been implemented so that the interface supports user-defined combinations of arbitrary fields to be returned; In terms of interface performance, internal services use Erlang's native RPC protocol for I/O communication, while redesigning the data storage method to rely on OSS for underlying storage, normalizing meteorological data through a standard binary matrix arrangement, and using multi-level caching to improve access performance; In service stability, completely designed and developed with the concept of Cloud Native, the whole system is deployed on the Kubernetes cluster of Aliyun, and achieved automatic scaling; In terms of development efficiency, we think about the flow of data throughout the system from a higher level of "meta-programming" and "metaflow" as much as possible, and design unified specifications and standards as the underlying support for the upper-level data flow needs, so that developers are usually developing platforms and rules, while specific single data development and iteration needs can be quickly validated and put it online through simple configuration. I was primarily responsible for the design of the majority of the system architecture, standards and specifications, working with members of my group to complete the development job.
  • Official Website v3 - Hansel: Primarily based on the NestJS framework, I uses TypeScript to design and implement the backend services of the official website that support both the front and backend. Modules can be divided into Organization, User, Order, Transaction, Product Definition, Product Instance, etc. Starting with the pre-technical selection, I was personally responsible for the service building, code organization, most of the module implementations, and the final formal go-online of the service.
  • Data Acquisition System - Scorpio: For weather site data, I use the Celery/RabbitMQ task queue in Python3 to implement distributed crawling, pre-screening, data fusion between different data sources, post-processing standardization, and storage into the library. In between, Statsd/Graphite is used to monitor task processing status and system key metrics. Finally, the program is packaged for deployment using Twitter's PEX tool. I was responsible for the full design and implementation of the system. For professional grid data, Airflow was used to manage the ETL task flow, and I was responsible for the implementation and maintenance of most of the data flow.
  • Kong Gateway-based Microservice Architecture - Halo: Based on the open source gateway system Kong, the gateway service was secondary-customized developed with Lua to fit its own business, with the aim of splitting user logic and stateless data interface logic, making the system easier to maintain and scale. Its main functions include the verification of request parameters, user authentication, logging of user-related requests, access statistics and deductions, access rate limits, etc. It includes an implementation of a simple time series database for graphical presentation of different time latitudes of the access volume. The final service is deployed in the Kubernetes cluster. I was responsible for majority development, maintenance and iteration of Plugins.
Duoyi Network
Back-End Web Developer
2016.3/4/7 - 2017.5
I was mainly responsible for web backend development using Node.js technology stack, involving Express/Koa framework, MySQL, MongoDB, Redis, etc.
  • Game BBS: In the early stage, I was responsible for writing the Activity module and Tag module, and in the later stage, I took over the entire project independently, including the implementation of new needs, daily maintenance, server deployment, new releases, performance optimization, and other complete processes from writing code to going online. The project is mainly based on the Express + MySQL + Redis architecture and consists of two parts. The front-end service is mainly responsible for simple page logic and page rendering, and the back-end service provides RESTful interface returning JSON data, so it also provides data to the mobile app. This project has given me a deeper understanding of best practices for building document systems, automating testing, logging systems, unified data return, MVC hierarchical architecture, rational use of caches, and other issues in Node.js industrial practice.

Open Source
  • barn: A CLI tool which generates and deploy resumes on Github Pages automatically. Exactly like resume-version Hexo. Needless to say, this resume is generated by this tool.
  • Others: kong-config-manager、clean-promise、node-redis-cache, etc. Check my GitHub to know more.

Skill & Expertize

Node.js / Elixir / Git
Python / TypeScript / C
Erlang / Docker / Shell
Kubernetes / DevOps
Hosted by Coding Pages & Github Pages