Deng Honglin: Rejecting “Calculations”: How to Promote the Good Use of Algorithms?
Wed, Apr 09, 2025
As a new type of Internet technology, algorithms have been deeply used in various mobile applications, relying on massive amounts of data to provide users with information through personalized push, sorting and selection, retrieval and filtering, scheduling and decision-making.
The attendant issues such as information cocoons, the rights and interests of workers in the new forms of employment, and price discrimination based on big data should not be overlooked. Data should not be cold and algorithms are not “calculations” used to take advantage of others. So, how can we promote the ethical application of algorithms?
Previously, Meituan and Ele.me both stated that they would establish a mechanism for algorithm transparency to promote the transparency of platform rules. Among them, Meituan announced new progress in “canceling overtime deductions for riders,” changing the “overtime deduction” policy for crowdsourcing riders in pilot cities to a points system.
Deng Honglin ,an Associate Professor, Institute for Advanced Management Studies of Tongji-SEM,said that many platform algorithms are designed with the primary goal of increasing traffic and user stickiness, and that this commercial interest orientation often makes the algorithm optimization process ignore potential social risks, such as discrimination, bias or misleading consumers. Therefore, how to find a precise balance between commercial needs and social responsibility to ensure the fairness and transparency of algorithms has become a major core challenge in algorithm governance.
In addition, enhancing public recognition and trust in algorithmic governance is an important prerequisite for promoting the normalization of governance, or else it may hinder the implementation of relevant policies. Therefore, establishing a transparent, fair and explainable algorithmic governance mechanism and actively communicating and interacting with the public are key to enhancing public trust.