How the internet can rebuild trust - FT中文网
登录×
电子邮件/用户名
密码
记住我
请输入邮箱和密码进行绑定操作:
请输入手机号码,通过短信验证(目前仅支持中国大陆地区的手机号):
请您阅读我们的用户注册协议隐私权保护政策,点击下方按钮即视为您接受。
人工智能

How the internet can rebuild trust

Algorithms and generative AI models that decide what billions of users see should be transparent
00:00

{"text":[[{"start":null,"text":"

As AI companies fight for dominance, the temptation to embed bias — commercial, political or cultural — into training data will be immense
"}],[{"start":5.8,"text":"The writer is co-founder of Wikipedia and author of ‘The Seven Rules of Trust’"}],[{"start":11.29,"text":"When I founded Wikipedia in 2001, pioneers of the internet were excited by its promise to give the world access to truth and connection."}],[{"start":22.18,"text":"Two decades later, that optimism has curdled into cynicism. We scroll through feeds serving up news we no longer believe, interact with bots we cannot identify and brace for the next synthetic scandal created by fake images from artificial intelligence."}],[{"start":42.06,"text":"Before the web can move forward, it must remember how it earned trust in the first place."}],[{"start":48.49,"text":"The defining difference between web 1.0 and the platforms that dominate today is not technological sophistication but moral architecture. Early online communities were transparent about process and purpose. They exposed how information was created, corrected and shared. That visibility generated accountability. People could see how the system worked and participate in fixing its mistakes. Trust emerged not from perfection (there was still plenty of online trolling, flame wars and toxicity), but from openness."}],[{"start":84.49000000000001,"text":"Today’s digital landscape reverses that logic. Recommendation algorithms and generative AI models decide what billions of users see, yet their workings remain opaque. When platforms insist their systems are too complex to explain, users are asked to substitute faith for understanding."}],[{"start":105.78,"text":"AI intensifies the problem. Large language models can produce fluent paragraphs and convincing deepfakes. The tools that promised to democratise knowledge now threaten to make knowledge unrecognisable. If everything can be fabricated, the distinction between truth and illusion becomes a matter of persuasion."}],[{"start":127.36,"text":"Re-establishing trust in this environment requires more than fact-checking or content moderation. It requires structural transparency. Every platform that mediates information should make provenance visible: where data originated, how it was processed, and what uncertainty surrounds it. Think of it as nutritional labelling for information. Without it, citizens cannot make informed judgments and democracies cannot function."}],[{"start":156.64,"text":"Equally important is independence. As AI companies fight for dominance, the temptation to embed bias — commercial, political or cultural — into training data will be immense. Guardrails must ensure the entities curating public knowledge are accountable to the public, not just investors."}],[{"start":177.42999999999998,"text":"And we must revive civility too. Some of the best early online spaces relied on norms that valued reasoned argument over insult. They were imperfect but self-correcting because participants felt a duty to the collective project. Today’s social platforms monetise outrage. Restoring trust means designing systems that reward good-faith discourse — through visibility algorithms, community-based moderation, or friction that forces reflection before reposting."}],[{"start":212.55999999999997,"text":"Governments have a role to play but regulation alone cannot rebuild trust. It has to be observed in practice. Platforms should disclose not only how their algorithms work but also when they fail. AI developers should publish dataset sources and error rates."}],[{"start":232.7,"text":"The challenge of our time is not that information is scarce but that authenticity is. Important aspects of the early internet succeeded because people could trace what they read to another human being, even if the other human being was operating behind a pseudonym. The new internet must restore that chain of custody."}],[{"start":255.83999999999997,"text":"We are entering an era when machines can mimic any voice and invent any image. If we want truth to survive that onslaught, we must embed transparency, independence and empathy into the digital architecture itself. The early days of the web showed it could be done. The question is whether we still have the will to do it again."}],[{"start":284.46999999999997,"text":""}]],"url":"https://audio.ftcn.net.cn/album/a_1764835851_6780.mp3"}

版权声明:本文版权归FT中文网所有,未经允许任何单位或个人不得转载,复制或以任何其他方式使用本文全部或部分,侵权必究。

伦敦的私密会员社交场景正在壮大——只是别叫它们“俱乐部”

尽管有报道称富裕人群正从英国出走,一类新型机构仍在纷纷开门迎客。

杰米•李•柯蒂斯:我的天赋是对生命的热爱

在步入花甲后“解放天性”、又抱回奥斯卡小金人,这位演员的事业一路飙升——她正尽情享受这股热潮。

恐怖游戏马为何吓坏了审查者?

Steam拒绝上架该作品,显示出游戏在探索令人不安的思想方面,仍不如书籍和电影那样自由。

权道亨的陨落之路

在法庭上,我们目睹一位纽约法官先是严厉抨击、继而剖析这位Terraform Labs联合创始人的戏剧性崛起与陨落。

韩国高风险偏好散户须先观看培训视频方可交易

韩国监管机构拟强化对散户交易者的监管。

阿波罗押注做空软件企业

因担忧来自AI的威胁,这家私募资本巨头做空贷款并削减对该行业的敞口。
设置字号×
最小
较小
默认
较大
最大
分享×