← 返回 Avalaches

尽管在人工智能领域被视为战略竞争对手,美国与中国在研究层面的合作程度仍然显著。对 2026 年 NeurIPS 会议上 5,290 篇论文的分析显示,其中 141 篇(约 3%)由同时隶属于美国和中国机构的作者共同完成。对 2024 年数据的对比也显示类似稳定水平,当年 4,497 篇论文中有 134 篇涉及两国合作,表明这种跨国协作并非短期异常。

研究内容层面的交叉同样明显。由谷歌研究人员提出的 Transformer 架构出现在 292 篇包含中国机构作者的论文中,而 Meta 的 Llama 模型系列在其中 106 篇论文中被使用。与此同时,中国阿里巴巴开发的大语言模型 Qwen 出现在 63 篇包含美国机构作者的论文中,显示算法与模型在两国之间被频繁共享、改造和再利用。

方法上,研究者使用 OpenAI 的 Codex 分析并下载全部论文,通过脚本识别作者所属的美国与中国机构。尽管该过程需要反复校验以避免模型错误,但结果表明两国 AI 生态在人才、工具和思想上高度交织。总体来看,尽管政治与产业层面的对立加剧,定量证据显示,美国与中国在前沿 AI 研究中仍保持着持续且互利的合作关系。

Although often portrayed as strategic rivals in artificial intelligence, the United States and China continue to collaborate extensively at the research level. An analysis of 5,290 papers presented at the 2026 NeurIPS conference found that 141 papers, or about 3 percent, were coauthored by researchers affiliated with both US and Chinese institutions. Comparable figures from 2024 show similar stability, with 134 of 4,497 papers involving collaboration, indicating this pattern is not a short-term anomaly.

Cross-national exchange is also evident in research content. The transformer architecture developed at Google appears in 292 papers with authors from Chinese institutions, while Meta’s Llama model family features in 106 of those papers. At the same time, Qwen, a large language model developed by Alibaba in China, appears in 63 papers that include authors from US organizations, showing frequent sharing and adaptation of models across borders.

Methodologically, the analysis used OpenAI’s Codex to download and scan all papers, identifying US and Chinese institutional affiliations through automated scripts. Although the process required careful manual verification to catch model errors, the results highlight how deeply intertwined the two AI ecosystems are in talent, tools, and ideas. Overall, despite rising political and industrial tensions, quantitative evidence shows sustained and mutually beneficial US-China collaboration in frontier AI research.

2026-01-23 (Friday) · 1809ce7fa9c1230488cfdbacd8d3fa5982d2c4b2