← 返回 Avalaches

2024年6月,Apple向外界推出其人工智能平台Apple Intelligence,并在年度开发者大会上宣称其凭借深度硬件整合与个性化能力“超越”标准AI。此后约18个月(1.5年)的多场发布活动中,Apple持续以该品牌AI框架包装新品:每一代新iPhone都被描述为“从零开始为Apple Intelligence而设计”,新芯片被称为“为Apple Intelligence的端侧性能量身定制”,广告则强调其将带来“更个人化的Siri”并重塑应用体验。

但叙事在“上周”急转。1月12日,Apple宣布与Google达成多年期合作,由Gemini模型为未来的Apple Intelligence提供动力;该交易在“去年秋天”谈判,Gemini将协助对Siri进行长期规划的重做。这使iPhone用户最终交互的核心能力更像由Google提供。两家公司联合声明称,Apple在评估后认为Google的AI技术为Apple Foundation Models提供“最有能力的基础”,这既是Alphabet AI业务的显著胜利,也构成Apple的隐性让步;同时,这也与Apple先前选择OpenAI支撑其部分ChatGPT组件形成对照。

就实际体验而言,Apple Intelligence的关键功能未达预期:其AI图像生成与编辑不及第三方应用既有水平,Siri甚至常无法回答基础问题,邮件或短信摘要也往往帮助不大或出人意料地不准确。最具代表性的落差出现在与ChatGPT的整合:用户需经由Siri完成近似“多一次点击”的链路——双击屏幕底部、输入问题、等待Siri困惑并询问是否转交、点击“Use ChatGPT”、再等待答案——流程摩擦被放大。Apple确有依赖外部软件的先例(Safari默认Google搜索、早期iPhone的Google Maps、iPad上的Adobe Photoshop与Microsoft Office),但将生态内“AI革命”的关键部分外包给Mountain View仍显得更不同寻常。

In June 2024, Apple introduced its AI platform, Apple Intelligence, and framed it at its annual developer conference as going “beyond” standard AI through deep hardware integration and personalization. Over roughly the next 18 months (1.5 years), Apple repeatedly marketed new launches through this branded lens: each new iPhone was said to be “designed from the ground up for Apple Intelligence,” new chips were “tailored” for on-device Apple Intelligence performance, and ads promised a “more personal Siri” and reinvented app experiences.

That narrative shifted sharply “last week.” On Jan. 12, Apple announced a multiyear partnership with Google in which Gemini models will power the future of Apple Intelligence; the deal was negotiated “last fall,” and Gemini is slated to support a long-planned Siri overhaul. This implies that what iPhone users eventually interact with may be substantially Google-supplied. In a joint statement, the companies said Apple concluded Google’s AI offered the “most capable foundation” for Apple Foundation Models, a major win for Alphabet and an implicit Apple concession, especially given Apple’s earlier selection of OpenAI to underpin parts of its ChatGPT components.

In practice, Apple Intelligence has lagged its hype: image generation/editing trailed long-available third-party tools, Siri often failed even basic questions, and email/text summaries were frequently unhelpful or unexpectedly inaccurate. The shortfall is clearest in the ChatGPT integration, which functions like “ChatGPT with extra clicks”: double-tap to open Siri, type a query, wait for Siri to ask to relay it, tap “Use ChatGPT,” then wait again for an answer. Apple has precedents for outside dependence (Google Search as Safari’s default, Google Maps in the original iPhone, Adobe Photoshop and Microsoft Office on iPad), but outsourcing a large share of its ecosystem’s AI foundation to Mountain View feels qualitatively different.

2026-01-20 (Tuesday) · b903b9484ed0525a2b73d8c81a403e2d1cdb8249