← 返回 Avalaches

「人类—AI关系教练」Amelia Miller为线索,指出聊天机器人不只是工具,还会用奉承、拟人化语气、记忆与个人化等设计,制造亲密感并提高黏著度,进而悄悄取代人们向真人求助的需求,伤害现实人际关系。

Miller在2025年初与牛津网路研究所相关访谈中,遇到一名与ChatGPT「交往」超过18个月的女性;即使她不满其记忆限制与空泛回复,仍觉得「走得太远」而无法「删除他」,凸显依附与无力感可能在不知不觉中形成。

文中提到聊天机器人已被全球逾10亿人使用,且「向AI求建议」是ChatGPT的主要用途之一;Miller建议先写下「个人AI宪法」,在设定中改写系统提示或自订指令,让回复更精炼直接、减少迎合,同时刻意锻炼「社交肌肉」:把通勤语音聊天改成打电话给真人,因为求助本身是建立关系与练习脆弱的过程。

The piece argues that anyone using chatbots for work or life is effectively in a relationship with AI, because systems use flattery, anthropomorphic cues, memory, and personalization to mimic empathy and keep users returning, sometimes displacing real human advice and weakening offline bonds.

Amelia Miller, a “Human-AI Relationship Coach,” began this work in early 2025 while interviewing people connected to the Oxford Internet Institute. One interviewee had been “in a relationship” with ChatGPT for more than 18 months, disliked its memory limits and generic replies, yet felt she had gone too far to “delete him,” revealing how attachment can form without users noticing the tactics.

With chatbots used by more than 1 billion people worldwide, Miller recommends taking control through a “Personal AI Constitution”: adjust settings and custom instructions so the model is succinct, professional, and less validating. She also urges rebuilding “social muscles” by seeking advice from real people—e.g., replacing a long commute spent on voice chat with calls to friends—because asking for advice is not just information transfer but a practice of vulnerability that strengthens relationships.

2025-12-31 (Wednesday) · c7e5242b1e8caba1cee5a969920d2d89e7712afe