🤖 AI Summary
The proliferation of AI-generated survey papers has led to information overload, content redundancy, and factual inaccuracies, severely undermining scholarly credibility. This paper introduces the concept of “survey paper DDoS attacks” to characterize their systemic disruption to the knowledge ecosystem. We propose a Dynamic Living Survey (DLS) platform framework integrating LLM-driven automated updates, multi-dimensional quality auditing, quantitative trend analysis, and community-based collaborative review. Furthermore, we advocate an expert-led, AI-augmented authorship paradigm. Our work establishes the first integrated assessment-and-governance methodology specifically for AI-generated surveys. It enables the development of transparent, verifiable, and sustainably evolving dynamic survey knowledge bases—providing both theoretical foundations and actionable pathways to restore trustworthiness in academic survey literature.
📝 Abstract
Survey papers are foundational to the scholarly progress of research communities, offering structured overviews that guide both novices and experts across disciplines. However, the recent surge of AI-generated surveys, especially enabled by large language models (LLMs), has transformed this traditionally labor-intensive genre into a low-effort, high-volume output. While such automation lowers entry barriers, it also introduces a critical threat: the phenomenon we term the "survey paper DDoS attack" to the research community. This refers to the unchecked proliferation of superficially comprehensive but often redundant, low-quality, or even hallucinated survey manuscripts, which floods preprint platforms, overwhelms researchers, and erodes trust in the scientific record. In this position paper, we argue that we must stop uploading massive amounts of AI-generated survey papers (i.e., survey paper DDoS attack) to the research community, by instituting strong norms for AI-assisted review writing. We call for restoring expert oversight and transparency in AI usage and, moreover, developing new infrastructures such as Dynamic Live Surveys, community-maintained, version-controlled repositories that blend automated updates with human curation. Through quantitative trend analysis, quality audits, and cultural impact discussion, we show that safeguarding the integrity of surveys is no longer optional but imperative to the research community.