Higher education is caught in an “assessment arms race” where students use AI for assignments, and universities develop new methods to counter it. This essay explains this dynamic through Niklas Luhmann’s “double contingency” theory, showing how this unpredictable back-and-forth isn’t just a problem but a constant driver of change. It explores how universities adapt, and predicts a future of perpetual adaptation rather than a stable resolution.
br>
The “assessment arms race” in higher education is a long-standing, fascinating modern challenge. This phenomenon, which predates AI and has evolved with various technological advancements, refers to the ongoing escalation around assessments where students seek new methods, and universities, in turn, develop more sophisticated pedagogical approaches to counter these. The contemporary phase is particularly marked by students increasingly using artificial intelligence (AI) for academic assignments, and universities responding to this novel tool. This back-and-forth, where each side’s actions depend on and react to the other’s, perfectly illustrates a concept called “double contingency” by the sociologist Niklas Luhmann. It’s a dynamic, unpredictable dance that paradoxically drives significant change and adaptation within the academic world.
Double Contingency: Understanding Unpredictable Interactions
Imagine two “black boxes”—individuals or complex systems like universities—trying to interact. They influence each other’s behavior, but neither can fully understand or predict what’s happening inside the other. This is double contingency. Early thinkers like Talcott Parsons saw this as a problem of needing shared understanding to make interactions stable. However, Luhmann went further, suggesting that this inherent unpredictability isn’t just a problem to be solved, but a fundamental source of dynamic change. Without a pre-existing agreement, this “pure circle of self-referential determination” introduces an element of chance and makes any apparent consensus fragile.
This unpredictability isn’t just an external issue; it’s a built-in feature of the system itself. The strange truth is that while true communication requires overcoming this uncertainty, resolving it doesn’t mean perfect harmony or mutual understanding in the traditional sense. Instead, this dynamic acts like a “catalyst,” forcing constant, often surprising, decisions. It allows a continually evolving social order to emerge, where instability itself becomes the foundation of stability.
Luhmann saw society not as a collection of individuals or actions, but as a system made of communications. Communication, in his view, is a three-part process: selecting information from many possibilities, choosing an intentional way to express it (utterance), and then interpreting the difference between what was said and the information conveyed (understanding). When these three elements successfully combine, they create “connections within the system.” This process is “autotelic,” meaning communication primarily serves to reproduce itself, not necessarily to achieve perfect understanding or an external goal.
Solving the problem of double contingency fundamentally involves forming systems by stabilizing “expectations” rather than specific behaviors. These expectations become crucial for new systems to form and to enable ongoing communication and action. The very act of one system observing another, and being observed in return, creates a self-referential loop. This often leads to the development of “trust” or “distrust”—essential strategies that allow social systems to overcome the anxiety of unpredictable interactions. Trust, in this context, isn’t just a feeling; it’s a fundamental structure that emerges from double contingency, allowing systems to form and continue despite inherent risks.
Luhmann’s groundbreaking idea of “interpenetration” helps explain how double contingency is even possible. Interpenetration happens when two systems share their own complexities with each other, allowing each to build upon the other’s capabilities. This allows for more freedom despite increased reliance, as systems find common ground (like shared actions) but interpret and connect them in ways specific to their own internal workings. This continuous, moment-by-moment processing of unpredictable interactions is how meaningful social order is constantly renewed.
Universities as Adapting Systems
From Luhmann’s perspective, universities are complex social systems—specifically, organizations within the broader systems of science and education. They maintain their identity by distinguishing themselves from their environment and reproducing their own operations. The “assessment arms race” is an internal example of double contingency within the university system, where student academic output and the institution’s evaluation of learning are mutually dependent and unpredictable.
Universities operate based on their own internal rules and distinctions, like “truth/untruth” in scientific research or “pass/fail” in assessment. They are “operationally closed,” meaning their core activities (teaching, research, assessment) continuously fuel more activities of the same kind. However, this internal closure requires “structural coupling” with their environment, which includes individual people (students, faculty) and other social systems (like the tech industry producing AI tools).
The “arms race” constantly “irritates” the university system. Student AI use and the university’s reactions are fleeting “events” that force the system to find stability by constantly adapting. This ongoing irritation stimulates the university’s internal operations, but it doesn’t dictate exactly how the university must change. To cope, the university must “self-observe,” understanding its own boundaries and shaping its reality based on its unique perspectives.
A university’s ability to adapt depends on its capacity to increase its internal complexity to match the “hypercomplex environment” of evolving AI capabilities and student strategies. This means developing more sophisticated internal structures, such as better assessment guidelines, flexible teaching methods, and improved faculty training. The current shift toward evaluating uniquely human cognitive processes in assessment is a direct response to the complexity introduced by AI. The goal is to maintain a “complexity differential,” where the university’s internal structures are intricate enough to manage, but not overwhelmed by, the external environment, thus ensuring its continued identity and ability to make choices.
Furthermore, the “arms race” highlights how uncertain expectations are within academia. Faculty expect original student work; students expect assessments to reflect their learning. AI disrupts these expectations, pushing both sides into a “reflexive anticipation” where each tries to guess what the other expects of them. This creates a need for new “structures of expectation,” which are essential for the university’s ongoing self-reproduction. As a social system, the university is continuously forced to adjust its fundamental models because its core “substance” (like the integrity of its assessment process) is constantly changing and must be re-established.
Predicting the Future of the ‘Arms Race’
Based on Luhmann’s theory, we can make several predictions about where the “assessment arms race” is headed:
- Continuous Instability and Internal Drive: This “arms race” isn’t a temporary phase leading to a stable outcome. Instead, it’s a perpetually “restless” and “unpredictable” dynamic. The university’s response to AI (e.g., new detection methods) becomes another “irritation” that fuels further AI innovation by students, and vice versa. This mutual self-disruption is, ironically, the “only source of its stability.” Expect continuous cycles of adaptation rather than a final “solution.”
- Increased System Complexity and Specialization: The university system will likely become more complex internally to manage the external complexity of AI. This could lead to more specialized departments for AI-integrated teaching, academic integrity, or digital literacy. Functional systems, like education, tend to adopt “essentially unstable criteria,” constantly adapting rather than sticking to rigid standards.
- Shift from Direct Control to Managing Uncertainty: Trying to achieve perfect control (e.g., foolproof detection) will become increasingly pointless due to the inherent unpredictability of self-referential systems. Instead, universities will become better at managing “uncertainty” as a natural condition, rather than eliminating it. This means moving away from preventing negative outcomes and focusing more on creating adaptable learning environments that can absorb and channel constant “irritations.”
- Redefining Knowledge and Learning: The ongoing challenge from AI will likely force a fundamental re-evaluation of what “knowledge” and “learning” truly mean within the university. As AI excels at regurgitating existing information, there will be a greater emphasis on uniquely human cognitive processes—like critical analysis, creative problem-solving, ethical reasoning, and the ability to navigate ambiguous, rapidly changing situations. Assessments will need to prioritize these skills, which are less easily replicated by current AI. This evolutionary pressure will drive pedagogical innovation, moving beyond rote learning to higher-order thinking.
- Temporal Decoupling and “Eigentime”: The university system will further develop its “eigentime,” its own internal pace for operations. This means the speed of policy changes, teaching innovations, and assessment cycles will increasingly diverge from the rapid, external pace of AI development. The system will build structures (like institutional memory and future expectations) to manage these different timeframes, allowing it to speed up or slow down its reactions independently.
- Evolution, Not Planning, Shapes the Future: Ultimately, the future of this “arms race” won’t be determined by rational planning or a predefined end goal, but by ongoing social evolution. Society cannot predict or plan its own future; it relies on “blind variation and selective retention”—trying things out and keeping what works. The evolution of the social system confirms itself, leading to continuous adaptation without necessarily achieving an optimal fit or complete control. This means that while specific problems will be addressed, the fundamental dynamic of mutual contingency and adaptation will persist, transforming the very nature of academic life in unpredictable ways. The “future is decided not by decision but by evolution.”