In a pioneering study led by Cornell University, researchers embarked on an exploratory journey into the realms of algorithmic fairness in a two-player version of the classic game Tetris. The experiment was founded on a simple yet profound premise: Players who received fewer turns during the game perceived their opponent as less likable, regardless of whether a human or an algorithm was responsible for allocating the turns.
This approach marked a significant shift away from the traditional focus of algorithmic fairness research, which predominantly zooms in on the algorithm or the decision itself. Instead, the Cornell University study decided to shed light on the relationships among the people affected by algorithmic decisions. This choice of focus was driven by the real-world implications of AI decision-making.
“We are starting to see a lot of situations in which AI makes decisions on how resources should be distributed among people,” observed Malte Jung, associate professor of information science at Cornell University, who spearheaded the study. As AI becomes increasingly integrated into various aspects of life, Jung highlighted the need to understand how these machine-made decisions shape interpersonal interactions and perceptions. “We see more and more evidence that machines mess with the way we interact with each other,” he commented.
The Experiment: A Twist on Tetris
To conduct the study, Houston Claure, a postdoctoral researcher at Yale University, made use of open-source software to develop a modified version of Tetris. This new version, dubbed Co-Tetris, allowed two players to alternately work together. The players’ shared goal was to manipulate falling geometric blocks, neatly stacking them without leaving gaps and preventing the blocks from piling to the top of the screen.
In a twist on the traditional game, an “allocator”—either a human or an AI—determined which player would take each turn. The allocation of turns was distributed such that players received either 90%, 10%, or 50% of the turns.
The Concept of Machine Allocation Behavior
The researchers hypothesized that players receiving fewer turns would recognize the imbalance. However, what they didn’t anticipate was that players’ feelings towards their co-player would remain largely the same, regardless of whether a human or an AI was the allocator. This unexpected result led the researchers to coin the term “machine allocation behavior.”
This concept refers to the observable behavior exhibited by people based on allocation decisions made by machines. It is a parallel to the established phenomenon of “resource allocation behavior,” which describes how people react to decisions about resource distribution. The emergence of machine allocation behavior demonstrates how algorithmic decisions can shape social dynamics and interpersonal interactions.
Fairness and Performance: A Surprising Paradox
However, the study did not stop at exploring perceptions of fairness. It also delved into the relationship between allocation and gameplay performance. Here, the findings were somewhat paradoxical: fairness in turn allocation didn’t necessarily lead to better performance. In fact, equal allocation of turns often resulted in worse game scores compared to situations where the allocation was unequal.
Explaining this, Claure said, “If a strong player receives most of the blocks, the team is going to do better. And if one person gets 90%, eventually they’ll get better at it than if two average players split the blocks.”
In our evolving world, where AI is increasingly integrated into decision-making processes across various fields, this study offers valuable insights. It provides an intriguing exploration of how algorithmic decision-making can influence perceptions, relationships, and even game performance. By highlighting the complexities that arise when AI intersects with human behaviors and interactions, the study prompts us to ponder crucial questions about how we can better understand and navigate this dynamic, tech-driven landscape.
Credit: Source link
Comments are closed.