Krasnow Institute for Advanced Study
Permanent URI for this collection
The Krasnow Institute seeks to expand understanding of mind, brain, and intelligence by conducting research at the intersection of the separate fields of cognitive psychology, neurobiology, and the computer-driven study of artificial intelligence and complex adaptive systems. These separate disciplines increasingly overlap and promise progressively deeper insight into human thought processes. The Institute also examines how new insights from cognitive science research can be applied for human benefit in the areas of mental health, neurological disease, education, and computer design.
Browse
Browsing Krasnow Institute for Advanced Study by Author "Axtell, Robert L."
Now showing 1 - 8 of 8
Results Per Page
Sort Options
Item Agent-Based Modeling in Intelligence Analysis(2012) Frank, Aaron Benjamin; Frank, Aaron Benjamin; Axtell, Robert L.The United States Intelligence Community (IC) was born out of the experiences and organization of the Office of Strategic Services during World War II and became a permanent fixture of the national security establishment with the passage of the National Security Act of 1947. Since its inception, there has been a strong fascination with the secret aspects of its work, particularly with respect to the clandestine collection of information and covert efforts to influence foreign governments, and to undermine rival intelligence services. By comparison, intelligence analysis, specifically the ways in which intelligence professionals develop and present assessments about the international system to policy makers, has been relatively ignored. As a result, intelligence analysis has remained largely under-theorized within the study of international relations, despite its prominent role in strategic thinking--only receiving significant attention in the aftermath of perceived failures.Item Agent-Based Simulation of Tax Reporting Compliance(2012-09-17) Bloomquist, Kim Michael; Bloomquist, Kim Michael; Axtell, Robert L.Following the global financial crisis of 2008 many national governments have a renewed urgency to collect taxes not paid by noncompliant taxpayers. However, despite decades of theoretical and applied research progress has lagged on the development of computational tools to help tax administrators devise effective compliance improvement strategies. This study aims to bridge this gap by introducing the Individual Reporting Compliance Model (IRCM), an agent-based computational model that simulates tax reporting compliance in a community of 85,000 individual taxpayers, their employers and tax preparers. The model uses detailed tax return information yet maintains taxpayer anonymity by replacing actual tax returns with cases from the Statistics of Income (SOI) Public Use File [Weber 2004]. After reviewing the theoretical and empirical literature on taxpayer compliance, this study describes the development of the IRCM and demonstrates its capabilities in several simulation experiments.Item Coupled Dynamics of Labor and Firms through Complex Networks(2013-08-20) Guerrero, Omar A.; Guerrero, Omar A.; Axtell, Robert L.This dissertation bridges the gap between labor and firm dynamics through the study of complex networks in labor markets. With extensive use of large-scale employer-employee matched micro-data and agent-based modeling, we tap into the effects that networked structures (between individuals or between firms) exert in labor outcomes and employment dynamics. Some of the contributions of this work are: (i) the first characterization of a network of firms for an entire economy (connected through labor flows, i.e. labor flow networks); (ii) the study of the relationship between labor flow networks and employment dynamics; (iii) agent-based models that generate rich stylized facts about labor, firm, and social dynamics from microeconomic behavior; (iv) providing the microeconomic foundations of the formation process of labor flow networks by coupling job search models with models about the formation of complex networks. We show that the study of labor dynamics can be enriched by coupling firm dynamics. Using agent-based modeling is a natural way to deal with the heterogeneous experiences of workers and firms while maintaining a simple representation of the labor market. Despite their simplicity these models are grounded on empirical evidence obtained from large-scale micro-data and are capable of generating numerous stylized facts simultaneously. This approach has great potential for the design and evaluation of labor policies. Therefore, governments, regulators, and policy-makers would be greatly benefited from collecting large-scale labor micro-data, analyzing labor flow networks, and developing agent-based models of labor markets.Item Individual and Social Learning: An Implementation of Bounded Rationality from First Principles(2015) Palmer, Nathan Michael; Palmer, Nathan Michael; Axtell, Robert L.This dissertation expands upon a growing economic literature that uses tools from reinforcement learning and approximate dynamic programming to impose bounded rationality in intertemporal choice problems. My dissertation contributes to the literature by applying these tools to the canonical household consumption under uncertainty problem. The three essays explore individual and social approaches to learning-to-optimize and how these may be brought to data.Item Innovation from a Computational Social Science Perspective: Analyses and Models(2013) Casstevens, Randy M.; Casstevens, Randy M.; Axtell, Robert L.Innovation processes are critical for preserving and improving our standard of living. While innovation has been studied by many disciplines, the focus has been on qualitative measures that are specific to a single technological domain. I adopt a quantitative approach to investigate underlying regularities that generalize across multiple domains. I use a novel approach to better understand the innovation process by combining computational models with empirical data on software development, on one hand, and the evolution of the English lexicon on the other. Innovation can be viewed as the recombination and mutation of existing building blocks. I focus on how building blocks are used to generate innovations. The building blocks are pieces of code (e.g., functions or objects) for the software development data and words for the written language. These data lie at extremes of time scales: innovation occurring over the course of a few days or a week in the case of software while language evolution occurs over decades or centuries. This allows the examination of innovation processes that range from highly-constrained to completely open-ended. Computational methods reinforce the findings from the data analyses and permit exploration of the general features of innovation processes through the construction of abstract models.Item Modeling Adaptive Economic Agents With PID Controllers(2015) Carrella, Ernesto; Carrella, Ernesto; Axtell, Robert L.I provide here a counterpoint to the rational agents that dominate economics: rather than adding rigidities and information limits on an otherwise classical feed-forward agent, I build a new feed-back agent that achieves equilibrium without knowledge of the model or the market it is in.Item The Blind Lawmaker(2013-08) Koehler, Matthew; Koehler, Matthew; Axtell, Robert L.Many have written about how the Common Law should evolve. The few attempts to demonstrate this empirically, however, have not found evidence that this evolution takes place. This study uses a representation of the Article III United States Federal Courts and an agent-based model to demonstrate that a judicial system may evolve while simultaneously emitting signals to the contrary by evolving via a punctuated equilibrium dynamic. The study then proceeds to demonstrate that agent-based modeling is a viable method for understanding the performance of judicial institutions. After reviewing concepts of jurisprudence and computational social science, the development of the model is discussed followed by a presentation of the results of the aforementioned experiments.Item Towards Emergent Social Complexity(2015) Rouly, Ovi Chris; Rouly, Ovi Chris; Axtell, Robert L.; Crooks, AndrewComplexity science often uses generative models to study and explain the emergent behavior of humans, human culture, and human patterns of social organization. In spite of this, little is known about how the lowest levels of human social organization came into being. That is, little is known about how the earliest members of our hominini tribe transitioned from being presumably small-groups of ape-like polygamous/ promiscuous individuals (beginning perhaps as early as Ardipithecus or Australopithecus after the time of the Pan-Homo split in the late Pliocene to early Pleistocene eras) into family units having stable breeding-bonds, extended families, and clans. What were the causal mechanisms (biological, possibly cognitive, social, and environmental, etc.) that were responsible for the conversion? To confound the issue, it is also possible the conversion process itself was a complex system replete with input sensitivities and path dependencies, i.e., a nested complex system. These processes and their distinctive social arrangements may be referred to favorably (as one notable anthropologist has called them) as, “the deep structure of society.” This dissertation describes applied research that used discrete event computer modeling techniques in an attempt to model-then-understand a few of the underlying social, environmental, and biological systems present at the root of human sociality; at the root of social complexity.