Our mission is to help computational modelers at all levels engage in the establishment and adoption of community standards and good practices for developing and sharing computational models. Model authors can freely publish their model source code in the Computational Model Library alongside narrative documentation, open science metadata, and other emerging open science norms that facilitate software citation, reproducibility, interoperability, and reuse. Model authors can also request peer review of their computational models to receive a DOI.
All users of models published in the library must cite model authors when they use and benefit from their code.
Please check out our model publishing tutorial and contact us if you have any questions or concerns about publishing your model(s) in the Computational Model Library.
We also maintain a curated database of over 7500 publications of agent-based and individual based models with additional detailed metadata on availability of code and bibliometric information on the landscape of ABM/IBM publications that we welcome you to explore.
Displaying 10 of 1203 results
The model simulates the diffusion of four low-carbon energy technologies among households: photovoltaic (PV) solar panels, electric vehicles (EVs), heat pumps, and home batteries. We model household decision making as the decision marking of one person, the agent. The agent decides whether to adopt these technologies. Hereby, the model can be used to study co-adoption behaviour, thereby going beyond traditional diffusion models that focus on the adop-tion of single technologies. The combination of these technologies is of particular interest be-cause (1) using the energy generated by PV solar panels for EVs and heat pumps can reduce emissions associated with transport and heating, respectively, and (2) EVs, heat pumps, and home batteries can help to integrate PV solar panels in local electricity grids by offering flexible demand (EVs and heat pumps) and energy storage (home batteries and EVs), thereby reducing grid impacts and associated upgrading costs.
The purpose of the model is to represent realistic adoption and co-adoption behaviour. This is achieved by grounding the decision model on the risks-as-feelings model (Loewenstein et al., 2001), theory from environmental and social psychology, and empirically informing agent be-haviour by survey-data among 1469 people in the Swiss region Romandie.
The model can be used to construct scenarios for the diffusion of the four low-carbon energy technologies depending on different contexts, and as a virtual experimentation environment for ex ante evaluation of policy interventions to stimulate adoption and co-adoption.
The Megafaunal Hunting Pressure Model (MHPM) is an interactive, agent-based model designed to conduct experiments to test megaherbivore extinction hypotheses. The MHPM is a model of large-bodied ungulate population dynamics with human predation in a simplified, but dynamic grassland environment. The overall purpose of the model is to understand how environmental dynamics and human predation preferences interact with ungulate life history characteristics to affect ungulate population dynamics over time. The model considers patterns in environmental change, human hunting behavior, prey profitability, herd demography, herd movement, and animal life history as relevant to this main purpose. The model is constructed in the NetLogo modeling platform (Version 6.3.0; Wilensky, 1999).
The model’s aim is to represent the price dynamics under very simple market conditions, given the values adopted by the user for the model parameters. We suppose the market of a financial asset contains agents on the hypothesis they have zero-intelligence. In each period, a certain amount of agents are randomly selected to participate to the market. Each of these agents decides, in a equiprobable way, between proposing to make a transaction (talk = 1) or not (talk = 0). Again in an equiprobable way, each participating agent decides to speak on the supply (ask) or the demand side (bid) of the market, and proposes a volume of assets, where this number is drawn randomly from a uniform distribution. The granularity depends on various factors, including market conventions, the type of assets or goods being traded, and regulatory requirements. In some markets, high granularity is essential to capture small price movements accurately, while in others, coarser granularity is sufficient due to the nature of the assets or goods being traded
The Archaeological Sampling Experimental Laboratory (tASEL) is an interactive tool for setting up and conducting experiments about sampling strategies for archaeological excavation, survey, and prospection.
This is an extended replication of Abelson’s and Bernstein’s early computer simulation model of community referendum controversies which was originally published in 1963 and often cited, but seldom analysed in detail. This replication is in NetLogo 6.3.0, accompanied with an ODD+D protocol and class and sequence diagrams.
This replication replaces the original scales for attitude position and interest in the referendum issue which were distributed between 0 and 1 with values that are initialised according to a normal distribution with mean 0 and variance 1 to make simulation results easier compatible with scales derived from empirical data collected in surveys such as the European Value Study which often are derived via factor analysis or principal component analysis from the answers to sets of questions.
Another difference is that this model is not only run for Abelson’s and Bernstein’s ten week referendum campaign but for an arbitrary time in order that one can find out whether the distributions of attitude position and interest in the (still one-dimensional) issue stabilise in the long run.
This is an agent-based model constructed in Netlogo v6.2.2 which seeks to provide a simple but flexible tool for researchers and dog-population managers to help inform management decisions.
It replicates the basic demographic processes including:
* reproduction
* natural death
* dispersal
…
This model system aims to simulate the whole process of task allocation, task execution and evaluation in the team system through a feasible method. On the basis of Complex Adaptive Systems (CAS) theory and Agent-based Modelling (ABM) technologies and tools, this simulation system attempts to abstract real-world teams into MAS models. The author designs various task allocation strategies according to different perspectives, and the interaction among members is concerned during the task-performing process. Additionally, knowledge can be acquired by such an interaction process if members encounter tasks they cannot handle directly. An artificial computational team is constructed through ABM in this simulation system, to replace real teams and carry out computational experiments. In all, this model system has great potential for studying team dynamics, and model explorers are encouraged to expand on this to develop richer models for research.
The Modern Wage Dynamics Model is a generative model of coupled economic production and allocation systems. Each simulation describes a series of interactions between a single aggregate firm and a set of households through both labour and goods markets. The firm produces a representative consumption good using labour provided by the households, who in turn purchase these goods as desired using wages earned, thus the coupling.
Each model iteration the firm decides wage, price and labour hours requested. Given price and wage, households decide hours worked based on their utility function for leisure and consumption. A labour market construct chooses the minimum of hours required and aggregate hours supplied. The firm then uses these inputs to produce goods. Given the hours actually worked, the households decide actual consumption and a market chooses the minimum of goods supplied and aggregate demand. The firm uses information gained through observing market transactions about consumption demand to refine their conceptions of the population’s demand.
The purpose of this model is to explore the general behaviour of an economy with coupled production and allocation systems, as well as to explore the effects of various policies on wage and production, such as minimum wage, tax credits, unemployment benefits, and universal income.
…
The SIM-VOLATILE model is a technology adoption model at the population level. The technology, in this model, is called Volatile Fatty Acid Platform (VFAP) and it is in the frame of the circular economy. The technology is considered an emerging technology and it is in the optimization phase. Through the adoption of VFAP, waste-treatment plants will be able to convert organic waste into high-end products rather than focusing on the production of biogas. Moreover, there are three adoption/investment scenarios as the technology enables the production of polyhydroxyalkanoates (PHA), single-cell oils (SCO), and polyunsaturated fatty acids (PUFA). However, due to differences in the processing related to the products, waste-treatment plants need to choose one adoption scenario.
In this simulation, there are several parameters and variables. Agents are heterogeneous waste-treatment plants that face the problem of circular economy technology adoption. Since the technology is emerging, the adoption decision is associated with high risks. In this regard, first, agents evaluate the economic feasibility of the emerging technology for each product (investment scenarios). Second, they will check on the trend of adoption in their social environment (i.e. local pressure for each scenario). Third, they combine these two economic and social assessments with an environmental assessment which is their environmental decision-value (i.e. their status on green technology). This combination gives the agent an overall adaptability fitness value (detailed for each scenario). If this value is above a certain threshold, agents may decide to adopt the emerging technology, which is ultimately depending on their predominant adoption probabilities and market gaps.
Policymakers decide on alternative policies facing restricted budgets and uncertain future. Designing public policies is further difficult due to the need to decide on priorities and handle effects across policies. Housing policies, specifically, involve heterogeneous characteristics of properties themselves and the intricacy of housing markets and the spatial context of cities. We propose PolicySpace2 (PS2) as an adapted and extended version of the open source PolicySpace agent-based model. PS2 is a computer simulation that relies on empirically detailed spatial data to model real estate, along with labor, credit, and goods and services markets. Interaction among workers, firms, a bank, households and municipalities follow the literature benchmarks to integrate economic, spatial and transport scholarship. PS2 is applied to a comparison among three competing public policies aimed at reducing inequality and alleviating poverty: (a) house acquisition by the government and distribution to lower income households, (b) rental vouchers, and (c) monetary aid. Within the model context, the monetary aid, that is, smaller amounts of help for a larger number of households, makes the economy perform better in terms of production, consumption, reduction of inequality, and maintenance of financial duties. PS2 as such is also a framework that may be further adapted to a number of related research questions.
Displaying 10 of 1203 results