Our mission is to help computational modelers at all levels engage in the establishment and adoption of community standards and good practices for developing and sharing computational models. Model authors can freely publish their model source code in the Computational Model Library alongside narrative documentation, open science metadata, and other emerging open science norms that facilitate software citation, reproducibility, interoperability, and reuse. Model authors can also request peer review of their computational models to receive a DOI.
All users of models published in the library must cite model authors when they use and benefit from their code.
Please check out our model publishing tutorial and contact us if you have any questions or concerns about publishing your model(s) in the Computational Model Library.
We also maintain a curated database of over 7500 publications of agent-based and individual based models with additional detailed metadata on availability of code and bibliometric information on the landscape of ABM/IBM publications that we welcome you to explore.
Displaying 10 of 1154 results for "Ian M Hamilton" clear search
Signaling chains are a special case of Lewis’ signaling games on networks. In a signaling chain, a sender tries to send a single unit of information to a receiver through a chain of players that do not share a common signaling system.
The purpose of Hegmon’s Sharing model is to develop an understanding of the effect sharing strategies have on household survival.
We demonstrate how a simple model of community associated Methicillin-resistant Staphylococcus aureus (CA-MRSA) can be easily constructed by leveraging the statecharts and ReLogo capabilities in Repast Simphony.
How do households alter their spending patterns when they experience changes in income? This model answers this question using a random assignment scheme where spending patterns are copied from a household in the new income bracket.
This model takes into consideration Peer Reviewing under the influence of Impact Factor (PRIF) and it has the purpose to explore whether the infamous metric affects assessment of papers under review. The idea is to consider to types of reviewers, those who are agnostic towards IF (IU1) and those that believe that it is a measure of journal (and article) quality (IU2). This perception is somehow reflected in the evaluation, because the perceived scientific value of a paper becomes a function of the journal in which an article has been submitted. Various mechanisms to update reviewer preferences are also implemented.
The purpose of the ABRam-BG model is to study belief dynamics as a potential driver of green (growth) transitions and illustrate their dynamics in a closed, decentralized economy populated by utility maximizing agents with an environmental attitude. The model is built using the ABRam-T model (for model visit: https://doi.org/10.25937/ep45-k084) and introduces two types of capital – green (low carbon intensity) and brown (high carbon intensity) – with their respective technological progress levels. ABRam-BG simulates a green transition as an emergent phenomenon resulting from well-known opinion dynamics along the economic process.
The model’s aim is to represent the price dynamics under very simple market conditions, given the values adopted by the user for the model parameters. We suppose the market of a financial asset contains agents on the hypothesis they have zero-intelligence. In each period, a certain amount of agents are randomly selected to participate to the market. Each of these agents decides, in a equiprobable way, between proposing to make a transaction (talk = 1) or not (talk = 0). Again in an equiprobable way, each participating agent decides to speak on the supply (ask) or the demand side (bid) of the market, and proposes a volume of assets, where this number is drawn randomly from a uniform distribution. The granularity depends on various factors, including market conventions, the type of assets or goods being traded, and regulatory requirements. In some markets, high granularity is essential to capture small price movements accurately, while in others, coarser granularity is sufficient due to the nature of the assets or goods being traded
We model the relationship between natural resource user´s individual time preferences and their use of destructive extraction method in the context of small-scale fisheries.
The model simulates the process of widespread diffusion of something due to popularity (i.e., bandwagon) within an organization.
An algorithm implemented in NetLogo that can be used for searching resources.
Displaying 10 of 1154 results for "Ian M Hamilton" clear search