883 resultados para Agent-based systems
Resumo:
We examined drivers of article citations using 776 articles that were published from 1990-2012 in a broad-based and high-impact social sciences journal, The Leadership Quarterly. These articles had 1,191 unique authors having published and received in total (at the time of their most recent article published in our dataset) 16,817 articles and 284,777 citations, respectively. Our models explained 66.6% of the variance in citations and showed that quantitative, review, method, and theory articles were significantly more cited than were qualitative articles or agent-based simulations. As concerns quantitative articles, which constituted the majority of the sample, our model explained 80.3% of the variance in citations; some methods (e.g., use of SEM) and designs (e.g., meta-analysis), as well as theoretical approaches (e.g., use of transformational, charismatic, or visionary type-leadership theories) predicted higher article citations. Regarding the statistical conclusion validity of quantitative articles, articles having endogeneity threats received significantly fewer citations than did those using a more robust design or an estimation procedure that ensured correct causal estimation. We make several general recommendations on how to improve research practice and article citations.
Resumo:
Las administracionespúblicas de los países avanzadosestán llevando a caboiniciativas para gestionar lainformación y la comunicaciónde riesgo y emergencias mediantesitios web concebidos ydiseñados para ello. Estos sitiosestán pensados para facilitarinformación a los ciudadanosen caso de emergencias, perotambién contienen informaciónútil para los expertos y las autoridades.En este trabajo, y ala luz de la legislación españolasobre emergencias, se comparanlos sitios de la administraciónautonómica catalana y delgobierno de España con lossitios de tres países de referencia:Estados Unidos, Francia yReino Unido. Al mismo tiempose propone una metodologíasimple para llevar a cabo unacomparación que permita extraerconclusiones y plantearrecomendaciones en un aspectode la gestión de la información que puede resultar clave para salvar bienes materiales y vidas humanas.
Resumo:
Test-based assessment tools are mostly focused on the use of computers. However, advanced Information and Communication Technologies, such as handheld devices, opens up the possibilities of creating new assessment scenarios, increasing the teachers’ choices to design more appropriate tests for their subject areas. In this paper we use the term Computing-Based Testing (CBT) instead of Computer-Based Testing, as it captures better the emerging trends. Within the CBT context, the paper is centred on proposing an approach for “Assessment in situ” activities, where questions have to be answered in front of a real space/location (situ). In particular, we present the QuesTInSitu software implementation that includes both an editor and a player based on the IMS Question and Test Interoperability specification and GoogleMaps. With QuesTInSitu teachers can create geolocated questions and tests (routes), and students can answer the tests using mobile devices with GPS when following a route. Three illustrating scenarios and the results from the implementation of one of them in a real educational situation show that QuesTInSitu enables the creation of innovative, enriched and context-aware assessment activities. The results also indicate that the use of mobile devices and location-based systems in assessment activities facilitates students to put explorative and spatial skills into practice and fosters their motivation, reflection and personal observation.
Resumo:
Earthquakes represent a major hazard for populations around the world, causing frequent loss of life,human suffering and enormous damage to homes, other buildings and infrastructure. The Technology Resources forEarthquake Monitoring and Response (TREMOR) Team of 36 space professionals analysed this problem over thecourse of the International Space University Summer Session Program and published their recommendations in the formof a report. The TREMOR Team proposes a series of space- and ground-based systems to provide improved capabilityto manage earthquakes. The first proposed system is a prototype earthquake early-warning system that improves theexisting knowledge of earthquake precursors and addresses the potential of these phenomena. Thus, the system willat first enable the definitive assessment of whether reliable earthquake early warning is possible through precursormonitoring. Should the answer be affirmative, the system itself would then form the basis of an operational earlywarningsystem. To achieve these goals, the authors propose a multi-variable approach in which the system will combine,integrate and process precursor data from space- and ground-based seismic monitoring systems (already existing andnew proposed systems) and data from a variety of related sources (e.g. historical databases, space weather data, faultmaps). The second proposed system, the prototype earthquake simulation and response system, coordinates the maincomponents of the response phase to reduce the time delays of response operations, increase the level of precisionin the data collected, facilitate communication amongst teams, enhance rescue and aid capabilities and so forth. It isbased in part on an earthquake simulator that will provide pre-event (if early warning is proven feasible) and post-eventdamage assessment and detailed data of the affected areas to corresponding disaster management actors by means of ageographic information system (GIS) interface. This is coupled with proposed mobile satellite communication hubs toprovide links between response teams. Business- and policy-based implementation strategies for these proposals, suchas the establishment of a non-governmental organisation to develop and operate the systems, are included.
Resumo:
Three-dimensional analysis of the entire sequence in ski jumping is recommended when studying the kinematics or evaluating performance. Camera-based systems which allow three-dimensional kinematics measurement are complex to set-up and require extensive post-processing, usually limiting ski jumping analyses to small numbers of jumps. In this study, a simple method using a wearable inertial sensors-based system is described to measure the orientation of the lower-body segments (sacrum, thighs, shanks) and skis during the entire jump sequence. This new method combines the fusion of inertial signals and biomechanical constraints of ski jumping. Its performance was evaluated in terms of validity and sensitivity to different performances based on 22 athletes monitored during daily training. The validity of the method was assessed by comparing the inclination of the ski and the slope at landing point and reported an error of -0.2±4.8°. The validity was also assessed by comparison of characteristic angles obtained with the proposed system and reference values in the literature; the differences were smaller than 6° for 75% of the angles and smaller than 15° for 90% of the angles. The sensitivity to different performances was evaluated by comparing the angles between two groups of athletes with different jump lengths and by assessing the association between angles and jump lengths. The differences of technique observed between athletes and the associations with jumps length agreed with the literature. In conclusion, these results suggest that this system is a promising tool for a generalization of three-dimensional kinematics analysis in ski jumping.
Resumo:
Use of ICN’s Internet and data services have continued to increase exponentially, which reflects the capacity needed for greater access to high-speed Internet (Broadband). Users are incorporating more web-based applications, which uses larger amounts of bandwidth; such as transmitting hospital MRIs, video streaming, and web-based systems.
Resumo:
Empirical studies indicate that the transition to parenthood is influenced by an individual's peer group. To study the mechanisms creating interdepen- dencies across individuals' transition to parenthood and its timing we apply an agent-based simulation model. We build a one-sex model and provide agents with three different characteristics regarding age, intended education and parity. Agents endogenously form their network based on social closeness. Network members then may influence the agents' transition to higher parity levels. Our numerical simulations indicate that accounting for social inter- actions can explain the shift of first-birth probabilities in Austria over the period 1984 to 2004. Moreover, we apply our model to forecast age-specific fertility rates up to 2016.
Resumo:
PURPOSE: We investigated the influence of beam modulation on treatment planning by comparing four available stereotactic radiosurgery (SRS) modalities: Gamma-Knife-Perfexion, Novalis-Tx Dynamic-Conformal-Arc (DCA) and Dynamic-Multileaf-Collimation-Intensity-Modulated-radiotherapy (DMLC-IMRT), and Cyberknife. MATERIAL AND METHODS: Patients with arteriovenous malformation (n = 10) or acoustic neuromas (n = 5) were planned with different treatment modalities. Paddick conformity index (CI), dose heterogeneity (DH), gradient index (GI) and beam-on time were used as dosimetric indices. RESULTS: Gamma-Knife-Perfexion can achieve high degree of conformity (CI = 0.77 ± 0.04) with limited low-doses (GI = 2.59 ± 0.10) surrounding the inhomogeneous dose distribution (D(H) = 0.84 ± 0.05) at the cost of treatment time (68.1 min ± 27.5). Novalis-Tx-DCA improved this inhomogeneity (D(H) = 0.30 ± 0.03) and treatment time (16.8 min ± 2.2) at the cost of conformity (CI = 0.66 ± 0.04) and Novalis-TX-DMLC-IMRT improved the DCA CI (CI = 0.68 ± 0.04) and inhomogeneity (D(H) = 0.18 ± 0.05) at the cost of low-doses (GI = 3.94 ± 0.92) and treatment time (21.7 min ± 3.4) (p<0.01). Cyberknife achieved comparable conformity (CI = 0.77 ± 0.06) at the cost of low-doses (GI = 3.48 ± 0.47) surrounding the homogeneous (D(H) = 0.22 ± 0.02) dose distribution and treatment time (28.4min±8.1) (p<0.01). CONCLUSIONS: Gamma-Knife-Perfexion will comply with all SRS constraints (high conformity while minimizing low-dose spread). Multiple focal entries (Gamma-Knife-Perfexion and Cyberknife) will achieve better conformity than High-Definition-MLC of Novalis-Tx at the cost of treatment time. Non-isocentric beams (Cyberknife) or IMRT-beams (Novalis-Tx-DMLC-IMRT) will spread more low-dose than multiple isocenters (Gamma-Knife-Perfexion) or dynamic arcs (Novalis-Tx-DCA). Inverse planning and modulated fluences (Novalis-Tx-DMLC-IMRT and CyberKnife) will deliver the most homogeneous treatment. Furthermore, Linac-based systems (Novalis and Cyberknife) can perform image verification at the time of treatment delivery.
Resumo:
There is no definite theory yet for the mechanism by which the pattern of epidermal ridges on fingers, palms and soles forming friction ridge skin (FRS) patterns is created. For a long time growth forces in the embryonal epidermis have been believed to be involved in FRS formation. More recent evidence suggests that Merkel cells play an important part in this process as well. Here we suggest a model for the formation of FRS patterns that links Merkel cells to the epidermal stress distribution. The Merkel cells are modeled as agents in an agent based model that move anisotropically where the anisotropy is created by the epidermal stress tensor. As a result ridge patterns are created with pattern defects as they occur in real FRS patterns. As a consequence we suggest why the topology of FRS patterns is indeed unique as the arrangement of pattern defects is sensitive to the initial configuration of Merkel cells.
Resumo:
In the administration, planning, design, and maintenance of road systems, transportation professionals often need to choose between alternatives, justify decisions, evaluate tradeoffs, determine how much to spend, set priorities, assess how well the network meets traveler needs, and communicate the basis for their actions to others. A variety of technical guidelines, tools, and methods have been developed to help with these activities. Such work aids include design criteria guidelines, design exception analysis methods, needs studies, revenue allocation schemes, regional planning guides, designation of minimum standards, sufficiency ratings, management systems, point based systems to determine eligibility for paving, functional classification, and bridge ratings. While such tools play valuable roles, they also manifest a number of deficiencies and are poorly integrated. Design guides tell what solutions MAY be used, they aren't oriented towards helping find which one SHOULD be used. Design exception methods help justify deviation from design guide requirements but omit consideration of important factors. Resource distribution is too often based on dividing up what's available rather than helping determine how much should be spent. Point systems serve well as procedural tools but are employed primarily to justify decisions that have already been made. In addition, the tools aren't very scalable: a system level method of analysis seldom works at the project level and vice versa. In conjunction with the issues cited above, the operation and financing of the road and highway system is often the subject of criticisms that raise fundamental questions: What is the best way to determine how much money should be spent on a city or a county's road network? Is the size and quality of the rural road system appropriate? Is too much or too little money spent on road work? What parts of the system should be upgraded and in what sequence? Do truckers receive a hidden subsidy from other motorists? Do transportation professions evaluate road situations from too narrow of a perspective? In considering the issues and questions the author concluded that it would be of value if one could identify and develop a new method that would overcome the shortcomings of existing methods, be scalable, be capable of being understood by the general public, and utilize a broad viewpoint. After trying out a number of concepts, it appeared that a good approach would be to view the road network as a sub-component of a much larger system that also includes vehicles, people, goods-in-transit, and all the ancillary items needed to make the system function. Highway investment decisions could then be made on the basis of how they affect the total cost of operating the total system. A concept, named the "Total Cost of Transportation" method, was then developed and tested. The concept rests on four key principles: 1) that roads are but one sub-system of a much larger 'Road Based Transportation System', 2) that the size and activity level of the overall system are determined by market forces, 3) that the sum of everything expended, consumed, given up, or permanently reserved in building the system and generating the activity that results from the market forces represents the total cost of transportation, and 4) that the economic purpose of making road improvements is to minimize that total cost. To test the practical value of the theory, a special database and spreadsheet model of Iowa's county road network was developed. This involved creating a physical model to represent the size, characteristics, activity levels, and the rates at which the activities take place, developing a companion economic cost model, then using the two in tandem to explore a variety of issues. Ultimately, the theory and model proved capable of being used in full system, partial system, single segment, project, and general design guide levels of analysis. The method appeared to be capable of remedying many of the existing work method defects and to answer society's transportation questions from a new perspective.
Resumo:
In order to understand the development of non-genetically encoded actions during an animal's lifespan, it is necessary to analyze the dynamics and evolution of learning rules producing behavior. Owing to the intrinsic stochastic and frequency-dependent nature of learning dynamics, these rules are often studied in evolutionary biology via agent-based computer simulations. In this paper, we show that stochastic approximation theory can help to qualitatively understand learning dynamics and formulate analytical models for the evolution of learning rules. We consider a population of individuals repeatedly interacting during their lifespan, and where the stage game faced by the individuals fluctuates according to an environmental stochastic process. Individuals adjust their behavioral actions according to learning rules belonging to the class of experience-weighted attraction learning mechanisms, which includes standard reinforcement and Bayesian learning as special cases. We use stochastic approximation theory in order to derive differential equations governing action play probabilities, which turn out to have qualitative features of mutator-selection equations. We then perform agent-based simulations to find the conditions where the deterministic approximation is closest to the original stochastic learning process for standard 2-action 2-player fluctuating games, where interaction between learning rules and preference reversal may occur. Finally, we analyze a simplified model for the evolution of learning in a producer-scrounger game, which shows that the exploration rate can interact in a non-intuitive way with other features of co-evolving learning rules. Overall, our analyses illustrate the usefulness of applying stochastic approximation theory in the study of animal learning.
Resumo:
Based on provious (Hemelrijk 1998; Puga-González, Hildenbrant & Hemelrijk 2009), we have developed an agent-based model and software, called A-KinGDom, which allows us to simulate the emergence of the social structure in a group of non-human primates. The model includes dominance and affiliative interactions and incorporate s two main innovations (preliminary dominance interactions and a kinship factor), which allow us to define four different attack and affiliative strategies. In accordance with these strategies, we compared the data obtained under four simulation conditions with the results obtained in a provious study (Dolado & Beltran 2012) involving empirical observations of a captive group of mangabeys (Cercocebus torquatus)
Resumo:
The City of Marquette lies in the 65,000 acre Mississippi River watershed, and is surrounded by steep bluffs. Though scenic, controlling water runoff during storm events presents significant challenges. Flash-flooding from the local watershed has plagued the city for decades. The people of Marquette have committed to preserve the water quality of key natural resources in the area including the Bloody Run Creek and associated wetlands by undertaking projects to control the spread of debris and sediment caused by excess runoff during area storm events. Following a July 2007 storm (over 8” of rain in 24 hours) which caused unprecedented flood damage, the City retained an engineering firm to study the area and provide recommendations to eliminate or greatly reduce uncontrolled runoff into the Bloody Run Creek wetland, infrastructure damage and personal property loss. Marquette has received Iowa Great Places designation, and has demonstrated its commitment to wetland preservation with the construction of Phase I of this water quality project. The Bench Area Storm Water Management Plan prepared by the City in 2008 made a number of recommendations to mitigate flash flooding by improving storm water conveyance paths, detention, and infrastructure within the Bench area. Due to steep slopes and rocky geography, infiltration based systems, though desirable, would not be an option over surface based systems. Runoff from the 240 acre watershed comes primarily from large, steep drainage areas to the south and west, flowing to the Bench area down three hillside routes; designated as South East, South Central and South West. Completion of Phase I, which included an increased storage capacity of the upper pond, addressed the South East and South Central areas. The increased upper pond capacity will now allow Phase II to proceed. Phase II will address runoff from the South West drainage area; which engineers have estimated to produce as much water volume as the South Central and South East areas combined. Total costs for Phase I are $1.45 million, of which Marquette has invested $775,000, and IJOBS funding contributed $677,000. Phase II costs are estimated at $617,000. WIRB funding support of $200,000 would expedite project completion, lessen the long term debt impact to the community and aid in the preservation of the Bloody Run Creek and adjoining wetlands more quickly than Marquette could accomplish on its own.