901 resultados para Rumen evacuation
Resumo:
Very large spatially-referenced datasets, for example, those derived from satellite-based sensors which sample across the globe or large monitoring networks of individual sensors, are becoming increasingly common and more widely available for use in environmental decision making. In large or dense sensor networks, huge quantities of data can be collected over small time periods. In many applications the generation of maps, or predictions at specific locations, from the data in (near) real-time is crucial. Geostatistical operations such as interpolation are vital in this map-generation process and in emergency situations, the resulting predictions need to be available almost instantly, so that decision makers can make informed decisions and define risk and evacuation zones. It is also helpful when analysing data in less time critical applications, for example when interacting directly with the data for exploratory analysis, that the algorithms are responsive within a reasonable time frame. Performing geostatistical analysis on such large spatial datasets can present a number of problems, particularly in the case where maximum likelihood. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively. Most modern commodity hardware has at least 2 processor cores if not more. Other mechanisms for allowing parallel computation such as Grid based systems are also becoming increasingly commonly available. However, currently there seems to be little interest in exploiting this extra processing power within the context of geostatistics. In this paper we review the existing parallel approaches for geostatistics. By recognising that diffeerent natural parallelisms exist and can be exploited depending on whether the dataset is sparsely or densely sampled with respect to the range of variation, we introduce two contrasting novel implementations of parallel algorithms based on approximating the data likelihood extending the methods of Vecchia [1988] and Tresp [2000]. Using parallel maximum likelihood variogram estimation and parallel prediction algorithms we show that computational time can be significantly reduced. We demonstrate this with both sparsely sampled data and densely sampled data on a variety of architectures ranging from the common dual core processor, found in many modern desktop computers, to large multi-node super computers. To highlight the strengths and weaknesses of the diffeerent methods we employ synthetic data sets and go on to show how the methods allow maximum likelihood based inference on the exhaustive Walker Lake data set.
Resumo:
Large-scale evacuations are a recurring theme on news channels, whether in response to major natural or manmade disasters. The role of warning dissemination is a key part in the success of such large-scale evacuations and its inadequacy in certain cases has been a 'primary contribution to deaths and injuries' (Hayden et al.; 2007). Along with technology-driven 'official warning channels' (e.g. sirens, mass media), the role of unofficial channel (e.g. neighbours, personal contacts, volunteer wardens) has proven to be significant in warning the public of the need to evacuate. Although post-evacuation studies identify the behaviours of evacuees as disseminators of the warning message, there has not been a detailed study that quantifies the effects of such behaviour on the warning message dissemination. This paper develops an Agent-Based Simulation (ABS) model of multiple agents (evacuee households) in a hypothetical community to investigate the impact of behaviour as an unofficial channel on the overall warning dissemination. Parameters studied include the percentage of people who warn their neighbours, the efficiency of different official warning channels, and delay time to warn neighbours. Even with a low proportion of people willing to warn their neighbour, the results showed considerable impact on the overall warning dissemination. © 2012 Elsevier B.V. All rights reserved.
Resumo:
With the appearance of INTERNET technologies the developers of algorithm animation systems have shifted to build on-line system with the advantages of platform-independence and open accessibility over earlier ones. As a result, there is ongoing research in the re-design and re-evaluation of AAS in order to transform them in task-oriented environments for design of algorithms in on-line mode. The experimental study reported in the present paper contributes in this research.
Resumo:
The problem of adapting teaching systems to the teacher has not been extensively covered in the specialised literature. The authors present the server-client architecture of a Task-Oriented Environment for Design of Virtual Labs (TOEDVL). The paper focuses on the computational models supporting its base of tasks (BT) and on two groups of behavioural tutor’s models for planning training sessions. Detailed examples are presented.
Resumo:
In this paper, we propose an unsupervised methodology to automatically discover pairs of semantically related words by highlighting their local environment and evaluating their semantic similarity in local and global semantic spaces. This proposal di®ers from previous research as it tries to take the best of two different methodologies i.e. semantic space models and information extraction models. It can be applied to extract close semantic relations, it limits the search space and it is unsupervised.
Resumo:
Computing the similarity between two protein structures is a crucial task in molecular biology, and has been extensively investigated. Many protein structure comparison methods can be modeled as maximum weighted clique problems in specific k-partite graphs, referred here as alignment graphs. In this paper we present both a new integer programming formulation for solving such clique problems and a dedicated branch and bound algorithm for solving the maximum cardinality clique problem. Both approaches have been integrated in VAST, a software for aligning protein 3D structures largely used in the National Center for Biotechnology Information, an original clique solver which uses the well known Bron and Kerbosch algorithm (BK). Our computational results on real protein alignment instances show that our branch and bound algorithm is up to 116 times faster than BK.
Resumo:
Румен Руменов Данговски, Калина Христова Петрова - Разглеждаме броя на несамопресичащите се разходки с фиксирана дължина върху целочислената решетка. Завършваме анализа върху случая за лента, с дължина едно. Чрез комбинаторни аргументи получаваме точна формула за броя на разходките върху лента, ограничена отляво и отдясно. Формулата я изследваме и асимптотично.
Resumo:
AMS subject classification: 68Q22, 90C90
Resumo:
2000 Mathematics Subject Classification: 68T50.
Resumo:
2000 Mathematics Subject Classification: 47H04, 65K10.
Resumo:
From 1992 to 2012 4.4 billion people were affected by disasters with almost 2 trillion USD in damages and 1.3 million people killed worldwide. The increasing threat of disasters stresses the need to provide solutions for the challenges faced by disaster managers, such as the logistical deployment of resources required to provide relief to victims. The location of emergency facilities, stock prepositioning, evacuation, inventory management, resource allocation, and relief distribution have been identified to directly impact the relief provided to victims during the disaster. Managing appropriately these factors is critical to reduce suffering. Disaster management commonly attracts several organisations working alongside each other and sharing resources to cope with the emergency. Coordinating these agencies is a complex task but there is little research considering multiple organisations, and none actually optimising the number of actors required to avoid shortages and convergence. The aim of the this research is to develop a system for disaster management based on a combination of optimisation techniques and geographical information systems (GIS) to aid multi-organisational decision-making. An integrated decision system was created comprising a cartographic model implemented in GIS to discard floodable facilities, combined with two models focused on optimising the decisions regarding location of emergency facilities, stock prepositioning, the allocation of resources and relief distribution, along with the number of actors required to perform these activities. Three in-depth case studies in Mexico were studied gathering information from different organisations. The cartographic model proved to reduce the risk to select unsuitable facilities. The preparedness and response models showed the capacity to optimise the decisions and the number of organisations required for logistical activities, pointing towards an excess of actors involved in all cases. The system as a whole demonstrated its capacity to provide integrated support for disaster preparedness and response, along with the existence of room for improvement for Mexican organisations in flood management.
Resumo:
The first essay developed a respondent model of Bayesian updating for a double-bound dichotomous choice (DB-DC) contingent valuation methodology. I demonstrated by way of data simulations that current DB-DC identifications of true willingness-to-pay (WTP) may often fail given this respondent Bayesian updating context. Further simulations demonstrated that a simple extension of current DB-DC identifications derived explicitly from the Bayesian updating behavioral model can correct for much of the WTP bias. Additional results provided caution to viewing respondents as acting strategically toward the second bid. Finally, an empirical application confirmed the simulation outcomes. The second essay applied a hedonic property value model to a unique water quality (WQ) dataset for a year-round, urban, and coastal housing market in South Florida, and found evidence that various WQ measures affect waterfront housing prices in this setting. However, the results indicated that this relationship is not consistent across any of the six particular WQ variables used, and is furthermore dependent upon the specific descriptive statistic employed to represent the WQ measure in the empirical analysis. These results continue to underscore the need to better understand both the WQ measure and its statistical form homebuyers use in making their purchase decision. The third essay addressed a limitation to existing hurricane evacuation modeling aspects by developing a dynamic model of hurricane evacuation behavior. A household's evacuation decision was framed as an optimal stopping problem where every potential evacuation time period prior to the actual hurricane landfall, the household's optimal choice is to either evacuate, or to wait one more time period for a revised hurricane forecast. A hypothetical two-period model of evacuation and a realistic multi-period model of evacuation that incorporates actual forecast and evacuation cost data for my designated Gulf of Mexico region were developed for the dynamic analysis. Results from the multi-period model were calibrated with existing evacuation timing data from a number of hurricanes. Given the calibrated dynamic framework, a number of policy questions that plausibly affect the timing of household evacuations were analyzed, and a deeper understanding of existing empirical outcomes in regard to the timing of the evacuation decision was achieved.
Resumo:
Understanding who evacuates and who does not has been one of the cornerstones of research on the pre-impact phase of both natural and technological hazards. Its history is rich in descriptive illustrations focusing on lists of characteristics of those who flee to safety. Early models of evacuation focused almost exclusively on the relationship between whether warnings were heard and ultimately believed and evacuation behavior. How people came to believe these warnings and even how they interpreted the warnings were not incorporated. In fact, the individual seemed almost removed from the picture with analysis focusing exclusively on external measures. ^ This study built and tested a more comprehensive model of evacuation that centers on the decision-making process, rather than decision outcomes. The model focused on three important factors that alter and shape the evacuation decision-making landscape. These factors are: individual level indicators which exist independently of the hazard itself and act as cultural lenses through which information is heard, processed and interpreted; hazard specific variables that directly relate to the specific hazard threat; and risk perception. The ultimate goal is to determine what factors influence the evacuation decision-making process. Using data collected for 1998's Hurricane Georges, logistic regression models were used to evaluate how well the three main factors help our understanding of how individuals come to their decisions to either flee to safety during a hurricane or remain in their homes. ^ The results of the logistic regression were significant emphasizing that the three broad types of factors tested in the model influence the decision making process. Conclusions drawn from the data analysis focus on how decision-making frames are different for those who can be designated “evacuators” and for those in evacuation zones. ^
Resumo:
The first essay developed a respondent model of Bayesian updating for a double-bound dichotomous choice (DB-DC) contingent valuation methodology. I demonstrated by way of data simulations that current DB-DC identifications of true willingness-to-pay (WTP) may often fail given this respondent Bayesian updating context. Further simulations demonstrated that a simple extension of current DB-DC identifications derived explicitly from the Bayesian updating behavioral model can correct for much of the WTP bias. Additional results provided caution to viewing respondents as acting strategically toward the second bid. Finally, an empirical application confirmed the simulation outcomes. The second essay applied a hedonic property value model to a unique water quality (WQ) dataset for a year-round, urban, and coastal housing market in South Florida, and found evidence that various WQ measures affect waterfront housing prices in this setting. However, the results indicated that this relationship is not consistent across any of the six particular WQ variables used, and is furthermore dependent upon the specific descriptive statistic employed to represent the WQ measure in the empirical analysis. These results continue to underscore the need to better understand both the WQ measure and its statistical form homebuyers use in making their purchase decision. The third essay addressed a limitation to existing hurricane evacuation modeling aspects by developing a dynamic model of hurricane evacuation behavior. A household’s evacuation decision was framed as an optimal stopping problem where every potential evacuation time period prior to the actual hurricane landfall, the household’s optimal choice is to either evacuate, or to wait one more time period for a revised hurricane forecast. A hypothetical two-period model of evacuation and a realistic multi-period model of evacuation that incorporates actual forecast and evacuation cost data for my designated Gulf of Mexico region were developed for the dynamic analysis. Results from the multi-period model were calibrated with existing evacuation timing data from a number of hurricanes. Given the calibrated dynamic framework, a number of policy questions that plausibly affect the timing of household evacuations were analyzed, and a deeper understanding of existing empirical outcomes in regard to the timing of the evacuation decision was achieved.
Resumo:
Unplanned pregnancy is experienced by millions of women worldwide. Such fact increases the risk of abortion-related morbimortality, which represents a serious public health problem. This study aims to evaluate the advances and challenges of the implementation of Humanized Abortion Care at the Maternity-School in Natal, state of Rio Grande do Norte. The research was evaluative, was preceded by an Evaluative Study, and resulted in a Case Study. The intentional sample totaled 102 subjects (60 users, 39 professionals and 3 managers). The collection techniques included documental analysis, semi-structured interview and observation with a field diary. The documental analysis was descriptive, while the Content Analysis by Bardin was used for semi-structured interviews and field diary. The Evaluative Study observed that Humanized Abortion Care is an evaluative program with preparation and pact of the logical model, of the matrix of indicators and evaluative questions. The Case Study showed that users were satisfied with the problem-solving capacity and access to the service; however, is also showed that they pointed out inadequacy in terms of environment, qualified hearing and reproductive planning. Professionals reported that the inefficiency of service consists of infrastructure and environment, which are considered inefficient and inadequate to humanized care, especially regarding patient accommodation, the lack of hospital beds, the reduced number of rooms in the surgical center and the lack of laboratory inside the maternity. Moreover, reproductive planning does not consist of an institutionalized practice in the service, and integrality with other services or partnership with the community is not in place. The Maternity Board emphasizes that the excessive demand of patients is one of the reasons that hinders the appropriate implementation of the technical standard. We then conclude that although satisfied regarding problem-solving capacity in terms of service and ease of access, there is room for improvement in qualified hearing systems, in the creation of a system to promote team work, implementation of ombudsman and satisfaction surveys. The right of shared choice did not prevail among users and health professionals with regard to the option of uterine evacuation procedure. Environment was the most mentioned category as that requiring more changes, seeing as a limited factor for the development of humanized and welcoming practices. Health professionals do not establish a periodic routine of planning practices, and such practices are not aligned with the Technical Standard. Incorporation of guidelines and availability of a plurality of methods and possibilities of choices for family planning are required. There is no institutionalization of reference and counter-reference, or partnerships with the community, which makes integrality of care not viable. The Standard needs to be included in the action plans of managers as one of the priorities in the construction of care strategies for women's health, in order to enable, allied to other initiatives, the real integration among safe conduct service, primary care network and social organizations. As a result, respect for human rights and adequate humanized care, as a way of attention and prevention of abortion, can be secured.