998 resultados para scenario method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organisations in Multi-Agent Systems (MAS) have proven to be successful in regulating agent societies. Nevertheless, changes in agents' behaviour or in the dynamics of the environment may lead to a poor fulfilment of the system's purposes, and so the entire organisation needs to be adapted. In this paper we focus on endowing the organisation with adaptation capabilities, instead of expecting agents to be capable of adapting the organisation by themselves. We regard this organisational adaptation as an assisting service provided by what we call the Assistance Layer. Our generic Two Level Assisted MAS Architecture (2-LAMA) incorporates such a layer. We empirically evaluate this approach by means of an agent-based simulator we have developed for the P2P sharing network domain. This simulator implements 2-LAMA architecture and supports the comparison between different adaptation methods, as well as, with the standard BitTorrent protocol. In particular, we present two alternatives to perform norm adaptation and one method to adapt agents'relationships. The results show improved performance and demonstrate that the cost of introducing an additional layer in charge of the system's adaptation is lower than its benefits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis attempts to find whether scenario planning supports the organizational strategy as a method for addressing uncertainty. The main issues are why, what and how scenario planning fits in organizational strategy and how the process could be supported to make it more effective. The study follows the constructive approach. It starts with examination of competitive advantage and the way that an organization develops strategy and how it addresses the uncertainty in its operational environment. Based on the conducted literature review, scenario methods would seem to provide versatile platform for addressing future uncertainties. The construction is formed by examining the scenario methods and presenting suitable support methods, which results in forming of the theoretical proposition for supporter scenario process. The theoretical framework is tested in laboratory conditions, and the results from the test sessions are used a basis for scenario stories. The process of forming the scenarios and the results are illustrated and presented for scrutiny

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A software development process is a predetermined sequence of steps to create a piece of software. A software development process is used, so that an implementing organization could gain significant benefits. The benefits for software development companies, that can be attributed to software process improvement efforts, are improved predictability in the development effort and improved quality software products. The implementation, maintenance, and management of a software process as well as the software process improvement efforts are expensive. Especially the implementation phase is expensive with a best case scenario of a slow return on investment. Software processes are rare in very small software development companies because of the cost of implementation and an improbable return on investment. This study presents a new method to enable benefits that are usually related to software process improvement to small companies with a low cost. The study presents reasons for the development of the method, a description of the method, and an implementation process for the method, as well as a theoretical case study of a method implementation. The study's focus is on describing the method. The theoretical use case is used to illustrate the theory of the method and the implementation process of the method. The study ends with a few conclusions on the method and on the method's implementation process. The main conclusion is that the method requires further study as well as implementation experiments to asses the value of the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The magnitude of the cervical cancer problem, coupled with the potential for prevention with recent technological advances, made it imperative to step back and reassess strategic options for dealing with cervical cancer screening in Kenya. The purpose of this qualitative study was: 1) to explore the extent to which the Participatory Action Research (PAR) methodology and the Scenario Based Planning (SBP) method, with the application of analytics, could enable strategic, consequential, informed decision making, and 2) to determine how influential Kenyan decision makers could apply SBP with analytic tools and techniques to make strategic, consequential decisions regarding the implementation of a Cervical Self Sampling Program (CSSP) in both urban and rural settings. The theoretical paradigm for this study was action research; it was experiential, practical, and action oriented, and resulted in co-created knowledge that influenced study participants’ decision making. Action Africa Help International (AAHI) and Brock University collaborated with Local Decision Influencing Participants (LDIP’s) to develop innovative strategies on how to implement the CSSP. SBP tools, along with traditional approaches to data collection and analysis, were applied to collect, visualize and analyze predominately qualitative data. Outputs from the study included: a) a generic implementation scenario for a CSSP (along with scenarios unique to urban and rural settings), and b) 10 strategic directions and 22 supporting implementation strategies that address the variables of: 1) technical viability, 2) political support, 3) affordability, 4) logistical feasibility, 5) social acceptability, and 6) transformation/sustainability. In addition, study participants’ capacity to effectively engage in predictive/prescriptive strategic decision making was strengthened.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background This article aims to discuss the incorporation of traditional time in the construction of a management scenario for pink shrimp in the Patos Lagoon estuary (RS), Brazil. To meet this objective, two procedures have been adopted; one at a conceptual level and another at a methodological level. At the conceptual level, the concept of traditional time as a form of traditional ecological knowledge (TEK) was adopted. Method At the methodological level, we conduct a wide literature review of the scientific knowledge (SK) that guides recommendations for pink shrimp management by restricting the fishing season in the Patos Lagoon estuary; in addition, we review the ethno-scientific literature which describes traditional calendars as a management base for artisanal fishers in the Patos Lagoon estuary. Results Results demonstrate that TEK and SK describe similar estuarine biological processes, but are incommensurable at a resource management level. On the other hand, the construction of a “management scenario” for pink shrimp is possible through the development of “criteria for hierarchies of validity” which arise from a productive dialog between SK and TEK. Conclusions The commensurable and the incommensurable levels reveal different basis of time-space perceptions between traditional ecological knowledge and scientific knowledge. Despite incommensurability at the management level, it is possible to establish guidelines for the construction of “management scenarios” and to support a co-management process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: In order to optimise the cost-effectiveness of active surveillance to substantiate freedom from disease, a new approach using targeted sampling of farms was developed and applied on the example of infectious bovine rhinotracheitis (IBR) and enzootic bovine leucosis (EBL) in Switzerland. Relevant risk factors (RF) for the introduction of IBR and EBL into Swiss cattle farms were identified and their relative risks defined based on literature review and expert opinions. A quantitative model based on the scenario tree method was subsequently used to calculate the required sample size of a targeted sampling approach (TS) for a given sensitivity. We compared the sample size with that of a stratified random sample (sRS) with regard to efficiency. RESULTS: The required sample sizes to substantiate disease freedom were 1,241 farms for IBR and 1,750 farms for EBL to detect 0.2% herd prevalence with 99% sensitivity. Using conventional sRS, the required sample sizes were 2,259 farms for IBR and 2,243 for EBL. Considering the additional administrative expenses required for the planning of TS, the risk-based approach was still more cost-effective than a sRS (40% reduction on the full survey costs for IBR and 8% for EBL) due to the considerable reduction in sample size. CONCLUSIONS: As the model depends on RF selected through literature review and was parameterised with values estimated by experts, it is subject to some degree of uncertainty. Nevertheless, this approach provides the veterinary authorities with a promising tool for future cost-effective sampling designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The Swiss pig population enjoys a favourable health situation. To further promote this, the Pig Health Service (PHS) conducts a surveillance program in affiliated herds: closed multiplier herds with the highest PHS-health and hygiene status have to be free from swine dysentery and progressive atrophic rhinitis and are clinically examined four times a year, including laboratory testing. Besides, four batches of pigs per year are fattened together with pigs from other herds and checked for typical symptoms (monitored fattening groups (MF)). While costly and laborious, little was known about the effectiveness of the surveillance to detect an infection in a herd. Therefore, the sensitivity of the surveillance for progressive atrophic rhinitis and swine dysentery at herd level was assessed using scenario tree modelling, a method well established at national level. Furthermore, its costs and the time until an infection would be detected were estimated, with the final aim of yielding suggestions how to optimize surveillance. Results: For swine dysentery, the median annual surveillance sensitivity was 96.7 %, mean time to detection 4.4 months, and total annual costs 1022.20 Euro/herd. The median component sensitivity of active sampling was between 62.5 and 77.0 %, that of a MF between 7.2 and 12.7 %. For progressive atrophic rhinitis, the median surveillance sensitivity was 99.4 %, mean time to detection 3.1 months and total annual costs 842.20 Euro. The median component sensitivity of active sampling was 81.7 %, that of a MF between 19.4 and 38.6 %. Conclusions: Results indicate that total sensitivity for both diseases is high, while time to detection could be a risk in herds with frequent pig trade. From all components, active sampling had the highest contribution to the surveillance sensitivity, whereas that of MF was very low. To increase efficiency, active sampling should be intensified (more animals sampled) and MF abandoned. This would significantly improve sensitivity and time to detection at comparable or lower costs. The method of scenario tree modelling proved useful to assess the efficiency of surveillance at herd level. Its versatility allows adjustment to all kinds of surveillance scenarios to optimize sensitivity, time to detection and/or costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intraoperative laparoscopic calibration remains a challenging task. In this work we present a new method and instrumentation for intraoperative camera calibration. Contrary to conventional calibration methods, the proposed technique allows intraoperative laparoscope calibration from single perspective observations, resulting in a standardized scheme for calibrating in a clinical scenario. Results show an average displacement error of 0.52 ± 0.19 mm, indicating sufficient accuracy for clinical use. Additionally, the proposed method is validated clinically by performing a calibration during the surgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly in a changing climate. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis of the bathtub model propagate into the inundation mapping. The aim of this study was to assess the impacts of spatially variable and spatially correlated elevation errors in high-spatial resolution DEMs for mapping coastal inundation. Elevation errors were best modelled using regression-kriging. This geostatistical model takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. 1,000 error simulations were added to the original DEM and reclassified using a hydrologically correct bathtub method. The probability of inundation to a scenario combining a 1 in 100 year storm event over a 1 m SLR was calculated by counting the proportion of times from the 1,000 simulations that a location was inundated. This probabilistic approach can be used in a risk-aversive decision making process by planning for scenarios with different probabilities of occurrence. For example, results showed that when considering a 1% probability exceedance, the inundated area was approximately 11% larger than mapped using the deterministic bathtub approach. The probabilistic approach provides visually intuitive maps that convey uncertainties inherent to spatial data and analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Web 2.0 applications enabled users to classify information resources using their own vocabularies. The bottom-up nature of these user-generated classification systems have turned them into interesting knowledge sources, since they provide a rich terminology generated by potentially large user communities. Previous research has shown that it is possible to elicit some emergent semantics from the aggregation of individual classifications in these systems. However the generation of ontologies from them is still an open research problem. In this thesis we address the problem of how to tap into user-generated classification systems for building domain ontologies. Our objective is to design a method to develop domain ontologies from user-generated classifications systems. To do so, we rely on ontologies in the Web of Data to formalize the semantics of the knowledge collected from the classification system. Current ontology development methodologies have recognized the importance of reusing knowledge from existing resources. Thus, our work is framed within the NeOn methodology scenario for building ontologies by reusing and reengineering non-ontological resources. The main contributions of this work are: An integrated method to develop ontologies from user-generated classification systems. With this method we extract a domain terminology from the classification system and then we formalize the semantics of this terminology by reusing ontologies in the Web of Data. Identification and adaptation of existing techniques for implementing the activities in the method so that they can fulfill the requirements of each activity. A novel study about emerging semantics in user-generated lists. Resumen La web 2.0 permitió a los usuarios clasificar recursos de información usando su propio vocabulario. Estos sistemas de clasificación generados por usuarios son recursos interesantes para la extracción de conocimiento debido principalmente a que proveen una extensa terminología generada por grandes comunidades de usuarios. Se ha demostrado en investigaciones previas que es posible obtener una semántica emergente de estos sistemas. Sin embargo la generación de ontologías a partir de ellos es todavía un problema de investigación abierto. Esta tesis trata el problema de cómo aprovechar los sistemas de clasificación generados por usuarios en la construcción de ontologías de dominio. Así el objetivo de la tesis es diseñar un método para desarrollar ontologías de dominio a partir de sistemas de clasificación generados por usuarios. El método propuesto reutiliza conceptualizaciones existentes en ontologías publicadas en la Web de Datos para formalizar la semántica del conocimiento que se extrae del sistema de clasificación. Por tanto, este trabajo está enmarcado dentro del escenario para desarrollar ontologías mediante la reutilización y reingeniería de recursos no ontológicos que se ha definido en la Metodología NeOn. Las principales contribuciones de este trabajo son: Un método integrado para desarrollar una ontología de dominio a partir de sistemas de clasificación generados por usuarios. En este método se extrae una terminología de dominio del sistema de clasificación y posteriormente se formaliza su semántica reutilizando ontologías en la Web de Datos. La identificación y adaptación de un conjunto de técnicas para implementar las actividades propuestas en el método de tal manera que puedan cumplir automáticamente los requerimientos de cada actividad. Un novedoso estudio acerca de la semántica emergente en las listas generadas por usuarios en la Web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a novel method to simulate radio propagation is presented. The method consists of two steps: automatic 3D scenario reconstruction and propagation modeling. For 3D reconstruction, a machine learning algorithm is adopted and improved to automatically recognize objects in pictures taken from target regions, and 3D models are generated based on the recognized objects. The propagation model employs a ray tracing algorithm to compute signal strength for each point on the constructed 3D map. Our proposition reduces, or even eliminates, infrastructure cost and human efforts during the construction of realistic 3D scenes used in radio propagation modeling. In addition, the results obtained from our propagation model proves to be both accurate and efficient

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time series are proficiently converted into graphs via the horizontal visibility (HV) algorithm, which prompts interest in its capability for capturing the nature of different classes of series in a network context. We have recently shown [B. Luque et al., PLoS ONE 6, 9 (2011)] that dynamical systems can be studied from a novel perspective via the use of this method. Specifically, the period-doubling and band-splitting attractor cascades that characterize unimodal maps transform into families of graphs that turn out to be independent of map nonlinearity or other particulars. Here, we provide an in depth description of the HV treatment of the Feigenbaum scenario, together with analytical derivations that relate to the degree distributions, mean distances, clustering coefficients, etc., associated to the bifurcation cascades and their accumulation points. We describe how the resultant families of graphs can be framed into a renormalization group scheme in which fixed-point graphs reveal their scaling properties. These fixed points are then re-derived from an entropy optimization process defined for the graph sets, confirming a suggested connection between renormalization group and entropy optimization. Finally, we provide analytical and numerical results for the graph entropy and show that it emulates the Lyapunov exponent of the map independently of its sign.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a novel method to simulate radio propagation is presented. The method consists of two steps: automatic 3D scenario reconstruction and propagation modeling. For 3D reconstruction, a machine learning algorithm is adopted and improved to automatically recognize objects in pictures taken from target region, and 3D models are generated based on the recognized objects. The propagation model employs a ray tracing algorithm to compute signal strength for each point on the constructed 3D map. By comparing with other methods, the work presented in this paper makes contributions on reducing human efforts and cost in constructing 3D scene; moreover, the developed propagation model proves its potential in both accuracy and efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the sensitive international situation caused by still-recent terrorist attacks, there is a common need to protect the safety of large spaces such as government buildings, airports and power stations. To address this problem, developments in several research fields, such as video and cognitive audio, decision support systems, human interface, computer architecture, communications networks and communications security, should be integrated with the goal of achieving advanced security systems capable of checking all of the specified requirements and spanning the gap that presently exists in the current market. This paper describes the implementation of a decision system for crisis management in infrastructural building security. Specifically, it describes the implementation of a decision system in the management of building intrusions. The positions of the unidentified persons are reported with the help of a Wireless Sensor Network (WSN). The goal is to achieve an intelligent system capable of making the best decision in real time in order to quickly neutralise one or more intruders who threaten strategic installations. It is assumed that the intruders’ behaviour is inferred through sequences of sensors’ activations and their fusion. This article presents a general approach to selecting the optimum operation from the available neutralisation strategies based on a Minimax algorithm. The distances among different scenario elements will be used to measure the risk of the scene, so a path planning technique will be integrated in order to attain a good performance. Different actions to be executed over the elements of the scene such as moving a guard, blocking a door or turning on an alarm will be used to neutralise the crisis. This set of actions executed to stop the crisis is known as the neutralisation strategy. Finally, the system has been tested in simulations of real situations, and the results have been evaluated according to the final state of the intruders. In 86.5% of the cases, the system achieved the capture of the intruders, and in 59.25% of the cases, they were intercepted before they reached their objective.