841 resultados para Multi-agent simulators
Resumo:
Systematic studies that evaluate the quality of decision-making processes are relatively rare. Using the literature on decision quality, this research develops a framework to assess the quality of decision-making processes for resolving boundary conflicts in the Philippines. The evaluation framework breaks down the decision-making process into three components (the decision procedure, the decision method, and the decision unit) and is applied to two ex-post (one resolved and one unresolved) and one ex-ante cases. The evaluation results from the resolved and the unresolved cases show that the choice of decision method plays a minor role in resolving boundary conflicts whereas the choice of decision procedure is more influential. In the end, a decision unit can choose a simple method to resolve the conflict. The ex-ante case presents a follow-up intended to resolve the unresolved case for a changing decision-making process in which the associated decision unit plans to apply the spatial multi criteria evaluation (SMCE) tool as a decision method. The evaluation results from the ex-ante case confirm that the SMCE has the potential to enhance the decision quality because: a) it provides high quality as a decision method in this changing process, and b) the weaknesses associated with the decision unit and the decision procedure of the unresolved case were found to be eliminated in this process.
Resumo:
To the Editor; It was with interest that I read the recent article by Zhang et al. published in Supportive Care in Cancer [1]. This paper highlighted the importance of radiodermatitis (RD) being an unresolved and distressing clinical issue in patients with cancer undergoing radiation therapy. However, I am concerned with a number of clinical and methodological issues within this paper: (i) the clinical and operational definition of prophylaxis and treatment of RD; (ii) the accuracy of the identification of trials; and (iii) the appropriateness of the conduct of the meta-analyses...
Resumo:
This paper present an efficient method using system state sampling technique in Monte Carlo simulation for reliability evaluation of multi-area power systems, at Hierarchical Level One (HLI). System state sampling is one of the common methods used in Monte Carlo simulation. The cpu time and memory requirement can be a problem, using this method. Combination of analytical and Monte Carlo method known as Hybrid method, as presented in this paper, can enhance the efficiency of the solution. Incorporation of load model in this study can be utilised either by sampling or enumeration. Both cases are examined in this paper, by application of the methods on Roy Billinton Test System(RBTS).
Resumo:
Grouping users in social networks is an important process that improves matching and recommendation activities in social networks. The data mining methods of clustering can be used in grouping the users in social networks. However, the existing general purpose clustering algorithms perform poorly on the social network data due to the special nature of users' data in social networks. One main reason is the constraints that need to be considered in grouping users in social networks. Another reason is the need of capturing large amount of information about users which imposes computational complexity to an algorithm. In this paper, we propose a scalable and effective constraint-based clustering algorithm based on a global similarity measure that takes into consideration the users' constraints and their importance in social networks. Each constraint's importance is calculated based on the occurrence of this constraint in the dataset. Performance of the algorithm is demonstrated on a dataset obtained from an online dating website using internal and external evaluation measures. Results show that the proposed algorithm is able to increases the accuracy of matching users in social networks by 10% in comparison to other algorithms.
Resumo:
Power system stabilizer (PSS) is one of the most important controllers in modern power systems for damping low frequency oscillations. Many efforts have been dedicated to design the tuning methodologies and allocation techniques to obtain optimal damping behaviors of the system. Traditionally, it is tuned mostly for local damping performance, however, in order to obtain a globally optimal performance, the tuning of PSS needs to be done considering more variables. Furthermore, with the enhancement of system interconnection and the increase of system complexity, new tools are required to achieve global tuning and coordination of PSS to achieve optimal solution in a global meaning. Differential evolution (DE) is a recognized as a simple and powerful global optimum technique, which can gain fast convergence speed as well as high computational efficiency. However, as many other evolutionary algorithms (EA), the premature of population restricts optimization capacity of DE. In this paper, a modified DE is proposed and applied for optimal PSS tuning of 39-Bus New-England system. New operators are introduced to reduce the probability of getting premature. To investigate the impact of system conditions on PSS tuning, multiple operating points will be studied. Simulation result is compared with standard DE and particle swarm optimization (PSO).
'Going live' : establishing the creative attributes of the live multi-camera television professional
Resumo:
In my capacity as a television professional and teacher specialising in multi-camera live television production for over 40 years, I was drawn to the conclusion that opaque or inadequately formed understandings of how creativity applies to the field of live television, have impeded the development of pedagogies suitable to the teaching of live television in universities. In the pursuit of this hypothesis, the thesis shows that television degrees were born out of film studies degrees, where intellectual creativity was aligned to single camera production, and the 'creative roles' of producers, directors and scriptwriters. At the same time, multi-camera live television production was subsumed under the 'mass communication' banner, leading to an understanding that roles other than producer and director are simply technical, and bereft of creative intent or acumen. The thesis goes on to show that this attitude to other television production personnel, for example, the vision mixer, videotape operator and camera operator, relegates their roles to that of 'button pusher'. This has resulted in university teaching models with inappropriate resources and unsuitable teaching practices. As a result, the industry is struggling to find people with the skills to fill the demands of the multi-camera live television sector. In specific terms the central hypothesis is pursued through the following sequenced approach. Firstly, the thesis sets out to outline the problems, and traces the origins of the misconceptions that hold with the notion that intellectual creativity does not exist in live multi-camera television. Secondly, this more adequately conceptualised rendition, of the origins particular to the misconceptions of live television and creativity, is then anchored to the field of examination by presentation of the foundations of the roles involved in making live television programs, using multicamera production techniques. Thirdly, this more nuanced rendition of the field sets the stage for a thorough analysis of education and training in the industry, and teaching models at Australian universities. The findings clearly establish that the pedagogical models are aimed at single camera production, a position that deemphasises the creative aspects of multi-camera live television production. Informed by an examination of theories of learning, qualitative interviews, professional reflective practice and observations, the roles of four multi-camera live production crewmembers (camera operator, vision mixer, EVS/videotape operator and director's assistant), demonstrate the existence of intellectual creativity during live production. Finally, supported by the theories of learning, and the development and explication of a successful teaching model, a new approach to teaching students how to work in live television is proposed and substantiated.
Resumo:
In this paper we give an overview of some very recent work, as well as presenting a new approach, on the stochastic simulation of multi-scaled systems involving chemical reactions. In many biological systems (such as genetic regulation and cellular dynamics) there is a mix between small numbers of key regulatory proteins, and medium and large numbers of molecules. In addition, it is important to be able to follow the trajectories of individual molecules by taking proper account of the randomness inherent in such a system. We describe different types of simulation techniques (including the stochastic simulation algorithm, Poisson Runge-Kutta methods and the balanced Euler method) for treating simulations in the three different reaction regimes: slow, medium and fast. We then review some recent techniques on the treatment of coupled slow and fast reactions for stochastic chemical kinetics and present a new approach which couples the three regimes mentioned above. We then apply this approach to a biologically inspired problem involving the expression and activity of LacZ and LacY proteins in E coli, and conclude with a discussion on the significance of this work. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
A novel intelligent online demand side management system is proposed for peak load management in low-voltage distribution networks. This method uses low-cost controllers with low-bandwidth two-way communication installed in custumers’ premises and at distribution transformers to manage the peak load while maximising customer satisfaction. A multi-objective decision making process is proposed to select the load(s) to be delayed or controlled. The efficacy of the proposed control system is verified by simulation of three different feeder types.
Resumo:
Articular cartilage is a complex structure with an architecture in which fluid-swollen proteoglycans constrained within a 3D network of collagen fibrils. Because of the complexity of the cartilage structure, the relationship between its mechanical behaviours at the macroscale level and its components at the micro-scale level are not completely understood. The research objective in this thesis is to create a new model of articular cartilage that can be used to simulate and obtain insight into the micro-macro-interaction and mechanisms underlying its mechanical responses during physiological function. The new model of articular cartilage has two characteristics, namely: i) not use fibre-reinforced composite material idealization ii) Provide a framework for that it does probing the micro mechanism of the fluid-solid interaction underlying the deformation of articular cartilage using simple rules of repartition instead of constitutive / physical laws and intuitive curve-fitting. Even though there are various microstructural and mechanical behaviours that can be studied, the scope of this thesis is limited to osmotic pressure formation and distribution and their influence on cartilage fluid diffusion and percolation, which in turn governs the deformation of the compression-loaded tissue. The study can be divided into two stages. In the first stage, the distributions and concentrations of proteoglycans, collagen and water were investigated using histological protocols. Based on this, the structure of cartilage was conceptualised as microscopic osmotic units that consist of these constituents that were distributed according to histological results. These units were repeated three-dimensionally to form the structural model of articular cartilage. In the second stage, cellular automata were incorporated into the resulting matrix (lattice) to simulate the osmotic pressure of the fluid and the movement of water within and out of the matrix; following the osmotic pressure gradient in accordance with the chosen rule of repartition of the pressure. The outcome of this study is the new model of articular cartilage that can be used to simulate and study the micromechanical behaviours of cartilage under different conditions of health and loading. These behaviours are illuminated at the microscale level using the socalled neighbourhood rules developed in the thesis in accordance with the typical requirements of cellular automata modelling. Using these rules and relevant Boundary Conditions to simulate pressure distribution and related fluid motion produced significant results that provided the following insight into the relationships between osmotic pressure gradient and associated fluid micromovement, and the deformation of the matrix. For example, it could be concluded that: 1. It is possible to model articular cartilage with the agent-based model of cellular automata and the Margolus neighbourhood rule. 2. The concept of 3D inter connected osmotic units is a viable structural model for the extracellular matrix of articular cartilage. 3. Different rules of osmotic pressure advection lead to different patterns of deformation in the cartilage matrix, enabling an insight into how this micromechanism influences macromechanical deformation. 4. When features such as transition coefficient were changed, permeability (representing change) is altered due to the change in concentrations of collagen, proteoglycans (i.e. degenerative conditions), the deformation process is impacted. 5. The boundary conditions also influence the relationship between osmotic pressure gradient and fluid movement at the micro-scale level. The outcomes are important to cartilage research since we can use these to study the microscale damage in the cartilage matrix. From this, we are able to monitor related diseases and their progression leading to potential insight into drug-cartilage interaction for treatment. This innovative model is an incremental progress on attempts at creating further computational modelling approaches to cartilage research and other fluid-saturated tissues and material systems.
Resumo:
In the modern connected world, pervasive computing has become reality. Thanks to the ubiquity of mobile computing devices and emerging cloud-based services, the users permanently stay connected to their data. This introduces a slew of new security challenges, including the problem of multi-device key management and single-sign-on architectures. One solution to this problem is the utilization of secure side-channels for authentication, including the visual channel as vicinity proof. However, existing approaches often assume confidentiality of the visual channel, or provide only insufficient means of mitigating a man-in-the-middle attack. In this work, we introduce QR-Auth, a two-step, 2D barcode based authentication scheme for mobile devices which aims specifically at key management and key sharing across devices in a pervasive environment. It requires minimal user interaction and therefore provides better usability than most existing schemes, without compromising its security. We show how our approach fits in existing authorization delegation and one-time-password generation schemes, and that it is resilient to man-in-the-middle attacks.
Resumo:
Theoretical foundations of higher order spectral analysis are revisited to examine the use of time-varying bicoherence on non-stationary signals using a classical short-time Fourier approach. A methodology is developed to apply this to evoked EEG responses where a stimulus-locked time reference is available. Short-time windowed ensembles of the response at the same offset from the reference are considered as ergodic cyclostationary processes within a non-stationary random process. Bicoherence can be estimated reliably with known levels at which it is significantly different from zero and can be tracked as a function of offset from the stimulus. When this methodology is applied to multi-channel EEG, it is possible to obtain information about phase synchronization at different regions of the brain as the neural response develops. The methodology is applied to analyze evoked EEG response to flash visual stimulii to the left and right eye separately. The EEG electrode array is segmented based on bicoherence evolution with time using the mean absolute difference as a measure of dissimilarity. Segment maps confirm the importance of the occipital region in visual processing and demonstrate a link between the frontal and occipital regions during the response. Maps are constructed using bicoherence at bifrequencies that include the alpha band frequency of 8Hz as well as 4 and 20Hz. Differences are observed between responses from the left eye and the right eye, and also between subjects. The methodology shows potential as a neurological functional imaging technique that can be further developed for diagnosis and monitoring using scalp EEG which is less invasive and less expensive than magnetic resonance imaging.
Resumo:
In this work, the thermal expansion properties of carbon nanotube (CNT)-reinforced nanocomposites with CNT content ranging from 1 to 15 wt% were evaluated using a multi-scale numerical approach, in which the effects of two parameters, i.e., temperature and CNT content, were investigated extensively. For all CNT contents, the obtained results clearly revealed that within a wide low-temperature range (30°C ~ 62°C), thermal contraction is observed, while thermal expansion occurs in a high-temperature range (62°C ~ 120°C). It was found that at any specified CNT content, the thermal expansion properties vary with temperature - as temperature increases, the thermal expansion rate increases linearly. However, at a specified temperature, the absolute value of the thermal expansion rate decreases nonlinearly as the CNT content increases. Moreover, the results provided by the present multi-scale numerical model were in good agreement with those obtained from the corresponding theoretical analyses and experimental measurements in this work, which indicates that this multi-scale numerical approach provides a powerful tool to evaluate the thermal expansion properties of any type of CNT/polymer nanocomposites and therefore promotes the understanding on the thermal behaviors of CNT/polymer nanocomposites for their applications in temperature sensors, nanoelectronics devices, etc.
Resumo:
Traffic congestion has a significant impact on the economy and environment. Encouraging the use of multimodal transport (public transport, bicycle, park’n’ride, etc.) has been identified by traffic operators as a good strategy to tackle congestion issues and its detrimental environmental impacts. A multi-modal and multi-objective trip planner provides users with various multi-modal options optimised on objectives that they prefer (cheapest, fastest, safest, etc) and has a potential to reduce congestion on both a temporal and spatial scale. The computation of multi-modal and multi-objective trips is a complicated mathematical problem, as it must integrate and utilize a diverse range of large data sets, including both road network information and public transport schedules, as well as optimising for a number of competing objectives, where fully optimising for one objective, such as travel time, can adversely affect other objectives, such as cost. The relationship between these objectives can also be quite subjective, as their priorities will vary from user to user. This paper will first outline the various data requirements and formats that are needed for the multi-modal multi-objective trip planner to operate, including static information about the physical infrastructure within Brisbane as well as real-time and historical data to predict traffic flow on the road network and the status of public transport. It will then present information on the graph data structures representing the road and public transport networks within Brisbane that are used in the trip planner to calculate optimal routes. This will allow for an investigation into the various shortest path algorithms that have been researched over the last few decades, and provide a foundation for the construction of the Multi-modal Multi-objective Trip Planner by the development of innovative new algorithms that can operate the large diverse data sets and competing objectives.