876 resultados para Multi objective optimizations (MOO)
Resumo:
Sophisticated models of human social behaviour are fast becoming highly desirable in an increasingly complex and interrelated world. Here, we propose that rather than taking established theories from the physical sciences and naively mapping them into the social world, the advanced concepts and theories of social psychology should be taken as a starting point, and used to develop a new modelling methodology. In order to illustrate how such an approach might be carried out, we attempt to model the low elaboration attitude changes of a society of agents in an evolving social context. We propose a geometric model of an agent in context, where individual agent attitudes are seen to self-organise to form ideologies, which then serve to guide further agent-based attitude changes. A computational implementation of the model is shown to exhibit a number of interesting phenomena, including a tendency for a measure of the entropy in the system to decrease, and a potential for externally guiding a population of agents towards a new desired ideology.
Resumo:
A multi-faceted study is conducted with the objective of estimating the potential fiscal savings in annoyance and sleep disturbance related health costs due to providing improved building acoustic design standards. This study uses balcony acoustic treatments in response to road traffic noise as an example. The study area is the State of Queensland in Australia, where regional road traffic noise mapping data is used in conjunction with standard dose–response curves to estimate the population exposure levels. The background and the importance of using the selected road traffic noise indicators are discussed. In order to achieve the objective, correlations between the mapping indicator (LA10 (18 hour)) and the dose response curve indicators (Lden and Lnight) are established via analysis on a large database of road traffic noise measurement data. The existing noise exposure of the study area is used to estimate the fiscal reductions in health related costs through the application of simple estimations of costs per person per year per degree of annoyance or sleep disturbance. The results demonstrate that balcony acoustic treatments may provide a significant benefit towards reducing the health related costs of road traffic noise in a community.
Resumo:
Recent advances in computational geodynamics are applied to explore the link between Earth’s heat, its chemistry and its mechanical behavior. Computational thermal-mechanical solutions are now allowing us to understand Earth patterns by solving the basic physics of heat transfer. This approach is currently used to solve basic convection patterns of terrestrial planets. Applying the same methodology to smaller scales delivers promising similarities between observed and predicted structures which are often the site of mineral deposits. The new approach involves a fully coupled solution to the energy, momentum and continuity equations of the system at all scales, allowing the prediction of fractures, shear zones and other typical geological patterns out of a randomly perturbed initial state. The results of this approach are linking a global geodynamic mechanical framework over regional-scale mineral deposits down to the underlying micro-scale processes. Ongoing work includes the challenge of incorporating chemistry into the formulation.
Resumo:
Particulate matter research is essential because of the well known significant adverse effects of aerosol particles on human health and the environment. In particular, identification of the origin or sources of particulate matter emissions is of paramount importance in assisting efforts to control and reduce air pollution in the atmosphere. This thesis aims to: identify the sources of particulate matter; compare pollution conditions at urban, rural and roadside receptor sites; combine information about the sources with meteorological conditions at the sites to locate the emission sources; compare sources based on particle size or mass; and ultimately, provide the basis for control and reduction in particulate matter concentrations in the atmosphere. To achieve these objectives, data was obtained from assorted local and international receptor sites over long sampling periods. The samples were analysed using Ion Beam Analysis and Scanning Mobility Particle Sizer methods to measure the particle mass with chemical composition and the particle size distribution, respectively. Advanced data analysis techniques were employed to derive information from large, complex data sets. Multi-Criteria Decision Making (MCDM), a ranking method, drew on data variability to examine the overall trends, and provided the rank ordering of the sites and years that sampling was conducted. Coupled with the receptor model Positive Matrix Factorisation (PMF), the pollution emission sources were identified and meaningful information pertinent to the prioritisation of control and reduction strategies was obtained. This thesis is presented in the thesis by publication format. It includes four refereed papers which together demonstrate a novel combination of data analysis techniques that enabled particulate matter sources to be identified and sampling site/year ranked. The strength of this source identification process was corroborated when the analysis procedure was expanded to encompass multiple receptor sites. Initially applied to identify the contributing sources at roadside and suburban sites in Brisbane, the technique was subsequently applied to three receptor sites (roadside, urban and rural) located in Hong Kong. The comparable results from these international and national sites over several sampling periods indicated similarities in source contributions between receptor site-types, irrespective of global location and suggested the need to apply these methods to air pollution investigations worldwide. Furthermore, an investigation into particle size distribution data was conducted to deduce the sources of aerosol emissions based on particle size and elemental composition. Considering the adverse effects on human health caused by small-sized particles, knowledge of particle size distribution and their elemental composition provides a different perspective on the pollution problem. This thesis clearly illustrates that the application of an innovative combination of advanced data interpretation methods to identify particulate matter sources and rank sampling sites/years provides the basis for the prioritisation of future air pollution control measures. Moreover, this study contributes significantly to knowledge based on chemical composition of airborne particulate matter in Brisbane, Australia and on the identity and plausible locations of the contributing sources. Such novel source apportionment and ranking procedures are ultimately applicable to environmental investigations worldwide.
Resumo:
This paper presents an alternative approach to image segmentation by using the spatial distribution of edge pixels as opposed to pixel intensities. The segmentation is achieved by a multi-layered approach and is intended to find suitable landing areas for an aircraft emergency landing. We combine standard techniques (edge detectors) with novel developed algorithms (line expansion and geometry test) to design an original segmentation algorithm. Our approach removes the dependency on environmental factors that traditionally influence lighting conditions, which in turn have negative impact on pixel-based segmentation techniques. We present test outcomes on realistic visual data collected from an aircraft, reporting on preliminary feedback about the performance of the detection. We demonstrate consistent performances over 97% detection rate.
Resumo:
Several tests have been devised in an attempt to detect behaviour modification due to training, supplements or diet in horses. These tests rely on subjective observations in combination with physiological measures, such as heart rate (HR) and plasma cortisol concentrations, but these measures do not definitively identify behavioural changes. The aim of the present studies was to develop an objective and relevant measure of horse reactivity. In Study 1, HR responses to auditory stimuli, delivered over 6 days, designed to safely startle six geldings confined to individual stalls was studied to determine if peak HR, unconfounded by physical exertion, was a reliable measure of reactivity. Both mean (±SEM) resting HR (39.5 ± 1.9 bpm) and peak HR (82 ± 5.5 bpm) in response to being startled in all horses were found to be consistent over the 6 days. In Study 2, HR, plasma cortisol concentrations and speed of departure from an enclosure (reaction speed (RS)) in response to a single stimulus of six mares were measured when presented daily over 6 days. Peak HR response (133 ± 4 bpm) was consistent over days for all horses, but RS increased (3.02 ± 0.72 m/s on Day 1 increasing to 4.45 ± 0.53 m/s on Day 6; P = 0.005). There was no effect on plasma cortisol, so this variable was not studied further. In Study 3, using the six geldings from Study 1, the RS test was refined and a different startle stimulus was used each day. Again, there was no change in peak HR (97.2 ± 5.8 bpm) or RS (2.9 ± 0.2 m/s on Day 1 versus 3.0 ± 0.7 m/s on Day 6) over time. In the final study, mild sedation using acepromazine maleate (0.04 mg/kg BW i.v.) decreased peak HR in response to a startle stimulus when the horses (n = 8) were confined to a stall (P = 0.006), but not in an outdoor environment when the RS test was performed. However, RS was reduced by the mild sedation (P = 0.02). In conclusion, RS may be used as a practical and objective test to measure both reactivity and changes in reactivity in horses.
Resumo:
LiFePO4 is a commercially available battery material with good theoretical discharge capacity, excellent cycle life and increased safety compared with competing Li-ion chemistries. It has been the focus of considerable experimental and theoretical scrutiny in the past decade, resulting in LiFePO4 cathodes that perform well at high discharge rates. This scrutiny has raised several questions about the behaviour of LiFePO4 material during charge and discharge. In contrast to many other battery chemistries that intercalate homogeneously, LiFePO4 can phase-separate into highly and lowly lithiated phases, with intercalation proceeding by advancing an interface between these two phases. The main objective of this thesis is to construct mathematical models of LiFePO4 cathodes that can be validated against experimental discharge curves. This is in an attempt to understand some of the multi-scale dynamics of LiFePO4 cathodes that can be difficult to determine experimentally. The first section of this thesis constructs a three-scale mathematical model of LiFePO4 cathodes that uses a simple Stefan problem (which has been used previously in the literature) to describe the assumed phase-change. LiFePO4 crystals have been observed agglomerating in cathodes to form a porous collection of crystals and this morphology motivates the use of three size-scales in the model. The multi-scale model developed validates well against experimental data and this validated model is then used to examine the role of manufacturing parameters (including the agglomerate radius) on battery performance. The remainder of the thesis is concerned with investigating phase-field models as a replacement for the aforementioned Stefan problem. Phase-field models have recently been used in LiFePO4 and are a far more accurate representation of experimentally observed crystal-scale behaviour. They are based around the Cahn-Hilliard-reaction (CHR) IBVP, a fourth-order PDE with electrochemical (flux) boundary conditions that is very stiff and possesses multiple time and space scales. Numerical solutions to the CHR IBVP can be difficult to compute and hence a least-squares based Finite Volume Method (FVM) is developed for discretising both the full CHR IBVP and the more traditional Cahn-Hilliard IBVP. Phase-field models are subject to two main physicality constraints and the numerical scheme presented performs well under these constraints. This least-squares based FVM is then used to simulate the discharge of individual crystals of LiFePO4 in two dimensions. This discharge is subject to isotropic Li+ diffusion, based on experimental evidence that suggests the normally orthotropic transport of Li+ in LiFePO4 may become more isotropic in the presence of lattice defects. Numerical investigation shows that two-dimensional Li+ transport results in crystals that phase-separate, even at very high discharge rates. This is very different from results shown in the literature, where phase-separation in LiFePO4 crystals is suppressed during discharge with orthotropic Li+ transport. Finally, the three-scale cathodic model used at the beginning of the thesis is modified to simulate modern, high-rate LiFePO4 cathodes. High-rate cathodes typically do not contain (large) agglomerates and therefore a two-scale model is developed. The Stefan problem used previously is also replaced with the phase-field models examined in earlier chapters. The results from this model are then compared with experimental data and fit poorly, though a significant parameter regime could not be investigated numerically. Many-particle effects however, are evident in the simulated discharges, which match the conclusions of recent literature. These effects result in crystals that are subject to local currents very different from the discharge rate applied to the cathode, which impacts the phase-separating behaviour of the crystals and raises questions about the validity of using cathodic-scale experimental measurements in order to determine crystal-scale behaviour.
Resumo:
Objective: Effective management of multi-resistant organisms is an important issue for hospitals both in Australia and overseas. This study investigates the utility of using Bayesian Network (BN) analysis to examine relationships between risk factors and colonization with Vancomycin Resistant Enterococcus (VRE). Design: Bayesian Network Analysis was performed using infection control data collected over a period of 36 months (2008-2010). Setting: Princess Alexandra Hospital (PAH), Brisbane. Outcome of interest: Number of new VRE Isolates Methods: A BN is a probabilistic graphical model that represents a set of random variables and their conditional dependencies via a directed acyclic graph (DAG). BN enables multiple interacting agents to be studied simultaneously. The initial BN model was constructed based on the infectious disease physician‟s expert knowledge and current literature. Continuous variables were dichotomised by using third quartile values of year 2008 data. BN was used to examine the probabilistic relationships between VRE isolates and risk factors; and to establish which factors were associated with an increased probability of a high number of VRE isolates. Software: Netica (version 4.16). Results: Preliminary analysis revealed that VRE transmission and VRE prevalence were the most influential factors in predicting a high number of VRE isolates. Interestingly, several factors (hand hygiene and cleaning) known through literature to be associated with VRE prevalence, did not appear to be as influential as expected in this BN model. Conclusions: This preliminary work has shown that Bayesian Network Analysis is a useful tool in examining clinical infection prevention issues, where there is often a web of factors that influence outcomes. This BN model can be restructured easily enabling various combinations of agents to be studied.
Resumo:
In this paper we analyse the effects of highway traffic flow parameters like vehicle arrival rate and density on the performance of Amplify and Forward (AF) cooperative vehicular networks along a multi-lane highway under free flow state. We derive analytical expressions for connectivity performance and verify them with Monte-Carlo simulations. When AF cooperative relaying is employed together with Maximum Ratio Combining (MRC) at the receivers the average route error rate shows 10-20 fold improvement compared to direct communication. A 4-8 fold increase in maximum number of traversable hops can also be observed at different vehicle densities when AF cooperative communication is used to strengthen communication routes. However the theorical upper bound of maximum number of hops promises higher performance gains.
Resumo:
Pacific Rim Real Estate Society has conducted four property case competitions from 2009 to 2012. The competition provides opportunities for undergraduate students to present their proposal on a given case study. All students were locked down with their four team members for five hours without external help to ensure a level playing field across participants. Students prepared their presentation and defended their arguments in front of experts in property industry and academia. The aim of this paper is reflecting on the feedback received from stakeholders involved in the case competition. Besides exploring what students have gained from the competitions, this paper provides an insight on the opportunities and challenges for the new format of competition to be introduced in 2013. Over the last four competitions, there were three universities participated in all the four consecutive events, four universities partook in two events and another four universities only competed once. Some universities had a great advantage by having previous experiences by participating in similar international business competitions. Findings show that the students have benefited greatly from the event including improving their ability in problem solving and other non-technical skills. Despite the aforementioned benefits, the PRRES closed-book case competition is proven not viable thus future competition needs to minimise the travel and logistic cost.
Resumo:
Classifier selection is a problem encountered by multi-biometric systems that aim to improve performance through fusion of decisions. A particular decision fusion architecture that combines multiple instances (n classifiers) and multiple samples (m attempts at each classifier) has been proposed in previous work to achieve controlled trade-off between false alarms and false rejects. Although analysis on text-dependent speaker verification has demonstrated better performance for fusion of decisions with favourable dependence compared to statistically independent decisions, the performance is not always optimal. Given a pool of instances, best performance with this architecture is obtained for certain combination of instances. Heuristic rules and diversity measures have been commonly used for classifier selection but it is shown that optimal performance is achieved for the `best combination performance' rule. As the search complexity for this rule increases exponentially with the addition of classifiers, a measure - the sequential error ratio (SER) - is proposed in this work that is specifically adapted to the characteristics of sequential fusion architecture. The proposed measure can be used to select a classifier that is most likely to produce a correct decision at each stage. Error rates for fusion of text-dependent HMM based speaker models using SER are compared with other classifier selection methodologies. SER is shown to achieve near optimal performance for sequential fusion of multiple instances with or without the use of multiple samples. The methodology applies to multiple speech utterances for telephone or internet based access control and to other systems such as multiple finger print and multiple handwriting sample based identity verification systems.
Resumo:
Predicate encryption is a new primitive that supports flexible control over access to encrypted data. We study predicate encryption systems, evaluating a wide class of predicates. Our systems are more expressive than the existing attribute-hiding systems in the sense that the proposed constructions support not only all existing predicate evaluations but also arbitrary conjunctions and disjunctions of comparison and subset queries. Toward our goal, we propose encryption schemes supporting multi-inner-product predicate and provide formal security analysis. We show how to apply the proposed schemes to achieve all those predicate evaluations.
Resumo:
Needs assessment strategies can facilitate prioritisation of resources. To develop a needs assessment tool for use with advanced cancer patients and caregivers, to prompt early intervation. A convenience sample of 103 health professionals viewed three videotaped consultations involving a simulated patient, his/her caregiver and a health professional, completed the Palliative Care Needs Assessment Tool (PC-NAT) and provided feedback on clarity, content and acceptability of the PC-NAT. Face and content validity, acceptability and feasibility of the PC-NAT were confirmed. Kappa scores indicated adequate inter-rater reliability for the majority of domains; the patient spirituality domain and the caregiver physical and family and relationship domains had low reliability. The PC-NAT can be used by health professionals with a range of clinical expertise to identify individuals' needs, thereby enabling early intervention. Further psychometric testing and an evaluation to assess the impact of the systematic use of the PC-NAT on quality of life, unmet needs and service utilisation of patients and caregivers are underway.
Resumo:
CubIT is a multi-user, large-scale presentation and collaboration framework installed at the Queensland University of Technology’s (QUT) Cube facility, an interactive facility made up 48 multi-touch screens and very large projected display screens. CubIT was built to make the Cube facility accessible to QUT’s academic and student population. The system allows users to upload, interact with and share media content on the Cube’s very large display surfaces. CubIT implements a unique combination of features including RFID authentication, content management through multiple interfaces, multi-user shared workspace support, drag and drop upload and sharing, dynamic state control between different parts of the system and execution and synchronisation of the system across multiple computing nodes.