30 resultados para Orion DBMS, Database, Uncertainty, Uncertain values, Benchmark

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Species distribution models (SDMs) are considered to exemplify Pattern rather than Process based models of a species' response to its environment. Hence when used to map species distribution, the purpose of SDMs can be viewed as interpolation, since species response is measured at a few sites in the study region, and the aim is to interpolate species response at intermediate sites. Increasingly, however, SDMs are also being used to also extrapolate species-environment relationships beyond the limits of the study region as represented by the training data. Regardless of whether SDMs are to be used for interpolation or extrapolation, the debate over how to implement SDMs focusses on evaluating the quality of the SDM, both ecologically and mathematically. This paper proposes a framework that includes useful tools previously employed to address uncertainty in habitat modelling. Together with existing frameworks for addressing uncertainty more generally when modelling, we then outline how these existing tools help inform development of a broader framework for addressing uncertainty, specifically when building habitat models. As discussed earlier we focus on extrapolation rather than interpolation, where the emphasis on predictive performance is diluted by the concerns for robustness and ecological relevance. We are cognisant of the dangers of excessively propagating uncertainty. Thus, although the framework provides a smorgasbord of approaches, it is intended that the exact menu selected for a particular application, is small in size and targets the most important sources of uncertainty. We conclude with some guidance on a strategic approach to identifying these important sources of uncertainty. Whilst various aspects of uncertainty in SDMs have previously been addressed, either as the main aim of a study or as a necessary element of constructing SDMs, this is the first paper to provide a more holistic view.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the medical and healthcare arena, patients‟ data is not just their own personal history but also a valuable large dataset for finding solutions for diseases. While electronic medical records are becoming popular and are used in healthcare work places like hospitals, as well as insurance companies, and by major stakeholders such as physicians and their patients, the accessibility of such information should be dealt with in a way that preserves privacy and security. Thus, finding the best way to keep the data secure has become an important issue in the area of database security. Sensitive medical data should be encrypted in databases. There are many encryption/ decryption techniques and algorithms with regard to preserving privacy and security. Currently their performance is an important factor while the medical data is being managed in databases. Another important factor is that the stakeholders should decide more cost-effective ways to reduce the total cost of ownership. As an alternative, DAS (Data as Service) is a popular outsourcing model to satisfy the cost-effectiveness but it takes a consideration that the encryption/ decryption modules needs to be handled by trustworthy stakeholders. This research project is focusing on the query response times in a DAS model (AES-DAS) and analyses the comparison between the outsourcing model and the in-house model which incorporates Microsoft built-in encryption scheme in a SQL Server. This research project includes building a prototype of medical database schemas. There are 2 types of simulations to carry out the project. The first stage includes 6 databases in order to carry out simulations to measure the performance between plain-text, Microsoft built-in encryption and AES-DAS (Data as Service). Particularly, the AES-DAS incorporates implementations of symmetric key encryption such as AES (Advanced Encryption Standard) and a Bucket indexing processor using Bloom filter. The results are categorised such as character type, numeric type, range queries, range queries using Bucket Index and aggregate queries. The second stage takes the scalability test from 5K to 2560K records. The main result of these simulations is that particularly as an outsourcing model, AES-DAS using the Bucket index shows around 3.32 times faster than a normal AES-DAS under the 70 partitions and 10K record-sized databases. Retrieving Numeric typed data takes shorter time than Character typed data in AES-DAS. The aggregation query response time in AES-DAS is not as consistent as that in MS built-in encryption scheme. The scalability test shows that the DBMS reaches in a certain threshold; the query response time becomes rapidly slower. However, there is more to investigate in order to bring about other outcomes and to construct a secured EMR (Electronic Medical Record) more efficiently from these simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Malnutrition is a common problem in children with end-stage liver disease (ESLD), and accurate assessment of nutritional status is essential in managing these children. In a retrospective study, we compared nutritional assessment by anthropometry with that by body composition. We analyzed all consecutive measurements of total body potassium (TBK, n = 186) of children less than 3 years old with ESLD awaiting transplantation found in our database. The TBK values obtained by whole body counting of 40K were compared with reference TRK values of healthy children. The prevalence of malnutrition, as assessed by weight (weight Z score < -2) was 28%, which was significantly lower (chi-square test, p < 0.0001) than the prevalence of malnutrition (76%) assessed by TBK (< 90% of expected TRK for age). These results demonstrated that body weight underestimated the nutritional deficit and stressed the importance of measuring body composition as part of assessing nutritional status of children with ESLD.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

CFD has been successfully used in the optimisation of aerodynamic surfaces using a given set of parameters such as Mach numbers and angle of attack. While carrying out a multidisciplinary design optimisation one deals with situations where the parameters have some uncertain attached. Any optimisation carried out for fixed values of input parameters gives a design which may be totally unacceptable under off-design conditions. The challenge is to develop a robust design procedure which takes into account the fluctuations in the input parameters. In this work, we attempt this using a modified Taguchi approach. This is incorporated into an evolutionary algorithm with many features developed in house. The method is tested for an UCAV design which simultaneously handles aerodynamics, electromagnetics and maneuverability. Results demonstrate that the method has considerable potential.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

One of the impediments to large-scale use of wind generation within power system is its variable and uncertain real-time availability. Due to the low marginal cost of wind power, its output will change the merit order of power markets and influence the Locational Marginal Price (LMP). For the large scale of wind power, LMP calculation can't ignore the essential variable and uncertain nature of wind power. This paper proposes an algorithm to estimate LMP. The estimation result of conventional Monte Carlo simulation is taken as benchmark to examine accuracy. Case study is conducted on a simplified SE Australian power system, and the simulation results show the feasibility of proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Choosing the appropriate procurement system for construction projects is a complex and challenging task for clients particularly when professional advice has not been sought. To assist with the decision making process, a range of procurement selection tools and techniques have been developed by both academic and industry bodies. Public sector clients in Western Australia (WA) remain uncertain about the pairing of procurement method to bespoke construction project and how this decision will ultimately impact upon project success. This paper examines ‘how and why’ a public sector agency selected particular procurement methods. · Methodology/Approach: An analysis of two focus group workshops (with 18 senior project and policy managers involved with procurement selection) is reported upon · Findings: The traditional lump sum (TLS) method is still the preferred procurement path even though alternative forms such as design and construct, public-private-partnerships could optimize the project outcome. Paradoxically, workshop participants agreed that alternative procurement forms should be considered, but an embedded culture of uncertainty avoidance invariably meant that TLS methods were selected. Senior managers felt that only a limited number of contractors have the resources and experience to deliver projects using the nontraditional methods considered. · Research limitations/implications: The research identifies a need to develop a framework that public sector clients can use to select an appropriate procurement method. A procurement framework should be able to guide the decision-maker rather than provide a prescriptive solution. Learning from previous experiences with regard to procurement selection will further provide public sector clients with knowledge about how to best deliver their projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Some types of antimicrobial-coated central venous catheters (A-CVC) have been shown to be cost-effective in preventing catheter-related bloodstream infection (CR-BSI). However, not all types have been evaluated, and there are concerns over the quality and usefulness of these earlier studies. There is uncertainty amongst clinicians over which, if any, antimicrobial-coated central venous catheters to use. We re-evaluated the cost-effectiveness of all commercially available antimicrobialcoated central venous catheters for prevention of catheter-related bloodstream infection in adult intensive care unit (ICU) patients. Methods: We used a Markov decision model to compare the cost-effectiveness of antimicrobial-coated central venous catheters relative to uncoated catheters. Four catheter types were evaluated; minocycline and rifampicin (MR)-coated catheters; silver, platinum and carbon (SPC)-impregnated catheters; and two chlorhexidine and silver sulfadiazine-coated catheters, one coated on the external surface (CH/SSD (ext)) and the other coated on both surfaces (CH/SSD (int/ext)). The incremental cost per qualityadjusted life-year gained and the expected net monetary benefits were estimated for each. Uncertainty arising from data estimates, data quality and heterogeneity was explored in sensitivity analyses. Results: The baseline analysis, with no consideration of uncertainty, indicated all four types of antimicrobial-coated central venous catheters were cost-saving relative to uncoated catheters. Minocycline and rifampicin-coated catheters prevented 15 infections per 1,000 catheters and generated the greatest health benefits, 1.6 quality-adjusted life-years, and cost-savings, AUD $130,289. After considering uncertainty in the current evidence, the minocycline and rifampicin-coated catheters returned the highest incremental monetary net benefits of $948 per catheter; but there was a 62% probability of error in this conclusion. Although the minocycline and rifampicin-coated catheters had the highest monetary net benefits across multiple scenarios, the decision was always associated with high uncertainty. Conclusions: Current evidence suggests that the cost-effectiveness of using antimicrobial-coated central venous catheters within the ICU is highly uncertain. Policies to prevent catheter-related bloodstream infection amongst ICU patients should consider the cost-effectiveness of competing interventions in the light of this uncertainty. Decision makers would do well to consider the current gaps in knowledge and the complexity of producing good quality evidence in this area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unmanned Aerial Vehicles (UAVs) are emerging as an ideal platform for a wide range of civil applications such as disaster monitoring, atmospheric observation and outback delivery. However, the operation of UAVs is currently restricted to specially segregated regions of airspace outside of the National Airspace System (NAS). Mission Flight Planning (MFP) is an integral part of UAV operation that addresses some of the requirements (such as safety and the rules of the air) of integrating UAVs in the NAS. Automated MFP is a key enabler for a number of UAV operating scenarios as it aids in increasing the level of onboard autonomy. For example, onboard MFP is required to ensure continued conformance with the NAS integration requirements when there is an outage in the communications link. MFP is a motion planning task concerned with finding a path between a designated start waypoint and goal waypoint. This path is described with a sequence of 4 Dimensional (4D) waypoints (three spatial and one time dimension) or equivalently with a sequence of trajectory segments (or tracks). It is necessary to consider the time dimension as the UAV operates in a dynamic environment. Existing methods for generic motion planning, UAV motion planning and general vehicle motion planning cannot adequately address the requirements of MFP. The flight plan needs to optimise for multiple decision objectives including mission safety objectives, the rules of the air and mission efficiency objectives. Online (in-flight) replanning capability is needed as the UAV operates in a large, dynamic and uncertain outdoor environment. This thesis derives a multi-objective 4D search algorithm entitled Multi- Step A* (MSA*) based on the seminal A* search algorithm. MSA* is proven to find the optimal (least cost) path given a variable successor operator (which enables arbitrary track angle and track velocity resolution). Furthermore, it is shown to be of comparable complexity to multi-objective, vector neighbourhood based A* (Vector A*, an extension of A*). A variable successor operator enables the imposition of a multi-resolution lattice structure on the search space (which results in fewer search nodes). Unlike cell decomposition based methods, soundness is guaranteed with multi-resolution MSA*. MSA* is demonstrated through Monte Carlo simulations to be computationally efficient. It is shown that multi-resolution, lattice based MSA* finds paths of equivalent cost (less than 0.5% difference) to Vector A* (the benchmark) in a third of the computation time (on average). This is the first contribution of the research. The second contribution is the discovery of the additive consistency property for planning with multiple decision objectives. Additive consistency ensures that the planner is not biased (which results in a suboptimal path) by ensuring that the cost of traversing a track using one step equals that of traversing the same track using multiple steps. MSA* mitigates uncertainty through online replanning, Multi-Criteria Decision Making (MCDM) and tolerance. Each trajectory segment is modeled with a cell sequence that completely encloses the trajectory segment. The tolerance, measured as the minimum distance between the track and cell boundaries, is the third major contribution. Even though MSA* is demonstrated for UAV MFP, it is extensible to other 4D vehicle motion planning applications. Finally, the research proposes a self-scheduling replanning architecture for MFP. This architecture replicates the decision strategies of human experts to meet the time constraints of online replanning. Based on a feedback loop, the proposed architecture switches between fast, near-optimal planning and optimal planning to minimise the need for hold manoeuvres. The derived MFP framework is original and shown, through extensive verification and validation, to satisfy the requirements of UAV MFP. As MFP is an enabling factor for operation of UAVs in the NAS, the presented work is both original and significant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, the issue of incorporating uncertainty for environmental modelling informed by imagery is explored by considering uncertainty in deterministic modelling, measurement uncertainty and uncertainty in image composition. Incorporating uncertainty in deterministic modelling is extended for use with imagery using the Bayesian melding approach. In the application presented, slope steepness is shown to be the main contributor to total uncertainty in the Revised Universal Soil Loss Equation. A spatial sampling procedure is also proposed to assist in implementing Bayesian melding given the increased data size with models informed by imagery. Measurement error models are another approach to incorporating uncertainty when data is informed by imagery. These models for measurement uncertainty, considered in a Bayesian conditional independence framework, are applied to ecological data generated from imagery. The models are shown to be appropriate and useful in certain situations. Measurement uncertainty is also considered in the context of change detection when two images are not co-registered. An approach for detecting change in two successive images is proposed that is not affected by registration. The procedure uses the Kolmogorov-Smirnov test on homogeneous segments of an image to detect change, with the homogeneous segments determined using a Bayesian mixture model of pixel values. Using the mixture model to segment an image also allows for uncertainty in the composition of an image. This thesis concludes by comparing several different Bayesian image segmentation approaches that allow for uncertainty regarding the allocation of pixels to different ground components. Each segmentation approach is applied to a data set of chlorophyll values and shown to have different benefits and drawbacks depending on the aims of the analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From a ‘cultural science’ perspective, this paper traces one aspect of a more general shift, from the realist representational regime of modernity to the productive DIY systems of the internet era. It argues that collecting and archiving is transformed by this change. Modern museums – and also broadcast television – were based on determinist or ‘essence’ theory; while internet archives like YouTube (and the internet as an archive) are based on ‘probability’ theory. The paper goes through the differences between modernist ‘essence’ and postmodern ‘probability’; starting from the obvious difference that in a museum each object is selected by experts for its intrinsic properties, while on the internet you don’t know what you will find. The status of individual objects is uncertain, although the productivity of the overall archive is unlimited. The paper links these differences with changes in contemporary culture – from a Newtonian to a quantum universe, progress to risk, institutional structure to evolutionary change, objectivity to uncertainty, identity to performance. Borrowing some of its methodology from science fiction, the paper uses examples from museums and online archives, ranging from the oldest stone tool in the world to the latest tribute vid on the net.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of decision making in an uncertain environment arises in many diverse contexts: deciding whether to keep a hard drive spinning in a net-book; choosing which advertisement to post to a Web site visitor; choosing how many newspapers to order so as to maximize profits; or choosing a route to recommend to a driver given limited and possibly out-of-date information about traffic conditions. All are sequential decision problems, since earlier decisions affect subsequent performance; all require adaptive approaches, since they involve significant uncertainty. The key issue in effectively solving problems like these is known as the exploration/exploitation trade-off: If I am at a cross-roads, when should I go in the most advantageous direction among those that I have already explored, and when should I strike out in a new direction, in the hopes I will discover something better?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In dynamic and uncertain environments such as healthcare, where the needs of security and information availability are difficult to balance, an access control approach based on a static policy will be suboptimal regardless of how comprehensive it is. The uncertainty stems from the unpredictability of users’ operational needs as well as their private incentives to misuse permissions. In Role Based Access Control (RBAC), a user’s legitimate access request may be denied because its need has not been anticipated by the security administrator. Alternatively, even when the policy is correctly specified an authorised user may accidentally or intentionally misuse the granted permission. This paper introduces a novel approach to access control under uncertainty and presents it in the context of RBAC. By taking insights from the field of economics, in particular the insurance literature, we propose a formal model where the value of resources are explicitly defined and an RBAC policy (entailing those predictable access needs) is only used as a reference point to determine the price each user has to pay for access, as opposed to representing hard and fast rules that are always rigidly applied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bioinformatics involves analyses of biological data such as DNA sequences, microarrays and protein-protein interaction (PPI) networks. Its two main objectives are the identification of genes or proteins and the prediction of their functions. Biological data often contain uncertain and imprecise information. Fuzzy theory provides useful tools to deal with this type of information, hence has played an important role in analyses of biological data. In this thesis, we aim to develop some new fuzzy techniques and apply them on DNA microarrays and PPI networks. We will focus on three problems: (1) clustering of microarrays; (2) identification of disease-associated genes in microarrays; and (3) identification of protein complexes in PPI networks. The first part of the thesis aims to detect, by the fuzzy C-means (FCM) method, clustering structures in DNA microarrays corrupted by noise. Because of the presence of noise, some clustering structures found in random data may not have any biological significance. In this part, we propose to combine the FCM with the empirical mode decomposition (EMD) for clustering microarray data. The purpose of EMD is to reduce, preferably to remove, the effect of noise, resulting in what is known as denoised data. We call this method the fuzzy C-means method with empirical mode decomposition (FCM-EMD). We applied this method on yeast and serum microarrays, and the silhouette values are used for assessment of the quality of clustering. The results indicate that the clustering structures of denoised data are more reasonable, implying that genes have tighter association with their clusters. Furthermore we found that the estimation of the fuzzy parameter m, which is a difficult step, can be avoided to some extent by analysing denoised microarray data. The second part aims to identify disease-associated genes from DNA microarray data which are generated under different conditions, e.g., patients and normal people. We developed a type-2 fuzzy membership (FM) function for identification of diseaseassociated genes. This approach is applied to diabetes and lung cancer data, and a comparison with the original FM test was carried out. Among the ten best-ranked genes of diabetes identified by the type-2 FM test, seven genes have been confirmed as diabetes-associated genes according to gene description information in Gene Bank and the published literature. An additional gene is further identified. Among the ten best-ranked genes identified in lung cancer data, seven are confirmed that they are associated with lung cancer or its treatment. The type-2 FM-d values are significantly different, which makes the identifications more convincing than the original FM test. The third part of the thesis aims to identify protein complexes in large interaction networks. Identification of protein complexes is crucial to understand the principles of cellular organisation and to predict protein functions. In this part, we proposed a novel method which combines the fuzzy clustering method and interaction probability to identify the overlapping and non-overlapping community structures in PPI networks, then to detect protein complexes in these sub-networks. Our method is based on both the fuzzy relation model and the graph model. We applied the method on several PPI networks and compared with a popular protein complex identification method, the clique percolation method. For the same data, we detected more protein complexes. We also applied our method on two social networks. The results showed our method works well for detecting sub-networks and give a reasonable understanding of these communities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Children’s literature has conventionally and historically been concerned with identity and the often tortuous journey to becoming a subject who is generally older and wiser, a journey typically characterised by mishap, adventure, and detours. Narrative closure in children’s and young adult novels and films typically provides a point of self-realisation or self-actualisation, whereby the struggles of finding one’s “true” identity have been overcome. In this familiar coming-of-age narrative, there is often an underlying premise of an essential self that will emerge or be uncovered. This kind of narrative resolution provides readers with a reassurance that things will work for the best in the end, which is an enduring feature of children’s literature, and part of liberal-humanism’s project of harmonious individuality. However, uncertainty is a constant that has always characterised the ways lives are lived, regardless of best-laid plans. Children’s literature provides a field of narrative knowledge whereby readers gain impressions of childhood and adolescence, or more specifically, knowledge of ways of being at a time in life, which is marked by uncertainty. Despite the prevalence of children’s texts which continue to offer normative ways of being, in particular, normative forms of gender behaviour, there are texts which resist the pull for characters to be “like everyone else” by exploring alternative subjectivities. Fiction, however, cannot be regarded as a source of evidence about the material realities of life, as its strength lies in its affective and imaginative dimensions, which nevertheless can offer readers moments of reflection, recognition, or, in some cases, reality lessons. As a form of cultural production, contemporary children’s literature is highly responsive to social change and political debates, and is crucially implicated in shaping the values, attitudes and behaviours of children and young people.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasingly societies and their governments are facing important social issues that have science and technology as key features. A number of these socio-scientific issues have two features that distinguish them from the restricted contexts in which school science has traditionally been presented. Some of their science is uncertain and scientific knowledge is not the only knowledge involved. As a result, the concepts of uncertainty, risk and complexity become essential aspects of the science underlying these issues. In this chapter we discuss the nature and role of these concepts in the public understanding of science and consider their links with school science. We argue that these same concepts and their role in contemporary scientific knowledge need to be addressed in school science curricula. The new features for content, pedagogy and assessment of this urgent challenge for science educators are outlined. These will be essential if the goal of science education for citizenship is to be achieved with our students, who will increasingly be required to make personal and collective decisions on issues involving science and technology.