440 resultados para ACCURATE


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this work is to validate and automate the use of DYNJAWS; a new component module (CM) in the BEAMnrc Monte Carlo (MC) user code. The DYNJAWS CM simulates dynamic wedges and can be used in three modes; dynamic, step-and-shoot and static. The step-and-shoot and dynamic modes require an additional input file defining the positions of the jaw that constitutes the dynamic wedge, at regular intervals during its motion. A method for automating the generation of the input file is presented which will allow for the more efficient use of the DYNJAWS CM. Wedged profiles have been measured and simulated for 6 and 10 MV photons at three field sizes (5 cm x 5 cm , 10 cm x10 cm and 20 cm x 20 cm), four wedge angles (15, 30, 45 and 60 degrees), at dmax and at 10 cm depth. Results of this study show agreement between the measured and the MC profiles to within 3% of absolute dose or 3 mm distance to agreement for all wedge angles at both energies and depths. The gamma analysis suggests that dynamic mode is more accurate than the step-and-shoot mode. The DYNJAWS CM is an important addition to the BEAMnrc code and will enable the MC verification of patient treatments involving dynamic wedges.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: to assess the accuracy of data linkage across the spectrum of emergency care in the absence of a unique patient identifier, and to use the linked data to examine service delivery outcomes in an emergency department setting. Design: automated data linkage and manual data linkage were compared to determine their relative accuracy. Data were extracted from three separate health information systems: ambulance, ED and hospital inpatients, then linked to provide information about the emergency journey of each patient. The linking was done manually through physical review of records and automatically using a data linking tool (Health Data Integration) developed by the CSIRO. Match rate and quality of the linking were compared. Setting: 10, 835 patient presentations to a large, regional teaching hospital ED over a two month period (August-September 2007). Results: comparison of the manual and automated linkage outcomes for each pair of linked datasets demonstrated a sensitivity of between 95% and 99%; a specificity of between 75% and 99%; and a positive predictive value of between 88% and 95%. Conclusions: Our results indicate that automated linking provides a sound basis for health service analysis, even in the absence of a unique patient identifier. The use of an automated linking tool yields accurate data suitable for planning and service delivery purposes and enables the data to be linked regularly to examine service delivery outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Magneto-rheological (MR) fluid damper is a semi-active control device that has recently received more attention by the vibration control community. But inherent nonlinear hysteresis character of magneto-rheological fluid dampers is one of the challenging aspects for utilizing this device to achieve high system performance. So the development of accurate model is necessary to take the advantage their unique characteristics. Research by others [3] has shown that a system of nonlinear differential equations can successfully be used to describe the hysteresis behavior of the MR damper. The focus of this paper is to develop an alternative method for modeling a damper in the form of centre average fuzzy interference system, where back propagation learning rules are used to adjust the weight of network. The inputs for the model are used from the experimental data. The resulting fuzzy interference system is satisfactorily represents the behavior of the MR fluid damper with reduced computational requirements. Use of the neuro-fuzzy model increases the feasibility of real time simulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intelligible and accurate risk-based decision-making requires a complex balance of information from different sources, appropriate statistical analysis of this information and consequent intelligent inference and decisions made on the basis of these analyses. Importantly, this requires an explicit acknowledgement of uncertainty in the inputs and outputs of the statistical model. The aim of this paper is to progress a discussion of these issues in the context of several motivating problems related to the wider scope of agricultural production. These problems include biosecurity surveillance design, pest incursion, environmental monitoring and import risk assessment. The information to be integrated includes observational and experimental data, remotely sensed data and expert information. We describe our efforts in addressing these problems using Bayesian models and Bayesian networks. These approaches provide a coherent and transparent framework for modelling complex systems, combining the different information sources, and allowing for uncertainty in inputs and outputs. While the theory underlying Bayesian modelling has a long and well established history, its application is only now becoming more possible for complex problems, due to increased availability of methodological and computational tools. Of course, there are still hurdles and constraints, which we also address through sharing our endeavours and experiences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A trend in design and implementation of modern industrial automation systems is to integrate computing, communication and control into a unified framework at different levels of machine/factory operations and information processing. These distributed control systems are referred to as networked control systems (NCSs). They are composed of sensors, actuators, and controllers interconnected over communication networks. As most of communication networks are not designed for NCS applications, the communication requirements of NCSs may be not satisfied. For example, traditional control systems require the data to be accurate, timely and lossless. However, because of random transmission delays and packet losses, the control performance of a control system may be badly deteriorated, and the control system rendered unstable. The main challenge of NCS design is to both maintain and improve stable control performance of an NCS. To achieve this, communication and control methodologies have to be designed. In recent decades, Ethernet and 802.11 networks have been introduced in control networks and have even replaced traditional fieldbus productions in some real-time control applications, because of their high bandwidth and good interoperability. As Ethernet and 802.11 networks are not designed for distributed control applications, two aspects of NCS research need to be addressed to make these communication networks suitable for control systems in industrial environments. From the perspective of networking, communication protocols need to be designed to satisfy communication requirements for NCSs such as real-time communication and high-precision clock consistency requirements. From the perspective of control, methods to compensate for network-induced delays and packet losses are important for NCS design. To make Ethernet-based and 802.11 networks suitable for distributed control applications, this thesis develops a high-precision relative clock synchronisation protocol and an analytical model for analysing the real-time performance of 802.11 networks, and designs a new predictive compensation method. Firstly, a hybrid NCS simulation environment based on the NS-2 simulator is designed and implemented. Secondly, a high-precision relative clock synchronization protocol is designed and implemented. Thirdly, transmission delays in 802.11 networks for soft-real-time control applications are modeled by use of a Markov chain model in which real-time Quality-of- Service parameters are analysed under a periodic traffic pattern. By using a Markov chain model, we can accurately model the tradeoff between real-time performance and throughput performance. Furthermore, a cross-layer optimisation scheme, featuring application-layer flow rate adaptation, is designed to achieve the tradeoff between certain real-time and throughput performance characteristics in a typical NCS scenario with wireless local area network. Fourthly, as a co-design approach for both a network and a controller, a new predictive compensation method for variable delay and packet loss in NCSs is designed, where simultaneous end-to-end delays and packet losses during packet transmissions from sensors to actuators is tackled. The effectiveness of the proposed predictive compensation approach is demonstrated using our hybrid NCS simulation environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Video surveillance technology, based on Closed Circuit Television (CCTV) cameras, is one of the fastest growing markets in the field of security technologies. However, the existing video surveillance systems are still not at a stage where they can be used for crime prevention. The systems rely heavily on human observers and are therefore limited by factors such as fatigue and monitoring capabilities over long periods of time. To overcome this limitation, it is necessary to have “intelligent” processes which are able to highlight the salient data and filter out normal conditions that do not pose a threat to security. In order to create such intelligent systems, an understanding of human behaviour, specifically, suspicious behaviour is required. One of the challenges in achieving this is that human behaviour can only be understood correctly in the context in which it appears. Although context has been exploited in the general computer vision domain, it has not been widely used in the automatic suspicious behaviour detection domain. So, it is essential that context has to be formulated, stored and used by the system in order to understand human behaviour. Finally, since surveillance systems could be modeled as largescale data stream systems, it is difficult to have a complete knowledge base. In this case, the systems need to not only continuously update their knowledge but also be able to retrieve the extracted information which is related to the given context. To address these issues, a context-based approach for detecting suspicious behaviour is proposed. In this approach, contextual information is exploited in order to make a better detection. The proposed approach utilises a data stream clustering algorithm in order to discover the behaviour classes and their frequency of occurrences from the incoming behaviour instances. Contextual information is then used in addition to the above information to detect suspicious behaviour. The proposed approach is able to detect observed, unobserved and contextual suspicious behaviour. Two case studies using video feeds taken from CAVIAR dataset and Z-block building, Queensland University of Technology are presented in order to test the proposed approach. From these experiments, it is shown that by using information about context, the proposed system is able to make a more accurate detection, especially those behaviours which are only suspicious in some contexts while being normal in the others. Moreover, this information give critical feedback to the system designers to refine the system. Finally, the proposed modified Clustream algorithm enables the system to both continuously update the system’s knowledge and to effectively retrieve the information learned in a given context. The outcomes from this research are: (a) A context-based framework for automatic detecting suspicious behaviour which can be used by an intelligent video surveillance in making decisions; (b) A modified Clustream data stream clustering algorithm which continuously updates the system knowledge and is able to retrieve contextually related information effectively; and (c) An update-describe approach which extends the capability of the existing human local motion features called interest points based features to the data stream environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Copyright protects much of the creative, cultural, educational, scientific and informational material generated by federal, State/Territory and local governments and their constituent departments and agencies. Governments at all levels develop, manage and distribute a vast array of materials in the form of documents, reports, websites, datasets and databases on CD or DVD and files that can be downloaded from a website. Under the Copyright Act 1968 (Cth), with few exceptions government copyright is treated the same as copyright owned by non-government parties insofar as the range of protected materials and the exclusive proprietary rights attaching to them are concerned. However, the rationale for recognizing copyright in public sector materials and vesting ownership of copyright in governments is fundamentally different to the main rationales underpinning copyright generally. The central justification for recognizing Crown copyright is to ensure that government documents and materials created for public administrative purposes are disseminated in an accurate and reliable form. Consequently, the exclusive rights held by governments as copyright owners must be exercised in a manner consistent with the rationale for conferring copyright ownership on them. Since Crown copyright exists primarily to ensure that documents and materials produced for use in the conduct of government are circulated in an accurate and reliable form, governments should exercise their exclusive rights to ensure that their copyright materials are made available for access and reuse, in accordance with any laws and policies relating to access to public sector materials. While copyright law vests copyright owners with extensive bundles of exclusive rights which can be exercised to prevent others making use of the copyright material, in the case of Crown copyright materials these rights should rarely be asserted by government to deviate from the general rule that Crown copyright materials will be available for “full and free reproduction” by the community at large.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Techniques for the accurate measurement of ionising radiation have been evolving since Roentgen first discovered x-rays in 1895; until now experimental measurements of radiation fields in the three spatial dimensions plus time have not been successfully demonstrated. In this work, we embed an organic plastic scintillator in a polymer gel dosimeter to obtain the first quasi-4D experimental measurement of a radiation field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability to accurately predict the remaining useful life of machine components is critical for machine continuous operation and can also improve productivity and enhance system’s safety. In condition-based maintenance (CBM), maintenance is performed based on information collected through condition monitoring and assessment of the machine health. Effective diagnostics and prognostics are important aspects of CBM for maintenance engineers to schedule a repair and to acquire replacement components before the components actually fail. Although a variety of prognostic methodologies have been reported recently, their application in industry is still relatively new and mostly focused on the prediction of specific component degradations. Furthermore, they required significant and sufficient number of fault indicators to accurately prognose the component faults. Hence, sufficient usage of health indicators in prognostics for the effective interpretation of machine degradation process is still required. Major challenges for accurate longterm prediction of remaining useful life (RUL) still remain to be addressed. Therefore, continuous development and improvement of a machine health management system and accurate long-term prediction of machine remnant life is required in real industry application. This thesis presents an integrated diagnostics and prognostics framework based on health state probability estimation for accurate and long-term prediction of machine remnant life. In the proposed model, prior empirical (historical) knowledge is embedded in the integrated diagnostics and prognostics system for classification of impending faults in machine system and accurate probability estimation of discrete degradation stages (health states). The methodology assumes that machine degradation consists of a series of degraded states (health states) which effectively represent the dynamic and stochastic process of machine failure. The estimation of discrete health state probability for the prediction of machine remnant life is performed using the ability of classification algorithms. To employ the appropriate classifier for health state probability estimation in the proposed model, comparative intelligent diagnostic tests were conducted using five different classifiers applied to the progressive fault data of three different faults in a high pressure liquefied natural gas (HP-LNG) pump. As a result of this comparison study, SVMs were employed in heath state probability estimation for the prediction of machine failure in this research. The proposed prognostic methodology has been successfully tested and validated using a number of case studies from simulation tests to real industry applications. The results from two actual failure case studies using simulations and experiments indicate that accurate estimation of health states is achievable and the proposed method provides accurate long-term prediction of machine remnant life. In addition, the results of experimental tests show that the proposed model has the capability of providing early warning of abnormal machine operating conditions by identifying the transitional states of machine fault conditions. Finally, the proposed prognostic model is validated through two industrial case studies. The optimal number of health states which can minimise the model training error without significant decrease of prediction accuracy was also examined through several health states of bearing failure. The results were very encouraging and show that the proposed prognostic model based on health state probability estimation has the potential to be used as a generic and scalable asset health estimation tool in industrial machinery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The potential restriction to effective dispersal and gene flow caused by habitat fragmentation can apply to multiple levels of evolutionary scale; from the fragmentation of ancient supercontinents driving diversification and speciation on disjunct landmasses, to the isolation of proximate populations as a result of their inability to cross intervening unsuitable habitat. Investigating the role of habitat fragmentation in driving diversity within and among taxa can thus include inferences of phylogenetic relationships among taxa, assessments of intraspecific phylogeographic structure and analyses of gene flow among neighbouring populations. The proposed Gondwanan clade within the chironomid (non-biting midge) subfamily Orthocladiinae (Diptera: Chironomidae) represents a model system for investigating the role that population fragmentation and isolation has played at different evolutionary scales. A pilot study by Krosch et al (2009) indentified several highly divergent lineages restricted to ancient rainforest refugia and limited gene flow among proximate sites within a refuge for one member of this clade, Echinocladius martini Cranston. This study provided a framework for investigating the evolutionary history of this taxon and its relatives more thoroughly. Populations of E. martini were sampled in the Paluma bioregion of northeast Queensland to investigate patterns of fine-scale within- and among-stream dispersal and gene flow within a refuge more rigorously. Data was incorporated from Krosch et al (2009) and additional sites were sampled up- and downstream of the original sites. Analyses of genetic structure revealed strong natal site fidelity and high genetic structure among geographically proximate streams. Little evidence was found for regular headwater exchange among upstream sites, but there was distinct evidence for rare adult flight among sites on separate stream reaches. Overall, however, the distribution of shared haplotypes implied that both larval and adult dispersal was largely limited to the natal stream channel. Patterns of regional phylogeographic structure were examined in two related austral orthoclad taxa – Naonella forsythi Boothroyd from New Zealand and Ferringtonia patagonica Sæther and Andersen from southern South America – to provide a comparison with patterns revealed in their close relative E. martini. Both taxa inhabit tectonically active areas of the southern hemisphere that have also experienced several glaciation events throughout the Plio-Pleistocene that are thought to have affected population structure dramatically in many taxa. Four highly divergent lineages estimated to have diverged since the late Miocene were revealed in each taxon, mirroring patterns in E. martini; however, there was no evidence for local geographical endemism, implying substantial range expansion post-diversification. The differences in pattern evident among the three related taxa were suggested to have been influenced by variation in the responses of closed forest habitat to climatic fluctuations during interglacial periods across the three landmasses. Phylogeographic structure in E. martini was resolved at a continental scale by expanding upon the sampling design of Krosch et al (2009) to encompass populations in southeast Queensland, New South Wales and Victoria. Patterns of phylogeographic structure were consistent with expectations and several previously unrecognised lineages were revealed from central- and southern Australia that were geographically endemic to closed forest refugia. Estimated divergence times were congruent with the timing of Plio-Pleistocene rainforest contractions across the east coast of Australia. This suggested that dispersal and gene flow of E. martini among isolated refugia was highly restricted and that this taxon was susceptible to the impacts of habitat change. Broader phylogenetic relationships among taxa considered to be members of this Gondwanan orthoclad group were resolved in order to test expected patterns of evolutionary affinities across the austral continents. The inferred phylogeny and estimated divergence times did not accord with expected patterns based on the geological sequence of break-up of the Gondwanan supercontinent and implied instead several transoceanic dispersal events post-vicariance. Difficulties in appropriate taxonomic sampling and accurate calibration of molecular phylogenies notwithstanding, the sampling regime implemented in the current study has been the most intensive yet performed for austral members of the Orthocladiinae and unsurprisingly has revealed both novel taxa and phylogenetic relationships within and among described genera. Several novel associations between life stages are made here for both described and previously unknown taxa. Investigating evolutionary relationships within and among members of this clade of proposed Gondwanan orthoclad taxa has demonstrated that a complex interaction between historical population fragmentation and dispersal at several levels of evolutionary scale has been important in driving diversification in this group. While interruptions to migration, colonisation and gene flow driven by population fragmentation have clearly contributed to the development and maintenance of much of the diversity present in this group, long-distance dispersal has also played a role in influencing diversification of continental biotas and facilitating gene flow among disjunct populations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A computational fluid dynamics (CFD) analysis has been performed for a flat plate photocatalytic reactor using CFD code FLUENT. Under the simulated conditions (Reynolds number, Re around 2650), a detailed time accurate computation shows the different stages of flow evolution and the effects of finite length of the reactor in creating flow instability, which is important to improve the performance of the reactor for storm and wastewater reuse. The efficiency of a photocatalytic reactor for pollutant decontamination depends on reactor hydrodynamics and configurations. This study aims to investigate the role of different parameters on the optimization of the reactor design for its improved performance. In this regard, more modelling and experimental efforts are ongoing to better understand the interplay of the parameters that influence the performance of the flat plate photocatalytic reactor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Information overload has become a serious issue for web users. Personalisation can provide effective solutions to overcome this problem. Recommender systems are one popular personalisation tool to help users deal with this issue. As the base of personalisation, the accuracy and efficiency of web user profiling affects the performances of recommender systems and other personalisation systems greatly. In Web 2.0, the emerging user information provides new possible solutions to profile users. Folksonomy or tag information is a kind of typical Web 2.0 information. Folksonomy implies the users‘ topic interests and opinion information. It becomes another source of important user information to profile users and to make recommendations. However, since tags are arbitrary words given by users, folksonomy contains a lot of noise such as tag synonyms, semantic ambiguities and personal tags. Such noise makes it difficult to profile users accurately or to make quality recommendations. This thesis investigates the distinctive features and multiple relationships of folksonomy and explores novel approaches to solve the tag quality problem and profile users accurately. Harvesting the wisdom of crowds and experts, three new user profiling approaches are proposed: folksonomy based user profiling approach, taxonomy based user profiling approach, hybrid user profiling approach based on folksonomy and taxonomy. The proposed user profiling approaches are applied to recommender systems to improve their performances. Based on the generated user profiles, the user and item based collaborative filtering approaches, combined with the content filtering methods, are proposed to make recommendations. The proposed new user profiling and recommendation approaches have been evaluated through extensive experiments. The effectiveness evaluation experiments were conducted on two real world datasets collected from Amazon.com and CiteULike websites. The experimental results demonstrate that the proposed user profiling and recommendation approaches outperform those related state-of-the-art approaches. In addition, this thesis proposes a parallel, scalable user profiling implementation approach based on advanced cloud computing techniques such as Hadoop, MapReduce and Cascading. The scalability evaluation experiments were conducted on a large scaled dataset collected from Del.icio.us website. This thesis contributes to effectively use the wisdom of crowds and expert to help users solve information overload issues through providing more accurate, effective and efficient user profiling and recommendation approaches. It also contributes to better usages of taxonomy information given by experts and folksonomy information contributed by users in Web 2.0.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the analysis of medical images for computer-aided diagnosis and therapy, segmentation is often required as a preliminary step. Medical image segmentation is a complex and challenging task due to the complex nature of the images. The brain has a particularly complicated structure and its precise segmentation is very important for detecting tumors, edema, and necrotic tissues in order to prescribe appropriate therapy. Magnetic Resonance Imaging is an important diagnostic imaging technique utilized for early detection of abnormal changes in tissues and organs. It possesses good contrast resolution for different tissues and is, thus, preferred over Computerized Tomography for brain study. Therefore, the majority of research in medical image segmentation concerns MR images. As the core juncture of this research a set of MR images have been segmented using standard image segmentation techniques to isolate a brain tumor from the other regions of the brain. Subsequently the resultant images from the different segmentation techniques were compared with each other and analyzed by professional radiologists to find the segmentation technique which is the most accurate. Experimental results show that the Otsu’s thresholding method is the most suitable image segmentation method to segment a brain tumor from a Magnetic Resonance Image.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study uses dosimetry film measurements and Monte Carlo simulations to investigate the accuracy of type-a (pencil-beam) dose calculations for predicting the radiation doses delivered during stereotactic radiotherapy treatments of the brain. It is shown that when evaluating doses in a water phantom, the type-a algorithm provides dose predictions which are accurate to within clinically relevant criteria, gamma(3%,3mm), but these predictions are nonetheless subtly different from the results of evaluating doses from the same fields using radiochromic film and Monte Carlo simulations. An analysis of a clinical meningioma treatment suggests that when predicting stereotactic radiotherapy doses to the brain, the inaccuracies of the type-a algorithm can be exacerbated by inadequate evaluation of the effects of nearby bone or air, resulting in dose differences of up to 10% for individual fields. The results of this study indicate the possible advantage of using Monte Carlo calculations, as well as measurements with high-spatial resolution media, to verify type-a predictions of dose delivered in cranial treatments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As dictated by s 213 of the Body Corporate and Community Management Act 1997 (Qld), the seller of a proposed lot is required to provide the buyer with a disclosure statement before the contract is entered into. Where the seller subsequently becomes aware that information contained in the disclosure statement was inaccurate when the contract was entered into or the disclosure statement would not be accurate if now given as a disclosure statement, the seller must, within 14 days, give the buyer a further statement rectifying the inaccuracies in the disclosure statement. Provided the contract has not been settled, where a further statement varies the disclosure statement to such a degree that the buyer would be materially prejudiced if compelled to complete the contract, the buyer may cancel the contract by written notice given to the seller within 14 days, or a longer period as agreed between the parties, after the seller gives the buyer the further statement. The term ‘material prejudice’ was considered by Wilson J in Wilson v Mirvac Queensland Pty Ltd.