994 resultados para REASONING OVER INCONSISTENCY


Relevância:

20.00% 20.00%

Publicador:

Resumo:

"For every complex problem there is a solution that is simple, neat and wrong (M.L. Mencken, US writer and social commentator). Nowhere is this quote more apt than when applied to finding over-simplified solutions to the complex problem of looking after the safety and well-being of vulnerable children. The easiest formula is, of course, to ‘rescue children from dysfunctional families’, a line taken recently in the monograph by the right wing think tank, Centre for Independent Studies (Sammut & O’Brien 2009). It is reasoning with fatal flaws. This commentary provides a timely reminder of the strong arguments which lie behind the national and international shift to supporting children and families through universal and specialist community-based services, rather than weighting all resources into statutory child protection interventions. A brief outline of the value of developing the resources to support children in their families, and the problems with 'rescuing' children through the child protection system are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Triage is a process that is critical to the effective management of modern emergency departments. Triage systems aim, not only to ensure clinical justice for the patient, but also to provide an effective tool for departmental organisation, monitoring and evaluation. Over the last 20 years, triage systems have been standardised in a number of countries and efforts made to ensure consistency of application. However, the ongoing crowding of emergency departments resulting from access block and increased demand has led to calls for a review of systems of triage. In addition, international variance in triage systems limits the capacity for benchmarking. The aim of this paper is to provide a critical review of the literature pertaining to emergency department triage in order to inform the direction for future research. While education, guidelines and algorithms have been shown to reduce triage variation, there remains significant inconsistency in triage assessment arising from the diversity of factors determining the urgency of any individual patient. It is timely to accept this diversity, what is agreed, and what may be agreeable. It is time to develop and test an International Triage Scale (ITS) which is supported by an international collaborative approach towards a triage research agenda. This agenda would seek to further develop application and moderating tools and to utilise the scales for international benchmarking and research programmes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Furniture and appliance related injuries in children under 5 years of age accounts for an estimated 180 emergency presentations annually in Queensland. Injuries occur when children push or pull items over, climb and fall off furniture, or climb and tip the item over. Children under 2 years of age tend to injure themselves by pulling items over onto themselves Children over 2 years of age are more likely to be injured after climbing the item and either falling off or tipping the item over onto themselves. Tip over injuries (where the item falls over and injures the child) in children under 5 years of age account for an estimated 115 emergency presentations annually in Queensland. The item most commonly associated with a tip over injury is a television (with or without the cabinet) Prevention requires better design and selection of furniture with inherent stability coupled with mechanisms to install or fix less stable items

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nonlinearity, uncertainty and subjectivity are the three predominant characteristics of contractors prequalification which cause the process more of an art than a scientific evaluation. A fuzzy neural network (FNN) model, amalgamating both the fuzzy set and neural network theories, has been developed aiming to improve the objectiveness of contractor prequalification. Through the FNN theory, the fuzzy rules as used by the prequalifiers can be identified and the corresponding membership functions can be transformed. Eighty-five cases with detailed decision criteria and rules for prequalifying Hong Kong civil engineering contractors were collected. These cases were used for training (calibrating) and testing the FNN model. The performance of the FNN model was compared with the original results produced by the prequalifiers and those generated by the general feedforward neural network (GFNN, i.e. a crisp neural network) approach. Contractor’s ranking orders, the model efficiency (R2) and the mean absolute percentage error (MAPE) were examined during the testing phase. These results indicate the applicability of the neural network approach for contractor prequalification and the benefits of the FNN model over the GFNN model. The FNN is a practical approach for modelling contractor prequalification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose Increased physical activity in colorectal cancer patients is related to improved recurrence free and overall survival. Psychological distress after cancer may place patients at risk of reduced physical activity; but paradoxically also act as a motivator for positive lifestyle change. The relationship between psychological distress and physical activity after cancer over time has not been described. Methods A prospective survey of 1966 (57% response) colorectal cancer survivors assessed the psychological distress variables of anxiety, depression, somatisation, cancer threat appraisal as predictors of physical activity five, 12, 24 and 36 months post-diagnosis 978 respondents had valid data for all time points. Results Higher somatisation was associated with greater physical inactivity (Relative risk ratio (RRR) =1.12; 95% CI=[1.1, 1.2]) and insufficient physical activity (RRR=1.05; [0.90, 1.0]). Respondents with a more positive appraisal of their cancer were significantly (p=0.031) less likely to be inactive (RRR=0.95; [0.90, 1.0]) or insufficiently active (RRR=0.96). Fatigued and obese respondents and current smokers were more inactive. Respondents whose somatisation increased between two time periods were less likely to increase their physical activity over the same period (p<0.001). Respondents with higher anxiety at one time period were less likely to have increased their activity at the next assessment (p=0.004). There was no association between depression and physical activity. Conclusions Cancer survivors who experience somatisation and anxiety are at greater risk of physical inactivity. The lack of a clear relationship between higher psychological distress and increasing physical activity argues against distress as a motivator to exercise in these patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the major challenges facing a present day game development company is the removal of bugs from such complex virtual environments. This work presents an approach for measuring the correctness of synthetic scenes generated by a rendering system of a 3D application, such as a computer game. Our approach builds a database of labelled point clouds representing the spatiotemporal colour distribution for the objects present in a sequence of bug-free frames. This is done by converting the position that the pixels take over time into the 3D equivalent points with associated colours. Once the space of labelled points is built, each new image produced from the same game by any rendering system can be analysed by measuring its visual inconsistency in terms of distance from the database. Objects within the scene can be relocated (manually or by the application engine); yet the algorithm is able to perform the image analysis in terms of the 3D structure and colour distribution of samples on the surface of the object. We applied our framework to the publicly available game RacingGame developed for Microsoft(R) Xna(R). Preliminary results show how this approach can be used to detect a variety of visual artifacts generated by the rendering system in a professional quality game engine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technological and societal change, along with organisational and market change (driven by contracting-out and privatisation), are “creating a new generation of infrastructures” [1]. While inter-organisational contractual arrangements can improve maintenance efficiency through consistent and repeatable patterns of action - unanticipated difficulties in implementation can reduce the performance of these arrangements. When faced with unsatisfactory performance of contracting-out arrangements, government organisations may choose to adapt and change these arrangements over time, with the aim of improving performance. This paper enhances our understanding of ‘next generation infrastructures’ by examining adaptation of the organisational arrangements for the maintenance of these assets, in a case study spanning 20 years.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the preliminary results in establishing a strategy for predicting Zenith Tropospheric Delay (ZTD) and relative ZTD (rZTD) between Continuous Operating Reference Stations (CORS) in near real-time. It is anticipated that the predicted ZTD or rZTD can assist the network-based Real-Time Kinematic (RTK) performance over long inter-station distances, ultimately, enabling a cost effective method of delivering precise positioning services to sparsely populated regional areas, such as Queensland. This research firstly investigates two ZTD solutions: 1) the post-processed IGS ZTD solution and 2) the near Real-Time ZTD solution. The near Real-Time solution is obtained through the GNSS processing software package (Bernese) that has been deployed for this project. The predictability of the near Real-Time Bernese solution is analyzed and compared to the post-processed IGS solution where it acts as the benchmark solution. The predictability analyses were conducted with various prediction time of 15, 30, 45, and 60 minutes to determine the error with respect to timeliness. The predictability of ZTD and relative ZTD is determined (or characterized) by using the previously estimated ZTD as the predicted ZTD of current epoch. This research has shown that both the ZTD and relative ZTD predicted errors are random in nature; the STD grows from a few millimeters to sub-centimeters while the predicted delay interval ranges from 15 to 60 minutes. Additionally, the RZTD predictability shows very little dependency on the length of tested baselines of up to 1000 kilometers. Finally, the comparison of near Real-Time Bernese solution with IGS solution has shown a slight degradation in the prediction accuracy. The less accurate NRT solution has an STD error of 1cm within the delay of 50 minutes. However, some larger errors of up to 10cm are observed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computer forensics is the process of gathering and analysing evidence from computer systems to aid in the investigation of a crime. Typically, such investigations are undertaken by human forensic examiners using purpose-built software to discover evidence from a computer disk. This process is a manual one, and the time it takes for a forensic examiner to conduct such an investigation is proportional to the storage capacity of the computer's disk drives. The heterogeneity and complexity of various data formats stored on modern computer systems compounds the problems posed by the sheer volume of data. The decision to undertake a computer forensic examination of a computer system is a decision to commit significant quantities of a human examiner's time. Where there is no prior knowledge of the information contained on a computer system, this commitment of time and energy occurs with little idea of the potential benefit to the investigation. The key contribution of this research is the design and development of an automated process to describe a computer system and its activity for the purposes of a computer forensic investigation. The term proposed for this process is computer profiling. A model of a computer system and its activity has been developed over the course of this research. Using this model a computer system, which is the subj ect of investigation, can be automatically described in terms useful to a forensic investigator. The computer profiling process IS resilient to attempts to disguise malicious computer activity. This resilience is achieved by detecting inconsistencies in the information used to infer the apparent activity of the computer. The practicality of the computer profiling process has been demonstrated by a proof-of concept software implementation. The model and the prototype implementation utilising the model were tested with data from real computer systems. The resilience of the process to attempts to disguise malicious activity has also been demonstrated with practical experiments conducted with the same prototype software implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the development of student functional thinking during a teaching experiment that was conducted in two classrooms with a total of 45 children whose average age was nine years and six months. The teaching comprised four lessons taught by a researcher, with a second researcher and classroom teacher acting as participant observers. These lessons were designed to enable students to build mental representations in order to explore the use of function tables by focusing on the relationship between input and output numbers with the intention of extracting the algebraic nature of the arithmetic involved. All lessons were videotaped. The results indicate that elementary students are not only capable of developing functional thinking but also of communicating their thinking both verbally and symbolically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mirroring the trends in other developed countries, levels of household debt in Australia have risen markedly in recent years. As one example, the total amount lent by banks to individuals has risen from $175.5 billion in August 1995 to $590.5 billion in August 2005.1 Consumer groups an~ media commentators here have long raised concerns about the risks of increasing levels of household debt and over-commitment, linking these issues at least in part to irresponsible lending practices. And more recently, the Reserve Bank Governor has also expressed concerns about the ability 'of some households to manage if personal or economic circumstances change.2