28 resultados para ”real world mathematics”
Resumo:
The second edition of An Introduction to Efficiency and Productivity Analysis is designed to be a general introduction for those who wish to study efficiency and productivity analysis. The book provides an accessible, well-written introduction to the four principal methods involved: econometric estimation of average response models; index numbers, data envelopment analysis (DEA); and stochastic frontier analysis (SFA). For each method, a detailed introduction to the basic concepts is presented, numerical examples are provided, and some of the more important extensions to the basic methods are discussed. Of special interest is the systematic use of detailed empirical applications using real-world data throughout the book. In recent years, there have been a number of excellent advance-level books published on performance measurement. This book, however, is the first systematic survey of performance measurement with the express purpose of introducing the field to a wide audience of students, researchers, and practitioners. Indeed, the 2nd Edition maintains its uniqueness: (1) It is a well-written introduction to the field. (2) It outlines, discusses and compares the four principal methods for efficiency and productivity analysis in a well-motivated presentation. (3) It provides detailed advice on computer programs that can be used to implement these performance measurement methods. The book contains computer instructions and output listings for the SHAZAM, LIMDEP, TFPIP, DEAP and FRONTIER computer programs. More extensive listings of data and computer instruction files are available on the book's website: (www.uq.edu.au/economics/cepa/crob2005).
Resumo:
This paper discusses a multi-layer feedforward (MLF) neural network incident detection model that was developed and evaluated using field data. In contrast to published neural network incident detection models which relied on simulated or limited field data for model development and testing, the model described in this paper was trained and tested on a real-world data set of 100 incidents. The model uses speed, flow and occupancy data measured at dual stations, averaged across all lanes and only from time interval t. The off-line performance of the model is reported under both incident and non-incident conditions. The incident detection performance of the model is reported based on a validation-test data set of 40 incidents that were independent of the 60 incidents used for training. The false alarm rates of the model are evaluated based on non-incident data that were collected from a freeway section which was video-taped for a period of 33 days. A comparative evaluation between the neural network model and the incident detection model in operation on Melbourne's freeways is also presented. The results of the comparative performance evaluation clearly demonstrate the substantial improvement in incident detection performance obtained by the neural network model. The paper also presents additional results that demonstrate how improvements in model performance can be achieved using variable decision thresholds. Finally, the model's fault-tolerance under conditions of corrupt or missing data is investigated and the impact of loop detector failure/malfunction on the performance of the trained model is evaluated and discussed. The results presented in this paper provide a comprehensive evaluation of the developed model and confirm that neural network models can provide fast and reliable incident detection on freeways. (C) 1997 Elsevier Science Ltd. All rights reserved.
Long-term clozapine treatment identifies significant improvements in clinical and functioning scales
Resumo:
The majority of clinical drug trials only cover a small number of variables over a short period of time on a small group of people. The objective of this study was to track a large group of people over a long period of time, using a diverse range of variables with a naturalistic design to assess the ‘real world’ use of clozapine. Fifty-three people with treatment-resistant schizophrenia were recruited into a 2-year study which assessed the subjects using the following scales: Positive and Negative Syndrome Scale (PANSS), Clinical Global Impression Scale (CGI), Life Skills Profile (LSP), and Role Functioning Scale (RFS). Discharge, leave, and ward movement rates were also monitored. All subjects were inpatients at a tertiary psychiatric facility. Thirty-three percent of the group was discharged. Seventythree percent moved to less cost-intensive wards, and the leave rate increased by 105”/0. Sixty-seven percent of the study group were identified as responders by the 24-month time point. Twenty-four percent of the group had their CGI scores reduced to 2 or better 0, =O.OOOl). Significant improvements were identified in the RFS (p = 0.02) and LSP (p = 0.0001). Long-term clozapine treatment has identified a significant group of responders on a variety of measures.
Resumo:
We tested the effects of four data characteristics on the results of reserve selection algorithms. The data characteristics were nestedness of features (land types in this case), rarity of features, size variation of sites (potential reserves) and size of data sets (numbers of sites and features). We manipulated data sets to produce three levels, with replication, of each of these data characteristics while holding the other three characteristics constant. We then used an optimizing algorithm and three heuristic algorithms to select sites to solve several reservation problems. We measured efficiency as the number or total area of selected sites, indicating the relative cost of a reserve system. Higher nestedness increased the efficiency of all algorithms (reduced the total cost of new reserves). Higher rarity reduced the efficiency of all algorithms (increased the total cost of new reserves). More variation in site size increased the efficiency of all algorithms expressed in terms of total area of selected sites. We measured the suboptimality of heuristic algorithms as the percentage increase of their results over optimal (minimum possible) results. Suboptimality is a measure of the reliability of heuristics as indicative costing analyses. Higher rarity reduced the suboptimality of heuristics (increased their reliability) and there is some evidence that more size variation did the same for the total area of selected sites. We discuss the implications of these results for the use of reserve selection algorithms as indicative and real-world planning tools.
Resumo:
The current study was designed to confirm that female drivers sit closer to the steering wheel than do male drivers and to investigate whether this expected difference in sitting position is attributable to differences in the physical dimensions of men and women. Driver body dimensions and multiple measures of sitting distance from the steering wheel were collected from a sample of 150 men and 150 women. The results confirmed that on average, women sit closer to the steering wheel than men do and that this difference is accounted for by variations in body dimensions, especially height. This result suggests that driver height may provide a good surrogate for sitting distance from the steering wheel when investigating the role of driver position in real-world crash outcomes. The potential applications of this research include change to vehicle design that allows independent adjustment of the relative distance among the driver's seat, the steering wheel, and the floor pedals.
Resumo:
We develop a test of evolutionary change that incorporates a null hypothesis of homogeneity, which encompasses time invariance in the variance and autocovariance structure of residuals from estimated econometric relationships. The test framework is based on examining whether shifts in spectral decomposition between two frames of data are significant. Rejection of the null hypothesis will point not only to weak nonstationarity but to shifts in the structure of the second-order moments of the limiting distribution of the random process. This would indicate that the second-order properties of any underlying attractor set has changed in a statistically significant way, pointing to the presence of evolutionary change. A demonstration of the test's applicability to a real-world macroeconomic problem is accomplished by applying the test to the Australian Building Society Deposits (ABSD) model.
Resumo:
An important feature of some conceptual modelling grammars is the features they provide to allow database designers to show real-world things may or may not possess a particular attribute or relationship. In the entity-relationship model, for example, the fact that a thing may not possess an attribute can be represented by using a special symbol to indicate that the attribute is optional. Similarly, the fact that a thing may or may not be involved in a relationship can be represented by showing the minimum cardinality of the relationship as zero. Whether these practices should be followed, however, is a contentious issue. An alternative approach is to eliminate optional attributes and relationships from conceptual schema diagrams by using subtypes that have only mandatory attributes and relationships. In this paper, we first present a theory that led us to predict that optional attributes and relationships should be used in conceptual schema diagrams only when users of the diagrams require a surface-level understanding of the domain being represented by the diagrams. When users require a deep-level understanding, however, optional attributes and relationships should not be used because they undermine users' abilities to grasp important domain semantics. We describe three experiments which we then undertook to test our predictions. The results of the experiments support our predictions.
Resumo:
A randomized controlled trial was carried out to measure the cost-effectiveness of realtime teledermatology compared with conventional outpatient dermatology care for patients from urban and rural areas. One urban and one rural health centre were linked to a regional hospital in Northern Ireland by ISDN at 128 kbit/s. Over two years, 274 patients required a hospital outpatient dermatology referral -126 patients (46%) were randomized to a telemedicine consultation and 148 (54%) to a conventional hospital outpatient consultation. Of those seen by telemedicine, 61% were registered with an urban practice, compared with 71% of those seen conventionally. The clinical outcomes of the two types of consultation were similar - almost half the patients were managed after a single consultation with the dermatologist. The observed marginal cost per patient of the initial realtime teledermatology consultation was f52.85 for those in urban areas and f59.93 per patient for those from rural areas. The observed marginal cost of the initial conventional consultation was f47.13 for urban patients and f48.77 for rural patients. The total observed costs of teledermatology were higher than the costs of conventional care in both urban and rural areas, mainly because of the fixed equipment costs. Sensitivity analysis using a real-world scenario showed that in urban areas the average costs of the telemedicine and conventional consultations were about equal, while in rural areas the average cost of the telemedicine consultation was less than that of the conventional consultation.
Resumo:
Read-only-memory-based (ROM-based) quantum computation (QC) is an alternative to oracle-based QC. It has the advantages of being less magical, and being more suited to implementing space-efficient computation (i.e., computation using the minimum number of writable qubits). Here we consider a number of small (one- and two-qubit) quantum algorithms illustrating different aspects of ROM-based QC. They are: (a) a one-qubit algorithm to solve the Deutsch problem; (b) a one-qubit binary multiplication algorithm; (c) a two-qubit controlled binary multiplication algorithm; and (d) a two-qubit ROM-based version of the Deutsch-Jozsa algorithm. For each algorithm we present experimental verification using nuclear magnetic resonance ensemble QC. The average fidelities for the implementation were in the ranges 0.9-0.97 for the one-qubit algorithms, and 0.84-0.94 for the two-qubit algorithms. We conclude with a discussion of future prospects for ROM-based quantum computation. We propose a four-qubit algorithm, using Grover's iterate, for solving a miniature real-world problem relating to the lengths of paths in a network.
Resumo:
Measures of eye activity, such as blink rate and scanning patterns, have been used extensively as psychophysiological indices of mental workload. In a review of measures derived from spontaneous eye activity it is shown that different measures are differentially sensitive to specific aspects of mental workload. A less well-known measure of non-spontaneous eye activity, the blink reflex, is also reviewed. Experiments using discrete punctuate stimuli and continuous tasks analogous to real-world systems show that blink reflexes are modulated by attention and that this modulation reflects modality-specific attentional engagement. Future research should examine the utility of the blink reflex according to the desirable properties of sensitivity, diagnosticity, validity, reliability, ease of use, unobtrusiveness, and operator acceptance.
Resumo:
Conceptual modelling is an activity undertaken during information systems development work to build a representation of selected semantics about some real-world domain. Ontological theories have been developed to account for the structure and behavior of the real world in general. In this paper, I discuss why ontological theories can be used to inform conceptual modelling research, practice, and pedagogy. I provide examples from my research to illustrate how a particular ontological theory has enabled me to improve my understanding of certain conceptual modelling practices and grammars. I describe, also, how some colleagues and I have used this theory to generate several counter-intuitive, sometimes surprising predictions about widely advocated conceptual modelling practices - predictions that subsequently were supported in empirical research we undertook. Finally, I discuss several possibilities and pitfalls I perceived to be associated with our using ontological theories to underpin research on conceptual modelling.
Resumo:
Like many states and territories, South Australia has a legacy of marine reserves considered to be inadequate to meet current conservation objectives. In this paper we configured exploratory marine reserve systems, using the software MARXAN, to examine how efficiently South Australia's existing marine reserves contribute to quantitative biodiversity conservation targets. Our aim was to compare marine reserve systems that retain South Australia's existing marine reserves with reserve systems that are free to either ignore or incorporate them. We devised a new interpretation of irreplaceability to identify planning units selected more than could be expected from chance alone. This is measured by comparing the observed selection frequency for an individual planning unit with a predicted selection frequency distribution. Knowing which sites make a valuable contribution to efficient marine reserve system design allows us to determine how well South Australia's existing reserves contribute to reservation goals when representation targets are set at 5, 10, 15, 20, 30 and 50% of conservation features. Existing marine reserves that tail to contribute to efficient marine reserve systems constitute 'opportunity costs'. We found that despite spanning less than 4% of South Australian state waters, locking in the existing ad hoc marine reserves presented considerable opportunity costs. Even with representation targets set at 50%, more than halt of South Australia's existing marine reserves were selected randomly or less in efficient marine reserve systems. Hence, ad hoc marine reserve systems are likely to be inefficient and may compromise effective conservation of marine biodiversity.
Resumo:
The potential applications of macrocycles in chemistry and at its interfaces with biology and physics continue to emerge, one of which is as receptors for small molecules and ions. This review illustrates these applications with examples from the last ten years employing complexation as the binding mechanism; some of the systems presented have already found real-world sensor applications. In any case, the challenges remain to design more selective and sensitive receptors for guests.