971 resultados para Binary Coded Decimal


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Signal Processing (SP) is a subject of central importance in engineering and the applied sciences. Signals are information-bearing functions, and SP deals with the analysis and processing of signals (by dedicated systems) to extract or modify information. Signal processing is necessary because signals normally contain information that is not readily usable or understandable, or which might be disturbed by unwanted sources such as noise. Although many signals are non-electrical, it is common to convert them into electrical signals for processing. Most natural signals (such as acoustic and biomedical signals) are continuous functions of time, with these signals being referred to as analog signals. Prior to the onset of digital computers, Analog Signal Processing (ASP) and analog systems were the only tool to deal with analog signals. Although ASP and analog systems are still widely used, Digital Signal Processing (DSP) and digital systems are attracting more attention, due in large part to the significant advantages of digital systems over the analog counterparts. These advantages include superiority in performance,s peed, reliability, efficiency of storage, size and cost. In addition, DSP can solve problems that cannot be solved using ASP, like the spectral analysis of multicomonent signals, adaptive filtering, and operations at very low frequencies. Following the recent developments in engineering which occurred in the 1980's and 1990's, DSP became one of the world's fastest growing industries. Since that time DSP has not only impacted on traditional areas of electrical engineering, but has had far reaching effects on other domains that deal with information such as economics, meteorology, seismology, bioengineering, oceanology, communications, astronomy, radar engineering, control engineering and various other applications. This book is based on the Lecture Notes of Associate Professor Zahir M. Hussain at RMIT University (Melbourne, 2001-2009), the research of Dr. Amin Z. Sadik (at QUT & RMIT, 2005-2008), and the Note of Professor Peter O'Shea at Queensland University of Technology. Part I of the book addresses the representation of analog and digital signals and systems in the time domain and in the frequency domain. The core topics covered are convolution, transforms (Fourier, Laplace, Z. Discrete-time Fourier, and Discrete Fourier), filters, and random signal analysis. There is also a treatment of some important applications of DSP, including signal detection in noise, radar range estimation, banking and financial applications, and audio effects production. Design and implementation of digital systems (such as integrators, differentiators, resonators and oscillators are also considered, along with the design of conventional digital filters. Part I is suitable for an elementary course in DSP. Part II (which is suitable for an advanced signal processing course), considers selected signal processing systems and techniques. Core topics covered are the Hilbert transformer, binary signal transmission, phase-locked loops, sigma-delta modulation, noise shaping, quantization, adaptive filters, and non-stationary signal analysis. Part III presents some selected advanced DSP topics. We hope that this book will contribute to the advancement of engineering education and that it will serve as a general reference book on digital signal processing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim. This paper elucidates the nature of metaphor and the conditions necessary to its use as an analytic device in qualitative research, and describes how the use of metaphor assisted in the analytic processes of a grounded theory study of nephrology nursing expertise. Background. The use of metaphor is pervasive in everyday thought, language and action. It is an important means for the comprehension and management of everyday life, and makes challenging or problematic concepts easier to explain. Metaphors are also pervasive in quantitative and qualitative research for the same reason. In both everyday life and in research, their use may be implicit or explicit. Methods. The study using grounded theory methodology took place in one renal unit in New South Wales, Australia between 1999 and 2000 and included six non-expert and 11 expert nurses. It involved simultaneous data collection and analysis using participant observation, semi-structured interviews and review of nursing documentation. Findings. A three stage skills-acquisitive process was identified in which an orchestral metaphor was used to explain the relationships between stages and to satisfactorily capture the data coded within each stage. Conclusion. Metaphors create images, clarify and add depth to meanings and, if used appropriately and explicitly in qualitative research, can capture data at highly conceptual levels. Metaphors also assist in explaining the relationship between findings in a clear and coherent manner. © 2005 Blackwell Publishing Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This important work describes recent theoretical advances in the study of artificial neural networks. It explores probabilistic models of supervised learning problems, and addresses the key statistical and computational questions. Chapters survey research on pattern classification with binary-output networks, including a discussion of the relevance of the Vapnik Chervonenkis dimension, and of estimates of the dimension for several neural network models. In addition, Anthony and Bartlett develop a model of classification by real-output networks, and demonstrate the usefulness of classification with a "large margin." The authors explain the role of scale-sensitive versions of the Vapnik Chervonenkis dimension in large margin classification, and in real prediction. Key chapters also discuss the computational complexity of neural network learning, describing a variety of hardness results, and outlining two efficient, constructive learning algorithms. The book is self-contained and accessible to researchers and graduate students in computer science, engineering, and mathematics

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Binary classification methods can be generalized in many ways to handle multiple classes. It turns out that not all generalizations preserve the nice property of Bayes consistency. We provide a necessary and sufficient condition for consistency which applies to a large class of multiclass classification methods. The approach is illustrated by applying it to some multiclass methods proposed in the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present new expected risk bounds for binary and multiclass prediction, and resolve several recent conjectures on sample compressibility due to Kuzmin and Warmuth. By exploiting the combinatorial structure of concept class F, Haussler et al. achieved a VC(F)/n bound for the natural one-inclusion prediction strategy. The key step in their proof is a d = VC(F) bound on the graph density of a subgraph of the hypercube—oneinclusion graph. The first main result of this paper is a density bound of n [n−1 <=d-1]/[n <=d] < d, which positively resolves a conjecture of Kuzmin and Warmuth relating to their unlabeled Peeling compression scheme and also leads to an improved one-inclusion mistake bound. The proof uses a new form of VC-invariant shifting and a group-theoretic symmetrization. Our second main result is an algebraic topological property of maximum classes of VC-dimension d as being d contractible simplicial complexes, extending the well-known characterization that d = 1 maximum classes are trees. We negatively resolve a minimum degree conjecture of Kuzmin and Warmuth—the second part to a conjectured proof of correctness for Peeling—that every class has one-inclusion minimum degree at most its VCdimension. Our final main result is a k-class analogue of the d/n mistake bound, replacing the VC-dimension by the Pollard pseudo-dimension and the one-inclusion strategy by its natural hypergraph generalization. This result improves on known PAC-based expected risk bounds by a factor of O(logn) and is shown to be optimal up to an O(logk) factor. The combinatorial technique of shifting takes a central role in understanding the one-inclusion (hyper)graph and is a running theme throughout.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of binary classification where the classifier can, for a particular cost, choose not to classify an observation. Just as in the conventional classification problem, minimization of the sample average of the cost is a difficult optimization problem. As an alternative, we propose the optimization of a certain convex loss function φ, analogous to the hinge loss used in support vector machines (SVMs). Its convexity ensures that the sample average of this surrogate loss can be efficiently minimized. We study its statistical properties. We show that minimizing the expected surrogate loss—the φ-risk—also minimizes the risk. We also study the rate at which the φ-risk approaches its minimum value. We show that fast rates are possible when the conditional probability P(Y=1|X) is unlikely to be close to certain critical values.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Binary classification is a well studied special case of the classification problem. Statistical properties of binary classifiers, such as consistency, have been investigated in a variety of settings. Binary classification methods can be generalized in many ways to handle multiple classes. It turns out that one can lose consistency in generalizing a binary classification method to deal with multiple classes. We study a rich family of multiclass methods and provide a necessary and sufficient condition for their consistency. We illustrate our approach by applying it to some multiclass methods proposed in the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present new expected risk bounds for binary and multiclass prediction, and resolve several recent conjectures on sample compressibility due to Kuzmin and Warmuth. By exploiting the combinatorial structure of concept class F, Haussler et al. achieved a VC(F)/n bound for the natural one-inclusion prediction strategy. The key step in their proof is a d=VC(F) bound on the graph density of a subgraph of the hypercube—one-inclusion graph. The first main result of this report is a density bound of n∙choose(n-1,≤d-1)/choose(n,≤d) < d, which positively resolves a conjecture of Kuzmin and Warmuth relating to their unlabeled Peeling compression scheme and also leads to an improved one-inclusion mistake bound. The proof uses a new form of VC-invariant shifting and a group-theoretic symmetrization. Our second main result is an algebraic topological property of maximum classes of VC-dimension d as being d-contractible simplicial complexes, extending the well-known characterization that d=1 maximum classes are trees. We negatively resolve a minimum degree conjecture of Kuzmin and Warmuth—the second part to a conjectured proof of correctness for Peeling—that every class has one-inclusion minimum degree at most its VC-dimension. Our final main result is a k-class analogue of the d/n mistake bound, replacing the VC-dimension by the Pollard pseudo-dimension and the one-inclusion strategy by its natural hypergraph generalization. This result improves on known PAC-based expected risk bounds by a factor of O(log n) and is shown to be optimal up to a O(log k) factor. The combinatorial technique of shifting takes a central role in understanding the one-inclusion (hyper)graph and is a running theme throughout

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The reduction of CO2 emissions and social exclusion are two key elements of UK transport strategy. Despite intensive research on each theme, little effort has so far been made linking the relationship between emissions and social exclusion. In addition, current knowledge on each theme is limited to urban areas; little research is available on these themes for rural areas. This research contributes to this gap in the literature by analysing 157 weekly activity-travel diary data collected from three case study areas with differential levels of area accessibility and area mobility options, located in rural Northern Ireland. Individual weekly CO2 emission levels from personal travel diaries (both hot exhaust emission and cold-start emission) were calculated using average speed models for different modes of transport. The socio-spatial patterns associated with CO2 emissions were identified using a general linear model whereas binary logistic regression analyses were conducted to identify mode choice behaviour and activity patterns. This research found groups that emitted a significantly lower level of CO2 included individuals living in an area with a higher level of accessibility and mobility, non-car, non-working, and low-income older people. However, evidence in this research also shows that although certain groups (e.g. those working, and residing in an area with a lower level of accessibility) emitted higher levels of CO2, their rate of participation in activities was however found to be significantly lower compared to their counterparts. Based on the study findings, this research highlights the need for both soft (e.g. teleworking) and physical (e.g. accessibility planning) policy measures in rural areas in order to meet government’s stated CO2 reduction targets while at the same time enhancing social inclusion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a framework for evaluating information retrieval of medical records. We use the BLULab corpus, a large collection of real-world de-identified medical records. The collection has been hand coded by clinical terminol- ogists using the ICD-9 medical classification system. The ICD codes are used to devise queries and relevance judge- ments for this collection. Results of initial test runs using a baseline IR system are provided. Queries and relevance judgements are online to aid further research in medical IR. Please visit: http://koopman.id.au/med_eval.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Unusual event detection in crowded scenes remains challenging because of the diversity of events and noise. In this paper, we present a novel approach for unusual event detection via sparse reconstruction of dynamic textures over an overcomplete basis set, with the dynamic texture described by local binary patterns from three orthogonal planes (LBPTOP). The overcomplete basis set is learnt from the training data where only the normal items observed. In the detection process, given a new observation, we compute the sparse coefficients using the Dantzig Selector algorithm which was proposed in the literature of compressed sensing. Then the reconstruction errors are computed, based on which we detect the abnormal items. Our application can be used to detect both local and global abnormal events. We evaluate our algorithm on UCSD Abnormality Datasets for local anomaly detection, which is shown to outperform current state-of-the-art approaches, and we also get promising results for rapid escape detection using the PETS2009 dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High levels of sitting have been linked with poor health outcomes. Previously a pragmatic MTI accelerometer data cut-point (100 count/min-1) has been used to estimate sitting. Data on the accuracy of this cut-point is unavailable. PURPOSE: To ascertain whether the 100 count/min-1 cut-point accurately isolates sitting from standing activities. METHODS: Participants fitted with an MTI accelerometer were observed performing a range of sitting, standing, light & moderate activities. 1-min epoch MTI data were matched to observed activities, then re-categorized as either sitting or not using the 100 count/min-1 cut-point. Self-report demographics and current physical activity were collected. Generalized estimating equation for repeated measures with a binary logistic model analyses (GEE), corrected for age, gender and BMI, were conducted to ascertain the odds of the MTI data being misclassified. RESULTS: Data were from 26 healthy subjects (8 men; 50% aged <25 years; mean BMI (SD) 22.7(3.8)m/kg2). MTI sitting and standing data mode was 0 count/min-1, with 46% of sitting activities and 21% of standing activities recording 0 count/min-1. The GEE was unable to accurately isolate sitting from standing activities using the 100 count/min-1 cut-point, since all sitting activities were incorrectly predicted as standing (p=0.05). To further explore the sensitivity of MTI data to delineate sitting from standing, the upper 95% confidence interval of the mean for the sitting activities (46 count/min-1) was used to re-categorise the data; this resulted in the GEE correctly classifying 49% of sitting, and 69% of standing activities. Using the 100 count/min-1 cut-point the data were re-categorised into a combined ‘sit/stand’ category and tested against other light activities: 88% of sit/stand and 87% of light activities were accurately predicted. Using Freedson’s moderate cut-point of 1952 count/min-1 the GEE accurately predicted 97% of light vs. 90% of moderate activities. CONCLUSION: The distributions of MTI recorded sitting and standing data overlap considerably, as such the 100 count/min -1 cut-point did not accurately isolate sitting from other static standing activities. The 100 count/min -1 cut-point more accurately predicted sit/stand vs. other movement orientated activities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modelling events in densely crowded environments remains challenging, due to the diversity of events and the noise in the scene. We propose a novel approach for anomalous event detection in crowded scenes using dynamic textures described by the Local Binary Patterns from Three Orthogonal Planes (LBP-TOP) descriptor. The scene is divided into spatio-temporal patches where LBP-TOP based dynamic textures are extracted. We apply hierarchical Bayesian models to detect the patches containing unusual events. Our method is an unsupervised approach, and it does not rely on object tracking or background subtraction. We show that our approach outperforms existing state of the art algorithms for anomalous event detection in UCSD dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: To examine the relationship between contact lens (CL) case contamination and various potential predictive factors. METHODS: 74 subjects were fitted with lotrafilcon B (CIBA Vision) CLs on a daily wear basis for 1 month. Subjects were randomly assigned one of two polyhexamethylene biguanide (PHMB) preserved disinfecting solutions with the corresponding regular lens case. Clinical evaluations were conducted at lens delivery and after 1 month, when cases were collected for microbial culture. A CL care non-compliance score was determined through administration of a questionnaire and the volume of solution used was calculated for each subject. Data was examined using backward stepwise binary logistic regression. RESULTS: 68% of cases were contaminated. 35% were moderately or heavily contaminated and 36% contained gram-negative bacteria. Case contamination was significantly associated with subjective dryness symptoms (OR 4.22, CI 1.37–13.01) (P<0.05). There was no association between contamination and subject age, ethnicity, gender, average wearing time, amount of solution used, non-compliance score, CL power and subjective redness (P>0.05). The effect of lens care system on case contamination approached significance (P=0.07). Failure to rinse the case with disinfecting solution following CL insertion (OR 2.51, CI 0.52–12.09) and not air drying the case (OR 2.31, CI 0.39–13.35) were positively correlated with contamination; however, did not reach statistical significance. CONCLUSIONS: Our results suggest that case contamination may influence subjective comfort. It is difficult to predict the development of case contamination from a variety of clinical factors. The efficacy of CL solutions, bacterial resistance to disinfection and biofilm formation are likely to play a role. Further evaluation of these factors will improve our understanding of the development of case contamination and its clinical impact.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Evasive change-of-direction manoeuvres (agility skills) are a fundamental ability in rugby union. In this study, we explored the attributes of agility skill execution as they relate to effective attacking strategies in rugby union. Seven Super 14 games were coded using variables that assessed team patterns and individual movement characteristics during attacking ball carries. The results indicated that tackle-breaks are a key determinant of try-scoring ability and team success in rugby union. The ability of the attacking ball carrier to receive the ball at high speed with at least two body lengths from the defence line against an isolated defender promoted tackle-breaks. Furthermore, the execution of a side-step evasive manoeuvre at a change of direction angle of 20–60° and a distance of one to two body lengths from the defence, and then straightening the running line following the initial direction change at an angle of 20–60°, was associated with tackle-breaks. This study provides critical insight regarding the attributes of agility skill execution that are associated with effective ball carries in rugby union.