930 resultados para Binary Coded Decimal


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intelligent agents are an advanced technology utilized in Web Intelligence. When searching information from a distributed Web environment, information is retrieved by multi-agents on the client site and fused on the broker site. The current information fusion techniques rely on cooperation of agents to provide statistics. Such techniques are computationally expensive and unrealistic in the real world. In this paper, we introduce a model that uses a world ontology constructed from the Dewey Decimal Classification to acquire user profiles. By search using specific and exhaustive user profiles, information fusion techniques no longer rely on the statistics provided by agents. The model has been successfully evaluated using the large INEX data set simulating the distributed Web environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

International Film Festivals play a vital role in shaping filmmakers’ careers. This paper presents some initial findings from a current major research project, highlighting the significance of particular festival programming of emerging female directors from developing nations. Some filmmakers showcased at festivals actively privilege the voices of women in their films as a means of commenting on pressing cultural and political issues. Ironically, other filmmakers do not subscribe to the label of “feminist” or “woman filmmaker”, even if their respective films represent a strongly coded woman’s point of view. Tensions also arise inevitably when scrutinising women filmmakers from developing nations within a first world film festival context. The expectations of the researcher, the festival, film critics and audiences inevitably must negotiate with the original intentions of the filmmaker. This paper explores the significance of women filmmakers in attendance at the Brisbane International Film Festival (2009) and the International Film Festival Rotterdam (2010).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article presents the results of a study on the association between measured air pollutants and the respiratory health of resident women and children in Lao PDR, one of the least developed countries in Southeast Asia. The study, commissioned by the World Health Organisation, included PM10, CO and NO2 measurements made inside 181 dwellings in nine districts within two provinces in Lao PDR over a 5- month period (12/05–04/06), and respiratory health information (via questionnaires and peak expiratory flow rate (PEFR) measurements) for all residents in the same dwellings. Adjusted odds ratios were calculated separately for each health outcome using binary logistic regression. There was a strong and consistent positive association between NO2 and CO for almost all questionnaire-based health outcomes for both women and children. Women in dwellings with higher measured NO2 had more than triple of the odds of almost all of the health outcomes, and higher concentrations of NO2 and CO were significantly associated with lower PEFR. This study supports a growing literature confirming the role of indoor air pollution in the burden of respiratory disease in developing countries. The results will directly support changes in health and housing policy in Lao PDR.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: In the last decade, there has been increasing interest in the health effects of sedentary behavior, which is often assessed using self-report sitting-time questions. The aim of this qualitative study was to document older adults’ understanding of sitting-time questions from the International Physical Activity (PA) Questionnaire (IPAQ) and the PA Scale for the Elderly (PASE). Methods: Australian community-dwelling adults aged 65+ years answered the IPAQ and PASE sitting questions in face-to-face semi-structured interviews. IPAQ uses one open-ended question to assess sitting on a weekday in the last 7 days 'at work, at home, while doing coursework and during leisure time'; PASE uses a three-part closed question about daily leisure-time sitting in the last 7 days. Participants expressed their thoughts out loud while answering each question. They were then probed about their responses. Interviews were recorded, transcribed and coded into themes. Results: Mean age of the 28 male and 27 female participants was 73 years (range 65-89). The most frequently reported activity was watching TV. For both questionnaires, many participants had difficulties understanding what activities to report. Some had difficulty understanding what activities should be classified as ‘leisure-time sitting’. Some assumed they were being asked to only report activities provided as examples. Most reported activities they normally do, rather than those performed on a day in the previous week. Participants used a variety of strategies to select ‘a day’ for which they reported their sitting activities and to calculate sitting time on that day. Therefore, many different ways of estimating sitting time were used. Participants had particular difficulty reporting their daily sitting-time when their schedules were not consistent across days. Some participants declared the IPAQ sitting question too difficult to answer. Conclusion: The accuracy of older adults’ self-reported sitting time is questionable given the challenges they have in answering sitting-time questions. Their responses to sitting-time questions may be more accurate if our recommendations for clarifying the sitting domains, providing examples relevant to older adults and suggesting strategies for formulating responses are incorporated. Future quantitative studies should include objective criterion measures to assess validity and reliability of these questions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Signal Processing (SP) is a subject of central importance in engineering and the applied sciences. Signals are information-bearing functions, and SP deals with the analysis and processing of signals (by dedicated systems) to extract or modify information. Signal processing is necessary because signals normally contain information that is not readily usable or understandable, or which might be disturbed by unwanted sources such as noise. Although many signals are non-electrical, it is common to convert them into electrical signals for processing. Most natural signals (such as acoustic and biomedical signals) are continuous functions of time, with these signals being referred to as analog signals. Prior to the onset of digital computers, Analog Signal Processing (ASP) and analog systems were the only tool to deal with analog signals. Although ASP and analog systems are still widely used, Digital Signal Processing (DSP) and digital systems are attracting more attention, due in large part to the significant advantages of digital systems over the analog counterparts. These advantages include superiority in performance,s peed, reliability, efficiency of storage, size and cost. In addition, DSP can solve problems that cannot be solved using ASP, like the spectral analysis of multicomonent signals, adaptive filtering, and operations at very low frequencies. Following the recent developments in engineering which occurred in the 1980's and 1990's, DSP became one of the world's fastest growing industries. Since that time DSP has not only impacted on traditional areas of electrical engineering, but has had far reaching effects on other domains that deal with information such as economics, meteorology, seismology, bioengineering, oceanology, communications, astronomy, radar engineering, control engineering and various other applications. This book is based on the Lecture Notes of Associate Professor Zahir M. Hussain at RMIT University (Melbourne, 2001-2009), the research of Dr. Amin Z. Sadik (at QUT & RMIT, 2005-2008), and the Note of Professor Peter O'Shea at Queensland University of Technology. Part I of the book addresses the representation of analog and digital signals and systems in the time domain and in the frequency domain. The core topics covered are convolution, transforms (Fourier, Laplace, Z. Discrete-time Fourier, and Discrete Fourier), filters, and random signal analysis. There is also a treatment of some important applications of DSP, including signal detection in noise, radar range estimation, banking and financial applications, and audio effects production. Design and implementation of digital systems (such as integrators, differentiators, resonators and oscillators are also considered, along with the design of conventional digital filters. Part I is suitable for an elementary course in DSP. Part II (which is suitable for an advanced signal processing course), considers selected signal processing systems and techniques. Core topics covered are the Hilbert transformer, binary signal transmission, phase-locked loops, sigma-delta modulation, noise shaping, quantization, adaptive filters, and non-stationary signal analysis. Part III presents some selected advanced DSP topics. We hope that this book will contribute to the advancement of engineering education and that it will serve as a general reference book on digital signal processing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim. This paper elucidates the nature of metaphor and the conditions necessary to its use as an analytic device in qualitative research, and describes how the use of metaphor assisted in the analytic processes of a grounded theory study of nephrology nursing expertise. Background. The use of metaphor is pervasive in everyday thought, language and action. It is an important means for the comprehension and management of everyday life, and makes challenging or problematic concepts easier to explain. Metaphors are also pervasive in quantitative and qualitative research for the same reason. In both everyday life and in research, their use may be implicit or explicit. Methods. The study using grounded theory methodology took place in one renal unit in New South Wales, Australia between 1999 and 2000 and included six non-expert and 11 expert nurses. It involved simultaneous data collection and analysis using participant observation, semi-structured interviews and review of nursing documentation. Findings. A three stage skills-acquisitive process was identified in which an orchestral metaphor was used to explain the relationships between stages and to satisfactorily capture the data coded within each stage. Conclusion. Metaphors create images, clarify and add depth to meanings and, if used appropriately and explicitly in qualitative research, can capture data at highly conceptual levels. Metaphors also assist in explaining the relationship between findings in a clear and coherent manner. © 2005 Blackwell Publishing Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This important work describes recent theoretical advances in the study of artificial neural networks. It explores probabilistic models of supervised learning problems, and addresses the key statistical and computational questions. Chapters survey research on pattern classification with binary-output networks, including a discussion of the relevance of the Vapnik Chervonenkis dimension, and of estimates of the dimension for several neural network models. In addition, Anthony and Bartlett develop a model of classification by real-output networks, and demonstrate the usefulness of classification with a "large margin." The authors explain the role of scale-sensitive versions of the Vapnik Chervonenkis dimension in large margin classification, and in real prediction. Key chapters also discuss the computational complexity of neural network learning, describing a variety of hardness results, and outlining two efficient, constructive learning algorithms. The book is self-contained and accessible to researchers and graduate students in computer science, engineering, and mathematics

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Binary classification methods can be generalized in many ways to handle multiple classes. It turns out that not all generalizations preserve the nice property of Bayes consistency. We provide a necessary and sufficient condition for consistency which applies to a large class of multiclass classification methods. The approach is illustrated by applying it to some multiclass methods proposed in the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present new expected risk bounds for binary and multiclass prediction, and resolve several recent conjectures on sample compressibility due to Kuzmin and Warmuth. By exploiting the combinatorial structure of concept class F, Haussler et al. achieved a VC(F)/n bound for the natural one-inclusion prediction strategy. The key step in their proof is a d = VC(F) bound on the graph density of a subgraph of the hypercube—oneinclusion graph. The first main result of this paper is a density bound of n [n−1 <=d-1]/[n <=d] < d, which positively resolves a conjecture of Kuzmin and Warmuth relating to their unlabeled Peeling compression scheme and also leads to an improved one-inclusion mistake bound. The proof uses a new form of VC-invariant shifting and a group-theoretic symmetrization. Our second main result is an algebraic topological property of maximum classes of VC-dimension d as being d contractible simplicial complexes, extending the well-known characterization that d = 1 maximum classes are trees. We negatively resolve a minimum degree conjecture of Kuzmin and Warmuth—the second part to a conjectured proof of correctness for Peeling—that every class has one-inclusion minimum degree at most its VCdimension. Our final main result is a k-class analogue of the d/n mistake bound, replacing the VC-dimension by the Pollard pseudo-dimension and the one-inclusion strategy by its natural hypergraph generalization. This result improves on known PAC-based expected risk bounds by a factor of O(logn) and is shown to be optimal up to an O(logk) factor. The combinatorial technique of shifting takes a central role in understanding the one-inclusion (hyper)graph and is a running theme throughout.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of binary classification where the classifier can, for a particular cost, choose not to classify an observation. Just as in the conventional classification problem, minimization of the sample average of the cost is a difficult optimization problem. As an alternative, we propose the optimization of a certain convex loss function φ, analogous to the hinge loss used in support vector machines (SVMs). Its convexity ensures that the sample average of this surrogate loss can be efficiently minimized. We study its statistical properties. We show that minimizing the expected surrogate loss—the φ-risk—also minimizes the risk. We also study the rate at which the φ-risk approaches its minimum value. We show that fast rates are possible when the conditional probability P(Y=1|X) is unlikely to be close to certain critical values.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Binary classification is a well studied special case of the classification problem. Statistical properties of binary classifiers, such as consistency, have been investigated in a variety of settings. Binary classification methods can be generalized in many ways to handle multiple classes. It turns out that one can lose consistency in generalizing a binary classification method to deal with multiple classes. We study a rich family of multiclass methods and provide a necessary and sufficient condition for their consistency. We illustrate our approach by applying it to some multiclass methods proposed in the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present new expected risk bounds for binary and multiclass prediction, and resolve several recent conjectures on sample compressibility due to Kuzmin and Warmuth. By exploiting the combinatorial structure of concept class F, Haussler et al. achieved a VC(F)/n bound for the natural one-inclusion prediction strategy. The key step in their proof is a d=VC(F) bound on the graph density of a subgraph of the hypercube—one-inclusion graph. The first main result of this report is a density bound of n∙choose(n-1,≤d-1)/choose(n,≤d) < d, which positively resolves a conjecture of Kuzmin and Warmuth relating to their unlabeled Peeling compression scheme and also leads to an improved one-inclusion mistake bound. The proof uses a new form of VC-invariant shifting and a group-theoretic symmetrization. Our second main result is an algebraic topological property of maximum classes of VC-dimension d as being d-contractible simplicial complexes, extending the well-known characterization that d=1 maximum classes are trees. We negatively resolve a minimum degree conjecture of Kuzmin and Warmuth—the second part to a conjectured proof of correctness for Peeling—that every class has one-inclusion minimum degree at most its VC-dimension. Our final main result is a k-class analogue of the d/n mistake bound, replacing the VC-dimension by the Pollard pseudo-dimension and the one-inclusion strategy by its natural hypergraph generalization. This result improves on known PAC-based expected risk bounds by a factor of O(log n) and is shown to be optimal up to a O(log k) factor. The combinatorial technique of shifting takes a central role in understanding the one-inclusion (hyper)graph and is a running theme throughout

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The reduction of CO2 emissions and social exclusion are two key elements of UK transport strategy. Despite intensive research on each theme, little effort has so far been made linking the relationship between emissions and social exclusion. In addition, current knowledge on each theme is limited to urban areas; little research is available on these themes for rural areas. This research contributes to this gap in the literature by analysing 157 weekly activity-travel diary data collected from three case study areas with differential levels of area accessibility and area mobility options, located in rural Northern Ireland. Individual weekly CO2 emission levels from personal travel diaries (both hot exhaust emission and cold-start emission) were calculated using average speed models for different modes of transport. The socio-spatial patterns associated with CO2 emissions were identified using a general linear model whereas binary logistic regression analyses were conducted to identify mode choice behaviour and activity patterns. This research found groups that emitted a significantly lower level of CO2 included individuals living in an area with a higher level of accessibility and mobility, non-car, non-working, and low-income older people. However, evidence in this research also shows that although certain groups (e.g. those working, and residing in an area with a lower level of accessibility) emitted higher levels of CO2, their rate of participation in activities was however found to be significantly lower compared to their counterparts. Based on the study findings, this research highlights the need for both soft (e.g. teleworking) and physical (e.g. accessibility planning) policy measures in rural areas in order to meet government’s stated CO2 reduction targets while at the same time enhancing social inclusion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a framework for evaluating information retrieval of medical records. We use the BLULab corpus, a large collection of real-world de-identified medical records. The collection has been hand coded by clinical terminol- ogists using the ICD-9 medical classification system. The ICD codes are used to devise queries and relevance judge- ments for this collection. Results of initial test runs using a baseline IR system are provided. Queries and relevance judgements are online to aid further research in medical IR. Please visit: http://koopman.id.au/med_eval.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Unusual event detection in crowded scenes remains challenging because of the diversity of events and noise. In this paper, we present a novel approach for unusual event detection via sparse reconstruction of dynamic textures over an overcomplete basis set, with the dynamic texture described by local binary patterns from three orthogonal planes (LBPTOP). The overcomplete basis set is learnt from the training data where only the normal items observed. In the detection process, given a new observation, we compute the sparse coefficients using the Dantzig Selector algorithm which was proposed in the literature of compressed sensing. Then the reconstruction errors are computed, based on which we detect the abnormal items. Our application can be used to detect both local and global abnormal events. We evaluate our algorithm on UCSD Abnormality Datasets for local anomaly detection, which is shown to outperform current state-of-the-art approaches, and we also get promising results for rapid escape detection using the PETS2009 dataset.