962 resultados para multiple components
Resumo:
Mistuning a harmonic produces an exaggerated change in its pitch. This occurs because the component becomes inconsistent with the regular pattern that causes the other harmonics (constituting the spectral frame) to integrate perceptually. These pitch shifts were measured when the fundamental (F0) component of a complex tone (nominal F0 frequency = 200 Hz) was mistuned by +8% and -8%. The pitch-shift gradient was defined as the difference between these values and its magnitude was used as a measure of frame integration. An independent and random perturbation (spectral jitter) was applied simultaneously to most or all of the frame components. The gradient magnitude declined gradually as the degree of jitter increased from 0% to ±40% of F0. The component adjacent to the mistuned target made the largest contribution to the gradient, but more distant components also contributed. The stimuli were passed through an auditory model, and the exponential height of the F0-period peak in the averaged summary autocorrelation function correlated well with the gradient magnitude. The fit improved when the weighting on more distant channels was attenuated by a factor of three per octave. The results are consistent with a grouping mechanism that computes a weighted average of periodicity strength across several components. © 2006 Elsevier B.V. All rights reserved.
Resumo:
In Statnotes 24 and 25, multiple linear regression, a statistical method that examines the relationship between a single dependent variable (Y) and two or more independent variables (X), was described. The principle objective of such an analysis was to determine which of the X variables had a significant influence on Y and to construct an equation that predicts Y from the X variables. ‘Principal components analysis’ (PCA) and ‘factor analysis’ (FA) are also methods of examining the relationships between different variables but they differ from multiple regression in that no distinction is made between the dependent and independent variables, all variables being essentially treated the same. Originally, PCA and FA were regarded as distinct methods but in recent times they have been combined into a single analysis, PCA often being the first stage of a FA. The basic objective of a PCA/FA is to examine the relationships between the variables or the ‘structure’ of the variables and to determine whether these relationships can be explained by a smaller number of ‘factors’. This statnote describes the use of PCA/FA in the analysis of the differences between the DNA profiles of different MRSA strains introduced in Statnote 26.
Resumo:
It is well known that optic flow - the smooth transformation of the retinal image experienced by a moving observer - contains valuable information about the three-dimensional layout of the environment. From psychophysical and neurophysiological experiments, specialised mechanisms responsive to components of optic flow (sometimes called complex motion) such as expansion and rotation have been inferred. However, it remains unclear (a) whether the visual system has mechanisms for processing the component of deformation and (b) whether there are multiple mechanisms that function independently from each other. Here, we investigate these issues using random-dot patterns and a forced-choice subthreshold summation technique. In experiment 1, we manipulated the size of a test region that was permitted to contain signal and found substantial spatial summation for signal components of translation, expansion, rotation, and deformation embedded in noise. In experiment 2, little or no summation was found for the superposition of orthogonal pairs of complex motion patterns (eg expansion and rotation), consistent with probability summation between pairs of independent detectors. Our results suggest that optic-flow components are detected by mechanisms that are specialised for particular patterns of complex motion.
Resumo:
PCA/FA is a method of analyzing complex data sets in which there are no clearly defined X or Y variables. It has multiple uses including the study of the pattern of variation between individual entities such as patients with particular disorders and the detailed study of descriptive variables. In most applications, variables are related to a smaller number of ‘factors’ or PCs that account for the maximum variance in the data and hence, may explain important trends among the variables. An increasingly important application of the method is in the ‘validation’ of questionnaires that attempt to relate subjective aspects of a patients experience with more objective measures of vision.
Resumo:
Rheumatoid arthritis (RA) associates with excess cardiovascular risk and there is a need to assess that risk. However, individual lipid levels may be influenced by disease activity and drug use, whereas lipid ratios may be more robust. A cross-sectional cohort of 400 consecutive patients was used to establish factors that influenced individual lipid levels and lipid ratios in RA, using multiple regression models. A further longitudinal cohort of 550 patients with RA was used to confirm these findings, using generalized estimating equations. Cross-sectionally, higher C-reactive protein (CRP) levels correlated with lower levels of total cholesterol (TC), low-density lipoprotein-cholesterol (LDL-C), and high-density lipoprotein-cholesterol ([HDL-C] P = .015), whereas lipid ratios did not correlate with CRP. The findings were broadly replicated in the longitudinal data. In summary, the effects of inflammation on individual lipid levels may underestimate lipid-associated cardiovascular disease (CVD) risk in RA, thus lipid ratios may be more appropriate for CVD risk stratification in RA.
Resumo:
Objective: To investigate the dynamics of communication within the primary somatosensory neuronal network. Methods: Multichannel EEG responses evoked by median nerve stimulation were recorded from six healthy participants. We investigated the directional connectivity of the evoked responses by assessing the Partial Directed Coherence (PDC) among five neuronal nodes (brainstem, thalamus and three in the primary sensorimotor cortex), which had been identified by using the Functional Source Separation (FSS) algorithm. We analyzed directional connectivity separately in the low (1-200. Hz, LF) and high (450-750. Hz, HF) frequency ranges. Results: LF forward connectivity showed peaks at 16, 20, 30 and 50. ms post-stimulus. An estimate of the strength of connectivity was modulated by feedback involving cortical and subcortical nodes. In HF, forward connectivity showed peaks at 20, 30 and 50. ms, with no apparent feedback-related strength changes. Conclusions: In this first non-invasive study in humans, we documented directional connectivity across subcortical and cortical somatosensory pathway, discriminating transmission properties within LF and HF ranges. Significance: The combined use of FSS and PDC in a simple protocol such as median nerve stimulation sheds light on how high and low frequency components of the somatosensory evoked response are functionally interrelated in sustaining somatosensory perception in healthy individuals. Thus, these components may potentially be explored as biomarkers of pathological conditions. © 2012 International Federation of Clinical Neurophysiology.
Resumo:
Formulating complex queries is hard, especially when users cannot understand all the data structures of multiple complex knowledge bases. We see a gap between simplistic but user friendly tools and formal query languages. Building on an example comparison search, we propose an approach in which reusable search components take an intermediary role between the user interface and formal query languages.
Resumo:
This paper describes the knowledge elicitation and knowledge representation aspects of a system being developed to help with the design and maintenance of relational data bases. The size algorithmic components. In addition, the domain contains multiple experts, but any given expert's knowledge of this large domain is only partial. The paper discusses the methods and techniques used for knowledge elicitation, which was based on a "broad and shallow" approach at first, moving to a "narrow and deep" one later, and describes the models used for knowledge representation, which were based on a layered "generic and variants" approach. © 1995.
Resumo:
While the literature has suggested the possibility of breach being composed of multiple facets, no previous study has investigated this possibility empirically. This study examined the factor structure of typical component forms in order to develop a multiple component form measure of breach. Two studies were conducted. In study 1 (N = 420) multi-item measures based on causal indicators representing promissory obligations were developed for the five potential component forms (delay, magnitude, type/form, inequity and reciprocal imbalance). Exploratory factor analysis showed that the five components loaded onto one higher order factor, namely psychological contract breach suggesting that breach is composed of different aspects rather than types of breach. Confirmatory factor analysis provided further evidence for the proposed model. In addition, the model achieved high construct reliability and showed good construct, convergent, discriminant and predictive validity. Study 2 data (N = 189), used to validate study 1 results, compared the multiple-component measure with an established multiple item measure of breach (rather than a single item as in study 1) and also tested for discriminant validity with an established multiple item measure of violation. Findings replicated those in study 1. The findings have important implications for considering alternative, more comprehensive and elaborate ways of assessing breach.
Resumo:
Progress on advanced active and passive photonic components that are required for high-speed optical communications over hollow-core photonic bandgap fiber at wavelengths around 2 μm is described in this paper. Single-frequency lasers capable of operating at 10 Gb/s and covering a wide spectral range are realized. A comparison is made between waveguide and surface normal photodiodes with the latter showing good sensitivity up to 15 Gb/s. Passive waveguides, 90° optical hybrids, and arrayed waveguide grating with 100-GHz channel spacing are demonstrated on a large spot-size waveguide platform. Finally, a strong electro-optic effect using the quantum confined Stark effect in strain-balanced multiple quantum wells is demonstrated and used in a Mach-Zehnder modulator capable of operating at 10 Gb/s.
Resumo:
Measurement and variation control of geometrical Key Characteristics (KCs), such as flatness and gap of joint faces, coaxiality of cabin sections, is the crucial issue in large components assembly from the aerospace industry. Aiming to control geometrical KCs and to attain the best fit of posture, an optimization algorithm based on KCs for large components assembly is proposed. This approach regards the posture best fit, which is a key activity in Measurement Aided Assembly (MAA), as a two-phase optimal problem. In the first phase, the global measurement coordinate system of digital model and shop floor is unified with minimum error based on singular value decomposition, and the current posture of components being assembly is optimally solved in terms of minimum variation of all reference points. In the second phase, the best posture of the movable component is optimally determined by minimizing multiple KCs' variation with the constraints that every KC respectively conforms to its product specification. The optimal models and the process procedures for these two-phase optimal problems based on Particle Swarm Optimization (PSO) are proposed. In each model, every posture to be calculated is modeled as a 6 dimensional particle (three movement and three rotation parameters). Finally, an example that two cabin sections of satellite mainframe structure are being assembled is selected to verify the effectiveness of the proposed approach, models and algorithms. The experiment result shows the approach is promising and will provide a foundation for further study and application. © 2013 The Authors.
Resumo:
Multiple system atrophy (MSA) is a rare neurodegenerative disorder associated with parkinsonism, ataxia, and autonomic dysfunction. Its pathology is primarily subcortical comprising vacuolation, neuronal loss, gliosis, and α-synuclein-immunoreactive glial cytoplasmic inclusions (GO). To quantify cerebellar pathology in MSA, the density and spatial pattern of the pathological changes were studied in α-synuclein-immunolabelled sections of the cerebellar hemisphere in 10 MSA and 10 control cases. In MSA, densities of Purkinje cells (PC) were decreased and vacuoles in the granule cell layer (GL) increased compared with controls. In six MSA cases, GCI were present in cerebellar white matter. In the molecular layer (ML) and GL of MSA, vacuoles were clustered, the clusters exhibiting a regular distribution parallel to the edge of the folia. Purkinje cells were randomly or regularly distributed with large gaps between surviving cells. Densities of glial cells and surviving neurons in the ML and surviving cells and vacuoles in the GL were negatively correlated consistent with gliosis and vacuolation in response to neuronal loss. Principal components analysis (PCA) suggested vacuole densities in the ML and vacuole density and cell losses in the GL were the main source of neuropathological variation among cases. The data suggest that: (1) cell losses and vacuolation of the GCL and loss of PC were the most significant pathological changes in the cases studied, (2) pathological changes were topographically distributed, and (3) cerebellar pathology could influence cerebral function in MSA via the cerebello-dentato-thalamic tract.
Resumo:
The purpose of this study was to explain how exemplary service providers in luxury hotels provide consistently excellent service. Using a case study framework, the study investigated the service provider's strategies and concepts of service delivery, the importance and implementation of organizational and individual controls, and the role of training and learning. The study identified barriers to service provision and characteristics of the exemplary individuals that affect their ability to deliver luxury service. This study sought to better understand how exemplary service providers learn, think about, and do their work. The sample population of three Five-Diamond-Award winning resorts was selected for their potential for learning about the phenomenon of interest. The results demonstrate that exemplary service providers possess individual characteristics that are enhanced by the organizations for which they work. Exemplary service providers are often exemplary communicators who are emotionally generous and genuinely enjoy helping and serving others. Exemplary service organizations treat their employees as they treat their customers, as suggested by the Service-Profit Chain (Heskett, Sasser & Schlesinger, 1997). Further, they have systems and standards to guarantee satisfactory service experiences for every guest. They also encourage their service providers to personalize their service delivery and to seek opportunities to delight their guests, using a combination of controls, traditions and cultural values. Several customer service theories are discussed in relationship to whether they were or were not supported by the data. The study concluded that the delivery of exemplary service is a complex phenomenon that requires successful interactions between guests, service providers and the organization. A Model of Exemplary Service Delivery is presented and discussed that demonstrates the components of service quality as shown in the data. The model can be used by practitioners seeking to create, enhance, or evaluate their service quality, and by researchers seeking insights into the complex concepts in service quality research. Implications for future research are discussed.
Resumo:
The large upfront investments required for game development pose a severe barrier for the wider uptake of serious games in education and training. Also, there is a lack of well-established methods and tools that support game developers at preserving and enhancing the games’ pedagogical effectiveness. The RAGE project, which is a Horizon 2020 funded research project on serious games, addresses these issues by making available reusable software components that aim to support the pedagogical qualities of serious games. In order to easily deploy and integrate these game components in a multitude of game engines, platforms and programming languages, RAGE has developed and validated a hybrid component-based software architecture that preserves component portability and interoperability. While a first set of software components is being developed, this paper presents selected examples to explain the overall system’s concept and its practical benefits. First, the Emotion Detection component uses the learners’ webcams for capturing their emotional states from facial expressions. Second, the Performance Statistics component is an add-on for learning analytics data processing, which allows instructors to track and inspect learners’ progress without bothering about the required statistics computations. Third, a set of language processing components accommodate the analysis of textual inputs of learners, facilitating comprehension assessment and prediction. Fourth, the Shared Data Storage component provides a technical solution for data storage - e.g. for player data or game world data - across multiple software components. The presented components are exemplary for the anticipated RAGE library, which will include up to forty reusable software components for serious gaming, addressing diverse pedagogical dimensions.
Resumo:
The Short Term Assessment of Risk and Treatability is a structured judgement tool used to inform risk estimation for multiple adverse outcomes. In research, risk estimates outperform the tool's strength and vulnerability scales for violence prediction. Little is known about what its’component parts contribute to the assignment of risk estimates and how those estimates fare in prediction of non-violent adverse outcomes compared with the structured components. START assessment and outcomes data from a secure mental health service (N=84) was collected. Binomial and multinomial regression analyses determined the contribution of selected elements of the START structured domain and recent adverse risk events to risk estimates and outcomes prediction for violence, self-harm/suicidality, victimisation, and self-neglect. START vulnerabilities and lifetime history of violence, predicted the violence risk estimate; self-harm and victimisation estimates were predicted only by corresponding recent adverse events. Recent adverse events uniquely predicted all corresponding outcomes, with the exception of self-neglect which was predicted by the strength scale. Only for victimisation did the risk estimate outperform prediction based on the START components and recent adverse events. In the absence of recent corresponding risk behaviour, restrictions imposed on the basis of START-informed risk estimates could be unwarranted and may be unethical.