869 resultados para Error-correcting codes (Information theory)
Resumo:
Similar to classic Signal Detection Theory (SDT), recent optimal Binary Signal Detection Theory (BSDT) and based on it Neural Network Assembly Memory Model (NNAMM) can successfully reproduce Receiver Operating Characteristic (ROC) curves although BSDT/NNAMM parameters (intensity of cue and neuron threshold) and classic SDT parameters (perception distance and response bias) are essentially different. In present work BSDT/NNAMM optimal likelihood and posterior probabilities are analytically analyzed and used to generate ROCs and modified (posterior) mROCs, optimal overall likelihood and posterior. It is shown that for the description of basic discrimination experiments in psychophysics within the BSDT a ‘neural space’ can be introduced where sensory stimuli as neural codes are represented and decision processes are defined, the BSDT’s isobias curves can simultaneously be interpreted as universal psychometric functions satisfying the Neyman-Pearson objective, the just noticeable difference (jnd) can be defined and interpreted as an atom of experience, and near-neutral values of biases are observers’ natural choice. The uniformity or no-priming hypotheses, concerning the ‘in-mind’ distribution of false-alarm probabilities during ROC or overall probability estimations, is introduced. The BSDT’s and classic SDT’s sensitivity, bias, their ROC and decision spaces are compared.
Resumo:
Forward error correction (FEC) plays a vital role in coherent optical systems employing multi-level modulation. However, much of coding theory assumes that additive white Gaussian noise (AWGN) is dominant, whereas coherent optical systems have significant phase noise (PN) in addition to AWGN. This changes the error statistics and impacts FEC performance. In this paper, we propose a novel semianalytical method for dimensioning binary Bose-Chaudhuri-Hocquenghem (BCH) codes for systems with PN. Our method involves extracting statistics from pre-FEC bit error rate (BER) simulations. We use these statistics to parameterize a bivariate binomial model that describes the distribution of bit errors. In this way, we relate pre-FEC statistics to post-FEC BER and BCH codes. Our method is applicable to pre-FEC BER around 10-3 and any post-FEC BER. Using numerical simulations, we evaluate the accuracy of our approach for a target post-FEC BER of 10-5. Codes dimensioned with our bivariate binomial model meet the target within 0.2-dB signal-to-noise ratio.
Resumo:
A szerző a 2008-ban kezdődött gazdasági világválság hatását vizsgálja az egy részvényre jutó nyereség előrejelzésének hibájára. Számos publikáció bizonyította, hogy az elemzők a tényértékeknél szisztematikusan kedvezőbb tervértéket adnak meg az egy részvényre jutó előrejelzéseikben. Más vizsgálatok azt igazolták, hogy az egy részvényre jutó előrejelzési hiba bizonytalan környezetben növekszik, míg arra is számos bizonyítékot lehet találni, hogy a negatív hírek hatását az elemzők alulsúlyozzák. A gazdasági világválság miatt az elemzőknek számtalan negatív hírt kellett figyelembe venniük az előrejelzések készítésekor, továbbá a válság az egész gazdaságban jelentősen növelte a bizonytalanságot. A szerző azt vizsgálja, hogy miként hatott a gazdasági világválság az egy részvényre jutó nyereség- előrejelzés hibájára, megkülönböztetve azt az időszakot, amíg a válság negatív hír volt, attól, amikor már hatásaként jelentősen megnőtt a bizonytalanság. _____ The author investigated the impact of the financial crisis that started in 2008 on the forecasting error for earnings per share. There is plentiful evidence from the 1980s that analysts give systematically more favourable values in their earnings per share (EPS) forecasts than reality, i.e. they are generally optimistic. Other investigations have supported the idea that the EPS forecasting error is greater under uncertain environmental circumstances, while other researchers prove that the analysts under-react to the negative information in their forecasts. The financial crisis brought a myriad of negative information for analysts to consider in such forecasts, while also increasing the level of uncertainty for the entire economy. The article investigates the impact of the financial crisis on the EPS forecasting error, distinguishing the period when the crisis gave merely negative information, from the one when its effect of uncertainty was significantly increased over the entire economy.
Resumo:
The theoretical foundation of this study comes from the significant recurrence throughout the leadership literature of two distinct behaviors, task orientation and relationship orientation. Task orientation and relationship orientation are assumed to be generic behaviors, which are universally observed and applied in organizations, even though they may be uniquely enacted in organizations across cultures. The lack of empirical evidence supporting these assumptions provided the impetus to hypothetically develop and empirically confirm the universal application of task orientation and relationship orientation and the generalizability of their measurement in a cross-cultural setting. Task orientation and relationship orientation are operationalized through consideration and initiation of structure, two well-established theoretical leadership constructs. Multiple-group mean and covariance structures (MACS) analyses are used to simultaneously validate the generalizability of the two hypothesized constructs across the 12 cultural groups and to assess whether the similarities and differences discovered are measurement and scaling artifacts or reflect true cross-cultural differences. The data were collected by the author and others as part of a larger international research project. The data are comprised of 2341 managers from 12 countries/regions. The results provide compelling evidence that task orientation and relationship orientation, reliably and validly operationalized through consideration and initiation of structure, are generalizable across the countries/regions sampled. But the results also reveal significant differences in the perception of these behaviors, suggesting that some aspects of task orientation and relationship orientation are strongly affected by cultural influences. These (similarities and) differences reflect directly interpretable, error-free effects among the constructs at the behavioral level. Thus, task orientation and relationship orientation can demonstrate different relations among cultures, yet still be defined equivalently across the 11 cultures studied. The differences found in this study are true differences and may contain information about cultural influences characterizing each cultural context (i.e. group). The nature of such influences should be examined before the results can be meaningfully interpreted. To examine the effects of cultural characteristics on the constructs, additional hypotheses on the constructs' latent parameters can be tested across groups. Construct-level tests are illustrated in hypothetical examples in light of the study's results. The study contributes significantly to the theoretical understanding of the nature and generalizability of psychological constructs. The theoretical and practical implications of embedding context into a unified theory of task orientated and relationship oriented leader behavior are proposed. Limitations and contributions are also discussed. ^
Resumo:
Other
Resumo:
Concept evaluation at the early phase of product development plays a crucial role in new product development. It determines the direction of the subsequent design activities. However, the evaluation information at this stage mainly comes from experts' judgments, which is subjective and imprecise. How to manage the subjectivity to reduce the evaluation bias is a big challenge in design concept evaluation. This paper proposes a comprehensive evaluation method which combines information entropy theory and rough number. Rough number is first presented to aggregate individual judgments and priorities and to manipulate the vagueness under a group decision-making environment. A rough number based information entropy method is proposed to determine the relative weights of evaluation criteria. The composite performance values based on rough number are then calculated to rank the candidate design concepts. The results from a practical case study on the concept evaluation of an industrial robot design show that the integrated evaluation model can effectively strengthen the objectivity across the decision-making processes.
Resumo:
Developing a theoretical framework for pervasive information environments is an enormous goal. This paper aims to provide a small step towards such a goal. The following pages report on our initial investigations to devise a framework that will continue to support locative, experiential and evaluative data from ‘user feedback’ in an increasingly pervasive information environment. We loosely attempt to outline this framework by developing a methodology capable of moving from rapid-deployment of software and hardware technologies, towards a goal of realistic immersive experience of pervasive information. We propose various technical solutions and address a range of problems such as; information capture through a novel model of sensing, processing, visualization and cognition.
Resumo:
This thesis attempts to provide deeper historical and theoretical grounding for sense-making, thereby illustrating its applicability to practical information seeking research. In Chapter One I trace the philosophical origins of Brenda Dervin’s theory known as “sense making,” reaching beyond current scholarship that locates the origins of sense-making in twentieth-century Phenomenology and Communication theory and find its rich ontological, epistemological, and etymological heritage that dates back to the Pre-Socratics. After exploring sense-making’s Greek roots, I examine sense-making’s philosophical undercurrents found in Hegel’s Phenomenology of Spirit (1807), where he also returns to the simplicity of the Greeks for his concept of sense. With Chapter Two I explore sense-making methodology and find, in light of the Greek and Hegelian dialectic, a dialogical bridge connecting sense-making’s theory with pragmatic uses. This bridge between Dervin’s situation and use occupies a distinct position in sense-making theory. Moreover, building upon Brenda Dervin’s model of sense-making, I use her metaphors of gap and bridge analogy to discuss the dialectic and dialogic components of sense making. The purpose of Chapter Three is pragmatic – to gain insight into the online information-seeking needs, experiences, and motivation of first-degree relatives (FDRs) of breast cancer survivors through the lens of sense-making. This research analyses four questions: 1) information-seeking behavior among FDRs of cancer survivors compared to survivors and to undiagnosed, non-related online cancer information seekers in the general population, 2) types of and places where information is sought, 3) barriers or gaps and satisfaction rates FDRs face in their cancer information quest, and 4) types and degrees of cancer information and resources FDRs want and use in their information search for themselves and other family members. An online survey instrument designed to investigate these questions was developed and pilot tested. Via an email communication, the Susan Love Breast Cancer Research Foundation distributed 322,000 invitations to its membership to complete the survey, and from March 24th to April 5th 10,692 women agreed to take the survey with 8,804 volunteers actually completing survey responses. Of the 8,804 surveys, 95% of FDRs have searched for cancer information online, and 84% of FDRs use the Internet as a sense-making tool for additional information they have received from doctors or nurses. FDRs report needing much more information than either survivors or family/friends in ten out of fifteen categories related to breast and ovarian cancer. When searching for cancer information online, FDRs also rank highest in several of sense-making’s emotional levels: uncertainty, confusion, frustration, doubt, and disappointment than do either survivors or friends and family. The sense-making process has existed in theory and praxis since the early Greeks. In applying sense–making’s theory to a contemporary problem, the survey reveals unaddressed situations and gaps of FDRs’ information search process. FDRs are a highly motivated group of online information seekers whose needs are largely unaddressed as a result of gaps in available online information targeted to address their specific needs. Since FDRs represent a quarter of the population, further research addressing their specific online information needs and experiences is necessary.
Resumo:
We propose a family of local CSS stabilizer codes as possible candidates for self-correcting quantum memories in 3D. The construction is inspired by the classical Ising model on a Sierpinski carpet fractal, which acts as a classical self-correcting memory. Our models are naturally defined on fractal subsets of a 4D hypercubic lattice with Hausdorff dimension less than 3. Though this does not imply that these models can be realized with local interactions in R3, we also discuss this possibility. The X and Z sectors of the code are dual to one another, and we show that there exists a finite temperature phase transition associated with each of these sectors, providing evidence that the system may robustly store quantum information at finite temperature.
Resumo:
Perspective taking is a crucial ability that guides our social interactions. In this study, we show how the specific patterns of errors of brain-damaged patients in perspective taking tasks can help us further understand the factors contributing to perspective taking abilities. Previous work (e.g., Samson, Apperly, Chiavarino, & Humphreys, 2004; Samson, Apperly, Kathirgamanathan, & Humphreys, 2005) distinguished two components of perspective taking: the ability to inhibit our own perspective and the ability to infer someone else’s perspective. We assessed these components using a new nonverbal false belief task which provided different response options to detect three types of response strategies that participants might be using: a complete and spared belief reasoning strategy, a reality-based response selection strategy in which participants respond from their own perspective, and a simplified mentalising strategy in which participants avoid responding from their own perspective but rely on inaccurate cues to infer the other person’s belief. One patient, with a self-perspective inhibition deficit, almost always used the reality-based response strategy; in contrast, the other patient, with a deficit in taking other perspectives, tended to use the simplified mentalising strategy without necessarily transposing her own perspective. We discuss the extent to which the pattern of performance of both patients could relate to their executive function deficit and how it can inform us on the cognitive and neural components involved in belief reasoning.
Resumo:
This paper outlines a formal and systematic approach to explication of the role of structure in information organization. It presents a preliminary set of constructs that are useful for understanding the similarities and differences that obtain across information organization systems. This work seeks to provide necessary groundwork for development of a theory of structure that can serve as a lens through which to observe patterns across systems of information organization.
Resumo:
Fluvial sediment transport is controlled by hydraulics, sediment properties and arrangement, and flow history across a range of time scales. This physical complexity has led to ambiguous definition of the reference frame (Lagrangian or Eulerian) in which sediment transport is analysed. A general Eulerian-Lagrangian approach accounts for inertial characteristics of particles in a Lagrangian (particle fixed) frame, and for the hydrodynamics in an independent Eulerian frame. The necessary Eulerian-Lagrangian transformations are simplified under the assumption of an ideal Inertial Measurement Unit (IMU), rigidly attached at the centre of the mass of a sediment particle. Real, commercially available IMU sensors can provide high frequency data on accelerations and angular velocities (hence forces and energy) experienced by grains during entrainment and motion, if adequately customized. IMUs are subjected to significant error accu- mulation but they can be used for statistical parametrisation of an Eulerian-Lagrangian model, for coarse sediment particles and over the temporal scale of individual entrainment events. In this thesis an Eulerian-Lagrangian model is introduced and evaluated experimentally. Absolute inertial accelerations were recorded at a 4 Hz frequency from a spherical instrumented particle (111 mm diameter and 2383 kg/m3 density) in a series of entrainment threshold experiments on a fixed idealised bed. The grain-top inertial acceleration entrainment threshold was approximated at 44 and 51 mg for slopes 0.026 and 0.037 respectively. The saddle inertial acceleration entrainment threshold was at 32 and 25 mg for slopes 0.044 and 0.057 respectively. For the evaluation of the complete Eulerian-Lagrangian model two prototype sensors are presented: an idealised (spherical) with a diameter of 90 mm and an ellipsoidal with axes 100, 70 and 30 mm. Both are instrumented with a complete IMU, capable of sampling 3D inertial accelerations and 3D angular velocities at 50 Hz. After signal analysis, the results can be used to parametrize sediment movement but they do not contain positional information. The two sensors (spherical and ellipsoidal) were tested in a series of entrainment experiments, similar to the evaluation of the 111 mm prototype, for a slope of 0.02. The spherical sensor entrained at discharges of 24.8 ± 1.8 l/s while the same threshold for the ellipsoidal sensor was 45.2 ± 2.2 l/s. Kinetic energy calculations were used to quantify the particle-bed energy exchange under fluvial (discharge at 30 l/s) and non-fluvial conditions. All the experiments suggest that the effect of the inertial characteristics of coarse sediments on their motion is comparable to the effect hydrodynamic forces. The coupling of IMU sensors with advanced telemetric systems can lead to the tracking of Lagrangian particle trajectories, at a frequency and accuracy that will permit the testing of diffusion/dispersion models across the range of particle diameters.
Resumo:
Atomic charge transfer-counter polarization effects determine most of the infrared fundamental CH intensities of simple hydrocarbons, methane, ethylene, ethane, propyne, cyclopropane and allene. The quantum theory of atoms in molecules/charge-charge flux-dipole flux model predicted the values of 30 CH intensities ranging from 0 to 123 km mol(-1) with a root mean square (rms) error of only 4.2 km mol(-1) without including a specific equilibrium atomic charge term. Sums of the contributions from terms involving charge flux and/or dipole flux averaged 20.3 km mol(-1), about ten times larger than the average charge contribution of 2.0 km mol(-1). The only notable exceptions are the CH stretching and bending intensities of acetylene and two of the propyne vibrations for hydrogens bound to sp hybridized carbon atoms. Calculations were carried out at four quantum levels, MP2/6-311++G(3d,3p), MP2/cc-pVTZ, QCISD/6-311++G(3d,3p) and QCISD/cc-pVTZ. The results calculated at the QCISD level are the most accurate among the four with root mean square errors of 4.7 and 5.0 km mol(-1) for the 6-311++G(3d,3p) and cc-pVTZ basis sets. These values are close to the estimated aggregate experimental error of the hydrocarbon intensities, 4.0 km mol(-1). The atomic charge transfer-counter polarization effect is much larger than the charge effect for the results of all four quantum levels. Charge transfer-counter polarization effects are expected to also be important in vibrations of more polar molecules for which equilibrium charge contributions can be large.