80 resultados para Error-correcting codes (Information theory)
Resumo:
We investigate the role of information in the internationalization of small and medium enterprises (SMEs). Information internalization is fundamentally antecedent to SME internationalization and is being facilitated increasingly by recent important trends. We offer a conceptual explanation and related propositions on information internalization, emphasizing hurdle rate theory for ascertaining the acceptability of firms' internationalization projects.
Resumo:
We show that quantum feedback control can be used as a quantum-error-correction process for errors induced by a weak continuous measurement. In particular, when the error model is restricted to one, perfectly measured, error channel per physical qubit, quantum feedback can act to perfectly protect a stabilizer codespace. Using the stabilizer formalism we derive an explicit scheme, involving feedback and an additional constant Hamiltonian, to protect an (n-1)-qubit logical state encoded in n physical qubits. This works for both Poisson (jump) and white-noise (diffusion) measurement processes. Universal quantum computation is also possible in this scheme. As an example, we show that detected-spontaneous emission error correction with a driving Hamiltonian can greatly reduce the amount of redundancy required to protect a state from that which has been previously postulated [e.g., Alber , Phys. Rev. Lett. 86, 4402 (2001)].
Resumo:
This paper presents a method for estimating the posterior probability density of the cointegrating rank of a multivariate error correction model. A second contribution is the careful elicitation of the prior for the cointegrating vectors derived from a prior on the cointegrating space. This prior obtains naturally from treating the cointegrating space as the parameter of interest in inference and overcomes problems previously encountered in Bayesian cointegration analysis. Using this new prior and Laplace approximation, an estimator for the posterior probability of the rank is given. The approach performs well compared with information criteria in Monte Carlo experiments. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
Colonius suggests that, in using standard set theory as the language in which to express our computational-level theory of human memory, we would need to violate the axiom of foundation in order to express meaningful memory bindings in which a context is identical to an item in the list. We circumvent Colonius's objection by allowing that a list item may serve as a label for a context without being identical to that context. This debate serves to highlight the value of specifying memory operations in set theoretic notation, as it would have been difficult if not impossible to formulate such an objection at the algorithmic level.
Resumo:
We propose two quantum error-correction schemes which increase the maximum storage time for qubits in a system of cold-trapped ions, using a minimal number of ancillary qubits. Both schemes consider only the errors introduced by the decoherence due to spontaneous emission from the upper levels of the ions. Continuous monitoring of the ion fluorescence is used in conjunction with selective coherent feedback to eliminate these errors immediately following spontaneous emission events.
Resumo:
Experimental data for E. coli debris size reduction during high-pressure homogenisation at 55 MPa are presented. A mathematical model based on grinding theory is developed to describe the data. The model is based on first-order breakage and compensation conditions. It does not require any assumption of a specified distribution for debris size and can be used given information on the initial size distribution of whole cells and the disruption efficiency during homogenisation. The number of homogeniser passes is incorporated into the model and used to describe the size reduction of non-induced stationary and induced E. coil cells during homogenisation. Regressing the results to the model equations gave an excellent fit to experimental data ( > 98.7% of variance explained for both fermentations), confirming the model's potential for predicting size reduction during high-pressure homogenisation. This study provides a means to optimise both homogenisation and disc-stack centrifugation conditions for recombinant product recovery. (C) 1997 Elsevier Science Ltd.
Resumo:
HE PROBIT MODEL IS A POPULAR DEVICE for explaining binary choice decisions in econometrics. It has been used to describe choices such as labor force participation, travel mode, home ownership, and type of education. These and many more examples can be found in papers by Amemiya (1981) and Maddala (1983). Given the contribution of economics towards explaining such choices, and given the nature of data that are collected, prior information on the relationship between a choice probability and several explanatory variables frequently exists. Bayesian inference is a convenient vehicle for including such prior information. Given the increasing popularity of Bayesian inference it is useful to ask whether inferences from a probit model are sensitive to a choice between Bayesian and sampling theory techniques. Of interest is the sensitivity of inference on coefficients, probabilities, and elasticities. We consider these issues in a model designed to explain choice between fixed and variable interest rate mortgages. Two Bayesian priors are employed: a uniform prior on the coefficients, designed to be noninformative for the coefficients, and an inequality restricted prior on the signs of the coefficients. We often know, a priori, whether increasing the value of a particular explanatory variable will have a positive or negative effect on a choice probability. This knowledge can be captured by using a prior probability density function (pdf) that is truncated to be positive or negative. Thus, three sets of results are compared:those from maximum likelihood (ML) estimation, those from Bayesian estimation with an unrestricted uniform prior on the coefficients, and those from Bayesian estimation with a uniform prior truncated to accommodate inequality restrictions on the coefficients.
Resumo:
After outlining some relevant background information about the NT crocodile farming industry and explaining the purpose of our survey of NT crocodile farmers conducted in the first half of 2005, this paper reports the results of the survey. The information received from the survey is supplemented by secondary data and by information from secondary sources. This report covers the location of respondents; the size of crocodile farms; farmers’ stated knowledge of and attitudes towards the NT Crocodile Management Plan; the involvement of farms in the harvesting of crocodile eggs and the costs involved; views of crocodile farmers about whether the NT Crocodile Management Plan encourages landholders to conserve crocodiles and their perceptions of the benefits to landholders; predicted production trends and trends in the number of farms operating in NT; economic characteristics of crocodile farms producing in NT including the economic advantages and disadvantages of crocodile farming in NT. Concluding comments provide, amongst other things, an overview of the structure of the crocodile farming industry in the NT gleaned from a consideration of data available from the NT Government’s Department of Business, Industry and Resource Development.
Resumo:
An important feature of some conceptual modelling grammars is the features they provide to allow database designers to show real-world things may or may not possess a particular attribute or relationship. In the entity-relationship model, for example, the fact that a thing may not possess an attribute can be represented by using a special symbol to indicate that the attribute is optional. Similarly, the fact that a thing may or may not be involved in a relationship can be represented by showing the minimum cardinality of the relationship as zero. Whether these practices should be followed, however, is a contentious issue. An alternative approach is to eliminate optional attributes and relationships from conceptual schema diagrams by using subtypes that have only mandatory attributes and relationships. In this paper, we first present a theory that led us to predict that optional attributes and relationships should be used in conceptual schema diagrams only when users of the diagrams require a surface-level understanding of the domain being represented by the diagrams. When users require a deep-level understanding, however, optional attributes and relationships should not be used because they undermine users' abilities to grasp important domain semantics. We describe three experiments which we then undertook to test our predictions. The results of the experiments support our predictions.
Resumo:
Codes C-1,...,C-M of length it over F-q and an M x N matrix A over F-q define a matrix-product code C = [C-1 (...) C-M] (.) A consisting of all matrix products [c(1) (...) c(M)] (.) A. This generalizes the (u/u + v)-, (u + v + w/2u + v/u)-, (a + x/b + x/a + b + x)-, (u + v/u - v)- etc. constructions. We study matrix-product codes using Linear Algebra. This provides a basis for a unified analysis of /C/, d(C), the minimum Hamming distance of C, and C-perpendicular to. It also reveals an interesting connection with MDS codes. We determine /C/ when A is non-singular. To underbound d(C), we need A to be 'non-singular by columns (NSC)'. We investigate NSC matrices. We show that Generalized Reed-Muller codes are iterative NSC matrix-product codes, generalizing the construction of Reed-Muller codes, as are the ternary 'Main Sequence codes'. We obtain a simpler proof of the minimum Hamming distance of such families of codes. If A is square and NSC, C-perpendicular to can be described using C-1(perpendicular to),...,C-M(perpendicular to) and a transformation of A. This yields d(C-perpendicular to). Finally we show that an NSC matrix-product code is a generalized concatenated code.
Resumo:
In population pharmacokinetic studies, the precision of parameter estimates is dependent on the population design. Methods based on the Fisher information matrix have been developed and extended to population studies to evaluate and optimize designs. In this paper we propose simple programming tools to evaluate population pharmacokinetic designs. This involved the development of an expression for the Fisher information matrix for nonlinear mixed-effects models, including estimation of the variance of the residual error. We implemented this expression as a generic function for two software applications: S-PLUS and MATLAB. The evaluation of population designs based on two pharmacokinetic examples from the literature is shown to illustrate the efficiency and the simplicity of this theoretic approach. Although no optimization method of the design is provided, these functions can be used to select and compare population designs among a large set of possible designs, avoiding a lot of simulations.
Resumo:
In this paper, we develop a theory for diffusion and flow of pure sub-critical adsorbates in microporous activated carbon over a wide range of pressure, ranging from very low to high pressure, where capillary condensation is occurring. This theory does not require any fitting parameter. The only information needed for the prediction is the complete pore size distribution of activated carbon. The various interesting behaviors of permeability versus loading are observed such as the maximum permeability at high loading (occurred at about 0.8-0.9 relative pressure). The theory is tested with diffusion and flow of benzene through a commercial activated carbon, and the agreement is found to be very good in the light that there is no fitting parameter in the model. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
Internationalisation occurs when the firm expands its selling, production, or other business activities into international markets. Many enterprises, especially small- and medium-size firms (SMEs), are internationalising today at an unprecedented rate. Managers are strategically using information to achieve degrees of internationalisation previously considered the domain of large firms. We extend existing explanations of firm internationalisation by examining the nature and fundamental, antecedent role of internalising appropriate information and translating it into relevant knowledge. Based on case studies of internationalising firms, we advance a conceptualisation of information internalisation and knowledge creation within the firm as it achieves internationalisation readiness. In the process, we offer several propositions intended to guide future research. (C) 2002 Elsevier Science Inc. All rights reserved.
Resumo:
Management are keen to maximize the life span of an information system because of the high cost, organizational disruption, and risk of failure associated with the re-development or replacement of an information system. This research investigates the effects that various factors have on an information system's life span by understanding how the factors affect an information system's stability. The research builds on a previously developed two-stage model of information system change whereby an information system is either in a stable state of evolution in which the information system's functionality is evolving, or in a state of revolution, in which the information system is being replaced because it is not providing the functionality expected by its users. A case study surveyed a number of systems within one organization. The aim was to test whether a relationship existed between the base value of the volatility index (a measure of the stability of an information system) and certain system characteristics. Data relating to some 3000 user change requests covering 40 systems over a 10-year period were obtained. The following factors were hypothesized to have significant associations with the base value of the volatility index: language level (generation of language of construction), system size, system age, and the timing of changes applied to a system. Significant associations were found in the hypothesized directions except that the timing of user changes was not associated with any change in the value of the volatility index. Copyright (C) 2002 John Wiley Sons, Ltd.