66 resultados para Error-correcting codes (Information theory)
Resumo:
Information costs play a key role in determining the relative efficiency of alternative organisational structures. The choice of locations at which information is stored in a firm is an important determinant of its information costs. A specific example of information use is modelled in order to explore what factors determine whether information should be stored centrally or locally and if it should be replicated at different sites. This provides insights into why firms are structured hierarchically, with some decisions and tasks being performed centrally and others at different levels of decentralisation. The effects of new information technologies are also discussed. These can radically alter the patterns and levels of information costs within a firm and so can cause substantial changes in organisational structure.
Resumo:
Intuition is an important and under-researched concept in information systems. Prior exploratory research has shown that that there is potential to characterize the use of intuition in academic information systems research. This paper extends this research to all of the available issues of two leading IS journals with the aim of reaching an approximation of theoretical saturation. Specifically, the entire text of MISQ and ISR was reviewed for the years 1990 through 2009 using searchable PDF versions of these publications. All references to intuition were coded on a basis consistent with Grounded Theory, interpreted as a gestalt and represented as a mind-map. In the period 1990-2009, 681 incidents of the use of "intuition", and related terms were found in the articles reviewed, representing a greater range of codes than prior research. In addition, codes were assigned to all issues of MIS Quarterly from commencement of publication to the end of the 2012 publication year to support the conjecture that coding saturation has been approximated. The most prominent use of the term of "intuition" was coded as "Intuition as Authority" in which intuition was used to validate a statement, research objective or a finding; representing approximately 34 per cent of codes assigned. In research articles where mathematical analysis was presented, researchers not infrequently commented on the degree to which a mathematical formulation was "intuitive"; this was the second most common coding representing approximately 16 per cent of the codes. The possibly most impactful use of the term "intuition" was "Intuition as Outcome", representing approximately 7 per cent of all coding, which characterized research results as adding to the intuitive understanding of a research topic or phenomena.This research aims to contribute to a greater theoretical understanding of the use of intuition in academic IS research publications. It provides potential benefits to practitioners by providing insight into the use of intuition in IS management, for example, emphasizing the emerging importance of "intuitive technology". Research directions include the creation of reflective and/or formative constructs for intuition in information systems research and the expansion of this novel research method to additional IS academic publications and topics.
Resumo:
In cooperative communication networks, owing to the nodes' arbitrary geographical locations and individual oscillators, the system is fundamentally asynchronous. Such a timing mismatch may cause rank deficiency of the conventional space-time codes and, thus, performance degradation. One efficient way to overcome such an issue is the delay-tolerant space-time codes (DT-STCs). The existing DT-STCs are designed assuming that the transmitter has no knowledge about the channels. In this paper, we show how the performance of DT-STCs can be improved by utilizing some feedback information. A general framework for designing DT-STC with limited feedback is first proposed, allowing for flexible system parameters such as the number of transmit/receive antennas, modulated symbols, and the length of codewords. Then, a new design method is proposed by combining Lloyd's algorithm and the stochastic gradient-descent algorithm to obtain optimal codebook of STCs, particularly for systems with linear minimum-mean-square-error receiver. Finally, simulation results confirm the performance of the newly designed DT-STCs with limited feedback.
Resumo:
This paper describes benchmark testing of six two-dimensional (2D) hydraulic models (DIVAST, DIVASTTVD, TUFLOW, JFLOW, TRENT and LISFLOOD-FP) in terms of their ability to simulate surface flows in a densely urbanised area. The models are applied to a 1·0 km × 0·4 km urban catchment within the city of Glasgow, Scotland, UK, and are used to simulate a flood event that occurred at this site on 30 July 2002. An identical numerical grid describing the underlying topography is constructed for each model, using a combination of airborne laser altimetry (LiDAR) fused with digital map data, and used to run a benchmark simulation. Two numerical experiments were then conducted to test the response of each model to topographic error and uncertainty over friction parameterisation. While all the models tested produce plausible results, subtle differences between particular groups of codes give considerable insight into both the practice and science of urban hydraulic modelling. In particular, the results show that the terrain data available from modern LiDAR systems are sufficiently accurate and resolved for simulating urban flows, but such data need to be fused with digital map data of building topology and land use to gain maximum benefit from the information contained therein. When such terrain data are available, uncertainty in friction parameters becomes a more dominant factor than topographic error for typical problems. The simulations also show that flows in urban environments are characterised by numerous transitions to supercritical flow and numerical shocks. However, the effects of these are localised and they do not appear to affect overall wave propagation. In contrast, inertia terms are shown to be important in this particular case, but the specific characteristics of the test site may mean that this does not hold more generally.
Resumo:
Models of the dynamics of nitrogen in soil (soil-N) can be used to aid the fertilizer management of a crop. The predictions of soil-N models can be validated by comparison with observed data. Validation generally involves calculating non-spatial statistics of the observations and predictions, such as their means, their mean squared-difference, and their correlation. However, when the model predictions are spatially distributed across a landscape the model requires validation with spatial statistics. There are three reasons for this: (i) the model may be more or less successful at reproducing the variance of the observations at different spatial scales; (ii) the correlation of the predictions with the observations may be different at different spatial scales; (iii) the spatial pattern of model error may be informative. In this study we used a model, parameterized with spatially variable input information about the soil, to predict the mineral-N content of soil in an arable field, and compared the results with observed data. We validated the performance of the N model spatially with a linear mixed model of the observations and model predictions, estimated by residual maximum likelihood. This novel approach allowed us to describe the joint variation of the observations and predictions as: (i) independent random variation that occurred at a fine spatial scale; (ii) correlated random variation that occurred at a coarse spatial scale; (iii) systematic variation associated with a spatial trend. The linear mixed model revealed that, in general, the performance of the N model changed depending on the spatial scale of interest. At the scales associated with random variation, the N model underestimated the variance of the observations, and the predictions were correlated poorly with the observations. At the scale of the trend, the predictions and observations shared a common surface. The spatial pattern of the error of the N model suggested that the observations were affected by the local soil condition, but this was not accounted for by the N model. In summary, the N model would be well-suited to field-scale management of soil nitrogen, but suited poorly to management at finer spatial scales. This information was not apparent with a non-spatial validation. (c),2007 Elsevier B.V. All rights reserved.
Resumo:
The extent to which the four-dimensional variational data assimilation (4DVAR) is able to use information about the time evolution of the atmosphere to infer the vertical spatial structure of baroclinic weather systems is investigated. The singular value decomposition (SVD) of the 4DVAR observability matrix is introduced as a novel technique to examine the spatial structure of analysis increments. Specific results are illustrated using 4DVAR analyses and SVD within an idealized 2D Eady model setting. Three different aspects are investigated. The first aspect considers correcting errors that result in normal-mode growth or decay. The results show that 4DVAR performs well at correcting growing errors but not decaying errors. Although it is possible for 4DVAR to correct decaying errors, the assimilation of observations can be detrimental to a forecast because 4DVAR is likely to add growing errors instead of correcting decaying errors. The second aspect shows that the singular values of the observability matrix are a useful tool to identify the optimal spatial and temporal locations for the observations. The results show that the ability to extract the time-evolution information can be maximized by placing the observations far apart in time. The third aspect considers correcting errors that result in nonmodal rapid growth. 4DVAR is able to use the model dynamics to infer some of the vertical structure. However, the specification of the case-dependent background error variances plays a crucial role.
Resumo:
The calculation of accurate and reliable vibrational potential functions and normal co-ordinates is discussed, for such simple polyatomic molecules as it may be possible. Such calculations should be corrected for the effects of anharmonicity and of resonance interactions between the vibrational states, and should be fitted to all the available information on all isotopic species: particularly the vibrational frequencies, Coriolis zeta constants and centrifugal distortion constants. The difficulties of making these corrections, and of making use of the observed data are reviewed. A programme for the Ferranti Mercury Computer is described by means of which harmonic vibration frequencies and normal co-ordinate vectors, zeta factors and centrifugal distortion constants can be calculated, from a given force field and from given G-matrix elements, etc. The programme has been used on up to 5 × 5 secular equations for which a single calculation and output of results takes approximately l min; it can readily be extended to larger determinants. The best methods of using such a programme and the possibility of reversing the direction of calculation are discussed. The methods are applied to calculating the best possible vibrational potential function for the methane molecule, making use of all the observed data.
Resumo:
Targeted observations are generally taken in regions of high baroclinicity, but often show little impact. One plausible explanation is that important dynamical information, such as upshear tilt, is not extracted from the targeted observations by the data assimilation scheme and used to correct initial condition error. This is investigated by generating pseudo targeted observations which contain a singular vector (SV) structure that is not present in the background field or routine observations, i.e. assuming that the background has an initial condition error with tilted growing structure. Experiments were performed for a single case-study with varying numbers of pseudo targeted observations. These were assimilated by the Met Office four-dimensional variational (4D-Var) data assimilation scheme, which uses a 6 h window for observations and background-error covariances calculated using the National Meteorological Centre (NMC) method. The forecasts were run using the operational Met Office Unified Model on a 24 km grid. The results presented clearly demonstrate that a 6 h window 4D-Var system is capable of extracting baroclinic information from a limited set of observations and using it to correct initial condition error. To capture the SV structure well (projection of 0.72 in total energy), 50 sondes over an area of 1×106 km2 were required. When the SV was represented by only eight sondes along an example targeting flight track covering a smaller area, the projection onto the SV structure was lower; the resulting forecast perturbations showed an SV structure with increased tilt and reduced initial energy. The total energy contained in the perturbations decreased as the SV structure was less well described by the set of observations (i.e. as fewer pseudo observations were assimilated). The assimilated perturbation had lower energy than the SV unless the pseudo observations were assimilated with the dropsonde observation errors halved from operational values. Copyright © 2010 Royal Meteorological Society
Resumo:
Slantwise convective available potential energy (SCAPE) is a measure of the degree to which the atmosphere is unstable to conditional symmetric instability (CSI). It has, until now, been defined by parcel theory in which the atmosphere is assumed to be nonevolving and balanced, that is, two-dimensional. When applying this two-dimensional theory to three-dimensional evolving flows, these assumptions can be interpreted as an implicit assumption that a timescale separation exists between a relatively rapid timescale for slantwise ascent and a slower timescale for the development of the system. An approximate extension of parcel theory to three dimensions is derived and it is shown that calculations of SCAPE based on the assumption of relatively rapid slantwise ascent can be qualitatively in error. For a case study example of a developing extratropical cyclone, SCAPE calculated along trajectories determined without assuming the existence of the timescale separation show large SCAPE values for parcels ascending from the warm sector and along the warm front. These parcels ascend into the cloud head within which there is some evidence consistent with the release of CSI from observational and model cross sections. This region of high SCAPE was not found for calculations along the relatively rapidly ascending trajectories determined by assuming the existence of the timescale separation.
Resumo:
This article assesses the extent to which sampling variation affects findings about Malmquist productivity change derived using data envelopment analysis (DEA), in the first stage by calculating productivity indices and in the second stage by investigating the farm-specific change in productivity. Confidence intervals for Malmquist indices are constructed using Simar and Wilson's (1999) bootstrapping procedure. The main contribution of this article is to account in the second stage for the information in the second stage provided by the first-stage bootstrap. The DEA SEs of the Malmquist indices given by bootstrapping are employed in an innovative heteroscedastic panel regression, using a maximum likelihood procedure. The application is to a sample of 250 Polish farms over the period 1996 to 2000. The confidence intervals' results suggest that the second half of 1990s for Polish farms was characterized not so much by productivity regress but rather by stagnation. As for the determinants of farm productivity change, we find that the integration of the DEA SEs in the second-stage regression is significant in explaining a proportion of the variance in the error term. Although our heteroscedastic regression results differ with those from the standard OLS, in terms of significance and sign, they are consistent with theory and previous research.
Resumo:
Theory of mind ability has been associated with performance in interpersonal interactions and has been found to influence aspects such as emotion recognition, social competence, and social anxiety. Being able to attribute mental states to others requires attention to subtle communication cues such as facial emotional expressions. Decoding and interpreting emotions expressed by the face, especially those with negative valence, are essential skills to successful social interaction. The current study explored the association between theory of mind skills and attentional bias to facial emotional expressions. According to the study hypothesis, individuals with poor theory of mind skills showed preferential attention to negative faces over both non-negative faces and neutral objects. Tentative explanations for the findings are offered emphasizing the potential adaptive role of vigilance for threat as a way of allocating a limited capacity to interpret others’ mental states to obtain as much information as possible about potential danger in the social environment.
Resumo:
In 1997, the UK implemented the worlds first commercial digital terrestrial television system. Under the ETS 300 744 standard, the chosen modulation method, COFDM, is assumed to be multipath resilient. Previous work has shown that this is not necessarily the case. It has been shown that the local oscillator required for demodulation from intermediate-frequency to baseband must be very accurate. This paper shows that under multipath conditions, standard methods for obtaining local oscillator phase lock may not be adequate. This paper demonstrates a set of algorithms designed for use with a simple local oscillator circuit which will allow correction for local oscillator phase offset to maintain a low bit error rate with multipath present.
Resumo:
A novel optimising controller is designed that leads a slow process from a sub-optimal operational condition to the steady-state optimum in a continuous way based on dynamic information. Using standard results from optimisation theory and discrete optimal control, the solution of a steady-state optimisation problem is achieved by solving a receding-horizon optimal control problem which uses derivative and state information from the plant via a shadow model and a state-space identifier. The paper analyzes the steady-state optimality of the procedure, develops algorithms with and without control rate constraints and applies the procedure to a high fidelity simulation study of a distillation column optimisation.
Resumo:
Twenty first century challenges facing agriculture include climate change, threats to food security for a growing population and downward economic pressures on rural livelihoods. Addressing these challenges will require innovation in extension theory, policy and education, at a time when the dominance of the state in the provision of knowledge and information services to farmers and rural entrepreneurs continues to decline. This paper suggests that extension theory is catching up with and helping us to understand innovative extension practice, and therefore provides a platform for improving rural development policies and strategies. Innovation is now less likely to be spoken of as something to be passed on to farmers, than as a continuing process of creativity and adaptation that can be nurtured and sustained. Innovation systems and innovation platforms are concepts that recognise the multiple factors that lead to farmers’ developing, adapting and applying new ideas and the importance of linking all actors in the value chain to ensure producers can access appropriate information and advice for decision making at all stages in the production process. Concepts of social learning, group development and solidarity, social capital, collective action and empowerment all help to explain and therefore to apply more effectively group extension approaches in building confidence and sustaining innovation. A challenge facing educators is to ensure the curricula for aspiring extension professionals in our higher education institutions are regularly reviewed and keep up with current and future developments in theory, policy and practice.