931 resultados para Space-time codes (STCs)
Resumo:
This work considers a nonlinear time-varying system described by a state representation, with input u and state x. A given set of functions v, which is not necessarily the original input u of the system, is the (new) input candidate. The main result provides necessary and sufficient conditions for the existence of a local classical state space representation with input v. These conditions rely on integrability tests that are based on a derived flag. As a byproduct, one obtains a sufficient condition of differential flatness of nonlinear systems. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
The question raised by researchers in the field of mathematical biology regarding the existence of error-correcting codes in the structure of the DNA sequences is answered positively. It is shown, for the first time, that DNA sequences such as proteins, targeting sequences and internal sequences are identified as codewords of BCH codes over Galois fields.
Resumo:
The generalized Gibbs sampler (GGS) is a recently developed Markov chain Monte Carlo (MCMC) technique that enables Gibbs-like sampling of state spaces that lack a convenient representation in terms of a fixed coordinate system. This paper describes a new sampler, called the tree sampler, which uses the GGS to sample from a state space consisting of phylogenetic trees. The tree sampler is useful for a wide range of phylogenetic applications, including Bayesian, maximum likelihood, and maximum parsimony methods. A fast new algorithm to search for a maximum parsimony phylogeny is presented, using the tree sampler in the context of simulated annealing. The mathematics underlying the algorithm is explained and its time complexity is analyzed. The method is tested on two large data sets consisting of 123 sequences and 500 sequences, respectively. The new algorithm is shown to compare very favorably in terms of speed and accuracy to the program DNAPARS from the PHYLIP package.
Resumo:
Power system real time security assessment is one of the fundamental modules of the electricity markets. Typically, when a contingency occurs, it is required that security assessment and enhancement module shall be ready for action within about 20 minutes’ time to meet the real time requirement. The recent California black out again highlighted the importance of system security. This paper proposed an approach for power system security assessment and enhancement based on the information provided from the pre-defined system parameter space. The proposed scheme opens up an efficient way for real time security assessment and enhancement in a competitive electricity market for single contingency case
Resumo:
An order of magnitude sensitivity gain is described for using quasar spectra to investigate possible time or space variation in the fine structure constant alpha. Applied to a sample of 30 absorption systems, spanning redshifts 0.5 < z < 1.6, we derive limits on variations in alpha over a wide range of epochs. For the whole sample, Delta alpha/alpha = (-1.1 +/- 0.4) x 10(-5). This deviation is dominated by measurements at z > 1, where Delta alpha/alpha = (-1.9 +/- 0.5) x 10(-5). For z < 1, Delta alpha/alpha = (-0.2 +/- 0.4) x 10(-5). While this is consistent with a time-varying alpha, further work is required to explore possible systematic errors in the data, although careful searches have so far revealed none.
Resumo:
This note considers continuous-time Markov chains whose state space consists of an irreducible class, C, and an absorbing state which is accessible from C. The purpose is to provide results on mu-invariant and mu-subinvariant measures where absorption occurs with probability less than one. In particular, the well-known premise that the mu-invariant measure, m, for the transition rates be finite is replaced by the more natural premise that m be finite with respect to the absorption probabilities. The relationship between mu-invariant measures and quasi-stationary distributions is discussed. (C) 2000 Elsevier Science Ltd. All rights reserved.
Resumo:
The image reconstruction using the EIT (Electrical Impedance Tomography) technique is a nonlinear and ill-posed inverse problem which demands a powerful direct or iterative method. A typical approach for solving the problem is to minimize an error functional using an iterative method. In this case, an initial solution close enough to the global minimum is mandatory to ensure the convergence to the correct minimum in an appropriate time interval. The aim of this paper is to present a new, simple and low cost technique (quadrant-searching) to reduce the search space and consequently to obtain an initial solution of the inverse problem of EIT. This technique calculates the error functional for four different contrast distributions placing a large prospective inclusion in the four quadrants of the domain. Comparing the four values of the error functional it is possible to get conclusions about the internal electric contrast. For this purpose, initially we performed tests to assess the accuracy of the BEM (Boundary Element Method) when applied to the direct problem of the EIT and to verify the behavior of error functional surface in the search space. Finally, numerical tests have been performed to verify the new technique.
Resumo:
Constructing a veridical spatial map by touch poses at least two problems for a perceptual system. First, as the hand is moved through space, the locations of features may be displaced if there is an uncorrected lag between the moment the hand encounters a feature and the time that feature is encoded on a spatial map. Second, due to the sequential nature of the process, some form of memory, which itself may be subject to spatial distortions, is required for integration of spatial samples. We investigated these issues using a task involving active haptic exploration with a stylus swept back and forth in the horizontal plane at the wrist. Remembered locations of tactile targets were shifted towards the medial axis of the forearm, suggesting a central tendency in haptic spatial memory, while evidence for a displacement of perceived locations in the direction of sweep motion was consistent with processing delays.
Resumo:
When linear equality constraints are invariant through time they can be incorporated into estimation by restricted least squares. If, however, the constraints are time-varying, this standard methodology cannot be applied. In this paper we show how to incorporate linear time-varying constraints into the estimation of econometric models. The method involves the augmentation of the observation equation of a state-space model prior to estimation by the Kalman filter. Numerical optimisation routines are used for the estimation. A simple example drawn from demand analysis is used to illustrate the method and its application.
Resumo:
Background and objective The time course of cardiopulmonary alterations after pulmonary embolism has not been clearly demonstrated and nor has the role of systemic inflammation on the pathogenesis of the disease. This study aimed to evaluate over 12 h the effects of pulmonary embolism caused by polystyrene microspheres on the haemodynamics, lung mechanics and gas exchange and on interleukin-6 production. Methods Ten large white pigs (weight 35-42 kg) had arterial and pulmonary catheters inserted and pulmonary embolism was induced in five pigs by injection of polystyrene microspheres (diameter similar to 300 mu mol l(-1)) until a value of pulmonary mean arterial pressure of twice the baseline was obtained. Five other animals received only saline. Haemodynamic and respiratory data and pressure-volume curves of the respiratory system were collected. A bronchoscopy was performed before and 12 h after embolism, when the animals were euthanized. Results The embolism group developed hypoxaemia that was not corrected with high oxygen fractions, as well as higher values of dead space, airway resistance and lower respiratory compliance levels. Acute haemodynamic alterations included pulmonary arterial hypertension with preserved systemic arterial pressure and cardiac index. These derangements persisted until the end of the experiments. The plasma interleukin-6 concentrations were similar in both groups; however, an increase in core temperature and a nonsignificant higher concentration of bronchoalveolar lavage proteins were found in the embolism group. Conclusion Acute pulmonary embolism induced by polystyrene microspheres in pigs produces a 12-h lasting hypoxaemia and a high dead space associated with high airway resistance and low compliance. There were no plasma systemic markers of inflammation, but a higher central temperature and a trend towards higher bronchoalveolar lavage proteins were found. Eur J Anaesthesiol 27:67-76 (C) 2010 European Society of Anaesthesiology.
Resumo:
Introduction: This ex vivo study evaluated the heat release, time required, and cleaning efficacy of MTwo (VDW, Munich, Germany) and ProTaper Universal Retreatment systems (Dentsply/Maillefer, Ballaigues, Switzerland) and hand instrumentation in the removal of filling material. Methods: Sixty single-rooted human teeth with a single straight canal were obturated with gutta-percha and zinc oxide and eugenol-based cement and randomly allocated to 3 groups (n = 20). After 30-day storage at 37 degrees C and 100% humidity, the root fillings were removed using ProTaper UR, MTwo R, or hand files. Heat release, time required, and cleaning efficacy data were analyzed statistically (analysis of variance and the Tukey test, alpha = 0.05). Results: None of the techniques removed the root fillings completely. Filling material removal with ProTaper UR was faster but caused more heat release. Mtwo R produced less heat release than the other techniques but was the least efficient in removing gutta-percha/sealer. Conclusions: ProTaper UR and MTwo R caused the greatest and lowest temperature increase on root surface, respectively; regardless of the type of instrument, more heat was released in the cervical third. Pro Taper UR needed less time to remove fillings than MTwo R. All techniques left filling debris in the root canals. (I Endod 2010;36:1870-1873)
Resumo:
Objectives: The purpose of this in vitro study was to evaluate the Vickers hardness (VHN) of a Light Core (Bisco) composite resin after root reinforcement, according to the light exposure time, region of intracanal reinforcement and lateral distance from the light-transmitting fibre post. Methods: Forty-five 17-mm long roots were used. Twenty-four hours after obturation, the root canals were emptied to a depth of 12 mm and the root dentine was artificially flared to produce a 1 mm space between the fibre post and the canal walls. The roots were bulk restored with the composite resin, which was photoactivated through the post for 40 s (G1, control), 80 s (G2) or 120 s (G3). Twenty-four hours after post-cementation, the specimens were sectioned transversely into three slices at depths of 2, 6 and 10 mm, corresponding to the coronal, middle and apical regions of the reinforced root. Composite VHN was measured as the average of three indentations (100 g/15 s) in each region at lateral distances of 50, 200 and 350 mu m from the cement/post-interface. Results: Three-way analysis of variance (alpha = 0.05) indicated that the factors time, region and distance influenced the hardness and that the interaction time x region was statistically significant (p = 0.0193). Tukey`s test showed that the mean VHN values for G1 (76.37 +/- 8.58) and G2 (74.89 +/- 6.28) differed significantly from that for G3 (79.5 +/- 5.18). Conclusions: Composite resin hardness was significantly lower in deeper regions of root reinforcement and in lateral areas distant from the post. Overall, a light exposure time of 120 s provided higher composite hardness than the shorter times (40 and 80 s). (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
A scheme is presented to incorporate a mixed potential integral equation (MPIE) using Michalski's formulation C with the method of moments (MoM) for analyzing the scattering of a plane wave from conducting planar objects buried in a dielectric half-space. The robust complex image method with a two-level approximation is used for the calculation of the Green's functions for the half-space. To further speed up the computation, an interpolation technique for filling the matrix is employed. While the induced current distributions on the object's surface are obtained in the frequency domain, the corresponding time domain responses are calculated via the inverse fast Fourier transform (FFT), The complex natural resonances of targets are then extracted from the late time response using the generalized pencil-of-function (GPOF) method. We investigate the pole trajectories as we vary the distance between strips and the depth and orientation of single, buried strips, The variation from the pole position of a single strip in a homogeneous dielectric medium was only a few percent for most of these parameter variations.
Resumo:
A data warehouse is a data repository which collects and maintains a large amount of data from multiple distributed, autonomous and possibly heterogeneous data sources. Often the data is stored in the form of materialized views in order to provide fast access to the integrated data. One of the most important decisions in designing a data warehouse is the selection of views for materialization. The objective is to select an appropriate set of views that minimizes the total query response time with the constraint that the total maintenance time for these materialized views is within a given bound. This view selection problem is totally different from the view selection problem under the disk space constraint. In this paper the view selection problem under the maintenance time constraint is investigated. Two efficient, heuristic algorithms for the problem are proposed. The key to devising the proposed algorithms is to define good heuristic functions and to reduce the problem to some well-solved optimization problems. As a result, an approximate solution of the known optimization problem will give a feasible solution of the original problem. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
This note presents a method of evaluating the distribution of a path integral for Markov chains on a countable state space.