931 resultados para Space-time trellis codes
Resumo:
This note considers continuous-time Markov chains whose state space consists of an irreducible class, C, and an absorbing state which is accessible from C. The purpose is to provide results on mu-invariant and mu-subinvariant measures where absorption occurs with probability less than one. In particular, the well-known premise that the mu-invariant measure, m, for the transition rates be finite is replaced by the more natural premise that m be finite with respect to the absorption probabilities. The relationship between mu-invariant measures and quasi-stationary distributions is discussed. (C) 2000 Elsevier Science Ltd. All rights reserved.
Resumo:
The image reconstruction using the EIT (Electrical Impedance Tomography) technique is a nonlinear and ill-posed inverse problem which demands a powerful direct or iterative method. A typical approach for solving the problem is to minimize an error functional using an iterative method. In this case, an initial solution close enough to the global minimum is mandatory to ensure the convergence to the correct minimum in an appropriate time interval. The aim of this paper is to present a new, simple and low cost technique (quadrant-searching) to reduce the search space and consequently to obtain an initial solution of the inverse problem of EIT. This technique calculates the error functional for four different contrast distributions placing a large prospective inclusion in the four quadrants of the domain. Comparing the four values of the error functional it is possible to get conclusions about the internal electric contrast. For this purpose, initially we performed tests to assess the accuracy of the BEM (Boundary Element Method) when applied to the direct problem of the EIT and to verify the behavior of error functional surface in the search space. Finally, numerical tests have been performed to verify the new technique.
Resumo:
Constructing a veridical spatial map by touch poses at least two problems for a perceptual system. First, as the hand is moved through space, the locations of features may be displaced if there is an uncorrected lag between the moment the hand encounters a feature and the time that feature is encoded on a spatial map. Second, due to the sequential nature of the process, some form of memory, which itself may be subject to spatial distortions, is required for integration of spatial samples. We investigated these issues using a task involving active haptic exploration with a stylus swept back and forth in the horizontal plane at the wrist. Remembered locations of tactile targets were shifted towards the medial axis of the forearm, suggesting a central tendency in haptic spatial memory, while evidence for a displacement of perceived locations in the direction of sweep motion was consistent with processing delays.
Resumo:
When linear equality constraints are invariant through time they can be incorporated into estimation by restricted least squares. If, however, the constraints are time-varying, this standard methodology cannot be applied. In this paper we show how to incorporate linear time-varying constraints into the estimation of econometric models. The method involves the augmentation of the observation equation of a state-space model prior to estimation by the Kalman filter. Numerical optimisation routines are used for the estimation. A simple example drawn from demand analysis is used to illustrate the method and its application.
Resumo:
Background and objective The time course of cardiopulmonary alterations after pulmonary embolism has not been clearly demonstrated and nor has the role of systemic inflammation on the pathogenesis of the disease. This study aimed to evaluate over 12 h the effects of pulmonary embolism caused by polystyrene microspheres on the haemodynamics, lung mechanics and gas exchange and on interleukin-6 production. Methods Ten large white pigs (weight 35-42 kg) had arterial and pulmonary catheters inserted and pulmonary embolism was induced in five pigs by injection of polystyrene microspheres (diameter similar to 300 mu mol l(-1)) until a value of pulmonary mean arterial pressure of twice the baseline was obtained. Five other animals received only saline. Haemodynamic and respiratory data and pressure-volume curves of the respiratory system were collected. A bronchoscopy was performed before and 12 h after embolism, when the animals were euthanized. Results The embolism group developed hypoxaemia that was not corrected with high oxygen fractions, as well as higher values of dead space, airway resistance and lower respiratory compliance levels. Acute haemodynamic alterations included pulmonary arterial hypertension with preserved systemic arterial pressure and cardiac index. These derangements persisted until the end of the experiments. The plasma interleukin-6 concentrations were similar in both groups; however, an increase in core temperature and a nonsignificant higher concentration of bronchoalveolar lavage proteins were found in the embolism group. Conclusion Acute pulmonary embolism induced by polystyrene microspheres in pigs produces a 12-h lasting hypoxaemia and a high dead space associated with high airway resistance and low compliance. There were no plasma systemic markers of inflammation, but a higher central temperature and a trend towards higher bronchoalveolar lavage proteins were found. Eur J Anaesthesiol 27:67-76 (C) 2010 European Society of Anaesthesiology.
Resumo:
Introduction: This ex vivo study evaluated the heat release, time required, and cleaning efficacy of MTwo (VDW, Munich, Germany) and ProTaper Universal Retreatment systems (Dentsply/Maillefer, Ballaigues, Switzerland) and hand instrumentation in the removal of filling material. Methods: Sixty single-rooted human teeth with a single straight canal were obturated with gutta-percha and zinc oxide and eugenol-based cement and randomly allocated to 3 groups (n = 20). After 30-day storage at 37 degrees C and 100% humidity, the root fillings were removed using ProTaper UR, MTwo R, or hand files. Heat release, time required, and cleaning efficacy data were analyzed statistically (analysis of variance and the Tukey test, alpha = 0.05). Results: None of the techniques removed the root fillings completely. Filling material removal with ProTaper UR was faster but caused more heat release. Mtwo R produced less heat release than the other techniques but was the least efficient in removing gutta-percha/sealer. Conclusions: ProTaper UR and MTwo R caused the greatest and lowest temperature increase on root surface, respectively; regardless of the type of instrument, more heat was released in the cervical third. Pro Taper UR needed less time to remove fillings than MTwo R. All techniques left filling debris in the root canals. (I Endod 2010;36:1870-1873)
Resumo:
Objectives: The purpose of this in vitro study was to evaluate the Vickers hardness (VHN) of a Light Core (Bisco) composite resin after root reinforcement, according to the light exposure time, region of intracanal reinforcement and lateral distance from the light-transmitting fibre post. Methods: Forty-five 17-mm long roots were used. Twenty-four hours after obturation, the root canals were emptied to a depth of 12 mm and the root dentine was artificially flared to produce a 1 mm space between the fibre post and the canal walls. The roots were bulk restored with the composite resin, which was photoactivated through the post for 40 s (G1, control), 80 s (G2) or 120 s (G3). Twenty-four hours after post-cementation, the specimens were sectioned transversely into three slices at depths of 2, 6 and 10 mm, corresponding to the coronal, middle and apical regions of the reinforced root. Composite VHN was measured as the average of three indentations (100 g/15 s) in each region at lateral distances of 50, 200 and 350 mu m from the cement/post-interface. Results: Three-way analysis of variance (alpha = 0.05) indicated that the factors time, region and distance influenced the hardness and that the interaction time x region was statistically significant (p = 0.0193). Tukey`s test showed that the mean VHN values for G1 (76.37 +/- 8.58) and G2 (74.89 +/- 6.28) differed significantly from that for G3 (79.5 +/- 5.18). Conclusions: Composite resin hardness was significantly lower in deeper regions of root reinforcement and in lateral areas distant from the post. Overall, a light exposure time of 120 s provided higher composite hardness than the shorter times (40 and 80 s). (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
A scheme is presented to incorporate a mixed potential integral equation (MPIE) using Michalski's formulation C with the method of moments (MoM) for analyzing the scattering of a plane wave from conducting planar objects buried in a dielectric half-space. The robust complex image method with a two-level approximation is used for the calculation of the Green's functions for the half-space. To further speed up the computation, an interpolation technique for filling the matrix is employed. While the induced current distributions on the object's surface are obtained in the frequency domain, the corresponding time domain responses are calculated via the inverse fast Fourier transform (FFT), The complex natural resonances of targets are then extracted from the late time response using the generalized pencil-of-function (GPOF) method. We investigate the pole trajectories as we vary the distance between strips and the depth and orientation of single, buried strips, The variation from the pole position of a single strip in a homogeneous dielectric medium was only a few percent for most of these parameter variations.
Resumo:
A data warehouse is a data repository which collects and maintains a large amount of data from multiple distributed, autonomous and possibly heterogeneous data sources. Often the data is stored in the form of materialized views in order to provide fast access to the integrated data. One of the most important decisions in designing a data warehouse is the selection of views for materialization. The objective is to select an appropriate set of views that minimizes the total query response time with the constraint that the total maintenance time for these materialized views is within a given bound. This view selection problem is totally different from the view selection problem under the disk space constraint. In this paper the view selection problem under the maintenance time constraint is investigated. Two efficient, heuristic algorithms for the problem are proposed. The key to devising the proposed algorithms is to define good heuristic functions and to reduce the problem to some well-solved optimization problems. As a result, an approximate solution of the known optimization problem will give a feasible solution of the original problem. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
This note presents a method of evaluating the distribution of a path integral for Markov chains on a countable state space.
Resumo:
This paper presents a method of evaluating the expected value of a path integral for a general Markov chain on a countable state space. We illustrate the method with reference to several models, including birth-death processes and the birth, death and catastrophe process. (C) 2002 Elsevier Science Inc. All rights reserved.
Resumo:
We reinterpret the state space dimension equations for geometric Goppa codes. An easy consequence is that if deg G less than or equal to n-2/2 or deg G greater than or equal to n-2/2 + 2g then the state complexity of C-L(D, G) is equal to the Wolf bound. For deg G is an element of [n-1/2, n-3/2 + 2g], we use Clifford's theorem to give a simple lower bound on the state complexity of C-L(D, G). We then derive two further lower bounds on the state space dimensions of C-L(D, G) in terms of the gonality sequence of F/F-q. (The gonality sequence is known for many of the function fields of interest for defining geometric Goppa codes.) One of the gonality bounds uses previous results on the generalised weight hierarchy of C-L(D, G) and one follows in a straightforward way from first principles; often they are equal. For Hermitian codes both gonality bounds are equal to the DLP lower bound on state space dimensions. We conclude by using these results to calculate the DLP lower bound on state complexity for Hermitian codes.
Resumo:
Image segmentation is an ubiquitous task in medical image analysis, which is required to estimate morphological or functional properties of given anatomical targets. While automatic processing is highly desirable, image segmentation remains to date a supervised process in daily clinical practice. Indeed, challenging data often requires user interaction to capture the required level of anatomical detail. To optimize the analysis of 3D images, the user should be able to efficiently interact with the result of any segmentation algorithm to correct any possible disagreement. Building on a previously developed real-time 3D segmentation algorithm, we propose in the present work an extension towards an interactive application where user information can be used online to steer the segmentation result. This enables a synergistic collaboration between the operator and the underlying segmentation algorithm, thus contributing to higher segmentation accuracy, while keeping total analysis time competitive. To this end, we formalize the user interaction paradigm using a geometrical approach, where the user input is mapped to a non-cartesian space while this information is used to drive the boundary towards the position provided by the user. Additionally, we propose a shape regularization term which improves the interaction with the segmented surface, thereby making the interactive segmentation process less cumbersome. The resulting algorithm offers competitive performance both in terms of segmentation accuracy, as well as in terms of total analysis time. This contributes to a more efficient use of the existing segmentation tools in daily clinical practice. Furthermore, it compares favorably to state-of-the-art interactive segmentation software based on a 3D livewire-based algorithm.
Resumo:
In this work, we present a neural network (NN) based method designed for 3D rigid-body registration of FMRI time series, which relies on a limited number of Fourier coefficients of the images to be aligned. These coefficients, which are comprised in a small cubic neighborhood located at the first octant of a 3D Fourier space (including the DC component), are then fed into six NN during the learning stage. Each NN yields the estimates of a registration parameter. The proposed method was assessed for 3D rigid-body transformations, using DC neighborhoods of different sizes. The mean absolute registration errors are of approximately 0.030 mm in translations and 0.030 deg in rotations, for the typical motion amplitudes encountered in FMRI studies. The construction of the training set and the learning stage are fast requiring, respectively, 90 s and 1 to 12 s, depending on the number of input and hidden units of the NN. We believe that NN-based approaches to the problem of FMRI registration can be of great interest in the future. For instance, NN relying on limited K-space data (possibly in navigation echoes) can be a valid solution to the problem of prospective (in frame) FMRI registration.
Resumo:
The paper formulates a genetic algorithm that evolves two types of objects in a plane. The fitness function promotes a relationship between the objects that is optimal when some kind of interface between them occurs. Furthermore, the algorithm adopts an hexagonal tessellation of the two-dimensional space for promoting an efficient method of the neighbour modelling. The genetic algorithm produces special patterns with resemblances to those revealed in percolation phenomena or in the symbiosis found in lichens. Besides the analysis of the spacial layout, a modelling of the time evolution is performed by adopting a distance measure and the modelling in the Fourier domain in the perspective of fractional calculus. The results reveal a consistent, and easy to interpret, set of model parameters for distinct operating conditions.