929 resultados para space time code
Resumo:
Background and objective The time course of cardiopulmonary alterations after pulmonary embolism has not been clearly demonstrated and nor has the role of systemic inflammation on the pathogenesis of the disease. This study aimed to evaluate over 12 h the effects of pulmonary embolism caused by polystyrene microspheres on the haemodynamics, lung mechanics and gas exchange and on interleukin-6 production. Methods Ten large white pigs (weight 35-42 kg) had arterial and pulmonary catheters inserted and pulmonary embolism was induced in five pigs by injection of polystyrene microspheres (diameter similar to 300 mu mol l(-1)) until a value of pulmonary mean arterial pressure of twice the baseline was obtained. Five other animals received only saline. Haemodynamic and respiratory data and pressure-volume curves of the respiratory system were collected. A bronchoscopy was performed before and 12 h after embolism, when the animals were euthanized. Results The embolism group developed hypoxaemia that was not corrected with high oxygen fractions, as well as higher values of dead space, airway resistance and lower respiratory compliance levels. Acute haemodynamic alterations included pulmonary arterial hypertension with preserved systemic arterial pressure and cardiac index. These derangements persisted until the end of the experiments. The plasma interleukin-6 concentrations were similar in both groups; however, an increase in core temperature and a nonsignificant higher concentration of bronchoalveolar lavage proteins were found in the embolism group. Conclusion Acute pulmonary embolism induced by polystyrene microspheres in pigs produces a 12-h lasting hypoxaemia and a high dead space associated with high airway resistance and low compliance. There were no plasma systemic markers of inflammation, but a higher central temperature and a trend towards higher bronchoalveolar lavage proteins were found. Eur J Anaesthesiol 27:67-76 (C) 2010 European Society of Anaesthesiology.
Resumo:
Introduction: This ex vivo study evaluated the heat release, time required, and cleaning efficacy of MTwo (VDW, Munich, Germany) and ProTaper Universal Retreatment systems (Dentsply/Maillefer, Ballaigues, Switzerland) and hand instrumentation in the removal of filling material. Methods: Sixty single-rooted human teeth with a single straight canal were obturated with gutta-percha and zinc oxide and eugenol-based cement and randomly allocated to 3 groups (n = 20). After 30-day storage at 37 degrees C and 100% humidity, the root fillings were removed using ProTaper UR, MTwo R, or hand files. Heat release, time required, and cleaning efficacy data were analyzed statistically (analysis of variance and the Tukey test, alpha = 0.05). Results: None of the techniques removed the root fillings completely. Filling material removal with ProTaper UR was faster but caused more heat release. Mtwo R produced less heat release than the other techniques but was the least efficient in removing gutta-percha/sealer. Conclusions: ProTaper UR and MTwo R caused the greatest and lowest temperature increase on root surface, respectively; regardless of the type of instrument, more heat was released in the cervical third. Pro Taper UR needed less time to remove fillings than MTwo R. All techniques left filling debris in the root canals. (I Endod 2010;36:1870-1873)
Resumo:
Objectives: The purpose of this in vitro study was to evaluate the Vickers hardness (VHN) of a Light Core (Bisco) composite resin after root reinforcement, according to the light exposure time, region of intracanal reinforcement and lateral distance from the light-transmitting fibre post. Methods: Forty-five 17-mm long roots were used. Twenty-four hours after obturation, the root canals were emptied to a depth of 12 mm and the root dentine was artificially flared to produce a 1 mm space between the fibre post and the canal walls. The roots were bulk restored with the composite resin, which was photoactivated through the post for 40 s (G1, control), 80 s (G2) or 120 s (G3). Twenty-four hours after post-cementation, the specimens were sectioned transversely into three slices at depths of 2, 6 and 10 mm, corresponding to the coronal, middle and apical regions of the reinforced root. Composite VHN was measured as the average of three indentations (100 g/15 s) in each region at lateral distances of 50, 200 and 350 mu m from the cement/post-interface. Results: Three-way analysis of variance (alpha = 0.05) indicated that the factors time, region and distance influenced the hardness and that the interaction time x region was statistically significant (p = 0.0193). Tukey`s test showed that the mean VHN values for G1 (76.37 +/- 8.58) and G2 (74.89 +/- 6.28) differed significantly from that for G3 (79.5 +/- 5.18). Conclusions: Composite resin hardness was significantly lower in deeper regions of root reinforcement and in lateral areas distant from the post. Overall, a light exposure time of 120 s provided higher composite hardness than the shorter times (40 and 80 s). (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
The catalytic properties of enzymes are usually evaluated by measuring and analyzing reaction rates. However, analyzing the complete time course can be advantageous because it contains additional information about the properties of the enzyme. Moreover, for systems that are not at steady state, the analysis of time courses is the preferred method. One of the major barriers to the wide application of time courses is that it may be computationally more difficult to extract information from these experiments. Here the basic approach to analyzing time courses is described, together with some examples of the essential computer code to implement these analyses. A general method that can be applied to both steady state and non-steady-state systems is recommended. (C) 2001 academic Press.
Resumo:
A scheme is presented to incorporate a mixed potential integral equation (MPIE) using Michalski's formulation C with the method of moments (MoM) for analyzing the scattering of a plane wave from conducting planar objects buried in a dielectric half-space. The robust complex image method with a two-level approximation is used for the calculation of the Green's functions for the half-space. To further speed up the computation, an interpolation technique for filling the matrix is employed. While the induced current distributions on the object's surface are obtained in the frequency domain, the corresponding time domain responses are calculated via the inverse fast Fourier transform (FFT), The complex natural resonances of targets are then extracted from the late time response using the generalized pencil-of-function (GPOF) method. We investigate the pole trajectories as we vary the distance between strips and the depth and orientation of single, buried strips, The variation from the pole position of a single strip in a homogeneous dielectric medium was only a few percent for most of these parameter variations.
Resumo:
A data warehouse is a data repository which collects and maintains a large amount of data from multiple distributed, autonomous and possibly heterogeneous data sources. Often the data is stored in the form of materialized views in order to provide fast access to the integrated data. One of the most important decisions in designing a data warehouse is the selection of views for materialization. The objective is to select an appropriate set of views that minimizes the total query response time with the constraint that the total maintenance time for these materialized views is within a given bound. This view selection problem is totally different from the view selection problem under the disk space constraint. In this paper the view selection problem under the maintenance time constraint is investigated. Two efficient, heuristic algorithms for the problem are proposed. The key to devising the proposed algorithms is to define good heuristic functions and to reduce the problem to some well-solved optimization problems. As a result, an approximate solution of the known optimization problem will give a feasible solution of the original problem. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
This note presents a method of evaluating the distribution of a path integral for Markov chains on a countable state space.
Resumo:
This paper presents a method of evaluating the expected value of a path integral for a general Markov chain on a countable state space. We illustrate the method with reference to several models, including birth-death processes and the birth, death and catastrophe process. (C) 2002 Elsevier Science Inc. All rights reserved.
Resumo:
Program compilation can be formally defined as a sequence of equivalence-preserving transformations, or refinements, from high-level language programs to assembler code, Recent models also incorporate timing properties, but the resulting formalisms are intimidatingly complex. Here we take advantage of a new, simple model of real-time refinement, based on predicate transformer semantics, to present a straightforward compilation formalism that incorporates real-time constraints. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
The refinement calculus is a well-established theory for deriving program code from specifications. Recent research has extended the theory to handle timing requirements, as well as functional ones, and we have developed an interactive programming tool based on these extensions. Through a number of case studies completed using the tool, this paper explains how the tool helps the programmer by supporting the many forms of variables needed in the theory. These include simple state variables as in the untimed calculus, trace variables that model the evolution of properties over time, auxiliary variables that exist only to support formal reasoning, subroutine parameters, and variables shared between parallel processes.
Resumo:
Abstract: If we think there is a significant number of legal offshore in the globalized world, then there is not even a global consensus about what «corruption» is. The «illegal corruption» in a country may be legal in another. Moreover, the great global corruption is above the law or above democratic States. And not all democratic States are «Rule of Law». Therefore, the solution is global earlier in time and space law, democratic, free and true law. While the human being does not reach a consensus of what «corruption» really is, the discussion will not go further than a caricature. One of the other problems about «corruption» is that it is very difficult to establish the imputation of crimes, including «corruption» (v.g. Portugal) on some «companies», corporations. We have a juridical problem in the composition of the art. 11. of the Portuguese Penal Code.
Resumo:
Image segmentation is an ubiquitous task in medical image analysis, which is required to estimate morphological or functional properties of given anatomical targets. While automatic processing is highly desirable, image segmentation remains to date a supervised process in daily clinical practice. Indeed, challenging data often requires user interaction to capture the required level of anatomical detail. To optimize the analysis of 3D images, the user should be able to efficiently interact with the result of any segmentation algorithm to correct any possible disagreement. Building on a previously developed real-time 3D segmentation algorithm, we propose in the present work an extension towards an interactive application where user information can be used online to steer the segmentation result. This enables a synergistic collaboration between the operator and the underlying segmentation algorithm, thus contributing to higher segmentation accuracy, while keeping total analysis time competitive. To this end, we formalize the user interaction paradigm using a geometrical approach, where the user input is mapped to a non-cartesian space while this information is used to drive the boundary towards the position provided by the user. Additionally, we propose a shape regularization term which improves the interaction with the segmented surface, thereby making the interactive segmentation process less cumbersome. The resulting algorithm offers competitive performance both in terms of segmentation accuracy, as well as in terms of total analysis time. This contributes to a more efficient use of the existing segmentation tools in daily clinical practice. Furthermore, it compares favorably to state-of-the-art interactive segmentation software based on a 3D livewire-based algorithm.
Resumo:
In this work, we present a neural network (NN) based method designed for 3D rigid-body registration of FMRI time series, which relies on a limited number of Fourier coefficients of the images to be aligned. These coefficients, which are comprised in a small cubic neighborhood located at the first octant of a 3D Fourier space (including the DC component), are then fed into six NN during the learning stage. Each NN yields the estimates of a registration parameter. The proposed method was assessed for 3D rigid-body transformations, using DC neighborhoods of different sizes. The mean absolute registration errors are of approximately 0.030 mm in translations and 0.030 deg in rotations, for the typical motion amplitudes encountered in FMRI studies. The construction of the training set and the learning stage are fast requiring, respectively, 90 s and 1 to 12 s, depending on the number of input and hidden units of the NN. We believe that NN-based approaches to the problem of FMRI registration can be of great interest in the future. For instance, NN relying on limited K-space data (possibly in navigation echoes) can be a valid solution to the problem of prospective (in frame) FMRI registration.
Resumo:
Embedded systems are increasingly complex and dynamic, imposing progressively higher developing time and costs. Tuning a particular system for deployment is thus becoming more demanding. Furthermore when considering systems which have to adapt themselves to evolving requirements and changing service requests. In this perspective, run-time monitoring of the system behaviour becomes an important requirement, allowing to dynamically capturing the actual scheduling progress and resource utilization. For this to succeed, operating systems need to expose their internal behaviour and state, making it available to external applications, and a runtime monitoring mechanism must be available. However, such mechanism can impose a burden in the system itself if not wisely used. In this paper we explore this problem and propose a framework, which is intended to provide this run-time mechanism whilst achieving code separation, run-time efficiency and flexibility for the final developer.
Resumo:
The paper formulates a genetic algorithm that evolves two types of objects in a plane. The fitness function promotes a relationship between the objects that is optimal when some kind of interface between them occurs. Furthermore, the algorithm adopts an hexagonal tessellation of the two-dimensional space for promoting an efficient method of the neighbour modelling. The genetic algorithm produces special patterns with resemblances to those revealed in percolation phenomena or in the symbiosis found in lichens. Besides the analysis of the spacial layout, a modelling of the time evolution is performed by adopting a distance measure and the modelling in the Fourier domain in the perspective of fractional calculus. The results reveal a consistent, and easy to interpret, set of model parameters for distinct operating conditions.