919 resultados para Method of moments algorithm
Resumo:
Background and Purpose. There has been a lot of debate about the use of predicted oxygen consumption to calculate pulmonary vascular resistance using the Fick principle. We therefore comparatively analyzed predicted oxygen consumption in infants and children in specific age groups, using different methods (formulas), as an attempt to better understand the usefulness and limitations of predictions. Methods and Results. Four models (LaFarge & Miettinen, Bergstra et al., Lindahl, and Lundell et al.) were used to predict oxygen consumption in 200 acyanotic patients with congenital cardiac defects aged 0-2.0, > 2.0-4.0, > 4.0-6.0, and > 6.0-8.75 years (median 2.04 years). Significant differences were observed between the age groups (P < .001) and between the methods (P < .001), not related to diagnoses. Differences between methods were more impressive in the first age group (P < .01). In patients aged 0-2.0 years, the lowest values of oxygen consumption (corresponding to the highest estimates of pulmonary vascular resistance) were obtained with the method of Lindahl; above this age, any method except that of Lundell et al. Conclusions. Although measuring oxygen consumption is always preferable, a rational use of predictions, using different methods, may be of help in situations where measurements are definitely not possible.
Resumo:
The role of catecholamines in the control of the GnRH pulse generator is unclear as studies have relied on the use of peripheral or intracerebroventricular injections, which lack specificity in relation to the anatomical site of action. Direct brain site infusions have been used, however, these are limited by the ability to accurately target small brain regions. One such area of interest in the control of GnRH is the median eminence and arcuate nucleus within the medial basal hypothalamus. Here we describe a method of stereotaxically targeting this area in a large animal (sheep) and an infusion system to deliver drugs into unrestrained conscious animals. To test our technique we infused the dopamine agonist, quinpirole or vehicle into the medial basal hypothalamus of ovariectomised ewes. Quinpirole significantly suppressed LH pulsatility only in animals with injectors located close to the lateral median eminence. This in vivo result supports the hypothesis that dopamine inhibits GnRH secretion by presynaptic inhibition in the lateral median eminence. Also infusion of quinpirole into the medial basal hypothalamus suppressed prolactin secretion providing in vivo evidence that is consistent with the hypothesis that there are stimulatory autoreceptors on tubero-infundibular dopamine neurons. (C) 1997 Elsevier Science B.V.
Resumo:
An algorithm for explicit integration of structural dynamics problems with multiple time steps is proposed that averages accelerations to obtain subcycle states at a nodal interface between regions integrated with different time steps. With integer time step ratios, the resulting subcycle updates at the interface sum to give the same effect as a central difference update over a major cycle. The algorithm is shown to have good accuracy, and stability properties in linear elastic analysis similar to those of constant velocity subcycling algorithms. The implementation of a generalised form of the algorithm with non-integer time step ratios is presented. (C) 1997 by John Wiley & Sons, Ltd.
Resumo:
Objective: To asses the onset (%) of patella stabilizer muscles during maximal isometric contraction exercises (MIC) in individuals with and without signs of patellofemoral pain syndrome (PFPS) in open (OKC) and closed (CKC) kinetic chain exercises, Method: Assessments were carried out on 22 women; ten with no complains of anterior knee pain, and 12 with PFPS signs during MIC in OKC and CKC with the knee flexed at 90 degrees. The onset of the electromyographic activity of the vastus mediallis obliquus (VMO), vastus lateralis obliquus (VLO) and vastus lateralis longus (VLL) was identified by means of an algorithm in the Myosystem Br 1 software. The statistical analysis used was Chi-Square test and student`s t test, which are both tests with a level of significance at 5%. Results: The VMO and VLO muscles presented a greater onset compared to the VLL during OKC exercises for both groups and for the PFPS group without CCF No differences were observed between the groups. Conclusion: CKC and OKC exercises seem to benefit the synchronism of the musculature that supposedly benefits the patella stabilizer musculature, and can be recommended in physiotherapeutic treatment programs.
Resumo:
Numerical modeling of the eddy currents induced in the human body by the pulsed field gradients in MRI presents a difficult computational problem. It requires an efficient and accurate computational method for high spatial resolution analyses with a relatively low input frequency. In this article, a new technique is described which allows the finite difference time domain (FDTD) method to be efficiently applied over a very large frequency range, including low frequencies. This is not the case in conventional FDTD-based methods. A method of implementing streamline gradients in FDTD is presented, as well as comparative analyses which show that the correct source injection in the FDTD simulation plays a crucial rule in obtaining accurate solutions. In particular, making use of the derivative of the input source waveform is shown to provide distinct benefits in accuracy over direct source injection. In the method, no alterations to the properties of either the source or the transmission media are required. The method is essentially frequency independent and the source injection method has been verified against examples with analytical solutions. Results are presented showing the spatial distribution of gradient-induced electric fields and eddy currents in a complete body model.
Resumo:
Many large-scale stochastic systems, such as telecommunications networks, can be modelled using a continuous-time Markov chain. However, it is frequently the case that a satisfactory analysis of their time-dependent, or even equilibrium, behaviour is impossible. In this paper, we propose a new method of analyzing Markovian models, whereby the existing transition structure is replaced by a more amenable one. Using rates of transition given by the equilibrium expected rates of the corresponding transitions of the original chain, we are able to approximate its behaviour. We present two formulations of the idea of expected rates. The first provides a method for analysing time-dependent behaviour, while the second provides a highly accurate means of analysing equilibrium behaviour. We shall illustrate our approach with reference to a variety of models, giving particular attention to queueing and loss networks. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
A brief description of the main features of the health planning technique developed by the "Centro de Estudios del Desarrollo" (CENDES) in Venezuela, and proposed by the Pan-American Health Organization for use in Latin America, is presented. This presentation is followed by an appraisal of the planning method which includes comments both upon its positive aspects and upon its negative points. Comments are also made referring to other recent publications of the WHO/PAHO on health planning. In conclusion, the CENDES technique is considered a health planning method of great potential for use especially in underdeveloped areas, the success of its application depending upon the hability of the health planners to introduce the necessary modifications to adapt to the local circunstamces.
Resumo:
5th. European Congress on Computational Methods in Applied Sciences and Engineering (ECCOMAS 2008) 8th. World Congress on Computational Mechanics (WCCM8)
Resumo:
Fluorescence confocal microscopy (FCM) is now one of the most important tools in biomedicine research. In fact, it makes it possible to accurately study the dynamic processes occurring inside the cell and its nucleus by following the motion of fluorescent molecules over time. Due to the small amount of acquired radiation and the huge optical and electronics amplification, the FCM images are usually corrupted by a severe type of Poisson noise. This noise may be even more damaging when very low intensity incident radiation is used to avoid phototoxicity. In this paper, a Bayesian algorithm is proposed to remove the Poisson intensity dependent noise corrupting the FCM image sequences. The observations are organized in a 3-D tensor where each plane is one of the images acquired along the time of a cell nucleus using the fluorescence loss in photobleaching (FLIP) technique. The method removes simultaneously the noise by considering different spatial and temporal correlations. This is accomplished by using an anisotropic 3-D filter that may be separately tuned in space and in time dimensions. Tests using synthetic and real data are described and presented to illustrate the application of the algorithm. A comparison with several state-of-the-art algorithms is also presented.
Resumo:
This paper presents a methodology for distribution networks reconfiguration in outage presence in order to choose the reconfiguration that presents the lower power losses. The methodology is based on statistical failure and repair data of the distribution power system components and uses fuzzy-probabilistic modelling for system component outage parameters. Fuzzy membership functions of system component outage parameters are obtained by statistical records. A hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models allows catching both randomness and fuzziness of component outage parameters. Once obtained the system states by Monte Carlo simulation, a logical programming algorithm is applied to get all possible reconfigurations for every system state. In order to evaluate the line flows and bus voltages and to identify if there is any overloading, and/or voltage violation a distribution power flow has been applied to select the feasible reconfiguration with lower power losses. To illustrate the application of the proposed methodology to a practical case, the paper includes a case study that considers a real distribution network.
Resumo:
Bread is consumed worldwide by man, thus contributing to the regular ingestion of certain inorganic species such as chloride. It controls the blood pressure if associated to a sodium intake and may increase the incidence of stomach ulcer. Its routine control should thus be established by means of quick and low cost procedures. This work reports a double- channel flow injection analysis (FIA) system with a new chloride sensor for the analysis of bread. All solutions are prepared in water and necessary ionic strength adjustments are made on-line. The body of the indicating electrode is made from a silver needle of 0.8 mm i.d. with an external layer of silver chloride. These devices were constructed with different lengths. Electrodes of 1.0 to 3.0 cm presented better analytical performance. The calibration curves under optimum conditions displayed Nernstian behaviour, with average slopes of 56 mV decade-1, with sampling rates of 60 samples h-1. The method was applied to analyze several kinds of bread, namely pão de trigo, pão integral, pão de centeio, pão de mistura, broa de milho, pão sem sal, pão meio sal, pão-de-leite, and pão de água. The accuracy and precision of the potentiometric method were ascertained by comparison to a spectrophotometric method of continuous segmented flow. These methods were validated against ion-chromatography procedures.
Resumo:
The paper formulates a genetic algorithm that evolves two types of objects in a plane. The fitness function promotes a relationship between the objects that is optimal when some kind of interface between them occurs. Furthermore, the algorithm adopts an hexagonal tessellation of the two-dimensional space for promoting an efficient method of the neighbour modelling. The genetic algorithm produces special patterns with resemblances to those revealed in percolation phenomena or in the symbiosis found in lichens. Besides the analysis of the spacial layout, a modelling of the time evolution is performed by adopting a distance measure and the modelling in the Fourier domain in the perspective of fractional calculus. The results reveal a consistent, and easy to interpret, set of model parameters for distinct operating conditions.
Resumo:
OBJECTIVE To propose a method of redistributing ill-defined causes of death (IDCD) based on the investigation of such causes.METHODS In 2010, an evaluation of the results of investigating the causes of death classified as IDCD in accordance with chapter 18 of the International Classification of Diseases (ICD-10) by the Mortality Information System was performed. The redistribution coefficients were calculated according to the proportional distribution of ill-defined causes reclassified after investigation in any chapter of the ICD-10, except for chapter 18, and used to redistribute the ill-defined causes not investigated and remaining by sex and age. The IDCD redistribution coefficient was compared with two usual methods of redistribution: a) Total redistribution coefficient, based on the proportional distribution of all the defined causes originally notified and b) Non-external redistribution coefficient, similar to the previous, but excluding external causes.RESULTS Of the 97,314 deaths by ill-defined causes reported in 2010, 30.3% were investigated, and 65.5% of those were reclassified as defined causes after the investigation. Endocrine diseases, mental disorders, and maternal causes had a higher representation among the reclassified ill-defined causes, contrary to infectious diseases, neoplasms, and genitourinary diseases, with higher proportions among the defined causes reported. External causes represented 9.3% of the ill-defined causes reclassified. The correction of mortality rates by the total redistribution coefficient and non-external redistribution coefficient increased the magnitude of the rates by a relatively similar factor for most causes, contrary to the IDCD redistribution coefficient that corrected the different causes of death with differentiated weights.CONCLUSIONS The proportional distribution of causes among the ill-defined causes reclassified after investigation was not similar to the original distribution of defined causes. Therefore, the redistribution of the remaining ill-defined causes based on the investigation allows for more appropriate estimates of the mortality risk due to specific causes.
Resumo:
This paper introduces a new unsupervised hyperspectral unmixing method conceived to linear but highly mixed hyperspectral data sets, in which the simplex of minimum volume, usually estimated by the purely geometrically based algorithms, is far way from the true simplex associated with the endmembers. The proposed method, an extension of our previous studies, resorts to the statistical framework. The abundance fraction prior is a mixture of Dirichlet densities, thus automatically enforcing the constraints on the abundance fractions imposed by the acquisition process, namely, nonnegativity and sum-to-one. A cyclic minimization algorithm is developed where the following are observed: 1) The number of Dirichlet modes is inferred based on the minimum description length principle; 2) a generalized expectation maximization algorithm is derived to infer the model parameters; and 3) a sequence of augmented Lagrangian-based optimizations is used to compute the signatures of the endmembers. Experiments on simulated and real data are presented to show the effectiveness of the proposed algorithm in unmixing problems beyond the reach of the geometrically based state-of-the-art competitors.
Resumo:
The paper presents a RFDSCA automated synthesis procedure. This algorithm determines several RFDSCA circuits from the top-level system specifications all with the same maximum performance. The genetic synthesis tool optimizes a fitness function proportional to the RFDSCA quality factor and uses the epsiv-concept and maximin sorting scheme to achieve a set of solutions well distributed along a non-dominated front. To confirm the results of the algorithm, three RFDSCAs were simulated in SpectreRF and one of them was implemented and tested. The design used a 0.25 mum BiCMOS process. All the results (synthesized, simulated and measured) are very close, which indicate that the genetic synthesis method is a very useful tool to design optimum performance RFDSCAs.