970 resultados para Computational methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction Bioelectrical impedance analysis (BIA) is a useful field measure to estimate total body water (TBW). No prediction formulae have been developed or validated against a reference method in patients with pancreatic cancer. The aim of this study was to assess the agreement between three prediction equations for the estimation of TBW in cachectic patients with pancreatic cancer. Methods Resistance was measured at frequencies of 50 and 200 kHz in 18 outpatients (10 males and eight females, age 70.2 +/- 11.8 years) with pancreatic cancer from two tertiary Australian hospitals. Three published prediction formulae were used to calculate TBW - TBWs developed in surgical patients, TBWca-uw and TBWca-nw developed in underweight and normal weight patients with end-stage cancer. Results There was no significant difference in the TBW estimated by the three prediction equations - TBWs 32.9 +/- 8.3 L, TBWca-nw 36.3 +/- 7.4 L, TBWca-uw 34.6 +/- 7.6 L. At a population level, there is agreement between prediction of TBW in patients with pancreatic cancer estimated from the three equations. The best combination of low bias and narrow limits of agreement was observed when TBW was estimated from the equation developed in the underweight cancer patients relative to the normal weight cancer patients. When no established BIA prediction equation exists, practitioners should utilize an equation developed in a population with similar critical characteristics such as diagnosis, weight loss, body mass index and/or age. Conclusions Further research is required to determine the accuracy of the BIA prediction technique against a reference method in patients with pancreatic cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerical modeling of the eddy currents induced in the human body by the pulsed field gradients in MRI presents a difficult computational problem. It requires an efficient and accurate computational method for high spatial resolution analyses with a relatively low input frequency. In this article, a new technique is described which allows the finite difference time domain (FDTD) method to be efficiently applied over a very large frequency range, including low frequencies. This is not the case in conventional FDTD-based methods. A method of implementing streamline gradients in FDTD is presented, as well as comparative analyses which show that the correct source injection in the FDTD simulation plays a crucial rule in obtaining accurate solutions. In particular, making use of the derivative of the input source waveform is shown to provide distinct benefits in accuracy over direct source injection. In the method, no alterations to the properties of either the source or the transmission media are required. The method is essentially frequency independent and the source injection method has been verified against examples with analytical solutions. Results are presented showing the spatial distribution of gradient-induced electric fields and eddy currents in a complete body model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motivation: A major issue in cell biology today is how distinct intracellular regions of the cell, like the Golgi Apparatus, maintain their unique composition of proteins and lipids. The cell differentially separates Golgi resident proteins from proteins that move through the organelle to other subcellular destinations. We set out to determine if we could distinguish these two types of transmembrane proteins using computational approaches. Results: A new method has been developed to predict Golgi membrane proteins based on their transmembrane domains. To establish the prediction procedure, we took the hydrophobicity values and frequencies of different residues within the transmembrane domains into consideration. A simple linear discriminant function was developed with a small number of parameters derived from a dataset of Type II transmembrane proteins of known localization. This can discriminate between proteins destined for Golgi apparatus or other locations (post-Golgi) with a success rate of 89.3% or 85.2%, respectively on our redundancy-reduced data sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe remarkable success in controlling dengue vectors, Aedes aegypti (L.) and Aedes albopictus (Skuse), in 6 communes with 11,675 households and 49,647 people in the northern provinces of Haiphong, Hung Yen, and Nam Dinh in Vietnam. The communes were selected for high-frequency use of large outdoor concrete tanks and wells. These were found to be the source of 49.6-98.4% of Ae. aegypti larvae, which were amenable to treatment with local Mesocyclops, mainly M. woutersi Van der Velde, M. aspericornis (Daday) and M. thermocyclopoides Harada. Knowledge, attitude, and practice surveys were performed to determine whether the communities viewed dengue and dengue hemorrhagic fever as a serious health threat; to determine their knowledge of the etiology, attitudes, and practices regarding control methods including Mesocyclops; and to determine their receptivity to various information methods. On the basis of the knowledge, attitude, and practice data, the community-based dengue control program comprised a system of local leaders, health volunteer teachers, and schoolchildren, supported by health professionals. Recycling of discards for economic gain was enhanced, where appropriate, and this, plus 37 clean-up campaigns, removed small containers unsuitable for Mesocyclops treatment. A previously successful eradication at Phan Boi village (Hung Yen province) was extended to 7 other villages forming Di Su commune (1,750 households) in the current study. Complete control was also achieved in Nghia Hiep (Hung Yen province) and in Xuan Phong (Nam Dinh province); control efficacy was greater than or equal to 99.7% in the other 3 communes (Lac Vien in Haiphong, Nghia Dong, and Xuan Kien in Nam Dinh). Although tanks and wells were the key container types of Ae. aegypti productivity, discarded materials were the source of 51% of the standing crop of Ae. albopictus. Aedes albopictus larvae were eliminated from the 3 Nam Dinh communes, and 86-98% control was achieved in the other 3 communes. Variable dengue attack rates made the clinical and serological comparison of control and untreated communes problematic, but these data indicate that clinical surveillance by itself is inadequate to monitor dengue transmission.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A decision theory framework can be a powerful technique to derive optimal management decisions for endangered species. We built a spatially realistic stochastic metapopulation model for the Mount Lofty Ranges Southern Emu-wren (Stipiturus malachurus intermedius), a critically endangered Australian bird. Using diserete-time Markov,chains to describe the dynamics of a metapopulation and stochastic dynamic programming (SDP) to find optimal solutions, we evaluated the following different management decisions: enlarging existing patches, linking patches via corridors, and creating a new patch. This is the first application of SDP to optimal landscape reconstruction and one of the few times that landscape reconstruction dynamics have been integrated with population dynamics. SDP is a powerful tool that has advantages over standard Monte Carlo simulation methods because it can give the exact optimal strategy for every landscape configuration (combination of patch areas and presence of corridors) and pattern of metapopulation occupancy, as well as a trajectory of strategies. It is useful when a sequence of management actions can be performed over a given time horizon, as is the case for many endangered species recovery programs, where only fixed amounts of resources are available in each time step. However, it is generally limited by computational constraints to rather small networks of patches. The model shows that optimal metapopulation, management decisions depend greatly on the current state of the metapopulation,. and there is no strategy that is universally the best. The extinction probability over 30 yr for the optimal state-dependent management actions is 50-80% better than no management, whereas the best fixed state-independent sets of strategies are only 30% better than no management. This highlights the advantages of using a decision theory tool to investigate conservation strategies for metapopulations. It is clear from these results that the sequence of management actions is critical, and this can only be effectively derived from stochastic dynamic programming. The model illustrates the underlying difficulty in determining simple rules of thumb for the sequence of management actions for a metapopulation. This use of a decision theory framework extends the capacity of population viability analysis (PVA) to manage threatened species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright (C) 2003 John Wiley Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been a resurgence of interest in the mean trace length estimator of Pahl for window sampling of traces. The estimator has been dealt with by Mauldon and Zhang and Einstein in recent publications. The estimator is a very useful one in that it is non-parametric. However, despite some discussion regarding the statistical distribution of the estimator, none of the recent works or the original work by Pahl provide a rigorous basis for the determination a confidence interval for the estimator or a confidence region for the estimator and the corresponding estimator of trace spatial intensity in the sampling window. This paper shows, by consideration of a simplified version of the problem but without loss of generality, that the estimator is in fact the maximum likelihood estimator (MLE) and that it can be considered essentially unbiased. As the MLE, it possesses the least variance of all estimators and confidence intervals or regions should therefore be available through application of classical ML theory. It is shown that valid confidence intervals can in fact be determined. The results of the work and the calculations of the confidence intervals are illustrated by example. (C) 2003 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A number of authors concerned with the analysis of rock jointing have used the idea that the joint areal or diametral distribution can be linked to the trace length distribution through a theorem attributed to Crofton. This brief paper seeks to demonstrate why Crofton's theorem need not be used to link moments of the trace length distribution captured by scan line or areal mapping to the moments of the diametral distribution of joints represented as disks and that it is incorrect to do so. The valid relationships for areal or scan line mapping between all the moments of the trace length distribution and those of the joint size distribution for joints modeled as disks are recalled and compared with those that might be applied were Crofton's theorem assumed to apply. For areal mapping, the relationship is fortuitously correct but incorrect for scan line mapping.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are several competing methods commonly used to solve energy grained master equations describing gas-phase reactive systems. When it comes to selecting an appropriate method for any particular problem, there is little guidance in the literature. In this paper we directly compare several variants of spectral and numerical integration methods from the point of view of computer time required to calculate the solution and the range of temperature and pressure conditions under which the methods are successful. The test case used in the comparison is an important reaction in combustion chemistry and incorporates reversible and irreversible bimolecular reaction steps as well as isomerizations between multiple unimolecular species. While the numerical integration of the ODE with a stiff ODE integrator is not the fastest method overall, it is the fastest method applicable to all conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to compare accumulated oxygen deficit data derived using two different exercise protocols with the aim of producing a less time-consuming test specifically for use with athletes. Six road and four track male endurance cyclists performed two series of cycle ergometer tests. The first series involved five 10 min sub-maximal cycle exercise bouts, a (V) over dotO(2peak) test and a 115% (V) over dotO(2peak) test. Data from these tests were used to estimate the accumulated oxygen deficit according to the calculations of Medbo et al. (1988). In the second series of tests, participants performed a 15 min incremental cycle ergometer test followed, 2 min later, by a 2 min variable resistance test in which they completed as much work as possible while pedalling at a constant rate. Analysis revealed that the accumulated oxygen deficit calculated from the first series of tests was higher (P< 0.02) than that calculated from the second series: 52.3 +/- 11.7 and 43.9 +/- 6.4 ml . kg(-1), respectively (mean +/- s). Other significant differences between the two protocols were observed for (V) over dot O-2peak, total work and maximal heart rate; all were higher during the modified protocol (P

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A variabilidade natural dos solos torna complexo o conhecimento de suas propriedades na elaboração de projetos geotécnicos, sendo a determinação da resistência ao cisalhamento não drenada um parâmetro importante nas análises de estabilidade de solos moles. Os ensaios de laboratório de cone e palheta, não convencionais, os ensaios de campo de palheta e piezocone e os ensaios de compressão simples e triaxial não adensado e não drenado foram utilizados para mensurar a resistência não drenada de uma camada de argila marinha mole localizada na planície costeira central brasileira. Os ensaios de laboratório foram realizados em amostras indeformadas coletadas com amostradores de pistão estacionário em vertical próxima à realização dos ensaios de campo. O sítio foi investigado preliminarmente por sondagens de simples reconhecimento, sendo apresentado o perfil estratigráfico por meio de modelagem computacional. Foram também realizados ensaios para caracterização física (análise granulométrica, teor de umidade, limites de liquidez e plasticidade, densidade real dos grãos) e mineralógica (difração de raios X), e ensaios de adensamento para obtenção do histórico de tensões e classificação de qualidade das amostras indeformadas. Os valores de resistência não drenada obtidos pelos ensaios de laboratório foram comparados ao perfil contínuo de resistência determinado empiricamente pelo ensaio de piezocone, com fator de cone Nkt calibrado pelo ensaio de palheta de campo, apresentando boa concordância, com a variabilidade natural do solo influenciando de forma preponderante a qualidade das amostras na variação entre os resultados. Os valores de resistência obtidos pelos ensaios de laboratório de cone e palheta foram comparados entre si, apresentando boa compatibilidade. Ambos, quando comparados ao ensaio de palheta de campo, não apresentaram boa concordância. Os resultados de resistência obtidos pelos ensaios de compressão simples e triaxial apresentaram boa compatibilidade com os resultados do ensaio de laboratório de cone, o que não ocorreu com os resultados do ensaio de laboratório de palheta. Na comparação entre a resistência normalizada pela tensão de sobreadensamento obtida pelos diversos métodos e algumas correlações empíricas da literatura internacional, foi observado para as amostras de solo com índice de plasticidade superior a 60% boa concordância com as correlações de Mesri (1975) e Jamiolkowski et al (1985). Os ensaios não convencionais apresentaram boa confiabilidade, que aliado a simplicidade e agilidade de execução, justificam a difusão destes na prática da investigação geotécnica brasileira como método alternativo para complementar e dar suporte às estimativas de resistência não drenada de solos moles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A numerical comparison is performed between three methods of third order with the same structure, namely BSC, Halley’s and Euler–Chebyshev’s methods. As the behavior of an iterative method applied to a nonlinear equation can be highly sensitive to the starting points, the numerical comparison is carried out, allowing for complex starting points and for complex roots, on the basins of attraction in the complex plane. Several examples of algebraic and transcendental equations are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract. Graphical user interfaces (GUIs) make software easy to use by providing the user with visual controls. Therefore, correctness of GUI’s code is essential to the correct execution of the overall software. Models can help in the evaluation of interactive applications by allowing designers to concentrate on its more important aspects. This paper describes our approach to reverse engineer an abstract model of a user interface directly from the GUI’s legacy code. We also present results from a case study. These results are encouraging and give evidence that the goal of reverse engineering user interfaces can be met with more work on this technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

n plant breeding programs that aim to obtain cultivars with nitrogen (N) use efficiency, the focus is on methods of selection and experimental procedures that present low cost, fast response, high repeatability, and can be applied to a large number of cultivars. Thus, the objectives of this study were to classify maize cultivars regarding their use efficiency and response to N in a breeding program, and to validate the methodology with contrasting doses of the nutrient. The experimental design was a randomized block with the treatments arranged in a split-plot scheme with three replicates and five N doses (0, 30, 60, 120 and 200 kg ha-1) in the plots, and six cultivars in subplots. We compared a method examining the efficiency and response (ER) with two contrasting doses of N. After that, the analysis of variance, mean comparison and regression analysis were performed. In conclusion, the method of the use efficiency and response based on two N levels classifies the cultivars in the same way as the regression analysis, and it is appropriate in plant breeding routine. Thus, it is necessary to identify the levels of N required to discriminate maize cultivars in conditions of low and high N availability in plant breeding programs that aim to obtain efficient and responsive cultivars. Moreover, the analysis of the interaction genotype x environment at experiments with contrasting doses is always required, even when the interaction is not significant.