5 resultados para Acceleration data structure

em Repositório da Produção Científica e Intelectual da Unicamp


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The validation of an analytical procedure must be certified through the determination of parameters known as figures of merit. For first order data, the acuracy, precision, robustness and bias is similar to the methods of univariate calibration. Linearity, sensitivity, signal to noise ratio, adjustment, selectivity and confidence intervals need different approaches, specific for multivariate data. Selectivity and signal to noise ratio are more critical and they only can be estimated by means of the calculation of the net analyte signal. In second order calibration, some differentes approaches are necessary due to data structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In acquired immunodeficiency syndrome (AIDS) studies it is quite common to observe viral load measurements collected irregularly over time. Moreover, these measurements can be subjected to some upper and/or lower detection limits depending on the quantification assays. A complication arises when these continuous repeated measures have a heavy-tailed behavior. For such data structures, we propose a robust structure for a censored linear model based on the multivariate Student's t-distribution. To compensate for the autocorrelation existing among irregularly observed measures, a damped exponential correlation structure is employed. An efficient expectation maximization type algorithm is developed for computing the maximum likelihood estimates, obtaining as a by-product the standard errors of the fixed effects and the log-likelihood function. The proposed algorithm uses closed-form expressions at the E-step that rely on formulas for the mean and variance of a truncated multivariate Student's t-distribution. The methodology is illustrated through an application to an Human Immunodeficiency Virus-AIDS (HIV-AIDS) study and several simulation studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monte Carlo track structures (MCTS) simulations have been recognized as useful tools for radiobiological modeling. However, the authors noticed several issues regarding the consistency of reported data. Therefore, in this work, they analyze the impact of various user defined parameters on simulated direct DNA damage yields. In addition, they draw attention to discrepancies in published literature in DNA strand break (SB) yields and selected methodologies. The MCTS code Geant4-DNA was used to compare radial dose profiles in a nanometer-scale region of interest (ROI) for photon sources of varying sizes and energies. Then, electron tracks of 0.28 keV-220 keV were superimposed on a geometric DNA model composed of 2.7 × 10(6) nucleosomes, and SBs were simulated according to four definitions based on energy deposits or energy transfers in DNA strand targets compared to a threshold energy ETH. The SB frequencies and complexities in nucleosomes as a function of incident electron energies were obtained. SBs were classified into higher order clusters such as single and double strand breaks (SSBs and DSBs) based on inter-SB distances and on the number of affected strands. Comparisons of different nonuniform dose distributions lacking charged particle equilibrium may lead to erroneous conclusions regarding the effect of energy on relative biological effectiveness. The energy transfer-based SB definitions give similar SB yields as the one based on energy deposit when ETH ≈ 10.79 eV, but deviate significantly for higher ETH values. Between 30 and 40 nucleosomes/Gy show at least one SB in the ROI. The number of nucleosomes that present a complex damage pattern of more than 2 SBs and the degree of complexity of the damage in these nucleosomes diminish as the incident electron energy increases. DNA damage classification into SSB and DSB is highly dependent on the definitions of these higher order structures and their implementations. The authors' show that, for the four studied models, different yields are expected by up to 54% for SSBs and by up to 32% for DSBs, as a function of the incident electrons energy and of the models being compared. MCTS simulations allow to compare direct DNA damage types and complexities induced by ionizing radiation. However, simulation results depend to a large degree on user-defined parameters, definitions, and algorithms such as: DNA model, dose distribution, SB definition, and the DNA damage clustering algorithm. These interdependencies should be well controlled during the simulations and explicitly reported when comparing results to experiments or calculations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The p23 protein is a chaperone widely involved in protein homeostasis, well known as an Hsp90 co-chaperone since it also controls the Hsp90 chaperone cycle. Human p23 includes a β-sheet domain, responsible for interacting with Hsp90; and a charged C-terminal region whose function is not clear, but seems to be natively unfolded. p23 can undergo caspase-dependent proteolytic cleavage to form p19 (p231-142), which is involved in apoptosis, while p23 has anti-apoptotic activity. To better elucidate the function of the human p23 C-terminal region, we studied comparatively the full-length human p23 and three C-terminal truncation mutants: p23₁₋₁₁₇; p23₁₋₁₃₁ and p23₁₋₁₄₂. Our data indicate that p23 and p19 have distinct characteristics, whereas the other two truncations behave similarly, with some differences to p23 and p19. We found that part of the C-terminal region can fold in an α-helix conformation and slightly contributes to p23 thermal-stability, suggesting that the C-terminal interacts with the β-sheet domain. As a whole, our results suggest that the C-terminal region of p23 is critical for its structure-function relationship. A mechanism where the human p23 C-terminal region behaves as an activation/inhibition module for different p23 activities is proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Size distributions in woody plant populations have been used to assess their regeneration status, assuming that size structures with reverse-J shapes represent stable populations. We present an empirical approach of this issue using five woody species from the Cerrado. Considering count data for all plants of these five species over a 12-year period, we analyzed size distribution by: a) plotting frequency distributions and their adjustment to the negative exponential curve and b) calculating the Gini coefficient. To look for a relationship between size structure and future trends, we considered the size structures from the first census year. We analyzed changes in number over time and performed a simple population viability analysis, which gives the mean population growth rate, its variance and the probability of extinction in a given time period. Frequency distributions and the Gini coefficient were not able to predict future trends in population numbers. We recommend that managers should not use measures of size structure as a basis for management decisions without applying more appropriate demographic studies.