54 resultados para RANDOM-ENERGY-MODEL

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present experimental results on benchmark problems in 3D cubic lattice structures with the Miyazawa-Jernigan energy function for two local search procedures that utilise the pull-move set: (i) population-based local search (PLS) that traverses the energy landscape with greedy steps towards (potential) local minima followed by upward steps up to a certain level of the objective function; (ii) simulated annealing with a logarithmic cooling schedule (LSA). The parameter settings for PLS are derived from short LSA-runs executed in pre-processing and the procedure utilises tabu lists generated for each member of the population. In terms of the total number of energy function evaluations both methods perform equally well, however. PLS has the potential of being parallelised with an expected speed-up in the region of the population size. Furthermore, both methods require a significant smaller number of function evaluations when compared to Monte Carlo simulations with kink-jump moves. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many studies have shown that with increasing LET of ionizing radiation the RBE (relative biological effectiveness) for dsb (double strand breaks) induction remains around 1.0 despite the increase in the RBE for cell killing. This has been attributed to an increase in the complexity of lesions, classified as dsb with current techniques, at multiply damaged sites. This study determines the molecular weight distributions of DNA from Chinese hamster V79 cells irradiated with X-rays or 110 keV/mu m alpha-particles. Two running conditions for pulsed-field gel-electrophoresis were chosen to give optimal separation of fragments either in the 225 kbp-5.7 Mbp range or the 0.3 kbp to 225 kbp range. Taking the total fraction of DNA migrating into the gel as a measure of fragmentation, the RBE for dsb induction was less than 1.0 for both molecular weight regions studied. The total yields of dsb were 8.2 x 10(-9) dsb/Gy/bp for X-rays and 7.8 x 10(-9) dsb/Gy/bp for a-particles, measured using a random breakage model. Analysis of the RBE of alpha-particles versus molecular weight gave a different response. In the 0.4 Mbp-57 Mbp region the RBE was less than 1.0; however, below 0.4 Mbp the RBE increased above 1.0. The frequency distributions of fragment sizes were found to differ from those predicted by a model assuming random breakage along the length of the DNA and the differences were greater for alpha-particles than for X-rays. An excess of fragments induced by a single-hit mechanism was found in the 8-300 kbp region and for X-rays and alpha-particles these corresponded to an extra 0.8 x 10(-9) and 3.4 x 10(-9) dsb/bp/Gy, respectively. Thus for every alpha-particle track that induces a dsb there is a 44% probability of inducing a second break within 300 kbp and for electron tracks the probability is 10%. This study shows that the distribution of damage from a high LET alpha-particle track is significantly different from that observed with low LET X-rays. In particular, it suggests that the fragmentation patterns of irradiated DNA may be related to the higher-order chromatin repealing structures found in intact cells.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In studies of radiation-induced DNA fragmentation and repair, analytical models may provide rapid and easy-to-use methods to test simple hypotheses regarding the breakage and rejoining mechanisms involved. The random breakage model, according to which lesions are distributed uniformly and independently of each other along the DNA, has been the model most used to describe spatial distribution of radiation-induced DNA damage. Recently several mechanistic approaches have been proposed that model clustered damage to DNA. In general, such approaches focus on the study of initial radiation-induced DNA damage and repair, without considering the effects of additional (unwanted and unavoidable) fragmentation that may take place during the experimental procedures. While most approaches, including measurement of total DNA mass below a specified value, allow for the occurrence of background experimental damage by means of simple subtractive procedures, a more detailed analysis of DNA fragmentation necessitates a more accurate treatment. We have developed a new, relatively simple model of DNA breakage and the resulting rejoining kinetics of broken fragments. Initial radiation-induced DNA damage is simulated using a clustered breakage approach, with three free parameters: the number of independently located clusters, each containing several DNA double-strand breaks (DSBs), the average number of DSBs within a cluster (multiplicity of the cluster), and the maximum allowed radius within which DSBs belonging to the same cluster are distributed. Random breakage is simulated as a special case of the DSB clustering procedure. When the model is applied to the analysis of DNA fragmentation as measured with pulsed-field gel electrophoresis (PFGE), the hypothesis that DSBs in proximity rejoin at a different rate from that of sparse isolated breaks can be tested, since the kinetics of rejoining of fragments of varying size may be followed by means of computer simulations. The problem of how to account for background damage from experimental handling is also carefully considered. We have shown that the conventional procedure of subtracting the background damage from the experimental data may lead to erroneous conclusions during the analysis of both initial fragmentation and DSB rejoining. Despite its relative simplicity, the method presented allows both the quantitative and qualitative description of radiation-induced DNA fragmentation and subsequent rejoining of double-stranded DNA fragments. (C) 2004 by Radiation Research Society.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Multiuser diversity (MUDiv) is one of the central concepts in multiuser (MU) systems. In particular, MUDiv allows for scheduling among users in order to eliminate the negative effects of unfavorable channel fading conditions of some users on the system performance. Scheduling, however, consumes energy (e.g., for making users' channel state information available to the scheduler). This extra usage of energy, which could potentially be used for data transmission, can be very wasteful, especially if the number of users is large. In this paper, we answer the question of how much MUDiv is required for energy limited MU systems. Focusing on uplink MU wireless systems, we develop MU scheduling algorithms which aim at maximizing the MUDiv gain. Toward this end, we introduce a new realistic energy model which accounts for scheduling energy and describes the distribution of the total energy between scheduling and data transmission stages. Using the fact that such energy distribution can be controlled by varying the number of active users, we optimize this number by either i) minimizing the overall system bit error rate (BER) for a fixed total energy of all users in the system or ii) minimizing the total energy of all users for fixed BER requirements. We find that for a fixed number of available users, the achievable MUDiv gain can be improved by activating only a subset of users. Using asymptotic analysis and numerical simulations, we show that our approach benefits from MUDiv gains higher than that achievable by generic greedy access algorithm, which is the optimal scheduling method for energy unlimited systems. © 2010 IEEE.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Approximate execution is a viable technique for energy-con\-strained environments, provided that applications have the mechanisms to produce outputs of the highest possible quality within the given energy budget.
We introduce a framework for energy-constrained execution with controlled and graceful quality loss. A simple programming model allows users to express the relative importance of computations for the quality of the end result, as well as minimum quality requirements. The significance-aware runtime system uses an application-specific analytical energy model to identify the degree of concurrency and approximation that maximizes quality while meeting user-specified energy constraints. Evaluation on a dual-socket 8-core server shows that the proposed
framework predicts the optimal configuration with high accuracy, enabling energy-constrained executions that result in significantly higher quality compared to loop perforation, a compiler approximation technique.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Cardiovascular disease (CVD) represents a leading cause of mortality worldwide, especially in the elderly. Lowering the number of CVD deaths requires preventive strategies targeted on the elderly.

OBJECTIVE: The objective was to generate evidence on the association between WHO dietary recommendations and mortality from CVD, coronary artery disease (CAD), and stroke in the elderly aged ≥60 y.

DESIGN: We analyzed data from 10 prospective cohort studies from Europe and the United States comprising a total sample of 281,874 men and women free from chronic diseases at baseline. Components of the Healthy Diet Indicator (HDI) included saturated fatty acids, polyunsaturated fatty acids, mono- and disaccharides, protein, cholesterol, dietary fiber, and fruit and vegetables. Cohort-specific HRs adjusted for sex, education, smoking, physical activity, and energy and alcohol intakes were pooled by using a random-effects model.

RESULTS: During 3,322,768 person-years of follow-up, 12,492 people died of CVD. An increase of 10 HDI points (complete adherence to an additional WHO guideline) was, on average, not associated with CVD mortality (HR: 0.94; 95% CI: 0.86, 1.03), CAD mortality (HR: 0.99; 95% CI: 0.85, 1.14), or stroke mortality (HR: 0.95; 95% CI: 0.88, 1.03). However, after stratification of the data by geographic region, adherence to the HDI was associated with reduced CVD mortality in the southern European cohorts (HR: 0.87; 95% CI: 0.79, 0.96; I(2) = 0%) and in the US cohort (HR: 0.85; 95% CI: 0.83, 0.87; I(2) = not applicable).

CONCLUSION: Overall, greater adherence to the WHO dietary guidelines was not significantly associated with CVD mortality, but the results varied across regions. Clear inverse associations were observed in elderly populations in southern Europe and the United States.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Approximate execution is a viable technique for environments with energy constraints, provided that applications are given the mechanisms to produce outputs of the highest possible quality within the available energy budget. This paper introduces a framework for energy-constrained execution with controlled and graceful quality loss. A simple programming model allows developers to structure the computation in different tasks, and to express the relative importance of these tasks for the quality of the end result. For non-significant tasks, the developer can also supply less costly, approximate versions. The target energy consumption for a given execution is specified when the application is launched. A significance-aware runtime system employs an application-specific analytical energy model to decide how many cores to use for the execution, the operating frequency for these cores, as well as the degree of task approximation, so as to maximize the quality of the output while meeting the user-specified energy constraints. Evaluation on a dual-socket 16-core Intel platform using 9 benchmark kernels shows that the proposed framework picks the optimal configuration with high accuracy. Also, a comparison with loop perforation (a well-known compile-time approximation technique), shows that the proposed framework results in significantly higher quality for the same energy budget.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

New radiocarbon calibration curves, IntCal04 and Marine04, have been constructed and internationally ratified to replace the terrestrial and marine components of IntCal98. The new calibration data sets extend an additional 2000 yr, from 0–26 cal kyr BP (Before Present, 0 cal BP = AD 1950), and provide much higher resolution, greater precision, and more detailed structure than IntCal98. For the Marine04 curve, dendrochronologically-dated tree-ring samples, converted with a box diffusion model to marine mixed-layer ages, cover the period from 0–10.5 cal kyr BP. Beyond 10.5 cal kyr BP, high-resolution marine data become available from foraminifera in varved sediments and U/Th-dated corals. The marine records are corrected with site-specific 14C reservoir age information to provide a single global marine mixed-layer calibration from 10.5–26.0 cal kyr BP. A substantial enhancement relative to IntCal98 is the introduction of a random walk model, which takes into account the uncertainty in both the calendar age and the 14C age to calculate the underlying calibration curve (Buck and Blackwell, this issue). The marine data sets and calibration curve for marine samples from the surface mixed layer (Marine04) are discussed here. The tree-ring data sets, sources of uncertainty, and regional offsets are presented in detail in a companion paper by Reimer et al. (this issue).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The equilibrium polymerization of sulfur is investigated by Monte Carlo simulations. The potential energy model is based on density functional results for the cohesive energy, structural, and vibrational properties as well as reactivity of sulfur rings and chains [Part I, J. Chem. Phys. 118, 9257 (2003)]. Liquid samples of 2048 atoms are simulated at temperatures 450less than or equal toTless than or equal to850 K and P=0 starting from monodisperse S-8 molecular compositions. Thermally activated bond breaking processes lead to an equilibrium population of unsaturated atoms that can change the local pattern of covalent bonds and allow the system to approach equilibrium. The concentration of unsaturated atoms and the kinetics of bond interchanges is determined by the energy DeltaE(b) required to break a covalent bond. Equilibrium with respect to the bond distribution is achieved for 15less than or equal toDeltaE(b)less than or equal to21 kcal/mol over a wide temperature range (Tgreater than or equal to450 K), within which polymerization occurs readily, with entropy from the bond distribution overcompensating the increase in enthalpy. There is a maximum in the polymerized fraction at temperature T-max that depends on DeltaE(b). This fraction decreases at higher temperature because broken bonds and short chains proliferate and, for Tless than or equal toT(max), because entropy is less important than enthalpy. The molecular size distribution is described well by a Zimm-Schulz function, plus an isolated peak for S-8. Large molecules are almost exclusively open chains. Rings tend to have fewer than 24 atoms, and only S-8 is present in significant concentrations at all T. The T dependence of the density and the dependence of polymerization fraction and degree on DeltaE(b) give estimates of the polymerization temperature T-f=450+/-20 K. (C) 2003 American Institute of Physics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new calibration curve for the conversion of radiocarbon ages to calibrated (cal) ages has been constructed and internationally ratified to replace IntCal98, which extended from 0-24 cal kyr BP (Before Present, 0 cal BP = AD 1950). The new calibration data set for terrestrial samples extends from 0-26 cal kyr BP, but with much higher resolution beyond 11.4 cal kyr BP than IntCal98. Dendrochronologically-dated tree-ring samples cover the period from 0-12.4 cal kyr BP. Beyond the end of the tree rings, data from marine records (corals and foraminifera) are converted to the atmospheric equivalent with a site-specific marine reservoir correction to provide terrestrial calibration from 12.4-26.0 cal kyr BP. A substantial enhancement relative to IntCal98 is the introduction of a coherent statistical approach based on a random walk model, which takes into account the uncertainty in both the calendar age and the (super 14) C age to calculate the underlying calibration curve (Buck and Blackwell, this issue). The tree-ring data sets, sources of uncertainty, and regional offsets are discussed here. The marine data sets and calibration curve for marine samples from the surface mixed layer (Marine04) are discussed in brief, but details are presented in Hughen et al. (this issue a). We do not make a recommendation for calibration beyond 26 cal kyr BP at this time; however, potential calibration data sets are compared in another paper (van der Plicht et al., this issue).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This systematic review aimed to examine if an association exists between dietary glycaemic index (GI) and glycaemic load (GL) intake and breast cancer risk. A systematic search was conducted in Medline and Embase and identified 14 relevant studies up to May 2008. Adjusted relative risk estimates comparing breast cancer risk for the highest versus the lowest category of GI/GL intake were extracted from relevant studies and combined in meta-analyses using a random-effects model. Combined estimates from six cohort studies show non-significant increased breast cancer risks for premenopausal women (relative risk (RR) 1.14, 95% CI 0.95-1.38) and postmenopausal women (RR 1.11, 95% CI 0.99-1.25) consuming the highest versus the lowest category of GI intake. Evidence of heterogeneity hindered analyses of GL and premenopausal risk, although most studies did not observe any significant association. Pooled cohort study results indicated no association between postmenopausal risk and GL intake (RR 1.03, 95% CI 0.94-1.12). Our findings do not provide strong support of an association between dietary GI and GL and breast cancer risk. © 2008 Cancer Research UK.


--------------------------------------------------------------------------------

Reaxys Database Information|

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aims/hypothesis Glomerular hyperfiltration is a well established phenomenon occurring early in some patients with type 1 diabetes. However, there is no consistent answer regarding whether hyperfiltration predicts later development of nephropathy. We performed a systematic review and meta-analysis of observational studies that compared the risk of developing diabetic nephropathy in patients with and without glomerular hyperfiltration and also explored the impact of baseline GFR.

Methods A systematic review and meta-analysis was carried out. Cohort studies in type 1 diabetic participants were included if they contained data on the development of incipient or overt nephropathy with baseline measurement
of GFR and presence or absence of hyperfiltration.

Results We included ten cohort studies following 780 patients. After a study median follow-up of 11.2 years, 130 patients had developed nephropathy. Using a random effects model, the pooled odds of progression to a minimum
of microalbuminuria in patients with hyperfiltration was 2.71 (95% CI 1.20–6.11) times that of patients with normofiltration. There was moderate heterogeneity (heterogeneity test p=0.05, measure of degree of inconsistency=48%) and some evidence of funnel plot asymmetry, possibly due to publication bias. The pooled weighted mean difference in baseline GFR was 13.8 ml min-1 1.73 m-2 (95% CI 5.0–22.7) greater in the group progressing to nephropathy than in those not progressing (heterogeneity test p<0.01).

Conclusions/interpretation In published studies, individuals with glomerular hyperfiltration were at increased risk of progression to diabetic nephropathy using study level data. Further larger studies are required to explore this relationship and the role of potential confounding variables.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The IntCal04 and Marine04 radiocarbon calibration curves have been updated from 12 cal kBP (cal kBP is here defined as thousands of calibrated years before AD 1950), and extended to 50 cal kBP, utilizing newly available data sets that meet the IntCal Working Group criteria for pristine corals and other carbonates and for quantification of uncertainty in both the 14C and calendar timescales as established in 2002. No change was made to the curves from 0-12 cal kBP. The curves were constructed using a Markov chain Monte Carlo (MCMC) implementation of the random walk model used for IntCal04 and Marine04. The new curves were ratified at the 20th International Radiocarbon Conference in June 2009 and are available in the Supplemental Material at www.radiocarbon.org.