984 resultados para Validated Computations


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stable wakefulness requires orexin/hypocretin neurons (OHNs) and OHR2 receptors. OHNs sense diverse environmental cues and control arousal accordingly. For unknown reasons, OHNs contain multiple excitatory transmitters, including OH peptides and glutamate. To analyze their cotransmission within computational frameworks for control, we optogenetically stimulated OHNs and examined resulting outputs (spike patterns) in a downstream arousal regulator, the histamine neurons (HANs). OHR2s were essential for sustained HAN outputs. OHR2-dependent HAN output increased linearly during constant OHN input, suggesting that the OHN→HAN(OHR2) module may function as an integral controller. OHN stimulation evoked OHR2-dependent slow postsynaptic currents, similar to midnanomolar OH concentrations. Conversely, glutamate-dependent output transiently communicated OHN input onset, peaking rapidly then decaying alongside OHN→HAN glutamate currents. Blocking glutamate-driven spiking did not affect OH-driven spiking and vice versa, suggesting isolation (low cross-modulation) of outputs. Therefore, in arousal regulators, cotransmitters may translate distinct features of OHN activity into parallel, nonredundant control signals for downstream effectors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The analytic continuation needed for the extraction of transport coefficients necessitates in principle a continuous function of the Euclidean time variable. We report on progress towards achieving the continuum limit for 2-point correlator measurements in thermal SU(3) gauge theory, with specific attention paid to scale setting. In particular, we improve upon the determination of the critical lattice coupling and the critical temperature of pure SU(3) gauge theory, estimating r0Tc ≃ 0.7470(7) after a continuum extrapolation. As an application the determination of the heavy quark momentum diffusion coefficient from a correlator of colour-electric fields attached to a Polyakov loop is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using explicitly-correlated coupled-cluster theory with single and double excitations, the intermolecular distances and interaction energies of the T-shaped imidazole⋯⋯benzene and pyrrole⋯⋯benzene complexes have been computed in a large augmented correlation-consistent quadruple-zeta basis set, adding also corrections for connected triple excitations and remaining basis-set-superposition errors. The results of these computations are used to assess other methods such as Møller–Plesset perturbation theory (MP2), spin-component-scaled MP2 theory, dispersion-weighted MP2 theory, interference-corrected explicitly-correlated MP2 theory, dispersion-corrected double-hybrid density-functional theory (DFT), DFT-based symmetry-adapted perturbation theory, the random-phase approximation, explicitly-correlated ring-coupled-cluster-doubles theory, and double-hybrid DFT with a correlation energy computed in the random-phase approximation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE Reliable tools to predict long-term outcome among patients with well compensated advanced liver disease due to chronic HCV infection are lacking. DESIGN Risk scores for mortality and for cirrhosis-related complications were constructed with Cox regression analysis in a derivation cohort and evaluated in a validation cohort, both including patients with chronic HCV infection and advanced fibrosis. RESULTS In the derivation cohort, 100/405 patients died during a median 8.1 (IQR 5.7-11.1) years of follow-up. Multivariate Cox analyses showed age (HR=1.06, 95% CI 1.04 to 1.09, p<0.001), male sex (HR=1.91, 95% CI 1.10 to 3.29, p=0.021), platelet count (HR=0.91, 95% CI 0.87 to 0.95, p<0.001) and log10 aspartate aminotransferase/alanine aminotransferase ratio (HR=1.30, 95% CI 1.12 to 1.51, p=0.001) were independently associated with mortality (C statistic=0.78, 95% CI 0.72 to 0.83). In the validation cohort, 58/296 patients with cirrhosis died during a median of 6.6 (IQR 4.4-9.0) years. Among patients with estimated 5-year mortality risks <5%, 5-10% and >10%, the observed 5-year mortality rates in the derivation cohort and validation cohort were 0.9% (95% CI 0.0 to 2.7) and 2.6% (95% CI 0.0 to 6.1), 8.1% (95% CI 1.8 to 14.4) and 8.0% (95% CI 1.3 to 14.7), 21.8% (95% CI 13.2 to 30.4) and 20.9% (95% CI 13.6 to 28.1), respectively (C statistic in validation cohort = 0.76, 95% CI 0.69 to 0.83). The risk score for cirrhosis-related complications also incorporated HCV genotype (C statistic = 0.80, 95% CI 0.76 to 0.83 in the derivation cohort; and 0.74, 95% CI 0.68 to 0.79 in the validation cohort). CONCLUSIONS Prognosis of patients with chronic HCV infection and compensated advanced liver disease can be accurately assessed with risk scores including readily available objective clinical parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE The pararectus approach has been validated for managing acetabular fractures. We hypothesised it might be an alternative approach for performing periacetabular osteotomy (PAO). METHODS Using four cadaver specimens, we randomly performed PAO through either the pararectus or a modified Smith-Petersen (SP) approach. We assessed technical feasibility and safety. Furthermore, we controlled fragment mobility using a surgical navigation system and compared mobility between approaches. The navigation system's accuracy was tested by cross-examination with validated preoperative planning software. RESULTS The pararectus approach is technically feasible, allowing for adequate exposure, safe osteotomies and excellent control of structures at risk. Fragment mobility is equal to that achieved through the SP approach. Validation of these measurements yielded a mean difference of less <1 mm without statistical significance. CONCLUSION Experimental data suggests the pararectus approach might be an alternative approach for performing PAO. Clinical validation is necessary to confirm these promising preliminary results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transcriptional enhancers are genomic DNA sequences that contain clustered transcription factor (TF) binding sites. When combinations of TFs bind to enhancer sequences they act together with basal transcriptional machinery to regulate the timing, location and quantity of gene transcription. Elucidating the genetic mechanisms responsible for differential gene expression, including the role of enhancers, during embryological and postnatal development is essential to an understanding of evolutionary processes and disease etiology. Numerous methods are in use to identify and characterize enhancers. Several high-throughput methods generate large datasets of enhancer sequences with putative roles in embryonic development. However, few enhancers have been deleted from the genome to determine their roles in the development of specific structures, such as the limb. Manipulation of enhancers at their endogenous loci, such as the deletion of such elements, leads to a better understanding of the regulatory interactions, rules and complexities that contribute to faithful and variant gene transcription – the molecular genetic substrate of evolution and disease. To understand the endogenous roles of two distinct enhancers known to be active in the mouse embryo limb bud we deleted them from the mouse genome. I hypothesized that deletion of these enhancers would lead to aberrant limb development. The enhancers were selected because of their association with p300, a protein associated with active transcription, and because the human enhancer sequences drive distinct lacZ expression patterns in limb buds of embryonic day (E) 11.5 transgenic mice. To confirm that the orthologous mouse enhancers, mouse 280 and 1442 (M280 and M1442, respectively), regulate expression in the developing limb we generated stable transgenic lines, and examined lacZ expression. In M280-lacZ mice, expression was detected in E11.5 fore- and hindlimbs in a region that corresponds to digits II-IV. M1442-lacZ mice exhibited lacZ expression in posterior and anterior margins of the fore- and hindlimbs that overlapped with digits I and V and several wrist bones. We generated mice lacking the M280 and M1442 enhancers by gene targeting. Intercrosses between M280 -/+ and M1442 -/+, respectively, generated M280 and M1442 null mice, which are born at expected Mendelian ratios and manifest no gross limb malformations. Quantitative real-time PCR of mutant E11.5 limb buds indicated that significant changes in transcriptional output of enhancer-proximal genes accompanied the deletion of both M280 and M1442. In neonatal null mice we observed that all limb bones are present in their expected positions, an observation also confirmed by histology of E18.5 distal limbs. Fine-scale measurement of E18.5 digit bone lengths found no differences between mutant and control embryos. Furthermore, when the developmental progression of cartilaginous elements was analyzed in M280 and M1442 embryos from E13.5-E15.5, transient development defects were not detected. These results demonstrate that M280 and M1442 are not required for mouse limb development. Though M280 is not required for embryonic limb development it is required for the development and/or maintenance of body size – adult M280 mice are significantly smaller than control littermates. These studies highlight the importance of experiments that manipulate enhancers in situ to understand their contribution to development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The number of well-dated pollen diagrams in Europe has increased considerably over the last 30 years and many of them have been submitted to the European Pollen Database (EPD). This allows for the construction of increasingly precise maps of Holocene vegetation change across the continent. Chronological information in the EPD has been expressed in uncalibrated radiocarbon years, and most chronologies to date are based on this time scale. Here we present new chronologies for most of the datasets stored in the EPD based on calibrated radiocarbon years. Age information associated with pollen diagrams is often derived from the pollen stratigraphy itself or from other sedimentological information. We reviewed these chronological tie points and assigned uncertainties to them. The steps taken to generate the new chronologies are described and the rationale for a new classification system for age uncertainties is introduced. The resulting chronologies are fit for most continental-scale questions. They may not provide the best age model for particular sites, but may be viewed as general purpose chronologies. Taxonomic particularities of the data stored in the EPD are explained. An example is given of how the database can be queried to select samples with appropriate age control as well as the suitable taxonomic level to answer a specific research question.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Culture and mesocosm experiments are often carried out under high initial nutrient concentrations, yielding high biomass concentrations that in turn often lead to a substantial build-up of DOM. In such experiments, DOM can reach concentrations much higher than typically observed in the open ocean. To the extent that DOM includes organic acids and bases, it will contribute to the alkalinity of the seawater contained in the experimental device. Our analysis suggests that whenever substantial amounts of DOM are produced during the experiment, standard computer programmes used to compute CO2 fugacity can underestimate true fCO2 significantly when the computation is based on AT and CT. Unless the effect of DOM-alkalinity can be accounted for, this might lead to significant errors in the interpretation of the system under consideration with respect to the experimentally applied CO2 perturbation. Errors in the inferred fCO2 can misguide the development of parameterisations used in simulations with global carbon cycle models in future CO2-scenarios. Over determination of the CO2-system in experimental ocean acidification studies is proposed to safeguard against possibly large errors in estimated fCO2.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There exists an interest in performing full core pin-by-pin computations for present nuclear reactors. In such type of problems the use of a transport approximation like the diffusion equation requires the introduction of correction parameters. Interface discontinuity factors can improve the diffusion solution to nearly reproduce a transport solution. Nevertheless, calculating accurate pin-by-pin IDF requires the knowledge of the heterogeneous neutron flux distribution, which depends on the boundary conditions of the pin-cell as well as the local variables along the nuclear reactor operation. As a consequence, it is impractical to compute them for each possible configuration. An alternative to generate accurate pin-by-pin interface discontinuity factors is to calculate reference values using zero-net-current boundary conditions and to synthesize afterwards their dependencies on the main neighborhood variables. In such way the factors can be accurately computed during fine-mesh diffusion calculations by correcting the reference values as a function of the actual environment of the pin-cell in the core. In this paper we propose a parameterization of the pin-by-pin interface discontinuity factors allowing the implementation of a cross sections library able to treat the neighborhood effect. First results are presented for typical PWR configurations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Irregular computations pose sorne of the most interesting and challenging problems in automatic parallelization. Irregularity appears in certain kinds of numerical problems and is pervasive in symbolic applications. Such computations often use dynamic data structures, which make heavy use of pointers. This complicates all the steps of a parallelizing compiler, from independence detection to task partitioning and placement. Starting in the mid 80s there has been significant progress in the development of parallelizing compilers for logic pro­gramming (and more recently, constraint programming) resulting in quite capable paralle­lizers. The typical applications of these paradigms frequently involve irregular computations, and make heavy use of dynamic data structures with pointers, since logical variables represent in practice a well-behaved form of pointers. This arguably makes the techniques used in these compilers potentially interesting. In this paper, we introduce in a tutoríal way, sorne of the problems faced by parallelizing compilers for logic and constraint programs and provide pointers to sorne of the significant progress made in the area. In particular, this work has resulted in a series of achievements in the areas of inter-procedural pointer aliasing analysis for independence detection, cost models and cost analysis, cactus-stack memory management, techniques for managing speculative and irregular computations through task granularity control and dynamic task allocation such as work-stealing schedulers), etc.