855 resultados para Large-scale databases


Relevância:

100.00% 100.00%

Publicador:

Resumo:

As part of an international intercomparison project, a set of single column models (SCMs) and cloud-resolving models (CRMs) are run under the weak temperature gradient (WTG) method and the damped gravity wave (DGW) method. For each model, the implementation of the WTG or DGW method involves a simulated column which is coupled to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. The simulated column has the same surface conditions as the reference state and is initialized with profiles from the reference state. We performed systematic comparison of the behavior of different models under a consistent implementation of the WTG method and the DGW method and systematic comparison of the WTG and DGW methods in models with different physics and numerics. CRMs and SCMs produce a variety of behaviors under both WTG and DGW methods. Some of the models reproduce the reference state while others sustain a large-scale circulation which results in either substantially lower or higher precipitation compared to the value of the reference state. CRMs show a fairly linear relationship between precipitation and circulation strength. SCMs display a wider range of behaviors than CRMs. Some SCMs under the WTG method produce zero precipitation. Within an individual SCM, a DGW simulation and a corresponding WTG simulation can produce different signed circulation. When initialized with a dry troposphere, DGW simulations always result in a precipitating equilibrium state. The greatest sensitivities to the initial moisture conditions occur for multiple stable equilibria in some WTG simulations, corresponding to either a dry equilibrium state when initialized as dry or a precipitating equilibrium state when initialized as moist. Multiple equilibria are seen in more WTG simulations for higher SST. In some models, the existence of multiple equilibria is sensitive to some parameters in the WTG calculations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A truly variance-minimizing filter is introduced and its per for mance is demonstrated with the Korteweg– DeV ries (KdV) equation and with a multilayer quasigeostrophic model of the ocean area around South Africa. It is recalled that Kalman-like filters are not variance minimizing for nonlinear model dynamics and that four - dimensional variational data assimilation (4DV AR)-like methods relying on per fect model dynamics have dif- ficulty with providing error estimates. The new method does not have these drawbacks. In fact, it combines advantages from both methods in that it does provide error estimates while automatically having balanced states after analysis, without extra computations. It is based on ensemble or Monte Carlo integrations to simulate the probability density of the model evolution. When obser vations are available, the so-called importance resampling algorithm is applied. From Bayes’ s theorem it follows that each ensemble member receives a new weight dependent on its ‘ ‘distance’ ’ t o the obser vations. Because the weights are strongly var ying, a resampling of the ensemble is necessar y. This resampling is done such that members with high weights are duplicated according to their weights, while low-weight members are largely ignored. In passing, it is noted that data assimilation is not an inverse problem by nature, although it can be for mulated that way . Also, it is shown that the posterior variance can be larger than the prior if the usual Gaussian framework is set aside. However , i n the examples presented here, the entropy of the probability densities is decreasing. The application to the ocean area around South Africa, gover ned by strongly nonlinear dynamics, shows that the method is working satisfactorily . The strong and weak points of the method are discussed and possible improvements are proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As part of an international intercomparison project, the weak temperature gradient (WTG) and damped gravity wave (DGW) methods are used to parameterize large-scale dynamics in a set of cloud-resolving models (CRMs) and single column models (SCMs). The WTG or DGW method is implemented using a configuration that couples a model to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. We investigated the sensitivity of each model to changes in SST, given a fixed reference state. We performed a systematic comparison of the WTG and DGW methods in different models, and a systematic comparison of the behavior of those models using the WTG method and the DGW method. The sensitivity to the SST depends on both the large-scale parameterization method and the choice of the cloud model. In general, SCMs display a wider range of behaviors than CRMs. All CRMs using either the WTG or DGW method show an increase of precipitation with SST, while SCMs show sensitivities which are not always monotonic. CRMs using either the WTG or DGW method show a similar relationship between mean precipitation rate and column-relative humidity, while SCMs exhibit a much wider range of behaviors. DGW simulations produce large-scale velocity profiles which are smoother and less top-heavy compared to those produced by the WTG simulations. These large-scale parameterization methods provide a useful tool to identify the impact of parameterization differences on model behavior in the presence of two-way feedback between convection and the large-scale circulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A comparison tool has been developed by mapping the global GPS total electron content (TEC) and large coverage of ionospheric scintillations together on the geomagnetic latitude/magnetic local time coordinates. Using this tool, a comparison between large-scale ionospheric irregularities and scintillations are pursued during a geomagnetic storm. Irregularities, such as storm enhanced density (SED), middle-latitude trough and polar cap patches, are clearly identified from the TEC maps. At the edges of these irregularities, clear scintillations appeared but their behaviors were different. Phase scintillations (σsub{φ}) were almost always larger than amplitude scintillations (S4) at the edges of these irregularities, associated with bursty flows or flow reversals with large density gradients. An unexpected scintillation feature appeared inside the modeled auroral oval where S4 were much larger than σsub{φ}, most likely caused by particle precipitations around the exiting polar cap patches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The landfall of Cyclone Catarina on the Brazilian coast in March 2004 became known as the first documented hurricane in the South Atlantic Ocean, promoting a new view oil how large-scale features can contribute to tropical transition. The aim of this paper is to put the large-scale circulation associated with Catarina`s transition in climate perspective. This is discussed in the light of a robust pattern of spatial correlations between thermodynamic and dynamic variables of importance for hurricane formation. A discussion on how transition mechanisms respond to the present-day circulation is presented. These associations help in understanding why Catarina was formed in a region previously thought to be hurricane-free. Catarina developed over a large-scale area of thermodynamically favourable air/sea temperature contrast. This aspect explains the paradox that such a rare system developed when the sea surface temperature was slightly below average. But, although thermodynamics played an important role, it is apparent that Catarina would not have formed without the key dynamic interplay triggered by a high latitude blocking. The blocking was associated with an extreme positive phase of the Southern Annular Mode (SAM) both hemispherically and locally, and the nearby area where Catarina developed is found to be more cyclonic during the positive phase of the SAM. A conceptual model is developed and a `South Atlantic index` is introduced as a useful diagnostic of potential conditions leading to tropical transition in the area, where large-scale indices indicate trends towards more favourable atmospheric conditions for tropical cyclone formation. Copyright (c) 2008 Royal Meteorological Society

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Autosomal recessive spastic paraplegia with thinning of corpus callosum (ARHSP-TCC) is a complex form of HSP initially described in Japan but subsequently reported to have a worldwide distribution with a particular high frequency in multiple families from the Mediterranean basin. We recently showed that ARHSP-TCC is commonly associated with mutations in SPG11/KIAA1840 on chromosome 15q. We have now screened a collection of new patients mainly originating from Italy and Brazil, in order to further ascertain the spectrum of mutations in SPG11, enlarge the ethnic origin of SPG11 patients, determine the relative frequency at the level of single Countries (i.e., Italy), and establish whether there is one or more common mutation. In 25 index cases we identified 32 mutations; 22 are novel, including 9 nonsense, 3 small deletions, 4 insertions, 1 in/del, 1 small duplication, 1 missense, 2 splice-site, and for the first time a large genomic rearrangement. This brings the total number of SPG11 mutated patients in the SPATAX collection to 111 cases in 44 families and in 17 isolated cases, from 16 Countries, all assessed using homogeneous clinical criteria. While expanding the spectrum of mutations in SPG11, this larger series also corroborated the notion that even within apparently homogeneous population a molecular diagnosis cannot be achieved without full gene sequencing. (C) 2008 Wiley-Liss, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The relationship between the structure and function of biological networks constitutes a fundamental issue in systems biology. Particularly, the structure of protein-protein interaction networks is related to important biological functions. In this work, we investigated how such a resilience is determined by the large scale features of the respective networks. Four species are taken into account, namely yeast Saccharomyces cerevisiae, worm Caenorhabditis elegans, fly Drosophila melanogaster and Homo sapiens. We adopted two entropy-related measurements (degree entropy and dynamic entropy) in order to quantify the overall degree of robustness of these networks. We verified that while they exhibit similar structural variations under random node removal, they differ significantly when subjected to intentional attacks (hub removal). As a matter of fact, more complex species tended to exhibit more robust networks. More specifically, we quantified how six important measurements of the networks topology (namely clustering coefficient, average degree of neighbors, average shortest path length, diameter, assortativity coefficient, and slope of the power law degree distribution) correlated with the two entropy measurements. Our results revealed that the fraction of hubs and the average neighbor degree contribute significantly for the resilience of networks. In addition, the topological analysis of the removed hubs indicated that the presence of alternative paths between the proteins connected to hubs tend to reinforce resilience. The performed analysis helps to understand how resilience is underlain in networks and can be applied to the development of protein network models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large-scale simulations of parts of the brain using detailed neuronal models to improve our understanding of brain functions are becoming a reality with the usage of supercomputers and large clusters. However, the high acquisition and maintenance cost of these computers, including the physical space, air conditioning, and electrical power, limits the number of simulations of this kind that scientists can perform. Modern commodity graphical cards, based on the CUDA platform, contain graphical processing units (GPUs) composed of hundreds of processors that can simultaneously execute thousands of threads and thus constitute a low-cost solution for many high-performance computing applications. In this work, we present a CUDA algorithm that enables the execution, on multiple GPUs, of simulations of large-scale networks composed of biologically realistic Hodgkin-Huxley neurons. The algorithm represents each neuron as a CUDA thread, which solves the set of coupled differential equations that model each neuron. Communication among neurons located in different GPUs is coordinated by the CPU. We obtained speedups of 40 for the simulation of 200k neurons that received random external input and speedups of 9 for a network with 200k neurons and 20M neuronal connections, in a single computer with two graphic boards with two GPUs each, when compared with a modern quad-core CPU. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Enantiomerically pure (R)- and (S)-gamma-hydroxy-organochalcogenides are prepared using poly-[R]-3-hydroxybutanoate (PHB) as the starting material. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This presentation was offered as part of the CUNY Library Assessment Conference, Reinventing Libraries: Reinventing Assessment, held at the City University of New York in June 2014.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Solar HeatIntegration NEtwork (SHINE) is a European research school in which 13 PhDstudents in solar thermal technologies are funded by the EU Marie-Curie program.It has five PhD course modules as well as workshops and seminars dedicated to PhDstudents both within the project as well as outside of it. The SHINE researchactivities focus on large solar heating systems and new applications: ondistrict heating, industrial processes and new storage systems. The scope ofthis paper is on systems for district heating for which there are five PhDstudents, three at universities and two at companies. The PhD students allstarted during the early part of 2014 and their initial work has concentratedon literature studies and on setting up models and data collection to be usedfor validation purposes. The PhD students will complete their studies in2017-18.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analyses of circulating metabolites in large prospective epidemiological studies could lead to improved prediction and better biological understanding of coronary heart disease (CHD). We performed a mass spectrometry-based non-targeted metabolomics study for association with incident CHD events in 1,028 individuals (131 events; 10 y. median follow-up) with validation in 1,670 individuals (282 events; 3.9 y. median follow-up). Four metabolites were replicated and independent of main cardiovascular risk factors [lysophosphatidylcholine 18∶1 (hazard ratio [HR] per standard deviation [SD] increment = 0.77, P-value<0.001), lysophosphatidylcholine 18∶2 (HR = 0.81, P-value<0.001), monoglyceride 18∶2 (MG 18∶2; HR = 1.18, P-value = 0.011) and sphingomyelin 28∶1 (HR = 0.85, P-value = 0.015)]. Together they contributed to moderate improvements in discrimination and re-classification in addition to traditional risk factors (C-statistic: 0.76 vs. 0.75; NRI: 9.2%). MG 18∶2 was associated with CHD independently of triglycerides. Lysophosphatidylcholines were negatively associated with body mass index, C-reactive protein and with less evidence of subclinical cardiovascular disease in additional 970 participants; a reverse pattern was observed for MG 18∶2. MG 18∶2 showed an enrichment (P-value = 0.002) of significant associations with CHD-associated SNPs (P-value = 1.2×10-7 for association with rs964184 in the ZNF259/APOA5 region) and a weak, but positive causal effect (odds ratio = 1.05 per SD increment in MG 18∶2, P-value = 0.05) on CHD, as suggested by Mendelian randomization analysis. In conclusion, we identified four lipid-related metabolites with evidence for clinical utility, as well as a causal role in CHD development.