171 resultados para E5


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Genetics plays a crucial role in human aging with up to 30% of those living to the mid-80s being determined by genetic variation. Survival to older ages likely entails an even greater genetic contribution. There is increasing evidence that genes implicated in age-related diseases, such as cancer and neuronal disease, play a role in affecting human life span. We have selected the 10 most promising late-onset Alzheimer's disease (LOAD) susceptibility genes identified through several recent large genome-wide association studies (GWAS). These 10 LOAD genes (APOE, CLU, PICALM, CR1, BIN1, ABCA7, MS4A6A, CD33, CD2AP, and EPHA1) have been tested for association with human aging in our dataset (1385 samples with documented age at death [AAD], age range: 58-108 years; mean age at death: 80.2) using the most significant single nucleotide polymorphisms (SNPs) found in the previous studies. Apart from the APOE locus (rs2075650) which showed compelling evidence of association with risk on human life span (p = 5.27 × 10(-4)), none of the other LOAD gene loci demonstrated significant evidence of association. In addition to examining the known LOAD genes, we carried out analyses using age at death as a quantitative trait. No genome-wide significant SNPs were discovered. Increasing sample size and statistical power will be imperative to detect genuine aging-associated variants in the future. In this report, we also discuss issues relating to the analysis of genome-wide association studies data from different centers and the bioinformatic approach required to distinguish spurious genome-wide significant signals from real SNP associations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Serum PEDF levels (mean (S.D.)) were increased in 96 Type 2 diabetic vs. 54 non-diabetic subjects; 5.3 (2.8) vs. 3.2 (2.0)mug/ml, p

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High power lasers have proven being capable to produce high energy γ-rays, charged particles and neutrons, and to induce all kinds of nuclear reactions. At ELI, the studies with high power lasers will enter for the first time into new domains of power and intensities: 10 PW and 10^23 W/cm^2. While the development of laser based radiation sources is the main focus at the ELI-Beamlines pillar of ELI, at ELI-NP the studies that will benefit from High Power Laser System pulses will focus on Laser Driven Nuclear Physics (this TDR, acronym LDNP, associated to the E1 experimental area), High Field Physics and QED (associated to the E6 area) and fundamental research opened by the unique combination of the two 10 PW laser pulses with a gamma beam provided by the Gamma Beam System (associated to E7 area). The scientific case of the LDNP TDR encompasses studies of laser induced nuclear reactions, aiming for a better understanding of nuclear properties, of nuclear reaction rates in laser-plasmas, as well as on the development of radiation source characterization methods based on nuclear techniques. As an example of proposed studies: the promise of achieving solid-state density bunches of (very) heavy ions accelerated to about 10 MeV/nucleon through the RPA mechanism will be exploited to produce highly astrophysical relevant neutron rich nuclei around the N~126 waiting point, using the sequential fission-fusion scheme, complementary to any other existing or planned method of producing radioactive nuclei.

The studies will be implemented predominantly in the E1 area of ELI-NP. However, many of them can be, in a first stage, performed in the E5 and/or E4 areas, where higher repetition laser pulses are available, while the harsh X-ray and electromagnetic pulse (EMP) environments are less damaging compared to E1.

A number of options are discussed through the document, having an important impact on the budget and needed resources. Depending on the TDR review and subsequent project decisions, they may be taken into account for space reservation, while their detailed design and implementation will be postponed.

The present TDR is the result of contributions from several institutions engaged in nuclear physics and high power laser research. A significant part of the proposed equipment can be designed, and afterwards can be built, only in close collaboration with (or subcontracting to) some of these institutions. A Memorandum of Understanding (MOU) is currently under preparation with each of these key partners as well as with others that are interested to participate in the design or in the future experimental program.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Acute massive pulmonary embolism (PE) is a life-threatening event. Before the era of cardiopulmonary bypass, acute pulmonary embolectomy had been historically attempted in patients with severe hemodynamic compromise. The Klippel-Trenaunay syndrome (KTS) represents a significant life-long risk for major thromboembolic events. We present two young patients with Klippel-Trenaunay syndrome who survived surgical embolectomy after massive PE and cardiopulmonary resuscitation, with good postoperative recovery. Even though the role of surgical embolectomy in massive PE is not clearly defined, with current technology it can be life saving and can lead to a complete recovery, especially in young patients as described in this study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El manejo sostenible de pesquerías es todavía un problema abierto y la teoría de viabilidad ofrece una alternativa para determinar políticas de manejo de los recursos que garanticen la sostenibilidad, una vez definidas las restricciones que determinan los estados sostenibles del sistema. La dinámica poblacional de la anchoveta peruana se modeló usando un modelo estructurado por edades tipo Thomson–Bell con capturas discretas acoplado con el modelo de reclutamiento de Ricker, con pasos semestrales entre los años 1963–1984. Se definió además un conjunto deseable de estados sostenibles, asociado a los niveles del stock y capturas que satisfacen restricciones ecológicas, económicas y sociales previamente definidas. En base a esto se calculó el conjunto de los estados del stock para los que existe un sucesión de capturas que permiten mantenerlo en un estado sostenible (conjunto denominado núcleo de viabilidad) y una familia de conjuntos de capturas viables, que corresponden a todos los niveles de captura que se puedan aplicar sobre cada estado del stock de manera tal que éste se mantenga dentro del núcleo de viabilidad, es decir, permanezca en un estado sostenible. Se encontró una condición suficiente para la existencia de un núcleo de viabilidad no vacío: que la cuota social (captura mínima para mantener en funcionamiento la pesquería) sea menor a un desembarque de 915 800 t semestrales. Se comparó la serie histórica de capturas con las obtenidas a partir de la teoría de viabilidad para el periodo 1963 - 1984, encontrándose que hubo sobrepesca desde finales de 1968, lo que conllevó al colapso de la pesquería durante El Niño de 1972-1973. A partir de los resultados de viabilidad, se definieron 5 estrategias de manejo pesquero (E1–E5) para la anchoveta peruana, concluyéndose que la estrategia precautoria viable media (E5) hubiera podido evitar el colapso de la pesquería de anchoveta, manteniendo además niveles aceptables de pesca. Además, la estrategia precautoria del ICES (E2) no aseguró la sostenibilidad del stock durante los periodos El Niño. Además, se concluye que hubiera sido necesaria una veda de un año después del colapso de la pesquería para que el stock regresara al núcleo de viabilidad, posibilitando un manejo sostenible en adelante. La teoría de la viabilidad, con el núcleo de viabilidad y las capturas viables asociadas, resultaron ser herramientas útiles para el diseño de estrategias de manejo que aseguran la sostenibilidad de los recursos pesqueros.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ancien possesseur : Gilles, Albert (1873-1959)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ancien possesseur : Gilles, Albert (1873-1959)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Statistical tests in vector autoregressive (VAR) models are typically based on large-sample approximations, involving the use of asymptotic distributions or bootstrap techniques. After documenting that such methods can be very misleading even with fairly large samples, especially when the number of lags or the number of equations is not small, we propose a general simulation-based technique that allows one to control completely the level of tests in parametric VAR models. In particular, we show that maximized Monte Carlo tests [Dufour (2002)] can provide provably exact tests for such models, whether they are stationary or integrated. Applications to order selection and causality testing are considered as special cases. The technique developed is applied to quarterly and monthly VAR models of the U.S. economy, comprising income, money, interest rates and prices, over the period 1965-1996.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper constructs and estimates a sticky-price, Dynamic Stochastic General Equilibrium model with heterogenous production sectors. Sectors differ in price stickiness, capital-adjustment costs and production technology, and use output from each other as material and investment inputs following an Input-Output Matrix and Capital Flow Table that represent the U.S. economy. By relaxing the standard assumption of symmetry, this model allows different sectoral dynamics in response to monetary policy shocks. The model is estimated by Simulated Method of Moments using sectoral and aggregate U.S. time series. Results indicate 1) substantial heterogeneity in price stickiness across sectors, with quantitatively larger differences between services and goods than previously found in micro studies that focus on final goods alone, 2) a strong sensitivity to monetary policy shocks on the part of construction and durable manufacturing, and 3) similar quantitative predictions at the aggregate level by the multi-sector model and a standard model that assumes symmetry across sectors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper studies the theoretical and empirical implications of monetary policy making by committee under three different voting protocols. The protocols are a consensus model, where super-majority is required for a policy change; an agenda-setting model, where the chairman controls the agenda; and a simple majority model, where policy is determined by the median member. These protocols give preeminence to different aspects of the actual decision making process and capture the observed heterogeneity in formal procedures across central banks. The models are estimated by Maximum Likehood using interest rate decisions by the committees of five central banks, namely the Bank of Canada, the Bank of England, the European Central Bank, the Swedish Riksbank, and the U.S. Federal Reserve. For all central banks, results indicate that the consensus model is statically superior to the alternative models. This suggests that despite institutionnal differences, committees share unwritten rules and informal procedures that deliver observationally equivalent policy decisions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we study the macroeconomic implications of sectoral heterogeneity and, in particular, heterogeneity in price setting, through the lens of a highly disaggregated multi-sector model. The model incorporates several realistic features and is estimated using a mix of aggregate and sectoral U.S. data. The frequencies of price changes implied by our estimates are remarkably consistent with those reported in micro-based studies, especially for non-sale prices. The model is used to study (i) the contribution of sectoral characteristics to the observed cross sectional heterogeneity in sectoral output and inflation responses to a monetary policy shock, (ii) the implications of sectoral price rigidity for aggregate output and inflation dynamics and for cost pass-through, and (iii) the role of sectoral shocks in explaining sectoral prices and quantities.