973 resultados para Typographical Errors


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Only few cases of classical phenylketonuria (PKU) in premature infants have been reported. Treatment of these patients is challenging due to the lack of a phenylalanine-free amino acid solution for parenteral infusion. The boy was born at 27 weeks of gestation with a weight of 1000 g (P10). He received parenteral nutrition with a protein intake of 3 g/kg/day. On day 7 he was diagnosed with classical PKU (genotype IVS10-11G>A/IVS12+ 1G>A) due to highly elevated phenylalanine (Phe) level in newborn screening (2800 micromol/L). His maximum plasma Phe level reached 3696 micromol/L. Phe intake was stopped for 4 days. During this time the boy received intravenous glucose and lipids as well as little amounts of Phe-free formula by a nasogastric tube. Due to a deficit of essential amino acids and insufficient growth, a parenteral nutrition rich in branched-chain amino-acids and relatively poor in Phe was added, in order to promote protein synthesis without overloading in Phe. Under this regimen, Phe plasma levels normalized on day 19 when intake of natural protein was started. The boy has now a corrected age of 2 years. He shows normal growth parameters and psychomotor development. Despite a long period of highly elevated Phe levels in the postnatal period our patient shows good psychomotor development. The management of premature infants with PKU depends on the child's tolerance to enteral nutrition. It demands an intensive follow-up by an experienced team and dedicated dietician. Appropriate Phe-free parenteral nutrition would be necessary especially in case of gastro-intestinal complications of prematurity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for (i) organizational aspects of GWAMAs, and for (ii) QC at the study file level, the meta-level across studies and the meta-analysis output level. Real-world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for the use of a powerful and flexible software package called EasyQC. Precise timings will be greatly influenced by consortium size. For consortia of comparable size to the GIANT Consortium, this protocol takes a minimum of about 10 months to complete.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A 3D in vitro model of rat organotypic brain cell cultures in aggregates was used to investigate neurotoxicity mechanisms in glutaric aciduria type I (GA-I). 1 mM glutarate (GA) or 3-hydroxyglutarate (3OHGA) were repeatedly added to the culture media at two different time points. In cultures treated with 3OHGA, we observed an increase in lactate in the medium, pointing to a possible inhibition of Krebs cycle and respiratory chain. We further observed that 3OHGA and to a lesser extend GA induced an increase in ammonia production with concomitant decrease of glutamine concentrations, which may suggest an inhibition of the astrocytic enzyme glutamine synthetase. These previously unreported findings may uncover a pathogenic mechanism in this disease which has deleterious effects on early stages of brain development. By immunohistochemistry we showed that 3OHGA increased non-apoptotic cell death. On the cellular level, 3OHGA and to a lesser extend GA led to cell swelling and loss of astrocytic fibers whereas a loss of oligodendrocytes was only observed for 3OHGA. We conclude that 3OHGAwas the most toxic metabolite in our model for GA-I. 3OHGA induced deleterious effects on glial cells, an increase of ammonia production, and resulted in accentuated cell death of non-apoptotic origin.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study is to quantify the prevalence and types of rare chromosome abnormalities (RCAs) in Europe for 2000-2006 inclusive, and to describe prenatal diagnosis rates and pregnancy outcome. Data held by the European Surveillance of Congenital Anomalies database were analysed on all the cases from 16 population-based registries in 11 European countries diagnosed prenatally or before 1 year of age, and delivered between 2000 and 2006. Cases were all unbalanced chromosome abnormalities and included live births, fetal deaths from 20 weeks gestation and terminations of pregnancy for fetal anomaly. There were 10,323 cases with a chromosome abnormality, giving a total birth prevalence rate of 43.8/10,000 births. Of these, 7335 cases had trisomy 21,18 or 13, giving individual prevalence rates of 23.0, 5.9 and 2.3/10,000 births, respectively (53, 13 and 5% of all reported chromosome errors, respectively). In all, 473 cases (5%) had a sex chromosome trisomy, and 778 (8%) had 45,X, giving prevalence rates of 2.0 and 3.3/10,000 births, respectively. There were 1,737 RCA cases (17%), giving a prevalence of 7.4/10,000 births. These included triploidy, other trisomies, marker chromosomes, unbalanced translocations, deletions and duplications. There was a wide variation between the registers in both the overall prenatal diagnosis rate of RCA, an average of 65% (range 5-92%) and the prevalence of RCA (range 2.4-12.9/10,000 births). In all, 49% were liveborn. The data provide the prevalence of families currently requiring specialised genetic counselling services in the perinatal period for these conditions and, for some, long-term care.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Multiple interventions were made to optimize the medication process in our intensive care unit (ICU). 1 Transcriptions from the medical order form to the administration plan were eliminated by merging both into a single document; 2 the new form was built in a logical sequence and was highly structured to promote completeness and standardization of information; 3 frequently used drug names, approved units, and fixed routes were pre-printed; 4 physicians and nurses were trained with regard to the correct use of the new form. This study was aimed at evaluating the impact of these interventions on clinically significant types of medication errors. METHODS: Eight types of medication errors were measured by a prospective chart review before and after the interventions in the ICU of a public tertiary care hospital. We used an interrupted time-series design to control the secular trends. RESULTS: Over 85 days, 9298 lines of drug prescription and/or administration to 294 patients, corresponding to 754 patient-days were collected and analysed for the three series before and three series following the intervention. Global error rate decreased from 4.95 to 2.14% (-56.8%, P < 0.001). CONCLUSIONS: The safety of the medication process in our ICU was improved by simple and inexpensive interventions. In addition to the optimization of the prescription writing process, the documentation of intravenous preparation, and the scheduling of administration, the elimination of the transcription in combination with the training of users contributed to reducing errors and carried an interesting potential to increase safety.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Expectations about the future are central for determination of current macroeconomic outcomes and the formulation of monetary policy. Recent literature has explored ways for supplementing the benchmark of rational expectations with explicit models of expectations formation that rely on econometric learning. Some apparently natural policy rules turn out to imply expectational instability of private agents’ learning. We use the standard New Keynesian model to illustrate this problem and survey the key results about interest-rate rules that deliver both uniqueness and stability of equilibrium under econometric learning. We then consider some practical concerns such as measurement errors in private expectations, observability of variables and learning of structural parameters required for policy. We also discuss some recent applications including policy design under perpetual learning, estimated models with learning, recurrent hyperinflations, and macroeconomic policy to combat liquidity traps and deflation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper contributes to the on-going empirical debate regarding the role of the RBC model and in particular of technology shocks in explaining aggregate fluctuations. To this end we estimate the model’s posterior density using Markov-Chain Monte-Carlo (MCMC) methods. Within this framework we extend Ireland’s (2001, 2004) hybrid estimation approach to allow for a vector autoregressive moving average (VARMA) process to describe the movements and co-movements of the model’s errors not explained by the basic RBC model. The results of marginal likelihood ratio tests reveal that the more general model of the errors significantly improves the model’s fit relative to the VAR and AR alternatives. Moreover, despite setting the RBC model a more difficult task under the VARMA specification, our analysis, based on forecast error and spectral decompositions, suggests that the RBC model is still capable of explaining a significant fraction of the observed variation in macroeconomic aggregates in the post-war U.S. economy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A mesura que la complexitat de les tasques dels agents mòbils va creixent, és més important que aquestes no perdin el treball realitzat. Hem de saber en tot moment que la execució s’està desenvolupant favorablement. Aquest projecte tracta d’explicar el procés d’elaboració d’un component de tolerància a fallades des de la seva idea inicial fins a la seva implementació. Analitzarem la situació i dissenyarem una solució. Procurarem que el nostre component emmascari la fallada d’un agent, detectant-la i posteriorment recuperant l’execució des d’on s’ha interromput. Tot això procurant seguir la metodologia de disseny d’agents mòbils per a plataformes lleugeres.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabajo presenta un sistema para detectar y clasificar objetos binarios según la forma de éstos. En el primer paso del procedimiento, se aplica un filtrado para extraer el contorno del objeto. Con la información de los puntos de forma se obtiene un descriptor BSM con características altamente descriptivas, universales e invariantes. En la segunda fase del sistema se aprende y se clasifica la información del descriptor mediante Adaboost y Códigos Correctores de Errores. Se han usado bases de datos públicas, tanto en escala de grises como en color, para validar la implementación del sistema diseñado. Además, el sistema emplea una interfaz interactiva en la que diferentes métodos de procesamiento de imágenes pueden ser aplicados.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spatial econometrics has been criticized by some economists because some model specifications have been driven by data-analytic considerations rather than having a firm foundation in economic theory. In particular this applies to the so-called W matrix, which is integral to the structure of endogenous and exogenous spatial lags, and to spatial error processes, and which are almost the sine qua non of spatial econometrics. Moreover it has been suggested that the significance of a spatially lagged dependent variable involving W may be misleading, since it may be simply picking up the effects of omitted spatially dependent variables, incorrectly suggesting the existence of a spillover mechanism. In this paper we review the theoretical and empirical rationale for network dependence and spatial externalities as embodied in spatially lagged variables, arguing that failing to acknowledge their presence at least leads to biased inference, can be a cause of inconsistent estimation, and leads to an incorrect understanding of true causal processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

‘Modern’ Phillips curve theories predict inflation is an integrated, or near integrated, process. However, inflation appears bounded above and below in developed economies and so cannot be ‘truly’ integrated and more likely stationary around a shifting mean. If agents believe inflation is integrated as in the ‘modern’ theories then they are making systematic errors concerning the statistical process of inflation. An alternative theory of the Phillips curve is developed that is consistent with the ‘true’ statistical process of inflation. It is demonstrated that United States inflation data is consistent with the alternative theory but not with the existing ‘modern’ theories.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we investigate the ability of a number of different ordered probit models to predict ratings based on firm-specific data on business and financial risks. We investigate models based on momentum, drift and ageing and compare them against alternatives that take into account the initial rating of the firm and its previous actual rating. Using data on US bond issuing firms rated by Fitch over the years 2000 to 2007 we compare the performance of these models in predicting the rating in-sample and out-of-sample using root mean squared errors, Diebold-Mariano tests of forecast performance and contingency tables. We conclude that initial and previous states have a substantial influence on rating prediction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Employing an endogenous growth model with human capital, this paper explores how productivity shocks in the goods and human capital producing sectors contribute to explaining aggregate fluctuations in output, consumption, investment and hours. Given the importance of accounting for both the dynamics and the trends in the data not captured by the theoretical growth model, we introduce a vector error correction model (VECM) of the measurement errors and estimate the model’s posterior density function using Bayesian methods. To contextualize our findings with those in the literature, we also assess whether the endogenous growth model or the standard real business cycle model better explains the observed variation in these aggregates. In addressing these issues we contribute to both the methods of analysis and the ongoing debate regarding the effects of innovations to productivity on macroeconomic activity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accurate chromosome segregation during mitosis is temporally and spatially coordinated by fidelity-monitoring checkpoint systems. Deficiencies in these checkpoint systems can lead to chromosome segregation errors and aneuploidy, and promote tumorigenesis. Here, we report that the TRAF-interacting protein (TRAIP), a ubiquitously expressed nucleolar E3 ubiquitin ligase important for cellular proliferation, is localized close to mitotic chromosomes. Its knockdown in HeLa cells by RNA interference (RNAi) decreased the time of early mitosis progression from nuclear envelope breakdown (NEB) to anaphase onset and increased the percentages of chromosome alignment defects in metaphase and lagging chromosomes in anaphase compared with those of control cells. The decrease in progression time was corrected by the expression of wild-type but not a ubiquitin-ligase-deficient form of TRAIP. TRAIP-depleted cells bypassed taxol-induced mitotic arrest and displayed significantly reduced kinetochore levels of MAD2 (also known as MAD2L1) but not of other spindle checkpoint proteins in the presence of nocodazole. These results imply that TRAIP regulates the spindle assembly checkpoint, MAD2 abundance at kinetochores and the accurate cellular distribution of chromosomes. The TRAIP ubiquitin ligase activity is functionally required for the spindle assembly checkpoint control.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the impact of anticipated fiscal policy changes in a Ramsey economy where agents form long-horizon expectations using adaptive learning. We extend the existing framework by introducing distortionary taxes as well as elastic labour supply, which makes agents. decisions non-predetermined but more realistic. We detect that the dynamic responses to anticipated tax changes under learning have oscillatory behaviour that can be interpreted as self-fulfilling waves of optimism and pessimism emerging from systematic forecast errors. Moreover, we demonstrate that these waves can have important implications for the welfare consequences of .scal reforms. (JEL: E32, E62, D84)