201 resultados para Construct
Resumo:
BACKGROUND: The purpose of the optic nerve sheath diameter (ONSD) research group project is to establish an individual patient-level database from high quality studies of ONSD ultrasonography for the detection of raised intracranial pressure (ICP), and to perform a systematic review and an individual patient data meta-analysis (IPDMA), which will provide a cutoff value to help physicians making decisions and encourage further research. Previous meta-analyses were able to assess the diagnostic accuracy of ONSD ultrasonography in detecting raised ICP but failed to determine a precise cutoff value. Thus, the ONSD research group was founded to synthesize data from several recent studies on the subject and to provide evidence on the diagnostic accuracy of ONSD ultrasonography in detecting raised ICP. METHODS: This IPDMA will be conducted in different phases. First, we will systematically search for eligible studies. To be eligible, studies must have compared ONSD ultrasonography to invasive intracranial devices, the current reference standard for diagnosing raised ICP. Subsequently, we will assess the quality of studies included based on the QUADAS-2 tool, and then collect and validate individual patient data. The objectives of the primary analyses will be to assess the diagnostic accuracy of ONSD ultrasonography and to determine a precise cutoff value for detecting raised ICP. Secondly, we will construct a logistic regression model to assess whether patient and study characteristics influence diagnostic accuracy. DISCUSSION: We believe that this IPD MA will provide the most reliable basis for the assessment of diagnostic accuracy of ONSD ultrasonography for detecting raised ICP and to provide a cutoff value. We also hope that the creation of the ONSD research group will encourage further study. TRIAL REGISTRATION: PROSPERO registration number: CRD42012003072.
Resumo:
Schizotypy refers to a set of personality traits thought to reflect the subclinical expression of the signs and symptoms of schizophrenia. Here, we review the cognitive and brain functional profile associated with high questionnaire scores in schizotypy. We discuss empirical evidence from the domains of perception, attention, memory, imagery and representation, language, and motor control. Perceptual deficits occur early and across various modalities. Whilst the neural mechanisms underlying visual impairments may be linked to magnocellular dysfunction, further effects may be seen downstream in higher cognitive functions. Cognitive deficits are observed in inhibitory control, selective and sustained attention, incidental learning and memory. In concordance with the cognitive nature of many of the aberrations of schizotypy, higher levels of schizotypy are associated with enhanced vividness and better performance on tasks of mental rotation. Language deficits seem most pronounced in higher-level processes. Finally, higher levels of schizotypy are associated with reduced performance on oculomotor tasks, resembling the impairments seen in schizophrenia. Some of these deficits are accompanied by reduced brain activation, akin to the pattern of hypoactivations in schizophrenia spectrum individuals. We conclude that schizotypy is a construct with apparent phenomenological overlap with schizophrenia and stable inter-individual differences that covary with performance on a wide range of perceptual, cognitive and motor tasks known to be impaired in schizophrenia. The importance of these findings lies not only in providing a fine-grained neurocognitive characterisation of a personality constellation known to be associated with real-life impairments, but also in generating hypotheses concerning the aetiology of schizophrenia.
Resumo:
The gacA gene of the biocontrol strain Pseudomonas fluorescens CHA0 codes for a response regulator which, together with the sensor kinase GacS (=LemA), is required for the production of exoenzymes and secondary metabolites involved in biocontrol, including hydrogen cyanide (HCN). A gacA multicopy suppressor was isolated from a cosmid library of strain CHA0 and identified as the infC-rpmI-rplT operon, which encodes the translation initiation factor IF3 and the ribosomal proteins L35 and L20. The efficiency of suppression was about 30%, as determined by the use of a GacA-controlled reporter construct, i.e. a translational hcnA'-'lacZ fusion. Overexpression of the rsmA gene (coding for a global translational repressor) reversed the suppressive effect of the amplified infC operon. This finding suggests that some product(s) of the infC operon can compete with RsmA at the level of translation in P. fluorescens CHA0 and that important biocontrol traits can be regulated at this level.
Resumo:
Gene expression often cycles between active and inactive states in eukaryotes, yielding variable or noisy gene expression in the short-term, while slow epigenetic changes may lead to silencing or variegated expression. Understanding how cells control these effects will be of paramount importance to construct biological systems with predictable behaviours. Here we find that a human matrix attachment region (MAR) genetic element controls the stability and heritability of gene expression in cell populations. Mathematical modeling indicated that the MAR controls the probability of long-term transitions between active and inactive expression, thus reducing silencing effects and increasing the reactivation of silent genes. Single-cell short-terms assays revealed persistent expression and reduced expression noise in MAR-driven genes, while stochastic burst of expression occurred without this genetic element. The MAR thus confers a more deterministic behavior to an otherwise stochastic process, providing a means towards more reliable expression of engineered genetic systems.
Resumo:
Retroviral vectors have many favorable properties for gene therapies, but their use remains limited by safety concerns and/or by relatively lower titers for some of the safer self-inactivating (SIN) derivatives. In this study, we evaluated whether increased production of SIN retroviral vectors can be achieved from the use of matrix attachment region (MAR) epigenetic regulators. Two MAR elements of human origin were found to increase and to stabilize the expression of the green fluorescent protein transgene in stably transfected HEK-293 packaging cells. Introduction of one of these MAR elements in retroviral vector-producing plasmids yielded higher expression of the viral vector RNA. Consistently, viral titers obtained from transient transfection of MAR-containing plasmids were increased up to sixfold as compared with the parental construct, when evaluated in different packaging cell systems and transfection conditions. Thus, use of MAR elements opens new perspectives for the efficient generation of gene therapy vectors.
Resumo:
The book of Joshua is in the very center of the recent discussion about the existence of a coherent deuteronomistic redaction in Deut to 2 Kings during the exilic period. This article analyses the beginning (Josh 1.1-9) and the end (Josh 23 and 24) of Josh. Josh 1.1-2,5-7 and chapter 23 belong to the dtr edition. Josh 23 was followed by Judg 2.6ff. During the Persian period, Deuteronomists and priests intended to publish one Law for the whole community. There was probably a discussion whether the Torah should be a Penta- or a Hexateuch. This discussion may explain such a text as Josh 24 which clearly tries to construct an Hexateuch (cf. also Gen. 50.25: Exod 13.19; Josh 24.32). But since the Torah is about foundations, the main theological trends agreed to have its end with the death of Moses.
Resumo:
Recently graph theory and complex networks have been widely used as a mean to model functionality of the brain. Among different neuroimaging techniques available for constructing the brain functional networks, electroencephalography (EEG) with its high temporal resolution is a useful instrument of the analysis of functional interdependencies between different brain regions. Alzheimer's disease (AD) is a neurodegenerative disease, which leads to substantial cognitive decline, and eventually, dementia in aged people. To achieve a deeper insight into the behavior of functional cerebral networks in AD, here we study their synchronizability in 17 newly diagnosed AD patients compared to 17 healthy control subjects at no-task, eyes-closed condition. The cross-correlation of artifact-free EEGs was used to construct brain functional networks. The extracted networks were then tested for their synchronization properties by calculating the eigenratio of the Laplacian matrix of the connection graph, i.e., the largest eigenvalue divided by the second smallest one. In AD patients, we found an increase in the eigenratio, i.e., a decrease in the synchronizability of brain networks across delta, alpha, beta, and gamma EEG frequencies within the wide range of network costs. The finding indicates the destruction of functional brain networks in early AD.
Resumo:
In groundwater applications, Monte Carlo methods are employed to model the uncertainty on geological parameters. However, their brute-force application becomes computationally prohibitive for highly detailed geological descriptions, complex physical processes, and a large number of realizations. The Distance Kernel Method (DKM) overcomes this issue by clustering the realizations in a multidimensional space based on the flow responses obtained by means of an approximate (computationally cheaper) model; then, the uncertainty is estimated from the exact responses that are computed only for one representative realization per cluster (the medoid). Usually, DKM is employed to decrease the size of the sample of realizations that are considered to estimate the uncertainty. We propose to use the information from the approximate responses for uncertainty quantification. The subset of exact solutions provided by DKM is then employed to construct an error model and correct the potential bias of the approximate model. Two error models are devised that both employ the difference between approximate and exact medoid solutions, but differ in the way medoid errors are interpolated to correct the whole set of realizations. The Local Error Model rests upon the clustering defined by DKM and can be seen as a natural way to account for intra-cluster variability; the Global Error Model employs a linear interpolation of all medoid errors regardless of the cluster to which the single realization belongs. These error models are evaluated for an idealized pollution problem in which the uncertainty of the breakthrough curve needs to be estimated. For this numerical test case, we demonstrate that the error models improve the uncertainty quantification provided by the DKM algorithm and are effective in correcting the bias of the estimate computed solely from the MsFV results. The framework presented here is not specific to the methods considered and can be applied to other combinations of approximate models and techniques to select a subset of realizations
Resumo:
The construct of cognitive errors is clinically relevant for cognitive therapy of mood disorders. Beck's universality hypothesis postulates the relevance of negative cognitions in all subtypes of mood disorders, as well as positive cognitions for manic states. This hypothesis has rarely been empirically addressed for patients presenting bipolar affective disorder (BD). In-patients (n = 30) presenting with BD were interviewed, as were 30 participants of a matched control group. Valid and reliable observer-rater methodology for cognitive errors was applied to the session transcripts. Overall, patients make more cognitive errors than controls. When manic and depressive patients were compared, parts of the universality hypothesis were confirmed. Manic symptoms are related to positive and negative cognitive errors. These results are discussed with regard to the main assumptions of the cognitive model for depression; thus adding an argument for extending it to the BD diagnostic group, taking into consideration specificities in terms of cognitive errors. Clinical implications for cognitive therapy of BD are suggested.
Resumo:
PURPOSE: To develop a breathhold method for black-blood viability imaging of the heart that may facilitate identifying the endocardial border. MATERIALS AND METHODS: Three stimulated-echo acquisition mode (STEAM) images were obtained almost simultaneously during the same acquisition using three different demodulation values. Two of the three images were used to construct a black-blood image of the heart. The third image was a T(1)-weighted viability image that enabled detection of hyperintense infarcted myocardium after contrast agent administration. The three STEAM images were combined into one composite black-blood viability image of the heart. The composite STEAM images were compared to conventional inversion-recovery (IR) delayed hyperenhanced (DHE) images in nine human subjects studied on a 3T MRI scanner. RESULTS: STEAM images showed black-blood characteristics and a significant improvement in the blood-infarct signal-difference to noise ratio (SDNR) when compared to the IR-DHE images (34 +/- 4.1 vs. 10 +/- 2.9, mean +/- standard deviation (SD), P < 0.002). There was sufficient myocardium-infarct SDNR in the STEAM images to accurately delineate infarcted regions. The extracted infarcts demonstrated good agreement with the IR-DHE images. CONCLUSION: The STEAM black-blood property allows for better delineation of the blood-infarct border, which would enhance the fast and accurate measurement of infarct size.
Resumo:
Rotator cuff disease is the most common pathology causing shoulder pain with an overall prevalence rate of 30%. There is a significant association between increasing age and the presence of rotator cuff tears. Spontaneous healing of clinically relevant tears has not been observed. Although satisfactory pain relief is possible without rotator cuff tendon healing, functional outcome is better for healed repairs. Operative treatment must be considered in the context of the reasonable expectations for success. Repairability and potential of surgically tendon construct to heal are important considerations in surgical indications. There is no difference of outcomes between the arthroscopic and the open techniques.
Resumo:
Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.
Resumo:
Introduction: Clinical examination and electroencephalography study (EEG) have been recommended to predict functional recovery in comatose survivors of cardiac arrest (CA), however their prognostic value in patients treated with induced hypothermia (IH) has not been evaluated. Hypothesis: We aimed to validate the prognostic ability of clinical examination and EEG in predicting outcome of patients with coma after CA treated with IH and sought to derive a score with high predictive value for poor functional outcome in this setting. Methods: We prospectively studied 100 consecutive comatose survivors of CA treated with IH. Repeated neurological examination and EEG were performed early after passive rewarming and off sedation. Mortality was assessed at hospital discharge, and functional outcome at 3 to 6 months with Cerebral Performance Categories (CPC), and was dichotomized as good (CPC 1-2) vs. poor (CPC 3-5). Independent predictors of outcome were identified by multivariable logistic regression and used to assess the prognostic value of a Reproducible Electro-clinical Prognosticators of Outcome Score (REPOS). Results: Patients (20/100) with good outcome had all a reactive EEG background. Incomplete recovery of brainstem reflexes, myoclonus, time to return of spontaneous circulation (ROSC) > 25 min, and unreactive EEG background were all independent predictors of death and severe disability, and were added to construct the REPOS. Using a cut-off of 0 or 1 variables for good vs. 2 to 4 for poor outcome, the REPOS had a positive predictive value of 1.00 (95% CI: 0.92-1.00), a negative predictive value of 0.43 (95% CI: 0.29-0.58) and an accuracy of 0.81 for poor functional recovery at 3 to 6 months. Conclusions: In comatose survivors of CA treated with IH, a prognostic score, including clinical and EEG examination, was highly predictive of death and poor functional outcome at 3 to 6 months. Lack of EEG background reactivity strongly predicted poor neurological recovery after CA. Our findings show that clinical and electrophysiological studies are effective in predicting long-term outcome of comatose survivors after CA and IH, and suggest that EEG improves early prognostic assessment in the setting of therapeutic cooling.
Resumo:
Caspase cleaved amyloid precursor protein (APPcc) and SET are increased and mislocalized in the neuronal cytoplasm in Alzheimer Disease (AD) brains. Translocated SET to the cytoplasm can induce tau hyperphosphorylation. To elucidate the putative relationships between mislocalized APPcc and SET, we studied their level and distribution in the hippocampus of 5 controls, 3 Down syndrome and 10 Alzheimer patients. In Down syndrome and Alzheimer patients, APPcc and SET levels were increased in CA1 and the frequency of both localizations in the neuronal cytoplasm was high in CA1, and low in CA4. As the increase of APPcc is already present at early stages of AD, we overexpressed APPcc in CA1 and the dentate gyrus neurons of adult mice with a lentiviral construct. APPcc overexpression in CA1 and not in the dentate gyrus induced endogenous SET translocation and tau hyperphosphorylation. These data suggest that increase in APPcc in CA1 neurons could be an early event leading to the translocation of SET and the progression of AD through tau hyperphosphorylation.
Resumo:
General Summary Although the chapters of this thesis address a variety of issues, the principal aim is common: test economic ideas in an international economic context. The intention has been to supply empirical findings using the largest suitable data sets and making use of the most appropriate empirical techniques. This thesis can roughly be divided into two parts: the first one, corresponding to the first two chapters, investigates the link between trade and the environment, the second one, the last three chapters, is related to economic geography issues. Environmental problems are omnipresent in the daily press nowadays and one of the arguments put forward is that globalisation causes severe environmental problems through the reallocation of investments and production to countries with less stringent environmental regulations. A measure of the amplitude of this undesirable effect is provided in the first part. The third and the fourth chapters explore the productivity effects of agglomeration. The computed spillover effects between different sectors indicate how cluster-formation might be productivity enhancing. The last chapter is not about how to better understand the world but how to measure it and it was just a great pleasure to work on it. "The Economist" writes every week about the impressive population and economic growth observed in China and India, and everybody agrees that the world's center of gravity has shifted. But by how much and how fast did it shift? An answer is given in the last part, which proposes a global measure for the location of world production and allows to visualize our results in Google Earth. A short summary of each of the five chapters is provided below. The first chapter, entitled "Unraveling the World-Wide Pollution-Haven Effect" investigates the relative strength of the pollution haven effect (PH, comparative advantage in dirty products due to differences in environmental regulation) and the factor endowment effect (FE, comparative advantage in dirty, capital intensive products due to differences in endowments). We compute the pollution content of imports using the IPPS coefficients (for three pollutants, namely biological oxygen demand, sulphur dioxide and toxic pollution intensity for all manufacturing sectors) provided by the World Bank and use a gravity-type framework to isolate the two above mentioned effects. Our study covers 48 countries that can be classified into 29 Southern and 19 Northern countries and uses the lead content of gasoline as proxy for environmental stringency. For North-South trade we find significant PH and FE effects going in the expected, opposite directions and being of similar magnitude. However, when looking at world trade, the effects become very small because of the high North-North trade share, where we have no a priori expectations about the signs of these effects. Therefore popular fears about the trade effects of differences in environmental regulations might by exaggerated. The second chapter is entitled "Is trade bad for the Environment? Decomposing worldwide SO2 emissions, 1990-2000". First we construct a novel and large database containing reasonable estimates of SO2 emission intensities per unit labor that vary across countries, periods and manufacturing sectors. Then we use these original data (covering 31 developed and 31 developing countries) to decompose the worldwide SO2 emissions into the three well known dynamic effects (scale, technique and composition effect). We find that the positive scale (+9,5%) and the negative technique (-12.5%) effect are the main driving forces of emission changes. Composition effects between countries and sectors are smaller, both negative and of similar magnitude (-3.5% each). Given that trade matters via the composition effects this means that trade reduces total emissions. We next construct, in a first experiment, a hypothetical world where no trade happens, i.e. each country produces its imports at home and does no longer produce its exports. The difference between the actual and this no-trade world allows us (under the omission of price effects) to compute a static first-order trade effect. The latter now increases total world emissions because it allows, on average, dirty countries to specialize in dirty products. However, this effect is smaller (3.5%) in 2000 than in 1990 (10%), in line with the negative dynamic composition effect identified in the previous exercise. We then propose a second experiment, comparing effective emissions with the maximum or minimum possible level of SO2 emissions. These hypothetical levels of emissions are obtained by reallocating labour accordingly across sectors within each country (under the country-employment and the world industry-production constraints). Using linear programming techniques, we show that emissions are reduced by 90% with respect to the worst case, but that they could still be reduced further by another 80% if emissions were to be minimized. The findings from this chapter go together with those from chapter one in the sense that trade-induced composition effect do not seem to be the main source of pollution, at least in the recent past. Going now to the economic geography part of this thesis, the third chapter, entitled "A Dynamic Model with Sectoral Agglomeration Effects" consists of a short note that derives the theoretical model estimated in the fourth chapter. The derivation is directly based on the multi-regional framework by Ciccone (2002) but extends it in order to include sectoral disaggregation and a temporal dimension. This allows us formally to write present productivity as a function of past productivity and other contemporaneous and past control variables. The fourth chapter entitled "Sectoral Agglomeration Effects in a Panel of European Regions" takes the final equation derived in chapter three to the data. We investigate the empirical link between density and labour productivity based on regional data (245 NUTS-2 regions over the period 1980-2003). Using dynamic panel techniques allows us to control for the possible endogeneity of density and for region specific effects. We find a positive long run elasticity of density with respect to labour productivity of about 13%. When using data at the sectoral level it seems that positive cross-sector and negative own-sector externalities are present in manufacturing while financial services display strong positive own-sector effects. The fifth and last chapter entitled "Is the World's Economic Center of Gravity Already in Asia?" computes the world economic, demographic and geographic center of gravity for 1975-2004 and compares them. Based on data for the largest cities in the world and using the physical concept of center of mass, we find that the world's economic center of gravity is still located in Europe, even though there is a clear shift towards Asia. To sum up, this thesis makes three main contributions. First, it provides new estimates of orders of magnitudes for the role of trade in the globalisation and environment debate. Second, it computes reliable and disaggregated elasticities for the effect of density on labour productivity in European regions. Third, it allows us, in a geometrically rigorous way, to track the path of the world's economic center of gravity.