168 resultados para Reliable Theoretical Procedures


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Colorectal cancer (CRC) can be cured when diagnosed in its early or precancerous (adenoma) stages. Mostly due to poor compliance towards invasive screening procedures, detection rates for adenoma and early CRCs are still low. Available non-invasive screening tests have unfortunately low sensitivity and specificity performances. Therefore, there is a large unmet need calling for a cost-effective, reliable and non-invasive test to screen for early neoplastic and pre-neoplastic lesions. Objective: To develop a routine screening test based on a nucleic acids multi-gene assay performed on peripheral blood mononuclear cells (PBMCs) that can detect early CRCs and adenomas. Methods: 116 patients (mean age: 55 years; range: 18 to 74 years; female/male ration 0.98) were included in this pilot, nonblinded, colonoscopy-controlled study. Colonoscopy revealed 21 patients with CRC, 30 patients with adenoma bigger than 1 cm, 24 patients with inflammatory bowel disease (IBD) and 41 patients had no neoplastic or inflammatory lesions. Blood samples were taken from each patient the day of the colonoscopy and PBMCs were purified. Total RNA was extracted following standard procedures. Multiplex RT-qPCR was applied on 92 different candidate biomarkers. Different univariate and multivariate statistical methods were applied on these candidates, and among them, 57 biomarkers with significant p values (<0.01, Wilcoxon test) were selected, including ADAMTS1, MMP9, CXCL10, CXCR4, VEGFA and CDH1. Two distinct biomarker signatures are used to separate patients without neoplastic lesion from those with cancer (named COLOX 1 test), respectively from those with adenoma (named COLOX 2 test). Result: COLOX 1 and 2 tests have successfully separated patients without neoplastic lesion from those with CRC (sensitivity 70%, specificity 90%, AUC 0.88), respectively from those with adenoma bigger than 1cm (sensitivity 61%, specificity 80%, AUC 0.80). 6/24 patients in the IBD group have a positive COLOX 1 test. Conclusion: These two COLOX tests demonstrated an acceptable sensitivity and a high specificity to detect the presence of CRCs and adenomas bigger than 1 cm. The false positives COLOX 1 test in IBD patients could possibly be due to the chronic inflammatory state. A prospective, multicenter, pivotal study is underway in order to confirm these promising results in a larger cohort.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To assess the suitability of a hot-wire anemometer infant monitoring system (Florian, Acutronic Medical Systems AG, Hirzel, Switzerland) for measuring flow and tidal volume (Vt) proximal to the endotracheal tube during high-frequency oscillatory ventilation. DESIGN: In vitro model study. SETTING: Respiratory research laboratory. SUBJECT: In vitro lung model simulating moderate to severe respiratory distress. INTERVENTION: The lung model was ventilated with a SensorMedics 3100A ventilator. Vt was recorded from the monitor display (Vt-disp) and compared with the gold standard (Vt-adiab), which was calculated using the adiabatic gas equation from pressure changes inside the model. MEASUREMENTS AND MAIN RESULTS: A range of Vt (1-10 mL), frequencies (5-15 Hz), pressure amplitudes (10-90 cm H2O), inspiratory times (30% to 50%), and Fio2 (0.21-1.0) was used. Accuracy was determined by using modified Bland-Altman plots (95% limits of agreement). An exponential decrease in Vt was observed with increasing oscillatory frequency. Mean DeltaVt-disp was 0.6 mL (limits of agreement, -1.0 to 2.1) with a linear frequency dependence. Mean DeltaVt-disp was -0.2 mL (limits of agreement, -0.5 to 0.1) with increasing pressure amplitude and -0.2 mL (limits of agreement, -0.3 to -0.1) with increasing inspiratory time. Humidity and heating did not affect error, whereas increasing Fio2 from 0.21 to 1.0 increased mean error by 6.3% (+/-2.5%). CONCLUSIONS: The Florian infant hot-wire flowmeter and monitoring system provides reliable measurements of Vt at the airway opening during high-frequency oscillatory ventilation when employed at frequencies of 8-13 Hz. The bedside application could improve monitoring of patients receiving high-frequency oscillatory ventilation, favor a better understanding of the physiologic consequences of different high-frequency oscillatory ventilation strategies, and therefore optimize treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Only a small percentage of neurodegenerative diseases like Alzheimer's disease and Parkinson's disease is directly related to familial forms. The etiology of the most abundant, sporadic forms seems to involve both genetic and environmental factors. Environmental compounds are now extensively studied for their possible contribution to neurodegeneration. Chemicals were found which were able to reproduce symptoms of known neurodegenerative diseases, others may either predispose to the onset of neurodegeneration, or exacerbate distinct pathogenic processes of these diseases. In any case, in vitro studies performed with models presenting various degrees of complexity have shown that many environmental compounds have the potential to cause neurodegeneration, through a variety of pathways similar to those described in neurodegenerative diseases. Since the population is exposed to a huge number of potentially neurotoxic compounds, there is an important need for rapid and efficient procedures for hazard evaluation. Xenobiotics elicit a cascade of reactions that, most of the time, involve numerous interactions between the different brain cell types. A reliable in vitro model for the detection of environmental toxins potentially at risk for neurodegenerative diseases should therefore allow maximal cell-cell interactions and multiparametric endpoints determination. The combined use of in vitro models and new analytical approaches using "omics" technologies should help to map toxicity pathways, and advance our understanding of the possible role of xenobiotics in the etiology of neurodegenerative diseases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the past 20 years the theory of robust estimation has become an important topic of mathematical statistics. We discuss here some basic concepts of this theory with the help of simple examples. Furthermore we describe a subroutine library for the application of robust statistical procedures, which was developed with the support of the Swiss National Science Foundation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This Ph.D. dissertation seeks to study the work motivation of employees in the delivery of public services. The questioning on work motivation in public services in not new but it becomes central for governments which are now facing unprecedented public debts. The objective of this research is twofold : First, we want to see if the work motivation of employees in public services is a continuum (intrinsic and extrinsic motivations cannot coexist) or a bi-dimensional construct (intrinsic and extrinsic motivations coexist simultaneously). The research in public administration literature has focused on the concept of public service motivation, and considered motivation to be uni-dimensional (Perry and Hondeghem 2008). However, no study has yet tackled both types of motivation, the intrinsic and extrinsic ones, in the same time. This dissertation proposes, in Part I, a theoretical assessment and an empirical test of a global work motivational structure, by using a self-constructed Swiss dataset with employees from three public services, the education sector, the security sector and the public administrative services sector. Our findings suggest that work motivation in public services in not uni-dimensional but bi-dimensional, the intrinsic and extrinsic motivations coexist simultaneously and can be positively correlated (Amabile et al. 1994). Our findings show that intrinsic motivation is as important as extrinsic motivation, thus, the assumption that employees in public services are less attracted by extrinsic rewards is not confirmed for this sample. Other important finding concerns the public service motivation concept, which, as theoretically predicted, represents the major motivational dimension of employees in the delivery of public services. Second, the theory of public service motivation makes the assumption that employees in public services engage in activities that go beyond their self-interest, but never uses this construct as a determinant for their pro-social behavior. In the same time, several studies (Gregg et al. 2011 and Georgellis et al. 2011) bring evidence about the pro-social behavior of employees in public services. However, they do not identify which type of motivation is at the origin of this behavior, they only make the assumption of an intrinsically motivated behavior. We analyze the pro-social behavior of employees in public services and use the public service motivation as determinant of their pro-social behavior. We add other determinants highlighted by the theory of pro-social behavior (Bénabou and Tirole 2006), by Le Grand (2003) and by fit theories (Besley and Ghatak 2005). We test these determinants on Part II and identify for each sector of activity the positive or the negative impact on pro-social behavior of Swiss employees. Contrary to expectations, we find, for this sample, that both intrinsic and extrinsic factors have a positive impact on pro-social behavior, no crowding-out effect is identified in this sample. We confirm the hypothesis of Le Grand (2003) about the positive impact of the opportunity cost on pro-social behavior. Our results suggest a mix of action-oriented altruism and out-put oriented altruism of employees in public services. These results are relevant when designing incentives schemes for employees in the delivery of public services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using Monte Carlo simulations and reanalyzing the data of a validation study of the AEIM emotional intelligence test, we demonstrated that an atheoretical approach and the use of weak statistical procedures can result in biased validity estimates. These procedures included stepwise regression-and the general case of failing to include important theoretical controls-extreme scores analysis, and ignoring heteroscedasticity as well as measurement error. The authors of the AEIM test responded by offering more complete information about their analyses, allowing us to further examine the perils of ignoring theory and correct statistical procedures. In this paper we show with extended analyses that the AEIM test is invalid.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Atrial fibrillation is a very common heart arrhythmia, associated with a five-fold increase in the risk of embolic strokes. Treatment strategies encompass palliative drugs or surgical procedures all of which can restore sinus rhythm. Unfortunately, atria often fail to recover their mechanical function and patients therefore require lifelong anticoagulation therapy. A motorless volume displacing device (Atripump) based on artificial muscle technology, positioned on the external surface of atrium could avoid the need of oral anticoagulation and its haemorrhagic complications. An animal study was conducted in order to assess the haemodynamic effects that such a pump could provide. METHODS: Atripump is a dome-shape siliconecoated nitinol actuator sewn on the external surface of the atrium. It is driven by a pacemaker-like control unit. Five non-anticoagulated sheep were selected for this experiment. The right atrium was surgically exposed, the device sutured and connected. Haemodynamic parameters and intracardiac ultrasound (ICUS) data were recorded in each animal and under three conditions; baseline; atrial fibrillation (AF); atripump assisted AF (aaAF). RESULTS: In two animals, after 20 min of AF, small thrombi appeared in the right atrial appendix and were washed out once the pump was turned on. Assistance also enhanced atrial ejection fraction. 31% baseline; 5% during AF; 20% under aaAF. Right atrial systolic surfaces (cm2) were; 5.2 +/- 0.3 baseline; 6.2 +/- 0.1 AF; 5.4 +/- 0.3 aaAF. CONCLUSION: This compact and reliable pump seems to restore the atrial "kick" and prevents embolic events. It could avoid long-term anticoagulation therapy and open new hopes in the care of end-stage heart failure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gene expression often cycles between active and inactive states in eukaryotes, yielding variable or noisy gene expression in the short-term, while slow epigenetic changes may lead to silencing or variegated expression. Understanding how cells control these effects will be of paramount importance to construct biological systems with predictable behaviours. Here we find that a human matrix attachment region (MAR) genetic element controls the stability and heritability of gene expression in cell populations. Mathematical modeling indicated that the MAR controls the probability of long-term transitions between active and inactive expression, thus reducing silencing effects and increasing the reactivation of silent genes. Single-cell short-terms assays revealed persistent expression and reduced expression noise in MAR-driven genes, while stochastic burst of expression occurred without this genetic element. The MAR thus confers a more deterministic behavior to an otherwise stochastic process, providing a means towards more reliable expression of engineered genetic systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article analyses and discusses issues that pertain to the choice of relevant databases for assigning values to the components of evaluative likelihood ratio procedures at source level. Although several formal likelihood ratio developments currently exist, both case practitioners and recipients of expert information (such as judiciary) may be reluctant to consider them as a framework for evaluating scientific evidence in context. The recent ruling R v T and ensuing discussions in many forums provide illustrative examples for this. In particular, it is often felt that likelihood ratio-based reasoning amounts to an application that requires extensive quantitative information along with means for dealing with technicalities related to the algebraic formulation of these approaches. With regard to this objection, this article proposes two distinct discussions. In a first part, it is argued that, from a methodological point of view, there are additional levels of qualitative evaluation that are worth considering prior to focusing on particular numerical probability assignments. Analyses will be proposed that intend to show that, under certain assumptions, relative numerical values, as opposed to absolute values, may be sufficient to characterize a likelihood ratio for practical and pragmatic purposes. The feasibility of such qualitative considerations points out that the availability of hard numerical data is not a necessary requirement for implementing a likelihood ratio approach in practice. It is further argued that, even if numerical evaluations can be made, qualitative considerations may be valuable because they can further the understanding of the logical underpinnings of an assessment. In a second part, the article will draw a parallel to R v T by concentrating on a practical footwear mark case received at the authors' institute. This case will serve the purpose of exemplifying the possible usage of data from various sources in casework and help to discuss the difficulty associated with reconciling the depth of theoretical likelihood ratio developments and limitations in the degree to which these developments can actually be applied in practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The number of fluoroscopy-guided procedures in cardiology is increasing over time and it is appropriate to wonder whether technological progress or change of techniques is influencing patient exposure. The aim of this study is to examine whether patient dose has been decreasing over the years. Patient dose data of more than 7700 procedures were collected from two cardiology centres. A steady increase in the patient dose over the years was observed in both the centres for the two cardiological procedures included in this study. Significant increase in dose was also observed after the installation of a flat-panel detector. The increasing use of radial access may lead to an increase in the patient exposure. The monitoring of dose data over time showed a considerable increase in the patient exposure over time. Actions have to be taken towards dose reduction in both the centres.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The optimization of the extremity dosimetry of medical staff in nuclear medicine was the aim of the Work Package 4 (WP4) of the ORAMED project, a Collaborative Project (2008-2011) supported by the European Commission within its 7th Framework Programme. Hand doses and dose distributions across the hands of medical staff working in nuclear medicine departments were evaluated through an extensive measurement program involving 32 hospitals in Europe and 139 monitored workers. The study included the most frequently used radionuclides, (99m)Tc- and (18)F-labelled radiopharmaceuticals for diagnostic and (90)Y-labelled Zevalin (R) and DOTATOC for therapy. Furthermore, Monte Carlo simulations were performed in different predefined scenarios to evaluate separately the efficacy of different radiation protection measures by comparing hand dose distributions according to various parameters. The present work gives recommendations based on results obtained with both measurements and simulations. This results in nine practical recommendations regarding the positioning of the dosemeters for an appropriate skin dose monitoring and the best protection means to reduce the personnel exposure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Identifying those areas suitable for recolonization by threatened species is essential to support efficient conservation policies. Habitat suitability models (HSM) predict species' potential distributions, but the quality of their predictions should be carefully assessed when the species-environment equilibrium assumption is violated.2. We studied the Eurasian otter Lutra lutra, whose numbers are recovering in southern Italy. To produce widely applicable results, we chose standard HSM procedures and looked for the models' capacities in predicting the suitability of a recolonization area. We used two fieldwork datasets: presence-only data, used in the Ecological Niche Factor Analyses (ENFA), and presence-absence data, used in a Generalized Linear Model (GLM). In addition to cross-validation, we independently evaluated the models with data from a recolonization event, providing presences on a previously unoccupied river.3. Three of the models successfully predicted the suitability of the recolonization area, but the GLM built with data before the recolonization disagreed with these predictions, missing the recolonized river's suitability and badly describing the otter's niche. Our results highlighted three points of relevance to modelling practices: (1) absences may prevent the models from correctly identifying areas suitable for a species spread; (2) the selection of variables may lead to randomness in the predictions; and (3) the Area Under Curve (AUC), a commonly used validation index, was not well suited to the evaluation of model quality, whereas the Boyce Index (CBI), based on presence data only, better highlighted the models' fit to the recolonization observations.4. For species with unstable spatial distributions, presence-only models may work better than presence-absence methods in making reliable predictions of suitable areas for expansion. An iterative modelling process, using new occurrences from each step of the species spread, may also help in progressively reducing errors.5. Synthesis and applications. Conservation plans depend on reliable models of the species' suitable habitats. In non-equilibrium situations, such as the case for threatened or invasive species, models could be affected negatively by the inclusion of absence data when predicting the areas of potential expansion. Presence-only methods will here provide a better basis for productive conservation management practices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

General Summary Although the chapters of this thesis address a variety of issues, the principal aim is common: test economic ideas in an international economic context. The intention has been to supply empirical findings using the largest suitable data sets and making use of the most appropriate empirical techniques. This thesis can roughly be divided into two parts: the first one, corresponding to the first two chapters, investigates the link between trade and the environment, the second one, the last three chapters, is related to economic geography issues. Environmental problems are omnipresent in the daily press nowadays and one of the arguments put forward is that globalisation causes severe environmental problems through the reallocation of investments and production to countries with less stringent environmental regulations. A measure of the amplitude of this undesirable effect is provided in the first part. The third and the fourth chapters explore the productivity effects of agglomeration. The computed spillover effects between different sectors indicate how cluster-formation might be productivity enhancing. The last chapter is not about how to better understand the world but how to measure it and it was just a great pleasure to work on it. "The Economist" writes every week about the impressive population and economic growth observed in China and India, and everybody agrees that the world's center of gravity has shifted. But by how much and how fast did it shift? An answer is given in the last part, which proposes a global measure for the location of world production and allows to visualize our results in Google Earth. A short summary of each of the five chapters is provided below. The first chapter, entitled "Unraveling the World-Wide Pollution-Haven Effect" investigates the relative strength of the pollution haven effect (PH, comparative advantage in dirty products due to differences in environmental regulation) and the factor endowment effect (FE, comparative advantage in dirty, capital intensive products due to differences in endowments). We compute the pollution content of imports using the IPPS coefficients (for three pollutants, namely biological oxygen demand, sulphur dioxide and toxic pollution intensity for all manufacturing sectors) provided by the World Bank and use a gravity-type framework to isolate the two above mentioned effects. Our study covers 48 countries that can be classified into 29 Southern and 19 Northern countries and uses the lead content of gasoline as proxy for environmental stringency. For North-South trade we find significant PH and FE effects going in the expected, opposite directions and being of similar magnitude. However, when looking at world trade, the effects become very small because of the high North-North trade share, where we have no a priori expectations about the signs of these effects. Therefore popular fears about the trade effects of differences in environmental regulations might by exaggerated. The second chapter is entitled "Is trade bad for the Environment? Decomposing worldwide SO2 emissions, 1990-2000". First we construct a novel and large database containing reasonable estimates of SO2 emission intensities per unit labor that vary across countries, periods and manufacturing sectors. Then we use these original data (covering 31 developed and 31 developing countries) to decompose the worldwide SO2 emissions into the three well known dynamic effects (scale, technique and composition effect). We find that the positive scale (+9,5%) and the negative technique (-12.5%) effect are the main driving forces of emission changes. Composition effects between countries and sectors are smaller, both negative and of similar magnitude (-3.5% each). Given that trade matters via the composition effects this means that trade reduces total emissions. We next construct, in a first experiment, a hypothetical world where no trade happens, i.e. each country produces its imports at home and does no longer produce its exports. The difference between the actual and this no-trade world allows us (under the omission of price effects) to compute a static first-order trade effect. The latter now increases total world emissions because it allows, on average, dirty countries to specialize in dirty products. However, this effect is smaller (3.5%) in 2000 than in 1990 (10%), in line with the negative dynamic composition effect identified in the previous exercise. We then propose a second experiment, comparing effective emissions with the maximum or minimum possible level of SO2 emissions. These hypothetical levels of emissions are obtained by reallocating labour accordingly across sectors within each country (under the country-employment and the world industry-production constraints). Using linear programming techniques, we show that emissions are reduced by 90% with respect to the worst case, but that they could still be reduced further by another 80% if emissions were to be minimized. The findings from this chapter go together with those from chapter one in the sense that trade-induced composition effect do not seem to be the main source of pollution, at least in the recent past. Going now to the economic geography part of this thesis, the third chapter, entitled "A Dynamic Model with Sectoral Agglomeration Effects" consists of a short note that derives the theoretical model estimated in the fourth chapter. The derivation is directly based on the multi-regional framework by Ciccone (2002) but extends it in order to include sectoral disaggregation and a temporal dimension. This allows us formally to write present productivity as a function of past productivity and other contemporaneous and past control variables. The fourth chapter entitled "Sectoral Agglomeration Effects in a Panel of European Regions" takes the final equation derived in chapter three to the data. We investigate the empirical link between density and labour productivity based on regional data (245 NUTS-2 regions over the period 1980-2003). Using dynamic panel techniques allows us to control for the possible endogeneity of density and for region specific effects. We find a positive long run elasticity of density with respect to labour productivity of about 13%. When using data at the sectoral level it seems that positive cross-sector and negative own-sector externalities are present in manufacturing while financial services display strong positive own-sector effects. The fifth and last chapter entitled "Is the World's Economic Center of Gravity Already in Asia?" computes the world economic, demographic and geographic center of gravity for 1975-2004 and compares them. Based on data for the largest cities in the world and using the physical concept of center of mass, we find that the world's economic center of gravity is still located in Europe, even though there is a clear shift towards Asia. To sum up, this thesis makes three main contributions. First, it provides new estimates of orders of magnitudes for the role of trade in the globalisation and environment debate. Second, it computes reliable and disaggregated elasticities for the effect of density on labour productivity in European regions. Third, it allows us, in a geometrically rigorous way, to track the path of the world's economic center of gravity.