843 resultados para Residual-Based Cointegration Test
Resumo:
Integrating single nucleotide polymorphism (SNP) p-values from genome-wide association studies (GWAS) across genes and pathways is a strategy to improve statistical power and gain biological insight. Here, we present Pascal (Pathway scoring algorithm), a powerful tool for computing gene and pathway scores from SNP-phenotype association summary statistics. For gene score computation, we implemented analytic and efficient numerical solutions to calculate test statistics. We examined in particular the sum and the maximum of chi-squared statistics, which measure the strongest and the average association signals per gene, respectively. For pathway scoring, we use a modified Fisher method, which offers not only significant power improvement over more traditional enrichment strategies, but also eliminates the problem of arbitrary threshold selection inherent in any binary membership based pathway enrichment approach. We demonstrate the marked increase in power by analyzing summary statistics from dozens of large meta-studies for various traits. Our extensive testing indicates that our method not only excels in rigorous type I error control, but also results in more biologically meaningful discoveries.
Resumo:
Integrated in a wide research assessing destabilizing and triggering factors to model cliff dynamic along the Dieppe's shoreline in High Normandy, this study aims at testing boat-based mobile LiDAR capabilities by scanning 3D point clouds of the unstable coastal cliffs. Two acquisition campaigns were performed in September 2012 and September 2013, scanning (1) a 30-km-long shoreline and (2) the same test cliffs in different environmental conditions and device settings. The potentials of collected data for 3D modelling, change detection and landslide monitoring were afterward assessed. By scanning during favourable meteorological and marine conditions and close to the coast, mobile LiDAR devices are able to quickly scan a long shoreline with median point spacing up to 10cm. The acquired data are then sufficiently detailed to map geomorphological features smaller than 0.5m2. Furthermore, our capability to detect rockfalls and erosion deposits (>m3) is confirmed, since using the classical approach of computing differences between sequential acquisitions reveals many cliff collapses between Pourville and Quiberville and only sparse changes between Dieppe and Belleville-sur-Mer. These different change rates result from different rockfall susceptibilities. Finally, we also confirmed the capability of the boat-based mobile LiDAR technique to monitor single large changes, characterizing the Dieppe landslide geometry with two main active scarps, retrogression up to 40m and about 100,000m3 of eroded materials.
Resumo:
The extension of traditional data mining methods to time series has been effectively applied to a wide range of domains such as finance, econometrics, biology, security, and medicine. Many existing mining methods deal with the task of change points detection, but very few provide a flexible approach. Querying specific change points with linguistic variables is particularly useful in crime analysis, where intuitive, understandable, and appropriate detection of changes can significantly improve the allocation of resources for timely and concise operations. In this paper, we propose an on-line method for detecting and querying change points in crime-related time series with the use of a meaningful representation and a fuzzy inference system. Change points detection is based on a shape space representation, and linguistic terms describing geometric properties of the change points are used to express queries, offering the advantage of intuitiveness and flexibility. An empirical evaluation is first conducted on a crime data set to confirm the validity of the proposed method and then on a financial data set to test its general applicability. A comparison to a similar change-point detection algorithm and a sensitivity analysis are also conducted. Results show that the method is able to accurately detect change points at very low computational costs. More broadly, the detection of specific change points within time series of virtually any domain is made more intuitive and more understandable, even for experts not related to data mining.
Resumo:
The paper presents the results of the piloting or pilot test in a virtual classroom. This e-portfolio was carried out in the 2005-2006 academic year, with students of the Doctorate in Information Society, at the Open University of Catalonia. The electronic portfolio is a strategy for competence based assessment. This experience shows the types of e-portfolios, where students show their work without interactions, and apply the competence-based learning theories in an interactive portfolio system. The real process of learning is developed in the competency based system, the portfolio not only is a basic bio document, has become a real space for learning with competence model. The paper brings out new ideas and possibilities: the competence-based learning promotes closer relationships between universities and companies and redesigns the pedagogic act.
Resumo:
Abstract Background HIV-1 infection increases plasma levels of inflammatory markers. Combination antiretroviral therapy (cART) does not restore inflammatory markers to normal levels. Since intensification of cART with raltegravir reduced CD8 T-cell activation in the Discor-Ral and IntegRal studies, we have evaluated the effect of raltegravir intensification on several soluble inflammation markers in these studies. Methods Longitudinal plasma samples (0–48 weeks) from the IntegRal (n = 67, 22 control and 45 intensified individuals) and the Discor-Ral studies (44 individuals with CD4 T-cell counts<350 cells/µl, 14 control and 30 intensified) were assayed for 25 markers. Mann-Whitney, Wilcoxon, Spearman test and linear mixed models were used for analysis. Results At baseline, different inflammatory markers were strongly associated with HCV co-infection, lower CD4 counts and with cART regimens (being higher in PI-treated individuals), but poorly correlated with detection of markers of residual viral replication. Although raltegravir intensification reduced inflammation in individuals with lower CD4 T-cell counts, no effect of intensification was observed on plasma markers of inflammation in a global analysis. An association was found, however, between reductions in immune activation and plasma levels of the coagulation marker D-dimer, which exclusively decreased in intensified patients on protease inhibitor (PI)-based cART regimens (P = 0.040). Conclusions The inflammatory profile in treated HIV-infected individuals showed a complex association with HCV co-infection, the levels of CD4 T cells and the cART regimen. Raltegravir intensification specifically reduced D-dimer levels in PI-treated patients, highlighting the link between cART composition and residual viral replication; however, raltegravir had little effect on other inflammatory markers.
Resumo:
Synchronous machines with an AC converter are used mainly in large drives, for example in ship propulsion drives as well as in rolling mill drives in steel industry. These motors are used because of their high efficiency, high overload capacity and good performance in the field weakening area. Present day drives for electrically excited synchronous motors are equipped with position sensors. Most drives for electrically excited synchronous motors will be equipped with position sensors also in future. This kind of drives with good dynamics are mainly used in metal industry. Drives without a position sensor can be used e.g. in ship propulsion and in large pump and blower drives. Nowadays, these drives are equipped with a position sensor, too. The tendency is to avoid a position sensor if possible, since a sensor reduces the reliability of the drive and increases costs (latter is not very significant for large drives). A new control technique for a synchronous motor drive is a combination of the Direct Flux Linkage Control (DFLC) based on a voltage model and a supervising method (e.g. current model). This combination is called Direct Torque Control method (DTC). In the case of the position sensorless drive, the DTC can be implemented by using other supervising methods that keep the stator flux linkage origin centered. In this thesis, a method for the observation of the drift of the real stator flux linkage in the DTC drive is introduced. It is also shown how this method can be used as a supervising method that keeps the stator flux linkage origin centered in the case of the DTC. In the position sensorless case, a synchronous motor can be started up with the DTC control, when a method for the determination of the initial rotor position presented in this thesis is used. The load characteristics of such a drive are not very good at low rotational speeds. Furthermore, continuous operation at a zero speed and at a low rotational speed is not possible, which is partly due to the problems related to the flux linkage estimate. For operation in a low speed area, a stator current control method based on the DFLC modulator (DMCQ is presented. With the DMCC, it is possible to start up and operate a synchronous motor at a zero speed and at low rotational speeds in general. The DMCC is necessary in situations where high torque (e.g. nominal torque) is required at the starting moment, or if the motor runs several seconds at a zero speed or at a low speed range (up to 2 Hz). The behaviour of the described methods is shown with test results. The test results are presented for the direct flux linkage and torque controlled test drive system with a 14.5 kVA, four pole salient pole synchronous motor with a damper winding and electric excitation. The static accuracy of the drive is verified by measuring the torque in a static load operation, and the dynamics of the drive is proven in load transient tests. The performance of the drive concept presented in this work is sufficient e.g. for ship propulsion and for large pump drives. Furthermore, the developed methods are almost independent of the machine parameters.
Resumo:
Residual CuSO4 was incorporated into the mass utilized for cement preparation. To a cement mass with 1:0.5:5 of cement, lime and sand to 0.25, 0.50, 0.75 and 1.00% of residual CuSO4 were added. The sulfate was mixed separately with lime and water to induce metal precipitation. The hardened test bodies were submerged in Milli-Q water for three months. No Cu was detected in the water by Atomic Absorption Spectrometry. The best proportion for mechanical resistance and porosity is 0.50%. The cement is adequate for non-structural objects.
Resumo:
In the context of the evidence-based practices movement, the emphasis on computing effect sizes and combining them via meta-analysis does not preclude the demonstration of functional relations. For the latter aim, we propose to augment the visual analysis to add consistency to the decisions made on the existence of a functional relation without losing sight of the need for a methodological evaluation of what stimuli and reinforcement or punishment are used to control the behavior. Four options for quantification are reviewed, illustrated, and tested with simulated data. These quantifications include comparing the projected baseline with the actual treatment measurements, on the basis of either parametric or nonparametric statistics. The simulated data used to test the quantifications include nine data patterns in terms of the presence and type of effect and comprising ABAB and multiple baseline designs. Although none of the techniques is completely flawless in terms of detecting a functional relation only when it is present but not when it is absent, an option based on projecting split-middle trend and considering data variability as in exploratory data analysis proves to be the best performer for most data patterns. We suggest that the information on whether a functional relation has been demonstrated should be included in meta-analyses. It is also possible to use as a weight the inverse of the data variability measure used in the quantification for assessing the functional relation. We offer an easy to use code for open-source software for implementing some of the quantifications.
Resumo:
This work presents a study about the elimination of anticancer drugs, a group of pollutants considered recalcitrant during conventional activated sludge wastewater treatment, using a biological treatment based on the fungus Trametes versicolor. A 10-L fluidized bed bioreactor inoculated with this fungus was set up in order to evaluate the removal of 10 selected anticancer drugs in real hospital wastewater. Almost all the tested anticancer drugs were completely removed from the wastewater at the end of the batch experiment (8 d) with the exception of Ifosfamide and Tamoxifen. These two recalcitrant compounds, together with Cyclophosphamide, were selected for further studies to test their degradability by T. versicolor under optimal growth conditions. Cyclophosphamide and Ifosfamide were inalterable during batch experiments both at high and low concentration, whereas Tamoxifen exhibited a decrease in its concentration along the treatment. Two positional isomers of a hydroxylated form of Tamoxifen were identified during this experiment using a high resolution mass spectrometry based on ultra-high performance chromatography coupled to an Orbitrap detector (LTQ-Velos Orbitrap). Finally the identified transformation products of Tamoxifen were monitored in the bioreactor run with real hospital wastewater
Resumo:
In this work we describe both a chromatographic purification procedure and a spot test for the enzyme peroxidase (POD: EC 1.11.1.7). The enzyme was obtained from crude extracts of sweet potatoes and the chromatographic enzyme purification procedure resulted in several fractions. Therefore a simple, fast and economic spot test for monitoring peroxidase during the purification procedure was developed. The spot test is based on the reaction of hydrogen peroxide and guaiacol, which is catalyzed by the presence of peroxidase yielding the colored tetraguaiacol.
Resumo:
Metaheuristic methods have become increasingly popular approaches in solving global optimization problems. From a practical viewpoint, it is often desirable to perform multimodal optimization which, enables the search of more than one optimal solution to the task at hand. Population-based metaheuristic methods offer a natural basis for multimodal optimization. The topic has received increasing interest especially in the evolutionary computation community. Several niching approaches have been suggested to allow multimodal optimization using evolutionary algorithms. Most global optimization approaches, including metaheuristics, contain global and local search phases. The requirement to locate several optima sets additional requirements for the design of algorithms to be effective in both respects in the context of multimodal optimization. In this thesis, several different multimodal optimization algorithms are studied in regard to how their implementation in the global and local search phases affect their performance in different problems. The study concentrates especially on variations of the Differential Evolution algorithm and their capabilities in multimodal optimization. To separate the global and local search search phases, three multimodal optimization algorithms are proposed, two of which hybridize the Differential Evolution with a local search method. As the theoretical background behind the operation of metaheuristics is not generally thoroughly understood, the research relies heavily on experimental studies in finding out the properties of different approaches. To achieve reliable experimental information, the experimental environment must be carefully chosen to contain appropriate and adequately varying problems. The available selection of multimodal test problems is, however, rather limited, and no general framework exists. As a part of this thesis, such a framework for generating tunable test functions for evaluating different methods of multimodal optimization experimentally is provided and used for testing the algorithms. The results demonstrate that an efficient local phase is essential for creating efficient multimodal optimization algorithms. Adding a suitable global phase has the potential to boost the performance significantly, but the weak local phase may invalidate the advantages gained from the global phase.
Resumo:
Background: Assessing of the costs of treating disease is necessary to demonstrate cost-effectiveness and to estimate the budget impact of new interventions and therapeutic innovations. However, there are few comprehensive studies on resource use and costs associated with lung cancer patients in clinical practice in Spain or internationally. The aim of this paper was to assess the hospital cost associated with lung cancer diagnosis and treatment by histology, type of cost and stage at diagnosis in the Spanish National Health Service. Methods: A retrospective, descriptive analysis on resource use and a direct medical cost analysis were performed. Resource utilisation data were collected by means of patient files from nine teaching hospitals. From a hospital budget impact perspective, the aggregate and mean costs per patient were calculated over the first three years following diagnosis or up to death. Both aggregate and mean costs per patient were analysed by histology, stage at diagnosis and cost type. Results: A total of 232 cases of lung cancer were analysed, of which 74.1% corresponded to non-small cell lung cancer (NSCLC) and 11.2% to small cell lung cancer (SCLC); 14.7% had no cytohistologic confirmation. The mean cost per patient in NSCLC ranged from 13,218 Euros in Stage III to 16,120 Euros in Stage II. The main cost components were chemotherapy (29.5%) and surgery (22.8%). Advanced disease stages were associated with a decrease in the relative weight of surgical and inpatient care costs but an increase in chemotherapy costs. In SCLC patients, the mean cost per patient was 15,418 Euros for limited disease and 12,482 Euros for extensive disease. The main cost components were chemotherapy (36.1%) and other inpatient costs (28.7%). In both groups, the Kruskall-Wallis test did not show statistically significant differences in mean cost per patient between stages. Conclusions: This study provides the costs of lung cancer treatment based on patient file reviews, with chemotherapy and surgery accounting for the major components of costs. This cost analysis is a baseline study that will provide a useful source of information for future studies on cost-effectiveness and on the budget impact of different therapeutic innovations in Spain.
Resumo:
Cefdinir has broad spectrum of activity and high prescription rates, hence its counterfeiting seems imminent. We have proposed a simple, fast, selective and non-extractive spectrophotometric method for the content assay of cefdinir in formulations. The method is based on complexation of cefdinir and Fe under reducing condition in a buffered medium (pH 11) to form a magenta colored donor-acceptor complex (λ max = 550 nm; apparent molar absorptivity = 3720 L mol-1 cm-1). No other cephalosporins, penicillins and common excipients interfere under the test conditions. The Beer's law is followed in the concentration range 8-160 µg mL-1.
Resumo:
Difenoconazole residues in strawberry fruit cultivated in pots were estimated using the solid-liquid extraction with low temperature partition (SLE/LTP) method for sample preparation and gas chromatography with electron capture detection (GC/ECD) for analysis. The optimized method presented excellent recovery values from fortified samples and reproducibility (average recovery values ≥ 98%; CV values < 15%). Linearity of response was demonstrated (r = 0.995) with a detection limit of 9 µg kg-1. The method was successfully applied for the determination of difenoconazole residues in strawberries. Based on these results, the fungicide dissipates quickly, but the residual concentration increases after multiple applications.
Resumo:
Virtual screening is a central technique in drug discovery today. Millions of molecules can be tested in silico with the aim to only select the most promising and test them experimentally. The topic of this thesis is ligand-based virtual screening tools which take existing active molecules as starting point for finding new drug candidates. One goal of this thesis was to build a model that gives the probability that two molecules are biologically similar as function of one or more chemical similarity scores. Another important goal was to evaluate how well different ligand-based virtual screening tools are able to distinguish active molecules from inactives. One more criterion set for the virtual screening tools was their applicability in scaffold-hopping, i.e. finding new active chemotypes. In the first part of the work, a link was defined between the abstract chemical similarity score given by a screening tool and the probability that the two molecules are biologically similar. These results help to decide objectively which virtual screening hits to test experimentally. The work also resulted in a new type of data fusion method when using two or more tools. In the second part, five ligand-based virtual screening tools were evaluated and their performance was found to be generally poor. Three reasons for this were proposed: false negatives in the benchmark sets, active molecules that do not share the binding mode, and activity cliffs. In the third part of the study, a novel visualization and quantification method is presented for evaluation of the scaffold-hopping ability of virtual screening tools.