909 resultados para comparison methods


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The DNA G-qadruplexes are one of the targets being actively explored for anti-cancer therapy by inhibiting them through small molecules. This computational study was conducted to predict the binding strengths and orientations of a set of novel dimethyl-amino-ethyl-acridine (DACA) analogues that are designed and synthesized in our laboratory, but did not diffract in Synchrotron light.Thecrystal structure of DNA G-Quadruplex(TGGGGT)4(PDB: 1O0K) was used as target for their binding properties in our studies.We used both the force field (FF) and QM/MM derived atomic charge schemes simultaneously for comparing the predictions of drug binding modes and their energetics. This study evaluates the comparative performance of fixed point charge based Glide XP docking and the quantum polarized ligand docking schemes. These results will provide insights on the effects of including or ignoring the drug-receptor interfacial polarization events in molecular docking simulations, which in turn, will aid the rational selection of computational methods at different levels of theory in future drug design programs. Plenty of molecular modelling tools and methods currently exist for modelling drug-receptor or protein-protein, or DNA-protein interactionssat different levels of complexities.Yet, the capasity of such tools to describevarious physico-chemical propertiesmore accuratelyis the next step ahead in currentresearch.Especially, the usage of most accurate methods in quantum mechanics(QM) is severely restricted by theirtedious nature. Though the usage of massively parallel super computing environments resulted in a tremendous improvement in molecular mechanics (MM) calculations like molecular dynamics,they are still capable of dealing with only a couple of tens to hundreds of atoms for QM methods. One such efficient strategy that utilizes thepowers of both MM and QM are the QM/MM hybrid methods. Lately, attempts have been directed towards the goal of deploying several different QM methods for betterment of force field based simulations, but with practical restrictions in place. One of such methods utilizes the inclusion of charge polarization events at the drug-receptor interface, that is not explicitly present in the MM FF.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

There are many published methods available for creating keyphrases for documents. Previous work in the field has shown that in a significant proportion of cases author selected keyphrases are not appropriate for the document they accompany. This requires the use of such automated methods to improve the use of keyphrases. Often the keyphrases are not updated when the focus of a paper changes or include keyphrases that are more classificatory than explanatory. The published methods are all evaluated using different corpora, typically one relevant to their field of study. This not only makes it difficult to incorporate the useful elements of algorithms in future work but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of six corpora. The methods chosen were term frequency, inverse document frequency, the C-Value, the NC-Value, and a synonym based approach. These methods were compared to evaluate performance and quality of results, and to provide a future benchmark. It is shown that, with the comparison metric used for this study Term Frequency and Inverse Document Frequency were the best algorithms, with the synonym based approach following them. Further work in the area is required to determine an appropriate (or more appropriate) comparison metric.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Specific traditional plate count method and real-time PCR systems based on SYBR Green I and TaqMan technologies using a specific primer pair and probe for amplification of iap-gene were used for quantitative assay of Listeria monocytogenes in seven decimal serial dilution series of nutrient broth and milk samples containing 1.58 to 1.58×107 cfu /ml and the real-time PCR methods were compared with the plate count method with respect to accuracy and sensitivity. In this study, the plate count method was performed using surface-plating of 0.1 ml of each sample on Palcam Agar. The lowest detectable level for this method was 1.58×10 cfu/ml for both nutrient broth and milk samples. Using purified DNA as a template for generation of standard curves, as few as four copies of the iap-gene could be detected per reaction with both real-time PCR assays, indicating that they were highly sensitive. When these real-time PCR assays were applied to quantification of L. monocytogenes in decimal serial dilution series of nutrient broth and milk samples, 3.16×10 to 3.16×105 copies per reaction (equals to 1.58×103 to 1.58×107 cfu/ml L. monocytogenes) were detectable. As logarithmic cycles, for Plate Count and both molecular assays, the quantitative results of the detectable steps were similar to the inoculation levels.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this study, we compare two different cyclone-tracking algorithms to detect North Atlantic polar lows, which are very intense mesoscale cyclones. Both approaches include spatial filtering, detection, tracking and constraints specific to polar lows. The first method uses digital bandpass-filtered mean sea level pressure (MSLP) fieldsin the spatial range of 200�600 km and is especially designed for polar lows. The second method also uses a bandpass filter but is based on the discrete cosine transforms (DCT) and can be applied to MSLP and vorticity fields. The latter was originally designed for cyclones in general and has been adapted to polar lows for this study. Both algorithms are applied to the same regional climate model output fields from October 1993 to September 1995 produced from dynamical downscaling of the NCEP/NCAR reanalysis data. Comparisons between these two methods show that different filters lead to different numbers and locations of tracks. The DCT is more precise in scale separation than the digital filter and the results of this study suggest that it is more suited for the bandpass filtering of MSLP fields. The detection and tracking parts also influence the numbers of tracks although less critically. After a selection process that applies criteria to identify tracks of potential polar lows, differences between both methods are still visible though the major systems are identified in both.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents practical approaches to the problem of sample size re-estimation in the case of clinical trials with survival data when proportional hazards can be assumed. When data are readily available at the time of the review, on a full range of survival experiences across the recruited patients, it is shown that, as expected, performing a blinded re-estimation procedure is straightforward and can help to maintain the trial's pre-specified error rates. Two alternative methods for dealing with the situation where limited survival experiences are available at the time of the sample size review are then presented and compared. In this instance, extrapolation is required in order to undertake the sample size re-estimation. Worked examples, together with results from a simulation study are described. It is concluded that, as in the standard case, use of either extrapolation approach successfully protects the trial error rates. Copyright © 2012 John Wiley & Sons, Ltd.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recently, in order to accelerate drug development, trials that use adaptive seamless designs such as phase II/III clinical trials have been proposed. Phase II/III clinical trials combine traditional phases II and III into a single trial that is conducted in two stages. Using stage 1 data, an interim analysis is performed to answer phase II objectives and after collection of stage 2 data, a final confirmatory analysis is performed to answer phase III objectives. In this paper we consider phase II/III clinical trials in which, at stage 1, several experimental treatments are compared to a control and the apparently most effective experimental treatment is selected to continue to stage 2. Although these trials are attractive because the confirmatory analysis includes phase II data from stage 1, the inference methods used for trials that compare a single experimental treatment to a control and do not have an interim analysis are no longer appropriate. Several methods for analysing phase II/III clinical trials have been developed. These methods are recent and so there is little literature on extensive comparisons of their characteristics. In this paper we review and compare the various methods available for constructing confidence intervals after phase II/III clinical trials.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In an adaptive seamless phase II/III clinical trial interim analysis, data are used for treatment selection, enabling resources to be focused on comparison of more effective treatment(s) with a control. In this paper, we compare two methods recently proposed to enable use of short-term endpoint data for decision-making at the interim analysis. The comparison focuses on the power and the probability of correctly identifying the most promising treatment. We show that the choice of method depends on how well short-term data predict the best treatment, which may be measured by the correlation between treatment effects on short- and long-term endpoints.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

As part of an international intercomparison project, a set of single column models (SCMs) and cloud-resolving models (CRMs) are run under the weak temperature gradient (WTG) method and the damped gravity wave (DGW) method. For each model, the implementation of the WTG or DGW method involves a simulated column which is coupled to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. The simulated column has the same surface conditions as the reference state and is initialized with profiles from the reference state. We performed systematic comparison of the behavior of different models under a consistent implementation of the WTG method and the DGW method and systematic comparison of the WTG and DGW methods in models with different physics and numerics. CRMs and SCMs produce a variety of behaviors under both WTG and DGW methods. Some of the models reproduce the reference state while others sustain a large-scale circulation which results in either substantially lower or higher precipitation compared to the value of the reference state. CRMs show a fairly linear relationship between precipitation and circulation strength. SCMs display a wider range of behaviors than CRMs. Some SCMs under the WTG method produce zero precipitation. Within an individual SCM, a DGW simulation and a corresponding WTG simulation can produce different signed circulation. When initialized with a dry troposphere, DGW simulations always result in a precipitating equilibrium state. The greatest sensitivities to the initial moisture conditions occur for multiple stable equilibria in some WTG simulations, corresponding to either a dry equilibrium state when initialized as dry or a precipitating equilibrium state when initialized as moist. Multiple equilibria are seen in more WTG simulations for higher SST. In some models, the existence of multiple equilibria is sensitive to some parameters in the WTG calculations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

As part of an international intercomparison project, the weak temperature gradient (WTG) and damped gravity wave (DGW) methods are used to parameterize large-scale dynamics in a set of cloud-resolving models (CRMs) and single column models (SCMs). The WTG or DGW method is implemented using a configuration that couples a model to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. We investigated the sensitivity of each model to changes in SST, given a fixed reference state. We performed a systematic comparison of the WTG and DGW methods in different models, and a systematic comparison of the behavior of those models using the WTG method and the DGW method. The sensitivity to the SST depends on both the large-scale parameterization method and the choice of the cloud model. In general, SCMs display a wider range of behaviors than CRMs. All CRMs using either the WTG or DGW method show an increase of precipitation with SST, while SCMs show sensitivities which are not always monotonic. CRMs using either the WTG or DGW method show a similar relationship between mean precipitation rate and column-relative humidity, while SCMs exhibit a much wider range of behaviors. DGW simulations produce large-scale velocity profiles which are smoother and less top-heavy compared to those produced by the WTG simulations. These large-scale parameterization methods provide a useful tool to identify the impact of parameterization differences on model behavior in the presence of two-way feedback between convection and the large-scale circulation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Sampling protocols for detecting Salmonella on poultry differ among various countries. In the United States, the U.S. Department of Agriculture Food Safety and Inspection Service dictates that whole broiler carcasses should be rinsed with 400 ml of 1% buffered peptone water, whereas in the European Union 25-g samples composed of neck skin from three carcasses are evaluated. The purpose of this study was to evaluate a whole carcass rinse (WCR) and a neck skin excision (NS) procedure for Salmonella and Escherichia coli isolation from the same broiler carcass. Carcasses were obtained from three broiler processing plants. The skin around the neck area was aseptically removed and bagged separately from the carcass, and microbiological analysis was performed. The corresponding carcass was bagged and a WCR sample was evaluated. No significant difference (alpha <= 0.05) in Salmonella prevalence was found between the samples processed by the two methods, but both procedures produced many false-negative Salmonella results. Prechill, 37% (66 carcasses), 28% (50 carcasses), and 51% (91 carcasses) of the 180 carcasses examined were positive for Salmonella by WCR, NS, and both procedures combined, respectively. Postchill, 3% (5 carcasses), 7% (12 carcasses), and 10% (17 carcasses) of the 177 carcasses examined were positive for Salmonella by the WCR, NS, and combination of both procedures, respectively. Prechill, E. coli plus coliform counts were 3.0 and 2.6 log CFU/ml by the WCR and NS methods, respectively. Postchill. E. coli plus coliform counts were 1.7 and 1.4 log CFU/ml by the WCR and NS methods, respectively.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the present study, we compared 2 methods for collecting ixodid ticks on the verges of animal trails in a primary Amazon forest area in northern Brazil. (i) Dragging: This method was based on passing a 1-m(2) white flannel over the vegetation and checking the flannel for the presence of caught ticks every 5-10 m. (ii) Visual search: This method consisted of looking for guesting ticks on the tips of leaves of the vegetation bordering animal trails in the forest. A total of 103 adult ticks belonging to 4 Amblyomma species were collected by the visual search method on 5 collecting dates, while only 44 adult ticks belonging to 3 Amblyomma species were collected by dragging on 5 other collecting dates. These values were statistically different (Mann-Whitney Test, P = 0.0472). On the other hand, dragging was more efficient for subadult ticks, since no larva or nymph was collected by visual search, whereas 18 nymphs and 7 larvae were collected by dragging. The visual search method proved to be suitable for collecting adult ticks in the Amazon forest: however, field studies should include a second method, such as dragging in order to maximize the collection of subadult ticks. Indeed, these 2 methods can be performed by a single investigator at the same time, while he/she walks on an animal trail in the forest. (C) 2010 Elsevier GmbH. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Este trabalho avalia as previsões de três métodos não lineares — Markov Switching Autoregressive Model, Logistic Smooth Transition Autoregressive Model e Autometrics com Dummy Saturation — para a produção industrial mensal brasileira e testa se elas são mais precisas que aquelas de preditores naive, como o modelo autorregressivo de ordem p e o mecanismo de double differencing. Os resultados mostram que a saturação com dummies de degrau e o Logistic Smooth Transition Autoregressive Model podem ser superiores ao mecanismo de double differencing, mas o modelo linear autoregressivo é mais preciso que todos os outros métodos analisados.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work assesses the forecasts of three nonlinear methods | Markov Switching Autoregressive Model, Logistic Smooth Transition Auto-regressive Model, and Auto-metrics with Dummy Saturation | for the Brazilian monthly industrial production and tests if they are more accurate than those of naive predictors such as the autoregressive model of order p and the double di erencing device. The results show that the step dummy saturation and the logistic smooth transition autoregressive can be superior to the double di erencing device, but the linear autoregressive model is more accurate than all the other methods analyzed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An analytical procedure based on manual dynamic headspace solid-phase microextraction (HS-SPME) method and the conventional extraction method by liquid–liquid extraction (LLE), were compared for their effectiveness in the extraction and quantification of volatile compounds from commercial whiskey samples. Seven extraction solvents covering a wide range of polarities and two SPME fibres coatings, has been evaluated. The highest amounts extracted, were achieved using dichloromethane (CH2Cl2) by LLE method (LLECH2Cl2)(LLECH2Cl2) and using a CAR/PDMS fibre (SPMECAR/PDMS) in HS-SPME. Each method was used to determine the responses of 25 analytes from whiskeys and calibration standards, in order to provide sensitivity comparisons between the two methods. Calibration curves were established in a synthetic whiskey and linear correlation coefficient (r ) were greater than 0.9929 for LLECH2Cl2LLECH2Cl2 and 0.9935 for SPMECAR/PDMS, for all target compounds. Recoveries greater than 80% were achieved. For most compounds, precision (expressed by relative standard deviation, R.S.D.) are very good, with R.S.D. values lower than 14.78% for HS-SPME method and than 19.42% for LLE method. The detection limits ranged from 0.13 to 19.03 μg L−1 for SPME procedure and from 0.50 to 12.48 μg L−1 for LLE. A tentative study to estimate the contribution of a specific compound to the aroma of a whiskey, on the basis of their odour activity values (OAV) was made. Ethyl octanoate followed by isoamyl acetate and isobutyl alcohol, were found the most potent odour-active compounds.