15 resultados para Experimental methods

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Mixed methods research involves the combined use of quantitative and qualitative methods in the same research study, and it is becoming increasingly important in several scientific areas. The aim of this paper is to review and compare through a mixed methods multiple-case study the application of this methodology in three reputable behavioural science journals: the Journal of Organizational Behavior, Addictive Behaviors and Psicothema. A quantitative analysis was carried out to review all the papers published in these journals during the period 2003-2008 and classify them into two blocks: theoretical and empirical, with the latter being further subdivided into three subtypes (quantitative, qualitative and mixed). A qualitative analysis determined the main characteristics of the mixed methods studies identified, in order to describe in more detail the ways in which the two methods are combined based on their purpose, priority, implementation and research design. From the journals selected, a total of 1.958 articles were analysed, the majority of which corresponded to empirical studies, with only a small number referring to research that used mixed methods. Nonetheless, mixed methods research does appear in all the behavioural science journals studied within the period selected, showing a range of designs, where the sequential equal weight mixed methods research design seems to stand out.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: We present the results of EGASP, a community experiment to assess the state-ofthe-art in genome annotation within the ENCODE regions, which span 1% of the human genomesequence. The experiment had two major goals: the assessment of the accuracy of computationalmethods to predict protein coding genes; and the overall assessment of the completeness of thecurrent human genome annotations as represented in the ENCODE regions. For thecomputational prediction assessment, eighteen groups contributed gene predictions. Weevaluated these submissions against each other based on a ‘reference set’ of annotationsgenerated as part of the GENCODE project. These annotations were not available to theprediction groups prior to the submission deadline, so that their predictions were blind and anexternal advisory committee could perform a fair assessment.Results: The best methods had at least one gene transcript correctly predicted for close to 70%of the annotated genes. Nevertheless, the multiple transcript accuracy, taking into accountalternative splicing, reached only approximately 40% to 50% accuracy. At the coding nucleotidelevel, the best programs reached an accuracy of 90% in both sensitivity and specificity. Programsrelying on mRNA and protein sequences were the most accurate in reproducing the manuallycurated annotations. Experimental validation shows that only a very small percentage (3.2%) of the selected 221 computationally predicted exons outside of the existing annotation could beverified.Conclusions: This is the first such experiment in human DNA, and we have followed thestandards established in a similar experiment, GASP1, in Drosophila melanogaster. We believe theresults presented here contribute to the value of ongoing large-scale annotation projects and shouldguide further experimental methods when being scaled up to the entire human genome sequence.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A number of experimental methods have been reported for estimating the number of genes in a genome, or the closely related coding density of a genome, defined as the fraction of base pairs in codons. Recently, DNA sequence data representative of the genome as a whole have become available for several organisms, making the problem of estimating coding density amenable to sequence analytic methods. Estimates of coding density for a single genome vary widely, so that methods with characterized error bounds have become increasingly desirable. We present a method to estimate the protein coding density in a corpus of DNA sequence data, in which a ‘coding statistic’ is calculated for a large number of windows of the sequence under study, and the distribution of the statistic is decomposed into two normal distributions, assumed to be the distributions of the coding statistic in the coding and noncoding fractions of the sequence windows. The accuracy of the method is evaluated using known data and application is made to the yeast chromosome III sequence and to C.elegans cosmid sequences. It can also be applied to fragmentary data, for example a collection of short sequences determined in the course of STS mapping.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on fi eld data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unifi cation of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call"plausibility"- including the fi delity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram"s 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Se exponen la base lógica y los antecedentes históricos de la presentación taquistoscópica en hemicampos visuales y se comentan algunos aspectos fundamentales referente a los sujetos, aparatos, estímulos, respuestas y procedirnientos de loss disafios experimentales con esta técnica.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Broadband access is a key factor for economic and social development. However, providing broadband to rural areas is not attractive to private telecommunications operators due its low or zero investment return. To deal with broadband provision in rural areas, different governance systems based on private and public cooperation have appeared. This paper not only identifies and defines public and private cooperation models but also assesses their impact on overcoming the digital divide in rural areas. The results show that public ownership infrastructure under private management policy has had positive effects on reducing the broadband digital divide and being applied to areas with higher digital divide; subsides to private operators providers only positive effects on reducing broadband digital divide; but public infrastructure with public management programs did not. The results, obtained using quasi-experimental methods, suggest the importance of incentives and control mechanisms in broadband universal service provision plans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We evaluate the performance of different optimization techniques developed in the context of optical flowcomputation with different variational models. In particular, based on truncated Newton methods (TN) that have been an effective approach for large-scale unconstrained optimization, we develop the use of efficient multilevel schemes for computing the optical flow. More precisely, we evaluate the performance of a standard unidirectional multilevel algorithm - called multiresolution optimization (MR/OPT), to a bidrectional multilevel algorithm - called full multigrid optimization (FMG/OPT). The FMG/OPT algorithm treats the coarse grid correction as an optimization search direction and eventually scales it using a line search. Experimental results on different image sequences using four models of optical flow computation show that the FMG/OPT algorithm outperforms both the TN and MR/OPT algorithms in terms of the computational work and the quality of the optical flow estimation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Non-invasive monitoring of respiratory muscle function is an area of increasing research interest, resulting in the appearance of new monitoring devices, one of these being piezoelectric contact sensors. The present study was designed to test whether the use of piezoelectric contact (non-invasive) sensors could be useful in respiratory monitoring, in particular in measuring the timing of diaphragmatic contraction.Methods: Experiments were performed in an animal model: three pentobarbital anesthetized mongrel dogs. The motion of the thoracic cage was acquired by means of a piezoelectric contact sensor placed on the costal wall. This signal is compared with direct measurements of the diaphragmatic muscle length, made by sonomicrometry. Furthermore, to assess the diaphragmatic function other respiratory signals were acquired: respiratory airflow and transdiaphragmatic pressure. Diaphragm contraction time was estimated with these four signals. Using diaphragm length signal as reference, contraction times estimated with the other three signals were compared with the contraction time estimated with diaphragm length signal.Results: The contraction time estimated with the TM signal tends to give a reading 0.06 seconds lower than the measure made with the DL signal (-0.21 and 0.00 for FL and DP signals, respectively), with a standard deviation of 0.05 seconds (0.08 and 0.06 for FL and DP signals, respectively). Correlation coefficients indicated a close link between time contraction estimated with TM signal and contraction time estimated with DL signal (a Pearson correlation coefficient of 0.98, a reliability coefficient of 0.95, a slope of 1.01 and a Spearman's rank-order coefficient of 0.98). In general, correlation coefficients and mean and standard deviation of the difference were better in the inspiratory load respiratory test than in spontaneous ventilation tests.Conclusion: The technique presented in this work provides a non-invasive method to assess the timing of diaphragmatic contraction in canines, using a piezoelectric contact sensor placed on the costal wall.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the statistical properties of three estimation methods for a model of learning that is often fitted to experimental data: quadratic deviation measures without unobserved heterogeneity, and maximum likelihood withand without unobserved heterogeneity. After discussing identification issues, we show that the estimators are consistent and provide their asymptotic distribution. Using Monte Carlo simulations, we show that ignoring unobserved heterogeneity can lead to seriously biased estimations in samples which have the typical length of actual experiments. Better small sample properties areobtained if unobserved heterogeneity is introduced. That is, rather than estimating the parameters for each individual, the individual parameters are considered random variables, and the distribution of those random variables is estimated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We evaluate the performance of different optimization techniques developed in the context of optical flow computation with different variational models. In particular, based on truncated Newton methods (TN) that have been an effective approach for large-scale unconstrained optimization, we de- velop the use of efficient multilevel schemes for computing the optical flow. More precisely, we evaluate the performance of a standard unidirectional mul- tilevel algorithm - called multiresolution optimization (MR/OPT), to a bidrec- tional multilevel algorithm - called full multigrid optimization (FMG/OPT). The FMG/OPT algorithm treats the coarse grid correction as an optimiza- tion search direction and eventually scales it using a line search. Experimental results on different image sequences using four models of optical flow com- putation show that the FMG/OPT algorithm outperforms both the TN and MR/OPT algorithms in terms of the computational work and the quality of the optical flow estimation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work presented here is part of a larger study to identify novel technologies and biomarkers for early Alzheimer disease (AD) detection and it focuses on evaluating the suitability of a new approach for early AD diagnosis by non-invasive methods. The purpose is to examine in a pilot study the potential of applying intelligent algorithms to speech features obtained from suspected patients in order to contribute to the improvement of diagnosis of AD and its degree of severity. In this sense, Artificial Neural Networks (ANN) have been used for the automatic classification of the two classes (AD and control subjects). Two human issues have been analyzed for feature selection: Spontaneous Speech and Emotional Response. Not only linear features but also non-linear ones, such as Fractal Dimension, have been explored. The approach is non invasive, low cost and without any side effects. Obtained experimental results were very satisfactory and promising for early diagnosis and classification of AD patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background To examine the effect of anastomosis on experimental carcinogenesis in the colon of rats. Methods Forty-three 10-week-old male and female Sprague-Dawley rats were operated on by performing an end-to-side ileorectostomy. Group A:16 rats received no treatment. Group B: 27 rats received 18 subcutaneous injections weekly at a dose of 21 mg/kg wt of 1–2 dimethylhydrazine (DMH), from the eighth day after the intervention. Animals were sacrificed between 25–27 weeks. The number of tumours, their localization, size and microscopic characteristics were recorded. A paired chi-squared analysis was performed comparing tumoral induction in the perianastomotic zone with the rest of colon with faeces. Results No tumours appeared in the dimethylhydrazine-free group. The percentage tumoral area was greater in the perianastomotic zone compared to tumours which had developed in the rest of colon with faeces (p = 0.014). Conclusion We found a cocarcinogenic effect due to the creation of an anastomosis, when using an experimental model of colonic carcinogenesis induced by DMH in rats.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Information about the composition of regulatory regions is of great value for designing experiments to functionally characterize gene expression. The multiplicity of available applications to predict transcription factor binding sites in a particular locus contrasts with the substantial computational expertise that is demanded to manipulate them, which may constitute a potential barrier for the experimental community. Results: CBS (Conserved regulatory Binding Sites, http://compfly.bio.ub.es/CBS) is a public platform of evolutionarily conserved binding sites and enhancers predicted in multiple Drosophila genomes that is furnished with published chromatin signatures associated to transcriptionally active regions and other experimental sources of information. The rapid access to this novel body of knowledge through a user-friendly web interface enables non-expert users to identify the binding sequences available for any particular gene, transcription factor, or genome region. Conclusions: The CBS platform is a powerful resource that provides tools for data mining individual sequences and groups of co-expressed genes with epigenomics information to conduct regulatory screenings in Drosophila.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The enzyme fatty acid synthase (FASN) is highly expressed in many human carcinomas and its inhibition is cytotoxic to human cancer cells. The use of FASN inhibitors has been limited until now by anorexia and weight loss, which is associated with the stimulation of fatty acid oxidation. Materials and Methods: The in vitro effect of (-)-epigallocatechin-3-gallate (EGCG) on fatty acid metabolism enzymes, on apoptosis and on cell signalling was evaluated. In vivo, the effect of EGCG on animal body weight was addressed. Results: EGCG inhibited FASN activity, induced apoptosis and caused a marked decrease of human epidermal growth factor receptor 2 (HER2), phosphatidylinositol 3-kinase (PI3K)/AKT and extracellular (signal)-regulated kinase (ERK) 1/2 proteins, in breast cancer cells. EGCG did not induce a stimulatory effect on CPT-1 activity in vitro (84% of control), or on animal body weight in vivo (99% of control). Conclusion: EGCG is a FASN inhibitor with anticancer activity which does not exhibit cross-activation of fatty acid oxidation and does not induce weight loss, suggesting its potential use as an anticancer drug.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyzing the state of the art in a given field in order to tackle a new problem is always a mandatory task. Literature provides surveys based on summaries of previous studies, which are often based on theoretical descriptions of the methods. An engineer, however, requires some evidence from experimental evaluations in order to make the appropriate decision when selecting a technique for a problem. This is what we have done in this paper: experimentally analyzed a set of representative state-of-the-art techniques in the problem we are dealing with, namely, the road passenger transportation problem. This is an optimization problem in which drivers should be assigned to transport services, fulfilling some constraints and minimizing some function cost. The experimental results have provided us with good knowledge of the properties of several methods, such as modeling expressiveness, anytime behavior, computational time, memory requirements, parameters, and free downloadable tools. Based on our experience, we are able to choose a technique to solve our problem. We hope that this analysis is also helpful for other engineers facing a similar problem