856 resultados para REPRODUCIBILITY OF RESULTS


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Binding, David; Couch, M.A.; Sujatha, K.S.; Webster, M.F., (2003) 'Experimental and numerical simulation of dough kneading in filled geometries', Journal of Food Engineering 58 pp.111-123 RAE2008

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND:The Framingham Heart Study (FHS), founded in 1948 to examine the epidemiology of cardiovascular disease, is among the most comprehensively characterized multi-generational studies in the world. Many collected phenotypes have substantial genetic contributors; yet most genetic determinants remain to be identified. Using single nucleotide polymorphisms (SNPs) from a 100K genome-wide scan, we examine the associations of common polymorphisms with phenotypic variation in this community-based cohort and provide a full-disclosure, web-based resource of results for future replication studies.METHODS:Adult participants (n = 1345) of the largest 310 pedigrees in the FHS, many biologically related, were genotyped with the 100K Affymetrix GeneChip. These genotypes were used to assess their contribution to 987 phenotypes collected in FHS over 56 years of follow up, including: cardiovascular risk factors and biomarkers; subclinical and clinical cardiovascular disease; cancer and longevity traits; and traits in pulmonary, sleep, neurology, renal, and bone domains. We conducted genome-wide variance components linkage and population-based and family-based association tests.RESULTS:The participants were white of European descent and from the FHS Original and Offspring Cohorts (examination 1 Offspring mean age 32 +/- 9 years, 54% women). This overview summarizes the methods, selected findings and limitations of the results presented in the accompanying series of 17 manuscripts. The presented association results are based on 70,897 autosomal SNPs meeting the following criteria: minor allele frequency [greater than or equal to] 10%, genotype call rate [greater than or equal to] 80%, Hardy-Weinberg equilibrium p-value [greater than or equal to] 0.001, and satisfying Mendelian consistency. Linkage analyses are based on 11,200 SNPs and short-tandem repeats. Results of phenotype-genotype linkages and associations for all autosomal SNPs are posted on the NCBI dbGaP website at http://www.ncbi.nlm.nih.gov/projects/gap/cgi-bin/study.cgi?id=phs000007.CONCLUSION:We have created a full-disclosure resource of results, posted on the dbGaP website, from a genome-wide association study in the FHS. Because we used three analytical approaches to examine the association and linkage of 987 phenotypes with thousands of SNPs, our results must be considered hypothesis-generating and need to be replicated. Results from the FHS 100K project with NCBI web posting provides a resource for investigators to identify high priority findings for replication.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In a recent paper, Structural Analysis of Network Traffic Flows, we analyzed the set of Origin Destination traffic flows from the Sprint-Europe and Abilene backbone networks. This report presents the complete set of results from analyzing data from both networks. The results in this report are specific to the Sprint-1 and Abilene datasets studied in the above paper. The following results are presented here: 1 Rows of Principal Matrix (V) 2 1.1 Sprint-1 Dataset ................................ 2 1.2 Abilene Dataset.................................. 9 2 Set of Eigenflows 14 2.1 Sprint-1 Dataset.................................. 14 2.2 Abilene Dataset................................... 21 3 Classifying Eigenflows 26 3.1 Sprint-1 Dataset.................................. 26 3.2 Abilene Datase.................................... 44

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Measurement of CD4+ T-lymphocytes (CD4) is a crucial parameter in the management of HIV patients, particularly in determining eligibility to initiate antiretroviral treatment (ART). A number of technologies exist for CD4 enumeration, with considerable variation in cost, complexity, and operational requirements. We conducted a systematic review of the performance of technologies for CD4 enumeration. METHODS AND FINDINGS: Studies were identified by searching electronic databases MEDLINE and EMBASE using a pre-defined search strategy. Data on test accuracy and precision included bias and limits of agreement with a reference standard, and misclassification probabilities around CD4 thresholds of 200 and 350 cells/μl over a clinically relevant range. The secondary outcome measure was test imprecision, expressed as % coefficient of variation. Thirty-two studies evaluating 15 CD4 technologies were included, of which less than half presented data on bias and misclassification compared to the same reference technology. At CD4 counts <350 cells/μl, bias ranged from -35.2 to +13.1 cells/μl while at counts >350 cells/μl, bias ranged from -70.7 to +47 cells/μl, compared to the BD FACSCount as a reference technology. Misclassification around the threshold of 350 cells/μl ranged from 1-29% for upward classification, resulting in under-treatment, and 7-68% for downward classification resulting in overtreatment. Less than half of these studies reported within laboratory precision or reproducibility of the CD4 values obtained. CONCLUSIONS: A wide range of bias and percent misclassification around treatment thresholds were reported on the CD4 enumeration technologies included in this review, with few studies reporting assay precision. The lack of standardised methodology on test evaluation, including the use of different reference standards, is a barrier to assessing relative assay performance and could hinder the introduction of new point-of-care assays in countries where they are most needed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: In recent years large bibliographic databases have made much of the published literature of biology available for searches. However, the capabilities of the search engines integrated into these databases for text-based bibliographic searches are limited. To enable searches that deliver the results expected by comparative anatomists, an underlying logical structure known as an ontology is required. DEVELOPMENT AND TESTING OF THE ONTOLOGY: Here we present the Mammalian Feeding Muscle Ontology (MFMO), a multi-species ontology focused on anatomical structures that participate in feeding and other oral/pharyngeal behaviors. A unique feature of the MFMO is that a simple, computable, definition of each muscle, which includes its attachments and innervation, is true across mammals. This construction mirrors the logical foundation of comparative anatomy and permits searches using language familiar to biologists. Further, it provides a template for muscles that will be useful in extending any anatomy ontology. The MFMO is developed to support the Feeding Experiments End-User Database Project (FEED, https://feedexp.org/), a publicly-available, online repository for physiological data collected from in vivo studies of feeding (e.g., mastication, biting, swallowing) in mammals. Currently the MFMO is integrated into FEED and also into two literature-specific implementations of Textpresso, a text-mining system that facilitates powerful searches of a corpus of scientific publications. We evaluate the MFMO by asking questions that test the ability of the ontology to return appropriate answers (competency questions). We compare the results of queries of the MFMO to results from similar searches in PubMed and Google Scholar. RESULTS AND SIGNIFICANCE: Our tests demonstrate that the MFMO is competent to answer queries formed in the common language of comparative anatomy, but PubMed and Google Scholar are not. Overall, our results show that by incorporating anatomical ontologies into searches, an expanded and anatomically comprehensive set of results can be obtained. The broader scientific and publishing communities should consider taking up the challenge of semantically enabled search capabilities.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As announced in the November 2000 issue of MathStats&OR [1], one of the projects supported by the Maths, Stats & OR Network funds is an international survey of research into pedagogic issues in statistics and OR. I am taking the lead on this and report here on the progress that has been made during the first year. A paper giving some background to the project and describing initial thinking on how it might be implemented was presented at the 53rd session of the International Statistical Institute in Seoul, Korea, in August 2001 in a session on The future of statistics education research [2]. It sounded easy. I considered that I was something of an expert on surveys having lectured on the topic for many years and having helped students and others who were doing surveys, particularly with the design of their questionnaires. Surely all I had to do was to draft a few questions, send them electronically to colleagues in statistical education who would be only to happy to respond, and summarise their responses? I should have learnt from my experience of advising all those students who thought that doing a survey was easy and to whom I had to explain that their ideas were too ambitious. There are several inter-related stages in survey research and it is important to think about these before rushing into the collection of data. In the case of the survey in question, this planning stage revealed several challenges. Surveys are usually done for a purpose so even before planning how to do them, it is advisable to think about the final product and the dissemination of results. This is the route I followed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study presents a CFD analysis constructed around PHYSICA, an open framework for multi-physics computational continuum mechanics modelling, to investigate the water movement in unsaturated porous media. The modelling environment is based on a cell-centred finite-volume discretisation technique. A number of test cases are performed in order to validate the correct implementation of Richard's equation for compressible and incompressible fluids. The pressure head form of the equation is used together with the constitutive relationships between pressure, volumetric water content and hydraulic conductivity described by Haverkamp and Van Genuchten models. The flow problems presented are associated with infiltration into initially dry soils with homogeneous or layered geologic settings. Comparison of results with the problems selected from literature shows a good agreement and validates the approach and the implementation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Very short-lived halocarbons are significant sources of reactive halogen in the marine boundary layer, and likely in the upper troposphere and lower stratosphere. Quantifying ambient concentrations in the surface ocean and atmosphere is essential for understanding the atmospheric impact of these trace gas fluxes. Despite the body of literature increasing substantially over recent years, calibration issues complicate the comparison of results and limit the utility of building larger-scale databases that would enable further development of the science (e.g. sea-air flux quantification, model validation, etc.). With this in mind, thirty-one scientists from both atmospheric and oceanic halocarbon communities in eight nations gathered in London in February 2008 to discuss the scientific issues and plan an international effort toward developing common calibration scales (http://tinyurl.com/c9cg58). Here, we discuss the outputs from this meeting, suggest the compounds that should be targeted initially, identify opportunities for beginning calibration and comparison efforts, and make recommendations for ways to improve the comparability of previous and future measurements.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ecohydrodynamics investigates the hydrodynamic constraints on ecosystems across different temporal and spatial scales. Ecohydrodynamics play a pivotal role in the structure and functioning of marine ecosystems, however the lack of integrated complex flow models for deep-water ecosystems beyond the coastal zone prevents further synthesis in these settings. We present a hydrodynamic model for one of Earth's most biologically diverse deep-water ecosystems, cold-water coral reefs. The Mingulay Reef Complex (western Scotland) is an inshore seascape of cold-water coral reefs formed by the scleractinian coral Lophelia pertusa. We applied single-image edge detection and composite front maps using satellite remote sensing, to detect oceanographic fronts and peaks of chlorophyll a values that likely affect food supply to corals and other suspension-feeding fauna. We also present a high resolution 3D ocean model to incorporate salient aspects of the regional and local oceanography. Model validation using in situ current speed, direction and sea elevation data confirmed the model's realistic representation of spatial and temporal aspects of circulation at the reef complex including a tidally driven current regime, eddies, and downwelling phenomena. This novel combination of 3D hydrodynamic modelling and remote sensing in deep-water ecosystems improves our understanding of the temporal and spatial scales of ecological processes occurring in marine systems. The modelled information has been integrated into a 3D GIS, providing a user interface for visualization and interrogation of results that allows wider ecological application of the model and that can provide valuable input for marine biodiversity and conservation applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It is standard clinical practice to use a combination of two or more antimicrobial agents to treat an infection caused by Pseudonionas aeruginosa. The antibiotic combinations are usually selected empirically with methods to determine the antimicrobial effect of the combination such as the time-kill assay rarely used as they are time-consuming and labour intensive to perforin. Here, we report a modified time-kill assay, based on the reduction of the tetrazolium salt, 2,3-bis[2-methyloxy-4-nitro-5-sulfopheny1]-2H-tetrazolium-5-carboxanilide (XTT), that allows simple, inexpensive and more rapid determination of the in vitro activity of antibiotic combinations against P aeruginosa. The assay was used to determine the in vitro activity of ceftazidime and tobramycin in combination against P. aertiginosa isolates from cystic fibrosis patients and the results obtained compared with those from conventional viable count time-kill assays. There was good agreement in interpretation of results obtained by the XTT and conventional viable count assays, with similar growth curves apparent and the most effective concentration combinations determined by both methods identical for all isolates tested. The XTT assay clearly indicated whether an antibiotic combination had a synergistic, indifferent or antagonistic effect and could, therefore, provide a useful method for rapidly determining the activity of a large number of antibiotic combinations against clinical isolates. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Epoxides and phosphites are often used as additives to stabilize the properties of polymers, including bisphenol A polycarbonate (BPA-PC). We describe density functional (DF) calculations of the reactions of cyclohexene oxide (CHO, cyclohexane epoxide) and phosphites with chain segments of BPA-PC, with the aim of identifying possible reaction paths and energy barriers. The reactions of CHO with the OH-terminated PC chains and with the carbonate group are exothermic, although there is an energy barrier in each case of more than 10 kcal/mol. A comparison of results for different CHO isomers demonstrates the importance of steric effects. The reactions between the same groups of the PC chain and the phosphites 2-[2,4-bis(tert-butyl)phenoxy]-5,5-dimethyl-1,3,2-dioxaphosphorinane] (BPDD) and trimethyl phosphite (TMP), and their phosphonate isomers are characterized by large energy barriers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Transparency in nonprofit sector and foundations, as an element to enhance the confidence of stakeholders in the organization, is a fact shown by several studies in recent decades. Transparency can be considered in various fields and through different channels. In our study we focused on the analysis of the organizational and economic transparency of foundations, shown through the voluntary information on their Website. We review the theoretical previous studies published to put to the foundations within the framework of the social economy. This theoretical framework has focused on accountability that make foundations in relation to its social function and its management, especially since the most recent focus of information transparency across the Website.In this theoretical framework was made an index to quantify the voluntary information which is shown on its website. This index has been developed ad hoc for this study and applied to a group of large corporate foundations.With the application of these data are obtained two kind of results, to a descriptive level and to a inferential level.We analyzed the statistical correlation between economic transparency and organizational transparency offered in the Website through quantified variables by a multiple linear regression. This empirical analysis allows us to draw conclusions about the level of transparency offered by these organizations in relation to their organizational and financial information, as well as explain the relation between them.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aim. This paper is a report of a study to describe how treatment fidelity is being enhanced and monitored, using a model from the National Institutes of Health Behavior Change Consortium. Background. The objective of treatment fidelity is to minimize errors in interpreting research trial outcomes, and to ascribe those outcomes directly to the intervention at hand. Treatment fidelity procedures are included in trials of complex interventions to account for inferences made from study outcomes. Monitoring treatment fidelity can help improve study design, maximize reliability of results, increase statistical power, determine whether theory-based interventions are responsible for observed changes, and inform the research dissemination process. Methods. Treatment fidelity recommendations from the Behavior Change Consortium were applied to the SPHERE study (Secondary Prevention of Heart DiseasE in GeneRal PracticE), a randomized controlled trial of a complex intervention. Procedures to enhance and monitor intervention implementation included standardizing training sessions, observing intervention consultations, structuring patient recall systems, and using written practice and patient care plans. The research nurse plays an important role in monitoring intervention implementation. Findings. Several methods of applying treatment fidelity procedures to monitoring interventions are possible. The procedure used may be determined by availability of appropriate personnel, fiscal constraints, or time limits. Complex interventions are not straightforward and necessitate a monitoring process at trial stage. Conclusion. The Behavior Change Consortium’s model of treatment fidelity is useful for structuring a system to monitor the implementation of a complex intervention, and helps to increase the reliability and validity of evaluation findings.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The present report investigates the role of formate species as potential reaction intermediates for the WGS reaction (CO + H2O -> CO2 + H-2) over a Pt-CeO2 catalyst. A combination of operando techniques, i.e., in situ diffuse reflectance FT-IR (DRIFT) spectroscopy and mass spectrometry (MS) during steady-state isotopic transient kinetic analysis (SSITKA), was used to relate the exchange of the reaction product CO2 to that of surface formate species. The data presented here suggest that a switchover from a non-formate to a formate-based mechanism could take place over a very narrow temperature range (as low as 60 K) over our Pt-CeO2 catalyst. This observation clearly stresses the need to avoid extrapolating conclusions to the case of results obtained under even slightly different experimental conditions. The occurrence of a low-temperature mechanism, possibly redox or Mars van Krevelen-like, that deactivates above 473 K because of ceria over-reduction is suggested as a possible explanation for the switchover, similarly to the case of the CO-NO reaction over Cu, I'd and Rh-CeZrOx (see Kaspar and co-workers [1-3]). (c) 2006 Elsevier B.V. All rights reserved.