916 resultados para quantitative data


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Dissertação apresentada para a obtenção do Grau de Doutor em Informática pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Intensification of agricultural production without a sound management and regulations can lead to severe environmental problems, as in Western Santa Catarina State, Brazil, where intensive swine production has caused large accumulations of manure and consequently water pollution. Natural resource scientists are asked by decision-makers for advice on management and regulatory decisions. Distributed environmental models are useful tools, since they can be used to explore consequences of various management practices. However, in many areas of the world, quantitative data for model calibration and validation are lacking. The data-intensive distributed environmental model AgNPS was applied in a data-poor environment, the upper catchment (2,520 ha) of the Ariranhazinho River, near the city of Seara, in Santa Catarina State. Steps included data preparation, cell size selection, sensitivity analysis, model calibration and application to different management scenarios. The model was calibrated based on a best guess for model parameters and on a pragmatic sensitivity analysis. The parameters were adjusted to match model outputs (runoff volume, peak runoff rate and sediment concentration) closely with the sparse observed data. A modelling grid cell resolution of 150 m adduced appropriate and computer-fit results. The rainfall runoff response of the AgNPS model was calibrated using three separate rainfall ranges (< 25, 25-60, > 60 mm). Predicted sediment concentrations were consistently six to ten times higher than observed, probably due to sediment trapping along vegetated channel banks. Predicted N and P concentrations in stream water ranged from just below to well above regulatory norms. Expert knowledge of the area, in addition to experience reported in the literature, was able to compensate in part for limited calibration data. Several scenarios (actual, recommended and excessive manure applications, and point source pollution from swine operations) could be compared by the model, using a relative ranking rather than quantitative predictions.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Cedrela odorata L. (Meliaceae) occurs in the Atlantic forest, Amazon rain forest, riparian forest of the country, and wetlands, which demand species adapted to their water conditions. Studies in ecological wood anatomy demonstrated that weather factors' variations have direct influence on the wood anatomical structure and that the fragmentation of the natural habitats is a direct cause of the edge effect which alters the abiotic aspects of the location, interfering consequently in its vegetation. A comparative analysis of 20 anatomical quantitative features of the wood structure was performed in populations of Cedrela odorata growing inside and on the edge of the swamp forest and granulometric analysis was made on the soil. The quantitative data were submitted to the Mann-Whitney's nonparametric test, presenting a statistically significant value decrease in the eleven wood features mean for the specimens growing in the edge of swamp forest.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Abstract This seminar is a research discussion around a very interesting problem, which may be a good basis for a WAISfest theme. A little over a year ago Professor Alan Dix came to tell us of his plans for a magnificent adventure:to walk all of the way round Wales - 1000 miles 'Alan Walks Wales'. The walk was a personal journey, but also a technological and community one, exploring the needs of the walker and the people along the way. Whilst walking he recorded his thoughts in an audio diary, took lots of photos, wrote a blog and collected data from the tech instruments he was wearing. As a result Alan has extensive quantitative data (bio-sensing and location) and qualitative data (text, images and some audio). There are challenges in analysing individual kinds of data, including merging similar data streams, entity identification, time-series and textual data mining, dealing with provenance, ontologies for paths, and journeys. There are also challenges for author and third-party annotation, linking the data-sets and visualising the merged narrative or facets of it.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Advances in biomedical signal acquisition systems for motion analysis have led to lowcost and ubiquitous wearable sensors which can be used to record movement data in different settings. This implies the potential availability of large amounts of quantitative data. It is then crucial to identify and to extract the information of clinical relevance from the large amount of available data. This quantitative and objective information can be an important aid for clinical decision making. Data mining is the process of discovering such information in databases through data processing, selection of informative data, and identification of relevant patterns. The databases considered in this thesis store motion data from wearable sensors (specifically accelerometers) and clinical information (clinical data, scores, tests). The main goal of this thesis is to develop data mining tools which can provide quantitative information to the clinician in the field of movement disorders. This thesis will focus on motor impairment in Parkinson's disease (PD). Different databases related to Parkinson subjects in different stages of the disease were considered for this thesis. Each database is characterized by the data recorded during a specific motor task performed by different groups of subjects. The data mining techniques that were used in this thesis are feature selection (a technique which was used to find relevant information and to discard useless or redundant data), classification, clustering, and regression. The aims were to identify high risk subjects for PD, characterize the differences between early PD subjects and healthy ones, characterize PD subtypes and automatically assess the severity of symptoms in the home setting.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

BACKGROUND Flavobacterium psychrophilum is the agent of Bacterial Cold Water Disease and Rainbow Trout Fry Syndrome, two diseases leading to high mortality. Pathogen detection is mainly carried out using cultures and more rapid and sensitive methods are needed. RESULTS We describe a qPCR technique based on the single copy gene β' DNA-dependent RNA polymerase (rpoC). Its detection limit was 20 gene copies and the quantification limit 103 gene copies per reaction. Tests on spiked spleens with known concentrations of F. psychrophilum (106 to 101 cells per reaction) showed no cross-reactions between the spleen tissue and the primers and probe. Screening of water samples and spleens from symptomless and infected fishes indicated that the pathogen was already present before the outbreaks, but F. psychrophilum was only quantifiable in spleens from diseased fishes. CONCLUSIONS This qPCR can be used as a highly sensitive and specific method to detect F. psychrophilum in different sample types without the need for culturing. qPCR allows a reliable detection and quantification of F. psychrophilum in samples with low pathogen densities. Quantitative data on F. psychrophilum abundance could be useful to investigate risk factors linked to infections and also as early warning system prior to potential devastating outbreak.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The long-term integrity of protected areas (PAs), and hence the maintenance of related ecosystem services (ES), are dependent on the support of local people. In the present study, local people's perceptions of ecosystem services from PAs and factors that govern local preferences for PAs are assessed. Fourteen study villages were randomly selected from three different protected forest areas and one control site along the southern coast of Côte d'Ivoire. Data was collected through a mixed-method approach, including qualitative semi-structured interviews and a household survey based on hypothetical choice scenarios. Local people's perceptions of ecosystem service provision was decrypted through qualitative content analysis, while the relation between people's preferences and potential factors that affect preferences were analyzed through multinomial models. This study shows that rural villagers do perceive a number of different ecosystem services as benefits from PAs in Côte d'Ivoire. The results based on quantitative data also suggest that local preferences for PAs and related ecosystem services are driven by PAs' management rules, age, and people's dependence on natural resources.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Clinical Research Data Quality Literature Review and Pooled Analysis We present a literature review and secondary analysis of data accuracy in clinical research and related secondary data uses. A total of 93 papers meeting our inclusion criteria were categorized according to the data processing methods. Quantitative data accuracy information was abstracted from the articles and pooled. Our analysis demonstrates that the accuracy associated with data processing methods varies widely, with error rates ranging from 2 errors per 10,000 files to 5019 errors per 10,000 fields. Medical record abstraction was associated with the highest error rates (70–5019 errors per 10,000 fields). Data entered and processed at healthcare facilities had comparable error rates to data processed at central data processing centers. Error rates for data processed with single entry in the presence of on-screen checks were comparable to double entered data. While data processing and cleaning methods may explain a significant amount of the variability in data accuracy, additional factors not resolvable here likely exist. Defining Data Quality for Clinical Research: A Concept Analysis Despite notable previous attempts by experts to define data quality, the concept remains ambiguous and subject to the vagaries of natural language. This current lack of clarity continues to hamper research related to data quality issues. We present a formal concept analysis of data quality, which builds on and synthesizes previously published work. We further posit that discipline-level specificity may be required to achieve the desired definitional clarity. To this end, we combine work from the clinical research domain with findings from the general data quality literature to produce a discipline-specific definition and operationalization for data quality in clinical research. While the results are helpful to clinical research, the methodology of concept analysis may be useful in other fields to clarify data quality attributes and to achieve operational definitions. Medical Record Abstractor’s Perceptions of Factors Impacting the Accuracy of Abstracted Data Medical record abstraction (MRA) is known to be a significant source of data errors in secondary data uses. Factors impacting the accuracy of abstracted data are not reported consistently in the literature. Two Delphi processes were conducted with experienced medical record abstractors to assess abstractor’s perceptions about the factors. The Delphi process identified 9 factors that were not found in the literature, and differed with the literature by 5 factors in the top 25%. The Delphi results refuted seven factors reported in the literature as impacting the quality of abstracted data. The results provide insight into and indicate content validity of a significant number of the factors reported in the literature. Further, the results indicate general consistency between the perceptions of clinical research medical record abstractors and registry and quality improvement abstractors. Distributed Cognition Artifacts on Clinical Research Data Collection Forms Medical record abstraction, a primary mode of data collection in secondary data use, is associated with high error rates. Distributed cognition in medical record abstraction has not been studied as a possible explanation for abstraction errors. We employed the theory of distributed representation and representational analysis to systematically evaluate cognitive demands in medical record abstraction and the extent of external cognitive support employed in a sample of clinical research data collection forms. We show that the cognitive load required for abstraction in 61% of the sampled data elements was high, exceedingly so in 9%. Further, the data collection forms did not support external cognition for the most complex data elements. High working memory demands are a possible explanation for the association of data errors with data elements requiring abstractor interpretation, comparison, mapping or calculation. The representational analysis used here can be used to identify data elements with high cognitive demands.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The present dataset data contain source data for Figure 5a from Schilling et al., 2009. Cell fate decisions are regulated by the coordinated activation of signalling pathways such as the extracellular signal-regulated kinase (ERK) cascade, but contributions of individual kinase isoforms are mostly unknown. The authors combined quantitative data from erythropoietin-induced pathway activation in primary erythroid progenitor (colony-forming unit erythroid stage, CFU-E) cells with mathematical modelling, in order to predict and experimentally confirmed a distributive ERK phosphorylation mechanism in CFU-E cells. The authors found evidences that double-phosphorylated ERK1 attenuates proliferation beyond a certain activation level, whereas activated ERK2 enhances proliferation with saturation kinetics. CFU-E cells were stimulated with the indicated Epo concentrations for 7 min and phosphorylation levels were determined by quantitative immunoblotting.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Since a pork barrel is crucial in buying off voters, competition over the distributions among legislators has been considered as one of the main factors in producing congressional political dynamism and congressional institutions. This paper aims to test the theory of pork barrel distributions in the Philippines through OLS regression on the quantitative data of the 12th congress. The results show that some attributes of legislators are statistically significant in estimating pork barrel allocations, but, do not support the hypothesis that the legislators’ proximity to leaders is a determining factor in the distributions.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Effective transcript profiling in animal systems requires isolation of homogenous tissue or cells followed by faithful mRNA amplification. Linear amplification based on cDNA synthesis and in vitro transcription is reported to maintain representation of mRNA levels, however, quantitative data demonstrating this as well as a description of inherent limitations is lacking. We show that published protocols produce a template-independent product in addition to amplifying real target mRNA thus reducing the specific activity of the final product. We describe a modified amplification protocol that minimizes the generation of template-independent product and can therefore generate the desired microgram quantities of message-derived material from 100 ng of total RNA. Application of a second, nested round of cDNA synthesis and in vitro transcription reduces the required starting material to 2 ng of total RNA. Quantitative analysis of these products on Caenorhabditis elegans Affymetrix GeneChips shows that this amplification does not reduce overall sensitivity and has only minor effects on fidelity.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

BACKGROUND Researchers evaluating angiomodulating compounds as a part of scientific projects or pre-clinical studies are often confronted with limitations of applied animal models. The rough and insufficient early-stage compound assessment without reliable quantification of the vascular response counts, at least partially, to the low transition rate to clinics. OBJECTIVE To establish an advanced, rapid and cost-effective angiogenesis assay for the precise and sensitive assessment of angiomodulating compounds using zebrafish caudal fin regeneration. It should provide information regarding the angiogenic mechanisms involved and should include qualitative and quantitative data of drug effects in a non-biased and time-efficient way. APPROACH & RESULTS Basic vascular parameters (total regenerated area, vascular projection area, contour length, vessel area density) were extracted from in vivo fluorescence microscopy images using a stereological approach. Skeletonization of the vasculature by our custom-made software Skelios provided additional parameters including "graph energy" and "distance to farthest node". The latter gave important insights into the complexity, connectivity and maturation status of the regenerating vascular network. The employment of a reference point (vascular parameters prior amputation) is unique for the model and crucial for a proper assessment. Additionally, the assay provides exceptional possibilities for correlative microscopy by combining in vivo-imaging and morphological investigation of the area of interest. The 3-way correlative microscopy links the dynamic changes in vivo with their structural substrate at the subcellular level. CONCLUSIONS The improved zebrafish fin regeneration model with advanced quantitative analysis and optional 3-way correlative morphology is a promising in vivo angiogenesis assay, well-suitable for basic research and preclinical investigations.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Deep-water benthic ostracodes from the Pliocene-Pleistocene interval of ODP Leg 107, Hole 654A (Tyrrhenian Sea) were studied. From a total of 106 samples, 40 species considered autochthonous were identified. Detailed investigations have established the biostratigraphic distribution of the most frequent ostracode taxa. The extinction levels of Agrenocythere pliocenica (a psychrospheric ostracode) in Hole 654A and in some Italian land sections lead to the conclusion that the removal of psychrospheric conditions took place in the Mediterranean Sea during or after the time interval corresponding to the Small Gephyrocapsa Zone (upper part of early Pleistocene), and not at the beginning of the Quaternary, as previously stated. Based on a reduced matrix of quantitative data of 63 samples and 20 variables of ostracodes, four varimax assemblages were extracted by a Q-mode factor analysis. Six factors and eight varimax assemblages were recognized from the Q-mode factor analysis of the quantitative data of 162 samples and 47 variables of the benthic foraminifers. The stratigraphic distributions of the varimax assemblages of the two faunistic groups were plotted against the calcareous plankton biostratigraphic scheme and compared in order to trace the relationship between the benthic foraminifers and ostracodes varimax assemblages. General results show that the two populations, belonging to quite different taxa, display almost coeval changes along the Pliocene-Pleistocene sequence of Hole 654A, essentially induced by paleoenvironmental modifications. Mainly on the base of the benthic foraminifer assemblages (which are quantitatively better represented than the ostracode assemblages), it is possible to identify such modifications as variations in sedimentation depth and in bottom oxygen content.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The Securities and Exchange Commission (SEC) in the United States mandated a new digital reporting system for US companies in late 2008. The new generation of information provision has been dubbed by Chairman Cox, ‘interactive data’ (SEC, 2006a). Despite the promise of its name, we find that in the development of the project retail investors are invoked as calculative actors rather than engaged in dialogue. Similarly, the potential for the underlying technology to be applied in ways to encourage new forms of accountability appears to be forfeited in the interests of enrolling company filers. We theorise the activities of the SEC and in particular its chairman at the time, Christopher Cox, over a three year period, both prior to and following the ‘credit crisis’. We argue that individuals and institutions play a central role in advancing the socio-technical project that is constituted by interactive data. We adopt insights from ANT (Callon, 1986; Latour, 1987, 2005b) and governmentality (Miller, 2008; Miller and Rose, 2008) to show how regulators and the proponents of the technology have acted as spokespersons for the interactive data technology and the retail investor. We examine the way in which calculative accountability has been privileged in the SEC’s construction of the retail investor as concerned with atomised, quantitative data (Kamuf, 2007; Roberts, 2009; Tsoukas, 1997). We find that the possibilities for the democratising effects of digital information on the Internet has not been realised in the interactive data project and that it contains risks for the very investors the SEC claims to seek to protect.