961 resultados para Data Utility


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many eukaryote organisms are polyploid. However, despite their importance, evolutionary inference of polyploid origins and modes of inheritance has been limited by a need for analyses of allele segregation at multiple loci using crosses. The increasing availability of sequence data for nonmodel species now allows the application of established approaches for the analysis of genomic data in polyploids. Here, we ask whether approximate Bayesian computation (ABC), applied to realistic traditional and next-generation sequence data, allows correct inference of the evolutionary and demographic history of polyploids. Using simulations, we evaluate the robustness of evolutionary inference by ABC for tetraploid species as a function of the number of individuals and loci sampled, and the presence or absence of an outgroup. We find that ABC adequately retrieves the recent evolutionary history of polyploid species on the basis of both old and new sequencing technologies. The application of ABC to sequence data from diploid and polyploid species of the plant genus Capsella confirms its utility. Our analysis strongly supports an allopolyploid origin of C. bursa-pastoris about 80 000 years ago. This conclusion runs contrary to previous findings based on the same data set but using an alternative approach and is in agreement with recent findings based on whole-genome sequencing. Our results indicate that ABC is a promising and powerful method for revealing the evolution of polyploid species, without the need to attribute alleles to a homeologous chromosome pair. The approach can readily be extended to more complex scenarios involving higher ploidy levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrical Impedance Tomography (EIT) is an imaging method which enables a volume conductivity map of a subject to be produced from multiple impedance measurements. It has the potential to become a portable non-invasive imaging technique of particular use in imaging brain function. Accurate numerical forward models may be used to improve image reconstruction but, until now, have employed an assumption of isotropic tissue conductivity. This may be expected to introduce inaccuracy, as body tissues, especially those such as white matter and the skull in head imaging, are highly anisotropic. The purpose of this study was, for the first time, to develop a method for incorporating anisotropy in a forward numerical model for EIT of the head and assess the resulting improvement in image quality in the case of linear reconstruction of one example of the human head. A realistic Finite Element Model (FEM) of an adult human head with segments for the scalp, skull, CSF, and brain was produced from a structural MRI. Anisotropy of the brain was estimated from a diffusion tensor-MRI of the same subject and anisotropy of the skull was approximated from the structural information. A method for incorporation of anisotropy in the forward model and its use in image reconstruction was produced. The improvement in reconstructed image quality was assessed in computer simulation by producing forward data, and then linear reconstruction using a sensitivity matrix approach. The mean boundary data difference between anisotropic and isotropic forward models for a reference conductivity was 50%. Use of the correct anisotropic FEM in image reconstruction, as opposed to an isotropic one, corrected an error of 24 mm in imaging a 10% conductivity decrease located in the hippocampus, improved localisation for conductivity changes deep in the brain and due to epilepsy by 4-17 mm, and, overall, led to a substantial improvement on image quality. This suggests that incorporation of anisotropy in numerical models used for image reconstruction is likely to improve EIT image quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reaching a consensus in terms of interchangeability and utility (i.e., disease detection/monitoring) of a medical device is the eventual aim of repeatability and agreement studies. The aim of the tolerance and relative utility indices described in this report is to provide a methodology to compare change in clinical measurement noise between different populations (repeatability) or measurement methods (agreement), so as to highlight problematic areas. No longitudinal data are required to calculate these indices. Both indices establish a metric of least to most effected across all parameters to facilitate comparison. If validated, these indices may prove useful tools when combining reports and forming the consensus required in the validation process for software updates and new medical devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. Molecular tests for breast cancer (BC) risk assessment are reimbursed by health insurances in Switzerland since the beginning of year 2015. The main current role of these tests is to help oncologists to decide about the usefulness of adjuvant chemotherapy in patients with early stage endocrine-sensitive and human epidermal growth factor receptor 2 (HER2)-negative BC. These gene expression signatures aim at predicting the risk of recurrence in this subgroup. One of them (OncotypeDx/OT) also predicts distant metastases rate with or without the addition of cytotoxic chemotherapy to endocrine therapy. The clinical utility of these tests -in addition to existing so-called "clinico-pathological" prognostic and predictive criteria (e.g. stage, grade, biomarkers status)-is still debated. We report a single center one year experience of the use of one molecular test (OT) in clinical decision making. Methods. We extracted from the CHUV Breast Cancer Center data base the total number of BC cases with estrogen-receptor positive (ER+), HER2-negative early breast cancer (node negative (pN0) disease or micrometastases in up to 3 lymph nodes) operated between September 2014 and August 2015. For the cases from this group in which a molecular test had been decided by the tumor board, we collected the clinicopathologic parameters, the initial tumor board decision, and the final adjuvant systemic therapy decision. Results. A molecular test (OT) was done in 12.2% of patients with ER + HER2 negative early BC. The median age was 57.4 years and the median invasive tumor size was 1.7 cm. These patients were classified by ODX testing (Recurrence Score) into low-, intermediate-, and high risk groups, respectively in 27.2%, 63.6% and 9% of cases. Treatment recommendations changed in 18.2%, predominantly from chemotherapyendocrine therapy to endocrine treatment alone. Of 8 patients originally recommended chemotherapy, 25% were recommended endocrine treatment alone after receiving the Recurrence Score result. Conclusions. Though reimbursed by health insurances since January 2015, molecular tests are used moderately in our institution as per the decision of the multidisciplinary tumor board. It's mainly used to obtain a complementary confirmation supporting the decision of no chemotherapy. The OncotypeDx Recurrence Score results were in the intermediate group in 66% of the 9 tested cases but contributed to avoid chemotherapy in 2 patients during the last 12 months.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data is the most important asset of a company in the information age. Other assets, such as technology, facilities or products can be copied or reverse-engineered, employees can be brought over, but data remains unique to every company. As data management topics are slowly moving from unknown unknowns to known unknowns, tools to evaluate and manage data properly are developed and refined. Many projects are in progress today to develop various maturity models for evaluating information and data management practices. These maturity models come in many shapes and sizes: from short and concise ones meant for a quick assessment, to complex ones that call for an expert assessment by experienced consultants. In this paper several of them, made not only by external inter-organizational groups and authors, but also developed internally at a Major Energy Provider Company (MEPC) are juxtaposed and thoroughly analyzed. Apart from analyzing the available maturity models related to Data Management, this paper also selects the one with the most merit and describes and analyzes using it to perform a maturity assessment in MEPC. The utility of maturity models is two-fold: descriptive and prescriptive. Besides recording the current state of Data Management practices maturity by performing the assessments, this maturity model is also used to chart the way forward. Thus, after the current situation is presented, analysis and recommendations on how to improve it based on the definitions of higher levels of maturity are given. Generally, the main trend observed was the widening of the Data Management field to include more business and “soft” areas (as opposed to technical ones) and the change of focus towards business value of data, while assuming that the underlying IT systems for managing data are “ideal”, that is, left to the purely technical disciplines to design and maintain. This trend is not only present in Data Management but in other technological areas as well, where more and more attention is given to innovative use of technology, while acknowledging that the strategic importance of IT as such is diminishing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The major objective of the thesis is essentially to evolve and apply certain computational procedures to evaluate the structure and properties of some simple polyatomic molecules making use of spectroscopic data available from the literature. It must be said that though there is dwindling interest in recent times in such analyses, there exists tremendous scope and utility for attempting such calculations as the precision and reliability of'experimental techniques in spectroscopy have increased vastly due to enormous sophistication of the instruments used for these measurements. In the present thesis an attempt is made to extract maximum amount of information regarding the geometrical structure and interatmic forces of simple molecules from the experimental data on microwave and infrared spectra of these molecules

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Satellite remote sensing is being effectively used in monitoring the ocean surface and its overlying atmosphere. Technical growth in the field of satellite sensors has made satellite measurement an inevitable part of oceanographic and atmospheric research. Among the ocean observing sensors, ocean colour sensors make use of visible band of electromagnetic spectrum (shorter wavelength). The use of shorter wavelength ensures fine spatial resolution of these parameters to depict oceanographic and atmospheric characteristics of any region having significant spaio-temporal variability. Off the southwest coast of India is such an area showing very significant spatio-temporal oceanographic and atmospheric variability due to the seasonally reversing surface winds and currents. Consequently, the region is enriched with features like upwelling, sinking, eddies, fronts, etc. Among them, upwelling brings nutrient-rich waters from subsurface layers to surface layers. During this process primary production enhances, which is measured in ocean colour sensors as high values of Chl a. Vertical attenuation depth of incident solar radiation (Kd) and Aerosol Optical Depth (AOD) are another two parameters provided by ocean colour sensors. Kd is also susceptible to undergo significant seasonal variability due to the changes in the content of Chl a in the water column. Moreover, Kd is affected by sediment transport in the upper layers as the region experiences land drainage resulting from copious rainfall. The wide range of variability of wind speed and direction may also influence the aerosol source / transport and consequently AOD. The present doctoral thesis concentrates on the utility of Chl a, Kd and AODprovided by satellite ocean colour sensors to understand oceanographic and atmospheric variability off the southwest coast of India. The thesis is divided into six Chapters with further subdivisions

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compositional data naturally arises from the scientific analysis of the chemical composition of archaeological material such as ceramic and glass artefacts. Data of this type can be explored using a variety of techniques, from standard multivariate methods such as principal components analysis and cluster analysis, to methods based upon the use of log-ratios. The general aim is to identify groups of chemically similar artefacts that could potentially be used to answer questions of provenance. This paper will demonstrate work in progress on the development of a documented library of methods, implemented using the statistical package R, for the analysis of compositional data. R is an open source package that makes available very powerful statistical facilities at no cost. We aim to show how, with the aid of statistical software such as R, traditional exploratory multivariate analysis can easily be used alongside, or in combination with, specialist techniques of compositional data analysis. The library has been developed from a core of basic R functionality, together with purpose-written routines arising from our own research (for example that reported at CoDaWork'03). In addition, we have included other appropriate publicly available techniques and libraries that have been implemented in R by other authors. Available functions range from standard multivariate techniques through to various approaches to log-ratio analysis and zero replacement. We also discuss and demonstrate a small selection of relatively new techniques that have hitherto been little-used in archaeometric applications involving compositional data. The application of the library to the analysis of data arising in archaeometry will be demonstrated; results from different analyses will be compared; and the utility of the various methods discussed

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Initial bacterial colonization, including colonization with health-positive bacteria, such as bifidobacteria and lactobacilli, is necessary for the normal development of intestinal innate and adaptive immune defenses. The predominance of beneficial bacteria in the gut microflora of breast-fed infants is thought to be, at least in part, supported by the metabolism of the complex mixture of oligosaccharides present in human breast milk, and a more adult-type intestinal microbiota is found in formula-fed infants. Inadequate gut colonization, dysbiosis, may lead to an increased risk of infectious, allergic, and autoimmune disorders later in life. The addition of appropriate amounts of selected prebiotics to infant formulas can enhance the growth of bifidobacteria or lactobacilli in the colonic microbiota and, thereby, might produce beneficial effects. Among the substrates considered as prebiotics are the oligosaccharides inulin, fructo-oligosaccharides, galacto-oligosaccharides, and lactulose. There are some reports that such prebiotics have beneficial effects on various markers of health. For example, primary prevention trials in infants have provided promising data on prevention of infections and atopic dermatitis. Additional well-designed prospective clinical trials and mechanistic studies are needed to advance knowledge further in this promising field. (J Pediatr 2009;155:S61-70).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyzes the use of linear and neural network models for financial distress classification, with emphasis on the issues of input variable selection and model pruning. A data-driven method for selecting input variables (financial ratios, in this case) is proposed. A case study involving 60 British firms in the period 1997-2000 is used for illustration. It is shown that the use of the Optimal Brain Damage pruning technique can considerably improve the generalization ability of a neural model. Moreover, the set of financial ratios obtained with the proposed selection procedure is shown to be an appropriate alternative to the ratios usually employed by practitioners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study focuses on the wealth-protective effects of socially responsible firm behavior by examining the association between corporate social performance (CSP) and financial risk for an extensive panel data sample of S&P 500 companies between the years 1992 and 2009. In addition, the link between CSP and investor utility is investigated. The main findings are that corporate social responsibility is negatively but weakly related to systematic firm risk and that corporate social irresponsibility is positively and strongly related to financial risk. The fact that both conventional and downside risk measures lead to the same conclusions adds convergent validity to the analysis. However, the risk-return trade-off appears to be such that no clear utility gain or loss can be realized by investing in firms characterized by different levels of social and environmental performance. Overall volatility conditions of the financial markets are shown to play a moderating role in the nature and strength of the CSP-risk relationship.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When performing data fusion, one often measures where targets were and then wishes to deduce where targets currently are. There has been recent research on the processing of such out-of-sequence data. This research has culminated in the development of a number of algorithms for solving the associated tracking problem. This paper reviews these different approaches in a common Bayesian framework and proposes an architecture that orthogonalises the data association and out-of-sequence problems such that any combination of solutions to these two problems can be used together. The emphasis is not on advocating one approach over another on the basis of computational expense, but rather on understanding the relationships among the algorithms so that any approximations made are explicit. Results for a multi-sensor scenario involving out-of-sequence data association are used to illustrate the utility of this approach in a specific context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This technique paper describes a novel method for quantitatively and routinely identifying auroral breakup following substorm onset using the Time History of Events and Macroscale Interactions During Substorms (THEMIS) all-sky imagers (ASIs). Substorm onset is characterised by a brightening of the aurora that is followed by auroral poleward expansion and auroral breakup. This breakup can be identified by a sharp increase in the auroral intensity i(t) and the time derivative of auroral intensity i'(t). Utilising both i(t) and i'(t) we have developed an algorithm for identifying the time interval and spatial location of auroral breakup during the substorm expansion phase within the field of view of ASI data based solely on quantifiable characteristics of the optical auroral emissions. We compare the time interval determined by the algorithm to independently identified auroral onset times from three previously published studies. In each case the time interval determined by the algorithm is within error of the onset independently identified by the prior studies. We further show the utility of the algorithm by comparing the breakup intervals determined using the automated algorithm to an independent list of substorm onset times. We demonstrate that up to 50% of the breakup intervals characterised by the algorithm are within the uncertainty of the times identified in the independent list. The quantitative description and routine identification of an interval of auroral brightening during the substorm expansion phase provides a foundation for unbiased statistical analysis of the aurora to probe the physics of the auroral substorm as a new scientific tool for aiding the identification of the processes leading to auroral substorm onset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is increasing recognition that agricultural landscapes meet multiple societal needs and demands beyond provision of economic and environmental goods and services. Accordingly, there have been significant calls for the inclusion of societal, amenity and cultural values in agri-environmental landscape indicators to assist policy makers in monitoring the wider impacts of land-based policies. However, capturing the amenity and cultural values that rural agrarian areas provide, by use of such indicators, presents significant challenges. The EU social awareness of landscape indicator represents a new class of generalized social indicator using a top-down methodology to capture the social dimensions of landscape without reference to the specific structural and cultural characteristics of individual landscapes. This paper reviews this indicator in the context of existing agri-environmental indicators and their differing design concepts. Using a stakeholder consultation approach in five case study regions, the potential and limitations of the indicator are evaluated, with a particular focus on its perceived meaning, utility and performance in the context of different user groups and at different geographical scales. This analysis supplements previous EU-wide assessments, through regional scale assessment of the limitations and potentialities of the indicator and the need for further data collection. The evaluation finds that the perceived meaning of the indicator does not vary with scale, but in common with all mapped indicators, the usefulness of the indicator, to different user groups, does change with scale of presentation. This indicator is viewed as most useful when presented at the scale of governance at which end users operate. The relevance of the different sub-components of the indicator are also found to vary across regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Second language acquisition researchers often face particular challenges when attempting to generalize study findings to the wider learner population. For example, language learners constitute a heterogeneous group, and it is not always clear how a study’s findings may generalize to other individuals who may differ in terms of language background and proficiency, among many other factors. In this paper, we provide an overview of how mixed-effects models can be used to help overcome these and other issues in the field of second language acquisition. We provide an overview of the benefits of mixed-effects models and a practical example of how mixed-effects analyses can be conducted. Mixed-effects models provide second language researchers with a powerful statistical tool in the analysis of a variety of different types of data.