1000 resultados para linkitetty data


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Part I of this series of articles focused on the construction of graphical probabilistic inference procedures, at various levels of detail, for assessing the evidential value of gunshot residue (GSR) particle evidence. The proposed models - in the form of Bayesian networks - address the issues of background presence of GSR particles, analytical performance (i.e., the efficiency of evidence searching and analysis procedures) and contamination. The use and practical implementation of Bayesian networks for case pre-assessment is also discussed. This paper, Part II, concentrates on Bayesian parameter estimation. This topic complements Part I in that it offers means for producing estimates useable for the numerical specification of the proposed probabilistic graphical models. Bayesian estimation procedures are given a primary focus of attention because they allow the scientist to combine (his/her) prior knowledge about the problem of interest with newly acquired experimental data. The present paper also considers further topics such as the sensitivity of the likelihood ratio due to uncertainty in parameters and the study of likelihood ratio values obtained for members of particular populations (e.g., individuals with or without exposure to GSR).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To date, state-of-the-art seismic material parameter estimates from multi-component sea-bed seismic data are based on the assumption that the sea-bed consists of a fully elastic half-space. In reality, however, the shallow sea-bed generally consists of soft, unconsolidated sediments that are characterized by strong to very strong seismic attenuation. To explore the potential implications, we apply a state-of-the-art elastic decomposition algorithm to synthetic data for a range of canonical sea-bed models consisting of a viscoelastic half-space of varying attenuation. We find that in the presence of strong seismic attenuation, as quantified by Q-values of 10 or less, significant errors arise in the conventional elastic estimation of seismic properties. Tests on synthetic data indicate that these errors can be largely avoided by accounting for the inherent attenuation of the seafloor when estimating the seismic parameters. This can be achieved by replacing the real-valued expressions for the elastic moduli in the governing equations in the parameter estimation by their complex-valued viscoelastic equivalents. The practical application of our parameter procedure yields realistic estimates of the elastic seismic material properties of the shallow sea-bed, while the corresponding Q-estimates seem to be biased towards too low values, particularly for S-waves. Given that the estimation of inelastic material parameters is notoriously difficult, particularly in the immediate vicinity of the sea-bed, this is expected to be of interest and importance for civil and ocean engineering purposes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Iowa Affirmative Action Data is developed annually to assist employers in creating affirmative action plans and evaluate the inclusion of women and minorities in their workforces. Data for the previous calendar year break out the following by gender and minority

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A number of experimental methods have been reported for estimating the number of genes in a genome, or the closely related coding density of a genome, defined as the fraction of base pairs in codons. Recently, DNA sequence data representative of the genome as a whole have become available for several organisms, making the problem of estimating coding density amenable to sequence analytic methods. Estimates of coding density for a single genome vary widely, so that methods with characterized error bounds have become increasingly desirable. We present a method to estimate the protein coding density in a corpus of DNA sequence data, in which a ‘coding statistic’ is calculated for a large number of windows of the sequence under study, and the distribution of the statistic is decomposed into two normal distributions, assumed to be the distributions of the coding statistic in the coding and noncoding fractions of the sequence windows. The accuracy of the method is evaluated using known data and application is made to the yeast chromosome III sequence and to C.elegans cosmid sequences. It can also be applied to fragmentary data, for example a collection of short sequences determined in the course of STS mapping.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In rodents and nonhuman primates subjected to spinal cord lesion, neutralizing the neurite growth inhibitor Nogo-A has been shown to promote regenerative axonal sprouting and functional recovery. The goal of the present report was to re-examine the data on the recovery of the primate manual dexterity using refined behavioral analyses and further statistical assessments, representing secondary outcome measures from the same manual dexterity test. Thirteen adult monkeys were studied; seven received an anti-Nogo-A antibody whereas a control antibody was infused into the other monkeys. Monkeys were trained to perform the modified Brinkman board task requiring opposition of index finger and thumb to grasp food pellets placed in vertically and horizontally oriented slots. Two parameters were quantified before and following spinal cord injury: (i) the standard 'score' as defined by the number of pellets retrieved within 30 s from the two types of slots; (ii) the newly introduced 'contact time' as defined by the duration of digit contact with the food pellet before successful retrieval. After lesion the hand was severely impaired in all monkeys; this was followed by progressive functional recovery. Remarkably, anti-Nogo-A antibody-treated monkeys recovered faster and significantly better than control antibody-treated monkeys, considering both the score for vertical and horizontal slots (Mann-Whitney test: P = 0.05 and 0.035, respectively) and the contact time (P = 0.008 and 0.005, respectively). Detailed analysis of the lesions excluded the possibility that this conclusion may have been caused by differences in lesion properties between the two groups of monkeys.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Systematic approaches for identifying proteins involved in different types of cancer are needed. Experimental techniques such as microarrays are being used to characterize cancer, but validating their results can be a laborious task. Computational approaches are used to prioritize between genes putatively involved in cancer, usually based on further analyzing experimental data. Results: We implemented a systematic method using the PIANA software that predicts cancer involvement of genes by integrating heterogeneous datasets. Specifically, we produced lists of genes likely to be involved in cancer by relying on: (i) protein-protein interactions; (ii) differential expression data; and (iii) structural and functional properties of cancer genes. The integrative approach that combines multiple sources of data obtained positive predictive values ranging from 23% (on a list of 811 genes) to 73% (on a list of 22 genes), outperforming the use of any of the data sources alone. We analyze a list of 20 cancer gene predictions, finding that most of them have been recently linked to cancer in literature. Conclusion: Our approach to identifying and prioritizing candidate cancer genes can be used to produce lists of genes likely to be involved in cancer. Our results suggest that differential expression studies yielding high numbers of candidate cancer genes can be filtered using protein interaction networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

NovoTTF-100A (TTF) is a portable device delivering low-intensity, intermediate-frequency, alternating electric fields using noninvasive, disposable scalp electrodes. TTF interferes with tumor cell division, and it has been approved by the US Food and Drug Administration (FDA) for the treatment of recurrent glioblastoma (rGBM) based on data from a phase III trial. This presentation describes the updated survival data 2 years after completing recruitment. Adults with rGBM (KPS ≥ 70) were randomized (stratified by surgery and center) to either continuous TTF (20-24 h/day, 7 days/week) or efficacious chemotherapy based on best physician choice (BPC). The primary endpoint was overall survival (OS), and secondary endpoints were PFS6, 1-year survival, and QOL. Patients were randomized (28 US and European centers) to either TTF alone (n ¼ 120) or BPC (n ¼ 117). Patient characteristics were balanced, median age was 54 years (range, 23-80 years), and median KPS was 80 (range, 50-100). One quarter of the patients had debulking surgery, and over half of the patients were at their second or later recurrence. OS in the intent-to-treat (ITT) population was equivalent in TTF versus BPC patients (median OS, 6.6vs. 6.0 months; n ¼ 237; p ¼ 0.26; HR ¼ 0.86). With a median follow-up of 33.6 months, long-term survival in the TTF group was higher than that in the BPC group at 2, 3, and 4 years of follow-up (9.3% vs. 6.6%; 8.4% vs. 1.4%; 8.4% vs. 0.0%, respectively). Analysis of patients who received at least one treatment course demonstrated a survival benefit for TTF patients compared to BPC patients (median OS, 7.8 vs. 6.0 months; n ¼ 93 vs. n ¼ 117; p ¼ 0.012; HR ¼ 0.69). In this group, 1-year survival was 28% vs. 20%, and PFS6 was 26.2% vs. 15.2% (p ¼ 0.034). TTF, a noninvasive, novel cancer treatment modality shows significant therapeutic efficacy with promising long-term survival results. The impact of TTF was more pronounced when comparing only patients who received the minimal treatment course. A large-scale phase III trial in newly diagnosed GBM is ongoing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Shrews of the genus Sorex are characterized by a Holarctic distribution, and relationships among extant taxa have never been fully resolved. Phylogenies have been proposed based on morphological, karyological, and biochemical comparisons, but these analyses often produced controversial and contradictory results. Phylogenetic analyses of partial mitochondrial cytochrome b gene sequences (1011 bp) were used to examine the relationships among 27 Sorex species. The molecular data suggest that Sorex comprises two major monophyletic lineages, one restricted mostly to the New World and one with a primarily Palearctic distribution. Furthermore, several sister-species relationships are revealed by the analysis. Based on the split between the Soricinae and Crocidurinae subfamilies, we used a 95% confidence interval for both the calibration of a molecular clock and the subsequent calculation of major diversification events within the genus Sorex. Our analysis does not support an unambiguous acceleration of the molecular clock in shrews, the estimated rate being similar to other estimates of mammalian mitochondrial clocks. In addition, the data presented here indicate that estimates from the fossil record greatly underestimate divergence dates among Sorex taxa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing volume of data describing humandisease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the@neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system’s architecture is generic enough that it could be adapted to the treatment of other diseases.Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers cliniciansthe tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medicalresearchers gain access to a critical mass of aneurysm related data due to the system’s ability to federate distributed informationsources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access andwork on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand forperforming computationally intensive simulations for treatment planning and research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The spectral efficiency achievable with joint processing of pilot and data symbol observations is compared with that achievable through the conventional (separate) approach of first estimating the channel on the basis of the pilot symbols alone, and subsequently detecting the datasymbols. Studied on the basis of a mutual information lower bound, joint processing is found to provide a non-negligible advantage relative to separate processing, particularly for fast fading. It is shown that, regardless of the fading rate, only a very small number of pilot symbols (at most one per transmit antenna and per channel coherence interval) shouldbe transmitted if joint processing is allowed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ETHNOPHARMACOLOGICAL RELEVANCE: "Reverse pharmacology", also called "bedside-to-bench" or "field to pharmacy" approach, is a research process starting with documentation of clinical outcome as observed by patients with different therapeutic regimens. The treatment most significantly associated with cure is selected for future studies: first, clinical safety and efficacy; then in vivo and vitro studies. Some clinical data, i.e. details on patient status and progress, can be collected during ethnobotanical surveys; they will help clinical researchers and, once effectiveness and safety are established, will also help users of traditional medicine make safer and more effective choices. To gather clinical data successfully, ethnopharmacologists need to be backed by an appropriate team of specialists in medicine and epidemiology. Ethnopharmacologists can also gather important data on traditional medicine safety. MATERIALS AND METHODS: The first step is to create a consensus on the meaning of "clinical data", their interest and importance. An understanding of why "a cure is not a proof of effectiveness" is a starting point to avoid faulty interpretation of the clinical observations. RESULTS: Experience showed that, with the "bedside-to-bench" approach, a treatment derived from traditional recipe can be scientifically validated (in terms of safety and effectiveness) with a cost of less than a million euros, thus providing an end-product that is affordable, available and sustainable. CONCLUSIONS: With rigorous clinical study results, medicinal plant users gain the possibility to refine heath strategies. The field surveyor may gain a better relationship with the population, once she/he is seen as bringing information useful for the quality of care in the community.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AbstractFor a wide range of environmental, hydrological, and engineering applications there is a fast growing need for high-resolution imaging. In this context, waveform tomographic imaging of crosshole georadar data is a powerful method able to provide images of pertinent electrical properties in near-surface environments with unprecedented spatial resolution. In contrast, conventional ray-based tomographic methods, which consider only a very limited part of the recorded signal (first-arrival traveltimes and maximum first-cycle amplitudes), suffer from inherent limitations in resolution and may prove to be inadequate in complex environments. For a typical crosshole georadar survey the potential improvement in resolution when using waveform-based approaches instead of ray-based approaches is in the range of one order-of- magnitude. Moreover, the spatial resolution of waveform-based inversions is comparable to that of common logging methods. While in exploration seismology waveform tomographic imaging has become well established over the past two decades, it is comparably still underdeveloped in the georadar domain despite corresponding needs. Recently, different groups have presented finite-difference time-domain waveform inversion schemes for crosshole georadar data, which are adaptations and extensions of Tarantola's seminal nonlinear generalized least-squares approach developed for the seismic case. First applications of these new crosshole georadar waveform inversion schemes on synthetic and field data have shown promising results. However, there is little known about the limits and performance of such schemes in complex environments. To this end, the general motivation of my thesis is the evaluation of the robustness and limitations of waveform inversion algorithms for crosshole georadar data in order to apply such schemes to a wide range of real world problems.One crucial issue to making applicable and effective any waveform scheme to real-world crosshole georadar problems is the accurate estimation of the source wavelet, which is unknown in reality. Waveform inversion schemes for crosshole georadar data require forward simulations of the wavefield in order to iteratively solve the inverse problem. Therefore, accurate knowledge of the source wavelet is critically important for successful application of such schemes. Relatively small differences in the estimated source wavelet shape can lead to large differences in the resulting tomograms. In the first part of my thesis, I explore the viability and robustness of a relatively simple iterative deconvolution technique that incorporates the estimation of the source wavelet into the waveform inversion procedure rather than adding additional model parameters into the inversion problem. Extensive tests indicate that this source wavelet estimation technique is simple yet effective, and is able to provide remarkably accurate and robust estimates of the source wavelet in the presence of strong heterogeneity in both the dielectric permittivity and electrical conductivity as well as significant ambient noise in the recorded data. Furthermore, our tests also indicate that the approach is insensitive to the phase characteristics of the starting wavelet, which is not the case when directly incorporating the wavelet estimation into the inverse problem.Another critical issue with crosshole georadar waveform inversion schemes which clearly needs to be investigated is the consequence of the common assumption of frequency- independent electromagnetic constitutive parameters. This is crucial since in reality, these parameters are known to be frequency-dependent and complex and thus recorded georadar data may show significant dispersive behaviour. In particular, in the presence of water, there is a wide body of evidence showing that the dielectric permittivity can be significantly frequency dependent over the GPR frequency range, due to a variety of relaxation processes. The second part of my thesis is therefore dedicated to the evaluation of the reconstruction limits of a non-dispersive crosshole georadar waveform inversion scheme in the presence of varying degrees of dielectric dispersion. I show that the inversion algorithm, combined with the iterative deconvolution-based source wavelet estimation procedure that is partially able to account for the frequency-dependent effects through an "effective" wavelet, performs remarkably well in weakly to moderately dispersive environments and has the ability to provide adequate tomographic reconstructions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the most important issues in portland cement concrete pavement research today is surface characteristics. The issue is one of balancing surface texture construction with the need for durability, skid resistance, and noise reduction. The National Concrete Pavement Technology Center at Iowa State University, in conjunction with the Federal Highway Administration, American Concrete Pavement Association, International Grinding and Grooving Association, Iowa Highway Research Board, and other states, have entered into a three-part National Surface Characteristics Program to resolve the balancing problem. As a portion of Part 2, this report documents the construction of 18 separate pavement surfaces for use in the first level of testing for the national project. It identifies the testing to be done and the limitations observed in the construction process. The results of the actual tests will be included in the subsequent national study reports.