996 resultados para Lossless data hiding


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fourth Quarterly County information for Census of Employment & Wage, Statewide

Relevância:

20.00% 20.00%

Publicador:

Resumo:

First Quarterly County information for Census of Employment & Wage, County

Relevância:

20.00% 20.00%

Publicador:

Resumo:

First Quarterly County information for Census of Employment & Wage, Statewide

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Second Quarterly County information for Census of Employment & Wage, County

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Second Quarterly County information for Census of Employment & Wage, Statewide

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Part I of this series of articles focused on the construction of graphical probabilistic inference procedures, at various levels of detail, for assessing the evidential value of gunshot residue (GSR) particle evidence. The proposed models - in the form of Bayesian networks - address the issues of background presence of GSR particles, analytical performance (i.e., the efficiency of evidence searching and analysis procedures) and contamination. The use and practical implementation of Bayesian networks for case pre-assessment is also discussed. This paper, Part II, concentrates on Bayesian parameter estimation. This topic complements Part I in that it offers means for producing estimates useable for the numerical specification of the proposed probabilistic graphical models. Bayesian estimation procedures are given a primary focus of attention because they allow the scientist to combine (his/her) prior knowledge about the problem of interest with newly acquired experimental data. The present paper also considers further topics such as the sensitivity of the likelihood ratio due to uncertainty in parameters and the study of likelihood ratio values obtained for members of particular populations (e.g., individuals with or without exposure to GSR).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To date, state-of-the-art seismic material parameter estimates from multi-component sea-bed seismic data are based on the assumption that the sea-bed consists of a fully elastic half-space. In reality, however, the shallow sea-bed generally consists of soft, unconsolidated sediments that are characterized by strong to very strong seismic attenuation. To explore the potential implications, we apply a state-of-the-art elastic decomposition algorithm to synthetic data for a range of canonical sea-bed models consisting of a viscoelastic half-space of varying attenuation. We find that in the presence of strong seismic attenuation, as quantified by Q-values of 10 or less, significant errors arise in the conventional elastic estimation of seismic properties. Tests on synthetic data indicate that these errors can be largely avoided by accounting for the inherent attenuation of the seafloor when estimating the seismic parameters. This can be achieved by replacing the real-valued expressions for the elastic moduli in the governing equations in the parameter estimation by their complex-valued viscoelastic equivalents. The practical application of our parameter procedure yields realistic estimates of the elastic seismic material properties of the shallow sea-bed, while the corresponding Q-estimates seem to be biased towards too low values, particularly for S-waves. Given that the estimation of inelastic material parameters is notoriously difficult, particularly in the immediate vicinity of the sea-bed, this is expected to be of interest and importance for civil and ocean engineering purposes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Iowa Affirmative Action Data is developed annually to assist employers in creating affirmative action plans and evaluate the inclusion of women and minorities in their workforces. Data for the previous calendar year break out the following by gender and minority

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A number of experimental methods have been reported for estimating the number of genes in a genome, or the closely related coding density of a genome, defined as the fraction of base pairs in codons. Recently, DNA sequence data representative of the genome as a whole have become available for several organisms, making the problem of estimating coding density amenable to sequence analytic methods. Estimates of coding density for a single genome vary widely, so that methods with characterized error bounds have become increasingly desirable. We present a method to estimate the protein coding density in a corpus of DNA sequence data, in which a ‘coding statistic’ is calculated for a large number of windows of the sequence under study, and the distribution of the statistic is decomposed into two normal distributions, assumed to be the distributions of the coding statistic in the coding and noncoding fractions of the sequence windows. The accuracy of the method is evaluated using known data and application is made to the yeast chromosome III sequence and to C.elegans cosmid sequences. It can also be applied to fragmentary data, for example a collection of short sequences determined in the course of STS mapping.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In rodents and nonhuman primates subjected to spinal cord lesion, neutralizing the neurite growth inhibitor Nogo-A has been shown to promote regenerative axonal sprouting and functional recovery. The goal of the present report was to re-examine the data on the recovery of the primate manual dexterity using refined behavioral analyses and further statistical assessments, representing secondary outcome measures from the same manual dexterity test. Thirteen adult monkeys were studied; seven received an anti-Nogo-A antibody whereas a control antibody was infused into the other monkeys. Monkeys were trained to perform the modified Brinkman board task requiring opposition of index finger and thumb to grasp food pellets placed in vertically and horizontally oriented slots. Two parameters were quantified before and following spinal cord injury: (i) the standard 'score' as defined by the number of pellets retrieved within 30 s from the two types of slots; (ii) the newly introduced 'contact time' as defined by the duration of digit contact with the food pellet before successful retrieval. After lesion the hand was severely impaired in all monkeys; this was followed by progressive functional recovery. Remarkably, anti-Nogo-A antibody-treated monkeys recovered faster and significantly better than control antibody-treated monkeys, considering both the score for vertical and horizontal slots (Mann-Whitney test: P = 0.05 and 0.035, respectively) and the contact time (P = 0.008 and 0.005, respectively). Detailed analysis of the lesions excluded the possibility that this conclusion may have been caused by differences in lesion properties between the two groups of monkeys.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Systematic approaches for identifying proteins involved in different types of cancer are needed. Experimental techniques such as microarrays are being used to characterize cancer, but validating their results can be a laborious task. Computational approaches are used to prioritize between genes putatively involved in cancer, usually based on further analyzing experimental data. Results: We implemented a systematic method using the PIANA software that predicts cancer involvement of genes by integrating heterogeneous datasets. Specifically, we produced lists of genes likely to be involved in cancer by relying on: (i) protein-protein interactions; (ii) differential expression data; and (iii) structural and functional properties of cancer genes. The integrative approach that combines multiple sources of data obtained positive predictive values ranging from 23% (on a list of 811 genes) to 73% (on a list of 22 genes), outperforming the use of any of the data sources alone. We analyze a list of 20 cancer gene predictions, finding that most of them have been recently linked to cancer in literature. Conclusion: Our approach to identifying and prioritizing candidate cancer genes can be used to produce lists of genes likely to be involved in cancer. Our results suggest that differential expression studies yielding high numbers of candidate cancer genes can be filtered using protein interaction networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

NovoTTF-100A (TTF) is a portable device delivering low-intensity, intermediate-frequency, alternating electric fields using noninvasive, disposable scalp electrodes. TTF interferes with tumor cell division, and it has been approved by the US Food and Drug Administration (FDA) for the treatment of recurrent glioblastoma (rGBM) based on data from a phase III trial. This presentation describes the updated survival data 2 years after completing recruitment. Adults with rGBM (KPS ≥ 70) were randomized (stratified by surgery and center) to either continuous TTF (20-24 h/day, 7 days/week) or efficacious chemotherapy based on best physician choice (BPC). The primary endpoint was overall survival (OS), and secondary endpoints were PFS6, 1-year survival, and QOL. Patients were randomized (28 US and European centers) to either TTF alone (n ¼ 120) or BPC (n ¼ 117). Patient characteristics were balanced, median age was 54 years (range, 23-80 years), and median KPS was 80 (range, 50-100). One quarter of the patients had debulking surgery, and over half of the patients were at their second or later recurrence. OS in the intent-to-treat (ITT) population was equivalent in TTF versus BPC patients (median OS, 6.6vs. 6.0 months; n ¼ 237; p ¼ 0.26; HR ¼ 0.86). With a median follow-up of 33.6 months, long-term survival in the TTF group was higher than that in the BPC group at 2, 3, and 4 years of follow-up (9.3% vs. 6.6%; 8.4% vs. 1.4%; 8.4% vs. 0.0%, respectively). Analysis of patients who received at least one treatment course demonstrated a survival benefit for TTF patients compared to BPC patients (median OS, 7.8 vs. 6.0 months; n ¼ 93 vs. n ¼ 117; p ¼ 0.012; HR ¼ 0.69). In this group, 1-year survival was 28% vs. 20%, and PFS6 was 26.2% vs. 15.2% (p ¼ 0.034). TTF, a noninvasive, novel cancer treatment modality shows significant therapeutic efficacy with promising long-term survival results. The impact of TTF was more pronounced when comparing only patients who received the minimal treatment course. A large-scale phase III trial in newly diagnosed GBM is ongoing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Shrews of the genus Sorex are characterized by a Holarctic distribution, and relationships among extant taxa have never been fully resolved. Phylogenies have been proposed based on morphological, karyological, and biochemical comparisons, but these analyses often produced controversial and contradictory results. Phylogenetic analyses of partial mitochondrial cytochrome b gene sequences (1011 bp) were used to examine the relationships among 27 Sorex species. The molecular data suggest that Sorex comprises two major monophyletic lineages, one restricted mostly to the New World and one with a primarily Palearctic distribution. Furthermore, several sister-species relationships are revealed by the analysis. Based on the split between the Soricinae and Crocidurinae subfamilies, we used a 95% confidence interval for both the calibration of a molecular clock and the subsequent calculation of major diversification events within the genus Sorex. Our analysis does not support an unambiguous acceleration of the molecular clock in shrews, the estimated rate being similar to other estimates of mammalian mitochondrial clocks. In addition, the data presented here indicate that estimates from the fossil record greatly underestimate divergence dates among Sorex taxa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing volume of data describing humandisease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the@neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system’s architecture is generic enough that it could be adapted to the treatment of other diseases.Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers cliniciansthe tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medicalresearchers gain access to a critical mass of aneurysm related data due to the system’s ability to federate distributed informationsources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access andwork on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand forperforming computationally intensive simulations for treatment planning and research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The spectral efficiency achievable with joint processing of pilot and data symbol observations is compared with that achievable through the conventional (separate) approach of first estimating the channel on the basis of the pilot symbols alone, and subsequently detecting the datasymbols. Studied on the basis of a mutual information lower bound, joint processing is found to provide a non-negligible advantage relative to separate processing, particularly for fast fading. It is shown that, regardless of the fading rate, only a very small number of pilot symbols (at most one per transmit antenna and per channel coherence interval) shouldbe transmitted if joint processing is allowed.