956 resultados para Applied identity-based encryption


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Excitation-continuous music instrument control patterns are often not explicitly represented in current sound synthesis techniques when applied to automatic performance. Both physical model-based and sample-based synthesis paradigmswould benefit from a flexible and accurate instrument control model, enabling the improvement of naturalness and realism. Wepresent a framework for modeling bowing control parameters inviolin performance. Nearly non-intrusive sensing techniques allow for accurate acquisition of relevant timbre-related bowing control parameter signals.We model the temporal contour of bow velocity, bow pressing force, and bow-bridge distance as sequences of short Bézier cubic curve segments. Considering different articulations, dynamics, and performance contexts, a number of note classes are defined. Contours of bowing parameters in a performance database are analyzed at note-level by following a predefined grammar that dictates characteristics of curve segment sequences for each of the classes in consideration. As a result, contour analysis of bowing parameters of each note yields an optimal representation vector that is sufficient for reconstructing original contours with significant fidelity. From the resulting representation vectors, we construct a statistical model based on Gaussian mixtures suitable for both the analysis and synthesis of bowing parameter contours. By using the estimated models, synthetic contours can be generated through a bow planning algorithm able to reproduce possible constraints caused by the finite length of the bow. Rendered contours are successfully used in two preliminary synthesis frameworks: digital waveguide-based bowed stringphysical modeling and sample-based spectral-domain synthesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Single-trial analysis of human electroencephalography (EEG) has been recently proposed for better understanding the contribution of individual subjects to a group-analysis effect as well as for investigating single-subject mechanisms. Independent Component Analysis (ICA) has been repeatedly applied to concatenated single-trial responses and at a single-subject level in order to extract those components that resemble activities of interest. More recently we have proposed a single-trial method based on topographic maps that determines which voltage configurations are reliably observed at the event-related potential (ERP) level taking advantage of repetitions across trials. Here, we investigated the correspondence between the maps obtained by ICA versus the topographies that we obtained by the single-trial clustering algorithm that best explained the variance of the ERP. To do this, we used exemplar data provided from the EEGLAB website that are based on a dataset from a visual target detection task. We show there to be robust correspondence both at the level of the activation time courses and at the level of voltage configurations of a subset of relevant maps. We additionally show the estimated inverse solution (based on low-resolution electromagnetic tomography) of two corresponding maps occurring at approximately 300 ms post-stimulus onset, as estimated by the two aforementioned approaches. The spatial distribution of the estimated sources significantly correlated and had in common a right parietal activation within Brodmann's Area (BA) 40. Despite their differences in terms of theoretical bases, the consistency between the results of these two approaches shows that their underlying assumptions are indeed compatible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a novel multifactor dimensionality reduction method for epistasis detection in small or extended pedigrees, FAM-MDR. It combines features of the Genome-wide Rapid Association using Mixed Model And Regression approach (GRAMMAR) with Model-Based MDR (MB-MDR). We focus on continuous traits, although the method is general and can be used for outcomes of any type, including binary and censored traits. When comparing FAM-MDR with Pedigree-based Generalized MDR (PGMDR), which is a generalization of Multifactor Dimensionality Reduction (MDR) to continuous traits and related individuals, FAM-MDR was found to outperform PGMDR in terms of power, in most of the considered simulated scenarios. Additional simulations revealed that PGMDR does not appropriately deal with multiple testing and consequently gives rise to overly optimistic results. FAM-MDR adequately deals with multiple testing in epistasis screens and is in contrast rather conservative, by construction. Furthermore, simulations show that correcting for lower order (main) effects is of utmost importance when claiming epistasis. As Type 2 Diabetes Mellitus (T2DM) is a complex phenotype likely influenced by gene-gene interactions, we applied FAM-MDR to examine data on glucose area-under-the-curve (GAUC), an endophenotype of T2DM for which multiple independent genetic associations have been observed, in the Amish Family Diabetes Study (AFDS). This application reveals that FAM-MDR makes more efficient use of the available data than PGMDR and can deal with multi-generational pedigrees more easily. In conclusion, we have validated FAM-MDR and compared it to PGMDR, the current state-of-the-art MDR method for family data, using both simulations and a practical dataset. FAM-MDR is found to outperform PGMDR in that it handles the multiple testing issue more correctly, has increased power, and efficiently uses all available information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Artifacts are present in most of the electroencephalography (EEG) recordings, making it difficult to interpret or analyze the data. In this paper a cleaning procedure based on a multivariate extension of empirical mode decomposition is used to improve the quality of the data. This is achieved by applying the cleaning method to raw EEG data. Then, a synchrony measure is applied on the raw and the clean data in order to compare the improvement of the classification rate. Two classifiers are used, linear discriminant analysis and neural networks. For both cases, the classification rate is improved about 20%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A discussion is presented of daytime sky imaging and techniques that may be applied to the analysis of full-color sky images to infer cloud macrophysical properties. Descriptions of two different types of skyimaging systems developed by the authors are presented, one of which has been developed into a commercially available instrument. Retrievals of fractional sky cover from automated processing methods are compared to human retrievals, both from direct observations and visual analyses of sky images. Although some uncertainty exists in fractional sky cover retrievals from sky images, this uncertainty is no greater than that attached to human observations for the commercially available sky-imager retrievals. Thus, the application of automatic digital image processing techniques on sky images is a useful method to complement, or even replace, traditional human observations of sky cover and, potentially, cloud type. Additionally, the possibilities for inferring other cloud parameters such as cloud brokenness and solar obstruction further enhance the usefulness of sky imagers

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the regional scale represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed an upscaling procedure based on a Bayesian sequential simulation approach. This method is then applied to the stochastic integration of low-resolution, regional-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities. Finally, the overall viability of this upscaling approach is tested and verified by performing and comparing flow and transport simulation through the original and the upscaled hydraulic conductivity fields. Our results indicate that the proposed procedure does indeed allow for obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simulated-annealing-based conditional simulations provide a flexible means of quantitatively integrating diverse types of subsurface data. Although such techniques are being increasingly used in hydrocarbon reservoir characterization studies, their potential in environmental, engineering and hydrological investigations is still largely unexploited. Here, we introduce a novel simulated annealing (SA) algorithm geared towards the integration of high-resolution geophysical and hydrological data which, compared to more conventional approaches, provides significant advancements in the way that large-scale structural information in the geophysical data is accounted for. Model perturbations in the annealing procedure are made by drawing from a probability distribution for the target parameter conditioned to the geophysical data. This is the only place where geophysical information is utilized in our algorithm, which is in marked contrast to other approaches where model perturbations are made through the swapping of values in the simulation grid and agreement with soft data is enforced through a correlation coefficient constraint. Another major feature of our algorithm is the way in which available geostatistical information is utilized. Instead of constraining realizations to match a parametric target covariance model over a wide range of spatial lags, we constrain the realizations only at smaller lags where the available geophysical data cannot provide enough information. Thus we allow the larger-scale subsurface features resolved by the geophysical data to have much more due control on the output realizations. Further, since the only component of the SA objective function required in our approach is a covariance constraint at small lags, our method has improved convergence and computational efficiency over more traditional methods. Here, we present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on a synthetic data set, and then applied to data collected at the Boise Hydrogeophysical Research Site.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The comparison of radiotherapy techniques regarding secondary cancer risk has yielded contradictory results possibly stemming from the many different approaches used to estimate risk. The purpose of this study was to make a comprehensive evaluation of different available risk models applied to detailed whole-body dose distributions computed by Monte Carlo for various breast radiotherapy techniques including conventional open tangents, 3D conformal wedged tangents and hybrid intensity modulated radiation therapy (IMRT). First, organ-specific linear risk models developed by the International Commission on Radiological Protection (ICRP) and the Biological Effects of Ionizing Radiation (BEIR) VII committee were applied to mean doses for remote organs only and all solid organs. Then, different general non-linear risk models were applied to the whole body dose distribution. Finally, organ-specific non-linear risk models for the lung and breast were used to assess the secondary cancer risk for these two specific organs. A total of 32 different calculated absolute risks resulted in a broad range of values (between 0.1% and 48.5%) underlying the large uncertainties in absolute risk calculation. The ratio of risk between two techniques has often been proposed as a more robust assessment of risk than the absolute risk. We found that the ratio of risk between two techniques could also vary substantially considering the different approaches to risk estimation. Sometimes the ratio of risk between two techniques would range between values smaller and larger than one, which then translates into inconsistent results on the potential higher risk of one technique compared to another. We found however that the hybrid IMRT technique resulted in a systematic reduction of risk compared to the other techniques investigated even though the magnitude of this reduction varied substantially with the different approaches investigated. Based on the epidemiological data available, a reasonable approach to risk estimation would be to use organ-specific non-linear risk models applied to the dose distributions of organs within or near the treatment fields (lungs and contralateral breast in the case of breast radiotherapy) as the majority of radiation-induced secondary cancers are found in the beam-bordering regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En este trabajo se expone e ilustra un modelo teórico para entender las funciones de la identidad, así como los mecanismospsicosociales asociados a su construcción: “Modelo Evolutivo y Funcional de la Identidad Mediada” (MEBIM). La identidad, mediada narrativamente, cumple una función personal orientada a la dirección de la propia vida, así como una función sociocultural vinculada a la búsqueda de reconocimiento de los derechos de los grupos sociales a los que uno se siente apegado. Se ilustran los factores asociados a la construcción de la identidad personal (sí mismos posibles, transiciones vitales, vínculo afectivo) y sociocultural (acción-transformación e identificación simbólica) a partir de 12 historias de vida realizadas con mestizos e indígenasde la Universidad Intercultural de Chiapas (México). Se sugiere que en contextos educativos formales, como la escuela o la Universidad, se deben propiciar narrativas personales y socioculturales con el objetivo de optimizar la identidad en un mundo a la vez globalizado y plural

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Impressive developments in X-ray imaging are associated with X-ray phase contrast computed tomography based on grating interferometry, a technique that provides increased contrast compared with conventional absorption-based imaging. A new "single-step" method capable of separating phase information from other contributions has been recently proposed. This approach not only simplifies data-acquisition procedures, but, compared with the existing phase step approach, significantly reduces the dose delivered to a sample. However, the image reconstruction procedure is more demanding than for traditional methods and new algorithms have to be developed to take advantage of the "single-step" method. In the work discussed in this paper, a fast iterative image reconstruction method named OSEM (ordered subsets expectation maximization) was applied to experimental data to evaluate its performance and range of applicability. The OSEM algorithm with different subsets was also characterized by comparison of reconstruction image quality and convergence speed. Computer simulations and experimental results confirm the reliability of this new algorithm for phase-contrast computed tomography applications. Compared with the traditional filtered back projection algorithm, in particular in the presence of a noisy acquisition, it furnishes better images at a higher spatial resolution and with lower noise. We emphasize that the method is highly compatible with future X-ray phase contrast imaging clinical applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a class of models of social network formation based on a mathematical abstraction of the concept of social distance. Social distance attachment is represented by the tendency of peers to establish acquaintances via a decreasing function of the relative distance in a representative social space. We derive analytical results (corroborated by extensive numerical simulations), showing that the model reproduces the main statistical characteristics of real social networks: large clustering coefficient, positive degree correlations, and the emergence of a hierarchy of communities. The model is confronted with the social network formed by people that shares confidential information using the Pretty Good Privacy (PGP) encryption algorithm, the so-called web of trust of PGP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several approaches have been developed to estimate both the relative and absolute rates of speciation and extinction within clades based on molecular phylogenetic reconstructions of evolutionary relationships, according to an underlying model of diversification. However, the macroevolutionary models established for eukaryotes have scarcely been used with prokaryotes. We have investigated the rate and pattern of cladogenesis in the genus Aeromonas (γ-Proteobacteria, Proteobacteria, Bacteria) using the sequences of five housekeeping genes and an uncorrelated relaxed-clock approach. To our knowledge, until now this analysis has never been applied to all the species described in a bacterial genus and thus opens up the possibility of establishing models of speciation from sequence data commonly used in phylogenetic studies of prokaryotes. Our results suggest that the genus Aeromonas began to diverge between 248 and 266 million years ago, exhibiting a constant divergence rate through the Phanerozoic, which could be described as a pure birth process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project was undertaken to study the relationships between the performance of locally available asphalts and their physicochemical properties under Iowa conditions with the ultimate objective of development of a locally and performance-based asphalt specification for durable pavements. Physical and physicochemical tests were performed on three sets of asphalt samples including: (a) twelve samples from local asphalt suppliers and their TFOT residues, (b) six core samples of known service records, and (c) a total of 79 asphalts from 10 pavement projects including original, lab aged and recovered asphalts from field mixes, as well as from lab aged mixes. Tests included standard rheological tests, HP-GPC and TMA. Some specific viscoelastic tests (at 5 deg C) were run on b samples and on some a samples. DSC and X-ray diffraction studies were performed on a and b samples. Furthermore, NMR techniques were applied to some a, b and c samples. Efforts were made to identify physicochemical properties which are correlated to physical properties known to affect field performance. The significant physicochemical parameters were used as a basis for an improved performance-based trial specification for Iowa to ensure more durable pavements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Following the seminal work on personal identity of Erikson, Marcia's identity status model has been one of the most enduring paradigms. The Ego Identity Process Questionnaire (EIPQ; Balistreri, Busch-Rossnagel, & Geissinger, 1995) is a widely used measure of identity status. The purpose of this study was to evaluate the factor structure and the reliability of a French version of the EIPQ. The hypothesized structures were not confirmed. In light of the failed attempts to validate the original version, an alternative short-form version of the EIPQ (EIPQ-SF), maintaining the integrity of the original model, was developed in one sample and cross-validated in another sample. Additionally, theoretically consistent associations between the EIPQ-SF dimensions and self-esteem confirmed convergent validity. Globally, the results indicated that the French short-version of the EIPQ might be a useful instrument for the assessment of the identity statuses in adolescence and emerging adulthood.