939 resultados para data-driven simulation


Relevância:

80.00% 80.00%

Publicador:

Resumo:

What can the statistical structure of natural images teach us about the human brain? Even though the visual cortex is one of the most studied parts of the brain, surprisingly little is known about how exactly images are processed to leave us with a coherent percept of the world around us, so we can recognize a friend or drive on a crowded street without any effort. By constructing probabilistic models of natural images, the goal of this thesis is to understand the structure of the stimulus that is the raison d etre for the visual system. Following the hypothesis that the optimal processing has to be matched to the structure of that stimulus, we attempt to derive computational principles, features that the visual system should compute, and properties that cells in the visual system should have. Starting from machine learning techniques such as principal component analysis and independent component analysis we construct a variety of sta- tistical models to discover structure in natural images that can be linked to receptive field properties of neurons in primary visual cortex such as simple and complex cells. We show that by representing images with phase invariant, complex cell-like units, a better statistical description of the vi- sual environment is obtained than with linear simple cell units, and that complex cell pooling can be learned by estimating both layers of a two-layer model of natural images. We investigate how a simplified model of the processing in the retina, where adaptation and contrast normalization take place, is connected to the nat- ural stimulus statistics. Analyzing the effect that retinal gain control has on later cortical processing, we propose a novel method to perform gain control in a data-driven way. Finally we show how models like those pre- sented here can be extended to capture whole visual scenes rather than just small image patches. By using a Markov random field approach we can model images of arbitrary size, while still being able to estimate the model parameters from the data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Kafka On The Shore consists of three simple concrete letterforms floating on a gallery wall. Reminiscent of minimalist sculpture, the mathematical expression of the letterforms states that ‘r’ is greater than ‘g’. Despite this material simplicity, the solemn presentation of the formula suggests a sense of foreboding, a quiet menace. The work was created as a response to the economic theories of Thomas Piketty presented in his book Capital in the Twenty-First Century. The primary finding of Piketty’s data-driven research is the formula presented by the work; that historically, wealth and inequity both flourish when the rate of return on capital (r) is greater than the rate of economic growth (g). With this simple mathematical summary the book acts as a sobering indictment on the present state of economic inequality.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Inadvertent failure of power transformers has serious consequences on the power system reliability, economics and the revenue accrual. Insulation is the weakest link in the power transformer prompting periodic inspection of the status of insulation at different points in time. A close Monitoring of the electrical, chemical and such other properties on insulation as are sensitive to the amount of time-dependent degradation becomes mandatory to judge the status of the equipment. Data-driven Diagnostic Testing and Condition Monitoring (DTCM) specific to power transformer is the aspect in focus. Authors develop a Monte Carlo approach for augmenting the rather scanty experimental data normally acquired using Proto-types of power transformers. Also described is a validation procedure for estimating the accuracy of the Model so developed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this research is to identify the optimal poverty policy for a welfare state. Poverty is defined by income. Policies for reducing poverty are considered primary, and those for reducing inequality secondary. Poverty is seen as a function of the income transfer system within a welfare state. This research presents a method for optimising this function for the purposes of reducing poverty. It is also implemented in the representative population sample within the Income Distribution Data. SOMA simulation model is used. The iterative simulation process is continued until a level of poverty is reached at which improvements can no longer be made. Expenditures and taxes are kept in balance during the process. The result consists of two programmes. The first programme (social assistance programme) was formulated using five social assistance parameters, all of which dealt with the norms of social assistance for adults (€/month). In the second programme (basic benefits programme), in which social assistance was frozen at the legislative level of 2003, the parameter with the strongest poverty reduction effect turned out to be one of the basic unemployment allowances. This was followed by the norm of the national pension for a single person, two parameters related to housing allowance, and the norm for financial aid for students of higher education institutions. The most effective financing parameter measured by gini-coefficient in all programmes was the percent of capital taxation. Furthermore, these programmes can also be examined in relation to their costs. The social assistance programme is significantly cheaper than the basic benefits programme, and therefore with regard to poverty, the social assistance programme is more cost effective than the basic benefits programme. Therefore, public demand for raising the level of basic benefits does not seem to correspond to the most cost effective poverty policy. Raising basic benefits has most effect on reducing poverty within the group of people whose basic benefits are raised. Raising social assistance, on the other hand, seems to have a strong influence on the poverty of all population groups. The most significant outcome of this research is the development of a method through which a welfare state’s income transfer-based safety net, which has severely deteriorated in recent decades, might be mended. The only way of doing so involves either social assistance or some forms of basic benefits and supplementing these by modifying social assistance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this research is to examine whether short-term communication training can have an impact on the improvement of communication capacity of working communities, and what are prerequisites for the creation of such capacity. Subjects of this research were short-term communication trainings aimed at the managerial and expert levels of enterprises and communities. The research endeavors to find out how communication trainings with an impact should be devised and implemented, and what this requires from the client and provider of the training service. The research data is mostly comprised of quantitative feed-back collected at the end of a training day, as well as delayed interviews. The evaluations have been based on a stakeholder approach, and those concerned were participants to the trainings, clients having commissioned the trainings and communication trainers. The principal method of the qualitative analysis is that of a data-driven content analysis. Two research instruments have been constructed for the analysis and for the presentation of the results: an evaluation circle for the purposes of a holistic evaluation and a development matrix for the structuring of an effective training. The core concept of the matrix is a carrier wave effect, which is needed to carry the abstractions from the training into concrete functions in the everyday life. The relevance of the results has been tested in a pilot organization. The immediate assessment and delayed evaluations gave a very differing picture of the trainings. The immediate feedback was of nearly commendable level, but the effects carried forward into the everyday situations of the working community were small and that the learning rarely was applied into practice. A training session that receives good feedback does not automatically result in the development of individual competence, let alone that of the community. The results show that even short-term communication training can promote communication competence that eventually changes the working culture on an organizational level, provided that the training is designed into a process and that the connections into the participants’ work are ensured. It is essential that all eight elements of the carrier wave effect are taken into account. The entire purchaser-provider -process must function while not omitting the contribution of the participants themselves. The research illustrates the so called bow tie -model of an effective communication training based on the carrier wave effect. Testing the results in pilot trainings showed that a rather small change in the training approach may have a signi¬ficant effect on the outcome of the training as well as those effects that are carried on into the working community. The evaluation circle proved to be a useful tool, which can be used while planning, executing and evaluating training in practice. The development matrix works as a tool for those producing the training service, those using the service as well as those deciding on the purchase of the service in planning and evaluating training that sustainably improves communication capacity. Thus the evaluation circle also works to support and ensure the long-term effects of short-term trainings. In addition to communication trainings, the tools developed for this research are useable for many such needs, where an organization is looking to improve its operations and profitability through training.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Deterministic models have been widely used to predict water quality in distribution systems, but their calibration requires extensive and accurate data sets for numerous parameters. In this study, alternative data-driven modeling approaches based on artificial neural networks (ANNs) were used to predict temporal variations of two important characteristics of water quality chlorine residual and biomass concentrations. The authors considered three types of ANN algorithms. Of these, the Levenberg-Marquardt algorithm provided the best results in predicting residual chlorine and biomass with error-free and ``noisy'' data. The ANN models developed here can generate water quality scenarios of piped systems in real time to help utilities determine weak points of low chlorine residual and high biomass concentration and select optimum remedial strategies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a new algorithm for extracting Free-Form Surface Features (FFSFs) from a surface model. The extraction algorithm is based on a modified taxonomy of FFSFs from that proposed in the literature. A new classification scheme has been proposed for FFSFs to enable their representation and extraction. The paper proposes a separating curve as a signature of FFSFs in a surface model. FFSFs are classified based on the characteristics of the separating curve (number and type) and the influence region (the region enclosed by the separating curve). A method to extract these entities is presented. The algorithm has been implemented and tested for various free-form surface features on different types of free-form surfaces (base surfaces) and is found to correctly identify and represent the features irrespective of the type of underlying surface. The representation and extraction algorithm are both based on topology and geometry. The algorithm is data-driven and does not use any pre-defined templates. The definition presented for a feature is unambiguous and application independent. The proposed classification of FFSFs can be used to develop an ontology to determine semantic equivalences for the feature to be exchanged, mapped and used across PLM applications. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent focus of flood frequency analysis (FFA) studies has been on development of methods to model joint distributions of variables such as peak flow, volume, and duration that characterize a flood event, as comprehensive knowledge of flood event is often necessary in hydrological applications. Diffusion process based adaptive kernel (D-kernel) is suggested in this paper for this purpose. It is data driven, flexible and unlike most kernel density estimators, always yields a bona fide probability density function. It overcomes shortcomings associated with the use of conventional kernel density estimators in FFA, such as boundary leakage problem and normal reference rule. The potential of the D-kernel is demonstrated by application to synthetic samples of various sizes drawn from known unimodal and bimodal populations, and five typical peak flow records from different parts of the world. It is shown to be effective when compared to conventional Gaussian kernel and the best of seven commonly used copulas (Gumbel-Hougaard, Frank, Clayton, Joe, Normal, Plackett, and Student's T) in estimating joint distribution of peak flow characteristics and extrapolating beyond historical maxima. Selection of optimum number of bins is found to be critical in modeling with D-kernel.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Real world biological systems such as the human brain are inherently nonlinear and difficult to model. However, most of the previous studies have either employed linear models or parametric nonlinear models for investigating brain function. In this paper, a novel application of a nonlinear measure of phase synchronization based on recurrences, correlation between probabilities of recurrence (CPR), to study connectivity in the brain has been proposed. Being non-parametric, this method makes very few assumptions, making it suitable for investigating brain function in a data-driven way. CPR's utility with application to multichannel electroencephalographic (EEG) signals has been demonstrated. Brain connectivity obtained using thresholded CPR matrix of multichannel EEG signals showed clear differences in the number and pattern of connections in brain connectivity between (a) epileptic seizure and pre-seizure and (b) eyes open and eyes closed states. Corresponding brain headmaps provide meaningful insights about synchronization in the brain in those states. K-means clustering of connectivity parameters of CPR and linear correlation obtained from global epileptic seizure and pre-seizure showed significantly larger cluster centroid distances for CPR as opposed to linear correlation, thereby demonstrating the superior ability of CPR for discriminating seizure from pre-seizure. The headmap in the case of focal epilepsy clearly enables us to identify the focus of the epilepsy which provides certain diagnostic value. (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Complex biological systems such as the human brain can be expected to be inherently nonlinear and hence difficult to model. Most of the previous studies on investigations of brain function have either used linear models or parametric nonlinear models. In this paper, we propose a novel application of a nonlinear measure of phase synchronization based on recurrences, correlation between probabilities of recurrence (CPR), to study seizures in the brain. The advantage of this nonparametric method is that it makes very few assumptions thus making it possible to investigate brain functioning in a data-driven way. We have demonstrated the utility of CPR measure for the study of phase synchronization in multichannel seizure EEG recorded from patients with global as well as focal epilepsy. For the case of global epilepsy, brain synchronization using thresholded CPR matrix of multichannel EEG signals showed clear differences in results obtained for epileptic seizure and pre-seizure. Brain headmaps obtained for seizure and preseizure cases provide meaningful insights about synchronization in the brain in those states. The headmap in the case of focal epilepsy clearly enables us to identify the focus of the epilepsy which provides certain diagnostic value. Comparative studies with linear correlation have shown that the nonlinear measure CPR outperforms the linear correlation measure. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Task-parallel languages are increasingly popular. Many of them provide expressive mechanisms for intertask synchronization. For example, OpenMP 4.0 will integrate data-driven execution semantics derived from the StarSs research language. Compared to the more restrictive data-parallel and fork-join concurrency models, the advanced features being introduced into task-parallelmodels in turn enable improved scalability through load balancing, memory latency hiding, mitigation of the pressure on memory bandwidth, and, as a side effect, reduced power consumption. In this article, we develop a systematic approach to compile loop nests into concurrent, dynamically constructed graphs of dependent tasks. We propose a simple and effective heuristic that selects the most profitable parallelization idiom for every dependence type and communication pattern. This heuristic enables the extraction of interband parallelism (cross-barrier parallelism) in a number of numerical computations that range from linear algebra to structured grids and image processing. The proposed static analysis and code generation alleviates the burden of a full-blown dependence resolver to track the readiness of tasks at runtime. We evaluate our approach and algorithms in the PPCG compiler, targeting OpenStream, a representative dataflow task-parallel language with explicit intertask dependences and a lightweight runtime. Experimental results demonstrate the effectiveness of the approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Partial differential equations (PDEs) with multiscale coefficients are very difficult to solve due to the wide range of scales in the solutions. In the thesis, we propose some efficient numerical methods for both deterministic and stochastic PDEs based on the model reduction technique.

For the deterministic PDEs, the main purpose of our method is to derive an effective equation for the multiscale problem. An essential ingredient is to decompose the harmonic coordinate into a smooth part and a highly oscillatory part of which the magnitude is small. Such a decomposition plays a key role in our construction of the effective equation. We show that the solution to the effective equation is smooth, and could be resolved on a regular coarse mesh grid. Furthermore, we provide error analysis and show that the solution to the effective equation plus a correction term is close to the original multiscale solution.

For the stochastic PDEs, we propose the model reduction based data-driven stochastic method and multilevel Monte Carlo method. In the multiquery, setting and on the assumption that the ratio of the smallest scale and largest scale is not too small, we propose the multiscale data-driven stochastic method. We construct a data-driven stochastic basis and solve the coupled deterministic PDEs to obtain the solutions. For the tougher problems, we propose the multiscale multilevel Monte Carlo method. We apply the multilevel scheme to the effective equations and assemble the stiffness matrices efficiently on each coarse mesh grid. In both methods, the $\KL$ expansion plays an important role in extracting the main parts of some stochastic quantities.

For both the deterministic and stochastic PDEs, numerical results are presented to demonstrate the accuracy and robustness of the methods. We also show the computational time cost reduction in the numerical examples.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Among different phase unwrapping approaches, the weighted least-squares minimization methods are gaining attention. In these algorithms, weighting coefficient is generated from a quality map. The intrinsic drawbacks of existing quality maps constrain the application of these algorithms. They often fail to handle wrapped phase data contains error sources, such as phase discontinuities, noise and undersampling. In order to deal with those intractable wrapped phase data, a new weighted least-squares phase unwrapping algorithm based on derivative variance correlation map is proposed. In the algorithm, derivative variance correlation map, a novel quality map, can truly reflect wrapped phase quality, ensuring a more reliable unwrapped result. The definition of the derivative variance correlation map and the principle of the proposed algorithm are present in detail. The performance of the new algorithm has been tested by use of a simulated spherical surface wrapped data and an experimental interferometric synthetic aperture radar (IFSAR) wrapped data. Computer simulation and experimental results have verified that the proposed algorithm can work effectively even when a wrapped phase map contains intractable error sources. (c) 2006 Elsevier GmbH. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Em 1828 foi observado um fenômeno no microscópio em que se visualizava minúsculos grãos de pólen mergulhados em um líquido em repouso que mexiam-se de forma aleatória, desenhando um movimento desordenado. A questão era compreender este movimento. Após cerca de 80 anos, Einstein (1905) desenvolveu uma formulação matemática para explicar este fenômeno, tratado por movimento Browniano, teoria cada vez mais desenvolvida em muitas das áreas do conhecimento, inclusive recentemente em modelagem computacional. Objetiva-se pontuar os pressupostos básicos inerentes ao passeio aleatório simples considerando experimentos com e sem problema de valor de contorno para melhor compreensão ao no uso de algoritmos aplicados a problemas computacionais. Foram explicitadas as ferramentas necessárias para aplicação de modelos de simulação do passeio aleatório simples nas três primeiras dimensões do espaço. O interesse foi direcionado tanto para o passeio aleatório simples como para possíveis aplicações para o problema da ruína do jogador e a disseminação de vírus em rede de computadores. Foram desenvolvidos algoritmos do passeio aleatório simples unidimensional sem e com o problema do valor de contorno na plataforma R. Similarmente, implementados para os espaços bidimensionais e tridimensionais,possibilitando futuras aplicações para o problema da disseminação de vírus em rede de computadores e como motivação ao estudo da Equação do Calor, embora necessita um maior embasamento em conceitos da Física e Probabilidade para dar continuidade a tal aplicação.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes work performed as part of the U.K. Alvey sponsored Voice Operated Database Inquiry System (VODIS) project in the area of intelligent dialogue control. The principal aims of the work were to develop a habitable interface for the untrained user; to investigate the degree to which dialogue control can be used to compensate for deficiencies in recognition performance; and to examine the requirements on dialogue control for generating natural speech output. A data-driven methodology is described based on the use of frames in which dialogue topics are organized hierarchically. The concept of a dynamically adjustable scope is introduced to permit adaptation to recognizer performance and the use of historical and hierarchical contexts are described to facilitate the construction of contextually relevant output messages. © 1989.