983 resultados para Transactional Distance Theory


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The extensional theory of arrays is one of the most important ones for applications of SAT Modulo Theories (SMT) to hardware and software verification. Here we present a new T-solver for arrays in the context of the DPLL(T) approach to SMT. The main characteristics of our solver are: (i) no translation of writes into reads is needed, (ii) there is no axiom instantiation, and (iii) the T-solver interacts with the Boolean engine by asking to split on equality literals between indices. As far as we know, this is the first accurate description of an array solver integrated in a state-of-the-art SMT solver and, unlike most state-of-the-art solvers, it is not based on a lazy instantiation of the array axioms. Moreover, it is very competitive in practice, specially on problems that require heavy reasoning on array literals

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Peer-reviewed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis I argue that the psychological study of concepts and categorisation, and the philosophical study of reference are deeply intertwined. I propose that semantic intuitions are a variety of categorisation judgements, determined by concepts, and that because of this, concepts determine reference. I defend a dual theory of natural kind concepts, according to which natural kind concepts have distinct semantic cores and non-semantic identification procedures. Drawing on psychological essentialism, I suggest that the cores consist of externalistic placeholder essence beliefs. The identification procedures, in turn, consist of prototypes, sets of exemplars, or possibly also theory-structured beliefs. I argue that the dual theory is motivated both by experimental data and theoretical considerations. The thesis consists of three interrelated articles. Article I examines philosophical causal and description theories of natural kind term reference, and argues that they involve, or need to involve, certain psychological elements. I propose a unified theory of natural kind term reference, built on the psychology of concepts. Article II presents two semantic adaptations of psychological essentialism, one of which is a strict externalistic Kripkean-Putnamian theory, while the other is a hybrid account, according to which natural kind terms are ambiguous between internalistic and externalistic senses. We present two experiments, the results of which support the strict externalistic theory. Article III examines Fodor’s influential atomistic theory of concepts, according to which no psychological capacities associated with concepts constitute them, or are necessary for reference. I argue, contra Fodor, that the psychological mechanisms are necessary for reference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis the X-ray tomography is discussed from the Bayesian statistical viewpoint. The unknown parameters are assumed random variables and as opposite to traditional methods the solution is obtained as a large sample of the distribution of all possible solutions. As an introduction to tomography an inversion formula for Radon transform is presented on a plane. The vastly used filtered backprojection algorithm is derived. The traditional regularization methods are presented sufficiently to ground the Bayesian approach. The measurements are foton counts at the detector pixels. Thus the assumption of a Poisson distributed measurement error is justified. Often the error is assumed Gaussian, altough the electronic noise caused by the measurement device can change the error structure. The assumption of Gaussian measurement error is discussed. In the thesis the use of different prior distributions in X-ray tomography is discussed. Especially in severely ill-posed problems the use of a suitable prior is the main part of the whole solution process. In the empirical part the presented prior distributions are tested using simulated measurements. The effect of different prior distributions produce are shown in the empirical part of the thesis. The use of prior is shown obligatory in case of severely ill-posed problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New economic and enterprise needs have increased the interest and utility of the methods of the grouping process based on the theory of uncertainty. A fuzzy grouping (clustering) process is a key phase of knowledge acquisition and reduction complexity regarding different groups of objects. Here, we considered some elements of the theory of affinities and uncertain pretopology that form a significant support tool for a fuzzy clustering process. A Galois lattice is introduced in order to provide a clearer vision of the results. We made an homogeneous grouping process of the economic regions of Russian Federation and Ukraine. The obtained results gave us a large panorama of a regional economic situation of two countries as well as the key guidelines for the decision-making. The mathematical method is very sensible to any changes the regional economy can have. We gave an alternative method of the grouping process under uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In nature, variation for example in herbivory, wind exposure, moisture and pollution impact often creates variation in physiological stress and plant productivity. This variation is seldom clear-cut, but rather results in clines of decreasing growth and productivity towards the high-stress end. These clines of unidirectionally changing stress are generally known as ‘stress gradients’. Through its effect on plant performance, stress has the capacity to fundamentally alter the ecological relationships between individuals, and through variation in survival and reproduction it also causes evolutionary change, i.e. local adaptations to stress and eventually speciation. In certain conditions local adaptations to environmental stress have been documented in a matter of just a few generations. In plant-plant interactions, intensities of both negative interactions (competition) and positive ones (facilitation) are expected to vary along stress gradients. The stress-gradient hypothesis (SGH) suggests that net facilitation will be strongest in conditions of high biotic and abiotic stress, while a more recent ‘humpback’ model predicts strongest net facilitation at intermediate levels of stress. Plant interactions on stress gradients, however, are affected by a multitude of confounding factors, making studies of facilitation-related theories challenging. Among these factors are plant ontogeny, spatial scale, and local adaptation to stress. The last of these has very rarely been included in facilitation studies, despite the potential co-occurrence of local adaptations and changes in net facilitation in stress gradients. Current theory would predict both competitive effects and facilitative responses to be weakest in populations locally adapted to withstand high abiotic stress. This thesis is based on six experiments, conducted both in greenhouses and in the field in Russia, Norway and Finland, with mountain birch (Betula pubescens subsp. czerepanovii) as the model species. The aims were to study potential local adaptations in multiple stress gradients (both natural and anthropogenic), changes in plant-plant interactions under conditions of varying stress (as predicted by SGH), potential mechanisms behind intraspecific facilitation, and factors confounding plant-plant facilitation, such as spatiotemporal, ontogenetic, and genetic differences. I found rapid evolutionary adaptations (occurring within a time-span of 60 to 70 years) towards heavy-metal resistance around two copper-nickel smelters, a phenomenon that has resulted in a trade-off of decreased performance in pristine conditions. Heavy-metal-adapted individuals had lowered nickel uptake, indicating a possible mechanism behind the detected resistance. Seedlings adapted to heavy-metal toxicity were not co-resistant to others forms of abiotic stress, but showed co-resistance to biotic stress by being consumed to a lesser extent by insect herbivores. Conversely, populations from conditions of high natural stress (wind, drought etc.) showed no local adaptations, despite much longer evolutionary time scales. Due to decreasing emissions, I was unable to test SGH in the pollution gradients. In natural stress gradients, however, plant performance was in accordance with SGH, with the strongest host-seedling facilitation found at the high-stress sites in two different stress gradients. Factors confounding this pattern included (1) plant size / ontogenetic status, with seedling-seedling interactions being competition dominated and host-seedling interactions potentially switching towards competition with seedling growth, and (2) spatial distance, with competition dominating at very short planting distances, and facilitation being strongest at a distance of circa ¼ benefactor height. I found no evidence for changes in facilitation with respect to the evolutionary histories of plant populations. Despite the support for SGH, it may be that the ‘humpback’ model is more relevant when the main stressor is resource-related, while what I studied were the effects of ‘non-resource’ stressors (i.e. heavy-metal pollution and wind). The results have potential practical applications: the utilisation of locally adapted seedlings and plant facilitation may increase the success of future restoration efforts in industrial barrens as well as in other wind-exposed sites. The findings also have implications with regard to the effects of global change in subarctic environments: the documented potential by mountain birch for rapid evolutionary change, together with the general lack of evolutionary ‘dead ends’, due to not (over)specialising to current natural conditions, increase the chances of this crucial forest-forming tree persisting even under the anticipated climate change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Luettelointi kesken

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Luettelointi kesken

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The proposed transdisciplinary field of ‘complexics’ would bring together allcontemporary efforts in any specific disciplines or by any researchersspecifically devoted to constructing tools, procedures, models and conceptsintended for transversal application that are aimed at understanding andexplaining the most interwoven and dynamic phenomena of reality. Our aimneeds to be, as Morin says, not “to reduce complexity to simplicity, [but] totranslate complexity into theory”.New tools for the conception, apprehension and treatment of the data ofexperience will need to be devised to complement existing ones and toenable us to make headway toward practices that better fit complexictheories. New mathematical and computational contributions have alreadycontinued to grow in number, thanks primarily to scholars in statisticalphysics and computer science, who are now taking an interest in social andeconomic phenomena.Certainly, these methodological innovations put into question and againmake us take note of the excessive separation between the training receivedby researchers in the ‘sciences’ and in the ‘arts’. Closer collaborationbetween these two subsets would, in all likelihood, be much moreenergising and creative than their current mutual distance. Humancomplexics must be seen as multi-methodological, insofar as necessarycombining quantitative-computation methodologies and more qualitativemethodologies aimed at understanding the mental and emotional world ofpeople.In the final analysis, however, models always have a narrative runningbehind them that reflects the attempts of a human being to understand theworld, and models are always interpreted on that basis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The proposed transdisciplinary field of ‘complexics’ would bring together allcontemporary efforts in any specific disciplines or by any researchersspecifically devoted to constructing tools, procedures, models and conceptsintended for transversal application that are aimed at understanding andexplaining the most interwoven and dynamic phenomena of reality. Our aimneeds to be, as Morin says, not “to reduce complexity to simplicity, [but] totranslate complexity into theory”.New tools for the conception, apprehension and treatment of the data ofexperience will need to be devised to complement existing ones and toenable us to make headway toward practices that better fit complexictheories. New mathematical and computational contributions have alreadycontinued to grow in number, thanks primarily to scholars in statisticalphysics and computer science, who are now taking an interest in social andeconomic phenomena.Certainly, these methodological innovations put into question and againmake us take note of the excessive separation between the training receivedby researchers in the ‘sciences’ and in the ‘arts’. Closer collaborationbetween these two subsets would, in all likelihood, be much moreenergising and creative than their current mutual distance. Humancomplexics must be seen as multi-methodological, insofar as necessarycombining quantitative-computation methodologies and more qualitativemethodologies aimed at understanding the mental and emotional world ofpeople.In the final analysis, however, models always have a narrative runningbehind them that reflects the attempts of a human being to understand theworld, and models are always interpreted on that basis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis deals with distance transforms which are a fundamental issue in image processing and computer vision. In this thesis, two new distance transforms for gray level images are presented. As a new application for distance transforms, they are applied to gray level image compression. The new distance transforms are both new extensions of the well known distance transform algorithm developed by Rosenfeld, Pfaltz and Lay. With some modification their algorithm which calculates a distance transform on binary images with a chosen kernel has been made to calculate a chessboard like distance transform with integer numbers (DTOCS) and a real value distance transform (EDTOCS) on gray level images. Both distance transforms, the DTOCS and EDTOCS, require only two passes over the graylevel image and are extremely simple to implement. Only two image buffers are needed: The original gray level image and the binary image which defines the region(s) of calculation. No other image buffers are needed even if more than one iteration round is performed. For large neighborhoods and complicated images the two pass distance algorithm has to be applied to the image more than once, typically 3 10 times. Different types of kernels can be adopted. It is important to notice that no other existing transform calculates the same kind of distance map as the DTOCS. All the other gray weighted distance function, GRAYMAT etc. algorithms find the minimum path joining two points by the smallest sum of gray levels or weighting the distance values directly by the gray levels in some manner. The DTOCS does not weight them that way. The DTOCS gives a weighted version of the chessboard distance map. The weights are not constant, but gray value differences of the original image. The difference between the DTOCS map and other distance transforms for gray level images is shown. The difference between the DTOCS and EDTOCS is that the EDTOCS calculates these gray level differences in a different way. It propagates local Euclidean distances inside a kernel. Analytical derivations of some results concerning the DTOCS and the EDTOCS are presented. Commonly distance transforms are used for feature extraction in pattern recognition and learning. Their use in image compression is very rare. This thesis introduces a new application area for distance transforms. Three new image compression algorithms based on the DTOCS and one based on the EDTOCS are presented. Control points, i.e. points that are considered fundamental for the reconstruction of the image, are selected from the gray level image using the DTOCS and the EDTOCS. The first group of methods select the maximas of the distance image to new control points and the second group of methods compare the DTOCS distance to binary image chessboard distance. The effect of applying threshold masks of different sizes along the threshold boundaries is studied. The time complexity of the compression algorithms is analyzed both analytically and experimentally. It is shown that the time complexity of the algorithms is independent of the number of control points, i.e. the compression ratio. Also a new morphological image decompression scheme is presented, the 8 kernels' method. Several decompressed images are presented. The best results are obtained using the Delaunay triangulation. The obtained image quality equals that of the DCT images with a 4 x 4

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scientific studies regarding specifically references do not seem to exist. However, the utilization of references is an important practice for many companies involved in industrial marketing. The purpose of the study is to increase the understanding about the utilization of references in international industrial marketing in order to contribute to the development of a theory of reference behavior. Specifically, the modes of reference usage in industry, the factors affecting a supplier's reference behavior, and the question how references are actually utilized, are explored in the study. Due to the explorative nature of the study, a research design was followed where theory and empirical studies alternated. An Exploratory Framework was developed to guide a pilot case study that resulted in Framework 1. Results of the pilot study guided an expanded literature review that was used to develop first a Structural Framework and a Process Framework which were combined in Framework 2. Then, the second empirical phase of the case study was conducted in the same (pilot) case company. In this phase, Decision Systems Analysis (DSA) was used as the analysis method. The DSA procedure consists of three interviewing waves: initial interviews, reinterviews, and validating interviews. Four reference decision processes were identified, described and analyzed in the form of flowchart descriptions. The flowchart descriptions were used to explore new constructs and to develop new propositions to develop Framework 2 further. The quality of the study was ascertained by many actions in both empirical parts of the study. The construct validity of the study was ascertained by using multiple sources of evidence and by asking the key informant to review the pilot case report. The DSA method itself includes procedures assuring validity. Because of the choice to conduct a single case study, external validity was not even pursued. High reliability was pursued through detailed documentation and thorough reporting of evidence. It was concluded that the core of the concept of reference is a customer relationship regardless of the concrete forms a reference might take in its utilization. Depending on various contingencies, references might have various tasks inside the four roles of increasing 1) efficiency of sales and sales management, 2) efficiency of the business, 3) effectiveness of marketing activities, and 4) effectiveness in establishing, maintaining and enhancing customer relationships. Thus, references have not only external but internal tasks as well. A supplier's reference behavior might be affected by many hierarchical conditions. Additionally, the empirical study showed that the supplier can utilize its references as a continuous, all pervasive decision making process through various practices. The process includes both individual and unstructured decision making subprocesses. The proposed concept of reference can be used to guide a reference policy recommendable for companies for which the utilization of references is important. The significance of the study is threefold: proposing the concept of reference, developing a framework of a supplier's reference behavior and its short term process of utilizing references, and conceptual structuring of an unstructured and in industrial marketing important phenomenon to four roles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A theoretical model for the noise properties of Schottky barrier diodes in the framework of the thermionic-emission¿diffusion theory is presented. The theory incorporates both the noise inducedby the diffusion of carriers through the semiconductor and the noise induced by the thermionicemission of carriers across the metal¿semiconductor interface. Closed analytical formulas arederived for the junction resistance, series resistance, and contributions to the net noise localized indifferent space regions of the diode, all valid in the whole range of applied biases. An additionalcontribution to the voltage-noise spectral density is identified, whose origin may be traced back tothe cross correlation between the voltage-noise sources associated with the junction resistance andthose for the series resistance. It is argued that an inclusion of the cross-correlation term as a newelement in the existing equivalent circuit models of Schottky diodes could explain the discrepanciesbetween these models and experimental measurements or Monte Carlo simulations.