16 resultados para interval-valued fuzzy set
em Helda - Digital Repository of University of Helsinki
Resumo:
The study focused on the different ways that forest-related rights can be devolved to the local level according to the current legal frameworks in Laos, Nepal, Vietnam, Kenya, Mozambique and Tanzania. The eleven case studies represented the main ways in which forest-related rights can be devolved to communities or households in these countries. The objectives of this study were to 1) analyse the contents and extent of forest-related rights that can be devolved to the local level, 2) develop an empirical typology that represents the main types of devolution, and 3) compare the cases against a theoretical ideal type to assess in what way and to what extent the cases are similar to or differ from the theoretical construct. Fuzzy set theory, Qualitative Comparative Analysis and ideal type analysis were used in analysing the case studies and in developing an empirical typology. The theoretical framework, which guided data collection and analyses, was based on institutional economics and theories on property rights, common pool resources and collective action. On the basis of the theoretical and empirical knowledge, the most important attributes of rights were defined as use rights, management rights, exclusion rights, transfer rights and the duration and security of the rights. The ideal type was defined as one where local actors have been devolved comprehensive use rights, extensive management rights, rights to exclude others from the resource and rights to transfer these rights. In addition, the rights are to be secure and held perpetually. The ideal type was used to structure the analysis and as a tool against which the cases were analysed. The contents, extent and duration of the devolved rights varied greatly. In general, the results show that devolution has mainly meant the transfer of use rights to the local level, and has not really changed the overall state control over forest resources. In most cases the right holders participate, or have a limited role in the decision making regarding the harvesting and management of the resource. There was a clear tendency to devolve the rights to enforce rules and to monitor resource use and condition more extensively than the powers to decide on the management and development of the resource. The empirical typology of the cases differentiated between five different types of devolution. The types can be characterised by the devolution of 1) restricted use and control rights, 2) extensive use rights but restricted control rights, 3) extensive rights, 4) insecure, short term use and restricted control rights, and 5) insecure extensive rights. Overall, the case studies conformity to the ideal type was very low: only two cases were similar to the ideal type, all other cases differed considerably from the ideal type. The restricted management rights were the most common reason for the low conformity to the ideal type (eight cases). In three cases, the short term of the rights, restricted transfer rights, restricted use rights or restricted exclusion rights were the reason or one of the reasons for the low conformity to the ideal type. In two cases the rights were not secure.
Resumo:
The topic of my doctoral thesis is to demonstrate the usefulness of incorporating tonal and modal elements into a pitch-web square analysis of Béla Bartók's (1881-1945) opera, 'A kékszakállú herceg vára' ('Duke Bluebeard's Castle'). My specific goal is to demonstrate that different musical materials, which exist as foreground melodies or long-term key progressions, are unified by the unordered pitch set {0,1,4}, which becomes prominent in different sections of Bartók's opera. In Bluebeard's Castle, the set {0,1,4} is also found as a subset of several tetrachords: {0,1,4,7}, {0,1,4,8}, and {0,3,4,7}. My claim is that {0,1,4} serves to link music materials between themes, between sections, and also between scenes. This study develops an analytical method, drawn from various theoretical perspectives, for conceiving superposed diatonic spaces within a hybrid pitch-space comprised of diatonic and chromatic features. The integrity of diatonic melodic lines is retained, which allows for a non-reductive understanding of diatonic superposition, without appealing to pitch centers or specifying complete diatonic collections. Through combining various theoretical insights of the Hungarian scholar Ernő Lendvai, and the American theorists Elliott Antokoletz, Paul Wilson and Allen Forte, as well as the composer himself, this study gives a detailed analysis of the opera's pitch material in a way that combines, complements, and expands upon the studies of those scholars. The analyzed pitch sets are represented on Aarre Joutsenvirta's note-web square, which adds a new aspect to the field of Bartók analysis. Keywords: Bartók, Duke Bluebeard's Castle (Op. 11), Ernő Lendvai, axis system, Elliott Antokoletz, intervallic cycles, intervallic cells, Allen Forte, set theory, interval classes, interval vectors, Aarre Joutsenvirta, pitch-web square, pitch-web analysis.
Resumo:
Objectives. The sentence span task is a complex working memory span task used for estimating total working memory capacity for both processing (sentence comprehension) and storage (remembering a set of words). Several traditional models of working memory suggest that performance on these tasks relies on phonological short-term storage. However, long-term memory effects as well as the effects of expertise and strategies have challenged this view. This study uses a working memory task that aids the creation of retrieval structures in the form of stories, which have been shown to form integrated structures in longterm memory. The research question is whether sentence and story contexts boost memory performance in a complex working memory task. The hypothesis is that storage of the words in the task takes place in long-term memory. Evidence of this would be better recall for words as parts of sentences than for separate words, and, particularly, a beneficial effect for words as part of an organized story. Methods. Twenty stories consisting of five sentences each were constructed, and the stimuli in all experimental conditions were based on these sentences and sentence-final words, reordered and recombined for the other conditions. Participants read aloud sets of five sentences that either formed a story or not. In one condition they had to report all the last words at the end of the set, in another, they memorised an additional separate word with each sentence. The sentences were presented on the screen one word at a time (500 ms). After the presentation of each sentence, the participant verified a statement about the sentence. After five sentences, the participant repeated back the words in correct positions. Experiment 1 (n=16) used immediate recall, experiment 2 (n=21) both immediate recall and recall after a distraction interval (the operation span task). In experiment 2 a distracting mental arithmetic task was presented instead of recall in half of the trials, and an individual word was added before each sentence in the two experimental conditions when the participants were to memorize the sentence final words. Subjects also performed a listening span task (in exp.1) or an operation span task (exp.2) to allow comparison of the estimated span and performance in the story task. Results were analysed using correlations, repeated measures ANOVA and a chi-square goodness of fit test on the distribution of errors. Results and discussion. Both the relatedness of the sentences (the story condition) and the inclusion of the words into sentences helped memory. An interaction showed that the story condition had a greater effect on last words than separate words. The beneficial effect of the story was shown in all serial positions. The effects remained in delayed recall. When the sentences formed stories, performance in verification of the statements about sentence context was better. This, as well as the differing distributions of errors in different experimental conditions, suggest different levels of representation are in use in the different conditions. In the story condition, the nature of these representations could be in the form of an organized memory structure, a situation model. The other working memory tasks had only few week correlations to the story task. This could indicate that different processes are in use in the tasks. The results do not support short-term phonological storage, but instead are compatible with the words being encoded to LTM during the task.
Resumo:
The aim of this thesis was to increase our knowledge about the effects of seed origin on the timing of height growth cessation and field performance of silver birch from different latitudes, with special attention paid to the browsing damage by moose in young birch plantations. The effect of seed origin latitude and sowing time on timing of height growth cessation of first-year seedlings was studied in a greenhouse experiment with seven seed origins (lat. 58º - 67ºN). Variation in critical night length (CNL) for 50 % bud set within two latitudinally distant stands (60º and 67ºN) was studied in three phytotron experiments. Browsing by moose on 5-11 -year-old silver birch saplings from latitudinally different seed origins (53º - 67ºN) was studied in a field experiment in southern Finland. Yield and stem quality of 22-year-old silver birch trees of Baltic, Finnish and Russian origin (54º - 63ºN) and the effect of latitudinal seed transfers were studied in two provenance trials at Tuusula, southern and Viitasaari, central Finland. The timing of height growth cessation depended systematically on latitude of seed origin and sowing date. The more northern the seed origin, the earlier the growth cessation and the shorter the growth period. Later sowing dates delayed growth cessation but also shortened the growth period. The mean CNL of the southern ecotype was longer, 6.3 ± 0.2 h (95 % confidence interval), than that of the northern ecotype, 3.1 ± 0.3 h. Within-ecotype variance of the CNL was higher in the northern ecotype (0.484 h2) than in the southern ecotype (0.150 h2). Browsing by moose decreased with increasing latitude of seed origin and sapling height. Origins transferred from more southern latitudes were more heavily browsed than the more northern native ones. Southern Finnish seed origins produced the highest volume per unit area in central Finland (lat. 63º11'N). Estonian and north Latvian stand seed origins, and the southern Finnish plus tree origins, were the most productive ones in southern Finland (lat. 60º21'N). Latitudinal seed transfer distance had a significant effect on survival, stem volume/ha and proportion of trees with a stem defect. The relationship of both survival and stem volume/ha to the latitudinal seed transfer distance was curvilinear. Volume was increased by transferring seed from ca. 2 degrees of latitude from the south. A longer transfer from the south, and transfer from the north, decreased the yield. The proportion of trees with a stem defect increased linearly in relation to the latitudinal seed transfer distance from the south.
Resumo:
A composition operator is a linear operator between spaces of analytic or harmonic functions on the unit disk, which precomposes a function with a fixed self-map of the disk. A fundamental problem is to relate properties of a composition operator to the function-theoretic properties of the self-map. During the recent decades these operators have been very actively studied in connection with various function spaces. The study of composition operators lies in the intersection of two central fields of mathematical analysis; function theory and operator theory. This thesis consists of four research articles and an overview. In the first three articles the weak compactness of composition operators is studied on certain vector-valued function spaces. A vector-valued function takes its values in some complex Banach space. In the first and third article sufficient conditions are given for a composition operator to be weakly compact on different versions of vector-valued BMOA spaces. In the second article characterizations are given for the weak compactness of a composition operator on harmonic Hardy spaces and spaces of Cauchy transforms, provided the functions take values in a reflexive Banach space. Composition operators are also considered on certain weak versions of the above function spaces. In addition, the relationship of different vector-valued function spaces is analyzed. In the fourth article weighted composition operators are studied on the scalar-valued BMOA space and its subspace VMOA. A weighted composition operator is obtained by first applying a composition operator and then a pointwise multiplier. A complete characterization is given for the boundedness and compactness of a weighted composition operator on BMOA and VMOA. Moreover, the essential norm of a weighted composition operator on VMOA is estimated. These results generalize many previously known results about composition operators and pointwise multipliers on these spaces.
Resumo:
Analyzing statistical dependencies is a fundamental problem in all empirical science. Dependencies help us understand causes and effects, create new scientific theories, and invent cures to problems. Nowadays, large amounts of data is available, but efficient computational tools for analyzing the data are missing. In this research, we develop efficient algorithms for a commonly occurring search problem - searching for the statistically most significant dependency rules in binary data. We consider dependency rules of the form X->A or X->not A, where X is a set of positive-valued attributes and A is a single attribute. Such rules describe which factors either increase or decrease the probability of the consequent A. A classical example are genetic and environmental factors, which can either cause or prevent a disease. The emphasis in this research is that the discovered dependencies should be genuine - i.e. they should also hold in future data. This is an important distinction from the traditional association rules, which - in spite of their name and a similar appearance to dependency rules - do not necessarily represent statistical dependencies at all or represent only spurious connections, which occur by chance. Therefore, the principal objective is to search for the rules with statistical significance measures. Another important objective is to search for only non-redundant rules, which express the real causes of dependence, without any occasional extra factors. The extra factors do not add any new information on the dependence, but can only blur it and make it less accurate in future data. The problem is computationally very demanding, because the number of all possible rules increases exponentially with the number of attributes. In addition, neither the statistical dependency nor the statistical significance are monotonic properties, which means that the traditional pruning techniques do not work. As a solution, we first derive the mathematical basis for pruning the search space with any well-behaving statistical significance measures. The mathematical theory is complemented by a new algorithmic invention, which enables an efficient search without any heuristic restrictions. The resulting algorithm can be used to search for both positive and negative dependencies with any commonly used statistical measures, like Fisher's exact test, the chi-squared measure, mutual information, and z scores. According to our experiments, the algorithm is well-scalable, especially with Fisher's exact test. It can easily handle even the densest data sets with 10000-20000 attributes. Still, the results are globally optimal, which is a remarkable improvement over the existing solutions. In practice, this means that the user does not have to worry whether the dependencies hold in future data or if the data still contains better, but undiscovered dependencies.
Resumo:
Electric activity of the heart consists of repeated cardiomyocyte depolarizations and repolarizations. Abnormalities in repolarization predispose to ventricular arrhythmias. In body surface electrocardiogram, ventricular repolarization generates the T wave. Several electrocardiographic measures have been developed both for clinical and research purposes to detect repolarization abnormalities. The study aim was to investigate modifiers of ventricular repolarization with the focus on the relationship of the left ventricular mass, antihypertensive drugs, and common gene variants, to electrocardiographic repolarization parameters. The prognostic value of repolarization parameters was also assessed. The study subjects originated from a population of more than 200 middle-aged hypertensive men attending the GENRES hypertension study, and from an epidemiological survey, the Health 2000 Study, including more than 6000 participants. Ventricular repolarization was analysed from digital standard 12-lead resting electrocardiograms with two QT-interval based repolarization parameters (QT interval, T-wave peak to T-wave end interval) and with a set of four T-wave morphology parameters. The results showed that in hypertensive men, a linear change in repolarization parameters is present even in the normal range of left ventricular mass, and that even mild left ventricular hypertrophy is associated with potentially adverse electrocardiographic repolarization changes. In addition, treatments with losartan, bisoprolol, amlodipine, and hydrochlorothiazide have divergent short-term effects on repolarization parameters in hypertensive men. Analyses of the general population sample showed that single nucleotide polymorphisms in KCNH2, KCNE1, and NOS1AP genes are associated with changes in QT-interval based repolarization parameters but not consistently with T-wave morphology parameters. T-wave morphology parameters, but not QT interval or T-wave peak to T-wave end interval, provided independent prognostic information on mortality. The prognostic value was specifically related to cardiovascular mortality. The results indicate that, in hypertension, altered ventricular repolarization is already present in mild left ventricular mass increase, and that commonly used antihypertensive drugs may relatively rapidly and treatment-specifically modify electrocardiographic repolarization parameters. Common variants in cardiac ion channel genes and NOS1AP gene may also modify repolarization-related arrhythmia vulnerability. In the general population, T-wave morphology parameters may be useful in the risk assessment of cardiovascular mortality.
Resumo:
Self-similarity, a concept taken from mathematics, is gradually becoming a keyword in musicology. Although a polysemic term, self-similarity often refers to the multi-scalar feature repetition in a set of relationships, and it is commonly valued as an indication for musical coherence and consistency . This investigation provides a theory of musical meaning formation in the context of intersemiosis, that is, the translation of meaning from one cognitive domain to another cognitive domain (e.g. from mathematics to music, or to speech or graphic forms). From this perspective, the degree of coherence of a musical system relies on a synecdochic intersemiosis: a system of related signs within other comparable and correlated systems. This research analyzes the modalities of such correlations, exploring their general and particular traits, and their operational bounds. Looking forward in this direction, the notion of analogy is used as a rich concept through its two definitions quoted by the Classical literature: proportion and paradigm, enormously valuable in establishing measurement, likeness and affinity criteria. Using quantitative qualitative methods, evidence is presented to justify a parallel study of different modalities of musical self-similarity. For this purpose, original arguments by Benoît B. Mandelbrot are revised, alongside a systematic critique of the literature on the subject. Furthermore, connecting Charles S. Peirce s synechism with Mandelbrot s fractality is one of the main developments of the present study. This study provides elements for explaining Bolognesi s (1983) conjecture, that states that the most primitive, intuitive and basic musical device is self-reference, extending its functions and operations to self-similar surfaces. In this sense, this research suggests that, with various modalities of self-similarity, synecdochic intersemiosis acts as system of systems in coordination with greater or lesser development of structural consistency, and with a greater or lesser contextual dependence.
Resumo:
We present a distributed algorithm that finds a maximal edge packing in O(Δ + log* W) synchronous communication rounds in a weighted graph, independent of the number of nodes in the network; here Δ is the maximum degree of the graph and W is the maximum weight. As a direct application, we have a distributed 2-approximation algorithm for minimum-weight vertex cover, with the same running time. We also show how to find an f-approximation of minimum-weight set cover in O(f2k2 + fk log* W) rounds; here k is the maximum size of a subset in the set cover instance, f is the maximum frequency of an element, and W is the maximum weight of a subset. The algorithms are deterministic, and they can be applied in anonymous networks.
Resumo:
Hypertexts are digital texts characterized by interactive hyperlinking and a fragmented textual organization. Increasingly prominent since the early 1990s, hypertexts have become a common text type both on the Internet and in a variety of other digital contexts. Although studied widely in disciplines like hypertext theory and media studies, formal linguistic approaches to hypertext continue to be relatively rare. This study examines coherence negotiation in hypertext with particularly reference to hypertext fiction. Coherence, or the quality of making sense, is a fundamental property of textness. Proceeding from the premise that coherence is a subjectively evaluated property rather than an objective quality arising directly from textual cues, the study focuses on the processes through which readers interact with hyperlinks and negotiate continuity between hypertextual fragments. The study begins with a typological discussion of textuality and an overview of the historical and technological precedents of modern hypertexts. Then, making use of text linguistic, discourse analytical, pragmatic, and narratological approaches to textual coherence, the study takes established models developed for analyzing and describing conventional texts, and examines their applicability to hypertext. Primary data derived from a collection of hyperfictions is used throughout to illustrate the mechanisms in practice. Hypertextual coherence negotiation is shown to require the ability to cognitively operate between local and global coherence by means of processing lexical cohesion, discourse topical continuities, inferences and implications, and shifting cognitive frames. The main conclusion of the study is that the style of reading required by hypertextuality fosters a new paradigm of coherence. Defined as fuzzy coherence, this new approach to textual sensemaking is predicated on an acceptance of the coherence challenges readers experience when the act of reading comes to involve repeated encounters with referentially imprecise hyperlinks and discourse topical shifts. A practical application of fuzzy coherence is shown to be in effect in the way coherence is actively manipulated in hypertext narratives.
Resumo:
Various Tb theorems play a key role in the modern harmonic analysis. They provide characterizations for the boundedness of Calderón-Zygmund type singular integral operators. The general philosophy is that to conclude the boundedness of an operator T on some function space, one needs only to test it on some suitable function b. The main object of this dissertation is to prove very general Tb theorems. The dissertation consists of four research articles and an introductory part. The framework is general with respect to the domain (a metric space), the measure (an upper doubling measure) and the range (a UMD Banach space). Moreover, the used testing conditions are weak. In the first article a (global) Tb theorem on non-homogeneous metric spaces is proved. One of the main technical components is the construction of a randomization procedure for the metric dyadic cubes. The difficulty lies in the fact that metric spaces do not, in general, have a translation group. Also, the measures considered are more general than in the existing literature. This generality is genuinely important for some applications, including the result of Volberg and Wick concerning the characterization of measures for which the analytic Besov-Sobolev space embeds continuously into the space of square integrable functions. In the second article a vector-valued extension of the main result of the first article is considered. This theorem is a new contribution to the vector-valued literature, since previously such general domains and measures were not allowed. The third article deals with local Tb theorems both in the homogeneous and non-homogeneous situations. A modified version of the general non-homogeneous proof technique of Nazarov, Treil and Volberg is extended to cover the case of upper doubling measures. This technique is also used in the homogeneous setting to prove local Tb theorems with weak testing conditions introduced by Auscher, Hofmann, Muscalu, Tao and Thiele. This gives a completely new and direct proof of such results utilizing the full force of non-homogeneous analysis. The final article has to do with sharp weighted theory for maximal truncations of Calderón-Zygmund operators. This includes a reduction to certain Sawyer-type testing conditions, which are in the spirit of Tb theorems and thus of the dissertation. The article extends the sharp bounds previously known only for untruncated operators, and also proves sharp weak type results, which are new even for untruncated operators. New techniques are introduced to overcome the difficulties introduced by the non-linearity of maximal truncations.