950 resultados para Objective measure
Resumo:
In this dissertation I study language complexity from a typological perspective. Since the structuralist era, it has been assumed that local complexity differences in languages are balanced out in cross-linguistic comparisons and that complexity is not affected by the geopolitical or sociocultural aspects of the speech community. However, these assumptions have seldom been studied systematically from a typological point of view. My objective is to define complexity so that it is possible to compare it across languages and to approach its variation with the methods of quantitative typology. My main empirical research questions are: i) does language complexity vary in any systematic way in local domains, and ii) can language complexity be affected by the geographical or social environment? These questions are studied in three articles, whose findings are summarized in the introduction to the dissertation. In order to enable cross-language comparison, I measure complexity as the description length of the regularities in an entity; I separate it from difficulty, focus on local instead of global complexity, and break it up into different types. This approach helps avoid the problems that plagued earlier metrics of language complexity. My approach to grammar is functional-typological in nature, and the theoretical framework is basic linguistic theory. I delimit the empirical research functionally to the marking of core arguments (the basic participants in the sentence). I assess the distributions of complexity in this domain with multifactorial statistical methods and use different sampling strategies, implementing, for instance, the Greenbergian view of universals as diachronic laws of type preference. My data come from large and balanced samples (up to approximately 850 languages), drawn mainly from reference grammars. The results suggest that various significant trends occur in the marking of core arguments in regard to complexity and that complexity in this domain correlates with population size. These results provide evidence that linguistic patterns interact among themselves in terms of complexity, that language structure adapts to the social environment, and that there may be cognitive mechanisms that limit complexity locally. My approach to complexity and language universals can therefore be successfully applied to empirical data and may serve as a model for further research in these areas.
Resumo:
This dissertation examines the impacts of energy and climate policies on the energy and forest sectors, focusing on the case of Finland. The thesis consists of an introduction article and four separate studies. The dissertation was motivated by the climate concern and the increasing demand of renewable energy. In particular, the renewable energy consumption and greenhouse gas emission reduction targets of the European Union were driving this work. In Finland, both forest and energy sectors are in key roles in achieving these targets. In fact, the separation between forest and energy sector is diminishing as the energy sector is utilizing increasing amounts of wood in energy production and as the forest sector is becoming more and more important energy producer. The objective of this dissertation is to find out and measure the impacts of climate and energy policies on the forest and energy sectors. In climate policy, the focus is on emissions trading, and in energy policy the dissertation focuses on the promotion of renewable forest-based energy use. The dissertation relies on empirical numerical models that are based on microeconomic theory. Numerical partial equilibrium mixed complementarity problem models were constructed to study the markets under scrutiny. The separate studies focus on co-firing of wood biomass and fossil fuels, liquid biofuel production in the pulp and paper industry, and the impacts of climate policy on the pulp and paper sector. The dissertation shows that the policies promoting wood-based energy may have have unexpected negative impacts. When feed-in tariff is imposed together with emissions trading, in some plants the production of renewable electricity might decrease as the emissions price increases. The dissertation also shows that in liquid biofuel production, investment subsidy may cause high direct policy costs and other negative impacts when compared to other policy instruments. The results of the dissertation also indicate that from the climate mitigation perspective, perfect competition is the favored wood market competition structure, at least if the emissions trading system is not global. In conclusion, this dissertation suggests that when promoting the use of wood biomass in energy production, the favored policy instruments are subsidies that promote directly the renewable energy production (i.e. production subsidy, renewables subsidy or feed-in premium). Also, the policy instrument should be designed to be dependent on the emissions price or on the substitute price. In addition, this dissertation shows that when planning policies to promote wood-based renewable energy, the goals of the policy scheme should be clear before decisions are made on the choice of the policy instruments.
Resumo:
Channel assignment in multi-channel multi-radio wireless networks poses a significant challenge due to scarcity of number of channels available in the wireless spectrum. Further, additional care has to be taken to consider the interference characteristics of the nodes in the network especially when nodes are in different collision domains. This work views the problem of channel assignment in multi-channel multi-radio networks with multiple collision domains as a non-cooperative game where the objective of the players is to maximize their individual utility by minimizing its interference. Necessary and sufficient conditions are derived for the channel assignment to be a Nash Equilibrium (NE) and efficiency of the NE is analyzed by deriving the lower bound of the price of anarchy of this game. A new fairness measure in multiple collision domain context is proposed and necessary and sufficient conditions for NE outcomes to be fair are derived. The equilibrium conditions are then applied to solve the channel assignment problem by proposing three algorithms, based on perfect/imperfect information, which rely on explicit communication between the players for arriving at an NE. A no-regret learning algorithm known as Freund and Schapire Informed algorithm, which has an additional advantage of low overhead in terms of information exchange, is proposed and its convergence to the stabilizing outcomes is studied. New performance metrics are proposed and extensive simulations are done using Matlab to obtain a thorough understanding of the performance of these algorithms on various topologies with respect to these metrics. It was observed that the algorithms proposed were able to achieve good convergence to NE resulting in efficient channel assignment strategies.
Resumo:
We have developed a theory for an electrochemical way of measuring the statistical properties of a nonfractally rough electrode. We obtained the expression for the current transient on a rough electrode which shows three times regions: short and long time limits and the transition region between them. The expressions for these time ranges are exploited to extract morphological information about the surface roughness. In the short and long time regimes, we extract information regarding various morphological features like the roughness factor, average roughness, curvature, correlation length, dimensionality of roughness, and polynomial approximation for the correlation function. The formulas for the surface structure factors (the measure of surface roughness) of rough surfaces in terms of measured reversible and diffusion-limited current transients are also obtained. Finally, we explore the feasibility of making such measurements.
Resumo:
A new method based on analysis of a single diffraction pattern is proposed to measure deflections in micro-cantilever (MC) based sensor probes, achieving typical deflection resolutions of 1nm and surface stress changes of 50 mu N/m. The proposed method employs a double MC structure where the deflection of one of the micro-cantilevers relative to the other due to surface stress changes results in a linear shift of intensity maxima of the Fraunhofer diffraction pattern of the transilluminated MC. Measurement of such shifts in the intensity maxima of a particular order along the length of the structure can be done to an accuracy of 0.01mm leading to the proposed sensitivity of deflection measurement in a typical microcantilever. This method can overcome the fundamental measurement sensitivity limit set by diffraction and pointing stability of laser beam in the widely used Optical Beam Deflection method (OBDM).
Resumo:
Notched three point bend specimens (TPB) were tested under crack mouth opening displacement (CMOD) control at a rate of 0.0004 mm/s and during the fracture process acoustic emissions (AE) were simultaneously monitored. It was observed that AE energy could be related to fracture energy. An experimental study was done to understand the behavior of AE energy with parameters of concrete like its strength and size. In this study, AE energy was used as a quantitative measure of size independent specific fracture energy of concrete beams and the concepts of boundary effect and local fracture energy were used to obtain size independent AE energy from which size independent fracture energy was obtained. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The problem of quantification of intelligence of humans, and of intelligent systems, has been a challenging and controversial topic. IQ tests have been traditionally used to quantify human intelligence based on results of test designed by psychologists. It is in general very difficult to quantify intelligence. In this paper the authors consider a simple question-answering (Q-A) system and use this to quantify intelligence. The authors quantify intelligence as a vector with three components. The components consist of a measure of knowledge in asking questions, effectiveness of questions asked, and correctness of deduction. The authors formalize these parameters and have conducted experiments on humans to measure these parameters
Resumo:
The suitability of the European Centre for Medium Range Weather Forecasting (ECMWF) operational wind analysis for the period 1980-1991 for studying interannual variability is examined. The changes in the model and the analysis procedure are shown to give rise to a systematic and significant trend in the large scale circulation features. A new method of removing the systematic errors at all levels is presented using multivariate EOF analysis. Objectively detrended analysis of the three-dimensional wind field agrees well with independent Florida State University (FSU) wind analysis at the surface. It is shown that the interannual variations in the detrended surface analysis agree well in amplitude as well as spatial patterns with those of the FSU analysis. Therefore, the detrended analyses at other levels as well are expected to be useful for studies of variability and predictability at interannual time scales. It is demonstrated that this trend in the wind field is due to the shift in the climatologies from the period 1980-1985 to the period 1986-1991.
Resumo:
Two smectite samples having different layer charges were pillared using hydroxy aluminium oligomers at a OH/Al ratio of 2.5 and at pH 4.3 to 4.6. Pillaring was carried out at different conditions such as ageing, temperature and base addition time of the pillaring solution, and also in the presence of nonionic surfactant polyoxyethylene sorbitanmonooleate (Tween-80). The primary objective of preparing at different conditions was to introduce varied quantities of aluminium oligomer between the layers and to study its effect on the properties of the pillared products. A simple method has been followed to estimate the amount of interlayer aluminium. A quantity called pillar density number (PDN) based on the ratio of interlayer Al adsorbed to CEC of the parent clay has been effectively used to evaluate the nature of the resulting pillared product. PDN, for a given clay, was found to correlate well with the sharpness of the d(001) peaks for the air dried samples. The calculated number of pillars, varied from 3.00 x 10(18) to 5.32 x 10(18) per meq charge. The present study shows that a higher value of PDN is indicative of better thermal stability. Pillar density number may be conveniently used as a measure of the thermal stability of pillared samples.
Resumo:
A new approach based on occupation measures is introduced for studying stochastic differential games. For two-person zero-sum games, the existence of values and optimal strategies for both players is established for various payoff criteria. ForN-person games, the existence of equilibria in Markov strategies is established for various cases.
Resumo:
Seizure electroencephalography (EEG) was recorded from two channels-right (Rt) and left (Lt)-during bilateral electroconvulsive therapy (ECT) (n = 12) and unilateral ECT (n = 12). The EEG was also acquired into a microcomputer and was analyzed without knowledge of the clinical details. EEG recordings of both ECT procedures yielded seizures of comparable duration. The Strength Symmetry Index (SSI) was computed from the early- and midseizure phases using the fractal dimension of the EEG. The seizures of unilateral ECT were characterized by significantly smaller SSI in both phases. More unilateral than bilateral ECT seizures had a smaller than median SSI in both phases. The seizures also differed on other measures as reported in the literature. The findings indicate that SSI may be a potential measure of seizure adequacy that remains to be validated in future research.
Resumo:
The conventional definition of redundancy is applicable to skeletal structural systems only, whereas the concept of redundancy has never been discussed in the context of a continuum. Generally, structures in civil engineering constitute a combination of both skeletal and continuum segments. Hence, this gaper presents a generalized definition of redundancy that has been defined in terms of structural response sensitivity, which is applicable to both continuum and discrete structures. In contrast to the conventional definition of redundancy, which is assumed to be fixed for a given structure and is believed to be independent of loading and material properties, the new definition would depend on strength and response of the structure at a given stage of its service life. The redundancy measure proposed in this paper is linked to the structural response sensitivities. Thus, the structure can have different degrees of redundancy during its lifetime, depending on the response sensitivity under consideration It is believed that this new redundancy measure would be more relevant in structural evaluation, damage assessment, and reliability analysis of structures at large.
Resumo:
We introduce a multifield comparison measure for scalar fields that helps in studying relations between them. The comparison measure is insensitive to noise in the scalar fields and to noise in their gradients. Further, it can be computed robustly and efficiently. Results from the visual analysis of various data sets from climate science and combustion applications demonstrate the effective use of the measure.