903 resultados para Mixing rule


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The control of molecular architectures may be essential to optimize materials properties for producing luminescent devices from polymers, especially in the blue region of the spectrum. In this Article, we report on the fabrication of Langmuir-Blodgett (LB) films of polyfluorene copolymers mixed with the phospholipid dimyristoyl phosphatidic acid (DMPA). The copolymers poly(9.9-dioetylfluorene)-co-phenylene (copolymer I) and poly(9,9-dioctylfluorene)-co-quaterphenylene) (copolymer 2) were synthesized via Suzuki reaction. Copolymer I could not form a monolayer on its own, but it yielded stable films when mixed with DMPA. In contrast, Langmuir monolayers could be formed from either the neat copolymer 2 or when mixed with DMPA. The surface pressure and surface potential measurements, in addition to Brewster angle microscopy, indicated that DMPA provided a suitable matrix for copolymer I to form a stable Langmuir film, amenable to transfer as LB films, while enhancing the ability of copolymer 2 to form LB films with enhanced emission, as indicated by fluorescence spectroscopy. Because a high emission was obtained with the mixed LB films and since the molecular-level interactions between the film components can be tuned by changing the experimental conditions to allow For further optimization, one may envisage applications of these films in optical devices such as organic light-emitting diodes (OLEDs).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Felsic microgranular enclaves with structures indicating that they interacted in a plastic state with their chemically similar host granite are abundant in the Maua Pluton, SE Brazil. Larger plagioclase xenocrysts are in textural disequilibrium with the enclave groundmass and show complex zoning patterns with partially resorbed An-rich cores (locally with patchy textures) surrounded by more sodic rims. In situ laser ablation-(multi-collector) inductively coupled plasma mass spectrometry trace element and Sr isotopic analyses performed on the plagioclase xenocrysts indicate open-system crystallization; however, no evidence of derivation from more primitive basic melts is observed. The An-rich cores have more radiogenic initial Sr isotopic ratios that decrease towards the outermost part of the rims, which are in isotopic equilibrium with the matrix plagioclase. These profiles may have been produced by either (1) diffusional re-equilibration after rim crystallization from the enclave-forming magma, as indicated by relatively short calculated residence times, or (2) episodic contamination with a decrease of the contaminant ratio proportional to the extent to which the country rocks were isolated by the crystallization front. Profiles of trace elements with high diffusion coefficients would require unrealistically long residence times, and can be modeled in terms of fractional crystallization. A combination of trace element and Sr isotope data suggests that the felsic microgranular enclaves from the Maua Pluton are the products of interaction between end-member magmas that had similar compositions, thus recording `self-mixing` events.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We define topological and measure-theoretic mixing for nonstationary dynamical systems and prove that for a nonstationary subshift of finite type, topological mixing implies the minimality of any adic transformation defined on the edge space, while if the Parry measure sequence is mixing, the adic transformation is uniquely ergodic. We also show this measure theoretic mixing is equivalent to weak ergodicity of the edge matrices in the sense of inhomogeneous Markov chain theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Denna avhandling tar sin utgångspunkt i ett ifrågasättande av effektiviteten i EU:s konditionalitetspolitik avseende minoritetsrättigheter. Baserat på den rationalistiska teoretiska modellen, External Incentives Model of Governance, syftar denna hypotesprövande avhandling till att förklara om tidsavståndet på det potentiella EU medlemskapet påverkar lagstiftningsnivån avseende minoritetsspråksrättigheter. Mätningen av nivån på lagstiftningen avseende minoritetsspråksrättigheter begränsas till att omfatta icke-diskriminering, användning av minoritetsspråk i officiella sammanhang samt minoriteters språkliga rättigheter i utbildningen. Metodologiskt används ett jämförande angreppssätt både avseende tidsramen för studien, som sträcker sig mellan 2003 och 2010, men även avseende urvalet av stater. På basis av det \"mest lika systemet\" kategoriseras staterna i tre grupper efter deras olika tidsavstånd från det potentiella EU medlemskapet. Hypotesen som prövas är följande: ju kortare tidsavstånd till det potentiella EU medlemskapet desto större sannolikhet att staternas lagstiftningsnivå inom de tre områden som studeras har utvecklats till en hög nivå. Studien visar att hypotesen endast bekräftas delvis. Resultaten avseende icke-diskriminering visar att sambandet mellan tidsavståndet och nivån på lagstiftningen har ökat markant under den undersökta tidsperioden. Detta samband har endast stärkts mellan kategorin av stater som ligger tidsmässigt längst bort ett potentiellt EU medlemskap och de två kategorier som ligger närmare respektive närmast ett potentiellt EU medlemskap. Resultaten avseende användning av minoritetsspråk i officiella sammanhang och minoriteters språkliga rättigheter i utbildningen visar inget respektive nästan inget samband mellan tidsavståndet och utvecklingen på lagstiftningen mellan 2003 och 2010.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solutions to combinatorial optimization, such as p-median problems of locating facilities, frequently rely on heuristics to minimize the objective function. The minimum is sought iteratively and a criterion is needed to decide when the procedure (almost) attains it. However, pre-setting the number of iterations dominates in OR applications, which implies that the quality of the solution cannot be ascertained. A small branch of the literature suggests using statistical principles to estimate the minimum and use the estimate for either stopping or evaluating the quality of the solution. In this paper we use test-problems taken from Baesley's OR-library and apply Simulated Annealing on these p-median problems. We do this for the purpose of comparing suggested methods of minimum estimation and, eventually, provide a recommendation for practioners. An illustration ends the paper being a problem of locating some 70 distribution centers of the Swedish Post in a region.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A decision support system (DSS) was implemented based on a fuzzy logic inference system (FIS) to provide assistance in dose alteration of Duodopa infusion in patients with advanced Parkinson’s disease, using data from motor state assessments and dosage. Three-tier architecture with an object oriented approach was used. The DSS has a web enabled graphical user interface that presents alerts indicating non optimal dosage and states, new recommendations, namely typical advice with typical dose and statistical measurements. One data set was used for design and tuning of the FIS and another data set was used for evaluating performance compared with actual given dose. Overall goodness-of-fit for the new patients (design data) was 0.65 and for the ongoing patients (evaluation data) 0.98. User evaluation is now ongoing. The system could work as an assistant to clinical staff for Duodopa treatment in advanced Parkinson’s disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to evaluate the performance of two divergent methods for delineating commuting regions, also called labour market areas, in a situation that the base spatial units differ largely in size as a result of an irregular population distribution. Commuting patterns in Sweden have been analyzed with geographical information system technology by delineating commuting regions using two regionalization methods. One, a rule-based method, uses one-way commuting flows to delineate local labour market areas in a top-down procedure based on the selection of predefined employment centres. The other method, the interaction-based Intramax analysis, uses two-way flows in a bottom-up procedure based on numerical taxonomy principles. A comparison of these methods will expose a number of strengths and weaknesses. For both methods, the same data source has been used. The performance of both methods has been evaluated for the country as a whole using resident employed population, self-containment levels and job ratios for criteria. A more detailed evaluation has been done in the Goteborg metropolitan area by comparing regional patterns with the commuting fields of a number of urban centres in this area. It is concluded that both methods could benefit from the inclusion of additional control measures to identify improper allocations of municipalities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article sets out to analyse recent regime developments in Ukraine in relation to semi-presidentialism. The article asks: to what extent and in what ways theoretical arguments against semi-presidentialism (premier-presidential and president-parliamentary systems) are relevant for understanding the changing directions of the Ukrainian regime since the 1990s? The article also reviews the by now overwhelming evidence suggesting that President Yanukovych is turning Ukraine into a more authoritarian hybrid regime and raises the question to what extent the president-parliamentary system might serve this end. The article argues that both kinds of semi-presidentialism have, in different ways, exacerbated rather than mitigated institutional conflict and political stalemate. The return to the president-parliamentary system in 2010 – the constitutional arrangement with the most dismal record of democratisation – was a step in the wrong direction. The premier-presidential regime was by no means ideal, but it had at least two advantages. It weakened the presidential dominance and it explicitly anchored the survival of the government in parliament. The return to the 1996 constitution ties in well with the notion that President Viktor Yanukovych has embarked on an outright authoritarian path.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most widely used updating rule for non-additive probalities is the Dempster-Schafer rule. Schmeidles and Gilboa have developed a model of decision making under uncertainty based on non-additive probabilities, and in their paper “Updating Ambiguos Beliefs” they justify the Dempster-Schafer rule based on a maximum likelihood procedure. This note shows in the context of Schmeidler-Gilboa preferences under uncertainty, that the Dempster-Schafer rule is in general not ex-ante optimal. This contrasts with Brown’s result that Bayes’ rule is ex-ante optimal for standard Savage preferences with additive probabilities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A determinação da taxa de juros estrutura a termo é um dos temas principais da gestão de ativos financeiros. Considerando a grande importância dos ativos financeiros para a condução das políticas econômicas, é fundamental para compreender a estrutura que é determinado. O principal objetivo deste estudo é estimar a estrutura a termo das taxas de juros brasileiras, juntamente com taxa de juros de curto prazo. A estrutura a termo será modelado com base em um modelo com uma estrutura afim. A estimativa foi feita considerando a inclusão de três fatores latentes e duas variáveis ​​macroeconômicas, através da técnica Bayesiana da Cadeia de Monte Carlo Markov (MCMC).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper demonstrates that for a very general class of monetary models (the Sidrauski type models and the cash-in-advance models), Bailey’s rule to evaluate the welfare efect of infation is in deed accurate. The result applies for any technology or preference, if the long-run capital stock does not depend on the ination rate. In general, a dynamic version of Bailey’s rule is established. In particular, the result extends to models in which there is a banking sector that supplies money substitutes services. A dditionally, it is argued that the relevant money demand concept for this issue- the impact of in ination under welfare- is the monetary base.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consumption is an important macroeconomic aggregate, being about 70% of GNP. Finding sub-optimal behavior in consumption decisions casts a serious doubt on whether optimizing behavior is applicable on an economy-wide scale, which, in turn, challenge whether it is applicable at all. This paper has several contributions to the literature on consumption optimality. First, we provide a new result on the basic rule-of-thumb regression, showing that it is observational equivalent to the one obtained in a well known optimizing real-business-cycle model. Second, for rule-of-thumb tests based on the Asset-Pricing Equation, we show that the omission of the higher-order term in the log-linear approximation yields inconsistent estimates when lagged observables are used as instruments. However, these are exactly the instruments that have been traditionally used in this literature. Third, we show that nonlinear estimation of a system of N Asset-Pricing Equations can be done efficiently even if the number of asset returns (N) is high vis-a-vis the number of time-series observations (T). We argue that efficiency can be restored by aggregating returns into a single measure that fully captures intertemporal substitution. Indeed, we show that there is no reason why return aggregation cannot be performed in the nonlinear setting of the Pricing Equation, since the latter is a linear function of individual returns. This forms the basis of a new test of rule-of-thumb behavior, which can be viewed as testing for the importance of rule-of-thumb consumers when the optimizing agent holds an equally-weighted portfolio or a weighted portfolio of traded assets. Using our setup, we find no signs of either rule-of-thumb behavior for U.S. consumers or of habit-formation in consumption decisions in econometric tests. Indeed, we show that the simple representative agent model with a CRRA utility is able to explain the time series data on consumption and aggregate returns. There, the intertemporal discount factor is significant and ranges from 0.956 to 0.969 while the relative risk-aversion coefficient is precisely estimated ranging from 0.829 to 1.126. There is no evidence of rejection in over-identifying-restriction tests.