191 resultados para Realizations
Resumo:
We show that a broad class of quantum critical points can be stable against locally correlated disorder even if they are unstable against uncorrelated disorder. Although this result seemingly contradicts the Harris criterion, it follows naturally from the absence of a random-mass term in the associated order parameter field theory. We illustrate the general concept with explicit calculations for quantum spin-chain models. Instead of the infinite-randomness physics induced by uncorrelated disorder, we find that weak locally correlated disorder is irrelevant. For larger disorder, we find a line of critical points with unusual properties such as an increase of the entanglement entropy with the disorder strength. We also propose experimental realizations in the context of quantum magnetism and cold-atom physics. Copyright (C) EPLA, 2011
Resumo:
The issue of smoothing in kriging has been addressed either by estimation or simulation. The solution via estimation calls for postprocessing kriging estimates in order to correct the smoothing effect. Stochastic simulation provides equiprobable images presenting no smoothing and reproducing the covariance model. Consequently, these images reproduce both the sample histogram and the sample semivariogram. However, there is still a problem, which is the lack of local accuracy of simulated images. In this paper, a postprocessing algorithm for correcting the smoothing effect of ordinary kriging estimates is compared with sequential Gaussian simulation realizations. Based on samples drawn from exhaustive data sets, the postprocessing algorithm is shown to be superior to any individual simulation realization yet, at the expense of providing one deterministic estimate of the random function.
Resumo:
Given a branched covering of degree d between closed surfaces, it determines a collection of partitions of d, the branch data. In this work we show that any branch data are realized by an indecomposable primitive branched covering on a connected closed surface N with chi(N) <= 0. This shows that decomposable and indecomposable realizations may coexist. Moreover, we characterize the branch data of a decomposable primitive branched covering. Bibliography: 20 titles.
Resumo:
Foreign accent can be everything from hardly detectable to rendering the second language speech unintelligible. It is assumed that certain aspects of a specific target language contribute more to making the foreign accented speech intelligible and listener friendly, than others. The present thesis examines a teaching strategy for Swedish pronunciation in second language education. The teaching strategy “Basic prosody” or BP, gives priority to temporal aspects of Swedish prosody, which means the temporal phonological contrasts word stress and quantity, as well as the durational realizations of these contrasts. BP does not prescribe any specific tonal realizations. This standpoint is based on the great regional variety in realization and distribution of Swedish word accents. The teaching strategy consists virtually of three directives: · Stress the proper word in the sentence. · Stress proper syllables in stressed words and make them longer. · Lengthen the proper segment – vowel or subsequent consonant – in the stressed syllable. These directives reflect the view that all phonological length is stress-induced, and that vowel length and consonant length are equally important as learning goals. BP is examined in the light of existing findings in the field of second language pronunciation and with respect to the phonetic correlates of Swedish stress and quantity. Five studies examine the relation between segment durations and the categorization made by native Swedish listeners. The results indicate that the postvocalic consonant duration contributes to quantity categorization as well as giving the proper duration to stressed syllables. Furthermore, native Swedish speakers are shown to apply the complementary /V: C/ - /VC:/ pattern also when speaking English and German, by lengthening postvocalic consonants. The correctness of the priority is not directly addressed but important aspects of BP are supported by earlier findings as well as the results from the present studies.
Resumo:
This study presents an approach to combine uncertainties of the hydrological model outputs predicted from a number of machine learning models. The machine learning based uncertainty prediction approach is very useful for estimation of hydrological models' uncertainty in particular hydro-metrological situation in real-time application [1]. In this approach the hydrological model realizations from Monte Carlo simulations are used to build different machine learning uncertainty models to predict uncertainty (quantiles of pdf) of the a deterministic output from hydrological model . Uncertainty models are trained using antecedent precipitation and streamflows as inputs. The trained models are then employed to predict the model output uncertainty which is specific for the new input data. We used three machine learning models namely artificial neural networks, model tree, locally weighted regression to predict output uncertainties. These three models produce similar verification results, which can be improved by merging their outputs dynamically. We propose an approach to form a committee of the three models to combine their outputs. The approach is applied to estimate uncertainty of streamflows simulation from a conceptual hydrological model in the Brue catchment in UK and the Bagmati catchment in Nepal. The verification results show that merged output is better than an individual model output. [1] D. L. Shrestha, N. Kayastha, and D. P. Solomatine, and R. Price. Encapsulation of parameteric uncertainty statistics by various predictive machine learning models: MLUE method, Journal of Hydroinformatic, in press, 2013.
Resumo:
Climate change has resulted in substantial variations in annual extreme rainfall quantiles in different durations and return periods. Predicting the future changes in extreme rainfall quantiles is essential for various water resources design, assessment, and decision making purposes. Current Predictions of future rainfall extremes, however, exhibit large uncertainties. According to extreme value theory, rainfall extremes are rather random variables, with changing distributions around different return periods; therefore there are uncertainties even under current climate conditions. Regarding future condition, our large-scale knowledge is obtained using global climate models, forced with certain emission scenarios. There are widely known deficiencies with climate models, particularly with respect to precipitation projections. There is also recognition of the limitations of emission scenarios in representing the future global change. Apart from these large-scale uncertainties, the downscaling methods also add uncertainty into estimates of future extreme rainfall when they convert the larger-scale projections into local scale. The aim of this research is to address these uncertainties in future projections of extreme rainfall of different durations and return periods. We plugged 3 emission scenarios with 2 global climate models and used LARS-WG, a well-known weather generator, to stochastically downscale daily climate models’ projections for the city of Saskatoon, Canada, by 2100. The downscaled projections were further disaggregated into hourly resolution using our new stochastic and non-parametric rainfall disaggregator. The extreme rainfall quantiles can be consequently identified for different durations (1-hour, 2-hour, 4-hour, 6-hour, 12-hour, 18-hour and 24-hour) and return periods (2-year, 10-year, 25-year, 50-year, 100-year) using Generalized Extreme Value (GEV) distribution. By providing multiple realizations of future rainfall, we attempt to measure the extent of total predictive uncertainty, which is contributed by climate models, emission scenarios, and downscaling/disaggregation procedures. The results show different proportions of these contributors in different durations and return periods.
Resumo:
We apply the concept of exchangeable random variables to the case of non-additive robability distributions exhibiting ncertainty aversion, and in the lass generated bya convex core convex non-additive probabilities, ith a convex core). We are able to rove two versions of the law of arge numbers (de Finetti's heorems). By making use of two efinitions. of independence we rove two versions of the strong law f large numbers. It turns out that e cannot assure the convergence of he sample averages to a constant. e then modal the case there is a true" probability distribution ehind the successive realizations of the uncertain random variable. In this case convergence occurs. This result is important because it renders true the intuition that it is possible "to learn" the "true" additive distribution behind an uncertain event if one repeatedly observes it (a sufficiently large number of times). We also provide a conjecture regarding the "Iearning" (or updating) process above, and prove a partia I result for the case of Dempster-Shafer updating rule and binomial trials.
Resumo:
O algoritmo de simulação seqüencial estocástica mais amplamente utilizado é o de simulação seqüencial Gaussiana (ssG). Teoricamente, os métodos estocásticos reproduzem tão bem o espaço de incerteza da VA Z(u) quanto maior for o número L de realizações executadas. Entretanto, às vezes, L precisa ser tão alto que o uso dessa técnica pode se tornar proibitivo. Essa Tese apresenta uma estratégia mais eficiente a ser adotada. O algoritmo de simulação seqüencial Gaussiana foi alterado para se obter um aumento em sua eficiência. A substituição do método de Monte Carlo pela técnica de Latin Hypercube Sampling (LHS), fez com que a caracterização do espaço de incerteza da VA Z(u), para uma dada precisão, fosse alcançado mais rapidamente. A técnica proposta também garante que todo o modelo de incerteza teórico seja amostrado, sobretudo em seus trechos extremos.
Resumo:
O presente trabalho investiga a possibilidade de aumentar o espectro de participação política na Administração, pela consideração de que o Estado brasileiro tem fins, objetivos a realizar e é uma democracia do tipo “procedimental-deliberativa”, em que o povo deve participar das decisões que afetem sua vida. Além disso, a realização do interesse geral só tem a ganhar em eficácia se as decisões estatais escaparem, o mais possível, da lógica tecnocrática do “segredo administrativo”, porque, graças à participação dos interessados, pelo conhecimento dos dados concretos e dos fatores humanos e técnicos que condicionam uma decisão, estes podem trazer algum elemento que a modifique, obrigando a Administração a explicar os motivos de sua ação, facilitando, assim, a execução. Por isso, estuda-se, tanto o significado do princípio democrático quanto do princípio da publicidade, as formas possíveis de participação dos cidadãos na Administração e as concretizações da publicidade administrativa em direito de saber, direito de controle e direito de participar do processo administrativo.
Resumo:
A repeated moral hazard setting in which the Principal privately observes the Agent’s output is studied. It is shown that there is no loss from restricting the analysis to contracts in which the Agent is supposed to exert effort every period, receives a constant efficiency wage and no feedback until he is fired. The optimal contract for a finite horizon is characterized, and shown to require burning of resources. These are only burnt after the worst possible realization sequence and the amount is independent of both the length of the horizon and the discount factor (δ). For the infinite horizon case a family of fixed interval review contracts is characterized and shown to achieve first best as δ → 1. The optimal contract when δ << 1 is partially characterized. Incentives are optimally provided with a combination of efficiency wages and the threat of termination, which will exhibit memory over the whole history of realizations. Finally, Tournaments are shown to provide an alternative solution to the problem.
Resumo:
Based on three versions of a small macroeconomic model for Brazil, this paper presents empirical evidence on the effects of parameter uncertainty on monetary policy rules and on the robustness of optimal and simple rules over different model specifications. By comparing the optimal policy rule under parameter uncertainty with the rule calculated under purely additive uncertainty, we find that parameter uncertainty should make policymakers react less aggressively to the economy's state variables, as suggested by Brainard's "conservatism principIe", although this effect seems to be relatively small. We then informally investigate each rule's robustness by analyzing the performance of policy rules derived from each model under each one of the alternative models. We find that optimal rules derived from each model perform very poorly under alternative models, whereas a simple Taylor rule is relatively robusto We also fmd that even within a specific model, the Taylor rule may perform better than the optimal rule under particularly unfavorable realizations from the policymaker' s loss distribution function.
Resumo:
We consider a class of sampling-based decomposition methods to solve risk-averse multistage stochastic convex programs. We prove a formula for the computation of the cuts necessary to build the outer linearizations of the recourse functions. This formula can be used to obtain an efficient implementation of Stochastic Dual Dynamic Programming applied to convex nonlinear problems. We prove the almost sure convergence of these decomposition methods when the relatively complete recourse assumption holds. We also prove the almost sure convergence of these algorithms when applied to risk-averse multistage stochastic linear programs that do not satisfy the relatively complete recourse assumption. The analysis is first done assuming the underlying stochastic process is interstage independent and discrete, with a finite set of possible realizations at each stage. We then indicate two ways of extending the methods and convergence analysis to the case when the process is interstage dependent.
Resumo:
The aim of this work is to analyse the tourism events and the performance of this segment from market as strategy in the combat of Natal's hotel seasonality, in the executives/managers's vision from this sector. Two searches were realized to turn available this study: Inicialy a bibliographic search involving the concepts associated with the tematic in question so that it arranges the embasament theoric-scientific and a survey from facts through the country search, where it was applicable in the establishments of work's population with the auxiliary from a formulary answered by a personal interview. The analyses techniques through facts were: estatistic descritive and Kolmogorov-smirnov test.Among the results found, it was verified that the main reasons alleged by the hotels to ingress in the segment from the events were the alinement from the competitive company, the diversification in the options in the sense of occupy the establishments during the period of low season and answer to the demand of market. Investigated the profile from the events occured in the Natal hotels referring to the port, public origin , kinds of events and frequency from their realizations, as well as , the capacity from these establishments to attend this segment.It was noticed that in spite of the hotels agree that the events are important estrategies to combat the seasonality, the establishments still suffer with the flutuation, what can be justified from the moment that it's considered that the events also behave from seasonal manner, having more concentration in certain periods from the year. It was evaluated that the main advantage noticed by the realization from the hotels's events is the utilization from alimentation and drinking services, surpassing the advantage from elevation of taxes occupation from the apartments
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Research in the area of teacher training in English as a Foreign Language (CELANI, 2003, 2004, 2010; PAIVA, 2000, 2003, 2005; VIEIRA-ABRAHÃO, 2010) articulates the complexity of beginning teachers classroom contexts aligned with teaching language as a social and professional practice of the teacher in training. To better understand this relationship, the present study is based on a corpus of transcribed interviews and questionnaires applied to 28 undergraduate students majoring in Letters/English emphasis, at a public university located in the interior of the Western Amazon region, soliciting their opinions about the reforms made in the curriculum of this Major. Interviews and questionnaires were used as data collection instruments to trace a profile of the students organized in Group 1, with freshmen and sophomore undergraduates who are following the 2009 curriculum, and Group 2, with junior and senior undergraduates who are following the 2006 curriculum. The objectives are to identify, to characterize and to analyze the types of pronouns, roles and social actors represented in the opinions of these students in relation to their teacher training curriculum. The theoretical support focuses on the challenge of historical and contemporary routes from English teachers initial education programs (MAGALHÃES; LIBERALLI, 2009; PAVAN; SILVA, 2010; ALVAREZ, 2010; VIANA, 2011; PAVAN, 2012). Our theoretical perspective is based on the Systemic Functional Grammar of Halliday (1994), Halliday and Hasan (1989), Halliday and Matthiessen (2004), Eggins (1994; 2004) and Thompson (2004). We focus on the concept of the Interpersonal meaning, specifically regarding the roles articulated in the studies by Delu (1991), Thompson and Thetela (1995), and in the Portuguese language such as Ramos (1997), Silva (2006) and Cabral (2009). Moreover, we ascribe van Leeuwen s (1997; 2003) theory of Representation of Social Actors as a theoretical framework in order to identify the sociological aspect of social actors represented in the students discourse. Within this scenario, the analysis unfolds on three levels: grammatical (pronouns), semantic (roles), and discursive (social actors). For the analysis of interpersonal realizations present in the students opinions, we use the computational program WordSmith Tools (SCOTT, 2010) and its applications Wordlist and Concord to quantify the occurrences of the pronouns I, You and They, which characterize the roles and social actors of the corpus. The results show that the students assigned the following roles to themselves: (i) apprentice to express their initial process of English language learning; (ii) freshman to reveal their choice of Major in Letters/English emphasis; (iii) future teacher to relate their expectations towards a practicing professional. To assign the roles to professors in the major, the students used the metaphor of modality (I think) to indicate the relationship of teacher training, while they are in the role of a student and as a future teacher. From these evidences the representation of the students as social actors emerges in roles such as: (i) active roles; (ii) passive roles and (iii) personalized roles. The social actors represented in the opinions of the students reflect the inclusion of these roles assigned to the actions expressed about their experiences and expectations derived from their teacher training classroom