981 resultados para Output-only Modal Analysis
Resumo:
Presented herein is an experimental design that allows the effects of several radiative forcing factors on climate to be estimated as precisely as possible from a limited suite of atmosphere-only general circulation model (GCM) integrations. The forcings include the combined effect of observed changes in sea surface temperatures, sea ice extent, stratospheric (volcanic) aerosols, and solar output, plus the individual effects of several anthropogenic forcings. A single linear statistical model is used to estimate the forcing effects, each of which is represented by its global mean radiative forcing. The strong colinearity in time between the various anthropogenic forcings provides a technical problem that is overcome through the design of the experiment. This design uses every combination of anthropogenic forcing rather than having a few highly replicated ensembles, which is more commonly used in climate studies. Not only is this design highly efficient for a given number of integrations, but it also allows the estimation of (nonadditive) interactions between pairs of anthropogenic forcings. The simulated land surface air temperature changes since 1871 have been analyzed. The changes in natural and oceanic forcing, which itself contains some forcing from anthropogenic and natural influences, have the most influence. For the global mean, increasing greenhouse gases and the indirect aerosol effect had the largest anthropogenic effects. It was also found that an interaction between these two anthropogenic effects in the atmosphere-only GCM exists. This interaction is similar in magnitude to the individual effects of changing tropospheric and stratospheric ozone concentrations or to the direct (sulfate) aerosol effect. Various diagnostics are used to evaluate the fit of the statistical model. For the global mean, this shows that the land temperature response is proportional to the global mean radiative forcing, reinforcing the use of radiative forcing as a measure of climate change. The diagnostic tests also show that the linear model was suitable for analyses of land surface air temperature at each GCM grid point. Therefore, the linear model provides precise estimates of the space time signals for all forcing factors under consideration. For simulated 50-hPa temperatures, results show that tropospheric ozone increases have contributed to stratospheric cooling over the twentieth century almost as much as changes in well-mixed greenhouse gases.
Resumo:
Adaptive filters used in code division multiple access (CDMA) receivers to counter interference have been formulated both with and without the assumption of training symbols being transmitted. They are known as training-based and blind detectors respectively. We show that the convergence behaviour of the blind minimum-output-energy (MOE) detector can be quite easily derived, unlike what was implied by the procedure outlined in a previous paper. The simplification results from the observation that the correlation matrix determining convergence performance can be made symmetric, after which many standard results from the literature on least mean square (LMS) filters apply immediately.
Resumo:
In wireless communication systems, all in-phase and quadrature-phase (I/Q) signal processing receivers face the problem of I/Q imbalance. In this paper, we investigate the effect of I/Q imbalance on the performance of multiple-input multiple-output (MIMO) maximal ratio combining (MRC) systems that perform the combining at the radio frequency (RF) level, thereby requiring only one RF chain. In order to perform the MIMO MRC, we propose a channel estimation algorithm that accounts for the I/Q imbalance. Moreover, a compensation algorithm for the I/Q imbalance in MIMO MRC systems is proposed, which first employs the least-squares (LS) rule to estimate the coefficients of the channel gain matrix, beamforming and combining weight vectors, and parameters of I/Q imbalance jointly, and then makes use of the received signal together with its conjugation to detect the transmitted signal. The performance of the MIMO MRC system under study is evaluated in terms of average symbol error probability (SEP), outage probability and ergodic capacity, which are derived considering transmission over Rayleigh fading channels. Numerical results are provided and show that the proposed compensation algorithm can efficiently mitigate the effect of I/Q imbalance.
Resumo:
The Hadley Centre Global Environmental Model (HadGEM) includes two aerosol schemes: the Coupled Large-scale Aerosol Simulator for Studies in Climate (CLASSIC), and the new Global Model of Aerosol Processes (GLOMAP-mode). GLOMAP-mode is a modal aerosol microphysics scheme that simulates not only aerosol mass but also aerosol number, represents internally-mixed particles, and includes aerosol microphysical processes such as nucleation. In this study, both schemes provide hindcast simulations of natural and anthropogenic aerosol species for the period 2000–2006. HadGEM simulations of the aerosol optical depth using GLOMAP-mode compare better than CLASSIC against a data-assimilated aerosol re-analysis and aerosol ground-based observations. Because of differences in wet deposition rates, GLOMAP-mode sulphate aerosol residence time is two days longer than CLASSIC sulphate aerosols, whereas black carbon residence time is much shorter. As a result, CLASSIC underestimates aerosol optical depths in continental regions of the Northern Hemisphere and likely overestimates absorption in remote regions. Aerosol direct and first indirect radiative forcings are computed from simulations of aerosols with emissions for the year 1850 and 2000. In 1850, GLOMAP-mode predicts lower aerosol optical depths and higher cloud droplet number concentrations than CLASSIC. Consequently, simulated clouds are much less susceptible to natural and anthropogenic aerosol changes when the microphysical scheme is used. In particular, the response of cloud condensation nuclei to an increase in dimethyl sulphide emissions becomes a factor of four smaller. The combined effect of different 1850 baselines, residence times, and abilities to affect cloud droplet number, leads to substantial differences in the aerosol forcings simulated by the two schemes. GLOMAP-mode finds a presentday direct aerosol forcing of −0.49Wm−2 on a global average, 72% stronger than the corresponding forcing from CLASSIC. This difference is compensated by changes in first indirect aerosol forcing: the forcing of −1.17Wm−2 obtained with GLOMAP-mode is 20% weaker than with CLASSIC. Results suggest that mass-based schemes such as CLASSIC lack the necessary sophistication to provide realistic input to aerosol-cloud interaction schemes. Furthermore, the importance of the 1850 baseline highlights how model skill in predicting present-day aerosol does not guarantee reliable forcing estimates. Those findings suggest that the more complex representation of aerosol processes in microphysical schemes improves the fidelity of simulated aerosol forcings.
Resumo:
Land cover data derived from satellites are commonly used to prescribe inputs to models of the land surface. Since such data inevitably contains errors, quantifying how uncertainties in the data affect a model’s output is important. To do so, a spatial distribution of possible land cover values is required to propagate through the model’s simulation. However, at large scales, such as those required for climate models, such spatial modelling can be difficult. Also, computer models often require land cover proportions at sites larger than the original map scale as inputs, and it is the uncertainty in these proportions that this article discusses. This paper describes a Monte Carlo sampling scheme that generates realisations of land cover proportions from the posterior distribution as implied by a Bayesian analysis that combines spatial information in the land cover map and its associated confusion matrix. The technique is computationally simple and has been applied previously to the Land Cover Map 2000 for the region of England and Wales. This article demonstrates the ability of the technique to scale up to large (global) satellite derived land cover maps and reports its application to the GlobCover 2009 data product. The results show that, in general, the GlobCover data possesses only small biases, with the largest belonging to non–vegetated surfaces. In vegetated surfaces, the most prominent area of uncertainty is Southern Africa, which represents a complex heterogeneous landscape. It is also clear from this study that greater resources need to be devoted to the construction of comprehensive confusion matrices.
Resumo:
Existing distributed hydrologic models are complex and computationally demanding for using as a rapid-forecasting policy-decision tool, or even as a class-room educational tool. In addition, platform dependence, specific input/output data structures and non-dynamic data-interaction with pluggable software components inside the existing proprietary frameworks make these models restrictive only to the specialized user groups. RWater is a web-based hydrologic analysis and modeling framework that utilizes the commonly used R software within the HUBzero cyber infrastructure of Purdue University. RWater is designed as an integrated framework for distributed hydrologic simulation, along with subsequent parameter optimization and visualization schemes. RWater provides platform independent web-based interface, flexible data integration capacity, grid-based simulations, and user-extensibility. RWater uses RStudio to simulate hydrologic processes on raster based data obtained through conventional GIS pre-processing. The program integrates Shuffled Complex Evolution (SCE) algorithm for parameter optimization. Moreover, RWater enables users to produce different descriptive statistics and visualization of the outputs at different temporal resolutions. The applicability of RWater will be demonstrated by application on two watersheds in Indiana for multiple rainfall events.
Resumo:
This paper uses an output oriented Data Envelopment Analysis (DEA) measure of technical efficiency to assess the technical efficiencies of the Brazilian banking system. Four approaches to estimation are compared in order to assess the significance of factors affecting inefficiency. These are nonparametric Analysis of Covariance, maximum likelihood using a family of exponential distributions, maximum likelihood using a family of truncated normal distributions, and the normal Tobit model. The sole focus of the paper is on a combined measure of output and the data analyzed refers to the year 2001. The factors of interest in the analysis and likely to affect efficiency are bank nature (multiple and commercial), bank type (credit, business, bursary and retail), bank size (large, medium, small and micro), bank control (private and public), bank origin (domestic and foreign), and non-performing loans. The latter is a measure of bank risk. All quantitative variables, including non-performing loans, are measured on a per employee basis. The best fits to the data are provided by the exponential family and the nonparametric Analysis of Covariance. The significance of a factor however varies according to the model fit although it can be said that there is some agreements between the best models. A highly significant association in all models fitted is observed only for nonperforming loans. The nonparametric Analysis of Covariance is more consistent with the inefficiency median responses observed for the qualitative factors. The findings of the analysis reinforce the significant association of the level of bank inefficiency, measured by DEA residuals, with the risk of bank failure.
Resumo:
“A Narratological Analysis of D. M. Thomas’s The White Hotel (1981)” originated within a seminar on British Postmodernist Literature during the first Master’s Degree in “British and North-American Culture and Literature” (2001-04) at the Universidade da Madeira set up by the Department of English and German Studies. This dissertation seeks to present a narratological analysis of Thomas’s novel. The White Hotel stands as a paradigmatic example of the kind of literature that has dominated the British literary scene in the past three decades, commonly referred to as postmodernist fiction, owing to its formal craftsmanship (multiplicity of narrative voices and perspectives, mixing of differing genres and text types, inclusion of embedded narratives) alongside the handling of what are deemed as postmodernist topoi (the distinction between truth and lies, history and fantasy, fact and fiction, the questioning of the nature of aesthetic representation, the role the author and the reader hold in the narrative process, the instability of the linguistic sign, the notion of originality and the moral responsibility the author has towards his/her work), The narratological approach carried out in this research reveals that Thomas’s text constitutes an aesthetic endeavour to challenge the teleological drive that is inherent in any narrative, i. e., the inevitable progression towards a reassuring end. Hence, the subversion of narrative telling, which is a recurrent feature in Thomas’s remaining literary output, mirrors the contemporary distrust in totalising, hierarchised and allencompassing narratives. In its handling of historical events, namely of the Holocaust, The White Hotel invites us to reassess the most profound beliefs we were taught to take for granted: progress, reality and truth. In their place the novel proposes a more flexible conception of both the world and art, especially of literary fiction. In other terms, the world appears as a brutal chaotic place the subject is forced to adjust to. Accordingly, the literary work is deemed hybrid, fragmented and open. So as to put forth the above-mentioned issues, this research work is structured in three main chapters. The initial chapter – “What is Postmodernism?” – advances a scrutiny not only of the seminal but also of more recent studies on postmodernist literary criticism. Following this, in Chapter II – “Postmodernist British Fiction” – a brief overview of postmodernist British fiction is carried out, focusing on the fictional works that, in my opinion, are fundamental for the periodising of British postmodernism. In addition, I felt the need to include a section – “D. M. Thomas as a Postmodernist Novelist” – in which the author’s remaining literary output is briefly examined. Finally, Chapter III – “A Narratological Analysis of The White Hotel” – proposes a narratological analysis of the novel according to the particular Genettian analytical model. To conclude, my dissertation constitutes an approach to D. M. Thomas’s The White Hotel as a text whose very existence is substantiated in the foregrounding of the contingency of all discourses, meeting the postmodernist precepts of openness and subversion of any narrative that claims to be true, globalising and all-inclusive.
Resumo:
The objective of this paper is to show an alternative representation in time domain of a non-transposed three-phase transmission line decomposed in its exact modes by using two transformation matrices. The first matrix is Clarke's matrix that is real, frequency independent, easily represented in computational transient programs (EMTP) and separates the line into quasi-modes a, b and zero. After that, Quasi-modes a and zero are decomposed into their exact modes by using a modal transformation matrix whose elements can be synthesized in time domain through standard curve-fitting techniques. The main advantage of this alternative representation is to reduce the processing time because a frequency dependent modal transformation matrix of a three-phase line has nine elements to be represented in time domain while a modal transformation matrix of a two-phase line has only four elements. This paper shows modal decomposition process and eigenvectors of a non-transposed three-phase line with a vertical symmetry plane whose nominal voltage is 440 kV and line length is 500 km.
Resumo:
This paper aims with the use of linear matrix inequalities approach (LMIs) for application in active vibration control problems in smart strutures. A robust controller for active damping in a panel was designed with piezoelectrical actuators in optimal locations for illustration of the main proposal. It was considered, in the simulations of the closed-loop, a model identified by eigensystem realization algorithm (ERA) and reduced by modal decomposition. We tested two differents techniques to solve the problem. The first one uses LMI approach by state-feedback based in an observer design, considering several simultaneous constraints as: a decay rate, limited input on the actuators, bounded output peak (output energy) and robustness to parametic uncertainties. The results demonstrated the vibration attenuation in the structure by controlling only the first modes and the increased damping in the bandwidth of interest. However, it is possible to occur spillover effects, because the design has not been done considering the dynamic uncertainties related with high frequencies modes. In this sense, the second technique uses the classical H. output feedback control, also solved by LMI approach, considering robustness to residual dynamic to overcome the problem found in the first test. The results are compared and discussed. The responses shown the robust performance of the system and the good reduction of the vibration level, without increase mass.
Resumo:
The objective of this paper is to show an alternative representation in time domain of a non-transposed three-phase transmission line decomposed in its exact modes by using two transformation matrices. The first matrix is Clarke's matrix that is real, frequency independent, easily represented in computational transient programs (EMTP) and separates the line into Quasi-modes alpha, beta and zero. After that, Quasi-modes a and zero are decomposed into their exact modes by using a modal transformation matrix whose elements can be synthesized in time domain through standard curve-fitting techniques. The main advantage of this alternative representation is to reduce the processing time because a frequency dependent modal transformation matrix of a three-phase line has nine elements to be represented in time domain while a modal transformation matrix of a two-phase line has only four elements. This paper shows modal decomposition process and eigenvectors of a nontransposed three-phase line with a vertical symmetry plane whose nominal voltage is 440 kV and line length is 500 km.
Resumo:
The objective of this paper is to show an alternative representation in time domain of a non-transposed three-phase transmission line decomposed in its exact modes by using two transformation matrices. The first matrix is Clarke's matrix that is real, frequency independent, easily represented in computational transient programs (EMTP) and separates the line into Quasi-modes α, β and zero. After that, Quasi-modes a and zero are decomposed into their exact modes by using a modal transformation matrix whose elements can be synthesized in time domain through standard curve-fitting techniques. The main advantage of this alternative representation is to reduce the processing time because a frequency dependent modal transformation matrix of a three-phase line has nine elements to be represented in time domain while a modal transformation matrix of a two-phase line has only four elements. This paper shows modal decomposition process and eigenvectors of a non-transposed three-phase line with a vertical symmetry plane whose nominal voltage is 440 kV and line length is 500 km. ©2006 IEEE.
Resumo:
Includes bibliography
Resumo:
Includes bibliography
Resumo:
This article provides a systemic analysis of the health sector in Brazil, based on a study of its productive structure and its interactions with the other sectors of the economy. The article draws on unpublished data on the National Health Accounts provided by the Brazilian Geographical and Statistical Institute (ibge); and it proposes a methodology for harmonizing the System of National Accounts (input-output matrix) with the Health Satellite Accounts for 2000 and 2005. This sheds light on the relations that exist between the health sector and the other sectors the economy, through input-output indicators.