234 resultados para Àrid
Resumo:
Queensland appears intent on dismantling its public and preventive health services. Health Minister Lawrence Springborg last week outlined the rationale for getting rid of more than 150 jobs in nutrition, health promotion and Indigenous health, arguing previous “campaigns” and “messaging” around obesity were “piecemeal” and had “grossly failed”. The plan now, the minister argued, is to focus on a new centrally-driven and high-profile approach...
Resumo:
The problem of reconstruction of a refractive-index distribution (RID) in optical refraction tomography (ORT) with optical path-length difference (OPD) data is solved using two adaptive-estimation-based extended-Kalman-filter (EKF) approaches. First, a basic single-resolution EKF (SR-EKF) is applied to a state variable model describing the tomographic process, to estimate the RID of an optically transparent refracting object from noisy OPD data. The initialization of the biases and covariances corresponding to the state and measurement noise is discussed. The state and measurement noise biases and covariances are adaptively estimated. An EKF is then applied to the wavelet-transformed state variable model to yield a wavelet-based multiresolution EKF (MR-EKF) solution approach. To numerically validate the adaptive EKF approaches, we evaluate them with benchmark studies of standard stationary cases, where comparative results with commonly used efficient deterministic approaches can be obtained. Detailed reconstruction studies for the SR-EKF and two versions of the MR-EKF (with Haar and Daubechies-4 wavelets) compare well with those obtained from a typically used variant of the (deterministic) algebraic reconstruction technique, the average correction per projection method, thus establishing the capability of the EKF for ORT. To the best of our knowledge, the present work contains unique reconstruction studies encompassing the use of EKF for ORT in single-resolution and multiresolution formulations, and also in the use of adaptive estimation of the EKF's noise covariances. (C) 2010 Optical Society of America
Resumo:
Perhaps the most fundamental prediction of financial theory is that the expected returns on financial assets are determined by the amount of risk contained in their payoffs. Assets with a riskier payoff pattern should provide higher expected returns than assets that are otherwise similar but provide payoffs that contain less risk. Financial theory also predicts that not all types of risks should be compensated with higher expected returns. It is well-known that the asset-specific risk can be diversified away, whereas the systematic component of risk that affects all assets remains even in large portfolios. Thus, the asset-specific risk that the investor can easily get rid of by diversification should not lead to higher expected returns, and only the shared movement of individual asset returns – the sensitivity of these assets to a set of systematic risk factors – should matter for asset pricing. It is within this framework that this thesis is situated. The first essay proposes a new systematic risk factor, hypothesized to be correlated with changes in investor risk aversion, which manages to explain a large fraction of the return variation in the cross-section of stock returns. The second and third essays investigate the pricing of asset-specific risk, uncorrelated with commonly used risk factors, in the cross-section of stock returns. The three essays mentioned above use stock market data from the U.S. The fourth essay presents a new total return stock market index for the Finnish stock market beginning from the opening of the Helsinki Stock Exchange in 1912 and ending in 1969 when other total return indices become available. Because a total return stock market index for the period prior to 1970 has not been available before, academics and stock market participants have not known the historical return that stock market investors in Finland could have achieved on their investments. The new stock market index presented in essay 4 makes it possible, for the first time, to calculate the historical average return on the Finnish stock market and to conduct further studies that require long time-series of data.
Resumo:
Perhaps the most fundamental prediction of financial theory is that the expected returns on financial assets are determined by the amount of risk contained in their payoffs. Assets with a riskier payoff pattern should provide higher expected returns than assets that are otherwise similar but provide payoffs that contain less risk. Financial theory also predicts that not all types of risks should be compensated with higher expected returns. It is well-known that the asset-specific risk can be diversified away, whereas the systematic component of risk that affects all assets remains even in large portfolios. Thus, the asset-specific risk that the investor can easily get rid of by diversification should not lead to higher expected returns, and only the shared movement of individual asset returns – the sensitivity of these assets to a set of systematic risk factors – should matter for asset pricing. It is within this framework that this thesis is situated. The first essay proposes a new systematic risk factor, hypothesized to be correlated with changes in investor risk aversion, which manages to explain a large fraction of the return variation in the cross-section of stock returns. The second and third essays investigate the pricing of asset-specific risk, uncorrelated with commonly used risk factors, in the cross-section of stock returns. The three essays mentioned above use stock market data from the U.S. The fourth essay presents a new total return stock market index for the Finnish stock market beginning from the opening of the Helsinki Stock Exchange in 1912 and ending in 1969 when other total return indices become available. Because a total return stock market index for the period prior to 1970 has not been available before, academics and stock market participants have not known the historical return that stock market investors in Finland could have achieved on their investments. The new stock market index presented in essay 4 makes it possible, for the first time, to calculate the historical average return on the Finnish stock market and to conduct further studies that require long time-series of data.
Resumo:
Static characteristics of an analog-to-digital converter (ADC) can be directly determined from the histogram-based quasi-static approach by measuring the ADC output when excited by an ideal ramp/triangular signal of sufficiently low frequency. This approach requires only a fraction of time compared to the conventional dc voltage test, is straightforward, is easy to implement, and, in principle, is an accepted method as per the revised IEEE 1057. However, the only drawback is that ramp signal sources are not ideal. Thus, the nonlinearity present in the ramp signal gets superimposed on the measured ADC characteristics, which renders them, as such, unusable. In recent years, some solutions have been proposed to alleviate this problem by devising means to eliminate the contribution of signal source nonlinearity. Alternatively, a straightforward step would be to get rid of the ramp signal nonlinearity before it is applied to the ADC. Driven by this logic, this paper describes a simple method about using a nonlinear ramp signal, but yet causing little influence on the measured ADC static characteristics. Such a thing is possible because even in a nonideal ramp, there exist regions or segments that are nearly linear. Therefore, the task, essentially, is to identify these near-linear regions in a given source and employ them to test the ADC, with a suitable amplitude to match the ADC full-scale voltage range. Implementation of this method reveals that a significant reduction in the influence of source nonlinearity can be achieved. Simulation and experimental results on 8- and 10-bit ADCs are presented to demonstrate its applicability.
Resumo:
High frequency PWM inverters produce an output voltage spectrum at the fundamental reference frequency and around the switching frequency. Thus ideally PWM inverters do not introduce any significant lower order harmonics. However, in real systems, due to dead-time effect, device drops and other non-idealities lower order harmonics are present. In order to attenuate these lower order harmonics and hence to improve the quality of output current, this paper presents an \emph{adaptive harmonic elimination technique}. This technique uses an adaptive filter to estimate a particular harmonic that is to be attenuated and generates a voltage reference which will be added to the voltage reference produced by the current control loop of the inverter. This would have an effect of cancelling the voltage that was producing the particular harmonic. The effectiveness and the limitations of the technique are verified experimentally in a single phase PWM inverter in stand-alone as well as g rid interactive modes of operation.
Resumo:
The GasBench II peripheral along with MAT 253 combination provides a more sensitive platform for the determination of water isotope ratios. Here, we examined the role of adsorbed moisture within the gas chromatography (GC) column of the GasBench II on measurement uncertainties. The uncertainty in O-18/O-16 ratio measurements is determined by several factors, including the presence of water in the GC. The contamination of GC with water originating from samples as water vapour over a longer timeframe is a critical factor in determining the reproducibility of O-18/O-16 ratios in water samples. The shift in isotope ratios observed in the experiment under dry and wet conditions correlates strongly with the retention time of analyte CO2, indicating the effect of accumulated moisture. Two possible methods to circumvent or minimise the effect of adsorbed water on isotope ratios are presented here. The proposed methodology includes either the regular baking of the GC column at a higher temperature (120 degrees C) after analysis of a batch of 32 sample entries or conducting the experiment at a low GC column temperature (22.5 degrees C). The effects of water contamination on long-term reproducibility of reference water, with and without baking protocol, have been described.
Resumo:
Super-resolution microscopy has tremendously progressed our understanding of cellular biophysics and biochemistry. Specifically, 4pi fluorescence microscopy technique stands out because of its axial super-resolution capability. All types of 4pi-microscopy techniques work well in conjugation with deconvolution techniques to get rid of artifacts due to side-lobes. In this regard, we propose a technique based on spatial filter in a 4pi-type-C confocal setup to get rid of these artifacts. Using a special spatial filter, we have reduced the depth-of-focus. Interference of two similar depth-of-focus beams in a 4 pi geometry result in substantial reduction of side-lobes. Studies show a reduction of side-lobes by 46% and 76% for single and two photon variant compared to 4pi - type - C confocal system. This is incredible considering the resolving capability of the existing 4pi - type - C confocal microscopy. Moreover, the main lobe is found to be 150 nm for the proposed spatial filtering technique as compared to 690 nm of the state-of-art confocal system. Reconstruction of experimentally obtained 2PE - 4pi data of green fluorescent protein (GFP)-tagged mitocondrial network shows near elimination of artifacts arising out of side-lobes. Proposed technique may find interesting application in fluorescence microscopy, nano-lithography, and cell biology. (C) 2013 AIP Publishing LLC.
Resumo:
The design methodology for flexible pavements needs to address the mechanisms of pavement failure, loading intensities and also develop suitable approaches for evaluation of pavement performance. In the recent years, the use of geocells to improve pavement performance has been receiving considerable attention. This paper studies the influence of geocells on the required thickness of pavements by placing it below the granular layers (base and sub-base) and above the subgrade. The reduction in thickness here refers to the reduction in the thickness of the GSB (Granular Sub-base) layer, with a possibility of altogether getting rid of it. To facilitate the analysis, a simple linear elastic approach is used, considering six of the sections as given in the Indian Roads Congress (IRC) code. All the analysis was done using the pavement analysis package KENPAVE. The results show that the use of geocells enables a reduction in pavement thickness.
Resumo:
The occurrence of spurious solutions is a well-known limitation of the standard nodal finite element method when applied to electromagnetic problems. The two commonly used remedies that are used to address this problem are (i) The addition of a penalty term with the penalty factor based on the local dielectric constant, and which reduces to a Helmholtz form on homogeneous domains (regularized formulation); (ii) A formulation based on a vector and a scalar potential. Both these strategies have some shortcomings. The penalty method does not completely get rid of the spurious modes, and both methods are incapable of predicting singular eigenvalues in non-convex domains. Some non-zero spurious eigenvalues are also predicted by these methods on non-convex domains. In this work, we develop mixed finite element formulations which predict the eigenfrequencies (including their multiplicities) accurately, even for nonconvex domains. The main feature of the proposed mixed finite element formulation is that no ad-hoc terms are added to the formulation as in the penalty formulation, and the improvement is achieved purely by an appropriate choice of finite element spaces for the different variables. We show that the formulation works even for inhomogeneous domains where `double noding' is used to enforce the appropriate continuity requirements at an interface. For two-dimensional problems, the shape of the domain can be arbitrary, while for the three-dimensional ones, with our current formulation, only regular domains (which can be nonconvex) can be modeled. Since eigenfrequencies are modeled accurately, these elements also yield accurate results for driven problems. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
Resumen: El presente trabajo investiga la actuación de la Embajada de España con respecto a la entrada a la Argentina de inmigrantes con antecedentes anarquistas durante fines del siglo XIX y los primeros dos años del siglo XX. Registra además el interés por seguir los movimientos de dichos inmigrantes en el país. Se destaca el rol de la documentación diplomática, mediante la cual la Embajada española se mantenía en permanente contacto con el gobierno central, con las autoridades argentinas y con los consulados del interior para intercambiar información. A través de las fuentes se puede observar la labor de “inteligencia ideológica” que se realizó y el detallado conocimiento que se tenía sobre los anarquistas, sus vías de traslado hacia nuestro país, sus profesiones, paraderos, domicilios, relaciones, y las minuciosas descripciones físicas de los sospechados. En esta política se nota la coparticipación entre la Embajada de España y las instituciones argentinas para desligarse de elementos que denominaban “no deseados”. También se consideran las relaciones entre el anarquismo argentino y el español, así como la importancia de la prensa revolucionaria, aun sobre la misma España. La vigilancia que ejercía la Embajada española sobre los inmigrantes, el control sobre los anarquistas españoles en la Argentina y la influencia de éstos sobre los sucesos de la Península demuestran a las claras la preocupación del gobierno hispano en la cuestión.
Resumo:
Waking up from a dreamless sleep, I open my eyes, recognize my wife’s face and am filled with joy. In this thesis, I used functional Magnetic Resonance Imaging (fMRI) to gain insights into the mechanisms involved in this seemingly simple daily occurrence, which poses at least three great challenges to neuroscience: how does conscious experience arise from the activity of the brain? How does the brain process visual input to the point of recognizing individual faces? How does the brain store semantic knowledge about people that we know? To start tackling the first question, I studied the neural correlates of unconscious processing of invisible faces. I was unable to image significant activations related to the processing of completely invisible faces, despite existing reports in the literature. I thus moved on to the next question and studied how recognition of a familiar person was achieved in the brain; I focused on finding invariant representations of person identity – representations that would be activated any time we think of a familiar person, read their name, see their picture, hear them talk, etc. There again, I could not find significant evidence for such representations with fMRI, even in regions where they had previously been found with single unit recordings in human patients (the Jennifer Aniston neurons). Faced with these null outcomes, the scope of my investigations eventually turned back towards the technique that I had been using, fMRI, and the recently praised analytical tools that I had been trusting, Multivariate Pattern Analysis. After a mostly disappointing attempt at replicating a strong single unit finding of a categorical response to animals in the right human amygdala with fMRI, I put fMRI decoding to an ultimate test with a unique dataset acquired in the macaque monkey. There I showed a dissociation between the ability of fMRI to pick up face viewpoint information and its inability to pick up face identity information, which I mostly traced back to the poor clustering of identity selective units. Though fMRI decoding is a powerful new analytical tool, it does not rid fMRI of its inherent limitations as a hemodynamics-based measure.
Resumo:
Este trabalho aproxima as contribuições da teoria das representações sociais e dos estudos em memória social para a compreensão do campo religioso, especificamente o Espiritismo, reconhecendo a importância da recordação de personalidades para a dinâmica religiosa. Esta pesquisa objetiva analisar o conteúdo da representação social de perfeição, o conteúdo e estrutura da memória de personalidades do Espiritismo e a relação entre ambos. Trata-se de estudo descritivo, desenvolvido em duas etapas. Participaram 75 participantes auto-declarados espíritas - 38 na primeira etapa e 37 na segunda, sendo entrevistados 24 desses. Os participantes, em média, possuíam 37,3 anos de idade e 16,7 anos como espíritas. Na primeira fase aplicou-se, através da Internet, a técnica de evocações livres com o termo indutor espíritos superiores, na qual os participantes respondiam que pessoas se associavam ao termo. Na segunda, prosseguiu-se com as evocações livres e questionário, para caracterização dos participantes. A partir das doze personalidades mais lembradas, realizou-se entrevista semi-estruturada, com questões sobre características, virtudes, lembranças, hierarquia das personalidades, e questões sobre o significado da perfeição e como alcançá-la. Os dados das evocações foram analisados através das técnicas do quadro de quatro casas e construção de árvore máxima de similitude. As entrevistas foram analisadas mediante análise categorial temática. Assim, verificou-se que as personalidades mais recordadas foram: Chico Xavier, Jesus, Allan Kardec, Emmanuel, Bezerra de Menezes, Madre Teresa de Calcutá, Joanna de Ângelis, Gandhi, André Luiz, Francisco de Assis, Maria de Nazaré e Divaldo P. Franco. A representação social de perfeição foi expressa, de modo simplificado, na sentença: um caminho, difícil e longo, em que o ser humano sai da sua condição de inferioridade para a perfeição, através do conhecimento (proveniente do trabalho, do estudo e do auto-conhecimento), livrando-se do seu egoísmo e expressando o amor, tal como demonstrado e vivido por Jesus. Verificou-se, ainda, que essas memórias se organizam, principalmente, em dois modelos de valores complementares no Espiritismo: 1) conhecimento, inteligência, razão, estudo, livro e 2) amor, vivência, fé, trabalho, exemplo. Eles se constituem nas duas condições essenciais para se alcançar essa perfeição. O primeiro modelo está principalmente personificado na figura de Allan Kardec e o segundo, em Jesus. Nesse sentido, o Espiritismo opera na mente dos fiéis, uma síntese entre ambos os modelos, tendo em Chico Xavier a personificação dessa síntese, constituindo-se como tipo ideal de espírita.
Resumo:
采用有氧热处理、激光预处理和离子后处理三种方式对电子束蒸发(EBE)制备的单层ZrO_2薄膜进行了后处理,并分别对样品的光学性能和抗激光损伤阈值(LIDT)特性进行了研究。实验结果表明,热处理方式可以有效排除膜层内吸附的水气,弥补薄膜制备过程中的氧损失,使得光谱短移、吸收减小、损伤阈值增高;激光预处理过程可以在一定程度上减少缺陷、提高损伤阈值,但对膜层的光谱和吸收情况没有明显的改善作用;而离子后处理能够提高膜层的堆积密度、减少缺陷、降低吸收从而提高损伤阈值。由于三种方式处理机制不同,在实际应用中应根据膜层的性能选择合适的处理方式。
Resumo:
Algumas questões desafiam atualmente a política institucional de recursos hídricos no que refere à implementação do que está previsto na legislação brasileira das águas. A primeira delas diz respeito à implantação do próprio gerenciamento ambiental por bacia hidrográfica, ou seja, à forma descentralizada de gestão através de organismos de bacia. A segunda é fazer com que esses organismos se desenvolvam de forma compartilhada e participativa, incorporando todos segmentos locais importantes na direção dos organismos de bacia, principalmente os moradores, que são os mais afetados e que até muito recentemente estavam alijados das decisões. Uma terceira questão é garantir que a gestão dos recursos hídricos esteja integrada à gestão ambiental como um todo. O Estado do Rio de Janeiro possui, assim como todos os outros, sua legislação das águas e vem, desde 1999, implantando a Política e o Sistema Estadual de Recursos Hídricos. A macrorregião Ambiental 4 do Estado, que abrange a conhecida região dos Lagos e a região da bacia hidrográfica do rio São João, desde 1999, vem desenvolvendo a sua gestão ambiental e, em especial, a de recursos hídricos através da criação do Consórcio Intermunicipal Lagos São João e, em 2005, do Comitê de Bacia. Esses organismos de bacia atuam conjuntamente e têm conseguido promover a organização de todos segmentos da sociedade local, com destaque para os pescadores artesanais e os moradores, além de todas prefeituras e das empresas usuárias de água mais importantes. Muitos resultados da ação desses organismos de bacia têm beneficiado o meio ambiente e os moradores da região, em especial, com soluções para os problemas de saneamento das Lagoas de Araruama e de Saquarema e para o término da extração de conchas e areia nas Lagoas e no rio São João, o que tem propiciado uma recuperação do estoque pesqueiro nestes corpos dágua. A análise político-institucional da criação e do desenvolvimento dos organismos de bacia da região, dos sucessos e insucessos que têm obtido e de sua sustentabilidade institucional, são os objetivos dessa dissertação. Para isso foi usada a metodologia do projeto do Banco Mundial Integrated River Basin Management and the Principle of Managing Water Resources at the Lowest Appropriate Level: When and Why Does It (Not) Work in Practice? de Kemper et al., 2005, com adaptações.