887 resultados para Nonrandom two-liquid model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Metamaterials have attracted great attention in recent decades, due to their electromagnetic properties which are not found in nature. Since metamaterials are now synthesized by the insertion of artificially manufactured inclusions in a specified homogeneous medium, it became possible for the researcher to work with a wide collection of independent parameters, for example, the electromagnetic properties of the material. An investigation of the properties of ring resonators was performed as well as those of metamaterials. A study of the major theories that clearly explain superconductivity was presented. The BCS theory, London Equations and the Two-Fluid Model are theories that support the application of superconducting microstrip antennas. Therefore, this thesis presents theoretical, numerical and experimental-computational analysis using full-wave formalism, through the application of the Transverse Transmission Line – LTT method applied in the Fourier Transform Domain (FTD). The LTT is a full wave method, which, as a rule, obtains the electromagnetic fields in terms of the transverse components of the structure. The inclusion of the superconducting patch is performed using the complex resistive boundary condition. Results of resonant frequency as a function of antenna parameters are obtained. To validate the analysis, computer programs were developed using Fortran, simulations were created using the commercial software, with curves being drawn using commercial software and MATLAB, in addition to comparing the conventional patch with the superconductor as well as comparing a metamaterial substrate with a conventional one, joining the substrate with the patch, observing what improves on both cas

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the creation of supersonic vehicles, during the Second World War, the engineers have given special attention to the interaction between the aerodynamic efforts and the structures of the aircrafts due to a highly destructive phenomenon called flutter in aeronautical panel. Flutter in aeronautical panels is a self-excited aeroelastic phenomenon, which can occurs during supersonic flights due to dynamic instability of inertia, elastic and aerodynamic forces of the system. In the flutter condition, when the critical aerodynamic pressure is reached, the vibration amplitudes of the panel become dynamically unstable and increase exponentially with time, affecting significantly the fatigue life of the existing aeronautical components. Thus, in this paper, the interest is to investigate the possibility of reducing the effects of the supersonic aeroelastic instability of rectangular plates by applying passive constrained viscoelastic layers. The rationale for such study is the fact that as the addition of viscoelastic materials provides decreased vibration amplitudes it becomes important to quantify the suppression of plate flutter coalescence modes that can be obtained. Moreover, despite the fact that much research on the suppression of panel flutter has been carried out by using passive, semi-active and active control techniques, very few of them are adapted to deal with the problem of estimating the flutter speeds of viscoelastic systems, since they must conveniently account for the frequency- and temperature-dependent behavior of the viscoelastic material. In this context, two different model of viscoelastic material are developed and applied to the model of sandwich plate by using finite elements. After the presentation of the theoretical foundations of the methodology, the description of a numerical study on the flutter analysis of a three-layer sandwich plate is addressed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Basement intersected in Holes 525A, 528, and 527 on the Walvis Ridge consists of submarine basalt flows and pillows with minor intercalated sediments. These holes are situated on the crest and mid- and lower NW flank of a NNW-SSE-trending ridge block which would have closely paralleled the paleo mid-ocean ridge. The basalts were erupted approximately 70 Ma, a date consistent with formation at the paleo mid-ocean ridge. The basalt types vary from aphyric quartz tholeiites on the Ridge crest to highly Plagioclase phyric olivine tholeiites on the flank. These show systematic differences in incompatible trace element and isotopic composition, and many element and isotope ratio pairs form systematic trends with the Ridge crest basalts at one end and the highly phyric Ridge flank basalts at the other. The low 143Nd/144Nd (0.51238) and high 87Sr/86Sr (0.70512) ratios of the Ridge crest basalts suggest derivation from an old Nd/Sm and Rb/Sr enriched mantle source. This isotopic signature is similar to that of alkaline basalts on Tristan da Cunha but offset by somewhat lower 143Nd/144Nd values. The isotopic ratio trends may be extrapolated beyond the Ridge flank basalts (which have 143Nd/144Nd of 0.51270 and 87Sr/86Sr of 0.70417) in the direction of typical MORB compositions. These isotopic correlations are equally consistent with mixing of depleted and enriched end-member melts or partial melting of an inhomogeneous, variably enriched mantle source. However, observed Zr-Ba-Nb-Y interelement relationships are inconsistent with any simple two-component model of magma mixing or partial melting. They also preclude extensive involvement of depleted (N-type) MORB material or its mantle sources in the petrogenesis of Walvis Ridge basalts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A north-south transect of 17 cores was constructed along the eastern boundary of the California Current system from 33° to 42°N to investigate the changes in biogenic sedimentation over the past 30 kyr. Percentages and mass accumulation rates of CaCO3, Corg, and biogenic opal were assembled at 500 to 1000 years/sample to provide relatively high resolution. Time-space maps reveal a complex pattern of changes that do not follow a simple glacial-interglacial two-mode model. Biogenic sedimentation shows responses that are sometimes time-transgressive and sometimes coeval, and most of the responses show more consistency within a limited geographic area than any temporal consistency. Reconstructed conditions during late oxygen isotope stage 3 were more like early Holocene conditions than any other time during the last 30 kyr. Coastal upwelling and productivity during oxygen isotope stage 3 were relatively strong along the central California margin but were weak along the northern California margin. Precipitation increased during the last glacial interval in the central California region, and the waters of the southern California margin had relatively low productivity. Productivity on the southern Oregon margin was relatively low at the beginning of the last glacial interval, but by about 20 ka, productivity in this area significantly increased. This change suggests that the center of the divergence of the West Wind Drift shifted south at this time. The end of the last glacial interval was characterized by increased productivity in the southern California margin and increased upwelling along the central California margin but upwelling remained weak along the northern California margin. A sudden (<300 years) decrease in CaCO3, Corg, and biogenic opal occurred at 13 ka. The changes suggest a major reorientation of the atmospheric circulation in the North Pacific and western North America and the establishment of a strong seasonality in the central California region. A carbonate preservation event occurred at 10 ka that appears to reflect the uptake of CO2 by the terrestrial biosphere as the northern latitudes were reforested following retreat of the glaciers. The Holocene has been a period of relatively high productivity in the southern California margin, relatively strong coastal upwelling along the central California margin, relatively weak upwelling along the northern California margin, and the northward migration of the divergence zone of the West Wind Drift.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work examines analytically the forced convection in a channel partially filled with a porous material and subjected to constant wall heat flux. The Darcy–Brinkman–Forchheimer model is used to represent the fluid transport through the porous material. The local thermal non-equilibrium, two-equation model is further employed as the solid and fluid heat transport equations. Two fundamental models (models A and B) represent the thermal boundary conditions at the interface between the porous medium and the clear region. The governing equations of the problem are manipulated, and for each interface model, exact solutions, for the solid and fluid temperature fields, are developed. These solutions incorporate the porous material thickness, Biot number, fluid to solid thermal conductivity ratio and Darcy number as parameters. The results can be readily used to validate numerical simulations. They are, further, applicable to the analysis of enhanced heat transfer, using porous materials, in heat exchangers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A self-consistent relativistic two-fluid model is proposed for one-dimensional electron-ion plasma dynamics. A multiple scales perturbation technique is employed, leading to an evolution equation for the wave envelope, in the form of a nonlinear Schrödinger type equation (NLSE). The inclusion of relativistic effects is shown to introduce density-dependent factors, not present in the non-relativistic case - in the conditions for modulational instability. The role of relativistic effects on the linear dispersion laws and on envelope soliton solutions of the NLSE is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Costs related to inventory are usually a significant amount of the company’s total assets. Despite this, companies in general don’t pay a lot of interest in it, even if the benefits from effective inventory are obvious when it comes to less tied up capital, increased customer satisfaction and better working environment. Permobil AB, Timrå is in an intense period when it comes to revenue and growth. The production unit is aiming for an increased output of 30 % in the next two years. To make this possible the company has to improve their way to distribute and handle material,The purpose of the study is to provide useful information and concrete proposals for action, so that the company can build a strategy for an effective and sustainable solution when it comes to inventory management. Alternative methods for making forecasts are suggested, in order to reach a more nuanced perception of different articles, and how they should be managed. Analytic Hierarchy Process (AHP) was used in order to give specially selected persons the chance to decide criteria for how the article should be valued. The criteria they agreed about were annual volume value, lead time, frequency rate and purchase price. The other method that was proposed was a two-dimensional model where annual volume value and frequency was the criteria that specified in which class an article should be placed. Both methods resulted in significant changes in comparison to the current solution. For the spare part inventory different forecast methods were tested and compared with the current solution. It turned out that the current forecast method performed worse than both moving average and exponential smoothing with trend. The small sample of ten random articles is not big enough to reject the current solution, but still the result is a reason enough, for the company to control the quality of the forecasts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cette thèse développe des méthodes bootstrap pour les modèles à facteurs qui sont couram- ment utilisés pour générer des prévisions depuis l'article pionnier de Stock et Watson (2002) sur les indices de diffusion. Ces modèles tolèrent l'inclusion d'un grand nombre de variables macroéconomiques et financières comme prédicteurs, une caractéristique utile pour inclure di- verses informations disponibles aux agents économiques. Ma thèse propose donc des outils éco- nométriques qui améliorent l'inférence dans les modèles à facteurs utilisant des facteurs latents extraits d'un large panel de prédicteurs observés. Il est subdivisé en trois chapitres complémen- taires dont les deux premiers en collaboration avec Sílvia Gonçalves et Benoit Perron. Dans le premier article, nous étudions comment les méthodes bootstrap peuvent être utilisées pour faire de l'inférence dans les modèles de prévision pour un horizon de h périodes dans le futur. Pour ce faire, il examine l'inférence bootstrap dans un contexte de régression augmentée de facteurs où les erreurs pourraient être autocorrélées. Il généralise les résultats de Gonçalves et Perron (2014) et propose puis justifie deux approches basées sur les résidus : le block wild bootstrap et le dependent wild bootstrap. Nos simulations montrent une amélioration des taux de couverture des intervalles de confiance des coefficients estimés en utilisant ces approches comparativement à la théorie asymptotique et au wild bootstrap en présence de corrélation sérielle dans les erreurs de régression. Le deuxième chapitre propose des méthodes bootstrap pour la construction des intervalles de prévision permettant de relâcher l'hypothèse de normalité des innovations. Nous y propo- sons des intervalles de prédiction bootstrap pour une observation h périodes dans le futur et sa moyenne conditionnelle. Nous supposons que ces prévisions sont faites en utilisant un ensemble de facteurs extraits d'un large panel de variables. Parce que nous traitons ces facteurs comme latents, nos prévisions dépendent à la fois des facteurs estimés et les coefficients de régres- sion estimés. Sous des conditions de régularité, Bai et Ng (2006) ont proposé la construction d'intervalles asymptotiques sous l'hypothèse de Gaussianité des innovations. Le bootstrap nous permet de relâcher cette hypothèse et de construire des intervalles de prédiction valides sous des hypothèses plus générales. En outre, même en supposant la Gaussianité, le bootstrap conduit à des intervalles plus précis dans les cas où la dimension transversale est relativement faible car il prend en considération le biais de l'estimateur des moindres carrés ordinaires comme le montre une étude récente de Gonçalves et Perron (2014). Dans le troisième chapitre, nous suggérons des procédures de sélection convergentes pour les regressions augmentées de facteurs en échantillons finis. Nous démontrons premièrement que la méthode de validation croisée usuelle est non-convergente mais que sa généralisation, la validation croisée «leave-d-out» sélectionne le plus petit ensemble de facteurs estimés pour l'espace généré par les vraies facteurs. Le deuxième critère dont nous montrons également la validité généralise l'approximation bootstrap de Shao (1996) pour les regressions augmentées de facteurs. Les simulations montrent une amélioration de la probabilité de sélectionner par- cimonieusement les facteurs estimés comparativement aux méthodes de sélection disponibles. L'application empirique revisite la relation entre les facteurs macroéconomiques et financiers, et l'excès de rendement sur le marché boursier américain. Parmi les facteurs estimés à partir d'un large panel de données macroéconomiques et financières des États Unis, les facteurs fortement correlés aux écarts de taux d'intérêt et les facteurs de Fama-French ont un bon pouvoir prédictif pour les excès de rendement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introdução: A lateralidade é a diferença na capacidade de controlo entre os dois lados do corpo. Os métodos utilizados para avaliar a lateralidade manual incluem a observação efetiva do uso do membro dominante ou a aplicação de inventários respondidos pelo próprio indivíduo avaliado. O Inventário de Lateralidade de Edinburgh (EHI) é o instrumento mais utilizado para avaliar a lateralidade manual. Apesar do seu uso amplo, em Portugal não existem estudos que avaliem a sua validade e fidedignidade. Objetivos: Estudar as propriedades psicométricas do Inventário de Lateralidade de Edinburgh numa amostra da população portuguesa. Métodos: A amostra é constituída por 290 pessoas (135 homens e 155 mulheres), com idades compreendidas entre os 18 e os 65 anos. Todos os participantes preencheram uma declaração de consentimento informado e uma bateria de testes neuropsicológicos Resultados: A média no EHI foi de 62,36 (DP = 38,00). Os resultados demonstraram que das seis variáveis sociodemográficas (idade, sexo, escolaridade, zona de residência, regiões e profissão) três apresentaram ter influência significativa nas pontuações do EHI: idade, zona de residência e regiões. A confiabilidade e a estabilidade temporal do EHI apresentaram resultados adequados. A análise fatorial confirmatória mostrou que o modelo não é melhor explicado por um fator. Para dois fatores o modelo continua a não ser adequado. Conclusão: Apesar de termos obtido uma boa consistência interna não nos é possível considerar este teste como o mais adequado para medir o constructo da lateralidade. / Introduction: The handedness is the difference in the control capacity between the two sides of the body. The methods used to evaluate the manual handedness include the effective observation of the use of dominant member or application of inventories answered by the person assessed. The Edinburgh Handedness Inventory (EHI) is the most used to evaluate manual handedness. Even though being widely used, in Portugal there are no studies that measure its validity and reliability. Objective: To study the psychometric properties of Edinburgh Handedness Inventory in a Portuguese sample. Methods: The sample consists of 290 people (135 men and 155 women), aged between 18 and 65 years. All participants filled an informed consent form and a battery of neuropsychological tests. Results: The average in EHI was 62.36 (SD = 38.00). The results showed that 3 of 6 sociodemographic variables showed significant influence in EHI scores. The reliability and temporal stability of EHI were adequate. Confirmatory factor analysis showed that the model is not better explained by one factor. A two-factor model was not also suitable. Conclusion: Even though we got a good internal consistency we cannot consider this test as the most appropriate for measuring the handedness construct.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study had three objectives: (1) to develop a comprehensive truck simulation that executes rapidly, has a modular program construction to allow variation of vehicle characteristics, and is able to realistically predict vehicle motion and the tire-road surface interaction forces; (2) to develop a model of doweled portland cement concrete pavement that can be used to determine slab deflection and stress at predetermined nodes, and that allows for the variation of traditional thickness design factors; and (3) to implement these two models on a work station with suitable menu driven modules so that both existing and proposed pavements can be evaluated with respect to design life, given specific characteristics of the heavy vehicles that will be using the facility. This report summarizes the work that has been performed during the first year of the study. Briefly, the following has been accomplished: A two dimensional model of a typical 3-S2 tractor-trailer combination was created. A finite element structural analysis program, ANSYS, was used to model the pavement. Computer runs have been performed varying the parameters defining both vehicle and road elements. The resulting time specific displacements for each node are plotted, and the displacement basin is generated for defined vehicles. Relative damage to the pavement can then be estimated. A damage function resulting from load replications must be assumed that will be reflected by further pavement deterioration. Comparison with actual damage on Interstate 80 will eventually allow verification of these procedures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objetivo: Estudar as propriedades psicométricas e os dados normativos da Forma Geral das Matrizes Progressivas de Raven numa amostra da comunidade da população portuguesa. Método: A amostra é constituída por 697 pessoas (314 homens e 383 mulheres), com idades compreendidas entre os 12 e os 90 anos. Todos os participantes preencheram uma declaração de consentimento informado e uma bateria de testes neuropsicológicos, incluindo a Forma Geral das Matrizes Progressivas de Raven (FG-MPR), Teste de Memória de 15-Item de Rey, Escala de Autoavaliação de Ansiedade de Zung, Bateria de Avaliação Frontal e Figura Complexa de Rey. Resultados: A média na FG-MPR foi de 44,47 (DP = 10,78). Os resultados demonstraram que todas as variáveis sociodemográficas (idade, sexo, escolaridade, profissão, regiões e tipologia de áreas urbanas), exceto o estado civil, apresentaram ter influência significativa nas pontuações da FG-MPR. A confiabilidade e a estabilidade temporal da FG-MPR revelaram-se adequadas. A análise fatorial exploratória e confirmatória mostrou que o modelo para um fator não é adequado. Um modelo a quatro fatores continua a não ser adequado. Conclusão: Os dados do presente estudo sugerem que se trata de um instrumento com potencialidades na sua utilização junto da população portuguesa. / Purpose: To study the psychometric properties and date normative of the Raven’s Standard Progressive Matrices in a Portuguese community sample. Method: The sample consists of 697 people (314 men and 383 women), aged between 12 and 90 years. All participants filled an informed consent form and a battery of neuropsychological tests, which included Raven’s Standard Progressive Matrices (RSPM), Rey 15-Item Memory Test, Zung Self-Rating Anxiety Scale, Frontal Assessment Battery, and Rey Complex Figure Test. Results: The average in RSPM was 44.47 (SD = 10.78). The results showed that all of the sociodemographic variables (age, sex, education, profession, region, and typology of urban areas), with the exception of civil status, showed significant influence on RSPM scores. The reliability and temporal stability of RSPM were adequate. Exploratory and Confirmatory factor analysis showed that the model is not better explained by one factor. A two-factor model was not also suitable. Conclusion: The data from this study suggest that it is an instrument with potential for its use among the Portuguese population.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Fall 2015, the Engineering and Physical Science Library (EPSL) began lending anatomical models as part of its course reserves program. EPSL received a partial skeleton and two muscle model figures from instructors of BSCI105. These models circulate for 4 hours at a time and are generally used by small, collaborative groups of students in the library. This poster will look at the challenges and rewards for adding these items to EPSL’s course reserves.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative conditions are derived under which electrically excitable membranes can undergo a phase transition induced by an externally applied voltage noise. The results obtained for a non-cooperative and a cooperative form of the two-state model are compared. © 1981.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To understand the evolution of bipedalism among the homnoids in an ecological context we need to be able to estimate theenerrgetic cost of locomotion in fossil forms. Ideally such an estimate would be based entirely on morphology since, except for the rare instances where footprints are preserved, this is hte only primary source of evidence available. In this paper we use evolutionary robotics techniques (genetic algoritms, pattern generators and mechanical modeling) to produce a biomimentic simulation of bipedalism based on human body dimensions. The mechnaical simulation is a seven-segment, two-dimensional model with motive force provided by tension generators representing the major muscle groups acting around the lower-limb joints. Metabolic energy costs are calculated from the muscel model, and bipedal gait is generated using a finite-state pattern generator whose parameters are produced using a genetic algorithm with locomotor economy (maximum distance for a fixed energy cost) as the fitness criterion. The model is validated by comparing the values it generates with those for modern humans. The result (maximum efficiency of 200 J m-1) is within 15% of the experimentally derived value, which is very encouraging and suggests that this is a useful analytic technique for investigating the locomotor behaviour of fossil forms. Initial work suggests that in the future this technique could be used to estimate other locomotor parameters such as top speed. In addition, the animations produced by this technique are qualitatively very convincing, which suggests that this may also be a useful technique for visualizing bipedal locomotion.