957 resultados para Two fluid model
Resumo:
The concept of a "Superheavy Quasiatom" is discussed. Radiative transition times are compared with the lifetime of the intermediate system, cross sections are calculated within a two-collision model and induced transitions and their anisotropic emission are discussed. Recent experimental and theoretical results are presented from collision systems obtained with I-beams bombarding various heavy targets, giving combined Z-values between 120 and 145. Results include the energy dependence of the peak structure interpreted as M X-rays from superheavy quasiatoms and the anisotropy of X-ray emission referred to the beam direction. The data are discussed within the models available. These cannot explain the streng emission of anisotropic radiation in the X-ray energy range of quasiatomic M X-rays at small bombarding energies.
Resumo:
In der psycholinguistischen Forschung ist die Annahme weitverbreitet, dass die Bewertung von Informationen hinsichtlich ihres Wahrheitsgehaltes oder ihrer Plausibilität (epistemische Validierung; Richter, Schroeder & Wöhrmann, 2009) ein strategischer, optionaler und dem Verstehen nachgeschalteter Prozess ist (z.B. Gilbert, 1991; Gilbert, Krull & Malone, 1990; Gilbert, Tafarodi & Malone, 1993; Herbert & Kübler, 2011). Eine zunehmende Anzahl an Studien stellt dieses Zwei-Stufen-Modell von Verstehen und Validieren jedoch direkt oder indirekt in Frage. Insbesondere Befunde zu Stroop-artigen Stimulus-Antwort-Kompatibilitätseffekten, die auftreten, wenn positive und negative Antworten orthogonal zum aufgaben-irrelevanten Wahrheitsgehalt von Sätzen abgegeben werden müssen (z.B. eine positive Antwort nach dem Lesen eines falschen Satzes oder eine negative Antwort nach dem Lesen eines wahren Satzes; epistemischer Stroop-Effekt, Richter et al., 2009), sprechen dafür, dass Leser/innen schon beim Verstehen eine nicht-strategische Überprüfung der Validität von Informationen vornehmen. Ausgehend von diesen Befunden war das Ziel dieser Dissertation eine weiterführende Überprüfung der Annahme, dass Verstehen einen nicht-strategischen, routinisierten, wissensbasierten Validierungsprozesses (epistemisches Monitoring; Richter et al., 2009) beinhaltet. Zu diesem Zweck wurden drei empirische Studien mit unterschiedlichen Schwerpunkten durchgeführt. Studie 1 diente der Untersuchung der Fragestellung, ob sich Belege für epistemisches Monitoring auch bei Informationen finden lassen, die nicht eindeutig wahr oder falsch, sondern lediglich mehr oder weniger plausibel sind. Mithilfe des epistemischen Stroop-Paradigmas von Richter et al. (2009) konnte ein Kompatibilitätseffekt von aufgaben-irrelevanter Plausibilität auf die Latenzen positiver und negativer Antworten in zwei unterschiedlichen experimentellen Aufgaben nachgewiesen werden, welcher dafür spricht, dass epistemisches Monitoring auch graduelle Unterschiede in der Übereinstimmung von Informationen mit dem Weltwissen berücksichtigt. Darüber hinaus belegen die Ergebnisse, dass der epistemische Stroop-Effekt tatsächlich auf Plausibilität und nicht etwa auf der unterschiedlichen Vorhersagbarkeit von plausiblen und unplausiblen Informationen beruht. Das Ziel von Studie 2 war die Prüfung der Hypothese, dass epistemisches Monitoring keinen evaluativen Mindset erfordert. Im Gegensatz zu den Befunden anderer Autoren (Wiswede, Koranyi, Müller, Langner, & Rothermund, 2013) zeigte sich in dieser Studie ein Kompatibilitätseffekt des aufgaben-irrelevanten Wahrheitsgehaltes auf die Antwortlatenzen in einer vollständig nicht-evaluativen Aufgabe. Die Ergebnisse legen nahe, dass epistemisches Monitoring nicht von einem evaluativen Mindset, möglicherweise aber von der Tiefe der Verarbeitung abhängig ist. Studie 3 beleuchtete das Verhältnis von Verstehen und Validieren anhand einer Untersuchung der Online-Effekte von Plausibilität und Vorhersagbarkeit auf Augenbewegungen beim Lesen kurzer Texte. Zusätzlich wurde die potentielle Modulierung dieser Effeke durch epistemische Marker, die die Sicherheit von Informationen anzeigen (z.B. sicherlich oder vielleicht), untersucht. Entsprechend der Annahme eines schnellen und nicht-strategischen epistemischen Monitoring-Prozesses zeigten sich interaktive Effekte von Plausibilität und dem Vorhandensein epistemischer Marker auf Indikatoren früher Verstehensprozesse. Dies spricht dafür, dass die kommunizierte Sicherheit von Informationen durch den Monitoring-Prozess berücksichtigt wird. Insgesamt sprechen die Befunde gegen eine Konzeptualisierung von Verstehen und Validieren als nicht-überlappenden Stufen der Informationsverarbeitung. Vielmehr scheint eine Bewertung des Wahrheitsgehalts oder der Plausibilität basierend auf dem Weltwissen – zumindest in gewissem Ausmaß – eine obligatorische und nicht-strategische Komponente des Sprachverstehens zu sein. Die Bedeutung der Befunde für aktuelle Modelle des Sprachverstehens und Empfehlungen für die weiterführende Forschung zum Vehältnis von Verstehen und Validieren werden aufgezeigt.
Resumo:
In this work, we present an atomistic-continuum model for simulations of ultrafast laser-induced melting processes in semiconductors on the example of silicon. The kinetics of transient non-equilibrium phase transition mechanisms is addressed with MD method on the atomic level, whereas the laser light absorption, strong generated electron-phonon nonequilibrium, fast heat conduction, and photo-excited free carrier diffusion are accounted for with a continuum TTM-like model (called nTTM). First, we independently consider the applications of nTTM and MD for the description of silicon, and then construct the combined MD-nTTM model. Its development and thorough testing is followed by a comprehensive computational study of fast nonequilibrium processes induced in silicon by an ultrashort laser irradiation. The new model allowed to investigate the effect of laser-induced pressure and temperature of the lattice on the melting kinetics. Two competing melting mechanisms, heterogeneous and homogeneous, were identified in our big-scale simulations. Apart from the classical heterogeneous melting mechanism, the nucleation of the liquid phase homogeneously inside the material significantly contributes to the melting process. The simulations showed, that due to the open diamond structure of the crystal, the laser-generated internal compressive stresses reduce the crystal stability against the homogeneous melting. Consequently, the latter can take a massive character within several picoseconds upon the laser heating. Due to the large negative volume of melting of silicon, the material contracts upon the phase transition, relaxes the compressive stresses, and the subsequent melting proceeds heterogeneously until the excess of thermal energy is consumed. A series of simulations for a range of absorbed fluences allowed us to find the threshold fluence value at which homogeneous liquid nucleation starts contributing to the classical heterogeneous propagation of the solid-liquid interface. A series of simulations for a range of the material thicknesses showed that the sample width we chosen in our simulations (800 nm) corresponds to a thick sample. Additionally, in order to support the main conclusions, the results were verified for a different interatomic potential. Possible improvements of the model to account for nonthermal effects are discussed and certain restrictions on the suitable interatomic potentials are found. As a first step towards the inclusion of these effects into MD-nTTM, we performed nanometer-scale MD simulations with a new interatomic potential, designed to reproduce ab initio calculations at the laser-induced electronic temperature of 18946 K. The simulations demonstrated that, similarly to thermal melting, nonthermal phase transition occurs through nucleation. A series of simulations showed that higher (lower) initial pressure reinforces (hinders) the creation and the growth of nonthermal liquid nuclei. For the example of Si, the laser melting kinetics of semiconductors was found to be noticeably different from that of metals with a face-centered cubic crystal structure. The results of this study, therefore, have important implications for interpretation of experimental data on the kinetics of melting process of semiconductors.
Resumo:
This paper presents a theoretical and empirical analysis of strategic competition in retail banking when some of the financial firms are non-profit organisations that invest in social activities. Banking literature about competition is fairly large, but the strategic interaction between profit maximizing and non profit maximizers has not been extensively analysed except for Purroy and Salas (1999). In this paper, a completely different approach is taken. An adaptation of Hotelling’s two stage model of spatial competition is developed to take into account consumer perceptions respect to the two different types of financial institutions. The empirical analysis confirms that consumers take into account other features different from the price, such as social contribution or closer service to make a deposit or mortgage decision. These conclusions are of interest in the debate about a firm’s social or ethical activities. It is shown that if consumers value social activities, firms can improv
Resumo:
A new approach for the control of the size of particles fabricated using the Electrohydrodynamic Atomization (EHDA) method is being developed. In short, the EHDA process produces solution droplets in a controlled manner, and as the solvent evaporates from the surface of the droplets, polymeric particles are formed. By varying the voltage applied, the size of the droplets can be changed, and consequently, the size of the particles can also be controlled. By using both a nozzle electrode and a ring electrode placed axisymmetrically and slightly above the nozzle electrode, we are able to produce a Single Taylor Cone Single Jet for a wide range of voltages, contrary to just using a single nozzle electrode where the range of permissible voltage for the creation of the Single Taylor Cone Single Jet is usually very small. Phase Doppler Particle Analyzer (PDPA) test results have shown that the droplet size increases with increasing voltage applied. This trend is predicted by the electrohydrodynamic theory of the Single Taylor Cone Single Jet based on a perfect dielectric fluid model. Particles fabricated using different voltages do not show much change in the particles size, and this may be attributed to the solvent evaporation process. Nevertheless, these preliminary results do show that this method has the potential of providing us with a way of fine controlling the particles size using relatively simple method with trends predictable by existing theories.
Resumo:
En la literatura económica no se ha estudiado como la competencia entre las instituciones educativas afecta específicamente la escogencia de estándares educativos y el valor de matrícula. Usando un modelo teórico analizo como la competencia entre las instituciones educativas afectan la escogencia de estándares académicos, comparando la solución en competencia con la solución eficiente y la solución de monopolio. Los individuos son heterogéneos y se diferencian en su habilidad, las instituciones educativas compiten estableciendo en una primera etapa el estándar educativo, y en una segunda etapa el valor de matrícula. Una vez definidos los estándares y los valores de matrícula, estos son información pública, permitiendo a los individuos escoger entre ingresar o no a una institución educativa o a que institución educativa ingresar de acuerdo a la habilidad innata y al costo asociado al esfuerzo. En los resultados se muestra que el bienestar social aumenta cuando en la economía existe más de una institución educativa con estándares diferentes, y la solución de mercado, en monopolio o en competencia, obliga a los estudiantes a ejercer un mayor esfuerzo para alcanzar el título. Independiente a la relación de costos, el valor de matrícula es siempre mayor para la institución con estándar educativo más alto, y mayor en la solución de mercado. Cuando el costo unitario de la institución con estándar más alto es mayor o igual al costo de la institución con menor estándar, los estándares educativos escogidos por el planificador son mayores y el esfuerzo requerido por los individuos es menor respecto a la solución de mercado.
Resumo:
The common assumptions that labor income share does not change over time or across countries and that factor income shares are equal to the elasticity of output with respect to factors have had important implications for economic theory. However, there are various theoretical reasons why the elasticity of output with respect to reproducible factors should be correlated with the stage of development. In particular, the behavior of international trade and capital flows and the existence of factor saving innovations imply such a correlation. If this correlation exists and if factor income shares are equal to the elasticity of output with respect to factors then the labor income share must be negatively correlated with the stage of development. We propose an explanation for why labor income share has no correlation with income per capita: the existence of a labor intensive sector which produces non tradable goods.
Resumo:
Even though antenatal care is universally regarded as important, determinants of demand for antenatal care have not been widely studied. Evidence concerning which and how socioeconomic conditions influence whether a pregnant woman attends or not at least one antenatal consultation or how these factors affect the absences to antenatal consultations is very limited. In order to generate this evidence, a two-stage analysis was performed with data from the Demographic and Health Survey carried out by Profamilia in Colombia during 2005. The first stage was run as a logit model showing the marginal effects on the probability of attending the first visit and an ordinary least squares model was performed for the second stage. It was found that mothers living in the pacific region as well as young mothers seem to have a lower probability of attending the first visit but these factors are not related to the number of absences to antenatal consultation once the first visit has been achieved. The effect of health insurance was surprising because of the differing effects that the health insurers showed. Some familiar and personal conditions such as willingness to have the last children and number of previous children, demonstrated to be important in the determination of demand. The effect of mother’s educational attainment was proved as important whereas the father’s educational achievement was not. This paper provides some elements for policy making in order to increase the demand inducement of antenatal care, as well as stimulating research on demand for specific issues on health.
Resumo:
Asset correlations are of critical importance in quantifying portfolio credit risk and economic capitalin financial institutions. Estimation of asset correlation with rating transition data has focusedon the point estimation of the correlation without giving any consideration to the uncertaintyaround these point estimates. In this article we use Bayesian methods to estimate a dynamicfactor model for default risk using rating data (McNeil et al., 2005; McNeil and Wendin, 2007).Bayesian methods allow us to formally incorporate human judgement in the estimation of assetcorrelation, through the prior distribution and fully characterize a confidence set for the correlations.Results indicate: i) a two factor model rather than the one factor model, as proposed bythe Basel II framework, better represents the historical default data. ii) importance of unobservedfactors in this type of models is reinforced and point out that the levels of the implied asset correlationscritically depend on the latent state variable used to capture the dynamics of default,as well as other assumptions on the statistical model. iii) the posterior distributions of the assetcorrelations show that the Basel recommended bounds, for this parameter, undermine the levelof systemic risk.
Resumo:
El desalineamiento temporal es la incorrespondencia de dos señales debido a una distorsión en el eje temporal. La Detección y Diagnóstico de Fallas (Fault Detection and Diagnosis-FDD) permite la detección, el diagnóstico y la corrección de fallos en un proceso. La metodología usada en FDD está dividida en dos categorías: técnicas basadas en modelos y no basadas en modelos. Esta tesis doctoral trata sobre el estudio del efecto del desalineamiento temporal en FDD. Nuestra atención se enfoca en el análisis y el diseño de sistemas FDD en caso de problemas de comunicación de datos, como retardos y pérdidas. Se proponen dos técnicas para reducir estos problemas: una basada en programación dinámica y la otra en optimización. Los métodos propuestos han sido validados sobre diferentes sistemas dinámicos: control de posición de un motor de corriente continua, una planta de laboratorio y un problema de sistemas eléctricos conocido como hueco de tensión.
Resumo:
The origin of the eddy variability around the 25°S band in the Indian Ocean is investigated. We have found that the surface circulation east of Madagascar shows an anticyclonic subgyre bounded to the south by eastward flow from southwest Madagascar, and to the north by the westward flowing South Equatorial Current (SEC) between 15° and 20°S. The shallow, eastward flowing South Indian Ocean Countercurrent (SICC) extends above the deep reaching, westward flowing SEC to 95°E around the latitude of the high variability band. Applying a two-layer model reveals that regions of large vertical shear along the SICC-SEC system are baroclinically unstable. Estimates of the frequencies (3.5–6 times/year) and wavelengths (290–470 km) of the unstable modes are close to observations of the mesoscale variability derived from altimetry data. It is likely then that Rossby wave variability locally generated in the subtropical South Indian Ocean by baroclinic instability is the origin of the eddy variability around 25°S as seen, for example, in satellite altimetry.
Resumo:
This article explores how data envelopment analysis (DEA), along with a smoothed bootstrap method, can be used in applied analysis to obtain more reliable efficiency rankings for farms. The main focus is the smoothed homogeneous bootstrap procedure introduced by Simar and Wilson (1998) to implement statistical inference for the original efficiency point estimates. Two main model specifications, constant and variable returns to scale, are investigated along with various choices regarding data aggregation. The coefficient of separation (CoS), a statistic that indicates the degree of statistical differentiation within the sample, is used to demonstrate the findings. The CoS suggests a substantive dependency of the results on the methodology and assumptions employed. Accordingly, some observations are made on how to conduct DEA in order to get more reliable efficiency rankings, depending on the purpose for which they are to be used. In addition, attention is drawn to the ability of the SLICE MODEL, implemented in GAMS, to enable researchers to overcome the computational burdens of conducting DEA (with bootstrapping).
Resumo:
Ab initio calculations using density functional theory have shown that the reactions that occur between artemisinin, 1, a cyclic trioxane active against malaria, and some metal ions and complexes lead to a series of radicals which are probably responsible for its therapeutic activity. In particular it has been shown that the interaction of Fe(H) with artemisinin causes the O-O bond to be broken as indeed does Fe(III) and Cu(I), while Zn(II) does not. Calculations were carried out with Fe(II) in several different forms including the bare ion, [Fe(H2O)(5)](2+) and [FeP(Im)] (P, porphyrin; Im, imadazole) and similar results were obtained. The resulting oxygen-based radicals are readily converted to more stable carbon-based radicals and/or. stable products. Similar radicals and products are also formed from two simple model trioxanes 2 and 3 that show little or no therapeutic action against malaria although some subtle differences were obtained. This suggests that the scaffold surrounding the pharmacophore may be involved in molecular recognition events allowing efficient uptake of this trioxane warhead into the parasite. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
X-ray reflectivity (XR) and grazing incidence X-ray diffraction (GIXD) have been used to examine an oxyethylene-b-oxybutylene (E23B8) copolymer film at the air-water interface. The XR data were fitted using both a one- and a two-layer model that outputted the film thickness, roughness, and electron density. The best fit to the experimental data was obtained using a two-layer model (representing the oxyethylene and oxybutylene blocks, respectively), which showed a rapid thickening of the copolymer film at pressures above 7 mN/m. The large roughness values found indicate a significant degree of intermixing between the blocks and back up the GIXD data, which showed no long range lateral ordering within the layer. It was found from the electron density model results that there is a large film densification at 7 mN/m, possibly suggesting conformational changes within the film, even though no such change occurs on the pressure-area isotherm at the same surface pressure.
Resumo:
Ellipsometry and atomic force microscopy (AFM) were used to study the film thickness and the surface roughness of both 'soft' and solid thin films. 'Soft' polymer thin films of polystyrene and poly(styrene-ethylene/butylene-styrene) block copolymer were prepared by spin-coating onto planar silicon wafers. Ellipsometric parameters were fitted by the Cauchy approach using a two-layer model with planar boundaries between the layers. The smooth surfaces of the prepared polymer films were confirmed by AFM. There is good agreement between AFM and ellipsometry in the 80-130 nm thickness range. Semiconductor surfaces (Si) obtained by anisotropic chemical etching were investigated as an example of a randomly rough surface. To define roughness parameters by ellipsometry, the top rough layers were treated as thin films according to the Bruggeman effective medium approximation (BEMA). Surface roughness values measured by AFM and ellipsometry show the same tendency of increasing roughness with increased etching time, although AFM results depend on the used window size. The combined use of both methods appears to offer the most comprehensive route to quantitative surface roughness characterisation of solid films. Copyright (c) 2007 John Wiley & Sons, Ltd.