919 resultados para Two-step model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a theoretical and empirical analysis of strategic competition in retail banking when some of the financial firms are non-profit organisations that invest in social activities. Banking literature about competition is fairly large, but the strategic interaction between profit maximizing and non profit maximizers has not been extensively analysed except for Purroy and Salas (1999). In this paper, a completely different approach is taken. An adaptation of Hotelling’s two stage model of spatial competition is developed to take into account consumer perceptions respect to the two different types of financial institutions. The empirical analysis confirms that consumers take into account other features different from the price, such as social contribution or closer service to make a deposit or mortgage decision. These conclusions are of interest in the debate about a firm’s social or ethical activities. It is shown that if consumers value social activities, firms can improv

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we describe a system for underwater navigation with AUVs in partially structured environments, such as dams, ports or marine platforms. An imaging sonar is used to obtain information about the location of planar structures present in such environments. This information is incorporated into a feature-based SLAM algorithm in a two step process: (I) the full 360deg sonar scan is undistorted (to compensate for vehicle motion), thresholded and segmented to determine which measurements correspond to planar environment features and which should be ignored; and (2) SLAM proceeds once the data association is obtained: both the vehicle motion and the measurements whose correct association has been previously determined are incorporated in the SLAM algorithm. This two step delayed SLAM process allows to robustly determine the feature and vehicle locations in the presence of large amounts of spurious or unrelated measurements that might correspond to boats, rocks, etc. Preliminary experiments show the viability of the proposed approach

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En la literatura económica no se ha estudiado como la competencia entre las instituciones educativas afecta específicamente la escogencia de estándares educativos y el valor de matrícula. Usando un modelo teórico analizo como la competencia entre las instituciones educativas afectan la escogencia de estándares académicos, comparando la solución en competencia con la solución eficiente y la solución de monopolio. Los individuos son heterogéneos y se diferencian en su habilidad, las instituciones educativas compiten estableciendo en una primera etapa el estándar educativo, y en una segunda etapa el valor de matrícula. Una vez definidos los estándares y los valores de matrícula, estos son información pública, permitiendo a los individuos escoger entre ingresar o no a una institución educativa o a que institución educativa ingresar de acuerdo a la habilidad innata y al costo asociado al esfuerzo. En los resultados se muestra que el bienestar social aumenta cuando en la economía existe más de una institución educativa con estándares diferentes, y la solución de mercado, en monopolio o en competencia, obliga a los estudiantes a ejercer un mayor esfuerzo para alcanzar el título. Independiente a la relación de costos, el valor de matrícula es siempre mayor para la institución con estándar educativo más alto, y mayor en la solución de mercado. Cuando el costo unitario de la institución con estándar más alto es mayor o igual al costo de la institución con menor estándar, los estándares educativos escogidos por el planificador son mayores y el esfuerzo requerido por los individuos es menor respecto a la solución de mercado.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introducción: Los desórdenes hipertensivos en el embarazo son la mayor causa de morbimortalidad materna en el mundo, su tratamiento habitualmente se realiza con nifedipino o enalapril durante el postparto indistintamente, pero no hay estudios que los comparen. Metodología: Se realizó un estudio de corte transversal con fines analíticos en el cual se incluyeron las historias clínicas de pacientes con trastorno hipertensivo durante el postparto que recibieron alguno de estos dos medicamentos y se evaluó el control de tensión arterial, necesidad de otros antihipertensivos, efectos adversos, presencia de complicaciones en ambos grupos. Resultados: Se estudió una muestra representativa, homogénea de 139 pacientes (p 0,43). Todas controlaron las cifras tensionales con el medicamento recibido. El 45% (n=62) recibió enalapril 20 mg cada 12 horas, el 40% (n=56) recibió nifedipino 30 mg cada 8 horas, el 15% (n=21) recibió nifedipino 30 mg cada 12 horas. No se presentaron efectos adversos, complicaciones o mortalidad en ninguno de los grupos. Las pacientes con enalapril requirieron más antihipertensivos comparado con las pacientes que recibieron nifedipino con diferencia estadísticamente significativa (p 0,001). Discusión La escogencia de un antihipertensivo durante el postparto debe estar encaminada al tipo de trastorno antihipertensivo: aquellos que se presentan por primera vez durante el embarazo se les administra nifedipino con excelentes resultados; aquellos con antecedente de hipertensión previa se les administra enalapril con buenos resultados. Ambos medicamentos controlaron la presión arterial adecuadamente sin complicaciones ni mortalidad.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The common assumptions that labor income share does not change over time or across countries and that factor income shares are equal to the elasticity of output with respect to factors have had important implications for economic theory. However, there are various theoretical reasons why the elasticity of output with respect to reproducible factors should be correlated with the stage of development. In particular, the behavior of international trade and capital flows and the existence of factor saving innovations imply such a correlation. If this correlation exists and if factor income shares are equal to the elasticity of output with respect to factors then the labor income share must be negatively correlated with the stage of development. We propose an explanation for why labor income share has no correlation with income per capita: the existence of a labor intensive sector which produces non tradable goods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Even though antenatal care is universally regarded as important, determinants of demand for antenatal care have not been widely studied. Evidence concerning which and how socioeconomic conditions influence whether a pregnant woman attends or not at least one antenatal consultation or how these factors affect the absences to antenatal consultations is very limited. In order to generate this evidence, a two-stage analysis was performed with data from the Demographic and Health Survey carried out by Profamilia in Colombia during 2005. The first stage was run as a logit model showing the marginal effects on the probability of attending the first visit and an ordinary least squares model was performed for the second stage. It was found that mothers living in the pacific region as well as young mothers seem to have a lower probability of attending the first visit but these factors are not related to the number of absences to antenatal consultation once the first visit has been achieved. The effect of health insurance was surprising because of the differing effects that the health insurers showed. Some familiar and personal conditions such as willingness to have the last children and number of previous children, demonstrated to be important in the determination of demand. The effect of mother’s educational attainment was proved as important whereas the father’s educational achievement was not. This paper provides some elements for policy making in order to increase the demand inducement of antenatal care, as well as stimulating research on demand for specific issues on health.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Asset correlations are of critical importance in quantifying portfolio credit risk and economic capitalin financial institutions. Estimation of asset correlation with rating transition data has focusedon the point estimation of the correlation without giving any consideration to the uncertaintyaround these point estimates. In this article we use Bayesian methods to estimate a dynamicfactor model for default risk using rating data (McNeil et al., 2005; McNeil and Wendin, 2007).Bayesian methods allow us to formally incorporate human judgement in the estimation of assetcorrelation, through the prior distribution and fully characterize a confidence set for the correlations.Results indicate: i) a two factor model rather than the one factor model, as proposed bythe Basel II framework, better represents the historical default data. ii) importance of unobservedfactors in this type of models is reinforced and point out that the levels of the implied asset correlationscritically depend on the latent state variable used to capture the dynamics of default,as well as other assumptions on the statistical model. iii) the posterior distributions of the assetcorrelations show that the Basel recommended bounds, for this parameter, undermine the levelof systemic risk.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El desalineamiento temporal es la incorrespondencia de dos señales debido a una distorsión en el eje temporal. La Detección y Diagnóstico de Fallas (Fault Detection and Diagnosis-FDD) permite la detección, el diagnóstico y la corrección de fallos en un proceso. La metodología usada en FDD está dividida en dos categorías: técnicas basadas en modelos y no basadas en modelos. Esta tesis doctoral trata sobre el estudio del efecto del desalineamiento temporal en FDD. Nuestra atención se enfoca en el análisis y el diseño de sistemas FDD en caso de problemas de comunicación de datos, como retardos y pérdidas. Se proponen dos técnicas para reducir estos problemas: una basada en programación dinámica y la otra en optimización. Los métodos propuestos han sido validados sobre diferentes sistemas dinámicos: control de posición de un motor de corriente continua, una planta de laboratorio y un problema de sistemas eléctricos conocido como hueco de tensión.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Integrated Catchment Model of Nitrogen (INCA-N) was applied to the River Lambourn, a Chalk river-system in southern England. The model's abilities to simulate the long-term trend and seasonal patterns in observed stream water nitrate concentrations from 1920 to 2003 were tested. This is the first time a semi-distributed, daily time-step model has been applied to simulate such a long time period and then used to calculate detailed catchment nutrient budgets which span the conversion of pasture to arable during the late 1930s and 1940s. Thus, this work goes beyond source apportionment and looks to demonstrate how such simulations can be used to assess the state of the catchment and develop an understanding of system behaviour. The mass-balance results from 1921, 1922, 1991, 2001 and 2002 are presented and those for 1991 are compared to other modelled and literature values of loads associated with nitrogen soil processes and export. The variations highlighted the problem of comparing modelled fluxes with point measurements but proved useful for identifying the most poorly understood inputs and processes thereby providing an assessment of input data and model structural uncertainty. The modelled terrestrial and instream mass-balances also highlight the importance of the hydrological conditions in pollutant transport. Between 1922 and 2002, increased inputs of nitrogen from fertiliser, livestock and deposition have altered the nitrogen balance with a shift from possible reduction in soil fertility but little environmental impact in 1922, to a situation of nitrogen accumulation in the soil, groundwater and instream biota in 2002. In 1922 and 2002 it was estimated that approximately 2 and 18 kg N ha(-1) yr(-1) respectively were exported from the land to the stream. The utility of the approach and further considerations for the best use of models are discussed. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Integrated Catchment Model of Nitrogen (INCA-N) was applied to the River Lambourn, a Chalk river-system in southern England. The model's abilities to simulate the long-term trend and seasonal patterns in observed stream water nitrate concentrations from 1920 to 2003 were tested. This is the first time a semi-distributed, daily time-step model has been applied to simulate such a long time period and then used to calculate detailed catchment nutrient budgets which span the conversion of pasture to arable during the late 1930s and 1940s. Thus, this work goes beyond source apportionment and looks to demonstrate how such simulations can be used to assess the state of the catchment and develop an understanding of system behaviour. The mass-balance results from 1921, 1922, 1991, 2001 and 2002 are presented and those for 1991 are compared to other modelled and literature values of loads associated with nitrogen soil processes and export. The variations highlighted the problem of comparing modelled fluxes with point measurements but proved useful for identifying the most poorly understood inputs and processes thereby providing an assessment of input data and model structural uncertainty. The modelled terrestrial and instream mass-balances also highlight the importance of the hydrological conditions in pollutant transport. Between 1922 and 2002, increased inputs of nitrogen from fertiliser, livestock and deposition have altered the nitrogen balance with a shift from possible reduction in soil fertility but little environmental impact in 1922, to a situation of nitrogen accumulation in the soil, groundwater and instream biota in 2002. In 1922 and 2002 it was estimated that approximately 2 and 18 kg N ha(-1) yr(-1) respectively were exported from the land to the stream. The utility of the approach and further considerations for the best use of models are discussed. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes the user modeling component of EPIAIM, a consultation system for data analysis in epidemiology. The component is aimed at representing knowledge of concepts in the domain, so that their explanations can be adapted to user needs. The first part of the paper describes two studies aimed at analysing user requirements. The first one is a questionnaire study which examines the respondents' familiarity with concepts. The second one is an analysis of concept descriptions in textbooks and from expert epidemiologists, which examines how discourse strategies are tailored to the level of experience of the expected audience. The second part of the paper describes how the results of these studies have been used to design the user modeling component of EPIAIM. This module works in a two-step approach. In the first step, a few trigger questions allow the activation of a stereotype that includes a "body" and an "inference component". The body is the representation of the body of knowledge that a class of users is expected to know, along with the probability that the knowledge is known. In the inference component, the learning process of concepts is represented as a belief network. Hence, in the second step the belief network is used to refine the initial default information in the stereotype's body. This is done by asking a few questions on those concepts where it is uncertain whether or not they are known to the user, and propagating this new evidence to revise the whole situation. The system has been implemented on a workstation under UNIX. An example of functioning is presented, and advantages and limitations of the approach are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

HFC-134a (CF3CH2F) is the most rapidly growing hydrofluorocarbon in terms of atmospheric abundance. It is currently used in a large number of household refrigerators and air-conditioning systems and its concentration in the atmosphere is forecast to increase substantially over the next 50–100 years. Previous estimates of its radiative forcing per unit concentration have differed significantly 25%. This paper uses a two-step approach to resolve this discrepancy. In the first step six independent absorption cross section datasets are analysed. We find that, for the integrated cross section in the spectral bands that contribute most to the radiative forcing, the differences between the various datasets are typically smaller than 5% and that the dependence on pressure and temperature is not significant. A “recommended'' HFC-134a infrared absorption spectrum was obtained based on the average band intensities of the strongest bands. In the second step, the “recommended'' HFC-134a spectrum was used in six different radiative transfer models to calculate the HFC-134a radiative forcing efficiency. The clear-sky instantaneous radiative forcing, using a single global and annual mean profile, differed by 8%, between the 6 models, and the latitudinally-resolved adjusted cloudy sky radiative forcing estimates differed by a similar amount.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The origin of the eddy variability around the 25°S band in the Indian Ocean is investigated. We have found that the surface circulation east of Madagascar shows an anticyclonic subgyre bounded to the south by eastward flow from southwest Madagascar, and to the north by the westward flowing South Equatorial Current (SEC) between 15° and 20°S. The shallow, eastward flowing South Indian Ocean Countercurrent (SICC) extends above the deep reaching, westward flowing SEC to 95°E around the latitude of the high variability band. Applying a two-layer model reveals that regions of large vertical shear along the SICC-SEC system are baroclinically unstable. Estimates of the frequencies (3.5–6 times/year) and wavelengths (290–470 km) of the unstable modes are close to observations of the mesoscale variability derived from altimetry data. It is likely then that Rossby wave variability locally generated in the subtropical South Indian Ocean by baroclinic instability is the origin of the eddy variability around 25°S as seen, for example, in satellite altimetry.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Surfactin is a bacterial lipopeptide produced by Bacillus subtilis and is a powerful surfactant, having also antiviral, antibacterial and antitumor properties. The recovery and purification of surfactin from complex fermentation broths is a major obstacle to its commercialization; therefore, a two-step membrane filtration process was developed using a lab scale tangential flow filtration (TFF) unit with 10 kDa MWCO regenerated cellulose (RC) and polyethersulfone (PES)membranes at three different transmembrane pressure (TMP) of 1.5 bar, 2.0 bar and 2.5 bar. Two modes of filtrations were studied, with and without cleaning of membranes prior to UF-2. In a first step of ultrafiltration (UF-1), surfactin was retained effectively by membranes at above its critical micelle concentration (CMC); subsequently in UF-2, the retentate micelles were disrupted by addition of 50% (v/v) methanol solution to allow recovery of surfactin in the permeate. Main protein contaminants were effectively retained by the membrane in UF-2. Flux of permeates, rejection coefficient (R) of surfactin and proteinwere measured during the filtrations. Overall the three different TMPs applied have no significant effect in the filtrations and PES is the more suitable membrane to selectively separate surfactin from fermentation broth, achieving high recovery and level of purity. In addition this two-step UF process is scalable for larger volume of samples without affecting the original functionality of surfactin, although membranes permeability can be affected due to exposure to methanolic solution used in UF-2.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is a need for better links between hydrology and ecology, specifically between landscapes and riverscapes to understand how processes and factors controlling the transport and storage of environmental pollution have affected or will affect the freshwater biota. Here we show how the INCA modelling framework, specifically INCA-Sed (the Integrated Catchments model for Sediments) can be used to link sediment delivery from the landscape to sediment changes in-stream. INCA-Sed is a dynamic, process-based, daily time step model. The first complete description of the equations used in the INCA-Sed software (version 1.9.11) is presented. This is followed by an application of INCA-Sed made to the River Lugg (1077 km2) in Wales. Excess suspended sediment can negatively affect salmonid health. The Lugg has a large and potentially threatened population of both Atlantic salmon (Salmo salar) and Brown Trout (Salmo trutta). With the exception of the extreme sediment transport processes, the model satisfactorily simulated both the hydrology and the sediment dynamics in the catchment. Model results indicate that diffuse soil loss is the most important sediment generation process in the catchment. In the River Lugg, the mean annual Guideline Standard for suspended sediment concentration, proposed by UKTAG, of 25 mg l− 1 is only slightly exceeded during the simulation period (1995–2000), indicating only minimal effect on the Atlantic salmon population. However, the daily time step simulation of INCA-Sed also allows the investigation of the critical spawning period. It shows that the sediment may have a significant negative effect on the fish population in years with high sediment runoff. It is proposed that the fine settled particles probably do not affect the salmonid egg incubation process, though suspended particles may damage the gills of fish and make the area unfavourable for spawning if the conditions do not improve.