926 resultados para Business Value Two-Layer Model
Resumo:
En este trabajo se desarrolló la reformulación del modelo de Lucentezza Accesorios, lo que se hizo fue una exploración y observación de los procesos que se realizan en la empresa. Una vez hecha esta investigación se identificaron los puntos débiles en los cuales debíamos trabajar para mejorar. Iniciamos un desarrollo de diferentes actividades como círculos virtuosos, Moodboard, curvas de valor, Blue-print, modelo Canvas, entre otros. Junto con estas herramientas y la experiencia de la empresa en el mercado, se fueron realizando modificaciones. Se logró identificar el camino que desde este momento debe tomar la empresa, para ser sostenible y escalable en el tiempo. Esta investigación también nos ayudó a conocer más nuestros clientes, a tener un perfil más claro de ellos, y de esta manera poder enfocar nuestra promoción hacia nuestro cliente objetivo.
Resumo:
Introducción: Los desórdenes hipertensivos en el embarazo son la mayor causa de morbimortalidad materna en el mundo, su tratamiento habitualmente se realiza con nifedipino o enalapril durante el postparto indistintamente, pero no hay estudios que los comparen. Metodología: Se realizó un estudio de corte transversal con fines analíticos en el cual se incluyeron las historias clínicas de pacientes con trastorno hipertensivo durante el postparto que recibieron alguno de estos dos medicamentos y se evaluó el control de tensión arterial, necesidad de otros antihipertensivos, efectos adversos, presencia de complicaciones en ambos grupos. Resultados: Se estudió una muestra representativa, homogénea de 139 pacientes (p 0,43). Todas controlaron las cifras tensionales con el medicamento recibido. El 45% (n=62) recibió enalapril 20 mg cada 12 horas, el 40% (n=56) recibió nifedipino 30 mg cada 8 horas, el 15% (n=21) recibió nifedipino 30 mg cada 12 horas. No se presentaron efectos adversos, complicaciones o mortalidad en ninguno de los grupos. Las pacientes con enalapril requirieron más antihipertensivos comparado con las pacientes que recibieron nifedipino con diferencia estadísticamente significativa (p 0,001). Discusión La escogencia de un antihipertensivo durante el postparto debe estar encaminada al tipo de trastorno antihipertensivo: aquellos que se presentan por primera vez durante el embarazo se les administra nifedipino con excelentes resultados; aquellos con antecedente de hipertensión previa se les administra enalapril con buenos resultados. Ambos medicamentos controlaron la presión arterial adecuadamente sin complicaciones ni mortalidad.
Resumo:
This investigation proposes to explore the existing link between a strategic conception of philanthropy and innovation. Indeed, the nature of the research question relies on an unexplored field in the CSR and Innovation management academic literature. It starts with the interest to know which the benefits are for a firm encouraged to invest strategically in philanthropy. In this regard, the analysis contributes in fitting this gap by following different objectives in an exploratory perspective. Throughout the research it will be analyzed the concept and the current and past contributions on the different branches of innovation (product innovation, managerial innovation, technological innovation), to accentuate the relation between an accurate strategic approach to philanthropy and the impact on the organizational value. Indeed, analyzing philanthropic innovation may provide insights about business opportunities and notions related to social investments and profit. That aspect includes the link between those strategic decisions that a firm can use to maximize those investments as it was part of their core business. It also proves the existing link between CSR and innovation, and the possibilities that the enterprises have towards this subject.
Resumo:
The common assumptions that labor income share does not change over time or across countries and that factor income shares are equal to the elasticity of output with respect to factors have had important implications for economic theory. However, there are various theoretical reasons why the elasticity of output with respect to reproducible factors should be correlated with the stage of development. In particular, the behavior of international trade and capital flows and the existence of factor saving innovations imply such a correlation. If this correlation exists and if factor income shares are equal to the elasticity of output with respect to factors then the labor income share must be negatively correlated with the stage of development. We propose an explanation for why labor income share has no correlation with income per capita: the existence of a labor intensive sector which produces non tradable goods.
Resumo:
Asset correlations are of critical importance in quantifying portfolio credit risk and economic capitalin financial institutions. Estimation of asset correlation with rating transition data has focusedon the point estimation of the correlation without giving any consideration to the uncertaintyaround these point estimates. In this article we use Bayesian methods to estimate a dynamicfactor model for default risk using rating data (McNeil et al., 2005; McNeil and Wendin, 2007).Bayesian methods allow us to formally incorporate human judgement in the estimation of assetcorrelation, through the prior distribution and fully characterize a confidence set for the correlations.Results indicate: i) a two factor model rather than the one factor model, as proposed bythe Basel II framework, better represents the historical default data. ii) importance of unobservedfactors in this type of models is reinforced and point out that the levels of the implied asset correlationscritically depend on the latent state variable used to capture the dynamics of default,as well as other assumptions on the statistical model. iii) the posterior distributions of the assetcorrelations show that the Basel recommended bounds, for this parameter, undermine the levelof systemic risk.
Resumo:
El desalineamiento temporal es la incorrespondencia de dos señales debido a una distorsión en el eje temporal. La Detección y Diagnóstico de Fallas (Fault Detection and Diagnosis-FDD) permite la detección, el diagnóstico y la corrección de fallos en un proceso. La metodología usada en FDD está dividida en dos categorías: técnicas basadas en modelos y no basadas en modelos. Esta tesis doctoral trata sobre el estudio del efecto del desalineamiento temporal en FDD. Nuestra atención se enfoca en el análisis y el diseño de sistemas FDD en caso de problemas de comunicación de datos, como retardos y pérdidas. Se proponen dos técnicas para reducir estos problemas: una basada en programación dinámica y la otra en optimización. Los métodos propuestos han sido validados sobre diferentes sistemas dinámicos: control de posición de un motor de corriente continua, una planta de laboratorio y un problema de sistemas eléctricos conocido como hueco de tensión.
Resumo:
This paper describes laboratory observations of inertia–gravity waves emitted from balanced fluid flow. In a rotating two-layer annulus experiment, the wavelength of the inertia–gravity waves is very close to the deformation radius. Their amplitude varies linearly with Rossby number in the range 0.05–0.14, at constant Burger number (or rotational Froude number). This linear scaling challenges the notion, suggested by several dynamical theories, that inertia–gravity waves generated by balanced motion will be exponentially small. It is estimated that the balanced flow leaks roughly 1% of its energy each rotation period into the inertia–gravity waves at the peak of their generation. The findings of this study imply an inevitable emission of inertia–gravity waves at Rossby numbers similar to those of the large-scale atmospheric and oceanic flow. Extrapolation of the results suggests that inertia–gravity waves might make a significant contribution to the energy budgets of the atmosphere and ocean. In particular, emission of inertia–gravity waves from mesoscale eddies may be an important source of energy for deep interior mixing in the ocean.
Resumo:
Our ability to identify thin non-stoichiometric and amorphous layers beneath mineral surfaces has been tested by undertaking X-ray photoelectron spectroscopy (XPS) and transmission electron microscopy (TEM) work on alkali feldspars from pH 1 dissolution experiments. The outcomes of this work were used to help interpret XPS and TEM results from alkali feldspars weathered for <10,000 years in soils overlying the Shap Granite (north-west England). The chemistry of effluent solutions indicates that silica-rich layers a few nanometers in thickness formed during the pH I experiments. These layers can be successfully identified by XPS and have lower Al/Si, Na/Si, K/Si and Ca/Si values than the outermost similar to 9 nm of unweathered controls. Development of Al-Si non-stoichiometry is coupled with loss of crystal structure to produce amorphous layers that are identifiable by TEM where >similar to 2.5 nm thick, whereas the crystallinity of albite is retained despite leaching of Na to depths of tens to hundreds on nanometers. Integration of XPS data over the outermost 6-9 nm of naturally weathered Shap feldspars shows that they have stoichiometric Al/Si and K/Si ratios, which is consistent with findings of previous TEM work on the same material that they lack amorphous layers. There is some XPS evidence for loss of K from the outermost couple of nanometers of Shap orthoclase, and the possibility of leaching of Na from albite to greater depths cannot be excluded using the XPS or TEM results. This study demonstrates that the leached layer model, as formulated from laboratory experiments, is inapplicable to the weathering of alkali feldspars within acidic soils, which is an essentially stoichiometric reaction. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Canopy interception of incident precipitation is a critical component of the forest water balance during each of the four seasons. Models have been developed to predict precipitation interception from standard meteorological variables because of acknowledged difficulty in extrapolating direct measurements of interception loss from forest to forest. No known study has compared and validated canopy interception models for a leafless deciduous forest stand in the eastern United States. Interception measurements from an experimental plot in a leafless deciduous forest in northeastern Maryland (39°42'N, 75°5'W) for 11 rainstorms in winter and early spring 2004/05 were compared to predictions from three models. The Mulder model maintains a moist canopy between storms. The Gash model requires few input variables and is formulated for a sparse canopy. The WiMo model optimizes the canopy storage capacity for the maximum wind speed during each storm. All models showed marked underestimates and overestimates for individual storms when the measured ratio of interception to gross precipitation was far more or less, respectively, than the specified fraction of canopy cover. The models predicted the percentage of total gross precipitation (PG) intercepted to within the probable standard error (8.1%) of the measured value: the Mulder model overestimated the measured value by 0.1% of PG; the WiMo model underestimated by 0.6% of PG; and the Gash model underestimated by 1.1% of PG. The WiMo model’s advantage over the Gash model indicates that the canopy storage capacity increases logarithmically with the maximum wind speed. This study has demonstrated that dormant-season precipitation interception in a leafless deciduous forest may be satisfactorily predicted by existing canopy interception models.
Resumo:
This article explores how data envelopment analysis (DEA), along with a smoothed bootstrap method, can be used in applied analysis to obtain more reliable efficiency rankings for farms. The main focus is the smoothed homogeneous bootstrap procedure introduced by Simar and Wilson (1998) to implement statistical inference for the original efficiency point estimates. Two main model specifications, constant and variable returns to scale, are investigated along with various choices regarding data aggregation. The coefficient of separation (CoS), a statistic that indicates the degree of statistical differentiation within the sample, is used to demonstrate the findings. The CoS suggests a substantive dependency of the results on the methodology and assumptions employed. Accordingly, some observations are made on how to conduct DEA in order to get more reliable efficiency rankings, depending on the purpose for which they are to be used. In addition, attention is drawn to the ability of the SLICE MODEL, implemented in GAMS, to enable researchers to overcome the computational burdens of conducting DEA (with bootstrapping).
Resumo:
Ab initio calculations using density functional theory have shown that the reactions that occur between artemisinin, 1, a cyclic trioxane active against malaria, and some metal ions and complexes lead to a series of radicals which are probably responsible for its therapeutic activity. In particular it has been shown that the interaction of Fe(H) with artemisinin causes the O-O bond to be broken as indeed does Fe(III) and Cu(I), while Zn(II) does not. Calculations were carried out with Fe(II) in several different forms including the bare ion, [Fe(H2O)(5)](2+) and [FeP(Im)] (P, porphyrin; Im, imadazole) and similar results were obtained. The resulting oxygen-based radicals are readily converted to more stable carbon-based radicals and/or. stable products. Similar radicals and products are also formed from two simple model trioxanes 2 and 3 that show little or no therapeutic action against malaria although some subtle differences were obtained. This suggests that the scaffold surrounding the pharmacophore may be involved in molecular recognition events allowing efficient uptake of this trioxane warhead into the parasite. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Supplier selection has a great impact on supply chain management. The quality of supplier selection also affects profitability of organisations which work in the supply chain. As suppliers can provide variety of services and customers demand higher quality of service provision, the organisation is facing challenges for making the right choice of supplier for the right needs. The existing methods for supplier selection, such as data envelopment analysis (DEA) and analytical hierarchy process (AHP) can automatically perform selection of competitive suppliers and further decide winning supplier(s). However, these methods are not capable of determining the right selection criteria which should be derived from the business strategy. An ontology model described in this paper integrates the strengths of DEA and AHP with new mechanisms which ensure the right supplier to be selected by the right criteria for the right customer's needs.
Resumo:
The new HadKPP atmosphere–ocean coupled model is described and then used to determine the effects of sub-daily air–sea coupling and fine near-surface ocean vertical resolution on the representation of the Northern Hemisphere summer intra-seasonal oscillation. HadKPP comprises the Hadley Centre atmospheric model coupled to the K Profile Parameterization ocean-boundary-layer model. Four 30-member ensembles were performed that varied in oceanic vertical resolution between 1 m and 10 m and in coupling frequency between 3 h and 24 h. The 10 m, 24 h ensemble exhibited roughly 60% of the observed 30–50 day variability in sea-surface temperatures and rainfall and very weak northward propagation. Enhancing either only the vertical resolution or only the coupling frequency produced modest improvements in variability and only a standing intra-seasonal oscillation. Only the 1 m, 3 h configuration generated organized, northward-propagating convection similar to observations. Sub-daily surface forcing produced stronger upper-ocean temperature anomalies in quadrature with anomalous convection, which likely affected lower-atmospheric stability ahead of the convection, causing propagation. Well-resolved air–sea coupling did not improve the eastward propagation of the boreal summer intra-seasonal oscillation in this model. Upper-ocean vertical mixing and diurnal variability in coupled models must be improved to accurately resolve and simulate tropical sub-seasonal variability. In HadKPP, the mere presence of air–sea coupling was not sufficient to generate an intra-seasonal oscillation resembling observations.
Resumo:
Valuation is the process of estimating price. The methods used to determine value attempt to model the thought processes of the market and thus estimate price by reference to observed historic data. This can be done using either an explicit model, that models the worth calculation of the most likely bidder, or an implicit model, that that uses historic data suitably adjusted as a short cut to determine value by reference to previous similar sales. The former is generally referred to as the Discounted Cash Flow (DCF) model and the latter as the capitalisation (or All Risk Yield) model. However, regardless of the technique used, the valuation will be affected by uncertainties. Uncertainty in the comparable data available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the estimate of price. In a previous paper, we have considered the way in which uncertainty is allowed for in the capitalisation model in the UK. In this paper, we extend the analysis to look at the way in which uncertainty can be incorporated into the explicit DCF model. This is done by recognising that the input variables are uncertain and will have a probability distribution pertaining to each of them. Thus buy utilising a probability-based valuation model (using Crystal Ball) it is possible to incorporate uncertainty into the analysis and address the shortcomings of the current model. Although the capitalisation model is discussed, the paper concentrates upon the application of Crystal Ball to the Discounted Cash Flow approach.
Resumo:
Consumer studies of meat have tended to use quantitative methodologies providing a wealth of statistically malleable information, but little in-depth insight into consumer perceptions of meat. The aim of the present study was therefore, to understand factors perceived important in the selection of chicken meat, using qualitative methodology. Focus group discussions were tape recorded, transcribed verbatim and content analysed for major themes. Themes arising implied that “appearance” and “convenience” were the most important determinants of choice of chicken meat and these factors appeared to be associated with perceptions of freshness, healthiness, product versatility and concepts of value. A descriptive model has been developed to illustrate the interrelationship between factors affecting chicken meat choice. This study indicates that those involved in the production and retailing of chicken products should concentrate upon product appearance and convenience as market drivers for their products.