55 resultados para A Model for Costing Absenteeism in Hotels


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Experimental evidences demonstrate that vegetable derived extracts inhibit cholesterol absorption in the gastrointestinal tract. To further explore the mechanisms behind, we modeled duodenal contents with several vegetable extracts. Results: By employing a widely used cholesterol quantification method based on a cholesterol oxidase-peroxidase coupled reaction we analyzed the effects on cholesterol partition. Evidenced interferences were analyzed by studying specific and unspecific inhibitors of cholesterol oxidase-peroxidase coupled reaction. Cholesterol was also quantified by LC/MS. We found a significant interference of diverse (cocoa and tea-derived) extracts over this method. The interference was strongly dependent on model matrix: while as in phosphate buffered saline, the development of unspecific fluorescence was inhibitable by catalase (but not by heat denaturation), suggesting vegetable extract derived H2O2 production, in bile-containing model systems, this interference also comprised cholesterol-oxidase inhibition. Several strategies, such as cholesterol standard addition and use of suitable blanks containing vegetable extracts were tested. When those failed, the use of a mass-spectrometry based chromatographic assay allowed quantification of cholesterol in models of duodenal contents in the presence of vegetable extracts. Conclusions: We propose that the use of cholesterol-oxidase and/or peroxidase based systems for cholesterol analyses in foodstuffs should be accurately monitored, as important interferences in all the components of the enzymatic chain were evident. The use of adequate controls, standard addition and finally, chromatographic analyses solve these issues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze the neutron skin thickness in finite nuclei with the droplet model and effective nuclear interactions. The ratio of the bulk symmetry energy J to the so-called surface stiffness coefficient Q has in the droplet model a prominent role in driving the size of neutron skins. We present a correlation between the density derivative of the nuclear symmetry energy at saturation and the J/Q ratio. We emphasize the role of the surface widths of the neutron and proton density profiles in the calculation of the neutron skin thickness when one uses realistic mean-field effective interactions. Next, taking as experimental baseline the neutron skin sizes measured in 26 antiprotonic atoms along the mass table, we explore constraints arising from neutron skins on the value of the J/Q ratio. The results favor a relatively soft symmetry energy at subsaturation densities. Our predictions are compared with the recent constraints derived from other experimental observables. Though the various extractions predict different ranges of values, one finds a narrow window L∼45-75 MeV for the coefficient L that characterizes the density derivative of the symmetry energy that is compatible with all the different empirical indications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Besley (1988) uses a scaling approach to model merit good arguments in commodity tax policy. In this paper, I question this approach on the grounds that it produces 'wrong' recommendations--taxation (subsidisation) of merit (demerit) goods--whenever the demand for the (de)merit good is inelastic. I propose an alternative approach that does not suffer from this deficiency, and derive the ensuing first and second best tax rules, as well as the marginal cost expressions to perform tax reform analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a response to the rapidly growing empirical literature on social capital and the evidence of its correlation with government performance, we build a theoretical framework to study the interactions between social capital and government's action. This paper presents a model of homogeneous agents in an overlapping generations framework incorporating social capital as the values transmitted from parent to child. The government's role is to provide public goods. First, government expenditure is exogenously given. Then, it will be chosen at the preferred level of the representative agent. For both setups the equilibrium outcomes are characterized and the resulting dynamics studied. Briefly we include an analysis of the effect of productivity growth on the evolution of social capital. The results obtained caution caution against both the crowding out effect of the welfare state and the impact of sustained economic growth on social capital.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we consider a model of cooperative production in which rational agents have the possibility to engage in sabotage activities that decrease output. It is shown that sabotage depends on the interplay between the degree of congestion, the technology of sabotage, the number of agents the degree of meritocracy and the form of the sharing rule. In particular it is shown that, ceteries paribus, meritocratic systems give more incentives to sabotage than egalitarian systems. We address two questions: The degree of meritocracy that is compatible with absence of sabotage and the existence of a Nash equilibrium with and without sabotage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper presents a foundation model for Marxian theories of the breakdown of capitalism based on a new falling rate of profit mechanism. All of these theories are based on one or more of "the historical tendencies": a rising capital-wage bill ratio, a rising capitalist share and a falling rate of profit. The model is a foundation in the sense that it generates these tendencies in the context of a model with a constant subsistence wage. The newly discovered generating mechanism is based on neo-classical reasoning for a model with land. It is non-Ricardian in that land augmenting technical progress can be unboundedly rapid. Finally, since the model has no steady state, it is necessary to use a new technique, Chaplygin's method, to prove the result.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper presents a foundation model for Marxian theories of the breakdown of capitalism based on a new falling rate of profit mechanism. All of these theories are based on one or more of ?the historical tendencies?: a rising capital-wage bill ratio, a rising capitalist share and a falling rate of profit. The model is a foundation in the sense that it generates these tendencies in the context of a model with a constant subsistence wage. The newly discovered generating mechanism is based on neo-classical reasoning for a model with land. It is non-Ricardian in that land augmenting technical progress can be unboundedly rapid. Finally, since the model has no steady state, it is necessary to use a new technique, Chaplygin?s method, to prove the result.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The choice of either the rate of monetary growth or the nominal interest rate as the instrument controlled by monetary authorities has both positive and normative implications for economic performance. We reexamine some of the issues related to the choice of the monetary policy instrument in a dynamic general equilibrium model exhibiting endogenous growth in which a fraction of productive government spending is financed by means of issuing currency. When we evaluate the performance of the two monetary instruments attending to the fluctuations of endogenous variables, we find that the inflation rate is less volatile under nominal interest rate targeting. Concerning the fluctuations of consumption and of the growth rate, both monetary policy instruments lead to statistically equivalent volatilities. Finally, we show that none of these two targeting procedures displays unambiguously higher welfare levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study a model where agents, located in a social network, decide whether to exert effort or not in experimenting with a new technology (or acquiring a new skill, innovating, etc.). We assume that agents have strong incentives to free ride on their neighbors' effort decisions. In the static version of the model efforts are chosen simultaneously. In equilibrium, agents exerting effort are never connected with each other and all other agents are connected with at least one agent exerting effort. We propose a mean-field dynamics in which agents choose in each period the best response to the last period's decisions of their neighbors. We characterize the equilibrium of such a dynamics and show how the pattern of free riders in the network depends on properties of the connectivity distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We extend the model of collective action in which groups compete for a budged by endogenizing the group platform, namely the specific mixture of public/private good and the distribution of the private good to group members which can be uniform or performance-based. While the group-optimal platform contains a degree of publicness that increases in group size and divides the private benefits uniformly, a success-maximizing leader uses incentives and distorts the platform towards more private benefits - a distortion that increases with group size. In both settings we obtain the anti-Olson type result that win probability increases with group size.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Suburbanization is changing the urban spatial structure and less monocentric metropolitan regions are becoming the new urban reality. Focused only on centers, most works have studied these spatial changes neglecting the role of transport infrastructure and its related location model, the “accessibility city”, in which employment and population concentrate in low-density settlements and close to transport infrastructure. For the case of Barcelona, we consider this location model and study the population spatial structure between 1991 and 2006. The results reveal a mix between polycentricity and the accessibility city, with movements away from the main centers, but close to the transport infrastructure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La sostenibilitat del model energètic de Catalunya es veu condicionada per aspectes com la dependència energètica, la seguretat de subministrament, l’eficiència energètica, els impactes ambientals i la demanda creixent. D’altra banda, la incorporació d’energia renovable en el mix energètic implica una major autonomia energètica, seguretat de subministrament a llarg termini, i eficiència energètica, així com un menor impacte ambiental. Tanmateix, la contribució en el sistema elèctric d’un volum ja important i creixent d’energia renovable requereix una complexa tasca d’integració a nivell tècnic i econòmic. Per aconseguir-ho, és necessari desenvolupar una regulació estable que complementi el procés de liberalització del sector amb l’objectiu d’acomodar la generació renovable en un model energètic sostenible. La (in)formació i participació de la demanda es presenta com una condició clau per engegar el camí cap a una nova cultura energètica.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three multivariate statistical tools (principal component analysis, factor analysis, analysis discriminant) have been tested to characterize and model the sags registered in distribution substations. Those models use several features to represent the magnitude, duration and unbalanced grade of sags. They have been obtained from voltage and current waveforms. The techniques are tested and compared using 69 registers of sags. The advantages and drawbacks of each technique are listed

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By anessential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur inmany compositional situations, such as household budget patterns, time budgets,palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful insuch situations. From consideration of such examples it seems sensible to build up amodel in two stages, the first determining where the zeros will occur and the secondhow the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Forest fire models have been widely studied from the context of self-organized criticality and from the ecological properties of the forest and combustion. On the other hand, reaction-diffusion equations have interesting applications in biology and physics. We propose here a model for fire propagation in a forest by using hyperbolic reaction-diffusion equations. The dynamical and thermodynamical aspects of the model are analyzed in detail