31 resultados para Delivery ratio
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
The paper models the practice of charging bribes for faster delivery of essential services in third world countries. It then examines the possibility of curbing corruption by supervision, and secondly, by introducing competition among delivery agents. It is argued that a supervisory solution eludes the problem because no hard evidence of the reduction of corruption can be established for this type of offenses. It is also shown that using more than one supplier cannot eliminate the practice, and the bribe paying part of the market attains a determinate proportion as the number of suppliers increases. However the bribe rate and average waiting time come down at a diminishing rate with increase in the number of suppliers, and this property can be used to determine an optimal number of suppliers.
Resumo:
This paper empirically analyses the hypothesis of the existence of a dual market for contracts in local services. Large firms that operate on a national basis control the contracts for delivery in the most populated and/or urban municipalities, whereas small firms that operate at a local level have the contracts in the least populated and/or rural municipalities. The dual market implies the high concentration and dominance of major firms in large municipalities, and local monopolies in the smaller ones. This market structure is harmful to competition for the market as the effective number of competitors is low across all municipalities. Thus, it damages the likelihood of obtaining cost savings from privatization.
Resumo:
Adaptació de l'habitatge a Bristol (UK. Ponència del "2º Espacio de Encuentro :Rehabilitación y Adaptación Funcional de la Vivienda" (San6 Sebastià, 9 Juny 2010)
Resumo:
Based on the Ahumada et al. (2007, Review of Income and Wealth) critique we revise existing estimates of the size of the German underground economy. Among other things, it turns out that most of these estimates are untenable and that the tax pressure induced size of the German underground economy may be much lower than previously thought. To this extent, German policy and law makers have been misguided during the last three decades. Therefore, we introduce the Modified-Cash-Deposit-Ratio (MCDR) approach, which is not subject to the recent critique and apply it to Germany for the period 1960 to 2008. JEL: O17, Q41, C22, Keywords: underground economy, shadow economy, cash-depositratio, currency demand approach, MIMIC approach
Resumo:
This article discusses the lessons learned from developing and delivering the Vocational Management Training for the European Tourism Industry (VocMat) online training programme, which was aimed at providing flexible, online distance learning for the European tourism industry. The programme was designed to address managers ‘need for flexible, senior management level training which they could access at a time and place which fitted in with their working and non-work commitments. The authors present two main approaches to using the Virtual Learning Environment, the feedback from the participants, and the implications of online Technology in extending tourism training opportunities
Resumo:
We compare correspondance análisis to the logratio approach based on compositional data. We also compare correspondance análisis and an alternative approach using Hellinger distance, for representing categorical data in a contingency table. We propose a coefficient which globally measures the similarity between these approaches. This coefficient can be decomposed into several components, one component for each principal dimension, indicating the contribution of the dimensions to the difference between the two representations. These three methods of representation can produce quite similar results. One illustrative example is given
Resumo:
We compare two methods for visualising contingency tables and developa method called the ratio map which combines the good properties of both.The first is a biplot based on the logratio approach to compositional dataanalysis. This approach is founded on the principle of subcompositionalcoherence, which assures that results are invariant to considering subsetsof the composition. The second approach, correspondence analysis, isbased on the chi-square approach to contingency table analysis. Acornerstone of correspondence analysis is the principle of distributionalequivalence, which assures invariance in the results when rows or columnswith identical conditional proportions are merged. Both methods may bedescribed as singular value decompositions of appropriately transformedmatrices. Correspondence analysis includes a weighting of the rows andcolumns proportional to the margins of the table. If this idea of row andcolumn weights is introduced into the logratio biplot, we obtain a methodwhich obeys both principles of subcompositional coherence and distributionalequivalence.
Resumo:
We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.
Resumo:
Many dynamic revenue management models divide the sale period into a finite number of periods T and assume, invoking a fine-enough grid of time, that each period sees at most one booking request. These Poisson-type assumptions restrict the variability of the demand in the model, but researchers and practitioners were willing to overlook this for the benefit of tractability of the models. In this paper, we criticize this model from another angle. Estimating the discrete finite-period model poses problems of indeterminacy and non-robustness: Arbitrarily fixing T leads to arbitrary control values and on the other hand estimating T from data adds an additional layer of indeterminacy. To counter this, we first propose an alternate finite-population model that avoids this problem of fixing T and allows a wider range of demand distributions, while retaining the useful marginal-value properties of the finite-period model. The finite-population model still requires jointly estimating market size and the parameters of the customer purchase model without observing no-purchases. Estimation of market-size when no-purchases are unobservable has rarely been attempted in the marketing or revenue management literature. Indeed, we point out that it is akin to the classical statistical problem of estimating the parameters of a binomial distribution with unknown population size and success probability, and hence likely to be challenging. However, when the purchase probabilities are given by a functional form such as a multinomial-logit model, we propose an estimation heuristic that exploits the specification of the functional form, the variety of the offer sets in a typical RM setting, and qualitative knowledge of arrival rates. Finally we perform simulations to show that the estimator is very promising in obtaining unbiased estimates of population size and the model parameters.
Resumo:
We reanalyze the decay mode of Lambda hypernuclei induced by two nucleons modifying previous numerical results and the interpretation of the process. The repercussions of this channel in the ratio of neutron to proton induced Lambda decay is studied in detail in connection with the present experimental data. This leads to ratios that are in greater contradiction with usual one pion exchange models than those deduced before.
Resumo:
The photoproduction of η′η′-mesons off different nuclei has been measured with the CBELSA/TAPS detector system for incident photon energies between 15002200 MeV. The transparency ratio has been deduced and compared to theoretical calculations describing the propagation of η′η′-mesons in nuclei. The comparison indicates a width of the η′η′-meson of the order of Γ=1525 MeVΓ=1525 MeV at ρ=ρ0ρ=ρ0 for an average momentum pη′=1050 MeV/cpη′=1050 MeV/c, at which the η′η′-meson is produced in the nuclear rest frame. The inelastic η′Nη′N cross section is estimated to be 310 mb. Parameterizing the photoproduction cross section of η′η′-mesons by σ(A)=σ0Aασ(A)=σ0Aα, a value of α=0.84±0.03α=0.84±0.03 has been deduced.
Resumo:
We present a class of systems for which the signal-to-noise ratio always increases when increasing the noise and diverges at infinite noise level. This new phenomenon is a direct consequence of the existence of a scaling law for the signal-to-noise ratio and implies the appearance of stochastic resonance in some monostable systems. We outline applications of our results to a wide variety of systems pertaining to different scientific areas. Two particular examples are discussed in detail.
Resumo:
The ability to entrap drugs within vehicles and subsequently release them has led to new treatments for a number of diseases. Based on an associative phase separation and interfacial diffusion approach, we developed a way to prepare DNA gel particles without adding any kind of cross-linker or organic solvent. Among the various agents studied, cationic surfactants offered particularly efficient control for encapsulation and DNA release from these DNA gel particles. The driving force for this strong association is the electrostatic interaction between the two components, as induced by the entropic increase due to the release of the respective counter-ions. However, little is known about the influence of the respective counter-ions on this surfactant-DNA interaction. Here we examined the effect of different counter-ions on the formation and properties of the DNA gel particles by mixing DNA (either single- (ssDNA) or double-stranded (dsDNA)) with the single chain surfactant dodecyltrimethylammonium (DTA). In particular, we used as counter-ions of this surfactant the hydrogen sulfate and trifluoromethane sulfonate anions and the two halides, chloride and bromide. Effects on the morphology of the particles obtained, the encapsulation of DNA and its release, as well as the haemocompatibility of these particles, are presented, using the counter-ion structure and the DNA conformation as controlling parameters. Analysis of the data indicates that the degree of counter-ion dissociation from the surfactant micelles and the polar/hydrophobic character of the counter-ion are important parameters in the final properties of the particles. The stronger interaction with amphiphiles for ssDNA than for dsDNA suggests the important role of hydrophobic interactions in DNA.