10 resultados para quality requirements
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred
Resumo:
Background: The COSMIN checklist (COnsensus-based Standards for the selection of health status Measurement INstruments) was developed in an international Delphi study to evaluate the methodological quality of studies on measurement properties of health-related patient reported outcomes (HR-PROs). In this paper, we explain our choices for the design requirements and preferred statistical methods for which no evidence is available in the literature or on which the Delphi panel members had substantial discussion. Methods: The issues described in this paper are a reflection of the Delphi process in which 43 panel members participated. Results: The topics discussed are internal consistency (relevance for reflective and formative models, and distinction with unidimensionality), content validity (judging relevance and comprehensiveness), hypotheses testing as an aspect of construct validity (specificity of hypotheses), criterion validity (relevance for PROs), and responsiveness (concept and relation to validity, and (in) appropriate measures).Conclusions: We expect that this paper will contribute to a better understanding of the rationale behind the items, thereby enhancing the acceptance and use of the COSMIN checklist.
Resumo:
The optimal location of services is one of the most important factors that affects service quality in terms of consumer access. On theother hand, services in general need to have a minimum catchment area so as to be efficient. In this paper a model is presented that locates the maximum number of services that can coexist in a given region without having losses, taking into account that they need a minimum catchment area to exist. The objective is to minimize average distance to the population. The formulation presented belongs to the class of discrete P--median--like models. A tabu heuristic method is presented to solve the problem. Finally, the model is applied to the location of pharmacies in a rural region of Spain.
Resumo:
Companies are under IAS 40 required to report fair values of investment properties on the balance sheet or to disclose them in the notes. The standard requires also that companies have to disclose the methods and significant assumptions applied in determining fair values of investment properties. However, IAS 40 does not include any illustrative examples or other guidance on how to apply the disclosure requirements. We use a sample with publicly traded companies from the real estate sector in the EU. We find that a majority of the companies use income based methods for the measurement of fair values but there are considerable cross-country variations in the level of disclosures about the assumptions used in determining fair values. More specifically, we find that Scandinavian and German origin companies disclose more than French and English origin companies. We also test whether disclosure quality is associated with enforcement quality measured with the “Rule of Law” index according to Kaufmann et al. (2010), and associated with a secrecy- versus transparency-measure based on Gray (1988). We find a positive association between disclosure and earnings quality and a negative association with secrecy.
Resumo:
Peer-reviewed
Resumo:
The effects of the addition to sausage mix of tocopherols (200 mg/kg), a conventional starter culture with or without Staphylococcus carnosus, celery concentrate (CP) (0.23% and 0.46%), and two doses of nitrate (70 and 140 mg/kg expressed as NaNO(3)) on residual nitrate and nitrite amounts, instrumental CIE Lab color, tocol content, oxidative stability, and overall acceptability were studied in fermented dry-cured sausages after ripening and after storage. Nitrate doses were provided by nitrate-rich CP or a chemical grade source. The lower dose complies with the EU requirements governing the maximum for ingoing amounts in organic meat products. Tocopherol addition protected against oxidation, whereas the nitrate dose, nitrate source, or starter culture had little influence on secondary oxidation values. The residual nitrate and nitrite amounts found in the sausages with the lower nitrate dose were within EU-permitted limits for organic meat products and residual nitrate can be further reduced by the presence of the S. carnosus culture. Color measurements were not affected by the CP dose. Product consumer acceptability was not affected negatively by any of the factors studied. As the two nitrate sources behaved similarly for the parameters studied, CP is a useful alternative to chemical ingredients for organic dry-cured sausage production.
Resumo:
A method for dealing with monotonicity constraints in optimal control problems is used to generalize some results in the context of monopoly theory, also extending the generalization to a large family of principal-agent programs. Our main conclusion is that many results on diverse economic topics, achieved under assumptions of continuity and piecewise differentiability in connection with the endogenous variables of the problem, still remain valid after replacing such assumptions by two minimal requirements.
Resumo:
This study analyses efficiency levels in Spanish local governments and their determining factors through the application of DEA (Data Envelopment Analysis) methodology. It aims to find out to what extent inefficiency arises from external factors beyond the control of the entity, or on the other hand, how much it is due to inadequate management of productive resources. The results show that on the whole, there is still a wide margin within which managers could increase local government efficiency levels, although it is revealed that a great deal of inefficiency is due to exogenous factors. It is specifically found that the size of the entity, per capita tax revenue, the per capita grants or the amount of commercial activity are some of the factors determining local government inefficiency.
Resumo:
This paper examines competition in a spatial model of two-candidate elections, where one candidate enjoys a quality advantage over the other candidate. The candidates care about winning and also have policy preferences. There is two-dimensional private information. Candidate ideal points as well as their tradeoffs between policy preferences and winning are private information. The distribution of this two-dimensional type is common knowledge. The location of the median voter's ideal point is uncertain, with a distribution that is commonly known by both candidates. Pure strategy equilibria always exist in this model. We characterize the effects of increased uncertainty about the median voter, the effect of candidate policy preferences, and the effects of changes in the distribution of private information. We prove that the distribution of candidate policies approaches the mixed equilibrium of Aragones and Palfrey (2002a), when both candidates' weights on policy preferences go to zero.
Resumo:
We construct estimates of educational attainment for a sample of OECD countries using previously unexploited sources. We follow a heuristic approach to obtain plausible time profiles for attainment levels by removing sharp breaks in the data that seem to reflect changes in classification criteria. We then construct indicators of the information content of our series and a number of previously available data sets and examine their performance in several growth specifications. We find a clear positive correlation between data quality and the size and significance of human capital coefficients in growth regressions. Using an extension of the classical errors in variables model, we construct a set of meta-estimates of the coefficient of years of schooling in an aggregate Cobb-Douglas production function. Our results suggest that, after correcting for measurement error bias, the value of this parameter is well above 0.50.