721 resultados para Empirical


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The conventional wisdom is that offenders have very high discount rates not only with respect to income and fines but also with respect to time incarcerated. These rates are difficult to measure objectively and the usual approach is to ask subjects hypothetical questions and infer time preference from their answers. In this article, we propose estimating rates at which offenders discount time incarcerated by specifying their equilibrium plea, defined as the discount rate, which equates the time and expected time spent in jail following a guilty plea and a trial. Offenders are assumed to exhibit positive time preference and discount time spent in jail at a constant rate. Our choice of sample is interesting because the offenders are not on bail, punishment is not delayed and the offences are planned therefore conforming to Becker’s model of the decision to commit a crime. Contrary to the discussion in the literature, we do not find evidence of consistently high time discount rates, and therefore cannot unequivocally infer that the prison experience always results in low levels of specific deterrence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previous research identifies various reasons companies invest in information technology (IT), often as a means to generate value. To add to the discussion of IT value generation, this study investigates investments in enterprise software systems that support business processes. Managers of more than 500 Swiss small and medium-sized enterprises (SMEs) responded to a survey regarding the levels of their IT investment in enterprise software systems and the perceived utility of those investments. The authors use logistic and ordinary least squares regression to examine whether IT investments in two business processes affect SMEs' performance and competitive advantage. Using cluster analysis, they also develop a firm typology with four distinct groups that differ in their investments in enterprise software systems. These findings offer key implications for both research and managerial practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A vast amount of public services and goods are contracted through procurement auctions. Therefore it is very important to design these auctions in an optimal way. Typically, we are interested in two different objectives. The first objective is efficiency. Efficiency means that the contract is awarded to the bidder that values it the most, which in the procurement setting means the bidder that has the lowest cost of providing a service with a given quality. The second objective is to maximize public revenue. Maximizing public revenue means minimizing the costs of procurement. Both of these goals are important from the welfare point of view. In this thesis, I analyze field data from procurement auctions and show how empirical analysis can be used to help design the auctions to maximize public revenue. In particular, I concentrate on how competition, which means the number of bidders, should be taken into account in the design of auctions. In the first chapter, the main policy question is whether the auctioneer should spend resources to induce more competition. The information paradigm is essential in analyzing the effects of competition. We talk of a private values information paradigm when the bidders know their valuations exactly. In a common value information paradigm, the information about the value of the object is dispersed among the bidders. With private values more competition always increases the public revenue but with common values the effect of competition is uncertain. I study the effects of competition in the City of Helsinki bus transit market by conducting tests for common values. I also extend an existing test by allowing bidder asymmetry. The information paradigm seems to be that of common values. The bus companies that have garages close to the contracted routes are influenced more by the common value elements than those whose garages are further away. Therefore, attracting more bidders does not necessarily lower procurement costs, and thus the City should not implement costly policies to induce more competition. In the second chapter, I ask how the auctioneer can increase its revenue by changing contract characteristics like contract sizes and durations. I find that the City of Helsinki should shorten the contract duration in the bus transit auctions because that would decrease the importance of common value components and cheaply increase entry which now would have a more beneficial impact on the public revenue. Typically, cartels decrease the public revenue in a significant way. In the third chapter, I propose a new statistical method for detecting collusion and compare it with an existing test. I argue that my test is robust to unobserved heterogeneity unlike the existing test. I apply both methods to procurement auctions that contract snow removal in schools of Helsinki. According to these tests, the bidding behavior of two of the bidders seems consistent with a contract allocation scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Business process models have become an effective way of examining business practices to identify areas for improvement. While common information gathering approaches are generally efficacious, they can be quite time consuming and have the risk of developing inaccuracies when information is forgotten or incorrectly interpreted by analysts. In this study, the potential of a role-playing approach to process elicitation and specification has been examined. This method allows stakeholders to enter a virtual world and role-play actions similarly to how they would in reality. As actions are completed, a model is automatically developed, removing the need for stakeholders to learn and understand a modelling grammar. An empirical investigation comparing both the modelling outputs and participant behaviour of this virtual world role-play elicitor with an S-BPM process modelling tool found that while the modelling approaches of the two groups varied greatly, the virtual world elicitor may not only improve both the number of individual process task steps remembered and the correctness of task ordering, but also provide a reduction in the time required for stakeholders to model a process view.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

EEG recordings are often contaminated with ocular artifacts such as eye blinks and eye movements. These artifacts may obscure underlying brain activity in the electroencephalogram (EEG) data and make the analysis of the data difficult. In this paper, we explore the use of empirical mode decomposition (EMD) based filtering technique to correct the eye blinks and eye movementartifacts in single channel EEG data. In this method, the single channel EEG data containing ocular artifact is segmented such that the artifact in each of the segment is considered as some type of slowly varying trend in the dataand the EMD is used to remove the trend. The filtering is done using partial reconstruction from components of the decomposition. The method is completely data dependent and hence adaptive and nonlinear. Experimental results are provided to check the applicability of the method on real EEG data and the results are quantified using power spectral density (PSD) as a measure. The method has given fairlygood results and does not make use of any preknowledge of artifacts or the EEG data used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inventory management (IM) has a decisive role in the enhancement of manufacturing industry's competitiveness. Therefore, major manufacturing industries are following IM practices with the intention of improving their performance. However, the effort to introduce IM in SMEs is very limited due to lack of initiation, expertise, and financial constraints. This paper aims to provide a guideline for entrepreneurs in enhancing their IM performance, as it presents the results of a survey based study carried out for machine tool Small and Medium Enterprises (SMEs) in Bangalore. Having established the significance of inventory as an input, we probed the relationship between IM performance and economic performance of these SMEs. To the extent possible all the factors of production and performance indicators were deliberately considered in pure economic terms. All economic performance indicators adopted seem to have a positive and significant association with IM performance in SMEs. On the whole, we found that SMEs which are IM efficient are likely to perform better on the economic front also and experience higher returns to scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The electrical conduction in insulating materials is a complex process and several theories have been suggested in the literature. Many phenomenological empirical models are in use in the DC cable literature. However, the impact of using different models for cable insulation has not been investigated until now, but for the claims of relative accuracy. The steady state electric field in the DC cable insulation is known to be a strong function of DC conductivity. The DC conductivity, in turn, is a complex function of electric field and temperature. As a result, under certain conditions, the stress at cable screen is higher than that at the conductor boundary. The paper presents detailed investigations on using different empirical conductivity models suggested in the literature for HV DC cable applications. It has been expressly shown that certain models give rise to erroneous results in electric field and temperature computations. It is pointed out that the use of these models in the design or evaluation of cables will lead to errors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Supercritical processes are gaining importance in the last few years in the food, environmental and pharmaceutical product processing. The design of any supercritical process needs accurate experimental data on solubilities of solids in the supercritical fluids (SCFs). The empirical equations are quite successful in correlating the solubilities of solid compounds in SCF both in the presence and absence of cosolvents. In this work, existing solvate complex models are discussed and a new set of empirical equations is proposed. These equations correlate the solubilities of solids in supercritical carbon dioxide (both in the presence and absence of cosolvents) as a function of temperature, density of supercritical carbon dioxide and the mole fraction of cosolvent. The accuracy of the proposed models was evaluated by correlating 15 binary and 18 ternary systems. The proposed models provided the best overall correlations. (C) 2009 Elsevier BA/. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper deals with the development of simplified semi-empirical relations for the prediction of residual velocities of small calibre projectiles impacting on mild steel target plates, normally or at an angle, and the ballistic limits for such plates. It has been shown, for several impact cases for which test results on perforation of mild steel plates are available, that most of the existing semi-empirical relations which are applicable only to normal projectile impact do not yield satisfactory estimations of residual velocity. Furthermore, it is difficult to quantify some of the empirical parameters present in these relations for a given problem. With an eye towards simplicity and ease of use, two new regression-based relations employing standard material parameters have been discussed here for predicting residual velocity and ballistic limit for both normal and oblique impact. The latter expressions differ in terms of usage of quasi-static or strain rate-dependent average plate material strength. Residual velocities yielded by the present semi-empirical models compare well with the experimental results. Additionally, ballistic limits from these relations show close correlation with the corresponding finite element-based predictions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes the use of empirical modeling techniques for building microarchitecture sensitive models for compiler optimizations. The models we build relate program performance to settings of compiler optimization flags, associated heuristics and key microarchitectural parameters. Unlike traditional analytical modeling methods, this relationship is learned entirely from data obtained by measuring performance at a small number of carefully selected compiler/microarchitecture configurations. We evaluate three different learning techniques in this context viz. linear regression, adaptive regression splines and radial basis function networks. We use the generated models to a) predict program performance at arbitrary compiler/microarchitecture configurations, b) quantify the significance of complex interactions between optimizations and the microarchitecture, and c) efficiently search for'optimal' settings of optimization flags and heuristics for any given microarchitectural configuration. Our evaluation using benchmarks from the SPEC CPU2000 suits suggests that accurate models (< 5% average error in prediction) can be generated using a reasonable number of simulations. We also find that using compiler settings prescribed by a model-based search can improve program performance by as much as 19% (with an average of 9.5%) over highly optimized binaries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective(s) To describe how doctors define and use the terms “futility” and “futile treatment” in end-of-life care. Design, Setting, Participants A qualitative study using semi-structured interviews with 96 doctors across a range of specialties who treat adults at the end of life. Doctors were recruited from three large Australian teaching hospitals and were interviewed from May to July 2013. Results Doctors’ conceptions of futility focused on the quality and chance of patient benefit. Aspects of benefit included physiological effect, weighing benefits and burdens, and quantity and quality of life. Quality and length of life were linked, but many doctors discussed instances when benefit was determined by quality of life alone. Most doctors described the assessment of chance of success in achieving patient benefit as a subjective exercise. Despite a broad conceptual consensus about what futility means, doctors noted variability in how the concept was applied in clinical decision-making. Over half the doctors also identified treatment that is futile but nevertheless justified, such as short-term treatment as part of supporting the family of a dying person. Conclusions There is an overwhelming preference for a qualitative approach to assessing futility, which brings with it variation in clinical decision-making. “Patient benefit” is at the heart of doctors’ definitions of futility. Determining patient benefit requires discussions with patients and families about their values and goals as well as the burdens and benefits of further treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies binary time series models and their applications in empirical macroeconomics and finance. In addition to previously suggested models, new dynamic extensions are proposed to the static probit model commonly used in the previous literature. In particular, we are interested in probit models with an autoregressive model structure. In Chapter 2, the main objective is to compare the predictive performance of the static and dynamic probit models in forecasting the U.S. and German business cycle recession periods. Financial variables, such as interest rates and stock market returns, are used as predictive variables. The empirical results suggest that the recession periods are predictable and dynamic probit models, especially models with the autoregressive structure, outperform the static model. Chapter 3 proposes a Lagrange Multiplier (LM) test for the usefulness of the autoregressive structure of the probit model. The finite sample properties of the LM test are considered with simulation experiments. Results indicate that the two alternative LM test statistics have reasonable size and power in large samples. In small samples, a parametric bootstrap method is suggested to obtain approximately correct size. In Chapter 4, the predictive power of dynamic probit models in predicting the direction of stock market returns are examined. The novel idea is to use recession forecast (see Chapter 2) as a predictor of the stock return sign. The evidence suggests that the signs of the U.S. excess stock returns over the risk-free return are predictable both in and out of sample. The new "error correction" probit model yields the best forecasts and it also outperforms other predictive models, such as ARMAX models, in terms of statistical and economic goodness-of-fit measures. Chapter 5 generalizes the analysis of univariate models considered in Chapters 2 4 to the case of a bivariate model. A new bivariate autoregressive probit model is applied to predict the current state of the U.S. business cycle and growth rate cycle periods. Evidence of predictability of both cycle indicators is obtained and the bivariate model is found to outperform the univariate models in terms of predictive power.