137 resultados para Variable selection
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
En los últimos 30 años la proliferación de modelos cuantitativos de predicción de la insolvencia empresarial en la literatura contable y financiera ha despertado un gran interés entre los especialistas e investigadores de lamateria. Lo que en un principio fueron unos modelos elaborados con un único objetivo, han derivado en una fuente de investigación constante.En este documento se formula un modelo de predicción de la insolvencia a través de la combinación de diferentes variables cuantitativas extraídas de los estados contables de una muestra de empresas para los años 1994-1997. A través de un procedimiento por etapas se selecciona e interpreta cuáles son las más relevantes en cuanto a aportación de información.Una vez formulado este primer tipo de modelos se busca una alternativa a las variables anteriores a través de la técnica factorial del análisis de componentes principales. Con ella se hace una selección de variables y se aplica, junto conlos ratios anteriores, el análisis univariante. Por último, se comparan los modelos obtenidos y se concluye que aunque la literatura previa ofrece mejores porcentajes de clasificación, los modelos obtenidos a través del análisis decomponentes principales no deben ser rechazados por la claridad en la explicación de las causas que conducen a una empresa a la insolvencia.
Resumo:
En los últimos 30 años la proliferación de modelos cuantitativos de predicción de la insolvencia empresarial en la literatura contable y financiera ha despertado un gran interés entre los especialistas e investigadores de lamateria. Lo que en un principio fueron unos modelos elaborados con un único objetivo, han derivado en una fuente de investigación constante.En este documento se formula un modelo de predicción de la insolvencia a través de la combinación de diferentes variables cuantitativas extraídas de los estados contables de una muestra de empresas para los años 1994-1997. A través de un procedimiento por etapas se selecciona e interpreta cuáles son las más relevantes en cuanto a aportación de información.Una vez formulado este primer tipo de modelos se busca una alternativa a las variables anteriores a través de la técnica factorial del análisis de componentes principales. Con ella se hace una selección de variables y se aplica, junto conlos ratios anteriores, el análisis univariante. Por último, se comparan los modelos obtenidos y se concluye que aunque la literatura previa ofrece mejores porcentajes de clasificación, los modelos obtenidos a través del análisis decomponentes principales no deben ser rechazados por la claridad en la explicación de las causas que conducen a una empresa a la insolvencia.
Resumo:
The availability of rich firm-level data sets has recently led researchers to uncover new evidence on the effects of trade liberalization. First, trade openness forces the least productive firms to exit the market. Secondly, it induces surviving firms to increase their innovation efforts and thirdly, it increases the degree of product market competition. In this paper we propose a model aimed at providing a coherent interpretation of these findings. We introducing firm heterogeneity into an innovation-driven growth model, where incumbent firms operating in oligopolistic industries perform cost-reducing innovations. In this framework, trade liberalization leads to higher product market competition, lower markups and higher quantity produced. These changes in markups and quantities, in turn, promote innovation and productivity growth through a direct competition effect, based on the increase in the size of the market, and a selection effect, produced by the reallocation of resources towards more productive firms. Calibrated to match US aggregate and firm-level statistics, the model predicts that a 10 percent reduction in variable trade costs reduces markups by 1:15 percent, firm surviving probabilities by 1 percent, and induces an increase in productivity growth of about 13 percent. More than 90 percent of the trade-induced growth increase can be attributed to the selection effect.
Resumo:
Current technology trends in medical device industry calls for fabrication of massive arrays of microfeatures such as microchannels on to nonsilicon material substrates with high accuracy, superior precision, and high throughput. Microchannels are typical features used in medical devices for medication dosing into the human body, analyzing DNA arrays or cell cultures. In this study, the capabilities of machining systems for micro-end milling have been evaluated by conducting experiments, regression modeling, and response surface methodology. In machining experiments by using micromilling, arrays of microchannels are fabricated on aluminium and titanium plates, and the feature size and accuracy (width and depth) and surface roughness are measured. Multicriteria decision making for material and process parameters selection for desired accuracy is investigated by using particle swarm optimization (PSO) method, which is an evolutionary computation method inspired by genetic algorithms (GA). Appropriate regression models are utilized within the PSO and optimum selection of micromilling parameters; microchannel feature accuracy and surface roughness are performed. An analysis for optimal micromachining parameters in decision variable space is also conducted. This study demonstrates the advantages of evolutionary computing algorithms in micromilling decision making and process optimization investigations and can be expanded to other applications
Resumo:
Markowitz portfolio theory (1952) has induced research into the efficiency of portfolio management. This paper studies existing nonparametric efficiency measurement approaches for single period portfolio selection from a theoretical perspective and generalises currently used efficiency measures into the full mean-variance space. Therefore, we introduce the efficiency improvement possibility function (a variation on the shortage function), study its axiomatic properties in the context of Markowitz efficient frontier, and establish a link to the indirect mean-variance utility function. This framework allows distinguishing between portfolio efficiency and allocative efficiency. Furthermore, it permits retrieving information about the revealed risk aversion of investors. The efficiency improvement possibility function thus provides a more general framework for gauging the efficiency of portfolio management using nonparametric frontier envelopment methods based on quadratic optimisation.
Resumo:
This comment corrects the errors in the estimation process that appear in Martins (2001). The first error is in the parametric probit estimation, as the previously presented results do not maximize the log-likelihood function. In the global maximum more variables become significant. As for the semiparametric estimation method, the kernel function used in Martins (2001) can take on both positive and negative values, which implies that the participation probability estimates may be outside the interval [0,1]. We have solved the problem by applying local smoothing in the kernel estimation, as suggested by Klein and Spady (1993).
Resumo:
Inductive learning aims at finding general rules that hold true in a database. Targeted learning seeks rules for the predictions of the value of a variable based on the values of others, as in the case of linear or non-parametric regression analysis. Non-targeted learning finds regularities without a specific prediction goal. We model the product of non-targeted learning as rules that state that a certain phenomenon never happens, or that certain conditions necessitate another. For all types of rules, there is a trade-off between the rule's accuracy and its simplicity. Thus rule selection can be viewed as a choice problem, among pairs of degree of accuracy and degree of complexity. However, one cannot in general tell what is the feasible set in the accuracy-complexity space. Formally, we show that finding out whether a point belongs to this set is computationally hard. In particular, in the context of linear regression, finding a small set of variables that obtain a certain value of R2 is computationally hard. Computational complexity may explain why a person is not always aware of rules that, if asked, she would find valid. This, in turn, may explain why one can change other people's minds (opinions, beliefs) without providing new information.
Resumo:
We study whether selection affects motivation. In our experiment subjects first answer a personality questionnaire. They then play a 3-person game. One of the three players decides between an outside option assigning him a positive amount, but leaving the two others empty-handed and allowing one of the other two players to distribute a pie. Treatments differ in the procedure by which distributive power is assigned: to a randomly determined or to a knowingly selected partner. Before making her decision the selecting player could consult the personality questionnaire of the other two players. Results show that knowingly selected players keep less for themselves than randomly selected ones and reward the selecting player more generously.
Resumo:
We study a problem of adverse selection in the context of environmental regulation, where the firm may suffer from a certain degree of ignorance about its own type. In a framework like the construction of a certain infrastructure project, the presence of ignorance about its impact on the environment, can play an important role in the determination of the regulatory policy. First, an optimal contract is constructed for any exogenous level of ignorance. Second, the presence of potentially informed third-parties is studied from the perspective of the regulator, which allows us to analyze the impact on the efficiency of the contract, of the presence of environmentalists and of experts. Then, we obtain some insights on how the problem differs when the degree of ignorance is a choice variable for the firm. We finally use our results to derive policy implications concerning the existing envoronmental regulation, and the potential role of interested parties as information providers.
Resumo:
This paper studies collective choice rules whose outcomes consist of a collection of simultaneous decisions, each one of which is the only concern of some group of individuals in society. The need for such rules arises in different contexts, including the establishment of jurisdictions, the location of multiple public facilities, or the election of representative committees. We define a notion of allocation consistency requiring that each partial aspect of the global decision taken by society as a whole should be ratified by the group of agents who are directly concerned with this particular aspect. We investigate the possibility of designing envy-free allocation consistent rules, we also explore whether such rules may also respect the Condorcet criterion.
Resumo:
El objetivo de este trabajo consiste en proponer una medida de performance adecuada para los fondos de inversión de renta variable. Las características específicas de este tipo de carteras inducen a tomar un enfoque basado en la L.M.C., por lo que se escoge como medida de riesgo el riesgo total de la cartera (pσ). Se introducen las estrategias pasivas y activas en el análisis, con lo que se consigue desarrollar una medida de performance que, además de medir la rentabilidad por gestión efectiva, la pondera en función del grado de actividad asumido por la cartera a evaluar.
Resumo:
We study competition in experimental markets in which two incumbents face entry by three other firms. Our treatments vary with respect to three factors: sequential vs. block or simultaneous entry, the cost functions of entrants and the amount of time during which incumbents are protected from entry. Before entry incumbents are able to collude in all cases. When all firms' costs are the same entry always leads consumer surplus and profits to their equilibrium levels. When entrants are more efficient than incumbents, entry leads consumer surplus to equilibrium. However, total profits remain below equilibrium, due to the fact that the inefficient incumbents produce too much and efficient entrants produce too little. Market behavior is satisfactory from the consumers' standpoint, but does not yield adequate signals to other potential entrants. These results are not affected by whether entry is simultaneous or sequential. The length of the incumbency phase does have some subtle effects.