6 resultados para Panel data probit model
em Universidad de Alicante
Resumo:
Nowadays, data mining is based on low-level specications of the employed techniques typically bounded to a specic analysis platform. Therefore, data mining lacks a modelling architecture that allows analysts to consider it as a truly software-engineering process. Here, we propose a model-driven approach based on (i) a conceptual modelling framework for data mining, and (ii) a set of model transformations to automatically generate both the data under analysis (via data-warehousing technology) and the analysis models for data mining (tailored to a specic platform). Thus, analysts can concentrate on the analysis problem via conceptual data-mining models instead of low-level programming tasks related to the underlying-platform technical details. These tasks are now entrusted to the model-transformations scaffolding.
Resumo:
Data mining is one of the most important analysis techniques to automatically extract knowledge from large amount of data. Nowadays, data mining is based on low-level specifications of the employed techniques typically bounded to a specific analysis platform. Therefore, data mining lacks a modelling architecture that allows analysts to consider it as a truly software-engineering process. Bearing in mind this situation, we propose a model-driven approach which is based on (i) a conceptual modelling framework for data mining, and (ii) a set of model transformations to automatically generate both the data under analysis (that is deployed via data-warehousing technology) and the analysis models for data mining (tailored to a specific platform). Thus, analysts can concentrate on understanding the analysis problem via conceptual data-mining models instead of wasting efforts on low-level programming tasks related to the underlying-platform technical details. These time consuming tasks are now entrusted to the model-transformations scaffolding. The feasibility of our approach is shown by means of a hypothetical data-mining scenario where a time series analysis is required.
Resumo:
Comunicación presentada en el IX Workshop de Agentes Físicos (WAF'2008), Vigo, 11-12 septiembre 2008.
Resumo:
The transitions and reactions involved in the thermal treatment of several commercial azodicarbonamides (ADC) in an inert atmosphere have been studied by dynamic thermogravimetry analysis (TGA), mass spectrometry and Fourier transform infrared (FTIR) spectroscopy. A pseudo-mechanistic model, involving several competitive and non-competitive reactions, has been suggested and applied to the correlation of the weight loss data. The model applied is capable of accurately representing the different processes involved, and can be of great interest in the understanding and quantification of such phenomena, including the simulation of the instantaneous amount of gases evolved in a foaming process. In addition, a brief discussion on the methodology related to the mathematical modeling of TGA data is presented, taking into account the complex thermal behaviour of the ADC.
Resumo:
The objective of this paper is to estimate technical efficiency in retailing; and the influence of inventory investment, wage levels, and firm age on this efficiency. We use the output supermarket chains’ sales volume, calculated isolating the retailer price effect on its sales revenue. This output allows us to estimate a strictly technical concept of efficiency. The methodology is based on the estimation of a stochastic parametric function. The empirical analyses applied to panel data on a sample of 42 supermarket chains between 2000 and 2002 show that inventory investment and wage level have an impact on technical efficiency. In comparison, the effect of these factors on efficiency calculated through a monetary output (sales revenue) shows some differences that could be due to aspects related to product prices.
Resumo:
Several studies have analyzed discretionary accruals to address earnings-smoothing behaviors in the banking industry. We argue that the characteristic link between accruals and earnings may be nonlinear, since both the incentives to manipulate income and the practical way to do so depend partially on the relative size of earnings. Given a sample of 15,268 US banks over the period 1996–2011, the main results in this paper suggest that, depending on the size of earnings, bank managers tend to engage in earnings-decreasing strategies when earnings are negative (“big-bath”), use earnings-increasing strategies when earnings are positive, and use provisions as a smoothing device when earnings are positive and substantial (“cookie-jar” accounting). This evidence, which cannot be explained by the earnings-smoothing hypothesis, is consistent with the compensation theory. Neglecting nonlinear patterns in the econometric modeling of these accruals may lead to misleading conclusions regarding the characteristic strategies used in earnings management.